Nic Acton
  • My Gitbook
  • My Favorite Things
    • Podcasts
    • Newsletters
  • Monthly Summaries
    • May 2019
    • June 2019
  • Cloud Computing
    • Cloud Concepts
    • AWS
      • Certified Solutions Architect
      • Well Architected Framework
        • Operational Excellence
        • Reliability
        • Performance Efficiency
        • Cost Optimization
        • Security
      • Analytics
        • Elasticsearch Service
        • Kinesis
        • Elastic MapReduce (EMR)
      • Compute Services
        • Elastic Beanstalk
        • Elastic Container Service (ECS)
      • Deployment
        • CloudFormation
      • Application Services
        • Key Management Service (KMS)
        • Simple Queue Service (SQS)
        • API Gateway
        • Simple Work Flow (SWF)
        • Amazon MQ
        • Simple Notification Service (SNS)
      • Simple Storage Service (S3)
        • Macie
      • Databases
        • RDS
        • DynamoDB
        • ElastiCache
        • Neptune
        • Redshift
      • Cloudfront
      • IAM
      • Monitoring
        • Trusted Advisor
        • Amazon Inspector
        • AWS Config
        • AWS Shield
        • CloudWatch
          • VPC Flow Logs
        • CloudTrail
        • Guard Duty
      • Route53
      • Serverless Architectures
        • Lambda
      • VPC
        • Highly Available & Fault Tolerant VPCs
        • Hybrid Environments
          • VPC Peering
          • Direct Connect
        • Cloud HSM
    • GCP
    • Azure
    • HashiCorp
    • Red Hat
      • RHEL
        • Basics
        • Grep & Regex
        • SSH
      • Ansible
    • Tutorials/Guides
      • Linux
        • Admin
  • Software Engineering
    • Machine Learning
      • Deep Learning
        • Tensorflow
      • Training and Loss
    • Programming
      • APIs
    • Security
    • Web Development
      • OSI 7 Layer Model
    • Tutorials/Guides
      • Apache Server
    • Virtualization
      • Virtual Machines
      • Containers
      • Serverless
  • Fitness
    • Nutrition
      • Diets
      • Macronutrients
      • Supplements
      • Miscellaneous
    • Strength Training
    • BodyBuilding
  • Miscellaneous
    • Technology Ethics
      • Education
    • Interesting Concepts
      • Libertarian Paternalism
Powered by GitBook
On this page
  • Basic Dense Network Model
  • Dense Layers

Was this helpful?

  1. Software Engineering
  2. Machine Learning
  3. Deep Learning

Tensorflow

Basic Dense Network Model

Instantiate a Dense network model with a single layer, the simplest possible model Tensorflow can support. This configuration will have:

  • input_shape=[1]- Input to this layer is a single value, the shape of this model is a one-dimensional array with one member. Since this is the only layer, it is the shape of the entire model.

  • units=1 - Specifieds number of neurons in the layer. How many internal variables the layer has to try to learn how to solve the problem. Since this is the final layer, it is also the size of the model's output

  • Sequential model - Takes a list of layers as an argument specifying calculation order from input to output

  • Loss function - a way of measuring how far off predictions are from the desired outcome. In this case we'll use mean_squared_error

  • Optimizer function - A way of adjusting the internal values in order to reduce the loss. In this case we'll use Adam

    • In this case we'll give it a learning rate of 0.1 but it is important to consider. If the learning rate is too small then it will take too long to train, too large and it will be inaccurate. Finding an optimal value for your use case can involve trial and error.

# Initiate the layers
l0 = tf.keras.layers.Dense(units=1, input_shape=[1])

# Assemble layers into the model
model = tf.keras.Sequential([l0])

# You can also do all this by defining the layers inside the model
model = tf.keras.Sequential([
    tf.keras.layers.Dense(units=1, input_shape=[1])
    ])
    
# Compile the model
model.compile(loss='mean_squared_error',
              optimizer=tf.keras.optimizers.Adam(0.1))

# Train the model using the fit method
history = model.fit(celsius_q, fahrenheit_a, epochs=500, verbose=False)
print("Finished training the model")

# Make a prediction
model.predict([100.0])

Dense Layers

In basic terms, all neurons from one layer are connected to the neurons from the previous layer and the next layer, or fully connected.

The following weights (w's) and variables (b) are changed iteratively during the training process until the final values are very close to correct.

The 1 layer Dense network we established previously to learn the equation for Celsius to Fahrenheit pretty much looks like this:

PreviousDeep LearningNextTraining and Loss

Last updated 5 years ago

Was this helpful?