Deep Learning

A summary of the content from taking this course to prepare for Grad School: https://www.udacity.com/course/intro-to-tensorflow-for-deep-learning--ud187

Basic Terms

Artificial Intelligence - Computer science trying to make computers perform intelligence tasks like humans.

  • Machine Learning - Computation techniques where computers are trained

    • Neural Network - Inspired literally by networks of neurons found in biological brains. Fundamental component of Deep Learning

      • Deep Learning - A subset of ML using Neural Networks

Supervised Learning - You know what you want to teach the computer Unsupervised Learning - You let the computer figure out what can be learned

Feature - Input(s) to an ML model Labels - The output a model predicts

Layer - A collection of nodes together within a network Model - The representation of your neural network (or whatever Machine Learning paradigm you're going for) Learning Rate - The "step size" for loss improvement during gradient descent Batch - Set of examples used during training of the neural network Epoch - A full pass over the entire training dataset Forward pass - Computation of output values from input Backward pass (backpropagation) - The calculation of internal variable adjustments according to the optimizer algorithm, starting from the output layer and working back through each layer to the input.

Current Applications

What is Machine Learning?

The input and the algorithm is known, and you write a function to produce an output:

  • Input data

  • Apply logic to it

  • Produce a result

# Convert Celsius to Fahrenheit
def function(C):     # Input
    F = C * 1.8 + 32 # Algorithm
    return F         # Output

Last updated

Was this helpful?