TensorFlow

TensorFlow is a machine learning framework with the ability to work on problems beyond traditional machine learning, TensorFlow is most popularly used for Deep Learning. It provides an extensive library of tools for research in deep learning to the deployment of such models.

This competency includes basics of linear algebra, static computational graphs, data types available in TensorFlow (variables, placeholders, and constants), session, tensor operations, using GPUs, implementation of linear regression using basic operations, tf.data.Dataset, data preprocessing libraries, an understanding of Keras, optimizers, loss functions available in TensorFlow, saving and loading models, implementation of neural networks, debugging, implementation of complex neural network architectures, distributed training, tensorflow in production using TFX, using tensorboard to visualize states of the machine learning workflow, and usage of TFRecord.

Key competencies: 

  1. Basics of Linear Algebra - A good understanding of concepts such as vectors, matrices, tensors, matrix multiplication, etc.
  2. Static Computational Graphs - A theoretical understanding of computational graphs and the ability to visualize simple programs such as the addition of two numbers as computational graphs. An understanding of the difference between static and dynamic computational graphs.
  3. Variables, Placeholders, and Constants - These are the three main data types available in TensorFlow. Variables are used to store trainable parameters of the model. Placeholders are commonly used for feeding input data into the machine learning model. They are variables without value initially but are fed input during a session run. Constants are constants defined for a particular operation and are not updated during backprop.
  4. tf.Session - After the computational graph has been set up, tf.Session is the wrapper that is used to execute a subgraph or the complete graph.
  5. Tensor Operations - Ability to create and manipulate tensors using TensorFlow such as reshaping, multiplying, sum on a different axis, transformations, assignment etc. Performing tasks such as converting an RGB image to black and white.
  6. Using GPUs - Usage of tf.device to run operations on CPUs or any GPU.
  7. Logistic Regression - Ability to set up a computational graph for simple logistic regression with gradient descent.
  8. tf.data.Dataset - Large datasets are generally used for deep learning and are memory inefficient. tf.data.Dataset provides a way to stream this data in an efficient manner.
  9. Data Preprocessing - Usage of TensorFlow features for reading, writing and manipulating different types of data such as images, text, audio, etc.
  10. Keras - A good understanding of Keras and how it works. Keras is a popular open-source library for building neural networks in an intuitive way that is ported into Tensorflow. 
  11. Optimizers - TensorFlow consists of several gradient descent optimizers such as Adam, RMSProp, Adagrad, SGD, etc.
  12. Loss - Ability to set up a loss function both from the TensorFlow library or write a custom loss function for a machine learning model.
  13. Saving and Loading Models - Usage of .save() and .load() function to save and load models and parameters and also, an understanding of possible cases where this would fail.
  14. Neural Networks - Ability to use Keras to build a neural network for simple problems such as binary classification or linear regression.
  15. Debugging - Ability to debug problems during training or testing by extracting states of parameters from the session or any other way.
  16. Neural Network Architectures - Ability to use TensorFlow Keras to solve more complex problems on unstructured data such as: image classification, segmentation, object detection, text summarization, video/audio classification. such as images, videos,visual, language, audio, etc using different neural network architectures using convolution layers, recurrent layers, etc.
  17. Using Pre-trained models - Ability to load and fine tune pre-trained models for different tasks.
  18. Distributed Training - Ability to setup distributed training using tensorflow.distribute for models across multiple GPUs, TPUs, or even machines.
  19. Production and TFX - Usage of TFX for serving models and building APIs for models. A good understanding of how TensorFlow can be used for large-scale machine learning.
  20. TensorBoard - Usage for TensorBoard for measuring and visualizing different states of a machine learning workflow. 
  21. TFRecord - Usage of TFRecords for saving and loading data. TFRecord is a way to store structured data in a sequence of binary strings that can be used efficiently across platforms.