First, what is a TensorFlow?
I needed to first understand what I was getting myself involved with. TensorFlow, is an open-source software library developed by Google, primarily has become one of the most popular tools for implementing machine learning and deep learning. Was originally released in 2015. TensorFlow is designed to be flexible, efficient, extensible, and portable, making it suitable for a wide range of applications from research to production.
At its core, TensorFlow is a framework for numerical computation(s) using data flow graphs. Nodes in the graph represent mathematical operations, while the edges represent the multidimensional data arrays (tensors) communicated between them. This architecture allows TensorFlow to efficiently execute large-scale machine learning models on various hardware platforms, including CPUs, GPUs, and TPUs.
Some key concepts I learned in TensorFlow so far...
Tensors and Variables
A good example of this, would be my screenshot at the top. I took the liberty to also circle some important aspects that you will notice while reading. You can also see more of work on my github, where you can see my experience as I continue to grow !
To add more to this, understand that Tensors are the fundamental data structures in TensorFlow. They are essentially multidimensional arrays, similar to NumPy arrays, but with additional capabilities for use in machine learning. Tensors can have various dimensions, known as ranks, and can store data of different types, such as float32, int32, and string. I believe that if you are familiar with NumPy, then TensorFlow will be a smooth ride for you. Regarding Variables on the other hand, in TensorFlow are used to store and update parameters during training. Unlike tensors, which are immutable, variables can be modified.
One of the things, I noticed about Jupyter that I loved and what helped me, was the fact I could just click back in the code after it was executed then change and re-run and add notes anywhere how I wanted. Paying attention to the 'dtype' and 'shape' is important as I learned.
Graphs and Sessions
In TensorFlow 1.x, computations are defined in a computational graph, and sessions are used to execute operations in the graph. However, TensorFlow 2.x has made eager execution the default mode, which allows operations to be executed immediately as they are called from Python. For a more comprehensive review, I strongly advise the documentation guide.
Part 2
What to expect
In my next article, be prepared for me to deep dive into the following key components:
Graphs and tf.function
Keras API: This high-level API is built into TensorFlow and provides an easy-to-use interface for building and training neural networks. Beginners should become familiar with creating models using Keras layers and the Sequential API.
Model training loops: Learning how to implement basic training loops is crucial for understanding how models are trained in TensorFlow.
Basic operations: Understanding how to perform mathematical operations on tensors, such as addition, multiplication, and matrix operations.(multiplication can be tricky and I will show you and explain how along with advise to make this easy for you.)
Sessions (for TensorFlow 1.x): While less relevant in TensorFlow 2.x, understanding the concept of sessions can be helpful when working with older codebases or resources.
By focusing on these key components, I understand that I can build a strong foundation in TensorFlow, enabling me to create, train, and deploy machine learning models effectively. As I progress, I can delve deeper into more advanced topics and specialized areas of the TensorFlow ecosystem for others as well that are alike, I hope this discussion reach you and help in some way now and in the future!