Friday, April 19, 2024

Machine learning with TensorFlow

- Advertisement -

The Google Brain team developed a software library to perform machine learning across a range of tasks. Their aim was to cater to the needs of their machine learning systems, those that were capable of building and training neural networks. The software was meant to help such systems detect and decipher patterns and correlations, just like the way human beings learn and reason.

In November 2015, Google released this tool under Apache 2.0 licence, making it open to use and providing everyone an opportunity to work on their own artificial intelligence (AI) based projects. By June 2016, 1500 repositories on GitHub mentioned the software, of which only five were from Google.

The tool under discussion is TensorFlow. What is it about this tool that makes it suitable for machine learning applications? How do you use the tool? Why does Google hold it at such high regard, and why are so many people contributing to it? This article gives you an overview of how TensorFlow flows.

- Advertisement -

Working with TensorFlow

When you import TensorFlow into Python environment, you get complete access over its classes, methods and symbols. Take TensorFlow operations and arrange these into a graph of nodes called the computational graph. Typically, each node takes tensors as inputs and produces a corresponding output tensor. Values for the nodes get evaluated as and when a session is run. Combine nodes with operations—also nodes in a certain form—to build more complicated computations.

A fully-configured TensorBoard
Fig. 1: A fully-configured TensorBoard

Customise and improvise.

To tune this for your machine learning application, you need to construct the model such that it can take arbitrary inputs and deliver outputs accordingly. The way to do this with TensorFlow is to add variables, reflecting trainable parameters. Each of these has a type and an initial value, letting you tune your system to required behavior.

How do you know if your system is functioning exactly the way you intended it to? Simple, just introduce a loss function. TensorFlow provides optimisers that slowly change each variable so that loss function can be minimized. There are also higher abstractions for common patterns, structures and functionality.

Multiple APIs for easier control

As a new user to any software, it is important to enjoy the experience. TensorFlow is built with that mindset, with the highest-level application program interface (API) tuned for easy learning and usage. With experience, you will learn how to handle the tool, and what modification will result in what kind of change to the entire functionality.

It is then obvious to want to be able to work around the model and have fine levels of control over the same. TensorFlow core API, the lowest-level API, helps you achieve this fine control. Other higher-level APIs are built on top of this very core. Higher the level of the API, the easier it is to perform repetitive tasks and to keep the flow consistent between multiple users.

MNIST is ‘Hello World’ to machine learning

Mixed National Institute of Standards and Technology (MNIST) database is the computer vision dataset that is used to train the machine learning system. It is basically a set of handwritten digits that the system has to learn and identify by the corresponding label. Accuracy of your model will depend on the intensity of your training. Broader the training data set, better will be the accuracy of your model.

One example is Softmax Regression model, which exploits the concept of probability to decipher a given image. As every image in MNIST is a handwritten digit between zero and nine, the image you are analysing can be only one of the ten digits. Based on this understanding, the principle of Softmax Regression allots a certain probability of being a particular number, to every image under test.

Smart handling of resources.

As this process might involve a good bit of heavy lifting, just like other compute-heavy operations, TensorFlow offloads the heavy lifting outside Python environment. As the developers describe it, instead of running a single expensive operation independently from Python, TensorFlow lets you describe a graph of interacting operations that run entirely outside Python.

Embeddings in TensorFlow
Fig. 2: Embeddings in TensorFlow

A few noteworthy features

Using TensorFlow to train your system comes with a few added benefits.
Visualising learning. No matter what you hear or read, it is only when you visually see something that the concept stays in your mind. The easiest way to understand the computational graph is, of course, to understand it pictorially. A utility called TensorBoard can display this very picture. The representation is very similar to a flow or a block diagram.

Graph visualisation.

Computational graphs are complicated and not easy to view or comprehend. The graph visualisation feature of TensorBoard helps you understand and debug the graphs easily. You can zoom in or out, click on blocks to check their internals, check how data is flowing from one block to another and so on. Name your scopes as clearly as possible in order to visualise better.The graph also includes series collapsing—names with number indexes are condensed to a single line on the graph, which can be expanded at your will. There are also special icons for different types of nodes to help you distinguish these easily. You can dump statistical information like time and resource usage data if you need.

Other than viewing the graphs pictorially, TensorFlow lets you plot quantitative metrics about the execution of your graphs and also shows additional data like images that pass through it. The base for TensorBoard is the event file generated while running the tool. This file contains summary data from nodes that you select for generating a summary.

Embedding visualisation.

Embedding Projector, the built-in visualiser of TensorFlow, aids interactive visualisation and analysis of high-dimensional data like embeddings. The projector reads the embeddings from a model checkpoint file and loads any 2D tensor or embeddings.

Begin with the smallest unit, the tensor

Data in TensorFlow is based on its central unit, a tensor. A tensor is an array of a primitive set of values, and can have any number of dimensions.

Loading data.

Sending data into a TensorFlow program can be done in three different ways. It can be fed directly via Python code, read from input files or preloaded into a constant or variable. The last comes of use for small data sets. Choose the method most suitable for your purpose.

Graphs and statistics on TensorFlow
Fig. 3: Graphs and statistics on TensorFlow

Large-scale numerical computation made easy

TensorFlow offers powerful support for implementing and training deep neural networks, owing to its highly-efficient C++ backend. The support this software offers has acted as the foundation stone for many other developmental projects. DeepDream is an automated image-captioning software based on TensorFlow.

Another application is RankBrain, which was built to replace and supplement static algorithm based search query results. RankBrain is the brainchild of Google.

Google also went on to build Tensor Processing Unit, a custom application-specific integrated circuit for machine learning. The unit is a programmable accelerator for AI based projects and is tailored for TensorFlow. Google announced that they had been running these inside their data centres for over a year and had achieved better results for machine learning applications.

Download the latest version of the software


SHARE YOUR THOUGHTS & COMMENTS

Unique DIY Projects

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components

Calculators