TensorFlow Introduction & Training Data Using A Python Example

0
35
TensorFlow Introduction

TensorFlow is one ofthe most efficientopen-source librariesthat hasan excellence in numerical computing. And this numerical computing is an important factor for our neural network calculations. It is being accompanied by a huge cluster of application interfaces for many of the major languages that are being used in learning field. we will be learning many more in this TensorFlow Introduction.

TensorFlow Introduction

As far as beginners are concerned, their entire solution keeps revolving around tensors, a primitive unit in TensorFlow. Tensor data structure us being used in order to describe the entire data. As per mathematics, all the geometric objects that are being used to describe the linear relationship in between any other geometric objects are called tensors. And according totensor flow, these are multi-dimensional data or array, i.e., matrixes, using these arrays; we can solve the matrix operations in an efficient manner. We can get a better picture of this concept with the help of linear algebra. Let us take an instance, we have defined two constant tensors and added one value to the other in the below code:

import tensorflow as tf
const1 = tf.constant([[1,2,3], [1,2,3]]);
const2 = tf.constant([[3,4,5], [3,4,5]]);
result = tf.add(const1, const2);
with tf.Session() as sess:
output = sess.run(result)
print(output)

The constantsare the static values. But, in this concept of TensorFlow, we can define other types of data as well including variables. As the Tensor flow has a rich API:

import tensorflow as tf
var1 = tf.Variable([[1, 2], [1, 2]], name="variable1")
var2 = tf.Variable([[3, 4], [3, 4]], name="variable2")
result = tf.matmul(var1, var2)
with tf.Session() as sess:
output = sess.run(result)
print(output)

TensorFlow also uses data flow graphs wherein all the nodes represent mathematical operations. And an edge is a representation of tensors that are being communicated among them.

Installation and Setup

TensorFlow offers alargerange of language-based APIs including Python, C++, Java, Go, Haskell and R (third-party library). Many of the operating systems are also being supported by Tensorflow. However, we will be covering installation process on this platform using Python 3.5 and 3.6 on Windows 10. One more important thing that needs to be noticed is the hardware configuration of our system.

Below are the two ways that you can opt for installing the TensorFlow:

• TensorFlow with GPU support
• TensorFlow with CPU support only
For installing the TensorFlow with GPU support, your system must be equipped with NVIDIA® GPU. Though the CPU version is much easier to install and configure, however, the GPU is the faster one.

Below are the steps to installTensorFlow using Anaconda:

1. Run the below command to create aconda environment:
conda create -n tensorflow pip python=3.5

2. Issue the below command to activate the environment created:
activate tensorflow

3. Now we will install TensorFlow within your environment.

-> Below is the command for CPU version:
pip install --ignore-installed --upgrade tensorflow

-> Below is the command for GPU version:
pip install --ignore-installed --upgrade tensorflow-gpu
Also, one can opt to install TensorFlow via “native pip”.

-> Run the below command for CPU version:
pip3 install --upgrade tensorflow

-> And the command for GPU TensorFlow version is below:
pip3 install --upgrade tensorflow-gpu
And with this, the installation process for Tensorflow has been completed. Let us talk about the respective workflow.

Workflow for TensorFlow

Below is the workflow for TensorFlow codes:
• Import dataset
• For describing the data, extend the dataset with additional columns
• Type of model to be selected
• Training
• Accuracy of the model to be evaluated
• Predict the results using model

Training and evaluating processes are considered to be an important part for developing any of the Artificial Neural Network. Most commonly, two datasets are used to do these processes, namely training and accuracy testing of the trained network. Even if you have got just one dataset, we will be required to split into two separate datasets to perform these processes.
Code

In this article, we will be usingthe Spyder IDE for development purposes. Therefore the whole process using this environment will be explained below.
First and foremost, import the dataset and parse it. We will be using another Python library – Pandas to do this. Pandas is one of the open source library that offers easier data structures along with data analysis tools for Python.

• Import dataset

# Import `tensorflow` and `pandas`
import tensorflow as tf
import pandas as pd
COLUMN_NAMES = [
‘SepalLength’,
‘SepalWidth’,
‘PetalLength’,
‘PetalWidth’,
‘Species’
]

# Import training dataset
training_dataset = pd.read_csv('iris_training.csv', names=COLUMN_NAMES, header=0)
train_x = training_dataset.iloc[:, 0:4]
train_y = training_dataset.iloc[:, 4]

# Import testing dataset
test_dataset = pd.read_csv('iris_test.csv', names=COLUMN_NAMES, header=0)
test_x = test_dataset.iloc[:, 0:4]
test_y = test_dataset.iloc[:, 4]

In the above code, the read_csv function has been used to import the dataset into local variables. And then we created four separate matrixes for separating the inputs (train_x, test_x) and expected outputs (train_y, test_y).

• Now, we will be defining the feature columns

# Setup feature columns
columns_feat = [
tf.feature_column.numeric_column(key='SepalLength'),
tf.feature_column.numeric_column(key='SepalWidth'),
tf.feature_column.numeric_column(key='PetalLength'),
tf.feature_column.numeric_column(key='PetalWidth')
]

• Model selection.
Here, we are trying to predicta class of Iris Flower based on an attributes data. And for this, we need to choose one of the estimators from the TensorFlow API. An Estimator class object encloses the logic behind building a TensorFlow graph and running a TensorFlow session. And to accomplish this, a DNNClassifier will be opted. Two hidden layers along with ten neurons in each will be added.

# Build Neural Network – Classifier
classifier = tf.estimator.DNNClassifier(
feature_columns=columns_feat,
# Two hidden layers of 10 nodes each.
hidden_units=[10, 10],
# The model is classifying 3 classes
n_classes=3)

• Training
Once the model has been selected, it is a time to train our neural network along with the data that we have picked from training dataset. First step is to define the training function. This function needs to supply neural network with data from the training set by extending it and creating multiple batches. Now the shuffle function has been called to have an effective training.

# Define train function
def train_function(inputs, outputs, batch_size):
dataset = tf.data.Dataset.from_tensor_slices((dict(inputs), outputs))
dataset = dataset.shuffle(1000).repeat().batch(batch_size)
return dataset.make_one_shot_iterator().get_next()

# Train the Model.
classifier.train(
input_fn=lambda:train_function(train_x, train_y, 100),
steps=1000

• Accuracy of the model to be evaluated
Here we will evaluate our neural network and give back the accuracy of the network.

# Define evaluation function
def evaluation_function(attributes, classes, batch_size):
attributes=dict(attributes)
if classes is None:
inputs = attributes
else:
inputs = (attributes, classes)
dataset = tf.data.Dataset.from_tensor_slices(inputs)
assert batch_size is not None, "batch_size must not be None"
dataset = dataset.batch(batch_size)
return dataset.make_one_shot_iterator().get_next()

# Evaluate the model.
eval_result = classifier.evaluate(
input_fn=lambda:evaluation_function(test_x, test_y, 100))

• Predict the results using model
With this code, we have got the accuracy of 0.93. Now, we can call our classifier and get predictions for the same.

Conclusion

Neural Networks have gained popularity when Google madeTensorFlow available to the public. Even nowadays, there are other higher-level APIs available simplifying the implementation of neural networks. Keras is one such thatruns on top of the TensorFlow even.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.