Skip to main content

Simple Tensorflow example on iris dataset

This one trains the DNN on iris and also create confusion matrix. Comments will help you know what is happening.

===

#imports

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow as tf
import numpy as np
import pandas as pd
from sklearn.model_selection import train_test_split

# Load data

# Download it from here : https://archive.ics.uci.edu/ml/datasets/iris. Ensure that the categories are #labeled as Iris-setosa, Iris-virginica, Iris-versicolor and their column name as Species.
iris = pd.read_csv("irisdata.csv")

# explore what is loaded
#print(iris.shape)
#print(iris.head)

# Convert the data type of columns to float32 type

#print(iris.dtypes)

#print(iris.iloc[:,0:4])
iris.iloc[:,0:4] = iris.iloc[:,0:4].astype(np.float32)

#print(iris.dtypes)


# encode the classes to numeric values
iris["Species"] = iris["Species"].map({"Iris-setosa":0,"Iris-virginica":1,"Iris-versicolor":2})


# train test split
X_train, X_test, y_train, y_test = train_test_split(iris.iloc[:,0:4], iris["Species"], test_size=0.33, random_state=42)

# get only the column names of input features
columns = iris.columns[0:4]

#print(columns)


# these feature coulmns are made as real valued numbers
feature_columns = [tf.contrib.layers.real_valued_column(k) for k in columns]
feature_columns


# convert the input into format of the tensorflow

#takes in the dataframe and a series/vector and returns

def input_fn(df,labels):
    feature_cols = {k:tf.constant(df[k].values,shape = [df[k].size,1]) for k in columns}
    label = tf.constant(labels.values, shape = [labels.size,1])
    return feature_cols,label

# Define the classifier with hidden layers and number of classes and feature columns
classifier = tf.contrib.learn.DNNClassifier(feature_columns=feature_columns,hidden_units=[10,20,10],n_classes = 3)


# training the classifier while providing the input which is typecasted by the input function and also mention the iterations
classifier.fit(input_fn=lambda: input_fn(X_train,y_train),steps = 1000)


# evaluate the classifier
ev = classifier.evaluate(input_fn=lambda: input_fn(X_test,y_test),steps=1)


# convert values to the type of tensorflow for prediction only the input and not the label variable
def input_predict(df):
    feature_cols = {k:tf.constant(df[k].values,shape = [df[k].size,1]) for k in columns}
    return feature_cols


# prediction
pred = classifier.predict_classes(input_fn=lambda: input_predict(X_test))


# print prediction
print("The predictions")

#print(list(pred))

#Confusion Matrix
actual = list(y_test)
predicted = list(pred)
actual = pd.Series(actual)
predicted = pd.Series(predicted)
pd.crosstab(actual,predicted)
==
Major part of the code from Wei LI 

Comments

Popular posts from this blog

आपण महाराष्ट्राच्या संस्कृतीचे अजूनही खरंच पाईक आहोत कि भरकटलोत ?

महाराष्ट्र दिनाच्या सगळ्यांना शुभेच्छा. तसं ज्याला आज आपण महाराष्ट्र म्हणतो, त्याचा हा आधुनिक जन्म दिवस. महाराष्ट्र कधी पासून अस्तित्वात असेल? म्हणजे इथली संस्कृती, आपण जे वागतो, जगतो, बोलतो, राहतो वगैरे वगैरे. कधी पासून हे सगळं असं असावं? याचं उत्तर आपल्या सारख्या सामान्यांपेक्षा हा ज्या कोणत्या विषयाचा विषय असेल त्या विषयाच्या निष्णातांना जास्त चांगलं माहित असेल. तरी, आपला एक सामान्य माणूस म्हणून या भूमीवर अधिकार आहे आणि त्याच अधिकाराने आपण आपला एक अंदाज लावू शकतो. म्हणजे ज्ञानेश्वर-नामदेव इथे आपल्याला या मराठी राज्याची - महाराष्ट्राची - सुरुवात झाली असावी, असा अंदाज लावता येईल.म्हणजे आजच्या आपल्या मराठी म्हणता येईल अशा संस्कृतीची सुरवात तिथून झाली असं आपण समजू शकतो. किंवा मला जे मांडायचं त्या साठी ते सोयीचं आहे म्हणून समजा हवं तर! पण मीच कशाला वारकरी साहित्यातच संत बहिणाबाईंनी लिहून ठेवलयं - संतकृपा झाली । इमारत फळा आली ॥ १ ॥ ज्ञानदेवें रचिला पाया । उभारिलें देवालया ॥ २ ॥ नामा तयाचा किंकर । तेणें रचिलें तें आवार ॥ ३ ॥ जनार्दन एकनाथ । खांब दिधला भागवत ॥ ४ ॥ तुका झालासे कळस । भजन करा ...

Is Higher Education a Bubble? A view ...

Peter Thiel founder PayPal and now a Venture Capitalist puts his thought on the education system. Looks his thoughts are biased to his  experiences, but this view towards education is not completely wrong when the education system doesn't produce the quality resources. The situation in the India is similar and may be the bauble will burst one day. Need to be careful. Education is a bubble in a classic sense. To call something a bubble, it must be overpriced and there must be an intense belief in it. Housing was a classic bubble, as were tech stocks in the ’90s, because they were both very overvalued, but there was an incredibly widespread belief that almost could not be questioned — you had to own a house in 2005, and you had to be in an equity-market index fund in 1999. Probably the only candidate left for a bubble — at least in the developed world (maybe emerging markets are a bubble) — is education. It’s basically extremely overpriced. People ar...

NVDIA AI India

 NVDIA AI some Insights after digesting the event with some Diwali Delicacies! I had good time at the NVIDIA AI Summit held at the Jio World Center in Mumbai. It felt like every company working on artificial intelligence in India was present, either as an exhibitor or an attendee. Some sessions were so packed that people were standing, and even then, many more were trying to get in. Much of the discussion revolved around Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG). These are undoubtedly central topics in AI today, drawing significant attention. I’m a huge admirer of RAG, but I still wouldn’t go as far as to say that “LLM (+RAG) is AI.” Although no one at the conference explicitly said this, it felt implied over the three days of sessions. I may be wrong, but I sensed a push for these technologies. Not just from NVIDIA, but from any hardware supplier, there’s an incentive to promote anything that drives demand for their solutions. NVIDIA’s GPUs are a backbo...