# Manual creation of Neural Networks

In this notebook we will manually build out a neural network that mimics the TensorFlow API.

## 1. Super() and Object-Oriented Programming

In [4]:
class SimpleClass():
def __init__(self):
print('hello')

def yell(self):
print('YELLING')

In [5]:
x = SimpleClass()

hello


yell within function is an example of methods!

In [7]:
x.yell()

YELLING


ExtendedClass takes in SimpleClass and the y variable inherit the methods from SimpleClass. The super() function allows ExtendedClass to inherit the init function within SimpleClass too as demonstrated below:

In [12]:
class ExtendedClass(SimpleClass):

def __init__(self):

super().__init__()
print('Extend!')

In [13]:
y = ExtendedClass()

hello
Extend!

In [11]:
y.yell()

YELLING


## 2. Operations

In [25]:
class Operation():
"""
An Operation is a node in a "Graph". TensorFlow will also use this concept of a Graph.

This Operation class will be inherited by other classes that actually compute the specific
operation, such as adding or matrix multiplication.
"""

def __init__(self, input_nodes=[]):

self.input_nodes = input_nodes
self.output_nodes = []

for node in input_nodes:
node.output_nodes.append(self)

_default_graph.operations.append(self)

def compute(self):
"""
This is a placeholder function. It will be overwritten by the actual specific operation
that inherits from this class.
"""
pass


In [26]:
class add(Operation):

def __init__(self,x,y):
super().__init__([x,y])

def compute(self, x_var, y_var):
self.inputs = [x_var,y_var]
return x_var + y_var


Multiplication

In [27]:
class multiply(Operation):

def __init__(self,x,y):
super().__init__([x,y])

def compute(self, x_var, y_var):
self.inputs = [x_var,y_var]
return x_var * y_var


Matrix Multiplication

In [28]:
class matmul(Operation):

def __init__(self,x,y):
super().__init__([x,y])

def compute(self, x_var, y_var):
self.inputs = [x_var,y_var]

return x_var.dot(y_var)


## 3. Placeholders, Variables and Graphs

A placeholder is an ’empty’ node that needs a value to be provided to compute output.

Variables are changeable parameter of Graph.

Graph – Global variable connecting variables and placeholders to operations

In [29]:
class Placeholder():

def __init__(self):

self.output_nodes = []

_default_graph.placeholders.append(self)

In [30]:
class Variable():

def __init__(self, initial_value=None):

self.value = initial_value
self.output_nodes = []

_default_graph.variables.append(self)

In [31]:
class Graph():

def __init__(self):

self.operations = []
self.placeholders = []
self.variables = []

def set_as_default(self):

global _default_graph
_default_graph = self


## 4. Simple example

z = Ax + b

A = 10,
b = 1

z = 10x + 1

In [32]:
g = Graph()

In [33]:
g.set_as_default()

In [34]:
A = Variable(10)

In [35]:
b = Variable(1)


x can be any value but at the moment, we haven’t got x values yet so we set it as a placeholder to take in future values

In [36]:
x = Placeholder()


y = 10x

In [37]:
y = multiply(A,x)


z = y + 1

In [38]:
z = add(y,b)


## 5. Session

Now that the Graph (g) has all the nodes, we need to execute all the operations within a Session. We will use a PostOrder Tree Traversal to make sure we execute the nodes in the correct order.

Traversing Operation Nodes

In [39]:
import numpy as np

In [40]:
def traverse_postorder(operation):
"""
PostOrder Traversal of Nodes. Basically makes sure computations are done in
the correct order (Ax first , then Ax + b).
"""
nodes_postorder = []

def recurse(node):
if isinstance(node, Operation):
for input_node in node.input_nodes:
recurse(input_node)
nodes_postorder.append(node)

recurse(operation)
return nodes_postorder


Session

In [41]:
class Session():
def run(self, operation, feed_dict={}):
nodes_postorder = traverse_postorder(operation)
for node in nodes_postorder:
if type(node) == Placeholder:
node.output = feed_dict[node]
elif type(node) == Variable:
node.output = node.value
else:
node.inputs = [input_node.output for input_node in node.input_nodes]
node.output = node.compute(*node.inputs)
if type(node.output) == list:
node.output = np.array(node.output)
return operation.output


Run the session and input x = 10

In [42]:
sess = Session()

In [43]:
result = sess.run(operation=z, feed_dict={x:10})

In [44]:
result

Out[44]:
101

Another example with matrix multiplication

In [46]:
g = Graph()
g.set_as_default()
A = Variable([[10,20],[30,40]])
b = Variable([1,2,])
x = Placeholder()
y = matmul(A,x)

In [47]:
sess = Session()

In [49]:
sess.run(operation=z, feed_dict={x:10})

Out[49]:
array([[101, 202],
[301, 402]])

## 6. Activation Function

In [50]:
import matplotlib.pyplot as plt
%matplotlib inline

In [51]:
def sigmoid(z):
return 1/ (1 + np.exp(-z))

In [52]:
sample_z = np.linspace(-10,10,100)
sample_a = sigmoid(sample_z)

In [53]:
plt.plot(sample_z,sample_a)

Out[53]:
[<matplotlib.lines.Line2D at 0x11455d518>]

Sigmoid as Operation

In [55]:
class Sigmoid(Operation):
def __init__(self,z):
super().__init__([z])
def compute(self,z_val):
return 1/ (1 + np.exp(-z_val))


## 7. Classification Example

In [56]:
from sklearn.datasets import make_blobs

In [58]:
data = make_blobs(n_samples=50,n_features=2,centers=2,random_state=75)

In [59]:
data

Out[59]:
(array([[  7.3402781 ,   9.36149154],
[  9.13332743,   8.74906102],
[  1.99243535,  -8.85885722],
[  7.38443759,   7.72520389],
[  7.97613887,   8.80878209],
[  7.76974352,   9.50899462],
[  8.3186688 ,  10.1026025 ],
[  8.79588546,   7.28046702],
[  9.81270381,   9.46968531],
[  1.57961049,  -8.17089971],
[  0.06441546,  -9.04982817],
[  7.2075117 ,   7.04533624],
[  9.10704928,   9.0272212 ],
[  1.82921897,  -9.86956281],
[  7.85036314,   7.986659  ],
[  3.04605603,  -7.50486114],
[  1.85582689,  -6.74473432],
[  2.88603902,  -8.85261704],
[ -1.20046211,  -9.55928542],
[  2.00890845,  -9.78471782],
[  7.68945113,   9.01706723],
[  6.42356167,   8.33356412],
[  8.15467319,   7.87489634],
[  1.92000795,  -7.50953708],
[  1.90073973,  -7.24386675],
[  7.7605855 ,   7.05124418],
[  6.90561582,   9.23493842],
[  0.65582768,  -9.5920878 ],
[  1.41804346,  -8.10517372],
[  9.65371965,   9.35409538],
[  1.23053506,  -7.98873571],
[  1.96322881,  -9.50169117],
[  6.11644251,   9.26709393],
[  7.70630321,  10.78862346],
[  0.79580385,  -9.00301023],
[  3.13114921,  -8.6849493 ],
[  1.3970852 ,  -7.25918415],
[  7.27808709,   7.15201886],
[  1.06965742,  -8.1648251 ],
[  6.37298915,   9.77705761],
[  7.24898455,   8.85834104],
[  2.09335725,  -7.66278316],
[  1.05865542,  -8.43841416],
[  6.43807502,   7.85483418],
[  6.94948313,   8.75248232],
[ -0.07326715, -11.69999644],
[  0.61463602,  -9.51908883],
[  1.31977821,  -7.2710667 ],
[  2.72532584,  -7.51956557],
[  8.20949206,  11.90419283]]),
array([1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 1,
0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 1, 0,
0, 0, 0, 1]))
In [60]:
features = data[0]
labels = data[1]

In [63]:
plt.scatter(features[:,0],features[:,1],c=labels,cmap='coolwarm')

Out[63]:
<matplotlib.collections.PathCollection at 0x11932f278>
In [66]:
x = np.linspace(0,11,10)
y = -x + 5

In [67]:
plt.scatter(features[:,0],features[:,1],c=labels,cmap='coolwarm')
plt.plot(x,y)

Out[67]:
[<matplotlib.lines.Line2D at 0x119473e48>]

Defining Perceptron

y = mx + b

y = -x + 5

f1 = mf2 + b , m=-1

f1 = -f2 + 5

f1 + f2 – 5 = 0

Convert to a Matrix Representation of Features

w^Tx + b = 0

$$\Big(1, 1\Big)f – 5 = 0$$

Then if the result is > 0 its label 1, if it is less than 0, it is label=0

Example Point

Let’s say we have the point f1=2 , f2=2 otherwise stated as (8,10). Then we have:

$$\begin{pmatrix} 1 , 1 \end{pmatrix} \begin{pmatrix} 8 \\ 10 \end{pmatrix} – 5 = 13$$

In [71]:
np.array([1,1]).dot(np.array([[8],[10]])) - 5

Out[71]:
array([13])

Using an Example Session Graph

In [72]:
g = Graph()

In [75]:
g.set_as_default()

In [76]:
x = Placeholder()

In [77]:
w = Variable([1,1])

In [78]:
b = Variable(-5)

In [79]:
z = add(matmul(w,x),b)

In [80]:
a = Sigmoid(z)

In [81]:
sess = Session()


High probability (certain) is class 1

In [82]:
sess.run(operation=a,feed_dict={x:[8,10]})

Out[82]:
0.99999773967570205

Class 0

In [83]:
sess.run(operation=a,feed_dict={x:[0,-10]})

Out[83]:
3.0590222692562472e-07