• Tidak ada hasil yang ditemukan

*******************************************************

N/A
N/A
Big Man

Academic year: 2023

Membagikan "*******************************************************"

Copied!
31
0
0

Teks penuh

(1)

Machine Intelligence

Associate Professor

Department of Computer Science & Engineering Preet Kanwal

Acknowledgement : Dr. Rajanikanth K (Former Principal MSRIT, PhD(IISc), Academic Advisor, PESU)

Teaching Assistant (Ameya Bhamare - Sem VII)

(2)

Machine Intelligence

Unit 2 -- History of Artificial Neural Networks

Preet Kanwal

Department of Computer Science & Engineering

(3)

Unit 2 - ANN Outline

● History of ANN

■ MP neuron

■ Perceptron

● Applications

(4)

Unit 2 - ANN

History - Biological Neurons

Reticular Theory

Joseph von Gerlach proposed that the nervous system is a single continuous network as opposed to a network of many discrete cells!

(5)

Unit 2 - ANN

History - Biological Neurons

Staining Technique

Camillo Golgi discovered a chemical reaction that allowed him to examine nervous tissue in much greater detail than ever before He was a proponent of Reticular theory.

(6)

Unit 2 - ANN

History - Biological Neurons

Neuron Doctrine

Santiago Cajal used Golgi’s technique to study the nervous system and proposed that it is actually made up of discrete individual cells forming a network (as opposed to a single continuous network)

(7)

Unit 2 - ANN

History - Biological Neurons

The Term Neuron

Coined by Heinrich Wilhelm Gottfried von Waldeyer-Hartz around 1891. He further consolidated the Neuron Doctrine.

(8)

Unit 2 - ANN

History - Biological Neurons

Nobel Prize

Both Golgi (reticular theory) and Cajal (neuron doctrine) were jointly awarded the 1906 Nobel Prize for Physiology or

Medicine, that resulted in lasting conflicting ideas and controversies between the two scientists.

(9)

Unit 2 - ANN

History - Biological Neurons

The Final Word

In 1950s electron microscopy finally confirmed the neuron doctrine by unambiguously demonstrating that nerve cells were individual cells interconnected through synapses (a network of many individual neurons).

(10)

Unit 2 - ANN

History - MP Neurons

McCulloch Pitts Neuron

McCulloch (neuroscientist) and Pitts (logician) proposed a highly simplified model of the neuron (1943)

(11)

Unit 2 - ANN

History - Perceptron

Perceptron

“the perceptron may eventually be able to learn, make decisions, and translate languages” -Frank Rosenblatt

(12)

Unit 2 - ANN

History - Multilayer Perceptrons

First generation Multilayer Perceptrons

(13)

Unit 2 - ANN

History - Perceptron Limitations

Perceptron Limitations

In their now famous book “Perceptrons”, Minsky and Papert outlined the limits of what perceptrons could do.

Showed a simple function like XOR cannot be modelled by a single neuron due to which funding severely cut down.

They did say a multilayer perceptron can do it, but people overlooked that.

Basically, their words were misconstrued.

(14)

Unit 2 - ANN

History - AI Winter

AI Winter of connectionism Funding dropped severely

There are two types - symbolic and connectionist AI Almost lead to the abandonment of connectionist AI

(15)

Unit 2 - ANN

History - Backpropagation

Backpropagation

Discovered and rediscovered several times throughout 1960’s and 1970’s

Werbos(1982)[5] first used it in the context of artificial neural networks

Eventually popularized by the work of Rumelhart et. al. in 1986

(16)

Unit 2 - ANN

History - Gradient Descent (Mathematical Basis)

Gradient Descent

Cauchy discovered Gradient Descent motivated by the need to compute the orbit of heavenly bodies

(17)

Unit 2 - ANN

Universal Approximation Theorem

Universal Approximation Theorem

A multilayered network of neurons with a single hidden layer can be used to approximate any continuous function to any desired precision.

No matter how complex a function, we can build a NN to learn it.

(18)

Unit 2 - ANN

Unsupervised Pre-training

Unsupervised Pre-Training

Hinton and Salakhutdinov described an effective way of

initializing the weights that allows deep autoencoder networks to learn a low-dimensional representation of data.

(19)

Unit 2 - ANN

Applications of ANN

Success in Handwriting Recognition

Graves et. al. outperformed all entries in an international Arabic handwriting recognition competition.

Success in Speech Recognition

Dahl et. al. showed relative error reduction of 16.0% and 23.2% over a state of the art system.

(20)

Unit 2 - ANN

Applications of ANN - ImageNet

• In 2012, a team from U. Toronto submitted AlexNet 5, a deep CNN architecture.

• In the first year of the competition, every team had an error rate of at least 25%.

• AlexNet was the first team to use deep learning and were the only one’s to achieve an error rate < 25%.

• After this, everyone moved to Deep Learning for Visual Recognition challenges!

(21)

Unit 2 - ANN

Winning more visual recognition challenges

Network Error Layers

AlexNet 16.0% 8

ZFNet 11.2% 8

VGGNet 7.3% 19

GoogleLeNet 6.7% 22 MS ResNet 3.6% 152!

(22)

Unit 2 - ANN

Applications of ANN - Voice recognition

(23)

Unit 2 - ANN

Applications of ANN - Machine translation

(24)

Unit 2 - ANN

Applications of ANN - Question answering

(25)

Unit 2 - ANN

Applications of ANN - Object detection and recognition

(26)

Unit 2 - ANN

Applications of ANN - Visual tracking

(27)

Unit 2 - ANN

Applications of ANN - Visual question answering

(28)

Unit 2 - ANN

Applications of ANN - Driverless cars

(29)

Unit 2 - ANN

Applications of ANN - Image captioning

(30)

Unit 2 - ANN

Applications of ANN - Deep fakes

These people do not exist!

(31)

preetkanwal@pes.edu

THANK YOU

Preet Kanwal

Department of Computer Science & Engineering

Referensi

Dokumen terkait