What is a neural network

Transcription

What is a neural network
Introduction to
Neural Networks
Neuronal Networks
Contents
• Real to artificial NN
• Bits of history
• Learning
Information Processing
• Preprocessing
• Selection de variables
• Net parameters red
• Postprocessing
NN flavors
Tutorials
• Antibiotics
• Car insurance
• Credit card
• Sales forecast
• Stocks
• Kohonen
• Time recursive
• ….
NN for time series and finances
• Structure of time series
• NN enhancement
Model for the brain
What is a neural network ?
Data
Historic Data
New data
variables goals
variables ??
Neural Networks learn from examples
Matematitian/Physicist
Universal Aproximant
(Huge set of functions which is
unbiased, robust, flexible and
implements bayesian inference)
Business man
Prediction tool
(objetive, consolidate,
adaptable to complex problems,
integrable)
¿ What are they good for ?
Clasification
Good/Bad client, Helicity of a particle
Interpolation
I need to guess the behavior of a client
Optimize the working of a chemical oven
Modeling
Build a quantitative model for fire propagation in cables
Prediction
Sun spots, Sales forecast
They can be used to deal with any statistical inference problem
Idea: Copy Nature
Real Neural Networks
Big sets of neurons take control
Of highly especialized tasks
Connectivity among sets is very complex
Real neural networks
differ in shape and tasks
Our brain contains over 1 000 000 000 000 neurons
Each neuron handles thousands of connections
Every minute some 10 000 neurons die in our brain!
The neuron
Neuron:
• Dendrites
• Axon
• ....
How does a neuron work?
Flow of charged ions (Calcium)!
Sinapsis
A neuron can
• colaborate towards the activation
of other neurons
• inhibite the activation
of other neurons
V1
If the incoming potencial
gets over a threshold
the neuron fires
V2
V3
U
Short summary of real NN
• Information processing takes place in neural networks
• Information is transferred by electricity flows
• Neurons die, but information processing remains robust
• A neuron fires depending on a local processing of inputs
versus threshold
• Sinapsis evolve in time (enhanced / suppressed)
The big picture
Alan Turing (37), Church, Post : Turing Machine
McCullough and Pitts (43) : binary neuron
John von Neumann : von Neumann computer
Two major schools of thought , 50 ~ 60:
• symbol manipulation
Intelligent behavior consists of rules to manipulate symbols
(subsymbolic level is overlooked)
• pattern matching, or feature detection
– Hearing, vision, taste, and tactile input to brain
– People develop many context-sensitive models of what to expect as we
interact with the world
top
top
examples
parallel
fuzzy
robust
general
rules
serial
boolean
brittle
expert
down
down
Prolog and Lisp, AI machine
Rule-based expert systems
mid-1980’s : realized that the idea was not a full success
reexamine the work from the 1960s on neural networks
Learning to learn
• Hebb (49), Caianello (61)
First learning algorithm
• Rosenblatt (62)
Perceptron learning rule
• Minsky & Papert (69)
XOR (CNOT) can not be learnt by perceptron
• Little (74), Hopfield(82),..
Relation to spin glasses
Content adressable associative memory
• (80´s)Kohonen, Carpenter, Grossberg, Rumelhart, Zipser
Unsupervised learning
• Werbos (74) Parker, Rumelhart, Hinton, Williams (85)
Error Back-Propagation learning
Real vs artificial neuron
in weights
threshold
activation
out weights
How does a neural network work ?
capa 1
multilayer
feedforward
capa 2
Neural Network
capa l
.....
n(l −1)

(l )
(l ) (l −1)
(l ) 
zi = f  ∑ wij z j + ti 
 j=1

n(l −1)

(l ) 
(l )
(l ) (l −1)
zi = f  ∑ wij z j + ti 
 j=1

• The function ƒ can make the response of neurons to
be non-lineal
• The weights w and the thresholds t define the way information
is processed in every neuron
• The number of layers and neurons in each layer
define the architecture of the neural network
The algorithm for learning by error back-propagation
(1985) is a systematic procedure to adjust the weights and
thresholds of a neural networks to reproduce known
example patterns.
No need of knowledge of underlying model is necessary.
T vs C ?
T
T
T
C
c
C
C
Training
0. Random w and t
1. Feed an example (T)
2. Output = T
fine
T
Output = C
error
3. Propagate a change of
w and t through the net
to reduce error
4. Go to 1
Robust if a
neuron dies!
Supervised learning of
T / C
Serious pattern recognition
A neural network
is trained to recognize
military plane patterns
The NN detects a military
plane hidden under
a commercial one
Belgrado 19/04/1999
Summary
• Nature has tried many problem solving aproaches
• Neural Networks implement inference through learning
• NN: robust, non-linear, adaptable, consolidated,
learn from incomplete, deteriorated data
• Standard in scientific data analysis

Similar documents