Pit Pattern Classification with Support Vector Machines and Neural

Transcription

Pit Pattern Classification with Support Vector Machines and Neural
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Pit Pattern Classification with Support Vector Machines and
Neural Networks
Christian Kastinger
February 1, 2007
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Objectives
Investigation of support vector machine and neural network for classification of
pit pattern (256 x 256 RGB images) with previous feature extraction
(co-occurrence histogram) and feature selection (principle component analysis)
for a 2-class and a 6-class problem.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Classification Process
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Co-occurrence histogram
I
Considers dependencies between adjacent pixels.
I
Co-occurrence distance (between two samples) can be varied.
I
Orientation (vertical,horizontal,...) is not fixed.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Principle Components Analysis (PCA)
I
PCA aims to provided a better representation with lower dimension.
I
Compaction of information.
Process
I
I
I
I
I
I
Create a new mean-adjusted data matrix X̃.
Calculate a m × m covariance matrix Σ from the mean-adjusted data X̃.
Compute n significant eigenvectors W from the covariance matrix Σ.
Perform dimensionality reduction: Y = WT X̃.
n is a tradeoff between ”compression” and quality.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
PCA Example
I
Data points : x = 1 3 3 5 ; y = 1 3.5 0.5
2.667 1.333
0.639 − 0.77
X̃ =
=⇒ W =
1.333 2.1667
−0.77 − 0.639
x 0 = WT ∗ x̃ = 1.409 4.546 2.63 5.767
I
y 0 = WT ∗ ỹ = 0.131
I
I
0.778
Christian Kastinger
−1.63
−1.532
3
−0.885
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Basic Concept
I
I
I
I
I
Inputs: x1 , ..., xj ∈ [0, 1]
Synapses: w1 , ..., wj ∈ R
P
Neuron: net =
wj ∗ xj
Output: y = f (net − θ)
Bias value: θ
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Multi Layer Network
I
Layer weights are adjusted during learning based on some input/output
patterns.
I
Learning typically starts at the output layer and moves toward the input
layer (back-propagation).
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Mathematical Model
I
Activation of the hidden layer
netj =
X
wji xi
i
I
Output of the hidden layer
!
X
yj = f (netj ) = f
wji xi
i
I
Activation of the output layer
netk =
X
wkj f (netj )
j
I
Net output
!!
yk = f (netk ) = f
X
wkj f
j
Christian Kastinger
X
i
wji xi
!
=f
X
wkj yj
j
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Basic Concept
I
A support vector machine is a learning algorithm which attempts to
separate patterns by a hyperplane defined through:
I
I
normal vector w
offset parameter b.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
What is an Optimal Hyperplane?
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Separation with maximal Margin
I
Support vectors are all points lying on the margin closest to the hyper
plane.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Kernel Trick
I
Nonlinear and complex separation in the 2-dimensional input space.
I
Easier and often linear separation in higher dimensional feature spaces.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Kernel Examples
I
Linear kernel
k(x, x0 ) = xT x0 = hx, x0 i
I
Polynomial kernel of degree d
k(x, x0 ) = (γ + hx, x0 i + coef 0)d
I
Radial basis kernel (RBF)
k(x, x0 ) = exp(−γkx − x0 k2 )
I
MLP or Sigmoid kernel
k(x, x0 ) = tanh(γhx, x0 i + coef 0)
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Classification Principle
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Mathematical Model
I
Dual optimization problem
maxm W (α) =
α∈R
m
X
αi −
i=1
αi ≥ 0,
subject to
m
1X
αi yi αj yj k(xi , xj )
2 i,j=1
for all i = 1, ..., m,
and
m
X
αi yi = 0
i,j=1
I
Decision function
f (x) = sgn
m
X
!
αi yi hφ(x), φ(xi )i) + b
i=1
= sgn
m
X
!
αi yi k(x, xi ) + b
i=1
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Multi-Class Approaches
I
Decomposition into several binary classification tasks.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
SVM Optimization
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
PCA Dependency
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Co-occurrence Distance Dependency
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
PCA Dependency for 6 Classes
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Additional Investigations
I
Color-histogram (3-dimensional).
I
I
Too high data dimension.
Low classification results due to high data compression.
I
Vertical co-occurrence histogram.
I
Combination of horizontal and vertical co-occurrence histogram.
I
I
3-5% lower classification results compared with horizontal histogram.
Lower classification results as with horizontal histogram.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Problems
I
High data dimension.
I
Data scaling.
I
Time intensive parameter optimization.
I
SVM accepts invalid input.
I
Too less training samples
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Summary
I
SVM provides 10% better results than the NN.
I
SVM parameter have to be optimized carefully.
I
Better classification with PCA due to higher compaction of information.
I
Low impact co-occurrence distance on classification accuracy.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Outlook
I
Evaluation of SVM with a higher amount of pit patterns.
I
Consideration of other feature extraction and selection strategies.
I
Investigation of other neural network topologies.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Bibliography
R.O. Duda, P.E. Hart and D.G. Stork. Pattern Classification, 2nd ed. New
York: John Wiley & Sons, 2001.
C.C. Chang and C.J. Lin. LIBSVM: a library for support vector machines.
URL: http://www.csie.ntu.edu.tw/ cjlin/libsvm, [Feb. 24, 2006].
N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector
Machines and other kernel-based learning methods. Cambridge:
Cambridge University Press,2000.
B. Schoelkopf and A.J. Smola. Learning with kernels. Cambridge,MA: MIT
Press, 2002.
J.E. Jackson A User’s Guide to Principal Components. John Wiley & Sons
Inc, 2003.
P. Chang and J. Krumm. Object recognition with color cooccurance
histograms. In Proceediungs of CVPR ’99, 1999.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ
Introduction
Feature Extraction and Selection
Neural Network
Support Vector Machine
Results
Conclusion
Questions?
Thank you for your attention.
Christian Kastinger
Pit Pattern Classification with Support Vector Machines and Neural Networ

Similar documents