Didier Lucor

Transcription

Didier Lucor
(Generalized) Polynomial Chaos
representation
Didier Lucor
Laboratoire de Modélisation en Mécanique
UPMC Paris VI www.lmm.jussieu.fr
Outline
Context & Motivation
Polynomial Chaos and generalized Polynomial
Chaos Expansions
Limitations & Difficulties of the method
Possible applications
Interaction of model and data
Roger Ghanem
(University of Southern California, USA)
Interaction of model and data
Roger Ghanem
(University of Southern California, USA)
Interaction of model and data
Roger Ghanem
(University of Southern California, USA)
Interaction of model and data
Roger Ghanem
(University of Southern California, USA)
Interaction of model and data
Roger Ghanem
(University of Southern California, USA)
Uncertainty Quantification (UQ)
Modeling errors/uncertainties, numerical errors and
data errors/uncertainties can interact. This brings the
need for uncertainty quantification.
Need to access the impact of uncertain data on
simulation outputs.
In case of the lack of a reference solution, the validity
of the model can be established only if uncertainty in
numerical predictions due to uncertain input
parameters can be quantified.
Difficulty: instead of looking for the unique solution of
a single deterministic problem, one is now interested
in finding and parameterizing the space of all possible
solutions spanned by the uncertain parameters.
Sources d’incertitudes
Ecoulement au bord incertain
(processus stochastique)
paramètres/constantes de
simulation, conditions
d'opération
paramètres structure
incertains
coefficients de transport,
propriétés physiques
géométrie
conditions aux bords,
conditions initiales
conditions aux limites
incertaines
lois de comportement,
schéma numériques
Représentation des Processus Aléatoires
Méthodes statistiques (non déterministes)
Monte-Carlo: convergence en 1/√N, taux de convergence ne depends pas du
nombre de variables aleatoires.
Monte-Carlo et ses variantes: tirages descriptifs, hypercube, optimal Latin
hypercube (Latin Hypercube Sampling, Quasi-Monte Carlo [QMC] method, Markov Chain Monte Carlo
method [MCMC], importance sampling, correlated sampling, conditional sampling,Variance reduction technique,
Response Surface Method [RSM]).
Méthodes non-statistiques (directes)
Développement en séries de Taylor ou méthode des perturbations (1er ou 2nd ordre).
Méthode itérative ou séries de Neumann et méthode d’intégrale pondérée.
Méthode spectrale et méthode de développements orthogonaux: Polynômes de
Chaos (PC-Chaos Homogène-Chaos Hermite, Generalized Polynomial Chaos [gPC]-Chaos Askey),
expansion de Karhunen Loève.
Wiener, The homogeneous chaos, Amer. J. Math., 60 (1938).
Ghanem & Spanos, Stochastic Finite Elements: a Spectral Approach, Springer-Verlag, (1991).
Loève, Probability Theory, Fourth edition, Springer-Verlag, (1977).
Modélisation spectrale de l’incertitude
• Concept:
Approche probabiliste qui considère que l’incertitude génère de
nouvelles dimensions et que la solution dépend de ces dimensions.
Représentation de la solution sous forme d’expansion convergente
construite grâce à une projection sur une base spectrale; les coefficients
sont calculés par le biais de projections.
• Avantages:
Mesure efficace de la sensibilité de la solution aux paramètres d'entrée
incertains
Obtention d’une forme explicite de la solution + moments + PDF
• Applications:
Mécanique des structures élastiques stochastiques, écoulement en milieu
poreux, équations de Navier-Stokes, problèmes thermiques, combustion
et fluides réactifs, séismologie, micro-fluides et électrochimie.
if
ematical
and
computational
issues
associated
with
stocha
and d
!
random
field
X
:
Ω
→
V
is
a
second-order
random
field
over
a
Hilbert
space
→inner
V isproduct
a second-order
a Hilbert
space
, abuse notation
(·, ·) : V × Vrandom
→ R. Asfield
V isover
densely
embedded
in V V
, we
A
!
with
gPC
based
methods
in
particular.
and denote by (·, ·) also the V × V duality pairing.
2
=
E
E"X"
if
Polynômes
de
Chaos
(Wiener
1938)
val, random
fields
processes.
For a stochastic
A random field
X : Ωare
→ V stochastic
is a second-order
random field over
Hilbert space Vpartia
,
2 SCHWAB,
2 KARNIADAKIS,SU,XIU,LUCOR,
if
AND
TODOR
2.
Generalized
Polynomial
Stochastic
m
= E(X,
X) <
∞, Chaos.
E(X, X)
< E"X"
∞, of generalized
E"X"
tions,
V is=often
a space
functions
in
a
physical
dom
where
E
denotes
the
expectation
of
a
ran
KARNIADAKIS,SU,XIU,LUCOR,
SCHWAB, AND TODOR
2
on
a
probability
space
(Ω,
A,
P)
where
Ω
is
the
event
spac
Soit are
l’espace
probabilisé:
2,
3.6 random fields
=
E(X,
X)
<
∞,
E"X"
nterval,
stochastic
processes.
For
stochastic
partial
differential
1
KARNIADAKIS,SU,XIU,LUCOR,
AND TODOR
1 SCHWAB,
!
where
E
denotes
the
expectation
of
a
random
variable
Y
∈
L
(Ω,
ctation
of
a
random
variable
Y
∈
L
(Ω,
R)
defined
by
dR) defined by
!
quations,
V
is
often
a
space
of
generalized
functions
in
a
physical
domain
D
⊂
R
,
P
its
probability
measure.
n
all
examples
considered
here,
V
is
a
Hilbert
with
dual
V
,
no
erval,
random
fields are
stochastic processes.
For
stochastic
partial
differential
1space
Probability
the
expectation
of space
aprocesses.
random variable
Y ∈ Lpartial
(Ω, R)differential
defined
byEY =
=where
2,interval,
3. E denotes
!
!
d
Event
random
fields
are
stochastic
For
stochastic
!
uations, V is often a space of generalized functions in a physical! domain
D
⊂
R
,
measure
d
product
(·,
·)
:
V
×
V
→
As
V
is
densely
embedded
in
V
,
we
ab
Inequations,
all examples
considered
here,
Vsolutions
is R.
a Hilbert
space
with
dual
VΩ, norm
" ◦ " and
Data
and
of
stochastic
differential
equation
ω∈
!
σ-algebra
of
V
is
often
a
space
of
generalized
functions
in
a
physical
domain
D
⊂
R
,
= 2, 3.EY =
EY
=
Y
(ω)dP
(ω).
YV(ω)dP
(ω).
!
nner dproduct
(·,
·)
:
V
×
→
R.
As
V
is
densely
embedded
in
V
notation
! Y (ω)dP (ω). , we abuse
=
2,
3.
!
EY
=
denote
bymappings
(·,
·) the
also
the
Vis ×
Vω∈Ω
duality
pairing.
all examples
considered
here,
a→
Hilbert
space with
dual
V , norm
" ◦ " and
X
V
from
the
probability
space
into
a
ω∈Ω
! :VΩ
!
ndIndenote
by
(·,
·)
also
V
×
V
duality
pairing.
In all examples considered here, V is aω∈Ω
Hilbert
space withpolynomial
" and (gPC) is a
Generalized
chaos
!dual V , norm " ◦
er A
product
(·,field
·) : X
V×
V→
→VR.is As
V is densely
embedded
in V a, we
abuse
notation
Gener
!
random
:
Ω
a
second-order
random
field
over
Hilbert
space
V
,
Processus
stochastique
du
second
ordre
si:
A
random
field
X
:
Ω
→
V
is
a
second-order
random
field
over
a
Hilb
inner product
(·, ·) : V × of
V →random
embedded and
in V , we
abuse
notation
we
speak
variables,
if
V
is
a
function
spa
!R. As V is densely
processes
X(ω)
parametrically
daos
denote
by (·,is·) aalso
thechaos
V of
× Vrepresenting
duality
pairing.
(gPC)
means
second-order
stochasralized
polynomial
(gPC)
is
atic
means
of representing
second-orderthroug
stocha
!
tic pr
Generalized
polynomial
chaos
is a pairing.
means of representing second-order stochasand denote by
(·, ·) also the
V ×(gPC)
V duality
N
N
A
random
field
X
:
Ω
→
V
is
a
second-order
random
field
over
a
Hilbert
space
V
,
N
(ω)}
,
N
∈
trically
through
a
set
of
random
variables
{ξ
avec
et
(ω)}
,
N
∈
tic
processes
X(ω)
parametrically
through
a
set
of
random
variables
{ξ
N
ocessesAX(ω)
through
set ofjfield
random
variables
peutXs’exprimer
fonction
de
N, athrough
the
events
ωspace
N,,thr
random parametrically
field
: Ω → V 2is aen
second-order
random
over
a Hilbert
V{ξ
,j=1
j∈ Ω:
j (ω)}j=1
j=1
E"X" = E(X, X) < ∞,
N,ifthrough
the events
∈ Ω:
Ω:
rough
the events
ω ∈ωΩ:
2
∞
1
2
"
=
E(X,
X)
<
∞,
E"X"
here E denotes the expectation
of a random
variable
Y ∈ L (Ω, R) defined by
2
=
E(X,
X)
<
∞,
E"X"
∞
X) < ∞,
E"X" = E(X,
"
∞
∞
X(ω)
=
!X(ω) = "a Φ (ξ(ω)).
"
(2.1)
k k
1
1 R) defined
ere E
denotes
the
expectation
of
a
random
variable
Y
∈
L
(Ω,
byby
X(ω)
=
ak Φ
(2.1)
where
E denotes
the
expectation
of aYrandom
variable
Y
∈
L
(Ω,
R)
defined
EY =X(ω)
(ω)dP
(ω).
1
=
a
Φ
(ξ(ω)).
(2.
k (ξ(ω)).
k
k
k=
k=0
e E denotes the expectation
of
a
random
variable
Y
∈
L
(Ω,
R)
defi
ω∈Ω
!
! k=0
Here
k=0
∞
!
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
vector
EY
=
Y
(ω)dP
(ω).
(ω)dP
(ω).
!
j
Generalized polynomial
chaos (gPC)EY
is a=means Yof
representing
second-order
stochasξ
:=
{
Here
{Φ
(ξ(ω))}
are
orthogonal
polynom
j
X(ω)
=
a
Φ
+
a
Φ
(ξ
(ω))
N
ω∈Ω
ω∈Ω
0
0 variables
i1{ξ1 (ω)}
i1 N , N ∈
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
c
processes
X(ω)
parametrically
through
a
set
of
random
j
gonal
polynomials
in termspolynomials
of a zero-mean
random
j=1 orthogonal
j=1
N ofj vector
{Φ
(ξ(ω))}
are
in terms
a zero-mean
random vect
EY =
Y (ω)dP (ω).
j
i1 =1
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogona
j
,neralized
through
the
events
ω
∈
Ω:
j=1
Generalized
polynomial
chaos
(gPC)
is a means
of representing
second-order
stochasN
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
stochasthe
orthogonality
relation
i
∞
Theoreme
de Cameron
&orthogonality
{ξj (ω)}
,
satisfying
the
1
2relation
!
!
ω∈Ω
j=1
N N ,N ∈
(Φ
Φ
)
=
(Φ
)δ
,
(2.2)
(ω)}
tic processes
X(ω) parametrically
through
a
set
of
random
variables
{ξ
i
j
ij
j
i
(ω)}
,
N
∈
processes
X(ω) parametrically
through
a
set
of
random
variables
{ξ
∞
j=1
j
j=1
"
ai1 i2 Φ2 (ξi1 (ω), ξi2 (ω))
+
Martin
(1947):
N, through
the events
ω ∈ Ω:
through
the events
ω ∈ Ω:
ak Φk (ξ(ω)).i1 =1 i2 =1
(2.1)
(Φwhere
2 X(ω) =
i Φj )
2
(Φi )δijpour
, k=0
(2.2)
PC-homogène
converge
where
δ(Φ
Kronecker
delta
andi Φ
(·,
·)
denotes
the
ensemble
average.
The
number
ij iisΦthe
j) =
(Φ
)
=
(Φ
)δ
,
(2.
j
ij
∞
i
of
ran
i2
i1 !
∞ !
∞ "
!
"
of random
variables N de
∈ NL2isX(ω)
in general
infinite,
so
is
the
index
in
(2.1).
In
practice,
toute fonctionnelle
=
a
Φ
(ξ(ω)).
(2.1)
k
howev
ai1 irandom
(ω),(2.1)
ξi2 (ω),
+ k of aδzero-mean
X(ω)
=
a
Φ
(ξ(ω)).
erehowever,
{Φj (ξ(ω))}
are
orthogonal
polynomials
where
is
the
delta
and
(·,
·)
N ξi3 (ω))
kin
kterms
2 iKronecker
3 Φ3 (ξi1vector
j
ij
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
with
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
j j=1 The numb
k=0
N
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
i
=1
i
=1
i
=1
3
2
:=ij{ξj (ω)}j=1 , satisfying the orthogonality
relation 1
k=0
N <∞
ralized polynomial chaos (gPC) is a means of representing second-o
rocesses X(ω) parametrically through a set of random variables {ξ (
of
random
variables
N
∈
N
is
in
general
rough
the
events
ω
∈
Ω:
N
<
∞,
and
a
finite-term
truncation
of
(2.1).
N is in general infinite, so is the index
In practice,
+ in
· · · (2.1).
,
es X(ω) parametrically through a set of random variables {ξj
Polynômes
de Chaos (continued)
h the events
ω ∈ Ω:
X(ω) =
∞
"
ak Φk (ξ(ω)).
k=0
expansion
on orthogonal
(in the mean
ξ(ω))} are Spectral
orthogonal
polynomials
in terms
of a sense
zero-mean r
<Φi ,Φj >=0 if i≠j) Hermite polynomial basis Φk.
N
)}j=1 , satisfying the orthogonality relation
ξ is here a “random array” of independent Gaussian
2
random variables
event
(Φofi Φthe
= (Φi )δ
j ) random
ij , ω.
Once computed, the knowledge of the coefficients ak
s the Kronecker delta and (·, ·) denotes the ensemble average.
fully determines the random process X(ω).
variables N ∈ N is in general infinite, so is the index in (2.1)
concept
can be generalized
to othervariables,
non-normali.e., to
we need toThis
retain
a finite
set of random
measures.
nd a finite-term truncation of (2.1).
nner product in (2.2) is in the Hilbert space determined by th
m variables
!
!
where E denotes the expectation of a random variable Y ∈ L1 (Ω, R) defin
M
!
!
X(ω) =
aj Φj (ξ(ω)),
(2.5)
EY =
Y (ω)dP (ω).
Polynômes de Chaos (truncated form)
j=0
ω∈Ω
M
!
· · · , ξN ) isGeneralized
an N −dimensional
vector
ξi independent
polynomialrandom
chaos (gPC)
is with
a means
of representingofsecond-ord
t,denote
ξ) = X(x,
t, ξ1 ,order
ξ2 , . .of
. ξthrough
N ) ≈ a setX
j (ξ)
= j ≤ NX(x,
. Iftic
weprocesses
the
highest
polynomial
{Φ}
as t)Φ
P , then
X(ω)
parametrically
ofj (x,
random
variables
{ξj (ω
j=0
N, through
ω ∈ Ω:
er of expansion
terms the
(M events
+ 1) is,
T
(M + 1) = (N + P )!/(N !P !).
X(ω) =
∞
"
ak Φk (ξ(ω)).
(2.6)
ensional generalized polynomial chaos expansion
is constructed as the
k=0
ν = {Φ
. , M } Note in one-dimensional
of the corresponding
one-dimensional
j , j = 0, . .expansion.
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
ran
j
= 1), M = P .
ξ := {ξj (ω)}N
, satisfying the orthogonality relation
j=1
w consider aisgeneral
setting
for
a
differential
equation
with
random
the set of vectors spanning the process and the
i Φj ) =
orthogonal basis Φj is a set of(Φpolynomials
with degree
L(x, t, ω; u) =atfmost
(x, t; ω),
x P.
∈ D(Λ), t ∈ (0, T ), ω ∈ Ω,
(2.7)
equal
to
where δij is the Kronecker delta and (·, ·) denotes the ensemble average. T
of randomD(Λ)
variables
∈ N=is1,in2,general
infinite, so
is the index
ifferential operator,
∈ RNd (d
3) a bounded
domain
with in (2.1).
The
orthogonality
relation
gives:
weisneed
to retain a finite
set complete
of randomprobability
variables, i.e., to {ξ
, and T > 0.however,
(Ω, A, P)
an appropriately
defined
!
Ω
< ∞, and and
a finite-term
truncation of
(2.1). u := u(x, t; ω)
A ⊆ 2 is theN σ-algebra
P the ∞
probability
measure;
The
product
(2.2)
is
in the
Hilbert
determined
by the
<ω)
Φiis(ξ),
Φsource
>=
Φigeneral
(ξ)Φ
= space
0 ofif applying
i != j
j (ξ)
j (ξ)dξ
and f (x, t;
theinner
term.in The
procedure
2
(Φi )δij ,
the
random
variables
−∞
polynomial chaos consists of the following
steps:
!
!
with
the inner
product
s the
random
inputs
by a finite number of random variables ξ(ω) =
f (ξ)g(ξ)dP (ω) = f (ξ)g(ξ)w(ξ)dξ
(f (ξ)g(ξ)) =
defined
as:
, ·, ξN (ω)},
and rewrite
the stochastic problem
parametrically as
ω∈Ω
9
Example of multi-dimensional homogeneous PC expansion
An example of multi-dimensional expansion :
cond-order stochasHere {Φj (ξ(ω))} are orthog
❍ Polynomial
N Chaos expansion (Wiener 1938) :
N
∈ N =2
les {ξj (ω)}j=1 , Nhere
ξ := {ξj (ω)}j=1 ,=satisfying
{ξ1 , ξ2 }
If
is a Gaussian vector, the weight function becomes :
2
2
1
−ξ2 /2
−ξ1 /2
exp
exp
w(ξ1 , ξ2 ) =
2π
The weight function is:
0 & 1st orders
Zero & 1st order
Hermite polynomials
(2.1)
Ψ0
1.015
1.01
1.005
1
0.995
0.99
0.985
3
2
1
0
-1
-2
-3
mean random vector
-3
2nd order
2nd order Hermite
polynomials
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
-3
ξ2
Ψ4
8
7
6
5
4
3
2
1
0
-1
10
8
6
4
2
0
-2
-4
-6
-8
-10
(2.2)
-3
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
ξ2
where δij is the Kronecker d
of random variables N ∈ N
however, we need to retain
N < ∞, and a finite-term t
The inner product in (2
the random variables
Ψ2
3
2
1
0
-1
-2
-3
3
Ψ3
verage. The number
n (2.1). In practice,
N
Ψ1
3
-3
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
3
-3
ξ2
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
3
ξ2
Ψ5
8
7
6
5
4
3
2
1
0
-1
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
ξ2
3
-3
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
3
ξ2
(f (ξ)g(ξ)) =
Example of multi-dimensional
homogeneous PC expansion
10
An example of multi-dimensional expansion :
Third order polynomials :
3rd order Hermite
Ψ
polynomials
20
Ψ7
6
25
20
15
10
5
0
-5
-10
-15
-20
-25
15
10
5
0
-5
-10
-15
-20
-3
-2
-1
ξ1 0
1
2
3 -3
-2
-1
0
1
2
3
-3
ξ2
Ψ8
-1
ξ1 0
1
2
3 -3
-2
-1
1
3
ξ2
Ψ9
25
20
15
10
5
0
-5
-10
-15
-20
-25
-3
-2
0
2
20
15
10
5
0
-5
-10
-15
-20
-2
-1
0
ξ1
1
2
3 -3
Polynomial Chaos Expansions,
-2
-1
0
1
2
ξ2
3
-3
-2
-1
0
ξ1
1
2
3 -3
-2
-1
0
1
2
3
ξ2
INSTN-13/05/04
Hermite-Chaos Expansion of Beta Distribution
Uniform
distribution :
Exact PDF and PDF of 1st, 3rd, 5th-order Hermite-Chaos Expansions
Hermite-Chaos Expansion of Gamma Distribution
: exponential distribution
PDF of exponential distribution and 1st, 3rd and 5th-order Hermite-Chaos
Exponential Input: Laguerre (optimal) vs. Hermite
Convergence w.r.t. number of expansion terms
Generalized polynomial chaos
of=
representing
second-order
stocha
EY =is a means
YEY
(ω)dP
(ω). Y (ω)dP
k=0 (gPC)
(ω).
N
! through
(ω)}
tic processes X(ω) parametrically
a
set
of
random
variables
{ξ
ω∈Ω
j
ω∈Ω
j=1 , N
ξ(ω))}
are orthogonal
polynomials
of a(ω).
zero-mean random vector
Y (ω)dP
N, through
the events
ωEY
∈ Ω:= in terms
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
ω∈Ω
)}N
,
satisfying
the
orthogonality
relation
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
j=1
∞
"
ic processestic
X(ω)
parametrically
through a setthrough
of random
variables
{ξj (ω)}
processes
X(ω)
parametrically
a
set
of
random
vari
X(ω)
=
a
Φ
(ξ(ω)).
(2.
alized polynomial chaos
(gPC)
is
second-order
stochas
k representing
k
2a means of
(Φ
Φ
)
=
(Φ
)δ
,
(2.2)
i
j
ij
N, Représentation
through the
events
ω
∈
Ω:
i
spectrale
d’un
processus
stochastique
du
second
ordre:
N
N, through the events ω ∈ Ω:
Polynômes de Chaos (Généralisés)
ocesses X(ω) parametrically through ak=0
set of random variables {ξj (ω)}j=1 , N ∈
∞
rough
the
events
ω
∈
Ω:
s the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
"
∞
Here {Φj (ξ(ω))} are orthogonal polynomials
in terms
of
a
zero-mean
random vect
"
N is in general infinite,
X(ω)
=
a
Φ
(ξ(ω)).
variables
∈N
so
is
the
index
in
(2.1).
In
practice,
k
k
ξ := {ξjN
(ω)}
,
satisfying
the
orthogonality
relation
X(ω)
=
a
Φ
(ξ(ω)).
j=1
∞
k
k
" k=0
we need to retain a finite set of random
variables, i.e., to {ξj }N
with
j=1
k=0
X(ω) =
ak Φk (ξ(ω)).
(2.1
2
(2.
nd a finite-term truncation of (2.1). (Φi Φj ) = (Φi )δij ,
k=0
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
j
nner product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
measure
of
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero
j
N the Kronecker delta and (·, ·) denotes the ensemble average. The numb
where
δ
is
ij
ξm
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
N
variables
j
j=1
{Φ (ξ(ω))} ξ
are
of a zero-mean
random vecto
:=orthogonal
{ξ (ω)} polynomials
, satisfying in
theterms
orthogonality
relation
n’est
pas limité
à une
distribution
gaussienne!
j N ∈j=1
random
variables
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practic
!
!
N
ξj (ω)}
the
orthogonality
relation
N
j=1 , satisfying
2
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
wit
j
j=1
(Φ=
(Φi(Φ
)δijΦ, ) = (Φ2 )δ ,(2.3)
f (ξ)g(ξ)dP (ω)
(ξ)g(ξ)w(ξ)dξ
(f (ξ)g(ξ)) =
i Φj ) f=
i j
N < ∞, and a finite-term
truncation of (2.1).
i ij
ω∈Ω
2
(Φi Φ
) =the
(ΦHilbert
(2.2
j in
The
inner
product
in
(2.2)
is
i )δij , space determined by the measure
where
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
ij
denoting
the where
density
law Kronecker
dP (ω) with delta
respectand
to the
δof
is the
(·, Lebesgue
·) denotesmeasure
the ensemble
the random
variables
ij the
ofhδij
random
variables
N
∈
Nand
is! in(·,domain,
general
infinite,
so by
is the
theaverage.
indexofinThe
(2.1).
In
is the Kronecker
delta
·)
denotes
the
ensemble
numbe
integration
taken
over
a
suitable
determined
range
the
!
of random variables N ∈ N is in general infinite, so is the indexN
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
j
ctor
ξ.
dom
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practice
produit interne:
= set
f (ξ)g(ξ)w(ξ)dξ
(2.j
(f (ξ)g(ξ))
=
however,
we need
tof (ξ)g(ξ)dP
retain a (ω)
finite
of random variables,
ω∈Ω
Ndiscrete
< we
∞, need
and
finite-term
truncation
of
(2.1).
case, ato
the
above orthogonal
relation
takes the
form i.e., to {ξj }N
ver,
retain
a
finite
set
of
random
variables,
wit
j=1
N < ∞, and a finite-term truncation of (2.1).
Thew(ξ)
product
in
(2.2)
in
thedP
Hilbert
space
determined
by the
me
"ofis
∞, with
and
ainner
finite-term
truncation
of
(2.1).
denoting
the
density
the
law
(ω)
with
respect
to
the
Lebesgue
measu
The
inner=product
in (2.2) is in the Hilbert space
determin
f
(ξ)g(ξ)w(ξ).
(2.4)
(f
(ξ)g(ξ))
he dξ
random
variables
he
inner
in (2.2)taken
is in over
the aHilbert
determined
o
andproduct
with
integration
suitablespace
domain,
determinedbybythe
the measure
range of th
the random variables
ξ
!
!
random
vector ξ.
ndom
variables
!
!
!
!
In
the
discrete
case,
the
above
orthogonal
relation
takes
the
form
f
(ξ)g(ξ)dP
(ω)
=
f
(ξ)g(ξ)w(ξ)dξ
(f
(ξ)g(ξ))
=
ation (2.1), there is a one-to-one correspondence between the type of the
jof
(f (ξ)g(ξ)) =
f (ξ)g(ξ)dP (ω) =
f (ξ)g(ξ)
Generalized polynomial chaos
of=
representing
second-order
stocha
EY =is a means
YEY
(ω)dP
(ω). Y (ω)dP
k=0 (gPC)
(ω).
N
! through
ω∈Ωa set of random
(ω)}
tic processes X(ω) parametrically
variables
{ξ
j
ω∈Ω
j=1 , N
ξ(ω))}
are orthogonal
polynomials
of a(ω).
zero-mean random vector
Y (ω)dP
N, through
the events
ωEY
∈ Ω:= in terms
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
ω∈Ω
)}N
,
satisfying
the
orthogonality
relation
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
j=1
∞
"
ic processestic
X(ω)
parametrically
through a setthrough
of random
variables
{ξj (ω)}
processes
X(ω)
parametrically
a
set
of
random
vari
X(ω)
=
a
Φ
(ξ(ω)).
(2.
alized
polynomial
chaos
(gPC)
is
a
means
of
second-order
stochas
k representing
k
2
déterministes
à(Φ
calculer
et
qui
(Φ
Φ
)
=
)δ
,
(2.2)
N, Coefficients
through spectraux
the
events
ω
∈
Ω:
i
j
ij
i
N
N, through the events ω ∈ Ω:
Polynômes de Chaos (Généralisés)
ocesses X(ω)
parametrically
set of random variables {ξj (ω)}j=1 , N ∈
déterminent
complètement le through
processus ak=0
∞
rough
the
events
ω
∈
Ω:
s the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
"
∞
Here {Φj (ξ(ω))} are orthogonal polynomials
in terms
of
a
zero-mean
random vect
"
N is in general infinite,
X(ω)
=
a
Φ
(ξ(ω)).
variables
∈N
so
is
the
index
in
(2.1).
In
practice,
k
k
ξ := {ξjN
(ω)}
,
satisfying
the
orthogonality
relation
X(ω)
=
a
Φ
(ξ(ω)).
j=1
∞
k
k
" k=0
we need to retain a finite set of random
variables, i.e., to {ξj }N
with
j=1
k=0
X(ω) =
ak Φk (ξ(ω)).
(2.1
2
(2.
nd a finite-term truncation of (2.1). (Φi Φj ) = (Φi )δij ,
k=0
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
j
nner product Here
in (2.2)
is
in
the
Hilbert
space
determined
by
the
measure
of
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero
j
N the Kronecker delta and (·, ·) denotes the ensemble average. The numb
where
δ
is
ij
ξm
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
N
j
variables
j=1
{Φ (ξ(ω))} are orthogonal polynomials in terms of a zero-mean random vecto
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
n’est
pas
limité
à
une
distribution
gaussienne!
j
j=1
random
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practic
!
!
N
ξj (ω)}
the
orthogonality
relation
N
j=1 , satisfying
2
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
wit
j
j=1
(Φ=
(Φi )δij ,
f (ξ)g(ξ)dP (ω)
(ξ)g(ξ)w(ξ)dξ
(2.3)
(f (ξ)g(ξ)) =
2
i Φj ) f=
N < ∞, and a finite-term
truncation of (2.1). (Φi Φj ) = (Φi )δij ,
ω∈Ω
2
(Φi Φ
) =the
(ΦHilbert
)δij , space determined by the measure
(2.2
j in
The
inner
product
in
(2.2)
is
i
where
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
ij
denoting
the where
density
law Kronecker
dP (ω) with delta
respectand
to the
δof
is the
(·, Lebesgue
·) denotesmeasure
the ensemble
the random
variables
ij the
ofhδij
random
variables
N
∈
Nand
is! in(·,domain,
general
infinite,
so
is
the
index
in
(2.1).
In
is the Kronecker
delta
·)
denotes
the
ensemble
average.
The
numbe
integration
taken
over
a
suitable
determined
by
the
range
of
the
!
of random variables N ∈ N is in general
infinite, so is the indexN
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
j
ctor
ξ.
dom
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practice
produit interne:
= set
f (ξ)g(ξ)w(ξ)dξ
(2.j
(f (ξ)g(ξ))
=
however,
we need
tof (ξ)g(ξ)dP
retain a (ω)
finite
of random variables,
ω∈Ω
Ndiscrete
< we
∞, need
and
finite-term
truncation
of
(2.1).
case, ato
the
above orthogonal
relation
takes the
form i.e., to {ξj }N
ver,
retain
a
finite
set
of
random
variables,
wit
j=1
N < ∞, and a finite-term truncation of (2.1).
Thew(ξ)
product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
me
"
∞, with
and
ainner
finite-term
truncation
of
(2.1).
denoting
the
density
of the in
law(2.2)
dP (ω)iswith
respect
to thespace
Lebesgue
measu
The
inner
product
in
the
Hilbert
determin
f
(ξ)g(ξ)w(ξ).
(2.4)
(f
(ξ)g(ξ))
=
he
random
variables
he dξ
inner
in (2.2)taken
is in over
the aHilbert
determined
o
andproduct
with integration
suitablespace
domain,
determinedbybythe
the measure
range of th
the random variables
ξ
!
!
random
vector ξ.
ndom
variables
!
!
!
!
In
the
discrete
case,
the
above
orthogonal
relation
takes
the
form
f
(ξ)g(ξ)dP
(ω)
=
f
(ξ)g(ξ)w(ξ)dξ
(f
(ξ)g(ξ))
=
ation (2.1), there is a one-to-one
correspondence
between
the
type
of
the
f (ξ)g(ξ)dP (ω) = f (ξ)g(ξ)
(f (ξ)g(ξ))
=
ω∈Ω
jof
Generalized polynomial chaos
of=
representing
second-order
stocha
EY =is a means
YEY
(ω)dP
(ω). Y (ω)dP
k=0 (gPC)
(ω).
N
! through
ω∈Ωa set of random
(ω)}
tic processes X(ω) parametrically
variables
{ξ
j
ω∈Ω
j=1 , N
ξ(ω))}
are orthogonal
polynomials
of a(ω).
zero-mean random vector
Y (ω)dP
N, through
the events
ωEY
∈ Ω:= in terms
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
ω∈Ω
)}N
,
satisfying
the
orthogonality
relation
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
j=1
∞
"
ic processestic
X(ω)
parametrically
through a setthrough
of random
variables
{ξj (ω)}
processes
X(ω)
parametrically
a
set
of
random
vari
X(ω)
=
a
Φ
(ξ(ω)).
(2.
alized polynomial chaos
(gPC)
is
second-order
stochas
k representing
k
2a means of
(Φ
Φ
)
=
(Φ
)δ
,
(2.2)
N, through the
events
ω
∈
Ω:
i
j
ij
i
N
N, through the events ω ∈ Ω:
Polynômes de Chaos (Généralisés)
ocesses X(ω) parametrically through ak=0
set of random variables {ξj (ω)}j=1 , N ∈
∞
rough
the
events
ω
∈
Ω:
s the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
"
∞
Here {Φj (ξ(ω))} are orthogonal polynomials
in terms
of
a
zero-mean
random vect
"
N is in general infinite,
X(ω)
=
a
Φ
(ξ(ω)).
variables
∈N
so
is
the
index
in
(2.1).
In
practice,
k
k
ξ := {ξjN
(ω)}
,
satisfying
the
orthogonality
relation
X(ω)
=
a
Φ
(ξ(ω)).
j=1
∞
k
k
" k=0
we need to retain a finite set of random
variables, i.e., to {ξj }N
with
j=1
k=0
X(ω) =
ak Φk (ξ(ω)).
(2.1
2
(2.
nd a finite-term truncation of (2.1). (Φi Φj ) = (Φi )δij , Fonctions orthogonales
k=0
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
j
(trigonométriques,
wavelets,...
polynômes)
nner product Here
in (2.2)
is
in
the
Hilbert
space
determined
by
the
measure
of
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero
j
N the Kronecker delta and (·, ·) denotes the ensemble average. The numb
where
δ
is
ij
ξm
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
N
j
variables
j=1
{Φ (ξ(ω))} are orthogonal polynomials in terms of a zero-mean random vecto
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
n’est
pas
limité
à
une
distribution
gaussienne!
j
j=1
random
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practic
!
!
N
ξj (ω)}
the
orthogonality
relation
N
j=1 , satisfying
2
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
wit
j
j=1
(Φ=
(Φi )δij ,
f (ξ)g(ξ)dP (ω)
(ξ)g(ξ)w(ξ)dξ
(2.3)
(f (ξ)g(ξ)) =
2
i Φj ) f=
N < ∞, and a finite-term
truncation of (2.1). (Φi Φj ) = (Φi )δij ,
ω∈Ω
2
(Φi Φ
) =the
(ΦHilbert
)δij , space determined by the measure
(2.2
j in
The
inner
product
in
(2.2)
is
i
where
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
ij
denoting
the where
density
law Kronecker
dP (ω) with delta
respectand
to the
δof
is the
(·, Lebesgue
·) denotesmeasure
the ensemble
the random
variables
ij the
ofhδij
random
variables
N
∈
Nand
is! in(·,domain,
general
infinite,
so
is
the
index
in
(2.1).
In
is the Kronecker
delta
·)
denotes
the
ensemble
average.
The
numbe
integration
taken
over
a
suitable
determined
by
the
range
of
the
!
of random variables N ∈ N is in general
infinite, so is the indexN
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
j
ctor
ξ.
dom
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practice
produit interne:
= set
f (ξ)g(ξ)w(ξ)dξ
(2.j
(f (ξ)g(ξ))
=
however,
we need
tof (ξ)g(ξ)dP
retain a (ω)
finite
of random variables,
ω∈Ω
Ndiscrete
< we
∞, need
and
finite-term
truncation
of
(2.1).
case, ato
the
above orthogonal
relation
takes the
form i.e., to {ξj }N
ver,
retain
a
finite
set
of
random
variables,
wit
j=1
N < ∞, and a finite-term truncation of (2.1).
Thew(ξ)
product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
me
"
∞, with
and
ainner
finite-term
truncation
of
(2.1).
denoting
the
density
of the in
law(2.2)
dP (ω)iswith
respect
to thespace
Lebesgue
measu
The
inner
product
in
the
Hilbert
determin
f
(ξ)g(ξ)w(ξ).
(2.4)
(f
(ξ)g(ξ))
=
he
random
variables
he dξ
inner
in (2.2)taken
is in over
the aHilbert
determined
o
andproduct
with integration
suitablespace
domain,
determinedbybythe
the measure
range of th
the random variables
ξ
!
!
random
vector ξ.
ndom
variables
!
!
!
!
In
the
discrete
case,
the
above
orthogonal
relation
takes
the
form
f
(ξ)g(ξ)dP
(ω)
=
f
(ξ)g(ξ)w(ξ)dξ
(f
(ξ)g(ξ))
=
ation (2.1), there is a one-to-one
correspondence
between
the
type
of
the
f (ξ)g(ξ)dP (ω) = f (ξ)g(ξ)
(f (ξ)g(ξ))
=
ω∈Ω
jof
Generalized polynomial chaos
of=
representing
second-order
stocha
EY =is a means
YEY
(ω)dP
(ω). Y (ω)dP
k=0 (gPC)
(ω).
N
! through
ω∈Ωa set of random
(ω)}
tic processes X(ω) parametrically
variables
{ξ
j
ω∈Ω
j=1 , N
ξ(ω))}
are orthogonal
polynomials
of a(ω).
zero-mean random vector
Y (ω)dP
N, through
the events
ωEY
∈ Ω:= in terms
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
ω∈Ω
)}N
,
satisfying
the
orthogonality
relation
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
j=1
∞
"
ic processestic
X(ω)
parametrically
through a setthrough
of random
variables
{ξj (ω)}
processes
X(ω)
parametrically
a
set
of
random
vari
X(ω)
=
a
Φ
(ξ(ω)).
(2.
alized polynomial chaos
(gPC)
is
second-order
stochas
k representing
k
2a means of
(Φ
Φ
)
=
(Φ
)δ
,
(2.2)
N, through the
events
ω
∈
Ω:
i
j
ij
i
N
N, through the events ω ∈ Ω:
Polynômes de Chaos (Généralisés)
ocesses X(ω) parametrically through ak=0
set of random variables {ξj (ω)}j=1 , N ∈
∞
rough
the
events
ω
∈
Ω:
s the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
"
∞
Here {Φj (ξ(ω))} are orthogonal polynomials
in terms
of
a
zero-mean
random vect
"
N is in general infinite,
X(ω)
=
a
Φ
(ξ(ω)).
variables
∈N
so
is
the
index
in
(2.1).
In
practice,
k
k
ξ := {ξjN
(ω)}
,
satisfying
the
orthogonality
relation
X(ω)
=
a
Φ
(ξ(ω)).
j=1
∞
k
k
" k=0
we need to retain a finite set of random
variables, i.e., to {ξj }N
with
j=1
k=0
X(ω) =
ak Φk (ξ(ω)).
(2.1
2
Variables
aléatoires
indépendantes
(2.
nd a finite-term truncation of (2.1). (Φi Φj ) = (Φi )δij ,
(distribution
statistiques
prescrites)
k=0
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
j
nner product Here
in (2.2)
is
in
the
Hilbert
space
determined
by
the
measure
of
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero
j
N the Kronecker delta and (·, ·) denotes the ensemble average. The numb
where
δ
is
ij
ξm
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
N
j
variables
j=1
{Φ (ξ(ω))} are orthogonal polynomials in terms of a zero-mean random vecto
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
n’est
pas
limité
à
une
distribution
gaussienne!
j
j=1
random
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practic
!
!
N
ξj (ω)}
the
orthogonality
relation
N
j=1 , satisfying
2
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
wit
j
j=1
(Φ=
(Φi )δij ,
f (ξ)g(ξ)dP (ω)
(ξ)g(ξ)w(ξ)dξ
(2.3)
(f (ξ)g(ξ)) =
2
i Φj ) f=
N < ∞, and a finite-term
truncation of (2.1). (Φi Φj ) = (Φi )δij ,
ω∈Ω
2
(Φi Φ
) =the
(ΦHilbert
)δij , space determined by the measure
(2.2
j in
The
inner
product
in
(2.2)
is
i
where
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
ij
denoting
the where
density
law Kronecker
dP (ω) with delta
respectand
to the
δof
is the
(·, Lebesgue
·) denotesmeasure
the ensemble
the random
variables
ij the
ofhδij
random
variables
N
∈
Nand
is! in(·,domain,
general
infinite,
so
is
the
index
in
(2.1).
In
is the Kronecker
delta
·)
denotes
the
ensemble
average.
The
numbe
integration
taken
over
a
suitable
determined
by
the
range
of
the
!
of random variables N ∈ N is in general
infinite, so is the indexN
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
j
ctor
ξ.
dom
variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in
(2.1).
In
practice
produit interne:
= set
f (ξ)g(ξ)w(ξ)dξ
(2.j
(f (ξ)g(ξ))
=
however,
we need
tof (ξ)g(ξ)dP
retain a (ω)
finite
of random variables,
ω∈Ω
Ndiscrete
< we
∞, need
and
finite-term
truncation
of
(2.1).
case, ato
the
above orthogonal
relation
takes the
form i.e., to {ξj }N
ver,
retain
a
finite
set
of
random
variables,
wit
j=1
N < ∞, and a finite-term truncation of (2.1).
Thew(ξ)
product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
me
"
∞, with
and
ainner
finite-term
truncation
of
(2.1).
denoting
the
density
of the in
law(2.2)
dP (ω)iswith
respect
to thespace
Lebesgue
measu
The
inner
product
in
the
Hilbert
determin
f
(ξ)g(ξ)w(ξ).
(2.4)
(f
(ξ)g(ξ))
=
he
random
variables
he dξ
inner
in (2.2)taken
is in over
the aHilbert
determined
o
andproduct
with integration
suitablespace
domain,
determinedbybythe
the measure
range of th
the random variables
ξ
!
!
random
vector ξ.
ndom
variables
!
!
!
!
In
the
discrete
case,
the
above
orthogonal
relation
takes
the
form
f
(ξ)g(ξ)dP
(ω)
=
f
(ξ)g(ξ)w(ξ)dξ
(f
(ξ)g(ξ))
=
ation (2.1), there is a one-to-one
correspondence
between
the
type
of
the
f (ξ)g(ξ)dP (ω) = f (ξ)g(ξ)
(f (ξ)g(ξ))
=
ω∈Ω
jof
Generalized polynomial chaos
of=
representing
second-order
stocha
EY =is a means
YEY
(ω)dP
(ω). Y (ω)dP
k=0 (gPC)
(ω). N
ω∈Ωa set of random
tic processes X(ω) parametrically through
ω∈Ω variables {ξj (ω)}j=1 , N
ξ(ω))}
are orthogonal
polynomials
in terms of a zero-mean random vector
N, through
the events
ω ∈ Ω:
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
second-order
)}N
,
satisfying
the
orthogonality
relation
Generalized
polynomial
chaos
(gPC)
is
a
means
of
representing
j=1
∞
"
ic processestic
X(ω)
parametrically
through a setthrough
of random
variables
{ξj (ω)}
processes
X(ω)X(ω)
parametrically
a
set
of
random
vari
=
a
Φ
(ξ(ω)).
(2.
k k
2
(Φ
Φ
)
=
(Φ
)δ
,
(2.2)
N, through the
events
ω
∈
Ω:
i
j
ij
i
N, through the events ω ∈ Ω:
Polynômes de Chaos (Généralisés)
k=0
∞
s the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
number
"
∞
Here {Φj (ξ(ω))} are orthogonal polynomials
in terms
of
a
zero-mean
random vect
"
N is in general infinite,
X(ω)
=
a
Φ
(ξ(ω)).
variables
∈N
so
is
the
index
in
(2.1).
In
practice,
k
k
ξ := {ξjN
(ω)}
,
satisfying
the
orthogonality
relation
X(ω) =
a k Φk N
(ξ(ω)).
j=1
we need to retain a finite set of randomk=0
variables, i.e., to {ξj }j=1 with
k=0
2
(2.
nd a finite-term truncation of (2.1). (Φi Φj ) = (Φi )δij ,
Hereproduct
{Φj (ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero-mean
random
nner
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
measure
of
Here
{Φ
(ξ(ω))}
are
orthogonal
polynomials
in
terms
of
a
zero
j
N the Kronecker delta and (·, ·) denotes the ensemble average. The numb
where
δ
is
ij
ξm:=
{ξ
(ω)}
,
satisfying
N the orthogonality relation
j
variables
j=1
ξ
:=
{ξ
(ω)}
,
satisfying
the
orthogonality
relation
n’est
pas
limité
à
une
distribution
gaussienne!
j
j=1
of random variables
N
∈
N
is
in
general
infinite,
so
is
the
index
in (2.1). In practic
!
!
N
2
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
wit
j
j=1
(Φ=
) f=(ξ)g(ξ)w(ξ)dξ
(Φi )δij , entre la2 fonction
f (ξ)g(ξ)dP Etroite
(ω)
(2.3) de
(f (ξ)g(ξ)) =
i Φjcorrespondance
N < ∞, and a finite-term
truncation of (2.1). (Φi Φj ) = (Φi )δij ,
ω∈Ω
poids
du
polynôme
choisi
et
la
densité
de
The
inner
product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
measure
where
δ
is
the
Kronecker
delta
and
(·,
·)
denotes
the
ensemble
average.
The
ij
probabilité
de
l’incertitude
denoting
the where
density
law Kronecker
dP (ω) with delta
respectand
to the
δof
is the
(·, Lebesgue
·) denotesmeasure
the ensemble
the random
variables
ij the
ofh random
variables
N∈
N is! in domain,
general determined
infinite, !so by
is the
the range
indexofinthe
(2.1). In
integration
taken
over
a
suitable
of random variables N ∈ N is in general infinite, so is the indexN
however,
we
need
to
retain
a
finite
set
of
random
variables,
i.e.,
to
{ξ
}
j
ctor
ξ.
produit interne:
= set
f (ξ)g(ξ)w(ξ)dξ
(2.j
(f (ξ)g(ξ))
=
however,
we need
tof (ξ)g(ξ)dP
retain a (ω)
finite
of random variables,
ω∈Ω relation
Ndiscrete
< ∞, and
finite-term
truncation
of (2.1).
case, athe
above orthogonal
takes the form
N < ∞, and a finite-term truncation of (2.1).
The
inner
product
in
(2.2)
is
in
the
Hilbert
space
determined
by
the
me
"
with w(ξ) denoting
the
density
of the in
law(2.2)
dP (ω)iswith
respect
to thespace
Lebesgue
measu
The
inner
product
in
the
Hilbert
determin
f (ξ)g(ξ)w(ξ).
(2.4)
(f (ξ)g(ξ)) =
he dξ
random
variables
and with integration taken over a suitable domain, determined by the range of th
the random variables
ξ
!
!
random vector ξ.
!
!
In
the
discrete
case,
the
above
orthogonal
relation
takes
the
form
f
(ξ)g(ξ)dP
(ω)
=
f
(ξ)g(ξ)w(ξ)dξ
(f
(ξ)g(ξ))
=
ation (2.1), there is a one-to-one correspondence between the type of the
(f (ξ)g(ξ))
=
ω∈Ω
f (ξ)g(ξ)dP (ω) =
f (ξ)g(ξ)
Hypergeometric Orthogonal Polynomials
• Generalized hypergeometric series:
• Pochhammer symbol:
• Infinite series converge under certain conditions:
• Examples: 0F0 is exponential series; 1F0 is binomial series.
• If one of the ai’s is a negative integer (-n), the series terminate at nth-term
and become hypergeometric orthogonal polynomials:
• Limit relations: e.g.
The Askey scheme of
Hypergeometric Polynomials
Askey-scheme
Hypergeometric Orthogonal Polynomials
• Orthogonal polynomials
• Three-term recurrence:
• Favard’s inverse theorem
• Orthogonality:
• Weighting functions and PDFs:
 Continuous:
 Discrete:
Orthogonal Polynomials and Probability Distributions
 Continuous Cases:
• Hermite Polynomials
• Laguerre Polynomials
Gaussian Distribution
Gamma Distribution
(special case: exponential distribution)
• Jacobi Polynomials
• Legendre Polynomials
Gaussian
distribution
Gamma
distribution
Beta Distribution
Uniform Distribution
Beta
distribution
Orthogonal Polynomials and Probability Distributions
 Discrete Cases :
• Charlier Polynomials
• Krawtchouk Polynomials
• Hahn Polynomials
• Meixner Polynomials
Poisson
distribution
Poisson Distribution
Binomial Distribution
Hypergeometric Distribution
Pascal Distribution
Binomial distribution
Hypergeometric distribution
KARNIADAKIS,SU,XIU,LUCOR, SCHWAB, AND TODOR
8Polynômes KARNIADAKIS,SU,XIU,LUCOR,
de Chaos (Résumé) SCHWAB, AN
M
!
S,SU,XIU,LUCOR, SCHWAB, AND TODOR
X(ω) =
aj Φj (ξ(ω)),
j=0
the
form
KARNIADAKIS,SU,XIU,LUCOR, SCHWAB, AND TODOR M
(2.5)
!
= (ξ1 , · · · , ξN ) is an N −dimensional random vector with ξi independent of
M (ξ)
!
X(x,
t,
ξ)
=
X(x,
t,
ξ
,
ξ
,
.
.
.
ξ
)
≈
X
(x,
t)Φ
M
1 2
N
j
j
!
T
1 ≤ i #= j ≤ N . If we denote the highest order of polynomial {Φ} as P , then
X(ω)
=
aj Φj (ξ(ω)),
j=0
X(ω) =
aj Φj (ξ(ω)),
(2.5)
number
of expansion
terms (M + 1) is,
M
j=0
!
j=0
(M + 1) = (N + P )!/(N !P !).
(2.6)
X(ω) =
aj Φj (ξ(ω)),
(2.5)
N −dimensional
random
vector with
ξexpansion
of
T
i independent
-dimensional
generalized
polynomial
chaos
is
constructed
the distribution
n’est
pas
limitéasà une
gaussienne!vec
avec ξj=0
where
=
(ξ
,
·
·
·
,
ξ
)
is
an
N
−dimensional
random
1
N
denote
the
highest order
of polynomial
{Φ} as Note
P , then
duct
of the
corresponding
one-dimensional
expansion.
in one-dimensional
T
ξ
for
all
1
≤
i
=
#
j
≤
N
.
If
we
denote
the
highest
order
of
p
j
:
dimension
de
l’espace
probabiliste
·
·
,
ξ
)
is
an
N
−dimensional
random
vector
with
ξ
independent
of
ns terms
(M
+
1)
is,
i
(N N
= 1), M = P .
consider
general
setting
for a order
differential
equation terms
with
=s jnow
≤N
. If weathe
denote
the
highest
of polynomial
{Φ}random
as(M
P , then
total
number
of expansion
+ 1) is,
M + 1) = (N + P )!/(N
: ordre!Ple!).plus élevé du polynôme (2.6)
er of expansion terms (M + 1) is,
(M
+
1)
=
(N
+
P
)!/(N
!P
!).
lizedL(x,
polynomial
chaos
expansion
is constructed
as
the
Moyenne:
X(ω)
=<
X(ω)
>=
a
t, ω;SCHWAB,
u)
=
f
(x,
t;
ω),
x
∈
D(Λ),
t
∈
(0,
T
),
ω
∈
Ω,
(2.7)
,LUCOR,
AND
TODOR
0
(M
+
1)
=
(N
+
P
)!/(N
!P
!).
(2.6)
P
nding one-dimensional expansion.d Note in one-dimensional
#
!
"
2
s a differential operator,
D(Λ)var(X(ω))
∈ R (d = 1,
2, 3)X(ω)
a bounded
domain
with a2 < Φ2chaos
The
multi-dimensional
generalized
polynomial
expan
Variance:
>=
=<
−
X(ω)
j >
j
nsional
generalized
polynomial
chaos
expansion
is
constructed
as
the
XIU,LUCOR,
SCHWAB,
AND
TODOR
Λ > 0, and T > 0. (Ω, A, P) is an appropriately defined complete probability
j=1
tensor
product
of
the
corresponding
one-dimensional
expans
eneral
setting
for
a
differential
equation
with
random
Ω
of
the
corresponding
one-dimensional
expansion.
Note
in
one-dimensional
ere A ⊆ 2 is the σ-algebra and P the probability measure; u := u(x, t; ω)
ΦP
= of1.0
0 (ξ)
=
1),
M
=
P
.
ution
and
f
(x,
t;
ω)
is
the
source
term.
The
general
procedure
applying
expansions
(N
=
1),
M
=
.
Exemple:
M
!polynomial
ξ1
alized
chaos consists
offor
theafollowing
steps:Φ
1 (ξ) = with
consider
a
general
setting
differential
equation
randomfor a different
(x, t; ω),
x ∈ D(Λ),
t∈
(0,now
T ), ω consider
∈ Ω,
(2.7)
Let
us
a
general
setting
: inputs
distribution
random
by agaussienne
finite number of random
variables
ξ(ω) =
=xpress the
aMj Φ
Φ2 (ξ)
= ξ(2.5)
j‣(ξ(ω)),
2
d rewrite the stochastic problem parametrically 2as
inputs
ξ1 (ω), ·,!
ξN (ω)},
and
rator,
D(Λ)
∈
R
(d
=
1,
2,
3)
a
bounded
domain
with
Φ
(ξ)
= ξ1 − 1
3
j=0
:
Polynômes
d’Hermite
‣
(x,
t,P)
ω; is
u)an
=
fΦ
(x,
t;L(x,
ω),t, ξ; u(x,
x ∈t; ξ))
D(Λ),
t
∈
(0,
T
),
ω
∈
Ω,
(2.7)
ω)
=
ajappropriately
(2.5)
j (ξ(ω)),
Ω,
A,
defined
complete
probability
2
= f (x, t; ξ). Φ4 (ξ) = ξ2 − 2(2.8)
L(x,
t,
ω;
u)
=
f
(x,
t;
ω),
x
∈
D(Λ),
t
∈
(0
d
-algebra
and
P
the
probability
measure;
u
:=
u(x,
t;
ω)
j=0
N=2;
P=2
fferential
D(Λ)
∈ R (d
= 1,already
2,
atake
‣random
mensional
with
ξi3)independent
Φbounded
ξdomain
his step
is operator,
trivial
when
thevector
random
inputs
the=form
ran- with
5 (ξ)
1 ξof
2 of
isand
the
term.
The
general
procedure
of applying
d
Tsource
> 0. (Ω,
A,
P)
is
an
appropriately
defined
complete
probability
om
variables.
When
the
random
inputs
are
random
fields,
a
decomposowhere L is a differential operator, D(Λ) ∈ R (d = 1, 2, 3
the highest order of polynomial {Φ} as P , then
ral
setting
for
a
differential
equation
with
random
Ω
ti-dimensional
generalized
polynomial
chaos
expansion
is
constructed
as
th
eiameter
A ⊆ 2 Λ is
the
σ-algebra
and
P
the
probability
measure;
u
:=
u(x,
t;
ω)
> 0, and T > 0. (Ω, A, P) is an appropriately
defined complete probability
inputs
Ω the source one-dimensional
on and
f (x,
t; corresponding
ω)2is
term.and
The
general
procedure
of
applying
roduct
of
the
expansion.
Note
in:=one-dimensio
pace,
where
A
⊆
is
the
σ-algebra
P
the
probability
measure;
u
u(x,
t;
ω)
,zed
t, ω;polynomial
u) = f (x, t;chaos
ω), consists
x ∈ D(Λ),
t
∈
(0,
T
),
ω
∈
Ω,
(2.7)
Technique
d’utilisation
du
PC
pour
la
résolution
L(x,
t,
ω;
u)
=
f
(x,
t;
ω)
of
the
following
steps:
ns
(N
=
1),
M
=
P
.
the
f (x, t;t ω)
is the
general procedure of applying
, t;
ω),solutionx and
∈ D(Λ),
∈ (0,
T ),source
ω ∈ Ω,term. The (2.7)
dad’équation
différentielle
stochastique
press
the
random
inputs
by
finite
number
of
random
variables
ξ(ω) =
rential
operator,
D(Λ)
∈
R
(d
=
1,
2,
3)
a
bounded
domain
with
he
generalized
polynomial
chaos
consists
of
the
following
steps:
us now consider
a
general
setting
for
a
differential
equation
with
random
where
L
is
a
differential
operator,
D
d
or,TD(Λ)
∈ R
(d
=
1,
2,
3)
a
bounded
domain
with
Approche
INTRUSIVE
(method
weighted
(ω),
·,1.>ξNExpress
and
rewrite
the
stochastic
problem
parametrically
as residuals)
nd
0.(ω)},
(Ω,
A,
P)
is
an
appropriately
defined
complete
probability
the random inputs by a finite number
ofof
random
ξ(ω)
diameter
Λ > 0,variables
and
T > 0.
(Ω,=A, P)
A,
appropriately
defined
complete
probability
Ω
2ΩP)
is is
the
σ-algebra
and
P
the
probability
measure;
u
:=
u(x,
t;
ω)
{ξan
·, ξN (ω)},
and
rewrite
the
stochastic
problem
parametrically
as σ-algebra
1 (ω),
space,
where
A
⊆
2
is the
L(x, t, ξ; u(x, t; ξ)) = f (x, t; ξ).
(2.8)
gebra
and
the
probability
measure;
u
:=
u(x,
t;∈ω)
d f (x, t;
ω) P
ist, the
source
term.
The general
procedure
of
applying
L(x,
ω; u)
= f (x,
t;L(x,
ω),
x
∈
D(Λ),
t
(0,
T
),
ω
∈
Ω,
(2.7
is
the
solution
and
f
(x,
t;
ω)
is
the
so
t,
ξ;
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
(2.8)
slynomial
step
is trivial
when
random
inputs
already
take
the
form
of
ranchaos
consists
of
the
following
steps:
the
source
term.
Thethe
general
procedure
of
applying
the generalized polynomial chaos con
d
a
differential
operator,
D(Λ)
∈
R
(d
=
1,
2,
3)
a
bounded
domain
with
This
step
is
trivial
when
the
random
inputs
already
take
the
form
of
ranmsheis
variables.
When
the
random
inputs
are
random
fields,
a
decomposorandom
inputs
by
a finitesteps:
numberà l’aide
of random
variables
ξ(ω) =the random inputs b
1/ of
Discrétiser
le processus
aléatoire
de variables
consists
the following
1. Express
Λ(ω)},
>by0,
and
T
>
0.
(Ω,
A,
P)
is
an
appropriately
defined
complete
probability
dom
variables.
When
the
random
inputs
are
random
fields,
a
decomposois
needed.
One
popular
choice
of
such
decomposition
is
the
aléatoires
(indépendantes).
ξruts
and
rewrite
the
stochastic
problem
parametrically
as
Ntechnique
{ξ
(ω),
·,
ξ
(ω)},
and
rewrite
a finite
number
of
random
variables
ξ(ω)
=
1
N
Ωexpansion,
tion
technique
isσ-algebra
needed.
One
choice
of such
decomposition
isu(x,
the t; ω
hunen-Loeve
which will
bepopular
discussed
in the
following
section.
here
A
⊆
2
is
the
and
P
the
probability
measure;
u
:=
write the stochastic
problem
parametrically
as
L(x,
t,
ξ;
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
(2.8)
L(x, t,
Karhunen-Loeve
expansion,
which
will
be
discussed
in
the
following
section.
proximate
the
solution
and
inputs
by
finite-term
polynomial
chaos
exlution and f (x, t; ω) is the source term. The general procedure of applying
2. ξ;
Approximate
the
solution
andalready
inputs
by
finite-term
polynomial
chaoswhen
ex- the
is
trivial
when
the
random
inputs
take
the
form
of
ran(x,
t,
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
(2.8)
sions
(2.5),
and
substitute
the
expanded
variables
into
the
variational
2/
Ecrire
la
solution
et
les
paramètres
d’entrée
incertains
sous
forme
de
This
step
is
trivial
ralized polynomial chaos consists of the following steps:
pansions
(2.5),
substitute
thel’équation.
expanded
into
the variational
sommes
de and
PC etinputs
substituer
ables.
thefinies
random
aredans
random
fields, variables
a decomposoation When
dom
variables.
When
the
ra
Express
the random
inputs by
a finite
number
of random variables ξ(ω) =
n
the
random
inputs
already
take
the
form
of
ran"
#
equation
nique is needed. One popular
choice of such decomposition
istechnique
the
tion
is needed. On
M
!
"rewrite
#
{ξ
(ω),
·,
ξ
(ω)},
and
the
stochastic
problem
parametrically
as
he
random
inputs
are
random
fields,
a
decomposo1
N
M
-Loeve expansion,
which
in fthe
section.
expansion, w
!
L x, t,
ξ(ω);will be
ui Φdiscussed
(ξ(ω)
=
(x, following
t; ξ(ω)). Karhunen-Loeve
i
d.ateOne
choice
of
such
decomposition
is
the
L
x,
t,
ξ(ω);
u
Φ
(ξ(ω)
=
f
(x,
t; ξ(ω)).
the popular
solution and
inputs
by
finite-term
polynomial
exApproximate
the solution
an
L(x,
t;chaos
ξ).
(2.8
i=0 t, ξ; u(x,i t;iξ)) = f (x,2.
ion,
will be discussed
in the
followinginto
section.
i=0
(2.5),which
and substitute
the expanded
variables
the variational
pansions
(2.5),
and
substitu
form a Galerkin
projection
onto
each
of
the
polynomial
basis
3/ Projeter (type Galerkin) sur la base des polynômes orthogonaux
This
step
is
trivial
when
the
random
inputs
already
take
the
form
of
ran
on
and
inputs
by
finite-term
polynomial
chaos
ex3.
Perform
a
Galerkin
projection
onto
each
of
the
polynomial
basis
equation
$ " considérés.
#
%
Obtention
d’un
système
linéaire.
M"
"
#
$
#
%
!
"
stitute
the expanded
variables
into
the
variational
dom
variables.
When
the
random
inputs
are
random
fields,
a
decomposo
M
M
!
!
L x, t, ξ;
ui Φi (ξ) , Φk (ξ) = &f, Φk (ξ)' ,
k = 0, 1, · · · , M.
tion technique
needed.
One ,=
popular
choice
of ,suchkdecomposition
th
L x, L
t, ξ(ω);
ui Φ
(ξ(ω)
(x, t;=
ξ(ω)).
x,ist, ξ;
ui Φ
Φkf(ξ)
&f, Φk (ξ)'
= 0, 1, · · · L, M.x, t,is
ξ(ω);
i (ξ)
i=0
#
i=0
i=0
Karhunen-Loeve
expansion,
which
will
be
discussed
in
the
following
section
M
! in a set of (M +1) deterministic, in general coupled, differential
ure results
This
procedure
results
a=set
offaçon
(M
+1)
deterministic,
in 3.
general
coupled,
differential
Galerkin
projection
onto
each
ofξ(ω)).
theinputs
polynomial
basis
Perform
a Galerkin
projectio
Approximate
the
solution
and
by
finite-term
polynomial
chaos
ex
- les
modes
PC sontin
couplés
de
implicite
ξ(ω);
u
Φ
(ξ(ω)
f
(x,
t;
i
i
hich can- nécessite
be solved
by conventional
discretization methods. Importantly,
l’adaptation
du
solver
déterministe
#
%
$
"
#
quations
which
can
be
solved
by
conventional
discretization
methods.
Importantly,
M
M
pansions
and substitute
the expanded
into and
the!
variationa
i=0
! (2.5),
ve Galerkin
discretization
“in probability”
(by gPC variables
approximation)
his successive Galerkin discretization “in probability” (by gPC approximation) and
ral setting for
a
differential
equation
with
random
hypergeometric
Hahn-chaos
{0, 1,
, N∈constructed
} Ω,
ti-dimensional
generalized
polynomial
chaos
expansion
is
as th
L(x, t, ω; u) = f (x, t; ω),
x ∈ D(Λ),
t ∈ (0,
T .),. .ω
(2.7)
inputs
roduct of the corresponding one-dimensional
expansion. Note in one-dimensio
d
where
L is
differential
operator,
2, 3)pour
a bounded
domain
with
, t, ω; u)
= af (x,
t; ω), Technique
x ∈ D(Λ),D(Λ)
t ∈ (0,∈TR
), ω(d∈=
Ω,1,PC
(2.7)
d’utilisation
du
la
résolution
L(x,
t,
ω;
u)
=
f
(x,
t;
ω)
ns
(N
=
1),
M
=
P
.
,iameter
t; ω),approche,
D(Λ),
∈ (Ω,
(0,deT
),
ω∈
Ω, appropriately
(2.7)
Λ >x0,∈and
Tintrusive
>t 0.
A,
P)
is
an
defined
complete
probability
une
dite
l’application
des
PC.
Elle
consiste
à
substituer
la
d d’équation différentielle
stochastique
rential
operator,
D(Λ)
∈
R
(d
=
1,
2,
3)
a
bounded
domain
with
Ω
us
now
consider
a
general
setting
for
a
differential
equation
with
random
where
L
is
a
differential
operator,
D
d 2 stochastique
pace,
where
A
⊆
is
the
σ-algebra
and
P
the
probability
measure;
u
:=
u(x,
t;
ω)
on
PC
dans
l’EDP
considérée,
à
projeter
(projection
de
type
Galerkin)
or,TD(Λ)
∈ R
1, appropriately
2, 3)
a bounded
domain
with
Approche
NON-INTRUSIVE
(collocation
method)
nd
> 0. (Ω,
A, (d
P) =
is an
defined
complete
probability
diameter
Λ
>
0,
and
T qui
>of0.résulte
(Ω, A, P)
système
sur
la
base
des
PC
puis
à
résoudre
le
système
couplé
déterministe
the
solution
and
f
(x,
t;
ω)
is
the
source
term.
The
general
procedure
applying
A,
anσ-algebra
appropriately
complete
probability
Ω
2ΩP)
is is
the
and Pdefined
the probability
measure;
u :=
u(x, t;
ω)
space,
where
A
⊆
2
the cette
σ-algebra
jection.
Cette approche
ne chaos
sera pas
décriteofenthe
détail
dans cesteps:
rapport. De isplus,
he
generalized
polynomial
consists
following
gebra
and
P
the
probability
measure;
u
:=
u(x,
t;∈ω)
d
f
(x,
t;
ω)
is
the
source
term.
The
general
procedure
of
applying
L(x,
t,
ω;
u)
=
f
(x,
t;
ω),
x
∈
D(Λ),
t
(0,
T
),
ω
∈
Ω,
(2.7
is
the
solution
and
f
(x,
t;
ω)
is
the
t coûteuse
en terme
développement
duacode
de number
calcul déterministe
base quiξ(ω)
doit = so
1. Express
the de
random
inputs by
finite
of randomde
variables
lynomial
chaos
consists
of
the following
steps:
the
source
term.
The
general
procedure
of
applying
the
generalized
polynomial
chaos
con
d
rdes
modifications.
{ξ
(ω),
·,
ξ
(ω)},
and
rewrite
the
stochastic
problem
parametrically
as
1
N operator, D(Λ) ∈ R (d = 1, 2, 3) a bounded domain with
is
a
differential
random
inputs
by
a finitesteps:
numberà l’aide
of random
variables
ξ(ω) =the random inputs b
1/ of
Discrétiser
le processus
aléatoire
de variables
sheconsists
the following
1. Express
Λapproche,
>by0,and
and
T
>
0.
(Ω,
A,
P)
is
an
appropriately
defined
complete
probability
dite
non-intrusive
consiste
à
projeter
directement
la
solution
stochastique
aléatoires
(indépendantes).
L(x,
t,
ξ;
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
(2.8)
ξrreuts
rewrite
the
stochastic
problem
parametrically
as
N (ω)},
{ξ
(ω),
·,
ξ
(ω)},
and
rewrite
a finite
number
of
random
variables
ξ(ω)
=
1
N
Ω
ble
desApolynômes
de laσ-algebra
base PC. De
cette
manière,
les coefficients
Jk peuvent
être t; ω
here
⊆
2
is
the
and
P
the
probability
measure;
u
:=
u(x,
write the
problem
parametrically
as already (2.8)
Thisstochastic
step
is
trivial
when
the
random
inputs
take the form of ranL(x,
t,
ξ;
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
L(x, t,
ctement
grâce
à
une
projection
de
type
Galerkin
de
la
solution
sur
la
base
PC.
C’est
lution dom
and variables.
f (x, t; ω) When
is thethe
source
term.
The
general
procedure
of
applying
random
inputs
are
random
fields,
a
decomposoque
nous
utiliserons
dans
cette
étude. L’orthogonalité
de
la base
rend
cette
projection
is
trivial
when
the
random
inputs
already
take
the
form
of
ran(x,
t,
ξ;
u(x,
t;
ξ))
=
f
(x,
t;
ξ).
(2.8)
This
step
is
trivial
when
the
ralizedtion
polynomial
chaos
consists
of
the
following
steps:
2/ Obtenir
les coefficients
PC enOne
projetant
la solution
sur laofbase
polynomiale.
technique
is
needed.
popular
choice
such
decomposition
is
the
s avons:When the random inputs are random fields, a decomposoables.
dom
variables.
When
the
ra
Express
the random
inputs
by
a finite
number
of random
variables
ξ(ω) =
Karhunen-Loeve
expansion,
which
will
be of
discussed
in the following
section.
n
the
random
inputs
already
take
the
form
rannique is needed. One popular choice of such decomposition
istechnique
the
tion
is
needed.
On
2. Approximate
the rewrite
solution the
andstochastic
inputs
by problem
finite-termparametrically
polynomial chaos
ex{ξ
(ω),
·,
ξ
(ω)},
and
as
he
random
inputs
are
random
fields,
a
decomposo1
N
=
-Loeve expansion, which will be discussed
in Φ
the
following
section.
<u
J(ω)
> Karhunen-Loeve
expansion, w
k (ξ(ω))
u
pansions
variables
into the variational
.
(5)
(∀k (2.5),
∈
{0, .and
. .of
, P substitute
})
Jkkdecomposition
= the expanded
d.ateOne
popular
choice
such
is
the
2
the solution and inputs
byt, finite-term
polynomial
ex<t;Φξ))
(ξ(ω))
>
2.t;chaos
Approximate
the solution
an
L(x,
ξ;
u(x,
=
f
(x,
ξ).
(2.8
k
equation
ion,
which
will be discussed
in thevariables
followinginto
section.
(2.5), and substitute
the"expanded
the
variational
pansions (2.5), and substitu
#
This
step
is< trivial
when
random
already
the
form of ran
on
and
inputs
finite-term
polynomial
chaos
ex- equation
ppelons
que
Φkby
(ξ(ω))
>= 0the
pour
kM> 0. inputs
Le
mode-zéro
dutake
PC, J
!
0 , représente
L
x,
t,
ξ(ω);
u
Φ
(ξ(ω)
=
f
(x,
t;
ξ(ω)).
Finalement
on
a:the
"
#
u
=
"
i
i
moyenne
J¯ et expanded
les
autres
modes
d’ordre
plus
élevés
sont
une
mesure
de la avariabilité
stitute
the
variables
into
the
variational
dom
variables.
When
random
inputs
are
random
fields,
decomposo
M
!
i=0
de latechnique
solution
autour
de
sa
valeur
moyenne.
particulier
la variance
de laLsolution
tion
is
needed.
One
popular
choice
of such
decomposition
th
L x, t, ξ(ω);
ui Φi (ξ(ω) = f (x, t;En
ξ(ω)).
x, t,is
ξ(ω);
3. Perform a Galerkin
projection
onto
each
of the polynomial
basis
#
i=0
Karhunen-Loeve
expansion,
which
will
be
discussed
in
the
following
section
M
revient
au
calcul
de
nombreuses
quadratures
numériques
! $ "
#
%
M each of the polynomial basis3. Perform a Galerkin projectio
risque
d’aliasing
Galerkin
projection
onto
!
Approximate
the
solution
and
inputs by finite-term polynomial chaos ex
ξ(ω);
ui Φi (ξ(ω) = f (x, t;
ξ(ω)).
- simplicité
d’utilisation
ne
nécessite
pas
l’adaptation
duk (ξ)'
solver, déterministe
(boite
noire)
#
%
$
"
#
L
x,
t,
ξ;
u
Φ
(ξ)
,
Φ
(ξ)
=
&f,
Φ
k
=
0,
1,
·
·
·
,
M.
i
i
k
M
M
pansions
variationa
i=0
! (2.5), and substitute the expanded variables into the!
i=0
dt
i
i
j i j
the of
following
form
i=0
i=0 equation,
j=0
ure
gPC for
a where
simpletheordinary
differential
and present
decay
rate
coefficient
k(ω)
is a random
variableerror
with certain c
distribution
function
f
(k)
and
zero
mean
value.
The
solution
takes
a
simp
both through
numerical
and
theoretical
estimates.
The
model
We then
project
the examples
above
equation
onto
the
random
space
spanned
by
the
orthogonal
st
(t, ω) =takes
−k(ω)y,
y(0) {Φ
=form
t ∈taking
(0, T ),the inner product of(4.1)
polynomial
basis
} by
the−k(ω)t
equation with each basis.
nsider
the following
iŷ,
Example: 1 ordery(t,linear
ODE
ω) = ŷe
.
By utilizing the orthogonality condition (2.2), we obtain:
dy is a random variable with
oefficient k(ω)
certain
continuous
M
M
We
apply
the 1gPC
expansion
(2.5)
to the solution
y+and
random
input
(t, ω) = −k(ω)y,
y(0)
=
ŷ,
t
∈
(0,
T
),
(4.1)
k
=
k̄
σ
ξ
!
!
k
2
dy
(t)
l
k) and zero
solution
of0, 1, . . . , M,
dtmean value. The
=
− 2 takes eaijlsimple
ki yj (t), forml =
(4.5)
dt
#Φl $
M
!
i=0 j=0
M
!
decay ratey(t,
coefficient
k(ω). is ay(t,
random
with(4.2)
certain
continuous
ω) = variable
yi (t)Φi (ξ(ω)),
k(ω) =
ki Φi (ξ(ω)).
ω) = ŷe−k(ω)t
eijl =zero
#Φi Φjmean
Φl $. Note
that The
the
smooth
and thus
any standard
i=0coefficients
i=0
function where
f (k) and
value.
solution are
takes
a simple
form
of
M !
M
be employed
here.
!
expansion ODE
(2.5) solver
to thecan
solution
y
and
random
input k
1
dyl
Note4.1,
here
the
only−k(ω)t
random
input
is
k(ω)
and
aj one-dimensional
gPC
is ne
In Figure
the
computational
results
of
Jacobi-chaos
expansion
is shown,
where
=
−
<
Φ
Φ
Φ
>
k
y
for
l
=
0,
1,
.
.
.
,
M
Galerkin projection:
i
j
l
i
2 >
y(t,
ω)
=
ŷe
.
(4.2)
dt
<
Φ
the random
input
to
a beta
PDF
of the formBy su
N
=
1 in k(ω)
(2.6)is
and
P j=0
, have
where
P
is distribution
the highest
order
of expansion.
M
Massumed
l M =
i=0
!
!
14
KARNIADAKIS,SU,XIU,LUCOR,
SCHWAB,
ANDwith
TODOR
into
the
governing
equation,
we obtain
=ly the
yi (t)Φ
k(ω)
=
k
(4.3)
i (ξ(ω)),the expansions
iΦ
iα(ξ(ω)).
β
gPC expansion (2.5) to the solution y and random input k
(1 − k) (1 + k)
f (k; α, β) = α+β+1
,
i=0
M1, β + 1)
2
B(α +
!
dy (t)
i=0
−1 M
< kM
< 1, α, β > −1,
iM
!!
(4.6)
M
Φi = − i.e., Φi Φj ki yj (t).
!
!
dom input where
is k(ω)
and
a
one-dimensional
gPC
is
needed,
dt
B(α,
β)
is
the
Beta
function
defined
as&!Φ
B(p,
q) =
Γ(p)Γ(q)/Γ(p(4.3)
+ q). An
i=0
i=0
j=0
y(t,
ω)
=
y
(t)Φ
(ξ(ω)),
k(ω)
=
k
(ξ(ω)).
i
i order of expansion. By substituting
i i
P , where P is the highest
&"#
'&
important special case is α = β = 0 when the distribution becomes the uniform
i=0
i=0
governing distribution
equation,
we
obtain
corresponding
becomes
the
Legendre-chaos.
We
We and
thenthe
project
the aboveJacobi-chaos
equation onto
the
random
space
spanned
by the o
&!
observe exponential
convergence
rate
of the errors
in mean
and variance
on
the right
!")
polynomial
basis
{Φ
}
by
taking
the
inner
product
of
the
equation
with e
i
he
only
random
input
is
k(ω)
and
a
one-dimensional
gPC
is
needed,
i.e.,
M
M !
M
!
!
4.1,utilizing
with different
sets of parameter
values α
and we
β. In
particular, we mote
dyi (t) of FigureBy
the orthogonality
condition
(2.2),
obtain:
&!
6) and !MΦ
=
P
,
where
P
is
the
highest
order
of
expansion.
=
−
Φ
Φ
k
y
(t).
i the asymptotic
i convergence
j i j
that
rate seems to be the(4.4)
sameBy
for substituting
the variance and the
dt
M !
M
i=0 into the
i=0 in
j=0
ons
governing
equation,
obtain
mean
unlike
Monte
Carlo we
methods.
!
dyl (t)
1
&!
For this simple ODE we can
also
perform
error
analysis
for different
types
of
'!")
=
−
e
k
y
(t),
l
=
0,
1,
.
.
.
,
M,
ijl
i
j
2
y (mean)
dtspanned
#Φ
$the
ve equation
onto
the
random
space
by
M
M
M
l
distributions.
To
this
end,
we
define
the
relative
mean-square
error as %P = #(y(t) −
y
i=0orthogonal
j=0
!y dyi (t)
!!
Mean (!=0, "=0)
2
2
&!
VarianceThe
(!=0, "=0)
'!"(
yP (t))
$/#y
(t)$,Φwhere
y
(t)
is
the
finite-term
expansion
(4.3).
following
results
by taking
the
inner
product
of
the
equation
with
each
basis.
y
P
=
−
Φ
Φ
k
y
(t).
(4.4)
i
i
j
i
j
Mean (!=1, "=3)
y
dt
where
e
=
#Φ
are
and
thus
any
Variance
(!=1,smooth
"=3)
have
been
obtained
in
[34]:
ijlobtain:
i Φj Φl $. Note that the coefficients
onality condition
(2.2),
we
Deterministic
i=0
i=0 j=0
!"(
Error
Solution
'*
'$
'%
0
1
2
'+
3
4
ODE !"$solver can
here. &! !
!"%$ be employed
&
&
#
*
)
$
Time
P
In
Figure 4.1, the computational results of Jacobi-chaos
expansion is sho
'&&
!
M
!"#$
M
Avantages des méthodes gPC
Méthode efficace qui fournit une estimation quantitative de la
sensibilité de la solution aux incertitudes des parametres d’entrée
Convergence spectrale et représentation optimale (compacité et
précision) de l’incertitude grâce à un choix de polynômes
appropriés. Possibilité de représentation non-intrusive par
projection de la solution sur la base du chaos polynomial.
Non limitée aux distributions gaussiennes d’incertitude ou à des
incertitudes faibles.
Tous les moments + pdf de la solution sont disponibles.
Coût de calcul en général très inférieur aux méthodes de type
Monte-Carlo (distribution gaussienne: 1 à 2 ordres de grandeur,
distribution uniforme: 3 à 4 ordres de grandeur).
i=0
i=0 j=0
dy
for a simple ordinary differential equation,
and
present
error
(t, ω)
−k(ω)y,
y(0) = by
ŷ, tthe
∈ (0,
T ),
We then project the above equationdtonto
the=random
space spanned
orthogonal
gh
numerical
examplesEquations.
and theoretical
estimates.
The we
model
inary
Differential
In
this
section,
theeach
solupolynomial basis {Φi } by taking the inner product of theillustrate
equation with
basis.
the of
following
form
ure
gPC
for
a where
simple
differential
equation,
and present
theordinary
decay rate
coefficient
k(ω)
is a random
variableerror
with certain c
By utilizing
the orthogonality
condition
(2.2), we
obtain:
Difficulty of the method
distribution
function and
f (k) theoretical
and zero mean
value. TheThe
solution
takes a simp
both through numerical
examples
estimates.
model
M !
M
!
dyl (t)=form
1 (0, T ),
(t, ω) =takes
−k(ω)y,
y(0)
ŷ,
t
∈
(4.1)
nsider
the following
=− 2
eijl ki yj (t),
l = 0,−k(ω)t
1, . . . , M,
(4.5)
dt
#Φl $
i=0 j=0
y(t, ω) = ŷe
.
dy is a random variable with certain continuous
oefficient k(ω)
We
apply
thethat
gPC
expansion
(2.5)
to the solution
y+
and
random
input
ω)==
y(0)
=
ŷ,
t
∈
(0,
T
),
(4.1)
k
=
k̄
σ
ξ
k
2
where(t,
eijl
#Φ−k(ω)y,
Φ
Φ
$.
Note
the
coefficients
are
smooth
and
thus
any
standard
i j l
k) and zero
mean
value.
The solution takes a simple form of
dt
ODE solver can be employed here.
M
M
!
!
In Figure 4.1,
the computational
results
of Jacobi-chaos
expansioncontinuous
is shown, where
−k(ω)t
decay ratey(t,
coefficient
k(ω)
is
a
random
variable
with
certain
y(t, ω) =
yi (t)Φi (ξ(ω)),(4.2) k(ω) =
ki Φi (ξ(ω)).
ω) = ŷe
.
the random input k(ω) is assumed to have a beta distribution with PDF of the form
i=0 solution
i=0
function
f (k) and
zero mean value. The
takes AND
a simple
form of
14
KARNIADAKIS,SU,XIU,LUCOR,
SCHWAB,
TODOR
expansion (2.5) to the solution(1y −and
random
k)α (1
+ k)β input k
Note
here
only−k(ω)t
random input
is k(ω)
and
a one-dimensional
gPC is ne
f (k;
α, β)
= the
,
−1
<
k
<
1,
α,
β
>
−1,
(4.6)
α+β+1
y(t,
ω)
=
ŷe
.
(4.2)
2
B(α
+
1,
β
+
1)
N = 1 in (2.6) and
M
M M = P , where P is the highest order of expansion. By su
!
!
the
expansions
into
the
governing
equation,
we
obtain
=ly the
yi&"#
(t)Φ
k(ω)
=
k
(4.3)
where
B(α,
β)
is
the
Beta
function
defined
as
B(p,
q)
=
Γ(p)Γ(q)/Γ(p
i (ξ(ω)),
iΦ
i (ξ(ω)).
gPC expansion (2.5) to the solution y and&! random input k + q). An
important special case is i=0
α = β = 0M when the distribution
becomes the uniform
M
M
!"(
! dyi (t) becomes
!the
! Legendre-chaos. We
distribution
and
the
corresponding
Jacobi-chaos
M
M Φi = −
Φi Φj ki yj (t).
!
!
dom input
is
k(ω)
and
a
one-dimensional
gPC
is
needed,
i.e.,
&!
observe exponential convergence rate of thedt
errors in mean and variance on the right
!")ω) =
i=0 =
i=0 j=0
y(t,
y
(t)Φ
(ξ(ω)),
k(ω)
k
Φ
(ξ(ω)).
(4.3)
i
i
i
i
of P
Figure
4.1,
with different
sets
of parameter
α and β. In particular, we mote
P , where
is the
highest
order of
expansion.
Byvalues
substituting
i=0
i=0
&!
that
the
asymptotic
convergence
rate
seems
to
be
the
same
for
the
variance
and
the
governing
equation,
we
obtain
We then project the above equation onto the random space spanned by the o
!
mean unlike in Monte Carlo methods.
polynomial
basis
{Φia} one-dimensional
by taking the innergPC
product
of the equation
with e
he
only
random
input
is
k(ω)
and
is
needed,
i.e.,
M
Msimple
M ODE we can also perform error
For this
analysis for different types of
&!
!
!
!
dy'!")
By
utilizing
the
orthogonality
condition
(2.2),
we
obtain:
i (t)
y (mean)
distributions.
To this
end,
we
define the
relative
mean-square
error
as %P = #(y(t) −
6) and M
P ,−
where
P
is
the
highest
order
of
expansion.
By
substituting
Φ=
Φ
Φ
k
y
(t).
(4.4)
i =
i
j
i
j
2 y 2
Mean (!=0, "=0)
dt
y
(t))
$/#y
(t)$,
where
y
(t)
is
the
finite-term
expansion
(4.3).
The
following
results
y
P
P
M
M
&!
i=0 into
i=0
j=0
ons
the
governing
equation,
we
obtain
Variance
(!=0,
"=0)
'!"(
!!
y
dy
(t)
1
l
Mean (!=1, "=3)
have been
y obtained in [34]:
=−
e k y Variance
(t), (!=1, "=3)
l = 0, 1, . . . , M,
i=0
'&
Error
Solution
'*
'$
'%
0
1
2
'+
3
4
ijl i j
2
#Φ
'&&
space
spanned
byl $the
Mdt
M
&!
i=0orthogonal
j=0
ve equation onto
M the random
!
!
!"#$ dyi (t)!"$
!"%$! !
&
!
&
Time
by taking the inner product
each basis.
Φi =of
− the equation
Φi Φj with
ki yj (t).
Deterministic
#
P
*
)
(4.4)
$
Effect of GPC variable order P on the
convergence rate in time
k is a random variable with zero mean
and constant variance and a certain
probability distribution f(k) :
Uniform distribution (Legendre)
Mean solution:
Variance solution:
omes the Legendre-chaos. We
mean and variance on the right
and β. In particular, we mote
same for the variance and the
observe exponential convergence rate of
of Figure 4.1, with different sets of para
that the asymptotic convergence rate se
mean unlike in Monte Carlo methods.
For this simple ODE we can also
analysis for different types Pof distributions. To this end, we define the
2
2
y
(t))
$/#y
n-square
error
%P = Jacobi-Chaos;
#(y(t) −Left:P Solution of each(t)$, where yP (t) is the finit
with beta random
input as
by 4th-order
endre-Chaos),
Error
convergence
of the meanLeft:
and Solution
the variance
with
beta random Right:
input by
4th-order
Jacobi-Chaos;
of
each
GENERALIZED
POLYNOMIAL
CHAOSin
FOR
DIFFERENTIAL EQU
have
been
obtained
[34]:
ion
(4.3).
The
following
results
Chaos), Right: Error convergence of the mean and the variance with
&!
Error
Error
&!
'$
'$
&!'%
Difficulty of the method
&!
'%
&!
&!
!"$
Time
!"%$
!"%$
&
'+
&!
&
&!
Mean (!=0, "=0)
Variance
(!=0, "=0)
Mean
(!=0, "=0)
Mean (!=0,
(!=1, "=3)
Variance
"=0)
Variance
(!=1,
Mean (!=1, "=3) "=3)
'+
'&&
Variance (!=1, "=3)
'&&
!
!
&
&
#
#
P
*
)
*
)
$
$
*!
Gaussian random variable with zero mean and standard12345665!7829:;<=>9?5?,@2A+B@:,6@C4,@9?
deviation
D56E@,5!7829:;F24::@2?+B@:,6@C4,@9?
Gaussian/Hermite
1535?B65!7829:;G?@H96E+B@:,6@C4,@9?
)!
Hermite-chaos
is
used,
then
sian random variable with zero mean and standard deviation
!
"
#$−1
2(P +1)
2
te-chaos is
used,
then
(!
(σt)
(σt)
#P ≤ (σt)2 !
(P + 1)!
1−
.
(4.7)
"
#$
P+
e +1)− 1
2(P
2 1 −1 '!
(σt)
(σt)
(P
+
1)!
1
−
.
(4.7)
"/"0!+!+"!
P ≤
2
n exponential
variable withPzero
+ 1mean and&! standard devie(σt) −random
1
!(
-."
and Laguerre-chaos is used, then
onential random variable
zero mean and %!standard devi" with#2P
Exponential/Laguerre
σt
Laguerre-chaos #isP used,
then
=
.
(4.8)
σt
$!
" 1 +#
2P
σt
uniform random
is used,
no explicit
#P =variable and Legendre-chaos
.
(4.8)
#!
+ σt the error can be readily evaluated
the error is available. 1However,
"!
e-term
recurrence
formula
of
the
Legendre
polynomials.
rm random variable and Legendre-chaos is used, no explicit
e plot the number of expansion terms (P + 1) that is needed to
rror
is available. However, the error can be readily
evaluated
!
! the" value
# at
or reaches a prescribed value. In particular, we fix
m recurrence formula of the Legendre polynomials.
seen that as time increases, the number of term required grows,
the number
of expansion
terms
(P +
1)Legendre-chaos
that is neededhas
to
ht growth
is different
for the three
cases;
the
$
%
&
!+,
'
(
)
*
"!
Problems and possible remedies...
Spectral estimation of non-linear terms when no closed-forms are
available
•
use pseudo-spectral approximation.
Low convergence for non-Gaussian process:
•
use the appropriate measure with Generalized PC.
Convergence failure for discontinuous or non-smooth processes
(stochastic bifurcation)
•
develop adapted (non-smooth or local) bases: multi-wavelets or
multi-elements gPC.
CPU cost for large scale problems
•
design new solvers, use different types of (sparse) numerical
quadratures, sparse tensor products.
Challenge: development of bases and techniques to improve
convergence and robustness of spectral expansions for processes
with steep/discontinuous dependences to uncertain parameters or
processes depending on a large number of random variables.
Possible applications so far...
Solid mechanics (Ghanem & Spanos 1989-91).
Flow through porous media (Ghanem & Dham 1998, Zhang & Lu
2004).
Heat diffusion in stochastic media (Hien & Kleiber 1997-98, Xiu &
Karniadakis 2003).
Incompressible flows (Le Maître et al, Karniadakis et al, Hou et al).
Fluid-Structure interaction (Karniadakis et al).
Micro-fluid systems (Debusschere et al 2001).
Reacting flows & combustion (Reagan et al 2001).
0-Mach flows & thermo-fluid problems (Le Maître et al 2003).