Tensor Renormalization in classical statistical models and quantum

Transcription

Tensor Renormalization in classical statistical models and quantum
Tensor Renormalization in classical statistical models
and quantum lattice models
Z. Y. Xie (谢志远)
Institute of Physics, Chinese Academy of Sciences
[email protected]
Part I


Tensor network representation of classical statistical models.
Tensor renormalization in a tensor network model
Part II


Projected entanglement simplex (PESS) rep. in frustrated system
Tensor renormalization in a quantum lattice model
Basic notations in tensor graph: dot, free line, link
Ising model as tensor-network model
 Partition function as a network scalar: no free outer-line.
 RJ. Baxter: vertex model, 1982.
 All classical statistical models with only local interactions can be effectively
written as a tensor-network model.
• Ising model on square lattice: already a tensor network:
Periodic boundary condition is assumed.
Ising model as tensor network model

Convert to a normal form

Introduce auxiliary DOF on each bond
This method is universal for all n. n. interaction.

How to evaluate the physical model

How to contract the infinite tensor-network to get the
partition function:

It is a NP problem to contract it exactly!
Tensor renormalization enters the evaluation of the partition function and
expectation value.
Renormalization: Compression of DOF (information, Hilbert space) by
discarding the irrelevant


How to evaluate the tensor-network model


There are at least 4 classes of methods:
Transfer Matrix Renormalization Group(TMRG)
Nishino(1995), XQWang, Txiang(1997)

Time Evolving Block Decimation(TEBD)
/boundary MPS
vidal, PRL(2003)

Corner Transfer Matrix Renormalization Group(CTMRG)
Baxter(1968), Nishino(1996), Orus Vidal(2009)
How to evaluate the tensor-network model

•
•
Coarse-graining Tensor Renormalization Group(TRG).
Kadanoff block spin decimation.
Levin-Nave-TRG:
PRL(2007)
Coarse-graining tensor renormalization



In 2D, they all work very well;
In 3D or higher, they do not work so well!
Tensor renormalization based on the higher-order singular
value decomposition(HOSVD), i.e., HOTRG.
our group(2012)
scale1
scale2
scale3
Block local spins in HOTRG

Block spin: how to decimate the local DOF.
Bond dimension: DOF introduced on each bond

Dimension scales super-exponentially!
Decimation of DOF

Decimation: shape down

More than 1 cutoff simultaneously! Low rank approximation
itself is an open problem!
HOSVD: Nearly optimal, in most cases at least very good

(1). Different blocks with the same shape are orthogonal
(2). Blocks are sorted decreasingly according to norm
Ref: L. de Latheauwer, B. de Moor, and J. Vandewalle, SIAM, J. Matrix Anal. Appl, 21, 1253 (2000).
HOSVD is of no mystery!

How to calculate the HOSVD of a given tensor:

Successive SVD:

Independent SVD:
Ref: L. de Latheauwer, B. de Moor, and J. Vandewalle, SIAM, J. Matrix Anal. Appl, 21, 1253 (2000).
How to use it in our case?
A single renormalization step in 2D

Coarse-grain in Y-direction.
x’

Repeat the same procedure in X-direction.
x’
Calculation of expectation value

Free energy
Calculation of expectation value

Local physical quantity: definition required
A renormalization step in 3D: cubic lattice

Coarse-grain in Z-direction.
 HOSVD of M, order-6, insert 4 isometry.
 x, y, z axis direction
Performance: cubic lattice

Magnetization
Monte Carlo:
0.3262
Series Expansion: 0.3265
HOTRG(D=14): 0.3295
Performance: cubic lattice

Critical property
former NRG
Main Problem: local update

Renormalize the tensor network site by site independently, local
update without considering the environment.
system
environment
Z=Tr  MM env 
NRG -> DMRG!
What we need: a decimation scheme which optimizes the global Z instead of
the local system M itself!

Central idea:
Note:





Do forward iteration (e.g., HOTRG) to construct tensor work at different
RG scales.
Find the relation between the environment of two neighboring scales.
Using the relation to do backward iteration to get the environment of the
targeted system (which need to be renormalized).
Use the environment to do global optimization of the system.

SRG VERSION OF HOTRG? HOSRG!!!
Use the environment to do global optimization

env
There are several methods to modify the local decomposition by M
 Splitting Env to form an open
system:
 Cut at
Use the environment to do global optimization

env
There are several methods to modify the local decomposition by M
 Splitting a bond to target a
closed system
 Insert PP-1 and cut at Λ
Sweep scheme and performance

NRG -> iDMRG -> fDMRG
TRG -> SRG -> fSRG with sweep
Sweep scheme:
Ising model on square: (D=20)
D = 24
HOSRG
Part I


Tensor network representation of classical statistical models.
Tensor renormalization in a tensor network model
Part II


Projected entangled simplex (PESS) rep. in frustrated system
Tensor renormalization in a quantum lattice model
Tensor network states and quantum lattice models









Some partial history:
AKLT authors:
PRL(1987), Commun. Math. Phys.(1988)
prototype of matrix product state and honeycomb tensor-network
Niggemann:
Z. Phys. B: CMP (1996,1997)
special tensor-network wavefunction for honeycomb Heisenberg model,
equivalence between expectation value and classical partition function
Sierra and Martin-Delgado: general ansatz
Proceedings on the ERG(1998)
Nishino: variational ansatz to study 3D classical lattice
Prog.Theor.Phys(2001)
PEPS, MERA, and so on.
F. Verstraete, arXiv:0407066; Vidal, PRL(2007)
Note: Wavefunction itself does not provide any intuition about the entanglement
structure between its constituents, the only constrain comes from the area law:
It doesn't matter whether the cat is black or white, as long as it catches mice!
Only a cat that can catch rats is a good cat!
Projected Entangled Pair State construction on square lattice
F. Verstraete, J. I. Cirac, arXiv: 0407066

Reinterpret by the concept of space Projector and local Entangled Pair

Some critical properties:
Satisfies the area law
Formally has no sign problem
Local pair entanglement.
Can have power-law decay correlation function
Ground state of any local Hamiltonian and PEPS, as long as a very large D.
‘Can be represent’ does not mean equal, might does not mean effective!






Tensor renormalization in tensor network states


Choose a wavefunction ansatz/form of the targeted state.
Determine the unknown parameters in the wavefunction form.
 1. global variational extremum problem
find a PEPS
which minimize the energy : Ax = E*Bx
 2. imaginary time evolution
In practice the central problem is reduced to how to update/renormalize the
wavefunction after a small evolution step
(1). Global variational extremum problem
find a PEPS
which minimize the difference: Ax=Y
(2). Reduced to a standard local SVD/HOSVD problem by mean field
entanglement approximation: bond vector projection/simple update
(3). Regard the surrounding cluster as the whole environment.
Good review: R. Orus, Annals of physics, 349, 117 (2014)
Tensor renormalization in tensor network states

Calculate the expectation value: figure
Again 2D network scalar: identical to classical partition function!
Tensor renormalization in tensor network states

Calculate the expectation value:

Biggest obstacle for application in QLM: D->D^2, MERA
PEPS on Kagome lattice: hidden frustration

PEPS ansatz:

Degeneracy: entanglement spectra at each bond has full double degeneracy.

Information cancelation:
Similar as sign/frustration as in MC!
Local pair entanglement; PEPS!
A, B, C has dominant element 1
Projected Entangled Simplex State(PESS)

Structure
 Introduce a simplex tensor S: the
triangle/simplex entanglement, instead of pair
 Ansatz is defined on unfrustrated lattice:
honeycomb, no hidden frustration here!

Property
Simplex ~ possible building block, such as triangle for Kagome
all the advantage of PEPS: area law, no fermion sign,
power-law decay correlator…
our group, PRX 4, 011025(2014)
Determination of PESS ground state wavefunction

Local update scheme with mean-field entanglement approximation:
Determination of PESS ground state wavefunction

Local update scheme with mean-field entanglement approximation:
Determination of PESS ground state wavefunction

Local update scheme with mean-field entanglement approximation: figure
two exact examples Kagome lattice

Spin-2 Valence Bond Solid (VBS):
Spin-2 Simplex Solid (SS):

2:4*1/2, ½+1/2:pair singlet

2:2*1, 1+1+1: simplex singlet
Other possible PESS by choosing larger simplex
(a) 3-PESS
Simplest PESS: 3PESS, smallest simplex: triangle
Multi-site interaction and longer-range can be encoded easily if needed
RVB trial wf. (2013)
 3PESS, 5PESS, and 9PESS all effective
 5PESS, 9PESS < 3PESS
 13-state PESS: promising and competitive
MERA (2010)
Series expansion (2008)
iDMRG m=5k (2011)
VMC+Lanczos (2013)
Extra. HOCC (2011)
Extra. fDMRG m=1.6w
(2012)
If you like extrapolation, as in DMRG…
Note here:
1. D=19, better
2. Extrapolation
gives lower energy
Convergence?
regime?
DMRG: Extrap. finite SU(2) 16,000-state
PESS on other lattices: e.g.,

Difference between PEPS and PESS:

PESS is more flexible than PEPS.
Many other things not included here:
















MERA and branch MERA
Variation in PEPS
Exact PEPS representation of some special states
TEBD, CTM(RG)
fPEPS and grassmann TPS
Topological measure and entanglement
Continuous-space matrix product state
Thermal and excited states, non-equilibrium process, spin glass
Tree tensor network, and Correlator product states
Combination of tensor renormalization with Monte Carlo
Symmetry
Canonical form
Real time evolution
RG flow
Filtering scheme in TRG
...
Young, need more effort and deeper mathematical foundation!