Logo Logo
Hilfe
Hilfe
Switch Language to English

Chen, Bin-Bin; Gao, Yuan; Guo, Yi-Bin; Liu, Yuzhi; Zhao, Hui-Hai; Liao, Hai-Jun; Wang, Lei; Xiang, Tao; Li, Wei und Xie, Z. Y. (2020): Automatic differentiation for second renormalization of tensor networks. In: Physical Review B, Bd. 101, Nr. 22, 220409

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

Tensor renormalization group (TRG) constitutes an important methodology for accurate simulations of strongly correlated lattice models. Facilitated by the automatic differentiation technique widely used in deep learning, we propose a uniform framework of differentiable TRG (partial derivative TRG) that can be applied to improve various TRG methods, in an automatic fashion. partial derivative TRG systematically extends the essential concept of second renormalization [Phys. Rev. Lett. 103. 160601 (2009)] where the tensor environment is computed recursively in the backward iteration. Given the forward TRG process, partial derivative TRG automatically finds the gradient of local tensors through backpropagation, with which one can deeply "train" the tensor networks. We benchmark partial derivative TRG in solving the square-lattice Ising model, and we demonstrate its power by simulating one- and two-dimensional quantum systems at finite temperature. The global optimization as well as GPU acceleration renders partial derivative TRG a highly efficient and accurate many-body computation approach.

Dokument bearbeiten Dokument bearbeiten