Tensor Networks
59 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Tensor Networks
Libraries
Use these libraries to find Tensor Networks models and implementationsMost implemented papers
Can recursive neural tensor networks learn logical reasoning?
Recursive neural network models and their accompanying vector representations for words have seen success in an array of increasingly semantically sophisticated tasks, but almost nothing is known about their ability to accurately capture the aspects of linguistic meaning that are necessary for interpretation or reasoning.
Tensor Ring Decomposition
In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.
Supervised Learning with Tensor Networks
Tensor networks are approximations of high-order tensors which are efficient to work with and have been very successful for physics and mathematics applications.
Logic Tensor Networks for Semantic Image Interpretation
Logic Tensor Networks (LTNs) are an SRL framework which integrates neural networks with first-order fuzzy logic to allow (i) efficient learning from noisy data in the presence of logical constraints, and (ii) reasoning with logical formulas describing general properties of the data.
Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives
Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1.
On the Long-Term Memory of Deep Recurrent Networks
A key attribute that drives the unprecedented success of modern Recurrent Neural Networks (RNNs) on learning tasks which involve sequential data, is their ability to model intricate long-term temporal dependencies.
A Generalized Language Model in Tensor Space
Theoretically, we prove that such tensor representation is a generalization of the n-gram language model.
TensorNetwork for Machine Learning
We demonstrate the use of tensor networks for image classification with the TensorNetwork open source library.
Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning
Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions.
Efficient Contraction of Large Tensor Networks for Weighted Model Counting through Graph Decompositions
We show that tree decompositions can be used both to find carving decompositions and to factor tensor networks with high-rank, structured tensors.