no code implementations • ICML 2020 • Fangcheng Fu, Yuzheng Hu, Yihan He, Jiawei Jiang, Yingxia Shao, Ce Zhang, Bin Cui
Recent years have witnessed intensive research interests on training deep neural networks (DNNs) more efficiently by quantization-based compression methods, which facilitate DNNs training in two ways: (1) activations are quantized to shrink the memory consumption, and (2) gradients are quantized to decrease the communication cost.
no code implementations • 25 Nov 2021 • Yihan He, Joan Bruna
In this example, we provide non-asymptotic bounds that highly depend on the sparsity of the receptive field constructed by the algorithm.
no code implementations • 21 Nov 2021 • Yihan He
This work addressed the problem of learning a network with communication between vertices.
no code implementations • 19 Nov 2021 • Yihan He
We consider the problem of recovering the rank of a set of $n$ items based on noisy pairwise comparisons.
1 code implementation • 1 Jan 2021 • Yihan He, Wei Cao, Shun Zheng, Zhifeng Gao, Jiang Bian
In this work, we present a new method named Fourier Temporal State Embedding (FTSE) to address the temporal information in dynamic graph representation learning.
no code implementations • 1 Jan 2021 • Yihan He, Wei Cao, Shun Zheng, Zhifeng Gao, Jiang Bian
In recent years, research communities have been developing stochastic sampling methods to handle large graphs when it is unreal to put the whole graph into a single batch.