no code implementations • 20 Dec 2023 • Rahul Chand, Yashoteja Prabhu, Pratyush Kumar
Extensive experiments on multiple natural language understanding benchmarks demonstrate that DSFormer obtains up to 40% better compression than the state-of-the-art low-rank factorizers, leading semi-structured sparsity baselines and popular knowledge distillation approaches.
no code implementations • 1 Apr 2023 • Rahul Chand, Rajat Arora, K Ram Prabhakar, R Venkatesh Babu
We present a framework to use recently introduced Capsule Networks for solving the problem of Optical Flow, one of the fundamental computer vision tasks.