no code implementations • EMNLP 2021 • Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
Classification problems having thousands or more classes naturally occur in NLP, for example language models or document classification.
no code implementations • 29 Sep 2021 • Yerlan Idelbayev, Arman Zharmagambetov, Magzhan Gabidolla, Miguel A. Carreira-Perpinan
We show that neural nets can be further compressed by replacing layers of it with a special type of decision forest.
no code implementations • CVPR 2021 • Yerlan Idelbayev, Pavlo Molchanov, Maying Shen, Hongxu Yin, Miguel A. Carreira-Perpinan, Jose M. Alvarez
We study the problem of quantizing N sorted, scalar datapoints with a fixed codebook containing K entries that are allowed to be rescaled.
no code implementations • CVPR 2020 • Elad Eban, Yair Movshovitz-Attias, Hao Wu, Mark Sandler, Andrew Poon, Yerlan Idelbayev, Miguel A. Carreira-Perpinan
Despite the success of deep neural networks (DNNs), state-of-the-art models are too large to deploy on low-resource devices or common server configurations in which multiple models are held in memory.
no code implementations • NeurIPS 2018 • Miguel A. Carreira-Perpinan, Pooya Tavallali
Learning a decision tree from data is a difficult optimization problem.
no code implementations • NeurIPS 2015 • Miguel A. Carreira-Perpinan, Max Vladymyrov
This has two advantages: 1) The algorithm is universal in that a specific learning algorithm for any choice of embedding and mapping can be constructed by simply reusing existing algorithms for the embedding and for the mapping.