no code implementations • 23 Nov 2023 • Muneeswaran I, Shreya Saxena, Siva Prasad, M V Sai Prakash, Advaith Shankar, Varun V, Vishal Vaddina, Saisubramaniam Gopalakrishnan
Large Language Models (LLMs) are widely used in critical fields such as healthcare, education, and finance due to their remarkable proficiency in various language-related tasks.
no code implementations • 25 Aug 2023 • M V Sai Prakash, Siddartha Reddy N, Ganesh Parab, Varun V, Vishal Vaddina, Saisubramaniam Gopalakrishnan
While recent advances in Graph Neural Networks (GNNs) and Transformers have shown to be effective and promising, they face the following limitations: Transformer self-attention does not explicitly consider the underlying molecule structure while GNN feature representation alone is not sufficient to capture granular and hidden interactions and characteristics that distinguish similar molecules.
Ranked #2 on Molecular Property Prediction on ClinTox
no code implementations • 12 Dec 2020 • Saisubramaniam Gopalakrishnan, Pranshu Ranjan Singh, Haytham Fayek, Savitha Ramasamy, ArulMurugan Ambikapathi
Deep neural networks have shown promise in several domains, and the learned data (task) specific information is implicitly stored in the network parameters.
no code implementations • 16 Apr 2020 • Saisubramaniam Gopalakrishnan, Pranshu Ranjan Singh, Yasin Yazici, Chuan-Sheng Foo, Vijay Chandrasekhar, ArulMurugan Ambikapathi
Utilization of classification latent space information for downstream reconstruction and generation is an intriguing and a relatively unexplored area.