Search Results for author: Octavian-Eugen Ganea

Found 20 papers, 16 papers with code

Independent SE(3)-Equivariant Models for End-to-End Rigid Protein Docking

1 code implementation ICLR 2022 Octavian-Eugen Ganea, Xinyuan Huang, Charlotte Bunne, Yatao Bian, Regina Barzilay, Tommi Jaakkola, Andreas Krause

Protein complex formation is a central problem in biology, being involved in most of the cell's processes, and essential for applications, e. g. drug design or protein engineering.

Graph Matching Translation

Message Passing Networks for Molecules with Tetrahedral Chirality

1 code implementation24 Nov 2020 Lagnajit Pattanaik, Octavian-Eugen Ganea, Ian Coley, Klavs F. Jensen, William H. Green, Connor W. Coley

Molecules with identical graph connectivity can exhibit different physical and biological properties if they exhibit stereochemistry-a spatial structural characteristic.

Drug Discovery

Optimal Transport Graph Neural Networks

2 code implementations8 Jun 2020 Benson Chen, Gary Bécigneul, Octavian-Eugen Ganea, Regina Barzilay, Tommi Jaakkola

Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation -- potentially losing structural or semantic information.

 Ranked #1 on Graph Regression on Lipophilicity (using extra training data)

Drug Discovery Graph Regression +2

Computationally Tractable Riemannian Manifolds for Graph Embeddings

1 code implementation20 Feb 2020 Calin Cruceru, Gary Bécigneul, Octavian-Eugen Ganea

Representing graphs as sets of node embeddings in certain curved Riemannian manifolds has recently gained momentum in machine learning due to their desirable geometric inductive biases, e. g., hierarchical structures benefit from hyperbolic geometry.

BIG-bench Machine Learning

Mixed-curvature Variational Autoencoders

1 code implementation ICLR 2020 Ondrej Skopek, Octavian-Eugen Ganea, Gary Bécigneul

Euclidean geometry has historically been the typical "workhorse" for machine learning applications due to its power and simplicity.

Constant Curvature Graph Convolutional Networks

no code implementations ICML 2020 Gregor Bachmann, Gary Bécigneul, Octavian-Eugen Ganea

Interest has been rising lately towards methods representing data in non-Euclidean spaces, e. g. hyperbolic or spherical, that provide specific inductive biases useful for certain real-world data properties, e. g. scale-free, hierarchical or cyclical.

Node Classification

Noise Contrastive Variational Autoencoders

no code implementations23 Jul 2019 Octavian-Eugen Ganea, Yashas Annadani, Gary Bécigneul

We take steps towards understanding the "posterior collapse (PC)" difficulty in variational autoencoders (VAEs),~i. e.

Breaking the Softmax Bottleneck via Learnable Monotonic Pointwise Non-linearities

no code implementations21 Feb 2019 Octavian-Eugen Ganea, Sylvain Gelly, Gary Bécigneul, Aliaksei Severyn

The Softmax function on top of a final linear layer is the de facto method to output probability distributions in neural networks.

Language Modelling Text Generation

Riemannian Adaptive Optimization Methods

1 code implementation ICLR 2019 Gary Bécigneul, Octavian-Eugen Ganea

Several first order stochastic optimization methods commonly used in the Euclidean domain such as stochastic gradient descent (SGD), accelerated gradient descent or variance reduced methods have already been adapted to certain Riemannian settings.

Riemannian optimization Stochastic Optimization

Learning and Evaluating Sparse Interpretable Sentence Embeddings

no code implementations WS 2018 Valentin Trifonov, Octavian-Eugen Ganea, Anna Potapenko, Thomas Hofmann

Previous research on word embeddings has shown that sparse representations, which can be either learned on top of existing dense embeddings or obtained through model constraints during training time, have the benefit of increased interpretability properties: to some degree, each dimension can be understood by a human and associated with a recognizable feature in the data.

Sentence Sentence Embedding +2

Hyperbolic Neural Networks

3 code implementations NeurIPS 2018 Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann

However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers.

Graph Representation Learning Natural Language Inference +2

Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

3 code implementations ICML 2018 Octavian-Eugen Ganea, Gary Bécigneul, Thomas Hofmann

Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning.

Graph Embedding Hypernym Discovery +2

Deep Joint Entity Disambiguation with Local Neural Attention

3 code implementations EMNLP 2017 Octavian-Eugen Ganea, Thomas Hofmann

We propose a novel deep learning model for joint document-level entity disambiguation, which leverages learned neural representations.

Entity Disambiguation

Neural Multi-Step Reasoning for Question Answering on Semi-Structured Tables

1 code implementation21 Feb 2017 Till Haug, Octavian-Eugen Ganea, Paulina Grnarova

Second, paraphrases of logical forms and questions are embedded in a jointly learned vector space using word and character convolutional neural networks.

Question Answering

Probabilistic Bag-Of-Hyperlinks Model for Entity Linking

1 code implementation8 Sep 2015 Octavian-Eugen Ganea, Marina Ganea, Aurelien Lucchi, Carsten Eickhoff, Thomas Hofmann

We demonstrate the accuracy of our approach on a wide range of benchmark datasets, showing that it matches, and in many cases outperforms, existing state-of-the-art methods.

Entity Disambiguation Entity Linking +3

Cannot find the paper you are looking for? You can Submit a new open access paper.