Search Results for author: Burc Gokden

Found 2 papers, 2 papers with code

Power Law Graph Transformer for Machine Translation and Representation Learning

1 code implementation27 Jun 2021 Burc Gokden

We present the Power Law Graph Transformer, a transformer model with well defined deductive and inductive tasks for prediction and representation learning.

Machine Translation Quantization +2

CoulGAT: An Experiment on Interpretability of Graph Attention Networks

1 code implementation18 Dec 2019 Burc Gokden

We present an attention mechanism inspired from definition of screened Coulomb potential.

Graph Attention

Cannot find the paper you are looking for? You can Submit a new open access paper.