Search Results for author: Pavel Karpov

Found 2 papers, 2 papers with code

State-of-the-Art Augmented NLP Transformer models for direct and single-step retrosynthesis

1 code implementation5 Mar 2020 Igor V. Tetko, Pavel Karpov, Ruud Van Deursen, Guillaume Godin

We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture.

Data Augmentation Memorization +2

Transformer-CNN: Fast and Reliable tool for QSAR

1 code implementation21 Oct 2019 Pavel Karpov, Guillaume Godin, Igor V. Tetko

That both the augmentation and transfer learning are based on embeddings allows the method to provide good results for small datasets.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.