State-of-the-Art Augmented NLP Transformer models for direct and single-step retrosynthesis

We investigated the effect of different training scenarios on predicting the (retro)synthesis of chemical compounds using a text-like representation of chemical reactions (SMILES) and Natural Language Processing neural network Transformer architecture. We showed that data augmentation, which is a powerful method used in image processing, eliminated the effect of data memorization by neural networks, and improved their performance for the prediction of new sequences... (read more)

Results in Papers With Code
(↓ scroll down to see all results)