1 code implementation • 12 Oct 2021 • Christoph Wick, Jochen Zöllner, Tobias Grüning
The CTC confidences are computed on the encoder while the Transformer is only used for character-wise S2S decoding.
2 code implementations • 23 Apr 2021 • Jochen Zöllner, Konrad Sperfeld, Christoph Wick, Roger Labahn
Currently, the most widespread neural network architecture for training language models is the so called BERT which led to improvements in various Natural Language Processing (NLP) tasks.
1 code implementation • 18 Mar 2019 • Johannes Michael, Roger Labahn, Tobias Grüning, Jochen Zöllner
Encoder-decoder models have become an effective approach for sequence learning tasks like machine translation, image captioning and speech recognition, but have yet to show competitive results for handwritten text recognition.
Ranked #9 on Handwritten Text Recognition on IAM