Search Results for author: Daniel Fleischer

Found 5 papers, 0 papers with code

Latent Universal Task-Specific BERT

no code implementations16 May 2019 Alon Rozental, Zohar Kelrich, Daniel Fleischer

This paper describes a language representation model which combines the Bidirectional Encoder Representations from Transformers (BERT) learning mechanism described in Devlin et al. (2018) with a generalization of the Universal Transformer model described in Dehghani et al. (2018).

Amobee at IEST 2018: Transfer Learning from Language Models

no code implementations WS 2018 Alon Rozental, Daniel Fleischer, Zohar Kelrich

This paper describes the system developed at Amobee for the WASSA 2018 implicit emotions shared task (IEST).

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.