no code implementations • 2 May 2024 • Nameyeh Alam, Jake Basilico, Daniele Bertolini, Satish Casie Chetty, Heather D'Angelo, Ryan Douglas, Charles K. Fisher, Franklin Fuller, Melissa Gomes, Rishabh Gupta, Alex Lang, Anton Loukianov, Rachel Mak-McCully, Cary Murray, Hanalei Pham, Susanna Qiao, Elena Ryapolova-Webb, Aaron Smith, Dimitri Theoharatos, Anil Tolwani, Eric W. Tramel, Anna Vidovszky, Judy Viduya, Jonathan R. Walsh
We show that the same neural network architecture can be trained to generate accurate digital twins for patients across 13 different indications simply by changing the training set and tuning hyperparameters.
no code implementations • 1 Mar 2024 • LiWei Wang, Xinru Liu, Aaron Smith, Yves Atchade
Cyclical MCMC is a novel MCMC framework recently proposed by Zhang et al. (2019) to address the challenge posed by high-dimensional multimodal posterior distributions like those arising in deep learning.
no code implementations • 13 Apr 2023 • Guanxun Li, Aaron Smith, Quan Zhou
Informed importance tempering (IIT) is an easy-to-implement MCMC algorithm that can be seen as an extension of the familiar Metropolis-Hastings algorithm with the special feature that informed proposals are always accepted, and which was shown in Zhou and Smith (2022) to converge much more quickly in some common circumstances.
no code implementations • CONLL 2018 • Aaron Smith, Bernd Bohnet, Miryam de Lhoneux, Joakim Nivre, Yan Shao, Sara Stymne
We present the Uppsala system for the CoNLL 2018 Shared Task on universal dependency parsing.
no code implementations • EMNLP 2018 • Aaron Smith, Miryam de Lhoneux, Sara Stymne, Joakim Nivre
We provide a comprehensive analysis of the interactions between pre-trained word embeddings, character models and POS tags in a transition-based dependency parser.
1 code implementation • 9 Aug 2018 • Oren Mangoubi, Natesh S. Pillai, Aaron Smith
In this paper, we investigate a different scaling question: does HMC beat RWM for highly $\textit{multimodal}$ targets?
1 code implementation • ACL 2018 • Sara Stymne, Miryam de Lhoneux, Aaron Smith, Joakim Nivre
How to make the most of multiple heterogeneous treebanks when training a monolingual dependency parser is an open question.
no code implementations • LREC 2014 • Liane Guillou, Christian Hardmeier, Aaron Smith, J{\"o}rg Tiedemann, Bonnie Webber
We present ParCor, a parallel corpus of texts in which pronoun coreference ― reduced coreference in which pronouns are used as referring expressions ― has been annotated.