no code implementations • 30 Oct 2023 • Gökberk Özsoy, Luis Salamanca, Matthew Connelly, Raymond Hicks, Fernando Pérez-Cruz
In the current paper, we present the KG-FRUS dataset, comprised of more than 300, 000 US government diplomatic documents encoded in a Knowledge Graph (KG).
no code implementations • 23 Aug 2021 • Aurora Cobo Aguilera, Pablo Martínez Olmos, Antonio Artés-Rodríguez, Fernando Pérez-Cruz
Language models (LM) have grown with non-stop in the last decade, from sequence-to-sequence architectures to the state-of-the-art and utter attention-based Transformers.
1 code implementation • 4 Jun 2020 • Aurora Cobo Aguilera, Antonio Artés-Rodríguez, Fernando Pérez-Cruz, Pablo Martínez Olmos
Deep learning requires regularization mechanisms to reduce overfitting and improve generalization.
3 code implementations • 28 Jan 2019 • Pablo Sánchez-Martín, Pablo M. Olmos, Fernando Pérez-Cruz
We propose a new method to evaluate GANs, namely EvalGAN.
no code implementations • 31 Oct 2016 • Rafael Boloix-Tortosa, Juan José Murillo-Fuentes, Irene Santos Velázquez, Fernando Pérez-Cruz
Usually, complex-valued RKHS are presented as an straightforward application of the real-valued case.
no code implementations • 12 Mar 2013 • Fernando Pérez-Cruz, Steven Van Vaerenbergh, Juan José Murillo-Fuentes, Miguel Lázaro-Gredilla, Ignacio Santamaria
Gaussian processes (GPs) are versatile tools that have been successfully employed to solve nonlinear estimation problems in machine learning, but that are rarely used in signal processing.
no code implementations • NeurIPS 2012 • Francisco Ruiz, Isabel Valera, Carlos Blanco, Fernando Pérez-Cruz
In the present paper, we are interested in seeking the hidden causes behind the suicide attempts, for which we propose to model the subjects using a nonparametric latent model based on the Indian Buffet Process (IBP).
no code implementations • NeurIPS 2011 • Pablo M. Olmos, Luis Salamanca, Juan Fuentes, Fernando Pérez-Cruz
We show an application of a tree structure for approximate inference in graphical models using the expectation propagation algorithm.
no code implementations • NeurIPS 2008 • Fernando Pérez-Cruz
We analyze the estimation of information theoretic measures of continuous random variables such as: differential entropy, mutual information or Kullback-Leibler divergence.