no code implementations • ICML 2018 • Shinji Ito, Akihiro Yabe, Ryohei Fujimaki
Predictive optimization, however, suffers from the problem of a calculated optimal solution’s being evaluated too optimistically, i. e., the value of the objective function is overestimated.
no code implementations • NeurIPS 2017 • Zhao Song, Yusuke Muraoka, Ryohei Fujimaki, Lawrence Carin
We propose a scalable algorithm for model selection in sigmoid belief networks (SBNs), based on the factorized asymptotic Bayesian (FAB) framework.
no code implementations • 7 Nov 2017 • Masato Asahara, Ryohei Fujimaki
The importance of interpretability of machine learning models has been increasing due to emerging enterprise predictive analytics, threat of data privacy, accountability of artificial intelligence in society, and so on.
1 code implementation • 10 Jul 2017 • Wei Qian, Wending Li, Yasuhiro Sogawa, Ryohei Fujimaki, Xitong Yang, Ji Liu
Sparsity learning with known grouping structure has received considerable attention due to wide modern applications in high-dimensional data analysis.
Human Activity Recognition Vocal Bursts Intensity Prediction
no code implementations • ICML 2017 • Haichuan Yang, Shupeng Gui, Chuyang Ke, Daniel Stefankovic, Ryohei Fujimaki, Ji Liu
The cardinality constraint is an intrinsic way to restrict the solution structure in many domains, for example, sparse learning, feature selection, and compressed sensing.
no code implementations • NeurIPS 2016 • Shinji Ito, Ryohei Fujimaki
On the basis of this connection, we propose an efficient algorithm that employs network flow algorithms.
no code implementations • 18 May 2016 • Shinji Ito, Ryohei Fujimaki
This paper addresses a novel data science problem, prescriptive price optimization, which derives the optimal price strategy to maximize future profit/revenue on the basis of massive predictive formulas produced by machine learning.
no code implementations • 26 Jun 2015 • Shaohua Li, Ryohei Fujimaki, Chunyan Miao
Factorial hidden Markov models (FHMMs) are powerful tools of modeling sequential data.
no code implementations • 22 Apr 2015 • Kohei Hayashi, Shin-ichi Maeda, Ryohei Fujimaki
Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies.
no code implementations • NeurIPS 2014 • Deguang Kong, Ryohei Fujimaki, Ji Liu, Feiping Nie, Chris Ding
Group lasso is widely used to enforce the structural sparsity, which achieves the sparsity at inter-group level.
no code implementations • NeurIPS 2014 • Hidekazu Oiwa, Ryohei Fujimaki
One of the key challenges in their use is non-convexity in simultaneous optimization of regions and region-specific models.
no code implementations • 31 Dec 2013 • Ji Liu, Ryohei Fujimaki, Jieping Ye
Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than $\bar{k}\log d$ where $\bar{k}$ is the sparsity number and $d$ is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods (including the ones based on forward greedy selection and L1-regularization).
no code implementations • NeurIPS 2013 • Kohei Hayashi, Ryohei Fujimaki
This paper extends factorized asymptotic Bayesian (FAB) inference for latent feature models~(LFMs).