Skip to main content

Research Repository

Advanced Search

Outputs (10)

Regularized joint mixture models (2023)
Journal Article
Perrakis, K., Lartigue, T., Dondelinger, F., & Mukherjee, S. (2023). Regularized joint mixture models. Journal of Machine Learning Research, 24, 1-47

Regularized regression models are well studied and, under appropriate conditions, offer fast and statistically interpretable results. However, large data in many applications are heterogeneous in the sense of harboring distributional differences betw... Read More about Regularized joint mixture models.

Scalable prediction of acute myeloid leukemia using high-dimensional machine learning and blood transcriptomics (2019)
Journal Article
Warnat-Herresthal, S., Perrakis, K., Taschler, B., Becker, M., Baßler, K., Beyer, M., …Schultze, J. L. (2020). Scalable prediction of acute myeloid leukemia using high-dimensional machine learning and blood transcriptomics. iScience, 23(1), Article 100780. https://doi.org/10.1016/j.isci.2019.100780

Acute myeloid leukemia (AML) is a severe, mostly fatal hematopoietic malignancy. We were interested in whether transcriptomic-based machine learning could predict AML status without requiring expert input. Using 12,029 samples from 105 different stud... Read More about Scalable prediction of acute myeloid leukemia using high-dimensional machine learning and blood transcriptomics.

Variations of power-expected-posterior priors in normal regression models (2019)
Journal Article
Fouskakis, D., Ntzoufras, I., & Perrakis, K. (2020). Variations of power-expected-posterior priors in normal regression models. Computational Statistics & Data Analysis, 143, Article 106836. https://doi.org/10.1016/j.csda.2019.106836

The power-expected-posterior (PEP) prior is an objective prior for Gaussian linear models, which leads to consistent model selection inference, under the M-closed scenario, and tends to favour parsimonious models. Recently, two new forms of the PEP p... Read More about Variations of power-expected-posterior priors in normal regression models.

Scalable Bayesian regression in high dimensions with multiple data sources (2019)
Journal Article
Perrakis, K., Mukherjee, S., & Initiative, T. A. D. N. (2020). Scalable Bayesian regression in high dimensions with multiple data sources. Journal of Computational and Graphical Statistics, 29(1), 28-39. https://doi.org/10.1080/10618600.2019.1624294

Applications of high-dimensional regression often involve multiple sources or types of covariates. We propose methodology for this setting, emphasizing the “wide data” regime with large total dimensionality p and sample size n≪p. We focus on a flexib... Read More about Scalable Bayesian regression in high dimensions with multiple data sources.

Power-expected-posterior priors for generalized linear models (2017)
Journal Article
Fouskakis, D., Ntzoufras, I., & Perrakis, K. (2017). Power-expected-posterior priors for generalized linear models. Bayesian Analysis, 13(3), 721-748. https://doi.org/10.1214/17-ba1066

The power-expected-posterior (PEP) prior provides an objective, automatic, consistent and parsimonious model selection procedure. At the same time it resolves the conceptual and computational problems due to the use of imaginary data. Namely, (i) it... Read More about Power-expected-posterior priors for generalized linear models.

Bayesian inference for transportation origin-destination matrices: the Poisson-inverse Gaussian and other Poisson mixtures (2014)
Journal Article
Perrakis, K., Karlis, D., Cools, M., & Janssens, D. (2015). Bayesian inference for transportation origin-destination matrices: the Poisson-inverse Gaussian and other Poisson mixtures. Journal of the Royal Statistical Society: Series A, 178(1), 271-296. https://doi.org/10.1111/rssa.12057

Transportation origin–destination analysis is investigated through the use of Poisson mixtures by introducing covariate‐based models which incorporate different transport modelling phases and also allow for direct probabilistic inference on link traf... Read More about Bayesian inference for transportation origin-destination matrices: the Poisson-inverse Gaussian and other Poisson mixtures.

On the use of marginal posteriors in marginal likelihood estimation via importance sampling (2014)
Journal Article
Perrakis, K., Ntzoufras, I., & Tsionas, E. G. (2014). On the use of marginal posteriors in marginal likelihood estimation via importance sampling. Computational Statistics & Data Analysis, 77, 54-69. https://doi.org/10.1016/j.csda.2014.03.004

The efficiency of a marginal likelihood estimator where the product of the marginal posterior distributions is used as an importance sampling function is investigated. The approach is generally applicable to multi-block parameter vector settings, doe... Read More about On the use of marginal posteriors in marginal likelihood estimation via importance sampling.