ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.24079
21
0

Hamiltonian Monte Carlo Inference of Marginalized Linear Mixed-Effects Models

31 October 2024
Jinlin Lai
Justin Domke
Daniel Sheldon
ArXivPDFHTML
Abstract

Bayesian reasoning in linear mixed-effects models (LMMs) is challenging and often requires advanced sampling techniques like Markov chain Monte Carlo (MCMC). A common approach is to write the model in a probabilistic programming language and then sample via Hamiltonian Monte Carlo (HMC). However, there are many ways a user can transform a model that make inference more or less efficient. In particular, marginalizing some variables can greatly improve inference but is difficult for users to do manually. We develop an algorithm to easily marginalize random effects in LMMs. A naive approach introduces cubic time operations within an inference algorithm like HMC, but we reduce the running time to linear using fast linear algebra techniques. We show that marginalization is always beneficial when applicable and highlight improvements in various models, especially ones from cognitive sciences.

View on arXiv
@article{lai2025_2410.24079,
  title={ Hamiltonian Monte Carlo Inference of Marginalized Linear Mixed-Effects Models },
  author={ Jinlin Lai and Justin Domke and Daniel Sheldon },
  journal={arXiv preprint arXiv:2410.24079},
  year={ 2025 }
}
Comments on this paper