ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.00947
41
3
v1v2v3 (latest)

Why Simple Quadrature is just as good as Monte Carlo

2 August 2019
Kevin Vanslette
Abdullatif Al-Alshaikh
K. Youcef-Toumi
ArXiv (abs)PDFHTML
Abstract

We motive and calculate Newton--Cotes quadrature integration variance and compare it directly with Monte Carlo (MC) integration variance. We find an equivalence between deterministic quadrature sampling and random MC sampling by noting that MC random sampling is statistically indistinguishable from a method that uses deterministic sampling on a randomly shuffled (permuted) function. We use this statistical equivalence to regularize the form of permissible Bayesian quadrature integration priors such that they are guaranteed to be objectively comparable with MC. This leads to the proof that simple quadrature methods have expected variances that are less than or equal to their corresponding theoretical MC integration variances. Separately, using Bayesian probability theory, we find that the theoretical standard deviations of the unbiased errors of simple Newton--Cotes composite quadrature integrations improve over their worst case errors by an extra dimension independent factor ∝N−1/2\propto N^{-1/2}∝N−1/2. This dimension independent factor is validated in our simulations.

View on arXiv
Comments on this paper