ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.06314
  4. Cited By
Proper Scoring Rules, Gradients, Divergences, and Entropies for Paths
  and Time Series

Proper Scoring Rules, Gradients, Divergences, and Entropies for Paths and Time Series

11 November 2021
Patric Bonnier
Harald Oberhauser
    AI4TS
ArXivPDFHTML

Papers citing "Proper Scoring Rules, Gradients, Divergences, and Entropies for Paths and Time Series"

5 / 5 papers shown
Title
Proper scoring rules for estimation and forecast evaluation
Proper scoring rules for estimation and forecast evaluation
Kartik Waghmare
Johanna Ziegel
AI4TS
33
0
0
02 Apr 2025
Efficient Training of Neural Stochastic Differential Equations by Matching Finite Dimensional Distributions
Efficient Training of Neural Stochastic Differential Equations by Matching Finite Dimensional Distributions
Jianxin Zhang
Josh Viktorov
Doosan Jung
Emily Pitler
DiffM
46
0
0
04 Oct 2024
Non-adversarial training of Neural SDEs with signature kernel scores
Non-adversarial training of Neural SDEs with signature kernel scores
Zacharia Issa
Blanka Horvath
M. Lemercier
C. Salvi
AI4TS
40
24
0
25 May 2023
Differentiable Divergences Between Time Series
Differentiable Divergences Between Time Series
Mathieu Blondel
A. Mensch
Jean-Philippe Vert
AI4TS
40
38
0
16 Oct 2020
Soft-DTW: a Differentiable Loss Function for Time-Series
Soft-DTW: a Differentiable Loss Function for Time-Series
Marco Cuturi
Mathieu Blondel
AI4TS
141
611
0
05 Mar 2017
1