ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.03704
8
3

Estimating Granger Causality with Unobserved Confounders via Deep Latent-Variable Recurrent Neural Network

9 September 2019
Yuan Meng
    CML
    BDL
ArXivPDFHTML
Abstract

Granger causality analysis, as one of the most popular time series causality methods, has been widely used in the economics, neuroscience. However, unobserved confounders is a fundamental problem in the observational studies, which is still not solved for the non-linear Granger causality. The application works often deal with this problem in virtue of the proxy variables, who can be treated as a measure of the confounder with noise. But the proxy variables has been proved to be unreliable, because of the bias it may induce. In this paper, we try to "recover" the unobserved confounders for the Granger causality. We use a generative model with latent variable to build the relationship between the unobserved confounders and the observed variables(tested variable and the proxy variables). The posterior distribution of the latent variable is adopted to represent the confounders distribution, which can be sampled to get the estimated confounders. We adopt the variational autoencoder to estimate the intractable posterior distribution. The recurrent neural network is applied to build the temporal relationship in the data. We evaluate our method in the synthetic and semi-synthetic dataset. The result shows our estimated confounders has a better performance than the proxy variables in the non-linear Granger causality with multiple proxies in the semi-synthetic dataset. But the performances of the synthetic dataset and the different noise level of proxy seem terrible. Any advice can really help.

View on arXiv
Comments on this paper