26
0

Directed Graph Grammars for Sequence-based Learning

Main:9 Pages
9 Figures
Bibliography:3 Pages
9 Tables
Appendix:16 Pages
Abstract

Directed acyclic graphs (DAGs) are a class of graphs commonly used in practice, with examples that include electronic circuits, Bayesian networks, and neural architectures. While many effective encoders exist for DAGs, it remains challenging to decode them in a principled manner, because the nodes of a DAG can have many different topological orders. In this work, we propose a grammar-based approach to constructing a principled, compact and equivalent sequential representation of a DAG. Specifically, we view a graph as derivations over an unambiguous grammar, where the DAG corresponds to a unique sequence of production rules. Equivalently, the procedure to construct such a description can be viewed as a lossless compression of the data. Such a representation has many uses, including building a generative model for graph generation, learning a latent space for property prediction, and leveraging the sequence representational continuity for Bayesian Optimization over structured data. Code is available atthis https URL.

View on arXiv
@article{sun2025_2505.22949,
  title={ Directed Graph Grammars for Sequence-based Learning },
  author={ Michael Sun and Orion Foo and Gang Liu and Wojciech Matusik and Jie Chen },
  journal={arXiv preprint arXiv:2505.22949},
  year={ 2025 }
}
Comments on this paper