Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.11953
Cited By
Crystal Transformer: Self-learning neural language model for Generative and Tinkering Design of Materials
25 April 2022
Lai Wei
Qinyang Li
Yuqi Song
Stanislav Stefanov
Edirisuriya M Dilanga Siriwardane
Fanglin Chen
Jianjun Hu
AI4CE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Crystal Transformer: Self-learning neural language model for Generative and Tinkering Design of Materials"
7 / 7 papers shown
Title
Bayesian Optimization of Catalysis With In-Context Learning
M. C. Ramos
Shane S. Michtavy
Marc D. Porosoff
Andrew D. White
BDL
44
30
0
11 Apr 2023
Composition based oxidation state prediction of materials using deep learning
Nihang Fu
Jeffrey Hu
Yingqi Feng
G. Morrison
H. Loye
Jianjun Hu
33
1
0
29 Nov 2022
Probabilistic Generative Transformer Language models for Generative Design of Molecules
Lai Wei
Nihang Fu
Yuqi Song
Qian Wang
Jianjun Hu
AI4CE
33
11
0
20 Sep 2022
Scalable deeper graph neural networks for high-performance materials property prediction
Sadman Sadeed Omee
Steph-Yves M. Louis
Nihang Fu
Lai Wei
Sourin Dey
Rongzhi Dong
Qinyang Li
Jianjun Hu
70
73
0
25 Sep 2021
Frequency Effects on Syntactic Rule Learning in Transformers
Jason W. Wei
Dan Garrette
Tal Linzen
Ellie Pavlick
88
62
0
14 Sep 2021
Blank Language Models
T. Shen
Victor Quach
Regina Barzilay
Tommi Jaakkola
203
73
0
08 Feb 2020
Molecular Sets (MOSES): A Benchmarking Platform for Molecular Generation Models
Daniil Polykovskiy
Alexander Zhebrak
Benjamín Sánchez-Lengeling
Sergey Golovanov
Oktai Tatanov
...
Simon Johansson
Hongming Chen
Sergey I. Nikolenko
Alán Aspuru-Guzik
Alex Zhavoronkov
ELM
194
633
0
29 Nov 2018
1