REFERENCES

1. Krishtal, A.; Sinha, D.; Genova, A.; Pavanello, M. Subsystem density-functional theory as an effective tool for modeling ground and excited states, their dynamics and many-body interactions. J. Phys. Condens. Matter. 2015, 27, 183202.

2. Kohn, W.; Sham, L. J. Self-consistent equations including exchange and correlation effects. Phys. Rev. 1965, 140, A1133-8.

3. Brockherde, F.; Vogt, L.; Li, L.; Tuckerman, M. E.; Burke, K.; Müller, K. R. Bypassing the Kohn-Sham equations with machine learning. Nat. Commun. 2017, 8, 872.

4. Li, H.; Wang, Z.; Zou, N.; et al. Deep-learning density functional theory Hamiltonian for efficient ab initio electronic-structure calculation. Nat. Comput. Sci. 2022, 2, 367-77.

5. Kochkov, D.; Pfaff, T.; Sanchez-Gonzalez, A.; Battaglia, P.; Clark, B. K. Learning ground states of quantum Hamiltonians with graph networks. arXiv 2021, arXiv:2110.16390. https://doi.org/10.48550/arXiv.2110.06390. (accessed 30 Jun 2025).

6. Behler, J.; Parrinello, M. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 2007, 98, 146401.

7. Bartók, A. P.; Payne, M. C.; Kondor, R.; Csányi, G. Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 2010, 104, 136403.

8. Zhang, L.; Han, J.; Wang, H.; Car, R.; E, W. Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Phys. Rev. Lett. 2018, 120, 143001.

9. Chan, H.; Narayanan, B.; Cherukara, M. J.; et al. Machine learning classical interatomic potentials for molecular dynamics from first-principles training data. J. Phys. Chem. C. 2019, 123, 6941-57.

10. Ko, T. W.; Ong, S. P. Recent advances and outstanding challenges for machine learning interatomic potentials. Nat. Comput. Sci. 2023, 3, 998-1000.

11. Yang, Z.; Wang, X.; Li, Y.; Lv, Q.; Chen, C. Y.; Shen, L. Efficient equivariant model for machine learning interatomic potentials. npj. Comput. Mater. 2025, 11, 1535.

12. Ahmad, W.; Simon, E.; Chithrananda, S.; Grand, G.; Ramsundar, B. ChemBERTa-2: towards chemical foundation models. arXiv 2022, arXiv:2209.01712. https://doi.org/10.48550/arXiv.2209.01712. (accessed 30 Jun 2025).

13. Li, J.; Jiang, X.; Wang, Y. Mol‐BERT: an effective molecular representation with BERT for molecular property prediction. Wirel. Commun. Mob. Comput. 2021, 2021, 7181815.

14. Batzner, S.; Musaelian, A.; Sun, L.; et al. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun. 2022, 13, 2453.

15. Duignan, T. T. The potential of neural network potentials. ACS. Phys. Chem. Au. 2024, 4, 232-41.

16. Dong, L.; Zhang, X.; Yang, Z.; Shen, L.; Lu, Y. Accurate piezoelectric tensor prediction with equivariant attention tensor graph neural network. npj. Comput. Mater. 2025, 11, 1546.

17. Zitnick, C. L.; Das, A.; Kolluru, A.; et al. Spherical channels for modeling atomic interactions. arXiv 2022, arXiv:2206.14331. https://doi.org/10.48550/arXiv.2206.14331. (accessed 30 Jun 2025).

18. Frank, J. T.; Unke, O. T.; Müller, K. R.; Chmiela, S. A Euclidean transformer for fast and stable machine learned force fields. Nat. Commun. 2024, 15, 6539.

19. Yuan, Z.; Xu, Z.; Li, H.; et al. Equivariant neural network force fields for magnetic materials. Quantum. Front. 2024, 3, 55.

20. Yu, H.; Zhong, Y.; Hong, L.; et al. Spin-dependent graph neural network potential for magnetic materials. Phys. Rev. B. 2024, 109, 144426.

21. Wang, H.; Zhang, L.; Han, J.; E, W. DeePMD-kit: a deep learning package for many-body potential energy representation and molecular dynamics. Comput. Phys. Commun. 2018, 228, 178-84.

22. Sokolovskiy, V.; Baigutlin, D.; Miroshkina, O.; Buchelnikov, V. Meta-GGA SCAN functional in the prediction of ground state properties of magnetic materials: review of the current state. Metals 2023, 13, 728.

23. Kirklin, S.; Saal, J. E.; Meredig, B.; et al. The Open Quantum Materials Database (OQMD): assessing the accuracy of DFT formation energies. npj. Comput. Mater. 2015, 1, BFnpjcompumats201510.

24. Ramakrishnan, R.; Dral, P. O.; Rupp, M.; von, Lilienfeld. O. A. Quantum chemistry structures and properties of 134 kilo molecules. Sci. Data. 2014, 1, 140022.

25. Chmiela, S.; Tkatchenko, A.; Sauceda, H. E.; Poltavsky, I.; Schütt, K. T.; Müller, K. R. Machine learning of accurate energy-conserving molecular force fields. Sci. Adv. 2017, 3, e1603015.

26. Chmiela, S.; Vassilev-Galindo, V.; Unke, O. T.; et al. Accurate global machine learning force fields for molecules with hundreds of atoms. arXiv 2022, arXiv 2209.14865. https://doi.org/10.48550/arXiv.2209.14865. (accessed 30 Jun 2025).

27. Smith, J. S.; Isayev, O.; Roitberg, A. E. ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 2017, 8, 3192-203.

28. Smith, J. S.; Nebgen, B.; Lubbers, N.; Isayev, O.; Roitberg, A. E. Less is more: sampling chemical space with active learning. J. Chem. Phys. 2018, 148, 241733.

29. Smith, J. S.; Nebgen, B. T.; Zubatyuk, R.; et al. Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning. Nat. Commun. 2019, 10, 2903.

30. Devereux, C.; Smith, J. S.; Huddleston, K. K.; et al. Extending the applicability of the ANI Deep learning molecular potential to sulfur and halogens. J. Chem. Theory. Comput. 2020, 16, 4192-202.

31. Schütt, K. T.; Sauceda, H. E.; Kindermans, P. J.; Tkatchenko, A.; Müller, K. R. SchNet - a deep learning architecture for molecules and materials. J. Chem. Phys. 2018, 148, 241722.

32. Eastman, P.; Behara, P. K.; Dotson, D. L.; et al. SPICE, a dataset of drug-like molecules and peptides for training machine learning potentials. Sci. Data. 2023, 10, 11.

33. Chanussot, L.; Das, A.; Goyal, S.; et al. Open Catalyst 2020 (OC20) dataset and community challenges. ACS. Catal. 2021, 11, 6059-72.

34. Tran, R.; Lan, J.; Shuaibi, M.; et al. The Open Catalyst 2022 (OC22) Dataset and Challenges for Oxide Electrocatalysts. ACS. Catal. 2023, 13, 3066-84.

35. Jain, A.; Ong, S. P.; Hautier, G.; et al. Commentary: The Materials Project: a materials genome approach to accelerating materials innovation. APL. Mater. 2013, 1, 011002.

36. Dunn, A.; Wang, Q.; Ganose, A.; Dopp, D.; Jain, A. Benchmarking materials property prediction methods: the Matbench test set and Automatminer reference algorithm. npj. Comput. Mater. 2020, 6, 138.

37. Bursch, M.; Mewes, J. M.; Hansen, A.; Grimme, S. Best-practice DFT protocols for basic molecular computational chemistry. Angew. Chem. Int. Ed. Engl. 2022, 61, e202205735.

38. Ko, T. W.; Ong, S. P. Data-efficient construction of high-fidelity graph deep learning interatomic potentials. npj. Comput. Mater. 2025, 11, 1550.

39. Batatia, I.; Batzner, S.; Kovács, D. P.; et al. The design space of E(3)-equivariant atom-centred interatomic potentials. Nat. Mach. Intell. 2025, 7, 56-67.

40. Fu, X.; Wood, B. M.; Barroso-Luque, L.; et al. Learning smooth and expressive interatomic potentials for physical property prediction. arXiv 2025, arXiv 2502,12147. https://doi.org/10.48550/arXiv.2502.12147. (accessed 30 Jun 2025).

41. Chmiela, S.; Sauceda, H. E.; Müller, K. R.; Tkatchenko, A. Towards exact molecular dynamics simulations with machine-learned force fields. Nat. Commun. 2018, 9, 3887.

42. Choudhary, K.; Congo, F. Y.; Liang, T.; Becker, C.; Hennig, R. G.; Tavazza, F. Evaluation and comparison of classical interatomic potentials through a user-friendly interactive web-interface. Sci. Data. 2017, 4, 160125.

43. Wilkinson, M. D.; Dumontier, M.; Aalbersberg, I. J.; et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci. Data. 2016, 3, 160018.

44. Blaiszik, B.; Chard, K.; Pruyne, J.; Ananthakrishnan, R.; Tuecke, S.; Foster, I. The materials data facility: data services to advance materials science research. JOM 2016, 68, 2045-52.

45. Varughese, B.; Manna, S.; Loeffler, T. D.; Batra, R.; Cherukara, M. J.; Sankaranarayanan, S. K. R. S. Active and transfer learning of high-dimensional neural network potentials for transition metals. ACS. Appl. Mater. Interfaces. 2024, 16, 20681-92.

46. Zhang, L.; Lin, D. Y.; Wang, H.; Car, R.; E, W. Active learning of uniformly accurate interatomic potentials for materials simulation. Phys. Rev. Mater. 2019, 3, 023804.

47. Zhang, L.; Chen, Z.; Su, J.; Li, J. Data mining new energy materials from structure databases. Renew. Sust. Energy. Rev. 2019, 107, 554-67.

48. Focassio, B.; Freitas, L. P. M.; Schleder, G. R. Performance assessment of universal machine learning interatomic potentials: challenges and directions for materials’ surfaces. ACS. Appl. Mater. Interfaces. 2025, 17, 13111-21.

49. Kim, Y.; Kim, Y.; Yang, C.; Park, K.; Gu, G. X.; Ryu, S. Deep learning framework for material design space exploration using active transfer learning and data augmentation. npj. Comput. Mater. 2021, 7, 609.

50. Mosquera-Lois, I.; Kavanagh, S. R.; Ganose, A. M.; Walsh, A. Machine-learning structural reconstructions for accelerated point defect calculations. npj. Comput. Mater. 2024, 10, 1303.

51. Omee, S. S.; Fu, N.; Dong, R.; Hu, M.; Hu, J. Structure-based out-of-distribution (OOD) materials property prediction: a benchmark study. npj. Comput. Mater. 2024, 10, 1316.

52. Musaelian, A.; Batzner, S.; Johansson, A.; et al. Learning local equivariant representations for large-scale atomistic dynamics. Nat. Commun. 2023, 14, 579.

53. Ma, X.; Chen, H.; He, R.; et al. Active learning of effective Hamiltonian for super-large-scale atomic structures. npj. Comput. Mater. 2025, 11, 1563.

54. Deng, B.; Zhong, P.; Jun, K.; et al. CHGNet as a pretrained universal neural network potential for charge-informed atomistic modelling. Nat. Mach. Intell. 2023, 5, 1031-41.

55. Liao, Y. L.; Wood, B.; Das, A.; Smidt, T. EquiformerV2: improved equivariant transformer for scaling to higher-degree representations. arXiv 2023, arXiv:2306.12059. https://doi.org/10.48550/arXiv.2306.12059. (accessed 30 Jun 2025).

56. Riebesell, J.; Goodall, R. E. A.; Benner, P.; et al. Matbench discovery - a framework to evaluate machine learning crystal stability predictions. arXiv 2023, arXiv.2308.14920. https://doi.org/10.48550/arXiv.2308.14920. (accessed 30 Jun 2025).

57. Merchant, A.; Batzner, S.; Schoenholz, S. S.; Aykol, M.; Cheon, G.; Cubuk, E. D. Scaling deep learning for materials discovery. Nature 2023, 624, 80-5.

58. McClean, J. R.; Rubin, N. C.; Lee, J.; et al. What the foundations of quantum computer science teach us about chemistry. J. Chem. Phys. 2021, 155, 150901.

59. Castelvecchi, D. The AI-quantum computing mash-up: will it revolutionize science? Nature. 2024.

60. Allen, A. E. A.; Lubbers, N.; Matin, S.; et al. Learning together: towards foundation models for machine learning interatomic potentials with meta-learning. npj. Comput. Mater. 2024, 10, 1339.

61. Zhong, Y.; Yu, H.; Yang, J.; Guo, X.; Xiang, H.; Gong, X. Universal machine learning Kohn–Sham Hamiltonian for materials. Chinese. Phys. Lett. 2024, 41, 077103.

62. Curchod, B. F. E.; Martínez, T. J. Ab initio nonadiabatic quantum molecular dynamics. Chem. Rev. 2018, 118, 3305-36.

63. Babadi, M.; Knap, M.; Martin, I.; Refael, G.; Demler, E. Theory of parametrically amplified electron-phonon superconductivity. Phys. Rev. B. 2017, 96, 014512.

64. Zou, J.; Zhouyin, Z.; Lin, D.; Zhang, L.; Hou, S.; Gu, Q. Deep learning accelerated quantum transport simulations in nanoelectronics: from break junctions to field-effect transistors. arXiv 2024, arXiv:2411.08800. https://doi.org/10.48550/arXiv.2411.08800. (accessed 30 Jun 2025).

65. Yan, B. Spin-orbit coupling: a relativistic effect. 2016. https://tms16.sciencesconf.org/data/pages/SOC_lecture1.pdf. (accessed 30 Jun 2025).

66. Inorganic Chemistry II review: 6.1 solid state structures. https://library.fiveable.me/inorganic-chemistry-ii/unit-6/solid-state-structures/study-guide/qnl67GdXxyZ74VJP. (accessed 30 Jun 2025).

67. Gong, X.; Li, H.; Zou, N.; Xu, R.; Duan, W.; Xu, Y. General framework for E(3)-equivariant neural network representation of density functional theory Hamiltonian. Nat. Commun. 2023, 14, 2848.

68. Schütt, K. T.; Gastegger, M.; Tkatchenko, A.; Müller, K. R.; Maurer, R. J. Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions. Nat. Commun. 2019, 10, 5024.

69. Unke, O. T.; Bogojeski, M.; Gastegger, M.; Geiger, M.; Smidt, T.; Muller, K. R. SE(3)-equivariant prediction of molecular wavefunctions and electronic densities. In Advances in Neural Information Processing Systems 34 (NeurIPS 2021). 2021. https://proceedings.neurips.cc/paper/2021/hash/78f1893678afbeaa90b1fa01b9cfb860-Abstract.html. (accessed 30 Jun 2025).

70. Thomas, N.; Smidt, T.; Kearnes, S.; et al. Tensor field networks: rotation- and translation-equivariant neural networks for 3D point clouds. arXiv 2018, arXiv:1802.08219. https://doi.org/10.48550/arXiv.1802.08219. (accessed 30 Jun 2025).

71. Passaro, S.; Zitnick, C. L. Reducing SO(3) convolutions to SO(2) for efficient equivariant GNNs. arXiv 2023, arXiv:2302.03655. https://doi.org/10.48550/arXiv.2302.03655. (accessed 30 Jun 2025).

72. Wang, Y.; Li, H.; Tang, Z.; et al. DeepH-2: enhancing deep-learning electronic structure via an equivariant local-coordinate transformer. arXiv 2024, arXiv:2401.17015. https://doi.org/10.48550/arXiv.2401.17015. (accessed 30 Jun 2025).

73. Zhouyin, Z.; Gan, Z.; Liu, M.; Pandey, S. K.; Zhang, L.; Gu, Q. Learning local equivariant representations for quantum operators. arXiv 2024, arXiv:2407.06053. https://doi.org/10.48550/arXiv.2407.06053. (accessed 30 Jun 2025).

74. Li, Y.; Xia, Z.; Huang, L.; et al. Enhancing the scalability and applicability of Kohn-Sham Hamiltonians for molecular systems. arXiv 2025, arXiv:2502.19227. https://doi.org/10.48550/arXiv.2502.19227. (accessed 30 Jun 2025).

75. Yin, S.; Pan, X.; Wang, F.; He, L. TraceGrad: a framework learning expressive SO(3)-equivariant non-linear representations for electronic-structure Hamiltonian prediction. arXiv 2024, arXiv:2405.05722. https://doi.org/10.48550/arXiv.2405.05722. (accessed 30 Jun 2025).

76. Yu, H.; Xu, Z.; Qian, X.; Qian, X.; Ji, S. Efficient and equivariant graph networks for predicting quantum Hamiltonian. arXiv 2023, arXiv:2306.04922. https://doi.org/10.48550/arXiv.2306.04922. (accessed 30 Jun 2025).

77. Zhong, Y.; Yu, H.; Su, M.; Gong, X.; Xiang, H. Transferable equivariant graph neural networks for the Hamiltonians of molecules and solids. npj. Comput. Mater. 2023, 9, 1130.

78. Tang, Z.; Li, H.; Lin, P.; et al. A deep equivariant neural network approach for efficient hybrid density functional calculations. Nat. Commun. 2024, 15, 8815.

79. Li, H.; Tang, Z.; Fu, J.; et al. Deep-learning density functional perturbation theory. Phys. Rev. Lett. 2024, 132, 096401.

80. Yin, S.; Pan, X.; Zhu, X.; et al. Towards harmonization of SO(3)-equivariance and expressiveness: a hybrid deep learning framework for electronic-structure Hamiltonian prediction. Mach. Learn. Sci. Technol. 2024, 5, 045038.

81. Li, H.; Tang, Z.; Gong, X.; Zou, N.; Duan, W.; Xu, Y. Deep-learning electronic-structure calculation of magnetic superstructures. Nat. Comput. Sci. 2023, 3, 321-7.

82. Gong, X.; Louie, S. G.; Duan, W.; Xu, Y. Generalizing deep learning electronic structure calculation to the plane-wave basis. Nat. Comput. Sci. 2024, 4, 752-60.

83. Wang, Y.; Li, Y.; Tang, Z.; et al. Universal materials model of deep-learning density functional theory Hamiltonian. Sci. Bull. 2024, 69, 2514-21.

84. Tang, H.; Xiao, B.; He, W.; et al. Approaching coupled-cluster accuracy for molecular electronic structures with multi-task learning. Nat. Comput. Sci. 2025, 5, 144-54.

85. Zhong, Y.; Liu, S.; Zhang, B.; et al. Accelerating the calculation of electron-phonon coupling strength with machine learning. Nat. Comput. Sci. 2024, 4, 615-25.

86. Ma, Y.; Yu, H.; Zhong, Y.; Chen, S.; Gong, X.; Xiang, H. Transferable machine learning approach for predicting electronic structures of charged defects. Appl. Phys. Lett. 2025, 126, 044103.

87. Gu, Q.; Zhouyin, Z.; Pandey, S. K.; Zhang, P.; Zhang, L.; E, W. Deep learning tight-binding approach for large-scale electronic simulations at finite temperatures with ab initio accuracy. Nat. Commun. 2024, 15, 6772.

88. Zheng, H.; Sivonxay, E.; Christensen, R.; et al. The ab initio non-crystalline structure database: empowering machine learning to decode diffusivity. npj. Comput. Mater. 2024, 10, 1469.

89. Huang, P.; Lukin, R.; Faleev, M.; et al. Unveiling the complex structure-property correlation of defects in 2D materials based on high throughput datasets. npj. 2D. Mater. Appl. 2023, 7, 369.

90. Zhou, Z.; Zhou, Y.; He, Q.; Ding, Z.; Li, F.; Yang, Y. Machine learning guided appraisal and exploration of phase design for high entropy alloys. npj. Comput. Mater. 2019, 5, 265.

91. Chen, Y.; Zhang, L.; Wang, H.; E, W. DeePKS: a comprehensive data-driven approach toward chemically accurate density functional theory. J. Chem. Theory. Comput. 2021, 17, 170-81.

92. Ou, Q.; Tuo, P.; Li, W.; Wang, X.; Chen, Y.; Zhang, L. DeePKS model for halide perovskites with the accuracy of a hybrid functional. J. Phys. Chem. C. 2023, 127, 18755-64.

93. Kakkad, J.; Jannu, J.; Sharma, K.; Aggarwal, C.; Medya, S. A survey on explainability of graph neural networks. arXiv 2023, arXiv:2306.01958. https://doi.org/10.48550/arXiv.2306.01958. (accessed 30 Jun 2025).

94. Chen, Y.; Bian, Y.; Han, B.; Cheng, J. Interpretable and generalizable graph neural networks via subgraph multilinear extension. 2024. https://openreview.net/forum?id=dVq2StlcnY. (accessed 30 Jun 2025).

95. Miao, S.; Liu, M.; Li, P. Interpretable and generalizable graph learning via stochastic attention mechanism. arXiv 2022, arXiv 2201.12987. https://doi.org/10.48550/arXiv.2201.12987. (accessed 30 Jun 2025).

96. Hou, B.; Wu, J.; Qiu, D. Y. Unsupervised representation learning of Kohn-Sham states and consequences for downstream predictions of many-body effects. Nat. Commun. 2024, 15, 9481.

97. Miao, S.; Luo, Y.; Liu, M.; Li, P. Interpretable geometric deep learning via learnable randomness injection. arXiv 2022, arXiv 2210.16966. https://doi.org/10.48550/arXiv.2210.16966. (accessed 30 Jun 2025).

98. Zhang, C.; Tang, Z.; Zhong, Y.; Zou, N.; Tao, Z. G.; et al. Advancing nonadiabatic molecular dynamics simulations in solids with E(3) equivariant deep neural hamiltonians. Nat. Commun. 2025, 16, 2033.

99. Zhang, X.; Xu, L.; Lu, J.; Zhang, Z.; Shen, L. Physics-integrated neural network for quantum transport prediction of field-effect transistor. arXiv 2024, arXiv 2408.17023. https://doi.org/10.48550/arXiv.2408.17023. (accessed 30 Jun 2025).

100. Chen, X.; Liu, Y.; Liu, P.; et al. Unconventional magnons in collinear magnets dictated by spin space groups. Nature 2025, 640, 349-54.

101. Törmä, P.; Peotta, S.; Bernevig, B. A. Superconductivity, superfluidity and quantum geometry in twisted multilayer systems. Nat. Rev. Phys. 2022, 4, 528-42.

102. Zhao, Y.; Wang, Z.; Zhou, J.; Zhang, C.; Shin, S.; Shen, L. Effects of loosely bound electrons and electron–phonon interaction on the thermoelectric properties of electrenes. J. Mater. Chem. C. 2024, 12, 14496-504.

Journal of Materials Informatics
ISSN 2770-372X (Online)
Follow Us

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/