Deep Learning Model for Finding New Superconductors
December 03, 2018 ยท Entered Twilight ยท ๐ Phys. Rev. B 103, 014509 (2021)
"Last commit was 5.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, 2007, 2009, 2018, LICENSE, README.md, chemical_formula_to_reading_periodic_table.py, go_open_candidate_materials_list.xlsx
Authors
Tomohiko Konno, Hodaka Kurokawa, Fuyuki Nabeshima, Yuki Sakishita, Ryo Ogawa, Iwao Hosako, Atsutaka Maeda
arXiv ID
1812.01995
Category
cs.LG: Machine Learning
Cross-listed
cond-mat.mtrl-sci,
cond-mat.supr-con,
cs.CL,
physics.comp-ph
Citations
3
Venue
Phys. Rev. B 103, 014509 (2021)
Repository
https://github.com/tomo835g/Deep-Learning-to-find-Superconductors
โญ 12
Last Checked
2 months ago
Abstract
Exploration of new superconductors still relies on the experience and intuition of experts and is largely a process of experimental trial and error. In one study, only 3% of the candidate materials showed superconductivity. Here, we report the first deep learning model for finding new superconductors. We introduced the method named "reading periodic table" which represented the periodic table in a way that allows deep learning to learn to read the periodic table and to learn the law of elements for the purpose of discovering novel superconductors that are outside the training data. It is recognized that it is difficult for deep learning to predict something outside the training data. Although we used only the chemical composition of materials as information, we obtained an $R^{2}$ value of 0.92 for predicting $T_\text{c}$ for materials in a database of superconductors. We also introduced the method named "garbage-in" to create synthetic data of non-superconductors that do not exist. Non-superconductors are not reported, but the data must be required for deep learning to distinguish between superconductors and non-superconductors. We obtained three remarkable results. The deep learning can predict superconductivity for a material with a precision of 62%, which shows the usefulness of the model; it found the recently discovered superconductor CaBi2 and another one Hf0.5Nb0.2V2Zr0.3, neither of which is in the superconductor database; and it found Fe-based high-temperature superconductors (discovered in 2008) from the training data before 2008. These results open the way for the discovery of new high-temperature superconductor families. The candidate materials list, data, and method are openly available from the link https://github.com/tomo835g/Deep-Learning-to-find-Superconductors.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted