Nature Computational Science

Journal Information
EISSN : 2662-8457
Total articles ≅ 86
Filter:

Latest articles in this journal

Published: 23 July 2021
Nature Computational Science, Volume 1, pp 449-449; doi:10.1038/s43588-021-00109-9

Abstract:
While it is crucial to guarantee the reproducibility of the results reported in a paper, let us also not forget about the importance of making research artifacts reusable for the scientific community.
Qiyuan Zhao,
Published: 22 July 2021
Nature Computational Science, Volume 1, pp 479-490; doi:10.1038/s43588-021-00101-3

Abstract:
Automated reaction prediction has the potential to elucidate complex reaction networks for applications ranging from combustion to materials degradation, but computational cost and inconsistent reaction coverage are still obstacles to exploring deep reaction networks. Here we show that cost can be reduced and reaction coverage can be increased simultaneously by relatively straightforward modifications of the reaction enumeration, geometry initialization and transition state convergence algorithms that are common to many prediction methodologies. These components are implemented in the context of yet another reaction program (YARP), our reaction prediction package with which we report reaction discovery benchmarks for organic single-step reactions, thermal degradation of a γ-ketohydroperoxide, and competing ring-closures in a large organic molecule. Compared with recent benchmarks, YARP (re)discovers both established and unreported reaction pathways and products while simultaneously reducing the cost of reaction characterization by nearly 100-fold and increasing convergence of transition states. This combination of ultra-low cost and high reaction coverage creates opportunities to explore the reactivity of larger systems and more complex reaction networks for applications such as chemical degradation, where computational cost is a bottleneck. This work demonstrates that large gains still exist in accelerating and improving the coverage of reaction prediction algorithms. These advances create opportunities for computationally exploring deeper and broader reaction networks.
Osama Abdin, Philip M. Kim
Published: 15 July 2021
Nature Computational Science, Volume 1, pp 456-457; doi:10.1038/s43588-021-00104-0

Abstract:
A graph-neural-network-based framework is proposed for the refinement of protein structure models, substantially improving the efficacy and efficiency of refining protein models when compared with the state-of-the-art approaches.
Jie Pan
Published: 15 July 2021
Nature Computational Science, Volume 1, pp 453-453; doi:10.1038/s43588-021-00107-x

Ananya Rastogi
Published: 15 July 2021
Nature Computational Science, Volume 1, pp 455-455; doi:10.1038/s43588-021-00106-y

Xiaoyang Jing,
Published: 15 July 2021
Nature Computational Science, Volume 1, pp 462-469; doi:10.1038/s43588-021-00098-9

Abstract:
Protein model refinement is the last step applied to improve the quality of a predicted protein model. Currently, the most successful refinement methods rely on extensive conformational sampling and thus take hours or days to refine even a single protein model. Here, we propose a fast and effective model refinement method that applies graph neural networks (GNNs) to predict a refined inter-atom distance probability distribution from an initial model and then rebuilds three-dimensional models from the predicted distance distribution. Tested on the Critical Assessment of Structure Prediction refinement targets, our method has an accuracy that is comparable to those of two leading human groups (FEIG and BAKER), but runs substantially faster. Our method may refine one protein model within ~11 min on one CPU, whereas BAKER needs ~30 h on 60 CPUs and FEIG needs ~16 h on one GPU. Finally, our study shows that GNN outperforms ResNet (convolutional residual neural networks) for model refinement when very limited conformational sampling is allowed. Deep graph neural networks can refine a predicted protein model efficiently with less computing resources. The accuracy is comparable to that of the leading physics-based methods that rely on time-consuming conformation sampling.
Fernando Chirigati
Published: 15 July 2021
Nature Computational Science, Volume 1, pp 454-454; doi:10.1038/s43588-021-00108-w

Published: 12 July 2021
Nature Computational Science, Volume 1, pp 450-452; doi:10.1038/s43588-021-00102-2

Abstract:
Gravitational-wave discoveries have ignited a new era of astronomy. Numerical relativity plays a crucial role in modeling gravitational-wave sources for current and next-generation observatories, but it doesn’t come without computational challenges.
Correction
Carla Lupo, François Jamet, Wai Hei Terence Tse, ,
Nature Computational Science, Volume 1, pp 502-502; doi:10.1038/s43588-021-00105-z

Published: 24 June 2021
Nature Computational Science, Volume 1, pp 385-385; doi:10.1038/s43588-021-00096-x

Abstract:
The shift to virtual meetings has made networking harder, but it has also brought forth benefits to the scientific community that should be embraced moving forward.
Back to Top Top