# LitMatter
A template for rapid experimentation and scaling deep learning models on molecular and crystal graphs.
## How to use
1. Clone this repository and start editing, or save it and use it as a template for new projects.
2. Edit `lit_models/models.py` with the PyTorch code for your model of interest.
3. Edit `lit_data/data.py` to load and process your PyTorch datasets.
4. Perform interactive experiments in `prototyping.py`.
5. Scale network training to any number of GPUs using the example batch scripts.
## Principles
LitMatter uses [PyTorch Lightning](https://zenodo.org/record/3828935#.YSP51x0pC-F) to organize PyTorch code so scientists can rapidly experiment with geometric deep learning and scale up to hundreds of GPUs without difficulty. Many amazing applied ML methods (even those with open-source code) are never used by the wider community because the important details are buried in hundreds of lines of boilerplate code. It may require a significant engineering effort to get the method working on a new dataset and in a different computing environment, and it can be hard to justify this effort before verifying that the method will provide some advantage. Packaging your code with the LitMatter template makes it easy for other researchers to experiment with your models and scale them beyond common benchmark datasets.
## Features
* Maximum flexibility. LitMatter supports arbitrary PyTorch models and dataloaders.
* Eliminate boilerplate. Engineering code is abstracted away, but still accessible if needed.
* Full end-to-end pipeline. Data processing, model construction, training, and inference can be launched from the command line, in a Jupyter notebook, or through a SLURM job.
* Lightweight. Using the template is *easier* than not using it; it reduces infrastructure overhead for simple and complex deep learning projects.
## Examples
The example notebooks show how to use LitMatter to scale model training for different applications.
* [Prototyping GNNs](./prototyping.ipynb) - train an equivariant graph neural network to predict quantum properties of small molecules.
* [Neural Force Fields](./LitNFFs.ipynb) - train a neural force field on molecular dynamics trajectories of small molecules.
* [DeepChem](./LitDeepChem.ipynb) - train a PyTorch model in DeepChem on a MoleculeNet dataset.
* [ð¤](./LitHF.ipynb) - train a ð¤ language model to generate molecules.
Note that these examples have additional dependencies beyond the core depdencies of LitMatter.
## References
If you use LitMatter for your own research and scaling experiments, please cite the following work:
[Frey, Nathan C., et al. "Scalable Geometric Deep Learning on Molecular Graphs." NeurIPS 2021 AI for Science Workshop. 2021.](https://arxiv.org/abs/2112.03364)
```
@inproceedings{frey2021scalable,
title={Scalable Geometric Deep Learning on Molecular Graphs},
author={Frey, Nathan C and Samsi, Siddharth and McDonald, Joseph and Li, Lin and Coley, Connor W and Gadepally, Vijay},
booktitle={NeurIPS 2021 AI for Science Workshop},
year={2021}
}
```
Please also cite the relevant frameworks: [PyG](https://arxiv.org/abs/1903.02428), [PyTorch Distributed](https://arxiv.org/abs/2006.15704), [PyTorch Lightning](https://github.com/PyTorchLightning/pytorch-lightning),
and any extensions you use:
[ð¤](https://arxiv.org/abs/1910.03771), [DeepChem](https://github.com/deepchem/deepchem#citing-deepchem), [NFFs](https://github.com/learningmatter-mit/NeuralForceField#references), etc.
## Extensions
When you're ready to upgrade to fully configurable, reproducible, and scalable workflows, use [hydra-zen](https://github.com/mit-ll-responsible-ai/hydra-zen). hydra-zen [integrates seamlessly](https://mit-ll-responsible-ai.github.io/hydra-zen/how_to/pytorch_lightning.html) with LitMatter to self-document ML experiments and orchestrate multiple training runs for extensive hyperparameter sweeps.
## Environment
Version management in Python is never fun and deep learning dependencies are always changing, but here are the latest tested versions of key dependencies for *LitMatter*
* Python 3.8
* Pytorch Lightning 1.5.1
* Pytorch 1.10.0
## Disclaimer
DISTRIBUTION STATEMENT A. Approved for public release. Distribution is unlimited.
© 2021 MASSACHUSETTS INSTITUTE OF TECHNOLOGY
Subject to FAR 52.227-11 â Patent Rights â Ownership by the Contractor (May 2014)
SPDX-License-Identifier: MIT
This material is based upon work supported by the Under Secretary of Defense for Research and Engineering under Air Force Contract No. FA8702-15-D-0001. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Under Secretary of Defense for Research and Engineering.
The software/firmware is provided to you on an As-Is basis.
没有合适的资源?快使用搜索试试~ 我知道了~
AIDD Tutorial Files,人工智能药物设计教程相关文件.zip
共66个文件
ipynb:29个
py:13个
json:6个
0 下载量 21 浏览量
2024-08-28
13:06:18
上传
评论
收藏 5.8MB ZIP 举报
温馨提示
项目工程资源经过严格测试可直接运行成功且功能正常的情况才上传,可轻松copy复刻,拿到资料包后可轻松复现出一样的项目,本人系统开发经验充足(全栈开发),有任何使用问题欢迎随时与我联系,我会及时为您解惑,提供帮助 【资源内容】:项目具体内容可查看/点击本页面下方的*资源详情*,包含完整源码+工程文件+说明(若有)等。【若无VIP,此资源可私信获取】 【本人专注IT领域】:有任何使用问题欢迎随时与我联系,我会及时解答,第一时间为您提供帮助 【附带帮助】:若还需要相关开发工具、学习资料等,我会提供帮助,提供资料,鼓励学习进步 【适合场景】:相关项目设计中,皆可应用在项目开发、毕业设计、课程设计、期末/期中/大作业、工程实训、大创等学科竞赛比赛、初期项目立项、学习/练手等方面中 可借鉴此优质项目实现复刻,也可基于此项目来扩展开发出更多功能 #注 1. 本资源仅用于开源学习和技术交流。不可商用等,一切后果由使用者承担 2. 部分字体及插图等来自网络,若是侵权请联系删除,本人不对所涉及的版权问题或内容负法律责任。收取的费用仅用于整理和收集资料耗费时间的酬劳 3. 积分资源不提供使用问题指导/解答
资源推荐
资源详情
资源评论
收起资源包目录
AIDD Tutorial Files,人工智能药物设计教程相关文件.zip (66个子文件)
DSrgzn114
DeepChem Jupyter Notebooks
在MNIST数据集上训练一个生成对抗网络.ipynb 58KB
对 MoleculeNet 的介绍.ipynb 22KB
使用 ChemBERTa 变换器进行迁移学习.ipynb 4.54MB
原子对分子的贡献.ipynb 1.69MB
使用 Trident Chemwidgets 进行交互式模型评估.ipynb 49KB
把多任务机器学习模型应用在工作中.ipynb 7KB
合成可行性评价.ipynb 51KB
与PytorchLightning结合.ipynb 28KB
分子指纹.ipynb 12KB
处理数据集.ipynb 30KB
高级模型训练.ipynb 14KB
使用拆分器.ipynb 10KB
使用DeepChem训练第一个模型.ipynb 34KB
LitDeepChem.ipynb 10KB
附件
Histogram.png 252KB
JSME.png 168KB
InteractiveMolecule.png 218KB
Scatter.png 451KB
使用 hyperopt 高级模型训练.ipynb 30KB
使用 TensorFlow 和 PyTorch 创建模型.ipynb 16KB
建立蛋白质配体相互作用模型.ipynb 759KB
高斯过程简介.ipynb 47KB
分子无监督嵌入学习.ipynb 12KB
从实验数据创建一个高精确度的模型.ipynb 355KB
litmatter
prototyping.ipynb 4KB
lit_data
__init__.py 0B
molnet_data.py 2KB
lm_data.py 4KB
data.py 1KB
nff_data.py 2KB
tpe.py 3KB
LICENSE 1KB
CONTRIBUTING.md 823B
SPDX.spdx 249B
TPE.ipynb 30KB
LitDeepChem.ipynb 10KB
LitHF.ipynb 6KB
lit_models
__init__.py 0B
lit_chemgpt.py 7KB
lit_nffs.py 2KB
models.py 6KB
deepchem_models.py 2KB
lit_hf.py 5KB
tokenizers
pubchem10M_tokenizer
tokenizer.json 11KB
tokenizer_config.json 46B
special_tokens_map.json 2B
pubchem10M_selfiesv2_tokenizer
tokenizer.json 9KB
tokenizer_config.json 46B
special_tokens_map.json 2B
requirements.txt 143B
example_files
nvidia-smi.csv 9.81MB
metrics.csv 52KB
LitNFFs.ipynb 102KB
.gitignore 2KB
run.sh 653B
train.py 2KB
submit.sh 319B
README.md 5KB
使用原子卷积网络建立蛋白质配体相互作用模型.ipynb 33KB
深入分子特征化.ipynb 21KB
有条件的生成对抗网络.ipynb 81KB
图卷积的介绍.ipynb 16KB
_config.yml 131B
LICENSE 34KB
index.md 2KB
README.md 281B
共 66 条
- 1
资源评论
热爱技术。
- 粉丝: 2423
- 资源: 7862
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- AxTools CodeSMART 2013 for vb6
- Python 实现基于BP神经网络的时间序列预测-递归预测未来(多指标评价)(含完整的程序和代码详解)
- 毕业设计基于Python的知识图谱和图神经网络的电影推荐系统源码(高分毕设项目)
- 基于Python的知识图谱和图神经网络的电影推荐系统(完整高分毕业设计项目)
- C#程序设计入门与实例代码,基础学习代码入门实操
- 基于c++的图像处理,画线去畸变,接包裹入门代码
- Python基础学习-03逻辑分支语句、循环
- 基于微信小程序的点餐系统源码.zip
- 基于python tensorflow opencv图像处理入门程序代码
- EasyAdmin极简社区简约清爽社区论坛源码下载自适应手机端带后台带会员中心可发帖
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功