# BrepGen: A B-rep Generative Diffusion Model with Structured Latent Geometry (SIGGRAPH 2024)
[![arXiv](https://img.shields.io/badge/ð-arXiv%20-red.svg)](https://arxiv.org/abs/2401.15563)
[![webpage](https://img.shields.io/badge/ð-Website%20-blue.svg)](https://brepgen.github.io)
[![Youtube](https://img.shields.io/badge/ð½ï¸-Video%20-orchid.svg)](https://www.youtube.com/xxx)
*[Xiang Xu](https://samxuxiang.github.io/), [Joseph Lambourne](https://www.research.autodesk.com/people/joseph-george-lambourne/),
[Pradeep Jayaraman](https://www.research.autodesk.com/people/pradeep-kumar-jayaraman/), [Zhengqing Wang](https://www.linkedin.com/in/zhengqing-wang-485854241/?originalSubdomain=ca), [Karl Willis](https://www.karlddwillis.com/), and [Yasutaka Furukawa](https://yasu-furukawa.github.io/)*
![alt BrepGen](resources/teaser.jpg)
> We present a diffusion-based generative approach that directly outputs a CAD B-rep. BrepGen uses a novel structured latent geometry to encode the CAD geometry and topology. A top-down generation approach is used to denoise the faces, edges, and vertices.
## Requirements
### Environment (Tested)
- Linux
- Python 3.9
- CUDA 11.8
- PyTorch 2.2
- Diffusers 0.27
### Dependencies
Install PyTorch and other dependencies:
pip install -r requirements.txt
Install Diffusers:
pip install diffusers["torch"] transformers
Install OCCWL following the instruction [here](https://github.com/AutodeskAILab/occwl).
Note: try [building from source](https://github.com/krrish94/chamferdist?tab=readme-ov-file#building-from-source) if ```pip install chamferdist``` does not work.
## Data
Download [ABC](https://archive.nyu.edu/handle/2451/43778) STEP files (100 folders), or the [Furniture Data](https://drive.google.com/file/d/16nXl7OXOZtPxRhkGobOezDTXiBisEVs2/view?usp=sharing).
The faces, edges, and vertices need to be extracted from the STEP files.
Process the B-reps (under ```data_process``` folder):
sh process.sh
Remove repeated CAD models (under ```data_process``` folder, default is ```6 bit``` ):
sh deduplicate.sh
We also provide the post-processed data for [DeepCAD](https://drive.google.com/drive/folders/1N_60VCZKYgPviQgP8lwCOVXrzu9Midfe?usp=drive_link), [ABC](https://drive.google.com/drive/folders/1bA90Rz5EcwaUhUrgFbSIpgdJ0aeDjy3v?usp=drive_link), and [Furniture](https://drive.google.com/drive/folders/13TxFFSXqT4IgyIwO4z6gbm4jg3JrnbZL?usp=drive_link).
## Training
Train the surface and edge VAE (wandb for logging):
sh train_vae.sh
Train the latent diffusion model (change path to previously trained VAEs):
sh train_ldm.sh
```--cf``` classifier-free training for the Furniture dataset.
```--data_aug``` randomly rotate the CAD model during training (optional).
## Generation and Evaluation
Randomly generate B-reps from Gaussian noise, both STEP and STL files will be saved:
python sample.py --mode abc
This will load the settings in ```eval_config.yaml```. Make sure to update model paths to the correct folder.
Run this script for evaluation (change the path to generated data folder, with at least 3,000 samples):
sh eval.sh
This computes the JSD, MMD, and COV scores. Please also download sampled point clouds for [test set](https://drive.google.com/drive/folders/1kqxSDkS2gUN9_qpuWotFDhl4t7czbfOc?usp=sharing).
## Pretrained Checkpoint
We also provide the individual checkpoints trained on different datasets.
| **Source Dataset** | | |
|--------------------|-----------| -----------|
| DeepCAD | [vae model](https://drive.google.com/drive/folders/1UZYqJ2EmTjzeTcNr_NL3bPpU4WrufvQa?usp=drive_link) | [latent diffusion model](https://drive.google.com/drive/folders/1jonuCzoTBFOKKlnaoGlbmhT6YlnH0lma?usp=drive_link) |
| ABC | [vae model](https://drive.google.com/drive/folders/18Ib9L0kpFf4ylZIRTCYFhXZB_GVIUm53?usp=drive_link) | [latent diffusion model](https://drive.google.com/drive/folders/1hv7ZUcU-L3J0LiONK60-TEh7sAN0zfve?usp=drive_link) |
| Furniture | [vae model](https://drive.google.com/drive/folders/1HT5h8b6mxcgBfz0Ciwue8nANcKgmRTd-?usp=drive_link) | [latent diffusion model](https://drive.google.com/drive/folders/1NxuZ9en6yWSkmb2pBQ97aFlWvtBSNnjU?usp=drive_link) |
## Acknowledgement
This research is partially supported by NSERC Discovery Grants with Accelerator Supplements and DND/NSERC Discovery Grant
Supplement, NSERC Alliance Grants, and John R. Evans Leaders Fund (JELF). We also thank Onshape for their support and access of
the publicly available CAD models.
## Citation
If you find our work useful in your research, please cite the following paper
```
@article{xu2024brepgen,
title={BrepGen: A B-rep Generative Diffusion Model with Structured Latent Geometry},
author={Xu, Xiang and Lambourne, Joseph G and Jayaraman, Pradeep Kumar and Wang, Zhengqing and Willis, Karl DD and Furukawa, Yasutaka},
journal={arXiv preprint arXiv:2401.15563},
year={2024}
}
```
没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
收起资源包目录
“BrepGen:具有结构化潜在几何的 B-rep 生成扩散模型”的官方 PyTorch 实现.zip (25个子文件)
1
BrepGen-main
utils.py 36KB
train_ldm.sh 4KB
train_vae.sh 2KB
ldm.py 2KB
resources
teaser.jpg 1.73MB
LICENSE 289B
network.py 55KB
trainer.py 39KB
dataset.py 21KB
pc_metric.py 12KB
sample.py 17KB
sample_points.py 3KB
requirements.txt 105B
vae.py 2KB
eval_config.yaml 1KB
eval.sh 238B
LICENSE_GPL 34KB
README.md 5KB
data_process
deduplicate.sh 1KB
deduplicate_cad.py 2KB
train_val_test_split.json 2.89MB
deduplicate_surfedge.py 2KB
process.sh 781B
process_brep.py 8KB
convert_utils.py 12KB
共 25 条
- 1
资源评论
GZM888888
- 粉丝: 515
- 资源: 3066
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功