bytedance / ImageDream
- четверг, 21 декабря 2023 г. в 00:00:06
The code releasing for https://image-dream.github.io/
Peng Wang, Yichun Shi
Project Page | Paper | Demo
This part is the same as original MVDream-threestudio. Skip it if you already have installed the environment.
Clone the modelcard on the Huggingface ImageDream Model Page under ./extern/ImageDream/release_models/
In the paper, we use the configuration with soft-shading. It would need an A100 GPU in most cases to compute normal:
export PYTHONPATH=$PYTHONPATH:./extern/ImageDream
image_file="./extern/ImageDream/assets/astronaut.png"
ckpt_file="./extern/ImageDream/release_models/ImageDream/sd-v2.1-base-4view-ipmv.pt"
cfg_file="./extern/ImageDream/imagedream/configs/sd_v2_base_ipmv.yaml"
python3 launch.py \
--config configs/$method.yaml --train --gpu 0 \
name="imagedream-sd21-shading" tag="astronaut" \
system.prompt_processor.prompt="an astronaut riding a horse" \
system.prompt_processor.image_path="${image_file}" \
system.guidance.ckpt_path="${ckpt_file}" \
system.guidance.config_path="${cfg_file}"
For diffusion only model, refer to subdir ./extern/ImageDream/
Check ./threestudio/scripts/run_imagedream.sh
for a bash example.
[0, 30]
, otherwise, you may do image outpainting and follow tips 1.If you find ImageDream helpful, please consider citing:
@article{wang2023imagedream,
title={ImageDream: Image-Prompt Multi-view Diffusion for 3D Generation},
author={Wang, Peng and Shi, Yichun},
journal={arXiv preprint arXiv:2312.02201},
year={2023}
}