Alpha-VLLM / LLaMA2-Accessory
- четверг, 3 августа 2023 г. в 00:00:04
An Open-source Toolkit for LLM Development
See docs/install.md.
See docs/pretrain.md and docs/finetune.md.
Chris Liu, Ziyi Lin, Guian Fang, Jiaming Han, Renrui Zhang, Wenqi Shao, Peng Gao
If you find our code and paper useful, please kindly cite:
@article{zhang2023llamaadapter,
title = {LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention},
author={Zhang, Renrui and Han, Jiaming and Liu, Chris and Gao, Peng and Zhou, Aojun and Hu, Xiangfei and Yan, Shilin and Lu, Pan and Li, Hongsheng and Qiao, Yu},
journal={arXiv preprint arXiv:2303.16199},
year={2023}
}
@article{gao2023llamaadapterv2,
title = {LLaMA-Adapter V2: Parameter-Efficient Visual Instruction Model},
author={Gao, Peng and Han, Jiaming and Zhang, Renrui and Lin, Ziyi and Geng, Shijie and Zhou, Aojun and Zhang, Wei and Lu, Pan and He, Conghui and Yue, Xiangyu and Li, Hongsheng and Qiao, Yu},
journal={arXiv preprint arXiv:2304.15010},
year={2023}
}
Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.