LAION-AI / Open-Assistant
- четверг, 29 декабря 2022 г. в 00:38:09
Open Assistant is a project meant to give everyone access to a great chat based large language model.
We believe that by doing this we will create a revolution in innovation in language. In the same way that stable-diffusion helped the world make art and images in new ways we hope Open Assistant can help improve the world by improving language itself.
We want to get to an initial MVP as fast as possible, by following the 3-steps outlined in the InstructGPT paper.
We can then take the resulting model and continue with completion sampling step 2 for a next iteration.
We are not going to stop at replicating ChatGPT. We want to build the assistant of the future, able to not only write email and cover letters, but do meaningful work, use APIs, dynamically research information, and much more, with the ability to be personalized and extended by anyone. And we want to do this in a way that is open and accessible, which means we must not only build a great assistant, but also make it small and efficient enough to run on consumer hardware.
All open source projects begins with people like you. Open source is the belief that if we collaborate we can together gift our knowledge and technology to the world for the benefit of humanity.
Fill out the contributor signup form
Join the LAION Discord Server!
We have a growing task list of issues. Find an issue that appeals to you and make a comment that you'd like to work on it. Include in your comment a brief description of how you'll solve the problem and if there are any open questions you want to discuss. Once a project coordinator has assigned the issue to you, start working on it.
If the issue is currently unclear but you are interested, please post in Discord and someone can help clarify the issue with more detail.
We're all working on different parts of Open Assistant together. To make contributions smoothly we recommend the following:
pre-commit
and make sure all files have formatting fixed. This
simplifies life for reviewers.main
without any problems.
If there's changes to files you're working on, resolve them by:main
.Additionally, if someone is working on an issue that interests you, ask if they need help on it or would like suggestions on how to approach the issue. If so, share wildly. If they seem to have a good handle on it, let them work on their solution until a challenge comes up.
A review finishes when all blocking comments are addressed and at least one owning reviewer has approved the PR. Be sure to acknowledge any non-blocking comments either by making the request change, explaining why it's not being addressed now, or filing an issue to handle it later.
Work is organized in the project board.
Anything that is in the Todo
column and not assigned, is up for grabs. Meaning we'd be happy if anyone did those tasks.
If you want to work on something, assign yourself to it or write a comment that you want to work on it and what you plan to do.
scripts/backend-development/README.md
.scripts/frontend-development/README.md
to make a backend available.There is also a minimal implementation of a frontend in the text-frontend
folder.
We are using Python 3.10 for the backend.
Check out the High-Level Protocol Architecture
If you are interested in just taking a look at the project. You can set up an entire stack needed to run Open Assistant, including the website, backend, and associated dependent services.
To start the demo, run this, in root directory:
docker compose up --build
Then, navigate to http://localhost:3000
and interact with the website. When
logging in, navigate to http://localhost:1080
to get the magic email login
link.
The website is built using Next.js and is in the website
folder.
Install pre-commit
and run pre-commit install
to install the pre-commit hooks.
In case you haven't done this, have already committed, and CI is failing, you can run pre-commit run --all-files
to run the pre-commit hooks on all files.
Upon making a release on GitHub, all docker images are automatically built and pushed to ghcr.io. The docker images are tagged with the release version, and the latest
tag. Further, the ansible playbook in ansible/dev.yaml
is run to automatically deploy the built release to the dev machine.