[!NOTE]
Just looking for the docs? Go here: https://docs.privategpt.dev/
PrivateGPT is a production-ready AI project that allows you to ask questions to your documents using the power of Large Language Models (LLMs), even in scenarios without Internet connection. 100% private, no data leaves your execution environment at any point.
The project provides an API offering all the primitives required to build private, context-aware AI applications. It follows and extends OpenAI API standard, and supports both normal and streaming responses.
The API is divided into two logical blocks:
High-level API, which abstracts all the complexity of a RAG (Retrieval Augmented Generation) pipeline implementation:
Low-level API, which allows advanced users to implement their own complex pipelines:
In addition to this, a working Gradio UI client is provided to test the API, together with a set of useful tools such as bulk model download script, ingestion script, documents folder watch, etc.
👂 Need help applying PrivateGPT to your specific use case? Let us know more about it and we'll try to help! We are refining PrivateGPT through your feedback.
DISCLAIMER: This README is not updated as frequently as the documentation. Please check it out for the latest updates!
Generative AI is a game changer for our society, but adoption in companies of all size and data-sensitive domains like healthcare or legal is limited by a clear concern: privacy. Not being able to ensure that your data is fully under your control when using third-party AI tools is a risk those industries cannot take.
The first version of PrivateGPT was launched in May 2023 as a novel approach to address the privacy concern by using LLMs in a complete offline way. This was done by leveraging existing technologies developed by the thriving Open Source AI community: LangChain, LlamaIndex, GPT4All, LlamaCpp, Chroma and SentenceTransformers.
That version, which rapidly became a go-to project for privacy-sensitive setups and served as the seed for thousands of local-focused generative AI projects, was the foundation of what PrivateGPT is becoming nowadays; thus a simpler and more educational implementation to understand the basic concepts required to build a fully local -and therefore, private- chatGPT-like tool.
If you want to keep experimenting with it, we have saved it in the primordial branch of the project.
It is strongly recommended to do a clean clone and install of this new version of PrivateGPT if you come from the previous, primordial version.
PrivateGPT is now evolving towards becoming a gateway to generative AI models and primitives, including completions, document ingestion, RAG pipelines and other low-level building blocks. We want to make easier for any developer to build AI applications and experiences, as well as providing a suitable extensive architecture for the community to keep contributing.
Stay tuned to our releases to check all the new features and changes included.
Full documentation on installation, dependencies, configuration, running the server, deployment options, ingesting local documents, API details and UI features can be found here: https://docs.privategpt.dev/
Conceptually, PrivateGPT is an API that wraps a RAG pipeline and exposes its primitives.
The design of PrivateGPT allows to easily extend and adapt both the API and the RAG implementation. Some key architectural decisions are:
LLM
, BaseEmbedding
or VectorStore
,
making it immediate to change the actual implementations of those abstractions.Main building blocks:
private_gpt:server:<api>
. Each package contains an
<api>_router.py
(FastAPI layer) and an <api>_service.py
(the
service implementation). Each Service uses LlamaIndex base abstractions instead
of specific implementations,
decoupling the actual implementation from its usage.private_gpt:components:<component>
. Each Component is in charge of providing
actual implementations to the base abstractions used in the Services - for example
LLMComponent
is in charge of providing an actual implementation of an LLM
(for example LlamaCPP
or OpenAI
).Contributions are welcomed! To ensure code quality we have enabled several format and
typing checks, just run make check
before committing to make sure your code is ok.
Remember to test your code! You'll find a tests folder with helpers, and you can run
tests using make test
command.
Interested in contributing to PrivateGPT? We have the following challenges ahead of us in case you want to give a hand:
make
to remove all contents of local_data folder except .gitignoreJoin the conversation around PrivateGPT on our:
Reference to cite if you use PrivateGPT in a paper:
@software{PrivateGPT_2023,
authors = {Martinez, I., Gallego, D. Orgaz, P.},
month = {5},
title = {PrivateGPT},
url = {https://github.com/imartinez/privateGPT},
year = {2023}
}
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。