ollama_proxy_server/README.md

64 lines
2.1 KiB
Markdown
Raw Normal View History

2024-01-16 08:06:21 +08:00
# 🌺 **Petals Server**
![Petals Logo](<INSERT_IMAGE_PLACEHOLDER_HERE>.png)
**One decentralized tool for text generation and community collaboration**
## Table of Contents
1. [Introduction](#intro)
2. [Requirements](#requirements)
3. [Installation](#installation)
4. [Usage](#usage)
5. [License](#license)
6. [Contact](#contact)
7. [Endpoints](#endpoints)
---
## 🌺 **Introduction** (<span id="intro">)</span>
Petals is a decentralized text generation network designed to connect users with large language models, allowing them to harness the power of the community for efficient and collaborative text generation. With Petals Server, you can share your hardware resources (CPU and GPU) to contribute to the network while also utilizing it to generate text on demand.
---
## 🌺 **Requirements** (<span id="requirements">)</span>
To get started with Petals Server, ensure you have the following prerequisites:
- Git for cloning the repository
- Python 3.11 or higher
- Operating system: Linux, macOS, or Windows with WSL (Windows Subsystem for Linux)
---
## 🌺 **Installation** (<span id="installation">)</span>
Follow these steps to install Petals Server on your local machine:
1. Clone the Git repository using `git clone https://github.com/ParisNeo/petals_server.git`
2. Navigate into the cloned directory (`cd petals_server`)
3. Install dependencies with pip by running `pip install -e .`
4. Launch the server with `petals_server`
---
## 🌺 **Usage** (<span id="usage">)</span>
Once installed, you can use Petals Server as a decentralized text generation client and contribute your hardware resources to the network.
---
## 🌺 **License** (<span id="license">)</span>
Petals Server is licensed under the [Apache License v2.0](https://www.apache.org/licenses/LICENSE-2.0).
---
## 🌺 **Contact** (<span id="contact">)</span>
For any queries or feedback, reach out to ParisNeo on Twitter (@SpaceNerduino), Discord (https://discord.gg/BDxacQmv), or subscribe to the r/lollms Subreddit for community updates and discussions.
---
## 🌺 **Endpoints** (<span id="endpoints">)</span>
To explore all available endpoints, navigate to `http://localhost:8000/docs`.