<h1 align="center">Welcome to wechat-chatgpt ð</h1>
<p>
<img alt="Version" src="https://img.shields.io/badge/version-1.0.0-blue.svg?cacheSeconds=2592000" />
<a href="#" target="_blank">
<img alt="License: ISC" src="https://img.shields.io/badge/License-ISC-yellow.svg" />
</a>
<a href="https://twitter.com/fuergaosi" target="_blank">
<img alt="Twitter: fuergaosi" src="https://img.shields.io/twitter/follow/fuergaosi.svg?style=social" />
</a>
</a>
<a href="https://discord.gg/8fXNrxwUJH" target="blank">
<img src="https://img.shields.io/discord/1058994816446369832?label=Join%20Community&logo=discord&style=flat-square" alt="join discord community of github profile readme generator"/>
</a>
</p>
> Use ChatGPT On Wechat via wechaty
> English | [ä¸æææ¡£](README_ZH.md)
[](https://railway.app/template/dMLG70?referralCode=bIYugQ)
## ð Features
- Interact with WeChat and ChatGPT:
- Use ChatGPT on WeChat with [wechaty](https://github.com/wechaty/wechaty) and [Official API](https://openai.com/blog/introducing-chatgpt-and-whisper-apis)
- Add conversation support
- Support command setting
- Deployment and configuration options:
- Add Dockerfile, deployable with [docker](#use-with-docker)
- Support deployment using [docker compose](#use-with-docker-compose)
- Support [Railway](#use-with-railway) and [Fly.io](#use-with-flyio) deployment
- Other features:
- Support [Dall·E](https://labs.openai.com/)
- Support [whisper](https://openai.com/blog/introducing-chatgpt-and-whisper-apis)
- Support setting prompt
- Support proxy (in development)
## ð Usage
- [Use with Railway](#use-with-railway)(PaaS, Free, Stable, â
Recommended)
- [Use with Fly.io](#use-with-flyio)(Paas, Free, â
Recommended)
- [Use with docker](#use-with-docker)(Self-hosted, Stable, â
Recommended)
- [Use with docker compose](#use-with-docker-compose)(Self-hosted, Stable, â
Recommended)
- [Use with nodejs](#use-with-nodejs)(Self-hosted)
## Use with Railway
> Railway offers $5 or 500 hours of runtime per month
1. Click the [Railway](https://railway.app/template/dMLG70?referralCode=bIYugQ) button to go to the Railway deployment page
2. Click the `Deploy Now` button to enter the Railway deployment page
3. Fill in the repository name and `OPENAI_API_KEY` (need to link GitHub account)
4. Click the `Deploy` button
5. Click the `View Logs` button and wait for the deployment to complete
## Use with Fly.io
> Please allocate 512MB memory for the application to meet the application requirements
> fly.io offers free bills up to $5(Free Allowances 3 256MB are not included in the bill)
1. Install [flyctl](https://fly.io/docs/getting-started/installing-flyctl/)
```shell
# macOS
brew install flyctl
# Windows
scoop install flyctl
# Linux
curl https://fly.io/install.sh | sh
```
2. Clone the project and enter the project directory
```shell
git clone https://github.com/fuergaosi233/wechat-chatgpt.git && cd wechat-chatgpt
```
3. Create a new app
```shell
â flyctl launch
? Would you like to copy its configuration to the new app? No
? App Name (leave blank to use an auto-generated name): <YOUR APP NAME>
? Select region: <YOUR CHOOSE REGION>
? Would you like to setup a Postgresql database now? No
? Would you like to deploy now? No
```
4. Configure the environment variables
```shell
flyctl secrets set OPENAI_API_KEY="<YOUR OPENAI API KEY>" MODEL="<CHATGPT-MODEL>"
```
5. Deploy the app
```shell
flyctl deploy
```
## Use with docker
```sh
# pull image
docker pull holegots/wechat-chatgpt
# run container
docker run -d --name wechat-chatgpt \
-e OPENAI_API_KEY=<YOUR OPENAI API KEY> \
-e MODEL="gpt-3.5-turbo" \
-e CHAT_PRIVATE_TRIGGER_KEYWORD="" \
-v $(pwd)/data:/app/data/wechat-assistant.memory-card.json \
holegots/wechat-chatgpt:latest
# View the QR code to log in to wechat
docker logs -f wechat-chatgpt
```
> How to get OPENAI API KEY? [Click here](https://platform.openai.com/account/api-keys)
## Use with docker compose
```sh
# Copy the configuration file according to the template
cp .env.example .env
# Edit the configuration file
vim .env
# Start the container
docker-compose up -d
# View the QR code to log in to wechat
docker logs -f wechat-chatgpt
```
## Use with nodejs
> You need NodeJS 18.0.0 version and above
```sh
# Clone the project
git clone https://github.com/fuergaosi233/wechat-chatgpt.git && cd wechat-chatgpt
# Install dependencies
npm install
# Copy the configuration file according to the template
cp .env.example .env
# Edit the configuration file
vim .env
# Start project
npm run dev
```
> Please make sure your WeChat account can log in [WeChat on web](https://wx.qq.com/)
## ð Environment Variables
| name | description |
|------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| API | API endpoint of ChatGPT |
| OPENAI_API_KEY | [create new secret key](https://platform.openai.com/account/api-keys) |
| MODEL | ID of the model to use. Currently, only gpt-3.5-turbo and gpt-3.5-turbo-0301 are supported. |
| TEMPERATURE | What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. |
| CHAT_TRIGGER_RULE | Private chat triggering rules. |
| DISABLE_GROUP_MESSAGE | Prohibited to use ChatGPT in group chat. |
| CHAT_PRIVATE_TRIGGER_KEYWORD | Keyword to trigger ChatGPT reply in WeChat private chat |
| BLOCK_WORDS | Chat blocker words, (works for both private and group chats, Use, Split) |
| CHATGPT_BLOCK_WORDS | The blocked words returned by ChatGPT(works for both private and group chats, Use, Split) |
## ð Using Custom ChatGPT API
> https://github.com/fuergaosi233/openai-proxy
```shell
# Clone the project
git clone https://github.com/fuergaosi233/openai-proxy
# Install dependencies
npm install && npm install -g wrangler && npm run build
# Deploy to CloudFlare Workers
npm run deploy
# Custom domain (optional)
Add `Route` to `wrangler.toml`
routes = [
{ pattern = "Your Custom Domain", custom_domain = true },
]
```
## â¨ï¸ Commands
> Enter in the WeChat chat box
```shell
/cmd help # Show help
/cmd prompt <PROMPT> # Set prompt
/cmd clear # Clear all sessions since last boot
```
## ⨠Contributor
<a href="https://github.com/fuergaosi233/wechat-chatgpt/graphs/contributors">
<img src="https://contrib.rocks/image?repo=fuergaosi233/wechat

牛马尼格
- 粉丝: 1319
- 资源: 375
最新资源
- 基于Matlab的全局路径规划算法:RRT及其改进方法RRT Star、RRT-Connect在非线性系统中的开环轨迹生成技术,轻松应对障碍物问题的研究与实践 ,MATLAB实现下的RRT路径规划算法
- 前端开发中Vue.js的渐进式JavaScript框架入门教程及其核心概念详解
- RHEL 9中的防火墙配置和数据包过滤解决方案:FIREWALLD、NFTABLES与XDP FILTER的深入解析
- 氯碱工业流程解析:盐水预处理与电解产物的纯化,氯碱工业流程解析:盐水预处理与电解产物的纯化,氯碱工业流程图( PFD ) 由盐水经过一次精制(预处理,凯膜过滤)和二次精制(离子交树脂)后经泵输送至电解
- 1998-2022年各地级市产业结构高级化数据(含原始数据+计算过程+结果)
- 基于Retinex算法的图像增强技术及其应用研究,Retinex算法:图像增强的现代关键技术,Retinex图像增强算法 ,Retinex图像增强算法; 图像处理; 算法技术; 色彩平衡; 对比度提升
- 强化学习与最优控制 pdf
- Matlab Simulink下的永磁同步电机及无刷直流电机仿真设计:矢量控制、无传感器控制及复矢量解耦等高级控制策略与三相逆变器控制技术的综合应用 ,基于MATLAB Simulink的永磁同步电机
- 电气防火防爆防雷防静电措施-李如虎,陆红编;刘宁绘-10976457.pdf
- 跨平台PyTorch安装指南:从环境搭建到问题解决
- 发动机排气制动制动力矩仿真研究:仿真模型构建、结果解析与相关计算公式探究,发动机排气制动制动力矩仿真技术研究:深入解析仿真模型、结果与相关计算公式,发动机排气制动制动力矩仿真 仿真模型 仿真结果 相关
- qemu的设备树,用于调试分析
- 卡尔曼滤波详解:基于数学原理的线性最小方差统计估算方法,预测物体位置及速度,支持当前、未来与过去位置的估计(MATLAB实现),卡尔曼滤波详解:基于数学统计的线性最小方差估计,实现当前与未来位置预测及
- RadASM环境,win32汇编入门教程之七
- 基于PID控制器的电动车充放电系统Simulink建模与仿真研究:三部分深度解析(车辆模型、蓄电池及PID控制器),基于PID控制器的电动车充放电系统Simulink建模与仿真验证:三部分详解(蓄电池
- java项目,课程设计(包含源代码)-“智慧食堂”设计与实现
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈



- 1
- 2
前往页