# Gorilla: Large Language Model Connected with Massive APIs [[Project Website](https://shishirpatil.github.io/gorilla/)]
<img src="https://github.com/ShishirPatil/gorilla/blob/gh-pages/assets/img/logo.png" width=50% height=50%>
**:fire: Gorilla OpenFunctions** is a drop-in alternative for function calling! [Release Blog](https://gorilla.cs.berkeley.edu/blogs/4_open_functions.html)
**🟢 Gorilla is Apache 2.0** With Gorilla being fine-tuned on MPT, and Falcon, you can use Gorilla commercially with no obligations! :golf:
**:rocket: Try Gorilla in 60s** [![Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP?usp=sharing)
:computer: Use [Gorilla in your CLI](https://github.com/gorilla-llm/gorilla-cli) with `pip install gorilla-cli`
**:newspaper_roll: Checkout our paper!** [![arXiv](https://img.shields.io/badge/arXiv-2305.15334-<COLOR>.svg?style=flat-square)](https://arxiv.org/abs/2305.15334)
**:wave: Join our Discord!** [![Discord](https://img.shields.io/discord/1111172801899012102?label=Discord&logo=discord&logoColor=green&style=flat-square)](https://discord.gg/SwTyuTAxX3)
`Gorilla` enables LLMs to use tools by invoking APIs. Given a natural language query, Gorilla comes up with the semantically- and syntactically- correct API to invoke. With Gorilla, we are the first to demonstrate how to use LLMs to invoke 1,600+ (and growing) API calls accurately while reducing hallucination. We also release APIBench, the largest collection of APIs, curated and easy to be trained on! Join us, as we try to expand the largest API store and teach LLMs how to write them! Hop on our Discord, or open a PR, or email us if you would like to have your API incorporated as well.
## News
- :fire: [11/16] Excited to release [Gorilla OpenFunctions](https://gorilla.cs.berkeley.edu/blogs/4_open_functions.html)
- 💻 [06/29] Released [gorilla-cli](https://github.com/gorilla-llm/gorilla-cli), LLMs for your CLI!
- 🟢 [06/06] Released Commercially usable, Apache 2.0 licensed Gorilla models
- :rocket: [05/30] Provided the [CLI interface](inference/README.md) to chat with Gorilla!
- :rocket: [05/28] Released Torch Hub and TensorFlow Hub Models!
- :rocket: [05/27] Released the first Gorilla model! [![Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1DEBPsccVLF_aUnmD0FwPeHFrtdC0QIUP?usp=sharing) or [:hugs:](https://huggingface.co/gorilla-llm/gorilla-7b-hf-delta-v0)!
- :fire: [05/27] We released the APIZoo contribution guide for community API contributions!
- :fire: [05/25] We release the APIBench dataset and the evaluation code of Gorilla!
## Gorilla Gradio
**Try Gorilla LLM models in [HF Spaces](https://huggingface.co/spaces/gorilla-llm/gorilla-demo/) or [![Gradio Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1ktnVWPJOgqTC9hLW8lJPVZszuIddMy7y?usp=sharing)**
![gorilla_webUI_2](https://github.com/TanmayDoesAI/gorilla/assets/85993243/f30645bf-6798-4bd2-ac6e-6943840ae095)
## Get Started
Inference: Run Gorilla locally [`inference/README.md`](inference/README.md)
Evaluation: We have included prompts and responses for the APIBench with and without retrievers along with the Abstract Syntax Tree (AST) matching evaluation script at [evaluation](https://github.com/ShishirPatil/gorilla/tree/main/eval).
## Repository Organization
Our repository organization is shown below.
- The `data` folder contains all the evaluation APIs `(APIBench)` and the community contributed APIs.
- The `eval` folder contains all our evaluation code as well as the Gorilla outputs.
- The `inference` folder contains all the inference code for running Gorilla locally.
- <span style="color:hr">[Coming Soon!]</span> The `train` folder contains all the training code associated with Gorilla finetuning.
For our dataset collections, all the 1640 API documentation is in `data/api`. We also include the `APIBench` dataset created by self-instruct in `data/apibench`. For evaluation, we convert this into a LLM-friendly chat format, and the questions are in `eval/eval-data/questions`, and the corresponding responses are in `eval/eval-data/responses`. We have also included the evaluation scripts are in `eval/eval-scripts`. This would be entirely sufficient to train Gorilla yourself, and reproduce our results. Please see [evaluation](https://github.com/ShishirPatil/gorilla/tree/main/eval) for the details on how to use our evaluation pipeline.
Additionally, we have released all the model weights. `gorilla-7b-hf-v0` lets you invoke over 925 Hugging Face APIs. Similarly, `gorilla-7b-tf-v0` and `gorilla-7b-th-v0` have 626 (exhaustive) Tensorflow v2, and 94 (exhaustive) Torch Hub APIs. `gorilla-mpt-7b-hf-v0` and `gorilla-falcon-7b-hf-v0` are Apache 2.0 licensed models (commercially usable) fine-tuned on MPT-7B and Falcon-7B respectively. We will release a model with all three combined with generic chat capability and community contributed APIs as soon as we can scale our serving infrastructure. You can run Gorilla locally from instructions in the `inference/` sub-directory, or we also provide a hosted Gorilla chat completion API (see Colab)! If you have any suggestions, or if you run into any issues please feel free to reach out to us either through Discord or email or raise a Github issue.
```
gorilla
├── data
│ ├── api (TF/HF/TH APIs used in generating apibench)
│ │ ├── {api_name}_api.jsonl
│ ├── apibench (Evaluating LLM models) v-1.0
│ │ ├── {api_name}_train.jsonl, {api_name}_eval.jsonl
| |── apizoo (Contributed by the community - evolving)
│ | ├── username1.json
│ │ ├── username2.json
│ │ ├── ...
├── eval
│ ├── README.md
│ ├── get_llm_responses.py
│ ├── eval-scripts
│ │ ├── ast_eval_{api_name}.py
│ ├── eval-data
│ │ ├── questions
│ │ │ ├── API name
│ │ │ │ ├── questions_{api_name}_{eval_metric}.jsonl
│ │ ├── responses
│ │ │ ├── API name
│ │ │ │ ├── responses_{api_name}_Gorilla_FT_{eval_metric}.jsonl
│ │ │ │ ├── responses_{api_name}_Gorilla_RT_{eval_metric}.jsonl
├── inference
│ ├── README.md
│ ├── serve
│ │ ├── gorilla_cli.py
│ │ ├── conv_template.py
├── train (Coming Soon!)
```
## Contributing Your API
We aim to build an open-source, one-stop-shop for all APIs, LLMs can interact with! Any suggestions and contributions are welcome! Please see the details on [how to contribute](https://github.com/ShishirPatil/gorilla/tree/main/data/README.md). THIS WILL ALWAYS REMAIN OPEN SOURCE.
## FAQ(s)
1. I would like to use Gorilla commercially. Is there going to be a Apache 2.0 licensed version?
Yes! We now have models that you can use commercially without any obligations.
2. Can we use Gorilla with Langchain, Toolformer, AutoGPT etc?
Absolutely! You've highlighted a great aspect of our tools. Gorilla is an end-to-end model, specifically tailored to serve correct API calls without requiring any additional coding. It's designed to work as part of a wider ecosystem and can be flexibly integrated with other tools.
Langchain, is a versatile developer tool. Its "agents" can efficiently swap in any LLM, Gorilla included, making it a highly adaptable solution for various needs.
AutoGPT, on the other hand, concentrates on the art of prompting GPT series models. It's worth noting that Gorilla, as a fully fine-tuned model, consistently shows remarkable accuracy, and lowers hallucination, outperforming GPT-4 in making specific API calls.
Now, when it comes to ToolFormer, Toolformer zeroes in on a select set of tools, providing specialized functionalities. Gorilla, in
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
一个强大的LLM(Language and Learning Model),它提供适当的API调用,经过在多个大型机器学习中心数据集上的训练。它的性能优越,特别是在零样本学习(Zero-shot)方面。对于需要强大自然语言处理能力的开发者和研究人员,Gorilla是一个有价值的模型。
资源推荐
资源详情
资源评论
收起资源包目录
强大的LLM模型,它提供适当的API调用,经过在多个大型机器学习中心数据集上的训练 (138个子文件)
parser.c 2.74MB
scanner.cc 11KB
binding.cc 869B
.gitattributes 56B
.gitignore 81B
.gitignore 76B
binding.gyp 336B
parser.h 5KB
grammar.js 22KB
index.js 463B
gorilla_openfunctions_train.json 26.11MB
huggingface_train.json 21.21MB
tensorflow_train.json 12.23MB
huggingface_eval.json 2.4MB
torchhub_train.json 1.68MB
tensorflow_eval.json 1.36MB
torchhub_eval.json 380KB
grammar.json 120KB
gorilla_openfunctions_test.json 111KB
node-types.json 53KB
shawnharmsen1.json 1KB
package.json 724B
questions_huggingface_bm25.jsonl 1.7MB
questions_huggingface_oracle.jsonl 1.28MB
questions_huggingface_gpt_index.jsonl 1.26MB
response_huggingface_Gorilla_RT_oracle.jsonl 1.2MB
response_huggingface_Gorilla_RT_0_shot.jsonl 1.19MB
response_huggingface_Gorilla_RT_bm25.jsonl 1.19MB
response_huggingface_Gorilla_FT_0_shot.jsonl 1.15MB
response_huggingface_Gorilla_RT_gpt_index.jsonl 1.13MB
huggingface_api.jsonl 1.01MB
response_huggingface_Gorilla_FT_bm25.jsonl 993KB
response_huggingface_Gorilla_FT_oracle.jsonl 986KB
questions_tensorflowhub_bm25.jsonl 942KB
response_huggingface_Gorilla_FT_gpt_index.jsonl 930KB
questions_tensorflowhub_oracle.jsonl 826KB
questions_tensorflowhub_gpt_index.jsonl 784KB
response_tensorflowhub_Gorilla_FT_bm25.jsonl 666KB
response_tensorflowhub_Gorilla_RT_0_shot.jsonl 663KB
response_tensorflowhub_Gorilla_FT_oracle.jsonl 660KB
response_tensorflowhub_Gorilla_RT_bm25.jsonl 656KB
tensorflowhub_api.jsonl 645KB
response_tensorflowhub_Gorilla_RT_oracle.jsonl 639KB
response_tensorflowhub_Gorilla_FT_gpt_index.jsonl 638KB
response_tensorflowhub_Gorilla_RT_gpt_index.jsonl 634KB
response_tensorflowhub_Gorilla_FT_0_shot.jsonl 614KB
questions_torchhub_bm25.jsonl 335KB
questions_torchhub_gpt_index.jsonl 308KB
questions_torchhub_oracle.jsonl 298KB
questions_huggingface_0_shot.jsonl 182KB
response_torchhub_Gorilla_RT_0_shot.jsonl 182KB
response_torchhub_Gorilla_FT_oracle.jsonl 136KB
response_torchhub_Gorilla_FT_bm25.jsonl 135KB
torchhub_api.jsonl 124KB
questions_tensorflowhub_0_shot.jsonl 118KB
response_torchhub_Gorilla_RT_bm25.jsonl 118KB
response_torchhub_Gorilla_FT_gpt_index.jsonl 118KB
response_torchhub_Gorilla_RT_gpt_index.jsonl 110KB
response_torchhub_Gorilla_RT_oracle.jsonl 105KB
response_torchhub_Gorilla_FT_0_shot.jsonl 100KB
questions_torchhub_0_shot.jsonl 34KB
example_questions.jsonl 131B
LICENSE 11KB
LICENSE 1KB
README.md 10KB
README.md 6KB
README.md 6KB
README.md 4KB
README.md 1KB
README.md 1KB
hosted-gorilla-.md 643B
apibench.md 552B
README.md 468B
feature_request.md 367B
custom-template.md 91B
.npmignore 50B
parse-examples 974B
DFG.py 52KB
python3.8_grammar.py 49KB
python2-grammar-crlf.py 31KB
python3-grammar-crlf.py 31KB
python2-grammar.py 30KB
python3-grammar.py 30KB
weighted_ngram_match.py 25KB
bleu.py 25KB
schema.py 11KB
get_llm_responses.py 10KB
get_llm_responses_retriever.py 10KB
gorilla_cli.py 9KB
conv_template.py 9KB
ast_eval_hf.py 7KB
gorilla_falcon_cli.py 7KB
ast_eval_th.py 7KB
ast_eval_tf.py 6KB
apply_delta.py 6KB
dataflow_match.py 6KB
gorilla_eval.py 6KB
syntax_match.py 4KB
utils.py 4KB
utils.py 3KB
共 138 条
- 1
- 2
资源评论
- 2301_789664452024-07-27资源内容详实,描述详尽,解决了我的问题,受益匪浅,学到了。
UnknownToKnown
- 粉丝: 1w+
- 资源: 773
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 基于FPGA的占空比测量模块-verilog语言
- c语言一个简单的线程池,实现不超过100行.zip
- C语言- 简易三子棋.zip
- 计算机网络课设-协议及流程分析
- COStream 工具在动态类型语言 js 上部署,目的是代码定制 & 易读 & 易测试.zip
- COIMS是应西安某救助站需求,义务为其开发的简易流浪者病历管理软件,基于GTK+2.24版本、SQLite3并采用C语言开发 不以成败论英雄 .zip
- CN编程语言,一个快速、便捷、易学的中文编程语言.zip
- 迈微88NV1120量产工具
- 伺服电机选型的技术指导及其应用场景
- CMM(C语言的一个简单版本)语言的解释器(用C++编写).zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功