Skip to main content
Open In ColabOpen on GitHub

Together AI

caution

您当前所在的页面记录了如何将 Together AI 模型用作 文本补全模型。许多流行的 Together AI 模型是 聊天补全模型

您可能在寻找 此页面

Together AI 提供了一个 API,可以用几行代码查询 50 多个领先的开源模型

此示例将介绍如何使用 LangChain 与 Together AI 模型进行交互。

安装

%pip install --upgrade langchain-together

环境

要使用 Together AI,您需要一个 API 密钥,您可以在此处找到: https://api.together.ai/settings/api-keys。该密钥可以作为初始化参数 together_api_key 传入, 或者设置为环境变量 TOGETHER_API_KEY

示例

# Querying chat models with Together AI

from langchain_together import ChatTogether

# choose from our 50+ models here: https://docs.together.ai/docs/inference-models
chat = ChatTogether(
# together_api_key="YOUR_API_KEY",
model="meta-llama/Llama-3-70b-chat-hf",
)

# stream the response back from the model
for m in chat.stream("Tell me fun things to do in NYC"):
print(m.content, end="", flush=True)

# if you don't want to do streaming, you can use the invoke method
# chat.invoke("Tell me fun things to do in NYC")
API Reference:ChatTogether
# Querying code and language models with Together AI

from langchain_together import Together

llm = Together(
model="codellama/CodeLlama-70b-Python-hf",
# together_api_key="..."
)

print(llm.invoke("def bubble_sort(): "))
API Reference:Together