Skip to main content
Open In ColabOpen on GitHub

DeepInfra

DeepInfra 是一个无服务器的推理即服务(inference as a service),提供对多种大语言模型嵌入模型的访问。本笔记将介绍如何将 LangChain 与 DeepInfra 结合使用来处理语言模型。

设置环境变量 API 密钥

请务必从 DeepInfra 获取您的 API 密钥。您需要先登录并获取新的令牌。

您将获得 1 小时的免费无服务器 GPU 计算资源来测试不同的模型。(参见这里) 您可以使用 deepctl auth token 命令打印您的令牌。

# get a new token: https://deepinfra.com/login?from=%2Fdash

from getpass import getpass

DEEPINFRA_API_TOKEN = getpass()
 ········
import os

os.environ["DEEPINFRA_API_TOKEN"] = DEEPINFRA_API_TOKEN

创建 DeepInfra 实例

您也可以使用我们的开源 deepctl 工具 来管理您的模型部署。您可以在此处查看可用参数列表。

from langchain_community.llms import DeepInfra

llm = DeepInfra(model_id="meta-llama/Llama-2-70b-chat-hf")
llm.model_kwargs = {
"temperature": 0.7,
"repetition_penalty": 1.2,
"max_new_tokens": 250,
"top_p": 0.9,
}
API Reference:DeepInfra
# run inferences directly via wrapper
llm("Who let the dogs out?")
'This is a question that has puzzled many people'
# run streaming inference
for chunk in llm.stream("Who let the dogs out?"):
print(chunk)
 Will
Smith
.

创建 Prompt 模板

我们将为问答创建一个 Prompt 模板。

from langchain_core.prompts import PromptTemplate

template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate.from_template(template)
API Reference:PromptTemplate

初始化 LLMChain

from langchain.chains import LLMChain

llm_chain = LLMChain(prompt=prompt, llm=llm)
API Reference:LLMChain

运行 LLMChain

提供一个问题并运行 LLMChain。

question = "Can penguins reach the North pole?"

llm_chain.run(question)
"Penguins are found in Antarctica and the surrounding islands, which are located at the southernmost tip of the planet. The North Pole is located at the northernmost tip of the planet, and it would be a long journey for penguins to get there. In fact, penguins don't have the ability to fly or migrate over such long distances. So, no, penguins cannot reach the North Pole. "