Nebula (Symbl.ai)
本指南将介绍如何开始使用 Nebula - Symbl.ai 的聊天模型。
集成详情
请访问 API 参考 获取详细文档。
模型功能:待办
设置
凭证
要开始使用,请请求一个 Nebula API 密钥,并将 NEBULA_API_KEY 环境变量设置为你的密钥:
import getpass
import os
os.environ["NEBULA_API_KEY"] = getpass.getpass()
安装
集成设置在 langchain-community 包中。
实例化
from langchain_community.chat_models.symblai_nebula import ChatNebula
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
chat = ChatNebula(max_tokens=1024, temperature=0.5)
调用
messages = [
SystemMessage(
content="You are a helpful assistant that answers general knowledge questions."
),
HumanMessage(content="What is the capital of France?"),
]
chat.invoke(messages)
AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])
异步
await chat.ainvoke(messages)
AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])
流式传输
for chunk in chat.stream(messages):
print(chunk.content, end="", flush=True)
The capital of France is Paris.
批处理
chat.batch([messages])
[AIMessage(content=[{'role': 'human', 'text': 'What is the capital of France?'}, {'role': 'assistant', 'text': 'The capital of France is Paris.'}])]