seo站长之家,盘锦网站建设价位,wordpress 模版安装教程,免费的wordpress模板一、前言 使用 FastAPI 可以帮助我们更简单高效地部署 AI 交互业务。FastAPI 提供了快速构建 API 的能力,开发者可以轻松地定义模型需要的输入和输出格式,并编写好相应的业务逻辑。 FastAPI 的异步高性能架构,可以有效支持大量并发的预测请求,为用户提供流畅的交互体验。此外,F… 一、前言 使用 FastAPI 可以帮助我们更简单高效地部署 AI 交互业务。FastAPI 提供了快速构建 API 的能力,开发者可以轻松地定义模型需要的输入和输出格式,并编写好相应的业务逻辑。 FastAPI 的异步高性能架构,可以有效支持大量并发的预测请求,为用户提供流畅的交互体验。此外,FastAPI 还提供了容器化部署能力,开发者可以轻松打包 AI 模型为 Docker 镜像,实现跨环境的部署和扩展。 总之,使用 FastAPI 可以大大提高 AI 应用程序的开发效率和用户体验,为 AI 模型的部署和交互提供全方位的支持。 本篇在开源模型应用落地-FastAPI-助力模型交互-WebSocket篇五基础上学习如何集成Tool获取实时数据并以流式方式返回 二、术语
2.1.Tool Tool工具是为了增强其语言模型的功能和实用性而设计的一系列辅助手段用于扩展模型的能力。例如代码解释器Code Interpreter和知识检索Knowledge Retrieval等都属于其工具。
2.2.langchain预置的tools https://github.com/langchain-ai/langchain/tree/v0.1.16/docs/docs/integrations/tools 基本这些工具能满足大部分需求具体使用参见
2.3.LangChain支持流式输出的方法
stream基本的流式传输方式能逐步给出代理的动作和观察结果。astream异步的流式传输用于异步处理需求的情况。astream_events更细致的流式传输能流式传输代理的每个具体事件如工具调用和结束、模型启动和结束等便于深入了解和监控代理执行的详细过程。
2.4.langchainhub 是 LangChain 相关工具的集合中心其作用在于方便开发者发现和共享常用的提示Prompt、链、代理等。 它受 Hugging Face Hub 启发促进社区交流与协作推动 LangChain 生态发展。当前它在新架构中被置于 LangSmith 里主要聚焦于 Prompt。
2.5.asyncio 是一个用于编写并发代码的标准库,它提供了构建异步应用程序的基础框架。 三、前置条件
3.1. 创建虚拟环境安装依赖 增加Google Search以及langchainhub的依赖包 conda create -n fastapi_test python3.10
conda activate fastapi_test
pip install fastapi websockets uvicorn
pip install --quiet langchain-core langchain-community langchain-openai
pip install google-search-results langchainhub 3.2. 注册Google Search API账号
参见开源模型应用落地-FastAPI-助力模型交互-WebSocket篇五
3.3. 生成Google Search API的KEY
四、技术实现
4.1. 使用Tool流式输出
# -*- coding: utf-8 -*-
import asyncio
import os
from langchain.agents import create_structured_chat_agent, AgentExecutor
from langchain_community.utilities.serpapi import SerpAPIWrapper
from langchain_core.prompts import SystemMessagePromptTemplate, HumanMessagePromptTemplate, ChatPromptTemplate
from langchain_core.tools import tool
from langchain_openai import ChatOpenAIos.environ[OPENAI_API_KEY] sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # 你的Open AI Key
os.environ[SERPAPI_API_KEY] xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxllm ChatOpenAI(modelgpt-3.5-turbo,temperature0,max_tokens512)tool
def search(query:str):只有需要了解实时信息或不知道的事情的时候才会使用这个工具需要传入要搜索的内容。serp SerpAPIWrapper()result serp.run(query)print(实时搜索结果:, result)return resulttools [search]template
Respond to the human as helpfully and accurately as possible. You have access to the following tools:{tools}Use a json blob to specify a tool by providing an action key (tool name) and an action_input key (tool input).Valid action values: Final Answer or {tool_names}Provide only ONE action per $JSON_BLOB, as shown:{{action: $TOOL_NAME,action_input: $INPUT}}Follow this format:Question: input question to answerThought: consider previous and subsequent stepsAction:$JSON_BLOBObservation: action result... (repeat Thought/Action/Observation N times)Thought: I know what to respondAction:{{action: Final Answer,action_input: Final response to human}}Begin! Reminder to ALWAYS respond with a valid json blob of a single action. Use tools if necessary. Respond directly if appropriate. Format is Action:$JSON_BLOBthen Observationsystem_message_prompt SystemMessagePromptTemplate.from_template(template)
human_template
{input}{agent_scratchpad}(reminder to respond in a JSON blob no matter what)human_message_prompt HumanMessagePromptTemplate.from_template(human_template)
prompt ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])print(prompt)agent create_structured_chat_agent(llm, tools, prompt
)agent_executor AgentExecutor(agentagent, toolstools, verboseTrue, handle_parsing_errorsTrue)async def chat(params):events agent_executor.astream_events(params,versionv2)async for event in events:type event[event]if on_chat_model_stream type:data event[data]chunk data[chunk]content chunk.contentif content and len(content) 0:print(content)asyncio.run(chat({input: 广州现在天气如何?}))
调用结果 说明
流式输出的数据结构为
{event: on_chat_model_stream, data: {chunk: AIMessageChunk(content天, idrun-92515b63-4b86-4af8-8515-2f84def9dfab)}, run_id: 92515b63-4b86-4af8-8515-2f84def9dfab, name: ChatOpenAI, tags: [seq:step:3], metadata: {ls_provider: openai, ls_model_name: gpt-3.5-turbo, ls_model_type: chat, ls_temperature: 0.0, ls_max_tokens: 512, ls_stop: [\nObservation]}}
type: on_chat_model_stream
{event: on_chat_model_stream, data: {chunk: AIMessageChunk(content气, idrun-92515b63-4b86-4af8-8515-2f84def9dfab)}, run_id: 92515b63-4b86-4af8-8515-2f84def9dfab, name: ChatOpenAI, tags: [seq:step:3], metadata: {ls_provider: openai, ls_model_name: gpt-3.5-turbo, ls_model_type: chat, ls_temperature: 0.0, ls_max_tokens: 512, ls_stop: [\nObservation]}}4.2. 通过langchainhub使用公共prompt 在4.1使用Tool流式输出的代码基础上进行调整
# -*- coding: utf-8 -*-
import asyncio
import os
from langchain.agents import create_structured_chat_agent, AgentExecutor
from langchain_community.utilities.serpapi import SerpAPIWrapper
from langchain_core.tools import tool
from langchain_openai import ChatOpenAIos.environ[OPENAI_API_KEY] sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # 你的Open AI Key
os.environ[SERPAPI_API_KEY] xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxfrom langchain import hubllm ChatOpenAI(modelgpt-3.5-turbo,temperature0,max_tokens512)tool
def search(query:str):只有需要了解实时信息或不知道的事情的时候才会使用这个工具需要传入要搜索的内容。serp SerpAPIWrapper()result serp.run(query)print(实时搜索结果:, result)return resulttools [search]prompt hub.pull(hwchase17/structured-chat-agent)print(prompt)agent create_structured_chat_agent(llm, tools, prompt
)agent_executor AgentExecutor(agentagent, toolstools, verboseTrue, handle_parsing_errorsTrue)async def chat(params):events agent_executor.astream_events(params,versionv2)async for event in events:type event[event]if on_chat_model_stream type:data event[data]chunk data[chunk]content chunk.contentif content and len(content) 0:print(content)asyncio.run(chat({input: 广州现在天气如何?}))调用结果 4.3. 整合代码
在开源模型应用落地-FastAPI-助力模型交互-WebSocket篇五的代码基础上进行调整
import uvicorn
import osfrom typing import Annotated
from fastapi import (Depends,FastAPI,WebSocket,WebSocketException,WebSocketDisconnect,status,
)
from langchain import hub
from langchain.agents import create_structured_chat_agent, AgentExecutor
from langchain_community.utilities import SerpAPIWrapperfrom langchain_core.tools import tool
from langchain_openai import ChatOpenAIos.environ[OPENAI_API_KEY] sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx # 你的Open AI Key
os.environ[SERPAPI_API_KEY] xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxclass ConnectionManager:def __init__(self):self.active_connections: list[WebSocket] []async def connect(self, websocket: WebSocket):await websocket.accept()self.active_connections.append(websocket)def disconnect(self, websocket: WebSocket):self.active_connections.remove(websocket)async def send_personal_message(self, message: str, websocket: WebSocket):await websocket.send_text(message)async def broadcast(self, message: str):for connection in self.active_connections:await connection.send_text(message)manager ConnectionManager()app FastAPI()async def authenticate(websocket: WebSocket,userid: str,secret: str,
):if userid is None or secret is None:raise WebSocketException(codestatus.WS_1008_POLICY_VIOLATION)print(fuserid: {userid},secret: {secret})if 12345 userid and xxxxxxxxxxxxxxxxxxxxxxxxxx secret:return passelse:return failtool
def search(query:str):只有需要了解实时信息或不知道的事情的时候才会使用这个工具需要传入要搜索的内容。serp SerpAPIWrapper()result serp.run(query)print(实时搜索结果:, result)return resultdef get_prompt():prompt hub.pull(hwchase17/structured-chat-agent)return promptasync def chat(query):global llm,toolsagent create_structured_chat_agent(llm, tools, get_prompt())agent_executor AgentExecutor(agentagent, toolstools, verboseTrue, handle_parsing_errorsTrue)events agent_executor.astream_events({input: query}, versionv1)async for event in events:type event[event]if on_chat_model_stream type:data event[data]chunk data[chunk]content chunk.contentif content and len(content) 0:print(content)yield contentapp.websocket(/ws)
async def websocket_endpoint(*,websocket: WebSocket,userid: str,permission: Annotated[str, Depends(authenticate)],):await manager.connect(websocket)try:while True:text await websocket.receive_text()if fail permission:await manager.send_personal_message(fauthentication failed, websocket)else:if text is not None and len(text) 0:async for msg in chat(text):await manager.send_personal_message(msg, websocket)except WebSocketDisconnect:manager.disconnect(websocket)print(fClient #{userid} left the chat)await manager.broadcast(fClient #{userid} left the chat)if __name__ __main__:tools [search]llm ChatOpenAI(modelgpt-3.5-turbo, temperature0, max_tokens512)uvicorn.run(app, host0.0.0.0,port7777)客户端
!DOCTYPE html
htmlheadtitleChat/title/headbodyh1WebSocket Chat/h1form action onsubmitsendMessage(event)labelUSERID: input typetext iduserid autocompleteoff value12345//labellabelSECRET: input typetext idsecret autocompleteoff valuexxxxxxxxxxxxxxxxxxxxxxxxxx//labelbr/button onclickconnect(event)Connect/buttonhrlabelMessage: input typetext idmessageText autocompleteoff//labelbuttonSend/button/formul idmessages/ulscriptvar ws null;function connect(event) {var userid document.getElementById(userid)var secret document.getElementById(secret)ws new WebSocket(ws://localhost:7777/ws?useriduserid.valuesecret secret.value);ws.onmessage function(event) {var messages document.getElementById(messages)var message document.createElement(li)var content document.createTextNode(event.data)message.appendChild(content)messages.appendChild(message)};event.preventDefault()}function sendMessage(event) {var input document.getElementById(messageText)ws.send(input.value)input.value event.preventDefault()}/script/body
/html
调用结果
用户输入你好
不需要触发工具调用 模型输出 用户输入广州现在天气如何
需要调用工具 模型输出 Action:{action: Final Answer,action_input: 广州现在的天气是多云温度为87华氏度降水概率为7%湿度为76%风力为7英里/小时。
}PS:
1. 上面仅用于演示流式输出的效果里面包含一些冗余的信息例如action: Final Answer要根据实际情况过滤。
2. 页面输出的样式可以根据实际需要进行调整此处仅用于演示效果。