作者:CSDN博客
简介
LangGraph 是一个强大的对话流程编排框架,而 Mem0 则是一个高效的记忆系统。本教程将介绍如何将两者结合,创建一个具有记忆能力的客服助手系统。
环境准备
首先安装必要的依赖:- pip install langgraph mem0 langchain openai
复制代码 基础配置
1. 导入必要的模块
- from openai import OpenAI
- from mem0 import Memory
- from mem0.configs.base import MemoryConfig
- from mem0.embeddings.configs import EmbedderConfig
- from mem0.llms.configs import LlmConfig
- from typing import Annotated, TypedDict, List
- from langgraph.graph import StateGraph, START
- from langgraph.graph.message import add_messages
- from langchain_openai import ChatOpenAI
- from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
复制代码 2. 配置基础参数
- # 集中管理配置
- API_KEY ="your-api-key"
- BASE_URL ="your-base-url"# 配置 LLM
- llm = ChatOpenAI(
- temperature=0,
- openai_api_key=API_KEY,
- openai_api_base=BASE_URL,
- model="qwen-turbo")# 配置 Mem0
- config = MemoryConfig(
- llm = LlmConfig(
- provider="openai",
- config={"model":"qwen-turbo","api_key": API_KEY,"openai_base_url": BASE_URL
- }),
- embedder = EmbedderConfig(
- provider="openai",
- config={"embedding_dims":1536,"model":"text-embedding-v2","api_key": API_KEY,"openai_base_url": BASE_URL
- }))
复制代码 核心概念解析
1. 状态定义
在 LangGraph 中,我们需要定义对话状态:- class State(TypedDict):
- messages: Annotated[List[HumanMessage | AIMessage], add_messages]
- mem0_user_id: str
复制代码 这个状态包含:
messages:当前对话的消息列表mem0_user_id:用户标识,用于 Mem0 记忆检索
2. 对话节点实现
- defchatbot(state: State):
- messages = state["messages"]
- user_id = state["mem0_user_id"]# 检索相关记忆
- memories = mem0.search(messages[-1].content, user_id=user_id)
- context ="\n".join([f"- {memory['memory']}"for memory in memories["results"]])# 构建系统提示
- system_prompt =f"""You are a helpful customer support assistant. Use the provided context to personalize your responses and remember user preferences and past interactions.
- Relevant information from previous conversations:
- {context}"""# 生成回复并存储记忆
- response = llm.invoke([SystemMessage(content=system_prompt)]+ messages)
- mem0.add(f"User: {messages[-1].content}\nAssistant: {response.content}", user_id=user_id)return{"messages":[response]}
复制代码 3. 图结构构建
- graph = StateGraph(State)
- graph.add_node("chatbot", chatbot)
- graph.add_edge(START,"chatbot")
- graph.add_edge("chatbot","chatbot")
- compiled_graph = graph.compile()
复制代码 工作流程解析
状态初始化:
创建初始状态,包含用户消息和用户ID状态通过图结构传递给对话节点
记忆检索:
使用 Mem0 的 search 方法检索相关历史记忆根据语义相似度返回最相关的记忆
上下文整合:
将检索到的记忆整合到系统提示中确保 AI 能够理解历史上下文
响应生成:
使用 LLM 生成回复将新的对话内容存储到 Mem0 中
运行对话
- defrun_conversation(user_input:str, mem0_user_id:str):
- config ={"configurable":{"thread_id": mem0_user_id}}
- state ={"messages":[HumanMessage(content=user_input)],"mem0_user_id": mem0_user_id}for event in compiled_graph.stream(state, config):for value in event.values():if value.get("messages"):print("Customer Support:", value["messages"][-1].content)return
复制代码 主程序示例
- if __name__ =="__main__":
- mem0_user_id ="customer_123"print("Welcome to Customer Support! How can I assist you today?")while(user_input :=input("You: ").lower())notin['quit','exit','bye']:
- run_conversation(user_input, mem0_user_id)
复制代码 关键特性
状态管理:
LangGraph 提供了清晰的状态管理机制支持复杂的对话流程控制
记忆检索:
流程编排:
完整代码与示例
- from openai import OpenAI
- from mem0 import Memory
- from mem0.configs.base import MemoryConfig
- from mem0.embeddings.configs import EmbedderConfig
- from mem0.llms.configs import LlmConfig
- from langchain_openai import ChatOpenAI
- from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
- from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
- from typing import List, Dict
- API_KEY ="your api key"
- BASE_URL ="https://dashscope.aliyuncs.com/compatible-mode/v1"
- openai_client = OpenAI(
- api_key=API_KEY,
- base_url=BASE_URL,)
- llm = ChatOpenAI(
- temperature=0,
- openai_api_key=API_KEY,
- openai_api_base=BASE_URL,
- model="qwen-turbo")
- config = MemoryConfig(
- llm = LlmConfig(
- provider="openai",
- config={"model":"qwen-turbo","api_key": API_KEY,"openai_base_url": BASE_URL
- }),
- embedder = EmbedderConfig(
- provider="openai",
- config={"embedding_dims":1536,"model":"text-embedding-v2","api_key": API_KEY,"openai_base_url": BASE_URL
- }))
- mem0 = Memory(config=config)
- prompt = ChatPromptTemplate.from_messages([
- SystemMessage(content="""You are a helpful travel agent AI. Use the provided context to personalize your responses and remember user preferences and past interactions.
- Provide travel recommendations, itinerary suggestions, and answer questions about destinations.
- If you don't have specific information, you can make general suggestions based on common travel knowledge."""),
- MessagesPlaceholder(variable_name="context"),
- HumanMessage(content="{input}")])defretrieve_context(query:str, user_id:str)-> List[Dict]:"""Retrieve relevant context from Mem0"""
- memories = mem0.search(query, user_id=user_id)
- seralized_memories =' '.join([mem["memory"]for mem in memories["results"]])
- context =[{"role":"system","content":f"Relevant information: {seralized_memories}"},{"role":"user","content": query
- }]return context
- defgenerate_response(input:str, context: List[Dict])->str:"""Generate a response using the language model"""
- chain = prompt | llm
- response = chain.invoke({"context": context,"input":input})return response.content
- defsave_interaction(user_id:str, user_input:str, assistant_response:str):"""Save the interaction to Mem0"""
- interaction =[{"role":"user","content": user_input
- },{"role":"assistant","content": assistant_response
- }]
- mem0.add(interaction, user_id=user_id)defchat_turn(user_input:str, user_id:str)->str:# Retrieve context
- context = retrieve_context(user_input, user_id)# Generate response
- response = generate_response(user_input, context)# Save interaction
- save_interaction(user_id, user_input, response)return response
- if __name__ =="__main__":print("Welcome to your personal Travel Agent Planner! How can I assist you with your travel plans today?")
- user_id ="john"whileTrue:
- user_input =input("You: ")if user_input.lower()in['quit','exit','bye']:print("Travel Agent: Thank you for using our travel planning service. Have a great trip!")break
-
- response = chat_turn(user_input, user_id)print(f"Travel Agent: {response}")
复制代码
原文地址:https://blog.csdn.net/qq_41472205/article/details/148260565 |