AI创想

标题: 在 LangGraph 中集成 Mem0 记忆系统教程 [打印本页]

作者: 米落枫    时间: 7 小时前
标题: 在 LangGraph 中集成 Mem0 记忆系统教程
作者:CSDN博客
简介

LangGraph 是一个强大的对话流程编排框架,而 Mem0 则是一个高效的记忆系统。本教程将介绍如何将两者结合,创建一个具有记忆能力的客服助手系统。
环境准备

首先安装必要的依赖:
  1. pip install langgraph mem0 langchain openai
复制代码
基础配置

1. 导入必要的模块
  1. from openai import OpenAI
  2. from mem0 import Memory
  3. from mem0.configs.base import MemoryConfig
  4. from mem0.embeddings.configs import EmbedderConfig
  5. from mem0.llms.configs import LlmConfig
  6. from typing import Annotated, TypedDict, List
  7. from langgraph.graph import StateGraph, START
  8. from langgraph.graph.message import add_messages
  9. from langchain_openai import ChatOpenAI
  10. from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
复制代码
2. 配置基础参数
  1. # 集中管理配置
  2. API_KEY ="your-api-key"
  3. BASE_URL ="your-base-url"# 配置 LLM
  4. llm = ChatOpenAI(
  5.     temperature=0,
  6.     openai_api_key=API_KEY,
  7.     openai_api_base=BASE_URL,
  8.     model="qwen-turbo")# 配置 Mem0
  9. config = MemoryConfig(
  10.     llm = LlmConfig(
  11.         provider="openai",
  12.         config={"model":"qwen-turbo","api_key": API_KEY,"openai_base_url": BASE_URL
  13.         }),
  14.     embedder = EmbedderConfig(
  15.         provider="openai",
  16.         config={"embedding_dims":1536,"model":"text-embedding-v2","api_key": API_KEY,"openai_base_url": BASE_URL
  17.         }))
复制代码
核心概念解析

1. 状态定义

在 LangGraph 中,我们需要定义对话状态:
  1. class State(TypedDict):
  2.     messages: Annotated[List[HumanMessage | AIMessage], add_messages]
  3.     mem0_user_id: str
复制代码
这个状态包含:
2. 对话节点实现
  1. defchatbot(state: State):
  2.     messages = state["messages"]
  3.     user_id = state["mem0_user_id"]# 检索相关记忆
  4.     memories = mem0.search(messages[-1].content, user_id=user_id)
  5.     context ="\n".join([f"- {memory['memory']}"for memory in memories["results"]])# 构建系统提示
  6.     system_prompt =f"""You are a helpful customer support assistant. Use the provided context to personalize your responses and remember user preferences and past interactions.
  7. Relevant information from previous conversations:
  8. {context}"""# 生成回复并存储记忆
  9.     response = llm.invoke([SystemMessage(content=system_prompt)]+ messages)
  10.     mem0.add(f"User: {messages[-1].content}\nAssistant: {response.content}", user_id=user_id)return{"messages":[response]}
复制代码
3. 图结构构建
  1. graph = StateGraph(State)
  2. graph.add_node("chatbot", chatbot)
  3. graph.add_edge(START,"chatbot")
  4. graph.add_edge("chatbot","chatbot")
  5. compiled_graph = graph.compile()
复制代码
工作流程解析

运行对话
  1. defrun_conversation(user_input:str, mem0_user_id:str):
  2.     config ={"configurable":{"thread_id": mem0_user_id}}
  3.     state ={"messages":[HumanMessage(content=user_input)],"mem0_user_id": mem0_user_id}for event in compiled_graph.stream(state, config):for value in event.values():if value.get("messages"):print("Customer Support:", value["messages"][-1].content)return
复制代码
主程序示例
  1. if __name__ =="__main__":
  2.     mem0_user_id ="customer_123"print("Welcome to Customer Support! How can I assist you today?")while(user_input :=input("You: ").lower())notin['quit','exit','bye']:
  3.         run_conversation(user_input, mem0_user_id)
复制代码
关键特性

完整代码与示例
  1. from openai import OpenAI
  2. from mem0 import Memory
  3. from mem0.configs.base import MemoryConfig
  4. from mem0.embeddings.configs import EmbedderConfig
  5. from mem0.llms.configs import LlmConfig
  6. from langchain_openai import ChatOpenAI
  7. from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
  8. from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
  9. from typing import List, Dict
  10. API_KEY ="your api key"
  11. BASE_URL ="https://dashscope.aliyuncs.com/compatible-mode/v1"
  12. openai_client = OpenAI(
  13.    api_key=API_KEY,
  14.    base_url=BASE_URL,)
  15. llm = ChatOpenAI(
  16.    temperature=0,
  17.    openai_api_key=API_KEY,
  18.    openai_api_base=BASE_URL,
  19.    model="qwen-turbo")
  20. config = MemoryConfig(
  21.    llm = LlmConfig(
  22.        provider="openai",
  23.        config={"model":"qwen-turbo","api_key": API_KEY,"openai_base_url": BASE_URL
  24.        }),
  25.    embedder = EmbedderConfig(
  26.        provider="openai",
  27.        config={"embedding_dims":1536,"model":"text-embedding-v2","api_key": API_KEY,"openai_base_url": BASE_URL
  28.        }))
  29. mem0 = Memory(config=config)
  30. prompt = ChatPromptTemplate.from_messages([
  31.    SystemMessage(content="""You are a helpful travel agent AI. Use the provided context to personalize your responses and remember user preferences and past interactions.
  32.    Provide travel recommendations, itinerary suggestions, and answer questions about destinations.
  33.    If you don't have specific information, you can make general suggestions based on common travel knowledge."""),
  34.    MessagesPlaceholder(variable_name="context"),
  35.    HumanMessage(content="{input}")])defretrieve_context(query:str, user_id:str)-> List[Dict]:"""Retrieve relevant context from Mem0"""
  36.    memories = mem0.search(query, user_id=user_id)
  37.    seralized_memories =' '.join([mem["memory"]for mem in memories["results"]])
  38.    context =[{"role":"system","content":f"Relevant information: {seralized_memories}"},{"role":"user","content": query
  39.        }]return context
  40. defgenerate_response(input:str, context: List[Dict])->str:"""Generate a response using the language model"""
  41.    chain = prompt | llm
  42.    response = chain.invoke({"context": context,"input":input})return response.content
  43. defsave_interaction(user_id:str, user_input:str, assistant_response:str):"""Save the interaction to Mem0"""
  44.    interaction =[{"role":"user","content": user_input
  45.        },{"role":"assistant","content": assistant_response
  46.        }]
  47.    mem0.add(interaction, user_id=user_id)defchat_turn(user_input:str, user_id:str)->str:# Retrieve context
  48.    context = retrieve_context(user_input, user_id)# Generate response
  49.    response = generate_response(user_input, context)# Save interaction
  50.    save_interaction(user_id, user_input, response)return response
  51. if __name__ =="__main__":print("Welcome to your personal Travel Agent Planner! How can I assist you with your travel plans today?")
  52.    user_id ="john"whileTrue:
  53.        user_input =input("You: ")if user_input.lower()in['quit','exit','bye']:print("Travel Agent: Thank you for using our travel planning service. Have a great trip!")break
  54.       
  55.        response = chat_turn(user_input, user_id)print(f"Travel Agent: {response}")
复制代码
(, 下载次数: 0)



原文地址:https://blog.csdn.net/qq_41472205/article/details/148260565




欢迎光临 AI创想 (https://www.llms-ai.com/) Powered by Discuz! X3.4