Back to Integrations
integration integration
integration

Integrate LangChain Chat Memory Manager in your LLM apps and 422+ apps and services

Use Chat Memory Manager to easily build AI-powered applications with LangChain and integrate them with 422+ apps and services. n8n lets you seamlessly import data from files, websites, or databases into your LLM-powered application and create automated scenarios.

Popular ways to use Chat Memory Manager integration

Twilio Trigger node
OpenAI Chat Model node
Twilio node
Redis node
+7

Enhance Customer Chat by Buffering Messages with Twilio and Redis

This n8n workflow demonstrates a simple approach to improve chat UX by staggering an AI Agent's reply for users who send in a sequence of partial messages and in short bursts. How it works Twilio webhook receives user's messages which are recorded in a message stack powered by Redis. The execution is immediately paused for 5 seconds and then another check is done against the message stack for the latest message. The purpose of this check lets use know if the user is sending more messages or if they are waiting for a reply. The execution is aborted if the latest message on the stack differs from the incoming message and continues if they are the same. For the latter, the agent receives the buffered messages up to that point and is able to respond to them in a single reply. Requirements A Twilio account and SMS-enabled phone number to receive messages. Redis instance for the messages stack. OpenAI account for the language model. Customising the workflow This workflow should work for other common messaging platforms such as Whatsapp and Telegram. 5 seconds too long or too short? Adjust the wait threshold to suit your customers.
jimleuk
Jimleuk
Limit node
Aggregate node
+4

Chat with OpenAI Assistant (by adding a memory)

OpenAI Assistant is a powerful tool, but at the time of writing it doesn't automatically remember past messages from a conversation. This workflow demonstrates how to get around this, by managing the chat history in n8n and passing it to the assistant when required. This makes it possible to use OpenAI Assistant for chatbot use cases. Note that to use this template, you need to be on n8n version 1.28.0 or later.
davidn8n
David Roberts

Supported modes

Get Many Messages
Retrieve chat messages from connected memory
Insert Messages
Insert chat messages into connected memory
Delete Messages
Delete chat messages from connected memory

About Chat Memory Manager

Related categories

Similar integrations

  • Pinecone: Insert node
  • Anthropic Chat Model node
  • Wikipedia node
  • Google Gemini Chat Model node
  • Google Vertex Chat Model node
  • Postgres Chat Memory node

Over 3000 companies switch to n8n every single week

Connect Chat Memory Manager with your company’s tech stack and create automation workflows