Back to Integrations
integration integration
integration

Integrate LangChain AI Agent in your LLM apps and 422+ apps and services

Use AI Agent to easily build AI-powered applications with LangChain and integrate them with 422+ apps and services. n8n lets you seamlessly import data from files, websites, or databases into your LLM-powered application and create automated scenarios.

Popular ways to use AI Agent integration

HTTP Request node
Notion node
+8

Automate Competitor Research with Exa.ai, Notion and AI Agents

This n8n workflow demonstrates a simple multi-agent setup to perform the task of competitor research. It showcases how using the HTTP request tool could reduce the number of nodes needed to achieve a workflow like this. How it works For this template, a source company is defined by the user which is sent to Exa.ai to find competitors. Each competitor is then funnelled through 3 AI agents that will go out onto the internet and retrieve specific datapoints about the competitor; company overview, product offering and customer reviews. Once the agents are finished, the results are compiled into a report which is then inserted in a notion database. Check out an example output here: https://jimleuk.notion.site/2d1c3c726e8e42f3aecec6338fd24333?v=de020fa196f34cdeb676daaeae44e110&pvs=4 Requirements An OpenAI account for the LLM. Exa.ai account for access to their AI search engine. SerpAPI account for Google search. Firecrawl.dev account for webscraping. Notion.com account for database to save final reports. Customising the workflow Add additional agents to gather more datapoints such as SEO keywords and metrics. Not using notion? Feel free to swap this out for your own database.
jimleuk
Jimleuk
Slack node
Google Calendar node
+9

Organise an Event using Slack, Google Calendar and AI

This n8n workflow takes Slack conversations and turns them into Calendar events complete with accurate date and times and location information. Adding and removing attendees are also managed automatically. How it works Workflow monitors a Slack channel for invite messages with a "📅" reaction and sends this to the AI agent. AI agent parses the message determining the time, date and location. Using its Location tool, the AI agent searches for the precise location address from Google Maps. Using its Calendar tool, the AI agent creates a Google Calendar invite with the title, description and location address for the user. Back in the Slack channel, others can RSVP to the invite by reacting with the "✅" emjoi. The workflow polls the message after a while and adds the users who have reacted to the Calendar Invite as attendees. Conversely, removing any attendees who have since removed their reaction. Examples Jill: "Hey team, I'm organising a round of Laser Tag (Bunker 51) next Thursday around 6pm. Please RSVP with a ✅" AI: "I've helped you create an event in your calendar https://cal.google.com/..." Jack: "✅" AI: "I've added Jack to the event as an attendee". Requirements Slack channel to attach the workflow OpenAI account to use a GPT model Google Calendar to create and update events Customising the Workflow This workflow can work with other messaging platforms that support reactions or tagging like features such as discord. Don't use Google Calendar? Swap it out for Outlook or your own. Use any combinations of emjoi reactions and add new rules like "RSVP maybe" which could send reminder updates nearer the event date.
jimleuk
Jimleuk
Airtable node
HTTP Request node
Gmail node
+9

Turn Emails into AI-Enhanced Tasks in Notion (Multi-User Support) with Gmail, Airtable and Softr

Purpose This workflow automatically creates Tasks from forwarded Emails, similar to Asana, but better. Emails are processed by AI and converted to rather actionable task. In addition this workflow is build in a way, that multiple users can share this single process by setting up their individual configuration through a user friendly portal (internal tool) instead of the need to manage their own workflows. Demo How it works One Gmail account is used to process inbound mails from different users. A custom web portal enables users to define “routes”. Thats where the mapping between an automatically generated Gmail Alias and a Notion Database URL, including the personal API Token, happens. Using a Gmail Trigger, new entries are split by the Email Alias, so the corresponding route can be retrieved from the Database connected to the portal. Every Email then gets processed by AI to get generate an actionable task and get a short summary of the original Email as well as some metadata. Based on a predefined structure a new Page is created in the corresponding Notion Database. Finally the Email is marked as “processed” in Gmail. If an error happens, the route gets paused for a possible overflow and the user gets notified by Email. Setup Create a new Google account (alternatively you can use an existing one and set up rules to keep your inbox organized) Create two Labels in Gmail: “Processed” and “Error” Clone this Softr template including the Airtable dataset and publish the application Clone this workflow and choose credentials (Gmail, Airtable) Follow the additional instructions provided within the workflow notes Enable the workflow, so it runs automatically in the background How to use Open published Softr application Register as a new user Create a new route containing the Notion API key and the Notion Database URL Expand the new entry to copy the Email address Save the address as a new contact in your Email provider of choice Forward an Email to it and watch how it gets converted to an actionable task Disclamer Airtable was chosen, so you can setup this template fairly quickly. It is advised to replace the persistence by something you own, like a self hosted SQL server, since we are dealing with sensitive information of multiple users This solution is only meant for building internal tools, unless you own an embed license for n8n.
octionic
Mario
HTTP Request node
+16

Build a Tax Code Assistant with Qdrant, Mistral.ai and OpenAI

This n8n workflows builds another example of creating a knowledgebase assistant but demonstrates how a more deliberate and targeted approach to ingesting the data can produce much better results for your chatbot. In this example, a government tax code policy document is used. Whilst we could split the document into chunks by content length, we often lose the context of chapters and sections which may be required by the user. Our approach then is to first split the document into chapters and sections before importing into our vector store. Additionally, using metadata correctly is key to allow filtering and scoped queries. Example Human: "Tell me about what the tax code says about cargo for intentional commerce?" AI: "Section 11.25 of the Texas Property Tax Code pertains to "MARINE CARGO CONTAINERS USED EXCLUSIVELY IN INTERNATIONAL COMMERCE." In this section, a person who is a citizen of a foreign country or an en..." How it works The tax code policy document is downloaded as a zip file from the government website and its pages are extracted as separate chapters. Each chapter is then parsed and split into its sections using data manipulation expressions. Each section is then inserted into our Qdrant vector store tagged with its source, chapter and section numbers as metadata. When our AI Agent needs to retrieve data from our vector store, we use a custom workflow tool to perform the query to Qdrant. Because we're relying on Qdrant's advanced filtering capabilities, we perform the search using the Qdrant API rather than the Qdrant node. When the AI Agent, needs to pull full wording or extracts, we can use Qdrant's scroll API and metadata filtering to do so. This makes Qdrant behave like a key-value store for our document. Requirements A Qdrant instance is required for the vector store and specifically for it's filtering functionality. Mistral.ai account for Embeddings and AI models. Customising this workflow Depending on your use-case, consider returning actual PDF pages (or links) to the user for the extra confirmation and to build trust. Not using Mistral? You are able to replace but note to match the distance and dimension size of Qdrant collection to your chosen embedding model.
jimleuk
Jimleuk
Odoo node
+10

ERP AI chatbot for Odoo sales module with OpenAI

Who is this for? This workflow is for everyone who wants to have easier access to their Odoo sales data without complex queries. Use Case To have a clear overview of your sales data in Odoo you typically needs to extract data from it manually to analyse it. This workflow uses OpenAI's language models to create an intelligent chatbot that provides conversational access to your Odoo sales opportunity data. How it works Creates a summary of all Odoo sales opportunities using OpenAI Uses that summary as context for the OpenAI chat model Keeps the summary up to date using a schedule trigger Set up steps: Configure the Odoo credentials Configure OpenAI credentials Toggle "Make Chat Publicly Available" from the Chat Trigger node.
mihailtd
Mihai Farcas
Telegram node
Telegram Trigger node
OpenAI Chat Model node

Chat with OpenAIs GPT via a simple Telegram Bot

Use case LLMs have provided a lot of value for several use cases. Especially some OpenAI models are proving to be quite valuable. However, it's sometimes not super accessible to chat with these models. This workflow enables you to chate directly with OpenAI's GPT-3.5 via Telegram. How it works A simple telegram bot that connects to your botfather bot to give AI responses, using OpenAI's GPT 3.5 model, to a user's messages with emojis. What to do Add your telegram API key and your OpenAI api key and have fun!
mikepowers
Mike

About AI Agent

Related categories

Similar integrations

  • JSON Input Loader node
  • Embeddings Google PaLM node
  • Embeddings Hugging Face Inference node
  • Embeddings Mistral Cloud node
  • Google PaLM Chat Model node
  • Google PaLM Language Model node
  • Groq Chat Model node
  • Cohere Model node

Over 3000 companies switch to n8n every single week

Connect AI Agent with your company’s tech stack and create automation workflows