Back to Integrations
integration integration
integration Pinecone Vector Store node

Integrate LangChain Pinecone Vector Store in your LLM apps and 422+ apps and services

Use Pinecone Vector Store to easily build AI-powered applications with LangChain and integrate them with 422+ apps and services. n8n lets you seamlessly import data from files, websites, or databases into your LLM-powered application and create automated scenarios.

Popular ways to use Pinecone Vector Store integration

HTTP Request node
Slack node
Webhook node
+17

Advanced AI Demo (Presented at AI Developers #14 meetup)

This workflow was presented at the AI Developers meet up in San Fransico on 24 July, 2024. AI workflows Categorize incoming Gmail emails and assign custom Gmail labels. This example uses the Text Classifier node, simplifying this usecase. Ingest a PDF into a Pinecone vector store and chat with it (RAG example) AI Agent example showcasing the HTTP Request tool. We teach the agent how to check availability on a Google Calendar and book an appointment.
max-n8n
Max Tkacz
Google Drive node
Code node
+8

Chat with PDF docs using AI (quoting sources)

This workflow allows you to ask questions about a PDF document. The answers are provided by an AI model of your choice, and the answer includes a citation pointing to the information it used. You can use n8n’s built-in chat interface to ask the questions, or you could customise this workflow to use another one (e.g. Slack, Teams, etc.) Example The workflow is set up with the Bitcoin whitepaper. So you could ask things like: Question: “Which email provider does the creator of Bitcoin use?“ Answer: “GMX [Bitcoin whitepaper.pdf, lines 1-35]” Requirements A Pinecone account (they have a free tier at the time of writing that is easily enough for this workflow) Access to a large language model (e.g. an OpenAI account) Customizing this workflow The workflow only reads in one document, but you could customise it to read in all the documents in a folder (or more). The workflow is set up to use GPT 3.5, but you could swap that out for any other model (including self-hosted ones).
davidn8n
David Roberts

Supported modes

Get Many
Get many ranked documents from vector store for query
Insert Documents
Insert documents into vector store
Retrieve Documents (For Agent/Chain)
Retrieve documents from vector store to be used with AI nodes
Update Documents
Update documents in vector store by ID
Pinecone Vector Store node

About Pinecone Vector Store

Related categories

Similar integrations

  • JSON Input Loader node
  • Embeddings Google PaLM node
  • Embeddings Hugging Face Inference node
  • Embeddings Mistral Cloud node
  • Google PaLM Chat Model node
  • Google PaLM Language Model node
  • Groq Chat Model node
  • Cohere Model node

Over 3000 companies switch to n8n every single week

Connect Pinecone Vector Store with your company’s tech stack and create automation workflows