Perform comprehensive research on a user's query by dynamically generating search terms, querying the web using Google Search (by Gemini) , reflecting on the results to identify knowledge gaps, and iteratively refining its search until it can provide a well-supported answer with citations. (like Perplexity)
This workflow is a reproduction of gemini-fullstack-langgraph-quickstart
in N8N.
The gemini‑fullstack‑langgraph‑quickstart
is a demo by the Google‑Gemini team that showcases how to build a powerful full‑stack AI agent using Gemini and LangGraph
Configure API Credentials:
Google Gemini Chat Model
and GeminiSearch
and reflection
Configure Redis Source:
number_of_initial_queries
and max_research_loops
.Use Redis as an external storage to maintain global variables (counter, search results, etc.)
This workflow contains a loop process, which need global variables (as State
in LangGraph).
It is difficult to achieve global variables management without external storage in n8n.