UptimeRobot node

Create, update, and get a monitor using UptimeRobot

Published 3 years ago

Created by

harshil1712
ghagrawal17

Categories

Template description

This workflow allows you to create, update, and get a monitor using the UptimeRobot node.

UptimeRobot node: This node creates a new monitor of the type HTTP(S).

UptimeRobot1 node: This node will update the monitor that we created in the previous node.

UptimeRobot2 node: This node will get the information of the monitor that we created in the previous node.

Share Template

More DevOps workflow templates

Git backup of workflows and credentials

This creates a git backup of the workflows and credentials. It uses the n8n export command with git diff, so you can run as many times as you want, but only when there are changes they will create a commit. Setup You need some access to the server. Create a repository in some remote place to host your project, like Github, Gitlab, or your favorite private repo. Clone the repository in the server in a place that the n8n has access. In the example, it's the ., and the repository name is repo. Change it in the commands and in the workflow commands (you can set it as a variable in the wokflow). Checkout to another branch if you won't use the master one. cd . git clone repository Or you could git init and then add the remote (git remote add origin YOUR_REPO_URL), whatever pleases you more. As the server, check if everything is ok for beeing able to commit. Very likely you'll need to setup the user email and name. Try to create a commit, and push it to upstream, and everything you need (like config a user to comit) will appear in way. I strong suggest testing with exporting the commands to garantee it will work too. cd ./repo git commit -c "Initial commmit" --allow-empty -u is the same as --set-upstream git push -u origin master Testing to push to upstream with the first exported data npx n8n export:workflow --backup --output ./repo/workflows/ npx n8n export:credentials --backup --output repo/credentials/ cd ./repo git add . git commit -c "manual backup: first export" git push After that, if everything is ok, the workflow should work just fine. Adjustments Adjust the path in used in the workflow. See the the git -C PATH command is the same as cd PATH; git .... Also, adjust the cron to run as you need. As I said in the beginning, you can run it even for every minute, but it will create commits only when there are changes. Credentials encryption The default for exporting the credentials is to do them encrypted. You can add the flag --decrypted to the n8n export:credentials command if you need to save them in plain. But as general rule, it's better to save the encryption key, that you only need to do that once, and them export it safely encrypted.
allandaemon
Allan Daemon
Google Sheets node
HTTP Request node
Slack node
+4

Host your own Uptime Monitoring with Scheduled Triggers

This n8n workflow demonstrates how to build a simple uptime monitoring service using scheduled triggers. Useful for webmasters with a handful of sites who want a cost-effective solution without the need for all the bells and whistles. How it works Scheduled trigger reads a list of website urls in a Google Sheet every 5 minutes Each website url is checked using the HTTP node which determines if the website is either in the UP or DOWN state. An email and Slack message are sent for websites which are in the DOWN state. The Google Sheet is updated with the website's state and a log created. Logs can be used to determine total % of UP and DOWN time over a period. Requirements Google Sheet for storing websites to monitor and their states Gmail for email alerts Slack for channel alerts Customising the workflow Don't use Google Sheets? This can easily be exchanged with Excel or Airtable.
jimleuk
Jimleuk
Merge node
Webhook node
+10

πŸ¦… Get a bird's-eye view of your n8n instance with the Workflow Dashboard!

Using n8n a lot? Soar above the limitations of the default n8n dashboard! This template gives you an overview of your workflows, nodes, and tags – all in one place. πŸ’ͺ Built using XML stylesheets and the Bootstrap 5 library, this workflow is self-contained and does not depend on any third-party software. πŸ™Œ It generates a comprehensive overview JSON that can be easily integrated with other BI tools for further analysis and visualization. πŸ“Š Reach out to Eduard if you need help adapting this workflow to your specific use-case! πŸš€ Benefits: Workflow Summary** πŸ“ˆ: Instant overview of your workflows, active counts, and triggers. Left-Side Panel** πŸ“‹: Quick access to all your workflows, nodes, and tags for seamless navigation. Workflow Details** πŸ”¬: Deep dive into each workflow's nodes, timestamps, and tags. Node Analysis** 🧩: Identify the most frequently used nodes across your workflows. Tag Organization** πŸ—‚οΈ: Workflows are grouped according to their tags. Visually Stunning** 🎨: Clean, intuitive, and easy-to-navigate dashboard design. XML & Bootstrap 5** πŸ› οΈ: Built using XML stylesheets and Bootstrap 5, ensuring a self-contained and responsive dashboard. No Dependencies** πŸ”’: The workflow does not rely on any third-party software. Bootstrap 5 files are loaded via CDN but can be delivered directly from your server. ⚠️ Important note for cloud users Since the cloud version doesn't support environmental variables, please make the following changes: get-nodes-via-jmespath node. Update the instance_url variable: enter your n8n URL instead of {{$env["N8N_PROTOCOL"]}}://{{$env["N8N_HOST"]}} Create HTML node. Please provide the n8n instance URL instead of {{ $env.WEBHOOK_URL }} 🌟Example: Check out our other workflows: n8n.io/creators/eduard n8n.io/creators/yulia
eduard
Eduard
HTTP Request node
Merge node
+13

AI Agent To Chat With Files In Supabase Storage

Video Guide I prepared a detailed guide explaining how to set up and implement this scenario, enabling you to chat with your documents stored in Supabase using n8n. Youtube Link Who is this for? This workflow is ideal for researchers, analysts, business owners, or anyone managing a large collection of documents. It's particularly beneficial for those who need quick contextual information retrieval from text-heavy files stored in Supabase, without needing additional services like Google Drive. What problem does this workflow solve? Manually retrieving and analyzing specific information from large document repositories is time-consuming and inefficient. This workflow automates the process by vectorizing documents and enabling AI-powered interactions, making it easy to query and retrieve context-based information from uploaded files. What this workflow does The workflow integrates Supabase with an AI-powered chatbot to process, store, and query text and PDF files. The steps include: Fetching and comparing files to avoid duplicate processing. Handling file downloads and extracting content based on the file type. Converting documents into vectorized data for contextual information retrieval. Storing and querying vectorized data from a Supabase vector store. File Extraction and Processing: Automates handling of multiple file formats (e.g., PDFs, text files), and extracts document content. Vectorized Embeddings Creation: Generates embeddings for processed data to enable AI-driven interactions. Dynamic Data Querying: Allows users to query their document repository conversationally using a chatbot. Setup N8N Workflow Fetch File List from Supabase: Use Supabase to retrieve the stored file list from a specified bucket. Add logic to manage empty folder placeholders returned by Supabase, avoiding incorrect processing. Compare and Filter Files: Aggregate the files retrieved from storage and compare them to the existing list in the Supabase files table. Exclude duplicates and skip placeholder files to ensure only unprocessed files are handled. Handle File Downloads: Download new files using detailed storage configurations for public/private access. Adjust the storage settings and GET requests to match your Supabase setup. File Type Processing: Use a Switch node to target specific file types (e.g., PDFs or text files). Employ relevant tools to process the content: For PDFs, extract embedded content. For text files, directly process the text data. Content Chunking: Break large text data into smaller chunks using the Text Splitter node. Define chunk size (default: 500 tokens) and overlap to retain necessary context across chunks. Vector Embedding Creation: Generate vectorized embeddings for the processed content using OpenAI's embedding tools. Ensure metadata, such as file ID, is included for easy data retrieval. Store Vectorized Data: Save the vectorized information into a dedicated Supabase vector store. Use the default schema and table provided by Supabase for seamless setup. AI Chatbot Integration: Add a chatbot node to handle user input and retrieve relevant document chunks. Use metadata like file ID for targeted queries, especially when multiple documents are involved. Testing Upload sample files to your Supabase bucket. Verify if files are processed and stored successfully in the vector store. Ask simple conversational questions about your documents using the chatbot (e.g., "What does Chapter 1 say about the Roman Empire?"). Test for accuracy and contextual relevance of retrieved results.
lowcodingdev
Mark Shcherbakov
Merge node
MySQL node
+9

Generate SQL queries from schema only - AI-powered

This workflow is a modification of the previous template on how to create an SQL agent with LangChain and SQLite. The key difference – the agent has access only to the database schema, not to the actual data. To achieve this, SQL queries are made outside the AI Agent node, and the results are never passed back to the agent. This approach allows the agent to generate SQL queries based on the structure of tables and their relationships, without having to access the actual data. This makes the process more secure and efficient, especially in cases where data confidentiality is crucial. πŸš€ Setup To get started with this workflow, you’ll need to set up a free MySQL server and import your database (check Step 1 and 2 in this tutorial). Of course, you can switch MySQL to another SQL database such as PostgreSQL, the principle remains the same. The key is to download the schema once and save it locally to avoid repeated remote connections. Run the top part of the workflow once to download and store the MySQL chinook database schema file on the server. With this approach, we avoid the need to repeatedly connect to a remote db4free database and fetch the schema every time. As a result, we reach greater processing speed and efficiency. πŸ—£οΈ Chat with your data Start a chat: send a message in the chat window. The workflow loads the locally saved MySQL database schema, without having the ability to touch the actual data. The file contains the full structure of your MySQL database for analysis. The Langchain AI Agent receives the schema, your input and begins to work. The AI Agent generates SQL queries and brief comments based solely on the schema and the user’s message. An IF node checks whether the AI Agent has generated a query. When: Yes: the AI Agent passes the SQL query to the next MySQL node for execution. No: You get a direct answer from the Agent without further action. The workflow formats the results of the SQL query, ensuring they are convenient to read and easy to understand. Once formatted, you get both the Agent answer and the query result in the chat window. 🌟 Example queries Try these sample queries to see the schema-driven AI Agent in action: Would you please list me all customers from Germany? What are the music genres in the database? What tables are available in the database? Please describe the relationships between tables. - In this example, the AI Agent does not need to create the SQL query. And if you prefer to keep the data private, you can manually execute the generated SQL query in your own environment using any database client or tool you trust πŸ—„οΈ πŸ’­ The AI Agent memory node does not store the actual data as we run SQL-queries outside the agent. It contains the database schema, user questions and the initial Agent reply. Actual SQL query results are passed to the chat window, but the values are not stored in the Agent memory.
yulia
Yulia
OpenAI Chat Model node

AI Agent to chat with Supabase/PostgreSQL DB

Video Guide I prepared a detailed guide that showed the whole process of building a resume analyzer. Who is this for? This workflow is ideal for developers, data analysts, and business owners who want to enable conversational interactions with their database. It’s particularly useful for cases where users need to extract, analyze, or aggregate data without writing SQL queries manually. What problem does this workflow solve? Accessing and analyzing database data often requires SQL expertise or dedicated reports, which can be time-consuming. This workflow empowers users to interact with a database conversationally through an AI-powered agent. It dynamically generates SQL queries based on user requests, streamlining data retrieval and analysis. What this workflow does This workflow integrates OpenAI with a Supabase database, enabling users to interact with their data via an AI agent. The agent can: Retrieve records from the database. Extract and analyze JSON data stored in tables. Provide summaries, aggregations, or specific data points based on user queries. Dynamic SQL Querying: The agent uses user prompts to create and execute SQL queries on the database. Understand JSON Structure: The workflow identifies JSON schema from sample records, enabling the agent to parse and analyze JSON fields effectively. Database Schema Exploration: It provides the agent with tools to retrieve table structures, column details, and relationships for precise query generation. Setup Preparation Create Accounts: N8N: For workflow automation. Supabase: For database hosting and management. OpenAI: For building the conversational AI agent. Configure Database Connection: Set up a PostgreSQL database in Supabase. Use appropriate credentials (username, password, host, and database name) in your workflow. N8N Workflow AI agent with tools: Code Tool: Execute SQL queries based on user input. Database Schema Tool: Retrieve a list of all tables in the database. Use a predefined SQL query to fetch table definitions, including column names, types, and references. Table Definition: Retrieve a list of columns with types for one table.
lowcodingdev
Mark Shcherbakov

Implement complex processes faster with n8n

red icon yellow icon red icon yellow icon