This workflow automatically tracks changes on specific websites, typically in e-commerce where you want to get information about price changes.
Prerequisites
- Basic knowledge of HTML and JavaScript
Nodes
- Execute Command nodes create a file named
kopacky.json
in the /data/
folder (Make sure that n8n has the permissions to make changes to the folder in your setup) and clean data.
- Cron node triggers the workflow at regular intervals (default is 15 minutes), depending on how often you want to crawl URLs of your watchers.
- Function Item node (Change me) adds the URL watchers. You can put as many URLs (watchers) as you want by changing the JavaScript code in the node. There are four properties for each watcher:
Property |
Meaning |
slug |
Unique identifier for the watcher. |
link |
URL of the website where you want to track changes. |
selector |
CSS selector of the HTML tag, where your price is placed. You can use browser web tools to get a specific selector. |
currency |
Currency code in which your price is set. |
- Function Item node (Init item) saves all required data from each watcher to the
kopacky.json
file.
- HTTP Request node fetches data from the website.
- HTML Extract node extracts the required information from the webpage.
- Send Email nodes (NotifyBetterPrice) send you an email when there is an issue with getting the price, and when a better price is available (this could happen if the website is down, your tracking product is not available anymore, or the owner of the website changed the selector or HTML).
- IF nodes filter the incoming data and route the workflow.
- Move Binary Data nodes convert the JSON file to binary data.
- Write Binary File nodes write the product prices in the file.
NOTE: This is the first (beta) version of this workflow, so it could have some issues. For example, there is an issue with getting content of those websites, where the owner of the website blocks any calls from unknown foreign services - it's typical protection against crawlers.