This workflow automatically tracks changes on specific websites, typically in e-commerce where you want to get information about price changes.
- Execute Command nodes create a file named
kopacky.json in the
/data/ folder (Make sure that n8n has the permissions to make changes to the folder in your setup) and clean data.
- Cron node triggers the workflow at regular intervals (default is 15 minutes), depending on how often you want to crawl URLs of your watchers.
||Unique identifier for the watcher.
||URL of the website where you want to track changes.
||CSS selector of the HTML tag, where your price is placed. You can use browser web tools to get a specific selector.
||Currency code in which your price is set.
- Function Item node (Init item) saves all required data from each watcher to the
- HTTP Request node fetches data from the website.
- HTML Extract node extracts the required information from the webpage.
- Send Email nodes (NotifyBetterPrice) send you an email when there is an issue with getting the price, and when a better price is available (this could happen if the website is down, your tracking product is not available anymore, or the owner of the website changed the selector or HTML).
- IF nodes filter the incoming data and route the workflow.
- Move Binary Data nodes convert the JSON file to binary data.
- Write Binary File nodes write the product prices in the file.
NOTE: This is the first (beta) version of this workflow, so it could have some issues. For example, there is an issue with getting content of those websites, where the owner of the website blocks any calls from unknown foreign services - it's typical protection against crawlers.