Overview
This n8n template tracks GitHub Trending repositories (daily/weekly/monthly), parses the trending page into structured data (rank, repo name, stars, language, etc.), and stores results in Google Sheets with automatic deduping. It’s designed for teams who want a simple “trending feed” for engineering research, developer tooling discovery, and weekly reporting.
Who is this for?
- Developers, PMs, DevRel, and tooling teams who want a lightweight trend radar
- Anyone building a curated list of fast-rising open source projects
- Teams who want Sheets-based tracking without manual copy/paste
What problems it solves
- Automatically collects GitHub Trending data on a schedule
- Prevents duplicate rows using a stable
dedupe_key
- Updates existing rows when values change (rank/stars/score)
How it works
- A schedule triggers the workflow.
- Inputs define the trending window (
daily, weekly, or monthly) and optional languages.
- ScrapeOps fetches the GitHub Trending HTML reliably.
- The workflow parses repositories and ranks from the HTML.
- Cleaned rows are written to Google Sheets using Append or Update Row matching on
dedupe_key.
Setup steps (~5–10 minutes)
- ScrapeOps
- Google Sheets
- Duplicate this sheet/create a Sheet and add a
trending_raw tab.
- Add columns used by the workflow (e.g. captured_at, since, source_url, rank_on_page, full_name, repo_url, stars_total, forks_total, stars_in_period, score, dedupe_key).
- In the Google Sheets node, choose Append or Update Row and set Column to match on = dedupe_key.
- Customize
- Change
since to daily/weekly/monthly in the Inputs node.
- Add languages via
languages_csv (example: any,python,go,rust).
- Adjust delay if needed.
Pre-conditions
- ScrapeOps account + API key configured in n8n
- Google Sheets credentials connected in n8n
- A Sheet tab named
trending_raw with matching columns
Disclaimer
This template uses ScrapeOps as a community node. You are responsible for complying with GitHub’s Terms of Service, robots directives, and applicable laws in your jurisdiction. Scraping targets can change at any time; you may need to update wait times and parsing logic accordingly. Use responsibly for legitimate business purposes.