Instead of guessing or relying on shallow placeholders, it scrapes real website content, summarizes it intelligently, and feeds that context into an LLM to produce outreach that feels relevant and human.
If a website is broken or unreachable, the workflow safely flags it so you can identify faulty leads early.
no content, helping you identify broken or invalid leads.The workflow starts by pulling leads from Baserow, including company name and website URL.
The lead’s website is fetched and converted into HTML. If the site fails to load or respond, the workflow records no content and continues without breaking.
All links are extracted from the page, then filtered so only links belonging to the same website are kept.
The workflow scrapes up to five pages in total, including the main website page and up to four internal pages. This provides enough context while avoiding unnecessary data.
Each page is converted to markdown to reduce token usage and trimmed to a maximum of 5,000 characters to control LLM costs.
All processed markdown content is combined into a single structured input.
An LLM analyzes the aggregated content and generates a concise overview of the company and its offering.
A second LLM uses the company name, lead name where available, and the generated business overview to create a highly personalized subject line and icebreaker for outreach.
The final outputs are written back to the database, keeping each lead enriched and ready for outreach.
Most outreach fails because it is generic. This workflow solves that by grounding every message in real website content while staying fast, efficient, and cost-conscious.