How a BotHound Customer Scrapes Reddit for Leads Every Day

A real workflow for using BotHound to find high-intent Reddit leads, filter them, and send a daily email report.

By The BotHound Team
lead-generation reddit workflows ai-automation

Most lead generation is too broad.

You search for “small business owners,” get a giant list of people, and then spend hours figuring out who actually has a problem worth solving.

This BotHound customer took a different approach.

They built a daily Reddit lead scraper that runs every morning at 7:00 AM.

The bot searches Reddit for people already describing painful business problems, filters the discussions, summarizes the opportunity, and emails a clean report with links to the original sources.

No refreshing Reddit.

No copy-pasting links.

No daily research routine.

Just a scheduled lead report waiting in their inbox.

The workflow

The bot has two tasks:

  1. Search Reddit for recent small business pain points.
  2. Email a structured lead report with links to the original discussions.

The customer also attaches a markdown file that explains their business, their ideal customers, and the specific types of leads they want.

That part matters.

Without context, a bot can find generic leads.

With context, it can find relevant leads.

The schedule

The customer scheduled the bot to run every day at 7:00 AM.

That means lead discovery happens before the workday starts.

By the time they open their inbox, the bot has already searched Reddit, filtered the discussions, and created a report.

This solves one of the biggest problems with lead research: consistency.

Most people do manual research when they remember to do it.

This bot does it every day.

The business context file

The customer passes the bot a markdown file that explains exactly what their business does and what kind of leads they are looking for.

That file acts like a briefing document.

It can include things like:

# Business Context

We help small businesses automate repetitive research, monitoring, and reporting tasks using BotHound.

BotHound lets users create bots with tasks, prompts, tools, files, and schedules.

The best leads are people who are already talking about:
- Manual research
- Repetitive checking
- Competitor monitoring
- Review monitoring
- Missed customer messages
- Pricing changes
- Lead discovery
- Content research
- Workflow inefficiencies
- Tasks they wish could be emailed to them automatically

Avoid:
- Enterprise-only problems
- Vague complaints
- Posts where the person already has a complete solution
- Old discussions
- Low-intent conversations

Prioritize leads where BotHound could be used immediately.

This gives the bot a much better shot at finding useful opportunities.

The bot is not just searching for keywords.

It is evaluating whether each discussion matches the customer’s business.

The Bot Soul

The Bot Soul defines how the bot should think.

Here is the exact Bot Soul used:

You are a high-signal B2B lead discovery agent for BotHound.

You are given context about what BotHound is and who it serves. Use that context to guide all decisions.

Your job is to find small businesses actively expressing real operational problems that BotHound could solve.

You prioritize:
- Clear, repeatable problems (not vague complaints)
- Situations involving manual work, monitoring, or missed information
- Problems tied to time loss, revenue impact, or inefficiency
- Recent discussions (last 6 months, preferably last 2 months)

You think like a founder:
- You ignore noise and low-quality discussions
- You focus on actionable pain, not general conversation
- You identify opportunities where a BotHound bot could be immediately useful

You do NOT:
- Return generic leads
- Return enterprise-only problems
- Return leads with outdated discussions
- Return unclear or low-intent discussions
- Return posts where the author already has a solution

Every strong lead should map to a specific BotHound use case.

This turns the bot from a basic scraper into a lead discovery agent.

A generic scraper finds posts.

This bot finds pain.

Task 1: Search Reddit

The first task searches Reddit for recent discussions where small business owners are describing real operational problems.

Tool attached: WebSearch

Prompt:

Search Reddit for recent discussions (prefer last 7 days) where small business owners or operators describe real problems, frustrations, or manual work.

Focus on subreddits like:
- smallbusiness
- entrepreneurship
- startups
- ecommerce
- marketing
- localbusiness

Look for signals such as:
- Repetitive manual tasks
- Constant checking or monitoring
- Missing opportunities or reacting too late
- Tracking competitors, pricing, or reviews manually
- Workflow inefficiencies

Return up to 10 candidate discussions.

For each:
- Platform: Reddit
- Link
- Problem Summary (1–2 sentences)
- Key Quote (if useful)
- Suspected Business Type

Do not filter aggressively yet — prioritize strong candidates.

The task looks for language like:

  • “I keep missing customer messages.”
  • “I manually check competitor pricing every week.”
  • “I need to know when reviews come in.”
  • “I waste hours gathering information from different sites.”
  • “I wish this could just be emailed to me.”

Those are strong signals.

They show pain before the customer has named the solution.

Task 2: Email the report

The second task turns the Reddit findings into a readable email.

Tool attached: SendEmail

Prompt:

Create an email report that displays all of these leads, as well as sources linking to the discussion.

Recipient: theodore@bothound.com

Subject:
"BotHound Lead Report - High-Intent Opportunities"

Body:
Include the full structured report.

Ensure:
- Clean formatting
- Easy readability
- No unnecessary fluff
- Links to every source discussion

The final report includes:

  • The Reddit discussion link
  • A short problem summary
  • A useful quote when available
  • The suspected business type
  • Why the lead may be relevant
  • The BotHound use case it maps to

That makes the report immediately actionable.

The customer can open the email, review the links, and decide who is worth engaging with.

Why this saves time

Manual lead research is slow because most of the work is filtering.

A person doing this manually has to:

  • Search Reddit
  • Check multiple subreddits
  • Try different keywords
  • Read dozens of posts
  • Ignore low-quality discussions
  • Save useful links
  • Summarize each problem
  • Decide whether the person fits the business
  • Put everything into a report

That can easily take 30–60 minutes a day.

This BotHound workflow turns that into a scheduled daily process.

The human still makes the final judgment, but the painful first pass is automated.

The bigger pain point: missed opportunities

A lot of customers do not struggle because information is unavailable.

They struggle because useful information is scattered.

It is buried in Reddit threads, forums, review sites, social media, competitor pages, local business groups, and public posts.

The problem is not access.

The problem is attention.

People do not have time to check every source every day.

BotHound helps by turning that recurring research into a scheduled job.

Other ways this same workflow can be used

This same pattern works far beyond Reddit lead scraping.

A business could use BotHound for:

  • Competitor price monitoring
  • Review monitoring
  • Local discount tracking
  • Brand mention monitoring
  • Social listening
  • Customer complaint discovery
  • RFP or grant monitoring
  • Content idea research
  • Ecommerce product research
  • Hiring signal discovery
  • Industry news summaries
  • Forum monitoring
  • New customer pain tracking

The structure is the same:

  1. Give the bot business context.
  2. Tell it what signals to look for.
  3. Attach the right tools.
  4. Schedule it.
  5. Send the results somewhere useful.

Why the markdown file matters

The markdown file is what makes the bot specific.

Two companies could both run a Reddit lead scraper, but they should not get the same report.

A marketing agency may want leads complaining about SEO, ads, reviews, or local visibility.

A SaaS founder may want leads complaining about broken workflows, missing integrations, or competitor pricing.

A consultant may want leads talking about compliance, documentation, or manual reporting.

The prompts define the task.

The tools let the bot act.

The schedule makes it consistent.

The markdown file gives it judgment.

Final thought

Most businesses do not need another dashboard.

They need someone to check the right places, filter the noise, and tell them what matters.

That is what this BotHound Reddit lead scraper does.

Every morning at 7:00 AM, it searches for fresh pain, matches it against the customer’s business, and sends a clean report.

That is the kind of work a bot should do.