Stop reviewing waitlist signups manually

Dennis Petri

You launched your waitlist. The signups are rolling in. You feel productive.

Then Monday morning hits and you have 247 new signups to review. You open the spreadsheet, start reading through names and emails, and realize you have absolutely no way to tell who's serious and who's just... there.

So you do what every founder does: you either review them all (and lose an entire day), or you skip the review and let everyone in (and lose the point of having a waitlist).

Neither option is good.

The math doesn't work

Let's be honest about the numbers. If you're manually reviewing signups:

  • Reading and evaluating one signup: 5-8 minutes (if you're checking LinkedIn, looking at their company, reading any notes they left)
  • 100 signups: 8-13 hours
  • 500 signups: 40-67 hours
  • 1,000 signups: you're not doing this

Most founders hit a wall around 50-100 signups. After that, review stops being rigorous and starts being "skim the email domain and guess." @gmail.com? Probably not enterprise. @bigcorp.com? Let them in. That's not lead qualification. That's pattern matching on domain names.

What you're actually trying to figure out

When you review a signup manually, you're trying to answer one question: "Is this person serious?"

Serious means:

  • They have a specific problem they're trying to solve
  • They've thought about how your product fits their workflow
  • They've tried alternatives and found them lacking
  • They can articulate what success looks like
  • They're not just collecting beta access to products they'll never open

The problem is that a name and email address tell you none of this. You need responses to actual questions, and you need a way to evaluate those responses at scale.

How intent scoring automates the triage

Intent scoring replaces the manual review loop with an automated evaluation. Here's how it works:

Step 1: Ask better questions. Instead of just name and email, ask signups 3-5 targeted questions about their problem, workflow, and intent. These questions take 60-90 seconds to answer - short enough that serious people don't bounce, long enough that casual signups self-select out.

Step 2: Score the responses. An AI scoring engine reads each set of answers and evaluates them for:

  • Specificity (vague vs. concrete problems)
  • Genuine engagement (thoughtful vs. one-word answers)
  • Relevance (actually related to your product vs. off-topic)
  • Red flags (spam, prompt injection, nonsensical content)

Each signup gets a score from 0 to 100.

Step 3: Route automatically. High scorers (92+) get instant access. No waiting, no manual approval. Mid-range scores (80-91) go to a review queue - but now you're reviewing 30 signups instead of 300. Low scores stay on the waitlist.

What this looks like in practice

Say you launch your waitlist on a Monday and get 200 signups by Friday.

Without intent scoring:

  • You have 200 rows in a spreadsheet
  • You spend Saturday reviewing them
  • You let in 50 people, half of whom never log in
  • You feel tired

With intent scoring:

  • 12 signups scored 92+ and got instant access (your best leads, already using the product)
  • 35 signups scored 80-91 and are in your review queue (manageable)
  • 153 signups scored below 80 (no action needed right now)
  • You spend 30 minutes reviewing the 35 mid-range signups
  • Your first 12 users are all high-intent

That's the difference between a weekend lost to triage and a launch that starts with momentum.

The self-selection effect

There's a secondary benefit most people don't expect: the questions themselves filter out low-quality signups.

When your signup form asks "What problem are you trying to solve?" instead of just "Enter your email," a certain percentage of casual signups bounce. They don't want to answer questions. They just wanted to "check it out."

This is a feature, not a bug. The people who stay and answer are exactly the people you want.

In practice, intent-scored waitlists see higher conversion rates on smaller lists. 50 scored signups often outperform 500 email-only signups because every person on the list has already demonstrated engagement.

Getting started

Baitlist handles the entire scoring flow. Create a waitlist, share the signup link, and every applicant answers your questions and gets scored automatically.

The free tier handles up to 50 signups per month - enough to test the approach on a real launch. Pro scales to 1,000 signups with custom questions and embeddable forms.

Your time is better spent building than triaging. Let the scoring engine handle the filter.

Ready to stop guessing which signups matter?