Your analytics dashboard shows the pricing page has a 68% exit rate. You know people are leaving. You do not know why. That gap — between knowing what visitors do and understanding why they do it — is exactly what on-site surveys are built to close.
This guide covers everything you need to run effective on-site surveys on a SaaS website: what they are, when to use them, what to ask, how to analyze results, and which pages deserve a survey first.
Note
What is an on-site survey? An on-site survey is a short feedback form that appears directly on your website while a visitor is still on the page. Unlike link-based surveys (Google Forms, Typeform) that redirect visitors away, an on-site survey is embedded as a lightweight widget — typically a popup, slide-in panel, or bottom bar — that overlays the page without interrupting the browsing experience. Visitors answer 1-3 questions and the survey disappears.
Why on-site surveys outperform every other feedback method
Before digging into how to use them, it helps to understand why on-site surveys consistently produce better data than the alternatives.
The context problem with email surveys
Email surveys arrive hours or days after the experience you're asking about. The visitor who nearly abandoned your checkout does not remember the exact moment of friction by the time your email hits their inbox. The emotional specificity is gone. The details are blurry.
On-site surveys catch people in the moment — when the problem is live, the frustration is fresh, and the answer is specific. "The promo code field was hidden and I almost gave up" is the kind of answer you get from someone who just experienced it. Not from someone recalling it three days later.
The volume problem with user research
Qualitative user research is valuable, but it's slow, expensive, and small. A usability study gives you 8-10 sessions with recruited participants in artificial conditions. A well-timed on-site survey gives you 200-500 responses from real visitors doing real things in the context you actually care about.
Both have their place. But for fast, actionable feedback that influences product and marketing decisions month-to-month, on-site surveys are more practical.
The direction problem with analytics
Analytics tells you what happened. It doesn't tell you why.
Your pricing page has a 62% bounce rate. Analytics confirms this. It does not tell you whether visitors left because the pricing was too high, they couldn't understand the difference between plans, they needed to check with a colleague first, or they found a competitor with a free trial.
On-site surveys answer the why. A 3-question survey on your pricing page — triggered on exit intent — gives you the actual objections in your visitors' own words.
A direct comparison
| Method | When you get the data | Sample size | Context quality | Cost to run |
|---|---|---|---|---|
| On-site survey | While visiting your site | 50-500+ responses | High — in-context | Low |
| Email survey | Days after the event | 10-50 responses | Low — from memory | Low |
| User research | Scheduled session | 5-15 sessions | Medium — simulated | High |
| Analytics | After the fact | All visitors | Zero — no "why" | Near zero |
| A/B testing | Weeks to reach stat sig | Thousands | Zero — no "why" | Medium |
The on-site survey is the only method that catches someone in the moment, at scale, at low cost.
The four types of on-site surveys
On-site surveys come in four display formats. The right format depends on the page and the trigger.
Modal: A centered overlay with a blurred backdrop. High visual weight — hard to miss, impossible to ignore. Use for high-stakes questions where you want maximum response rates. Best for exit intent on critical conversion pages.
Popup: A floating card, typically bottom-right or bottom-left. Less intrusive than modal. Good default format for most use cases — visible but doesn't block the page.
Slide-in: Appears by sliding in from the edge of the screen. Feels less like an interruption than a modal or popup. Works well for scroll-triggered surveys mid-article or on content pages.
Bottom bar: A thin banner that sticks to the bottom of the screen. Lowest interruption level. Good for NPS or single-question ratings on pages where you don't want to disrupt reading.
On mobile, all four formats typically collapse to a bottom sheet — a card that slides up from the bottom of the screen — which respects native mobile UX patterns and gives visitors a full-width, touch-friendly experience.
When to trigger an on-site survey
The trigger matters as much as the question. A good question shown at the wrong moment gets ignored. The same question at the right moment gets answered.
Time delay
The visitor lands on your page. After a set number of seconds — typically 5-15 — the survey appears.
Best for: pages where you want general feedback on the experience (homepage, features page). Not great for exit-intent questions since the visitor hasn't decided to leave yet.
Scroll depth
The survey appears when the visitor has scrolled X% down the page.
Best for: blog posts, long landing pages, documentation. "Was this article helpful?" at the 80% scroll mark catches readers who actually read the piece — not those who bounced after 3 seconds.
Exit intent
On desktop, exit intent detects when the visitor's cursor moves toward the top of the browser window — the motion of someone about to switch tabs or close. On mobile, it typically falls back to time delay or scroll.
Best for: conversion pages (pricing, signup) where the question is explicitly about why the visitor didn't convert. This is the highest-value trigger in on-site surveys. The exit intent audience — people who considered converting and decided not to — is the most valuable audience you can survey.
On click (trigger button)
A persistent button on the page (text tab or icon button) that visitors can click whenever they want to give feedback.
Best for: product dashboards, SaaS apps, and any context where you want to invite feedback without interrupting. The visitor self-selects — only those who want to respond do.
After interaction
The survey appears after a specific action — after completing a checkout, submitting a form, or reaching the end of an onboarding flow.
Best for: post-purchase, post-signup, post-cancellation surveys. The action is the context, so the question can be specific to that moment.
Which pages should get a survey first
You can put an on-site survey on any page. But some pages have dramatically higher ROI than others.
Pricing page
The highest-value page for surveys on any SaaS website. Visitors who reach the pricing page have shown intent. When they leave without converting, you want to know exactly what stopped them.
Trigger: Exit intent Questions to consider:
- What stopped you from signing up today? (multiple choice with "Other" option)
- How clear is our pricing? (emoji scale)
- What would have made you sign up? (open text)
The multiple choice gives you categories you can quantify across hundreds of responses. The open text gives you the exact language visitors use to describe their objections — language you can then use in your own copy.
Signup or checkout page
A similar dynamic: visitors who started the signup flow and didn't finish it are the most interesting drop-offs to understand. What was the last thing that stopped them?
Trigger: Exit intent or time delay (30+ seconds on page) Questions to consider:
- Was there anything confusing or missing on this page? (open text)
- What's one thing that would make this easier? (open text)
Post-signup or post-purchase
This is the easiest survey to get right because the visitor has just done something. The context is fresh and specific.
Trigger: After form submission or purchase completion Questions to consider:
- What almost stopped you from signing up? (open text or multiple choice)
- How would you describe [product] to a colleague? (open text)
- How did you find us? (multiple choice)
The "what almost stopped you" question is one of the highest-ROI survey questions in conversion optimization. It catches the friction that nearly derailed a conversion — and that friction is almost certainly stopping other visitors too.
Features page
Visitors on the features page are evaluating whether your product does what they need. When they leave without converting, you want to know which feature was missing or unclear.
Trigger: Time delay (10-15 seconds) or scroll depth (60%) Questions to consider:
- Did you find what you were looking for? (yes/no with follow-up)
- What feature matters most to you? (multiple choice)
Blog posts and content pages
Content surveys are often overlooked but produce consistently useful data. A simple "Was this article helpful?" survey tells you which content is actually landing — and which pieces are getting traffic but leaving readers unsatisfied.
Trigger: Scroll depth (80%) Questions to consider:
- Was this article helpful? (yes/no or emoji scale)
- What question did this article leave unanswered? (open text)
Tip
Start with your highest-traffic page that also has a conversion problem — usually the pricing page or the signup page. Don't try to survey everything at once. One well-designed survey on the right page gives you more useful data than five mediocre ones scattered across the site.
How to design questions that produce useful answers
The format of the question matters. But the content of the question matters more.
Ask about behavior, not sentiment
"What stopped you from signing up today?" produces actionable answers. "How satisfied are you with your experience?" produces a number.
The first question gives you something to fix. The second gives you data you can stare at without knowing what to do.
Whenever you write a survey question, ask yourself: if the answer is [X], what would I change? If the answer doesn't map to a specific action, rewrite the question.
Use the visitor's language
Don't ask: "How would you rate the clarity of our value proposition?" Ask: "Did our pricing page make it clear what you get?"
Visitors don't think in the language of UX research. They think in plain terms. Match their language and you get more responses and better responses.
Make "Other" open text
For multiple choice questions, always include an "Other — please specify" option with a text input. Your pre-written options will miss the most interesting objections — the ones you didn't think to include. The write-ins from "Other" are often the most actionable responses you'll collect.
Pair a structured question with an open text follow-up
Multiple choice gives you quantifiable categories across hundreds of responses. Open text gives you the nuance and the specific language that explains the category.
Together they're much more powerful than either alone. A good two-question survey: multiple choice first (fast, feels low-commitment), open text second (optional, for those with more to say).
Set a maximum of 3 questions
Research consistently shows that response rates drop sharply after the third question. A 1-question survey gets 80%+ completion. A 5-question survey is already under 50%. A 10-question survey is under 20%.
On-site surveys are micro-surveys by design. Their power comes from brevity and context — not comprehensiveness.
How to interpret on-site survey results
Collecting responses is the easy part. Turning them into decisions is where most teams get stuck.
Wait for at least 50 responses before drawing conclusions
With fewer than 50 responses, the percentages are too noisy to act on. 3 people choosing "too expensive" out of 10 responses (30%) means nothing. 150 people out of 500 (30%) is a signal.
The exception: open text responses. Even 5-10 strong open-text answers can identify a specific problem to investigate. Qualitative data doesn't need statistical significance — it needs pattern recognition.
Group open text responses into themes
Open text answers are messy. People write different things to describe the same underlying problem. Your job is to find the underlying theme.
Run through 50 open-text responses and create 3-5 theme buckets. Then tag each response. The distribution of themes tells you where to focus. The specific quotes tell you what to change.
Cross-reference with behavioral data
On-site survey responses become much more powerful when layered with analytics. If 32% of respondents say "I couldn't tell the difference between the plans" and your analytics shows that users who viewed the comparison section converted at 2.4x the rate of those who didn't — that's a clear case to make the comparison section more visible.
The survey identifies the problem. Analytics validates the hypothesis. The A/B test measures the fix.
Benchmark over time, not just at a point
The most useful thing you can do with on-site surveys is run the same survey on the same page repeatedly over time. After you make a change based on survey data, re-run the survey. Watch the objection distribution shift.
If you redesigned your pricing page to clarify plan differences, you'd expect "I couldn't tell the difference between plans" to drop from 32% to something lower. If it does, you fixed the right thing. If it doesn't, the change didn't address the actual objection.
Setting up your first on-site survey: step-by-step
Step 1: Identify the one page you most need to understand
Don't start with five surveys. Start with one. Pick the page that has the biggest conversion gap — high traffic, low conversion — where you genuinely don't know why people are leaving.
Step 2: Choose a template or write 1-3 focused questions
If you're using an expert template, the questions, trigger, and interpretation guide are already configured. If you're writing questions yourself: one behavioral question, one optional open-text follow-up.
Selge includes 14 expert templates with WHY/WHEN/HOW/WHAT guidance — including pre-configured targeting, trigger settings, and interpretation notes for each response type.
Step 3: Configure the trigger
Set the trigger based on what you're trying to learn. Exit intent for conversion pages. Scroll depth for content pages. Time delay for general experience feedback.
Set a dismiss cooldown (14-30 days) so return visitors don't see the same survey every visit.
Step 4: Install the embed script
One script tag, placed before </body>. The script is async and lazy-loads — the full widget only loads when a survey actually triggers. Under 2KB initial load.
<script async src="https://widget.selge.app/loader.js" data-selge-project="YOUR_PROJECT_ID"></script>
Step 5: Go live and collect data
Wait for at least 50 responses before making decisions. For a page with 1,000 daily visitors at a 5% survey response rate, that's 1-2 days. For a low-traffic page, it might be 2 weeks.
Step 6: Act on what you learn
Read the responses. Find the top 2-3 themes. Decide what change to test. Make the change. Re-run the survey in 4-6 weeks.
This is the loop: survey → insight → change → re-survey. Each cycle builds evidence. After 3 cycles on a page, you'll understand your visitors better than most companies understand theirs.
Common mistakes that kill response rates
Showing the survey immediately on page load. Visitors haven't had a chance to experience the page yet, so they have nothing to respond to. Wait until they've actually been on the page.
Asking more than 3 questions. Each additional question drops completion rate by 5-10%. Keep it short.
Triggering the survey on every page. Survey fatigue is real. Use URL targeting to show surveys only on the specific pages where you need the data.
No dismiss cooldown. If a visitor dismisses a survey and sees it again on their next visit, they stop trusting your site. Set a 14-30 day cooldown.
Vague questions. "Any feedback?" gets you nothing. "What almost stopped you from signing up?" gets you everything.
Not acting on the results. The most common failure mode. Survey data that doesn't lead to a change was a waste of the visitor's time and yours.
Frequently asked questions
What is an on-site survey?
An on-site survey is a short feedback widget — typically 1-3 questions — that appears directly on a webpage while a visitor is still on the site. Unlike link-based surveys that redirect visitors to a separate URL, on-site surveys overlay the current page as a popup, slide-in, modal, or bottom bar. The visitor answers without leaving the page they're on.
How many questions should a website survey have?
1-3 questions. Response rates drop sharply after the third question — a 3-question survey typically gets 5x more completions than a 15-question one. For on-site surveys, shorter is almost always better. One behavioral question and one optional open-text follow-up is the most effective structure for most use cases.
Do on-site surveys hurt conversion rates?
Not if they're triggered correctly. Surveys that appear immediately on page load disrupt the experience and can increase bounce rates. Surveys triggered after scroll depth, time on page, or exit intent do not interrupt the initial experience and have negligible impact on conversion rates. The data you collect typically drives conversion improvements that far outweigh any minor disruption effect.
What is the best way to trigger a website survey?
Exit intent is the most valuable trigger for conversion pages (pricing, signup). It catches visitors who considered converting and decided not to — the highest-intent audience you can survey. Scroll depth (70-80%) is best for content pages. Time delay (10-30 seconds) works for general experience feedback. On-click trigger buttons are best for SaaS dashboards where you want to invite feedback without interrupting.
How many responses do you need for meaningful data?
For quantitative data (multiple choice, NPS, star ratings), aim for at least 50 responses before drawing conclusions, and 200+ for decisions with real business impact. For qualitative data (open text), 15-20 responses often identify clear patterns. You don't need statistical significance for website surveys — you need enough responses to spot patterns and make directional decisions.
How do on-site surveys differ from email surveys?
The key difference is timing. On-site surveys catch visitors in context — while they're experiencing your site. Email surveys arrive hours or days later, by which point the specific friction points are forgotten and responses become generic. On-site surveys also reach a much larger audience: a survey shown to 5% of pricing page visitors will collect more responses than an email sent to 100 customers, because the sample is your actual at-risk population, not a separate panel.
Can on-site surveys slow down my website?
Not if they're built correctly. A lightweight survey widget with a 3-stage loading pattern (tiny loader script, conditional config fetch, widget-on-trigger) has zero measurable impact on Core Web Vitals. The key is that the full widget JavaScript should only load when a survey actually triggers — not on every page load. Selge's loader is under 2KB; the full widget (16KB gzipped) only loads when a survey is about to display.
What pages should I put a website survey on first?
Start with your highest-traffic, lowest-converting page — usually the pricing page or the signup flow. This is where the most money is being left on the table and where survey insights have the most direct impact on revenue. After you've learned what you can from that page, expand to post-signup, features page, and high-traffic blog posts.
How do I analyze open-text survey responses at scale?
Group responses into 3-5 theme buckets, then tag each response. The frequency distribution of themes shows where to focus; the specific quotes show what to change. For larger datasets (500+ responses), AI summarization tools can help identify recurring themes. Selge includes an AI summary feature that generates a holistic analysis of all responses per survey, highlighting patterns and suggested actions.
The bottom line
On-site surveys are the fastest, cheapest, and most contextually accurate way to understand why visitors behave the way they do on your website.
They don't replace analytics — they complete it. Analytics tells you what visitors do. Surveys tell you why. Together, they give you the confidence to make changes that move real business metrics rather than dashboard numbers.
The barrier isn't technical. One script tag, one good question, 48 hours. By the end of the week you'll have more useful insight about your pricing page than a $20,000 research project would give you — because your respondents are real visitors, in real context, experiencing real friction, not recruited participants simulating it in a Zoom call.
Start with one survey. Pick your pricing page. Ask why people leave. Read the answers.
Then change something.
Selge is a lightweight on-site survey tool with expert templates built from 15 years of CRO work. Install in 2 minutes, get your first responses by Friday. Browse the templates or start free.
Ready to hear what your visitors think?
Pick a template, paste one script tag, start getting real answers. No developer required.
Why didn't you sign up?
Conversion & CRO
Pricing page clarity check
Conversion & CRO
Homepage clarity check
Messaging & Positioning
Navigation check
Website & UX
What stopped you from signing up?
What stopped you from signing up today?
What stopped you from signing up today?
AI Insight
34% cite pricing. Consider adding a comparison table or rewriting plan descriptions using visitor language.
