Banner61 logo with a red flag above the text.

4 Lessons from a Year of Satisfaction Surveys

In Fall 2024, we launched our very first Parent Satisfaction Survey.

Determining what to ask, how to ask, and how to report on the findings was a significant undertaking. We knew schools would weigh strategic decisions based on the insights we presented, and I don’t take that reality lightly.

We partnered with several heads of school and a veteran market researcher to design and refine the questions. Then we went to market.

As expected, some things had to be learned from experience.

Whether you’re considering conducting your own survey internally or hiring a third party to do it, I think you’ll benefit from what we’ve learned.

1. Double-barreled questions are sneaky.

When we first launched the survey, one question asked parents about students learning to “live out and defend their faith.”

On paper, it was a great question. It captured something essential to Christian education.

But during one of our early survey debriefs, a Head of School graciously pointed out that this is actually two questions masquerading as one. A parent might say “yes” to students learning to live out their faith in daily life, but “no” to being equipped to defend their beliefs intellectually. Or vice versa.

So what’s a school leader supposed to do with that response?

This is a textbook survey design error. Yet, despite carefully designing each question, we still missed it.

Side note: We still ask some double-barreled questions, such as “Availability and quality of athletic programs.” We’ve found that parents are quick to share their opinions on extracurricular availability and quality in open-text comments. School leaders can still identify where the problem lies without adding more questions.

Takeaway for DIY surveys: Read your questions out loud. Ask yourself: “Could someone honestly answer ‘yes’ to half of this and ‘no’ to the other half?” If yes, it may need to be separated.

2. AI REALLY loves to make things up.

Last spring, we made a strategic decision to leverage artificial intelligence to unearth the kind of deep, nuanced insights that usually come with a premium price tag.

In theory, the “why” is obvious:

More insights + accessible price = schools get more value without breaking the budget.

But we ran into a problem. One you are probably familiar with.

AI can be confidently, catastrophically wrong.

I’ve watched it quote parent comments that were never written. Invent survey questions we didn’t ask. Randomly replace verified data with fabricated numbers. And present all of it with the same level of certainty and clarity as the accurate information.

It’s frankly maddening the first (and 50th…) time you realize that the “insights” you’ve been presented are a mix of reality and hallucinations. However, it would be far worse to not realize you’ve been lied to.

AI has marginally improved in this area, but only marginally. Not enough to blindly trust it with information this important.

So, we rebuilt our process. Multiple times.

We still use AI to dig through the data. It’s good at pattern recognition and surfacing themes. But the verification steps required to ensure accuracy before it reaches your hands are far more robust than we initially anticipated.

Manual validation checks. Advanced prompting. Cross-referencing against raw data. It’s not “instant insights” anymore. But now we trust the insights. (And, of course, the raw answers are still provided.)

Takeaway for DIY surveys: If you’re using AI to analyze your survey results, use extreme caution. Question everything. Verify against the raw data. Don’t assume the AI is being honest.

3. Question order matters.

We organize our survey questions into specific domains: academics, Christian formation, communication, culture, ease/convenience, and extracurriculars.

From the beginning, we randomized questions within each domain. If you were answering the “academics” section, you might see questions in a different order than another parent.

But we recently took it further. We started randomizing the domains themselves.

Why?

Parents get mentally tired. Their responses become less thoughtful as they progress through the survey. If “Culture” questions always appear last, they may get lower scores than if they appear first, not because culture is actually worse, but because respondents are fatigued.

There’s also the primacy effect, where early questions set a mental frame that influences later answers, as well as other psychological biases we could discuss at length.

Takeaway for DIY surveys: Randomize question order. Don’t let “tired parent syndrome” skew your most important data. If you break questions into domains as we do, you may need to pay for survey software (Google Forms doesn’t offer this flexibility). It will also require a little extra work to organize them into a format that makes scoring the results easy.

4. Unhappy parents hold back.

Here’s an interesting insight:

The average Net Promoter Score (a measure of how likely parents are to refer your school) across all our survey respondents was 60.

The average among parents who selected “Prefer not to answer” when asked about their education level? 28.

That’s not a small gap.

What does this mean?

We could interpret that disparity in a few ways, but here’s one that makes sense on a gut level:

In many cases, the most unhappy parents are the most concerned about anonymity. They worry their feedback will be traced back to them. They will either withhold identifying information (such as names or demographic information) or withhold their honest opinions.

Since the primary goal of a survey is to gather honest feedback, we do our best to help parents feel comfortable providing it.

Now, I understand why faceless feedback is controversial, but we’ll save the Matthew 18 discussion for another time. For now, it’s worth acknowledging what is not what should be.

Takeaway: Trust is everything. Some parents will withhold honest feedback unless they believe it’s truly anonymous. And paradoxically, the most valuable feedback often comes from the most cautious respondents.

Skip the learning curve.

If you’re thinking about running a parent survey this year, I hope this gave you a realistic picture of what’s involved.

Done well, a parent satisfaction survey is one of the most useful strategic tools a school leader can have. Doing poorly gives you data you can’t trust and decisions you shouldn’t make.

Everything in this article came from experience. We’ve made these mistakes so we could build something better on the other side.

If you want to run your own survey, the takeaways above are a genuine starting point. Use them.

But if you want analysis that goes deeper than a spreadsheet of averages, satisfaction scores broken down by domain, NPS with meaningful context, gap analysis that shows where expectations and reality are furthest apart, and comment themes — that’s what we do.

If you want to talk through what this would look like for your school, I’d be glad to have that conversation.

Banner61 logo with a red flag above the text.
Marketing strategy and services for Christian schools, churches, and nonprofits.

New Family Survey

New Family Survey

When you download this resource, you'll be added to our mailing list. You can unsubscribe at any time. (…but we hope you'll never want to.)