At a time of growing uncertainty related to yield, many schools are going to extraordinary lengths to get a better read on their incoming classes. Far fewer have taken advantage of a simple and powerful fact—if you ask students the right way, they’ll simply tell you what you want to know.
Students will share valuable information when asked
It’s something we’ve quantified through research and testing with hundreds of thousands of college applicants across the years.
What we’ve found, time and time again, is that students will readily respond to questions – if only you ask. What’s more, their responses are accurate predictors of their future behavior.
I’m sure you can imagine all sorts of potential uses this insight might have in an enrollment context. We certainly did, and we researched a dozen or so before settling on a handful that were especially promising.
Student surveying for yield analytics
Our work with admissions teams typically includes polling admitted students regarding their enrollment intentions. It really is as simple as asking, “Do you intend to enroll?”
We’ve found that their responses are strong indicators of what they’ll actually end up doing.Students answering “yes” enroll at a rate of 70%; those answering “no” enroll at a rate of 0.4%; and those answering “probably” or “maybe” enroll at rates of 54% and 15%, respectively.
The benefit of this information is twofold.
First, it's a powerful input for your yield prediction models—something that’s especially critical now that FAFSA position data is no longer available.
Second, it helps triage follow-up efforts with admitted students, channeling your staff’s time to where it will have the greatest impact. Focusing selectively on the “maybes” and “probablys”—students you’re most likely to be able to influence—can cut your follow-up workload significantly.
Student surveying for competitive intelligence
That’s not to say you should just ignore the “no” group. As students who considered but ultimately did not select you, they have precious insight to offer regarding your competitive positioning.
And this is another major way that we use student surveying with our partner institutions.
All students declining admission are invited to participate in an online poll that asks, among other things, which school they ended up choosing. The insights that emerge can be transformative. We’ve found, for example, that many schools overestimate the share they’re losing to schools they think are their main competitors and, conversely, that many of their true competitors are not even on their radar as such.
On Doing it Properly
While the concept here is simple—just ask—execution is not. If you’re going to realize the considerable benefits of student surveying, you’re going to need to make sure you’re doing it properly. The main reason is that its effectiveness depends on the quantity of responses you get. This is true for two reasons.
The first has to do with statistical accuracy. The more responses you get, the more certain you can be of the conclusions you’re reaching based on them.
The second has to do with leverage across your full group of admitted students. The more responses you get, the more students you’ll be able to assess for triage purposes and the less effort you’ll waste on inconsequential follow-up.
Thankfully, response rate is something you can impact. Over the years, through careful research, testing and analysis, we’ve managed to get our survey participation up to between 60% and 70% – more than enough to make the effort worthwhile. Keep in mind that this is the result of our having refined every conceivable driver of student response: media channel, message content and tone, outreach timing and cadence, to name just a few.
Looking for more survey insights? We spoke with 3,000 parents of prospective undergraduates. Learn how they impact the college decision process
Note as well that unlocking the value of the data you collect will depend on additional capabilities and assets. Benchmarking, for example. We’ve found that a lot of critical insights emerge only after we’ve compared a partner institution’s results with aggregate numbers we’ve generated from our full cohort of schools. The related data processing and analytical work are additional areas in which we’ve seen some institutions fall down when they try to go it alone. Raw data is one thing. Insight is very much another.
Unless you’re one of the lucky few who have developed capabilities like this in-house, you’ll want to make sure you’re bringing on expert help.