Published on
Students Asked AI Where to Go to School. It Didn’t Say You.
Almost a year ago, our agency partner told us something that changed how we think about marketing at Forsyth Tech. They said AI engines couldn’t find us. It wasn’t a rankings problem or a content quality problem. The systems increasingly deciding which institutions students discover couldn’t read, extract or cite what was on our website. We’d spent years perfecting our brand voice, and the robots didn’t even know we existed. Humbling, to say the least.
We listened. We restructured our program pages, implemented structured data markup, opened our site to AI crawlers and started building a content refresh cadence. We started measuring AI referral traffic separately in GA4. We did all of this inside our marketing and communications team with our agency partner, without a committee and without waiting for a mandate. I’m telling you this not because we have it figured out—we’re still learning—but the data has gotten harder to ignore.
The Numbers Got Worse While We Were Building
EAB surveyed more than 5,000 high school students in late 2025 and published the results in February 2026. Forty-six percent are now using AI tools in their college search, up from 26 percent just months earlier. Eighteen percent removed a college from consideration based on what AI told them—basically an AI-generated answer that the institution had no hand in shaping. Carnegie’s research tracks the same curve, with 4 percent of seniors using AI in their college search in 2023, 10 percent in 2024, 23 percent in 2025. Between 58 and 69 percent of Google searches now end without a click, according to Semrush and Similarweb analyses. When AI Overviews appear, that jumps to 83 percent.
The students who do arrive through AI referrals convert at more than four times the rate of standard organic visitors—higher quality but far less volume. And if your institution isn’t part of the AI-generated answer, you’re not part of the consideration set at all.
What to Build This Month
If your team hasn’t started, the good news is that the highest-impact steps are also the simplest. None of this requires a new vendor or a new budget line. Promise.
- Check your robot.txt file. GPTBot, ChatGPT-User, ClaudeBot, PerplexityBot. Each needs explicit permission to crawl your site. Many institutions block them by default because a CMS vendor’s configuration excludes them and nobody checked. Your robots.txt file lives at yoursite.edu/robots.txt. Pull it up in a browser right now. If you see “Disallow” next to any of those bot names, your institution is invisible to AI search regardless of how good your content is. That’s a five-minute fix.
- Run the 30-minute (or less) audit. Open ChatGPT, Perplexity and Google’s AI Mode. Ask the questions your students actually ask: “Best [program] near [city].” “How much does [program] cost at [institution]?” “Compare [your school] to [competitor].” Collect the results for your top ten programs. This is the fastest way to see exactly how AI represents you today.
- Restructure your program pages for extraction. AI engines pull individual passages, not entire pages. They want a few clear sentences that answer a specific question without needing the rest of the page for context. Lead with cost, timeline, credential and career outcomes in the first two sentences. Add FAQ sections with question-format headings. Keep the storytelling but move it below the extractable facts.
- Implement structured data. EducationalOccupationalProgram schema tells AI engines exactly what your program is, what it costs, how long it takes, what format it’s in and what credential it awards. It’s a small block of code that lives on your program page, invisible to students but readable by every AI crawler. Your web team or CMS vendor can implement it; you just need to tell them what fields to populate. AI responses cite sites with structured data markup nearly three times as often as sites without it.
- Build a freshness cadence. AirOps research found that pages not updated quarterly are three times likelier to lose AI citations. Add visible “Last Updated” timestamps. Set a quarterly review cycle for your top program pages. This is maintenance work, not glamorous. Nobody’s winning awards for updating timestamps. Trust me, I’ve checked.
- Track AI traffic in GA4. Create a custom channel group with regex matching for chatgpt.com, perplexity.ai, claude.ai, copilot.microsoft.com and gemini.google.com. Google’s own documentation even includes an AI chatbot channel example now. Without this, AI referral traffic hides inside “Referral” and “Direct,” and you can’t measure the problem or make the case for resources.
I’ll be completely honest, when our partners first walked us through schema markup, my eyes glazed over. It sounded like an IT project, but it wasn’t. Our web team implemented it on our program pages very quickly. The technical barrier is lower than you think. After we implemented structured data in September, our AI referral traffic visibly increased within weeks. Over the past year, AI referrals to our site grew more than 800 percent. The numbers are still small in absolute terms, but the trajectory is unmistakable.
Who Gets Hurt First
Harvard and Stanford dominate AI citations because of massive domain authority and brand recognition baked into the training data. They don’t need to optimize. Must be nice, right? Community colleges, regional publics, HBCUs and small privates depend on discoverability. Our students search for programs and careers. They’re not typing our names into ChatGPT. If AI answers only surface the institutions with the biggest web footprint, the students who most need affordable pathways won’t find them.
This Work Is Already the Floor
I want to be honest about something. The work I just described is already becoming table stakes. The conversation in digital marketing has moved past answer engine optimization (AEO) and generative engine optimization (GEO) toward what practitioners are calling agent optimization. Companies are building autonomous agents designed to act on a student’s behalf. Search for programs. Compare costs. Check admission requirements. Even start an application.
OpenAI launched ads inside ChatGPT in February 2026. Perplexity launched product search with checkout. The Harvard Business Review’s March-April 2026 issue has a feature on preparing brands for agentic AI. This isn’t science fiction or a 2028 conversation. It’s happening now. But here’s why the practical steps above still matter. Agents will still need to read your content and parse your structured data before acting on a student’s behalf. The infrastructure you build today for AI search visibility is the same infrastructure that serves you when agents start making enrollment decisions for students. The institutions that get the basics right now won’t have to start over when the game changes again.
Forty-six percent of high school students are already using AI in their college search. Eighteen percent have already ruled out an institution based on what AI told them. This is an exciting problem for us to solve, and your marketing team can start solving it this week. You can even put “taught a robot to recommend our college” on your annual review. We’ve spent years proving our value to students, legislators and skeptical parents. Convincing an algorithm should be the easy part.