Guide

My ChatGPT citations aren't moving. Why?

20 April 2026. 5-minute read.

You did the work. Rewrote the homepage. Added schema. Validated it in the Rich Results Test. Six weeks later, ChatGPT still names a competitor. Here are the six causes I see on client sites, in order of how often they turn out to be the problem. Work down the list.

Cause one: the model has not refreshed yet

Most common. ChatGPT, Perplexity, Claude, and Gemini do not crawl your site in real time for every answer. They blend training data, a retrieval index, and live browsing. The retrieval index refreshes on each model's own schedule, anywhere from weekly to quarterly.

Fix: wait. If your schema and copy are clean and less than 6 weeks old, the fastest thing you can do is nothing. Keep publishing, keep gathering reviews, and re-check at the 8-week mark. About half of "citations not moving" tickets I get fix themselves in this window.

Cause two: schema is present but invalid

Second most common. You pasted the JSON-LD. It looks right. But the Rich Results Test shows a silent warning, or one required field is missing, or you have two conflicting schema blocks on the same page.

Fix: run every page through the Google Rich Results Test and also Schema.org's validator. Fix every warning, not just the errors. On LocalBusiness, the common miss is missing geo coordinates or an ISO-8601 opening hours format. On FAQPage, it is questions that do not match the actual page content.

If schema is new territory, the free schema pack has validated blocks for five trades. Paste and go.

Cause three: homepage copy is still vague

You rewrote the first paragraph. You think it is clear. Read it to a friend who does not work in your industry. If they cannot tell you what you sell, where, and who for in one go, the model cannot either.

A common failure mode: rewriting the homepage to say "we deliver outcomes for our clients across the UK" and calling it done. Models cannot cite that. They need "we fix boilers in Bristol within 4 hours, 24/7, 98 reviews at 4.9 stars".

Fix: redo the first 40 words with who, what, where, proof. Boring is better than clever. Models quote boring prose because it is unambiguous.

Cause four: Google Business Profile disconnected or inconsistent

This one bites local businesses hard. ChatGPT and Perplexity cross-reference Google Business Profile data when a user asks a local query. If your GBP lists a different phone number, different address format, different opening hours, or is missing categories, the model sees conflicting signals and often picks a cleaner competitor.

Fix: open your GBP and your website side by side. Check:

Fix any mismatch. Wait 2 to 3 weeks. Re-check citations.

Cause five: review count below the threshold

AI models weight social proof. In our data, small businesses with fewer than 15 Google reviews rarely get cited for competitive local terms. Under 8 reviews, almost never. The models are not reading your reviews in detail, they are using the count and rating as a confidence signal.

Fix: systematic review collection. Every customer, every job, ask. SMS works better than email. Aim for 50 reviews in 90 days. A Belfast plumber I worked with went from 11 to 64 reviews in four months by adding a "please review us" SMS 24 hours after each job. His ChatGPT citation rate on local terms went from 0 to 40 percent in that window.

Do not buy reviews. Do not generate them. The FTC rule in effect since October 2024 fines fake reviews at $51,744 per violation, and models cross-check against Google anyway.

Cause six: robots.txt blocking AI crawlers

Less common, more painful when it hits. Your robots.txt file disallows GPTBot, OAI-SearchBot, PerplexityBot, ClaudeBot, or Google-Extended. The models literally cannot read your pages in real time, so schema changes never reach them through the live path.

Fix: open yoursite.com/robots.txt. Look for lines like:

If any of those exist and you want to be cited, remove them or change Disallow: / to Allow: /. Some hosts (Squarespace, certain WordPress privacy plugins) add these by default. Check after every platform update.

There is a legitimate reason to block AI crawlers if you do not want your content used for training. Just know that blocking them is a choice to be uncitable.

How to work the list

Go in order. Most tickets I see are cause one or two. Do not skip ahead to review systems if you have not validated your schema. Each cause takes a different amount of effort:

  1. Model refresh. Effort: zero, wait 8 weeks.
  2. Schema invalid. Effort: 30 minutes with a validator.
  3. Vague homepage. Effort: 1 hour rewriting.
  4. GBP mismatch. Effort: 30 minutes reconciling.
  5. Review count. Effort: ongoing, 90-day flywheel.
  6. Robots.txt. Effort: 5 minutes.

If you have worked all six and citations still have not moved after 12 weeks, the issue is usually off-site citation graph thinness. That is a longer conversation involving directory listings, niche press coverage, and industry mentions. Happy to talk through it.

Where to go next

If you want the exact schema blocks, the free snippet pack has them filled in for five trades. If you want a priority list for your own site, the audit gives you one with the copy and schema written.

Happy to answer anything, just reply.

Bob

Stop guessing whether you're being cited.

60-second free check. Type your URL, get a report on which AI models mention you today.

Run the free check