Just revamped. Got feedback? Share now.
Need to use captchas to prevent content scraping but not sure if it impacts SEO or not? Below is the expert answer.
You might be using captchas to hide content for any or all of the following reasons: –
It might be protecting your content, however, you may be missing out on SEO rankings.
Based on experience and evidence via authentic resource:
Hiding the content behind captchas is NOT GOOD in SEO perspective. Here is why.
Besides all this above, your website is NOT penalized just for having captcha-driven content, however, you won’t be ranked easily due to inaccessibility to content for algorithms to analyze and rank somewhere.
Wait. It doesn’t mean you should stop using captcha to prevent scraping. Just workaround for GoogleBot (if you’re really concerned with rankings).
Workaround: Serve different version to GoogleBot vs. Users. – This is slightly against Google Policies (falls in cloaking), however, if you do it right, you can still do it. It is confirmed by John Mueller via Google Office Hours SEO Videos (Official Resource).
“From a policy point of view we’re okay with situations where you serve us the full content, and you require a captcha on the user side. If you need to do that slightly differently for Googlebot or maybe other search engines than you would for the average user from our point of view that’s fine.”
John Muller, Google Office Hours (Youtube)
So at your website end, if its GoogleBot, don’t serve captcha at all. If any other BOT or user, present the captcha first.
Another similar example is rendering content via edge worker and its impact on SEO.
Our standardized editorial process ensures right, timely, and healthy updates to our content. Your honest opinion drives significant improvement to our content. We appreciate you are taking time to share that.
100K+ readers have already joined the lists of their choice. Will you? Just now, maybe?
Pick List(s) to Subscribe