Not too long ago we gave our Playground page a brand-new responsive polish. The Playground is one of the last ones we made responsive. And when we finally released its update, something horrible happened — our search ranking evaporated until nothing was left.
When launching a new product or a redesign, we usually mind our Ps and Qs. We have a laundry list of things that we check, double check and, sometimes, triple check. Since Playground gets a ton of search traffic, we tripled-checked all the SEO things we knew, such as:
- Titles
- Keywords
- URLs and redirects
We even fixed a number of things broken in the previous site. But that due diligence didn't seem to be enough. Week after week, we watched in horror as our search ranking dropped and dropped and dropped to nothing. We didn't even rank for some of our more well-known Playground pieces, such as Orbit and Reveal. What the heck was going on? Where did we end up going wrong? Was there something we missed? We were scratching our heads.
Whittling Down the Problem
First, we combed Google Webmaster Tools, which showed no errors. Bing and Yahoo still worked as expected. The problem was only in Google (read: the only one we care about). To figure out what happened, we went through all of the things that we could've messed up:
- Duplicate URLs
- Broken URLs
- Bad redirects
- Cloaking
- Content being loaded via AJAX
- WWW vs. no subdomain
- Page load time
- Invisible text
- Service disruptions
- Bad robots file
- All of this stuff here from Google
We did all these things. And like we said earlier, we had even fixed most of them with the redesign. We even reached out to Google to figure out what happened, and they finally gave use a short response explaining the problem:
Hello Matt,
Thank you for your email.
It appears that you're serving Googlebot a different content type than you're serving to regular users. This is preventing Google from correctly interpreting your site. If you use the Fetch as Google tool in Webmaster Tools, you'll notice that Googlebot sees pages like https://zurb.com/playground/ajax-upload with "Content-Type: application/zip; charset=utf-8". Googelbot is unable to interpret the content type of your page which is preventing your page from being crawled properly.
What does that mean? Well, the root cause of the problem was that our server was setting a content-type that Google didn't like, so they stopped indexing our pages. All browsers were either not getting this content-type, or just ignoring it and figuring out to us the right one. If you knew exactly what to look for in webmaster tools, you could ferret out that Google was getting this content-type. But no error was reported anywhere. In fact, the page was indicated a "success".
Eventually, the problem was fixed with the correct content-type on our back end code. Everything was back to normal within a week. And Google gave us a curt response to our questions about why this happened in the first place. We asked how do we get the content reindexed now and why webmaster tools didn't show this crawl error.
Hello Matt,
It might take some time for Google to start reindexing URLs on your site depending on a number of factors like how often Google is able to crawl your site. These pages most likely did not show in Webmaster Tools as crawl errors because the pages were live and able to be crawled returning an HTTP 200. However, when Google tried interpreting the data on the page, it was unable to convert the file type and may have chosen not to index a page that did not have any content on it.
A Cautionary Tale
But this isn't about the solution or the resolution. This is a warning. You can't just launch something and not pay attention to your traffic. You might find out something horrible went wrong, long after it's occurred.