My experience with Google’s Supplemental Index

Here’s my recent experience with the Google Supplemental Index:A relatively new site that I had launched was receiving some traffic from Google. All of a sudden, traffic stopped, and I ran a site: command to find that my site is still in the index, but now there are two copies of my site’s homepage:, which is in the main index, and, which is in the supplemental index.

Clearly, my URL got placed in the supplemental index because and are quite similar (they are not identical, but it looked like Google thought they are close enough to be considered duplicates), and Google just happened to keep the one with a query string in the main index. Of course, all the backlinks point to, so ranked poorly (was top 5, now in the 200′s), hence the reason I was no longer receiving traffic from Google.

Once I found out what the problem was, the solution was straightforward: Modify robots.txt. I changed robots.txt so that Googlebot cannot see This had the desired effect, as 2-3 days later, got back to the main index with its original ranking, and the site started receiving traffic from Google again.