One of the biggest headaches for anyone who works in SEO comes when attempting to reverse an unexpected slide in rankings for your website.
You’ve finish work for the day and having just checked the rankings you relax for the night happy in the knowledge that your website is doing well in SERPs. Feeling good about yourself you head off to the pub for a well-deserved drink with a smile and a spring in your step.
The following day you return to your office, switch on your computer and while that’s booting up make yourself a nice cuppa tea. Settling into your seat, you open up your browser and decide to check on the rankings for your site that you have recently done some work on and Bam!, you go from hero to zero!
The rankings are heading south at such a break neck speed that even Usain Bolt would hang up his shoes and call it a day. It must be a slight blip in the SERPs, a fluctuation, nothing that you have done could affect the rankings in such a way so you decide to leave it a day or two and check again towards the end of the week. BAM, BAM and a Double Whooper with extra cheese meal please BAM, the primary keywords where your website had a regular top spot on the first page have now gone the way of the Dodo. What do you do next!?
The best advice is to stay calm, regain your composure and read through the quick check list below.
SEO rankings quick check list
View your site in a browser
Such an obvious point to make but before you begin to look at any potential problems that exist with the rankings, make sure you can view the website yourself in a browser. Your hosting server could be down for all kinds of reasons and if this was the case, it will prevent you from making changes and undoing work that was fine to begin with.
Fetch as Googlebot
In Webmaster tools the fetch as Googlebot tool lets you see a page as Googlebot sees it. This is particularly useful if you’re troubleshooting a page’s poor performance in search results. Fetch as Googlebot means you are telling WMT to send Googlebot to a page, process it and then display it exactly as Googlebot sees it. If you are seeing nothing or a forbidden message, then its probably your hosting provider has accidentally blocked Google’s IP address.
This may or may not be present in the root directory of your website, but the purpose of the robots.txt file is to tell the search engine spiders which directories they can’t crawl. Be careful when making any changes to the robots.txt file as a misplaced “/” could prevent the search engines from indexing your website altogether.
Note: Because the robots.txt file can be seen by anyone, do not use it as the only way of blocking access to private areas of your site. Disallowing a file or folder does not make it inaccessible to humans, only search engines.
Wrong canonical tag
The canonical tag tells search engines that the preferred location of this url is http://example.com/page.html instead of http://www.example.com/page.html?sid=kfjd7463744.
Use this tag with caution. If this tag is set up incorrectly it can result in important pages being dropped from the search engines index completely.
For more information, visit: seomoz catastrophic-canonicalization.
Noindex robots meta tag
Check for the robots meta tag in the head section of all pages effected in SERPs to see if the tag contains the “nofollow” attribute, if so change this to “follow”.
This is a simple quick check list to work through if your rankings begin to slide. I will expand on this list at a later date to cover additional considerations such as external backlinks, duplicate content, malware/hacking or search engine ranking algorithm changes to name a few.
Please feel free to add any comments or thoughts you may have regarding rankings issues and together we can compile a complete list of possible causes and solutions that can be shared with everyone.