Has Google just killed page rank recovery?

When it comes to SEO 101 there are fewer better house keeping practices than page rank recovery. This is the process of identifying external links pointing to pages on a domain that either error (500) or don't exist (404) and then recovering their value.

The value of these broken links can be recovered one two ways:
  1. By forwarding the erring pages to a similar page of content with a permanent 301 redirect (link redirection)
  2. By asking the owner of the site linking to you to change the link they have to a more appropriate page (link repatriation)
In terms of linking efficiency, link repatriation is generally considered to be the best strategy for page rank recovery, as the complete value of the link is passed onto the new page, but it can be time consuming to manage large numbers of broken links this way. 301 redirects are much easier to implement, but the value they eventually pass to the new page is generally considered to be reduced compared to a direct link.

Whatever you decide to do, choose your page rank recovery strategy with care so you give yourself the best result for the least amount of work, the best option is to save the manual work for the most valuable broken links pointing to a domain.

The bad news... Google have just made page rank recovery much much harder.

It used to be that you could export crawl errors from Google Webmaster Tools and analyse these to get external links pointing to 404 or 500 pages on a domain. With this list it would be easy to either redirect or repoint these links to the new page on a site.

At some point during September 2010 Google seems to have turned off or heavily filtered the reporting of these errors in Webmaster Tools.

After a shout-out to the SEO twitterati and looking at a couple of the responses I got, this seems to be a global change.. and it is a massive blow to page rank recovery.

So how do I recover page rank now?

After a tweet from a fellow SEO it became obvious that there are some paid page rank recovery tools that will do something similar, but most are based on less rich link data compared to the information Google provided for free.

But some page rank recovery is better than none right?

So after a bit of thought I came up with a process using our old friend Xenu and a custom link .txt file to check for broken external links:
  1. Using a pro seomoz account and open site explorer I was able to download a list of external links pointing to my domain
  2. I exported this list to excel and filtered out any duplicates urls
  3. I saved this filtered list of urls as a tab separated text file
  4. I started a new project in Xenu and selected the .txt file I just created using the "browse" file option
  5. I made sure I changed the preferences so that Xenu was only checking through the first tree level of the links in the text file
  6. Xenu did it's stuff and reported on the broken links it found. Now I had a list of links on my domain that were linked to from external sources.
  7. I imported the broken links into the same workbook as the original OSE file and performed a simple excel vlookup on the erring urls to see where these were being linked from.
  8. Now to recover that pagerank!
Although this process is sufficient to enable the fixing of some broken links, it is a based on incomplete link data and is therefore not as good as using the Google data we used to have. Lets hope someone at Google realises their mistake and reinstates the old data reports!
 

tootricky's blog 2010