How to remove the "who to follow" suggestions on new twitter

By using the power of user styles functionality in your browser you can banish the annoying "who to follow" suggestions panel from new twitter forever. What you will need:
Install the above and you go from this:


to this:







Some poems I wrote when I was 21..

My parents recently dropped off a couple of my old portfolios from my time at Brighton Uni... hidden in the pockets of one of these I found 2 poems I had forgotten I had written. Not sure if they constitute as "good content" or that I even remember being this person... but here they are for posterity:

Resolution


Concious of a failing love, I am on my own again.
Retrospective of the one above; to live I don't know how.

Forceful lies and weak excuses
help me avoid my deepest desires.
Seeing you; hearing you
reproduces the feelings I have tired to disacquire.

Covered by a layer of disillusion
I strain against the on-coming tide of depression.
Tones and pitch rein on the mind:
Dropping the mental barriers to see what is behind.

Unable to hold any longer, I weep myself to sleep.
Because today on my own
I can no longer cope
without hope of a lasting resolution.

The Sea


Sitting on the breakwater getting saturated with salt water.
I feel free
The horizon has no end but bends.
My resistance as I stand, is like one man against an army.
I must sit for fear of falling into the icy depths
It is cold and wet, but I can not feel it
The water itself is a blanket against such things
Aggressive waves rip-tide and drag floating debris to its cloudy deep:
The see has eternal tempo, unable to resist
No wave nor ripple from skipping stone will ever again be
This is the overwhelming power of the Luna sea.

How to optimise your post page titles on your blogspot blog

The typical page titles on a blogger post are of the form:

Blog title - Post title

This is because the default Blogger templates come with a tag inserted into the pages that
automatically puts the blog title before every post title in the title tags of the post pages.

Obviously it make much more sense to put your post name at the begining
of the title tag not only for seo but for also for making your title more relevant when it appears in search engine results pages.

How you make your page titles optimised for your post pages on your blogger blog


  1. Goto to Design in your blogger admin
  2. Click on Edit HTML
  3. Download a backup of your blogger template (ALWAYS do this before mucking about with blogger templates)
  4. Look for this code in your blog template:
    <title><data:blog.pageTitle/></title>
  5. Replace this with:
    <b:if cond='data:blog.pageType != &quot;index&quot;'>
    <title><data:blog.pageName/> - <data:blog.title/></title>
    <b:else/>
    <title><data:blog.pageTitle/></title>
    </b:if>
What this code does is stipulate that anything but an index page should show the post title before the blog title. There are obviously tweaks that can be made to this simple bit of code to change things further (like removing the blog title from the page titles and replacing it with static text like this example:

<b:if cond='data:blog.pageType != &quot;index&quot;'>
<title><data:blog.pageName/> - Alternative Text</title>
<b:else/>
<title><data:blog.pageTitle/></title>
</b:if>


For further optimisation or tweaking I'd recommend reading the blogger data tags help page

Has Google just killed page rank recovery?

When it comes to SEO 101 there are fewer better house keeping practices than page rank recovery. This is the process of identifying external links pointing to pages on a domain that either error (500) or don't exist (404) and then recovering their value.

The value of these broken links can be recovered one two ways:
  1. By forwarding the erring pages to a similar page of content with a permanent 301 redirect (link redirection)
  2. By asking the owner of the site linking to you to change the link they have to a more appropriate page (link repatriation)
In terms of linking efficiency, link repatriation is generally considered to be the best strategy for page rank recovery, as the complete value of the link is passed onto the new page, but it can be time consuming to manage large numbers of broken links this way. 301 redirects are much easier to implement, but the value they eventually pass to the new page is generally considered to be reduced compared to a direct link.

Whatever you decide to do, choose your page rank recovery strategy with care so you give yourself the best result for the least amount of work, the best option is to save the manual work for the most valuable broken links pointing to a domain.

The bad news... Google have just made page rank recovery much much harder.

It used to be that you could export crawl errors from Google Webmaster Tools and analyse these to get external links pointing to 404 or 500 pages on a domain. With this list it would be easy to either redirect or repoint these links to the new page on a site.

At some point during September 2010 Google seems to have turned off or heavily filtered the reporting of these errors in Webmaster Tools.

After a shout-out to the SEO twitterati and looking at a couple of the responses I got, this seems to be a global change.. and it is a massive blow to page rank recovery.

So how do I recover page rank now?

After a tweet from a fellow SEO it became obvious that there are some paid page rank recovery tools that will do something similar, but most are based on less rich link data compared to the information Google provided for free.

But some page rank recovery is better than none right?

So after a bit of thought I came up with a process using our old friend Xenu and a custom link .txt file to check for broken external links:
  1. Using a pro seomoz account and open site explorer I was able to download a list of external links pointing to my domain
  2. I exported this list to excel and filtered out any duplicates urls
  3. I saved this filtered list of urls as a tab separated text file
  4. I started a new project in Xenu and selected the .txt file I just created using the "browse" file option
  5. I made sure I changed the preferences so that Xenu was only checking through the first tree level of the links in the text file
  6. Xenu did it's stuff and reported on the broken links it found. Now I had a list of links on my domain that were linked to from external sources.
  7. I imported the broken links into the same workbook as the original OSE file and performed a simple excel vlookup on the erring urls to see where these were being linked from.
  8. Now to recover that pagerank!
Although this process is sufficient to enable the fixing of some broken links, it is a based on incomplete link data and is therefore not as good as using the Google data we used to have. Lets hope someone at Google realises their mistake and reinstates the old data reports!

Downloading any Google WMT report with dates in the UK format (dd/mm/yyyy)

Today @millerian tweeted about a problem I've also experienced in the past with downloads from Google WMT tools. The issue is that all dates in any report are always in the US format (mm/dd/yyyy). The problem is that excel thinks the date is in UK format (dd/mm/yyyy) and fixing the problem can be a pain in the butt.

How I have fixed the problem in the past:

  • The first technqiue I suggested and that I have used in the past to fix this issue is to use ASAP Utilities Text > Convert/Recognize dates tool
  • The second technqiue I have also used in the past is to create a new column with the function: TEXT(A1,"mm/dd/yyyy") which extracts and rebuilds the date in the UK format. The values of this new column can then pasted in to the worksheet.
These kind of "hacks" are fine, and they work, but this issue had really started to bug me now that I thought about it! Why does WMT do this? Google knows the location of the site as well as the location of the Google account, so why does it still offer dates in the incorrect format if you are in UK?

Eureka!

Then it hit me: Google generally passes language or locations in a parameter in most of it's search urls, so perhaps they do the same with WMT download links?

So I went to look at the download link for http crawl errors on one of the sites I look after and guess what I saw:

https://www.google.com/webmasters/tools/crawl-errors?hl=en&siteUrl=http%3A%2F%2Fwww.testsite.co.uk%2F&tid=we&sort=1

See the issue? That parameter in bold should be hl=en-gb for the UK!

So what does this mean?

If you want your dates in any WMT report to be in the UK format then simply change the download links so that the hl parameter has a value of "en-gb" not "en".

No post-download excel manipulation

No mucking about

Yes, it is that simple!

Creating a cartesian product in Excel with a Visual Basic macro

The term cartesian product sounds flash doesn't it? Like some sort of Star Trek thingy wossit.... When I first asked my "goto excel guy" how I created one, I didn't know that what I was asking for had a specific, flashy name.

So what is a cartesian product?

You probably already knows what a cartesian product is, even if you don't know that that is what it is called. In essence a cartesian product is the result set of all possible ordered pairs of data. Although normally used in mathematics, it is a process that can also be applied to non-numerical data sets like table fields (in SQL) or ranges (in Excel): If you want to create a result set of all possible combinations of two tables then you want a cartesian product!

So how do I create a cartesian product in Excel?

You need a marco that runs through your two ranges and gives you a new range of all the possible combinations. Simple right? The problem is there isn't a cartesian product macro that works out of the box and even common excel plugins like ASAP Utilities don't have one either.

Visual Basic to the rescue! Here is how you make a cartesian product macro in Excel 2007:


Firstly open your Developer tab and click Visual Basic to open the Visual Basic Editor; you should see something like this:



You need to create a Module in your wookbork, so right-click on the Modules folder and choose Insert > Module:



Copy the following code into the right-hand side of the Visual Basic Editor (big thanks to Andrew Edge for his masterful VB skills):

Sub cartesianproduct()
Dim startrange As Range

range1 = Application.InputBox(Prompt:="Please Select First Range", Type:=8)
range2 = Application.InputBox(Prompt:="Please Select Second Range", Type:=8)
Set startrange = Application.InputBox(Prompt:="Please select where you want to put it", Type:=8)

array1 = [range1]
array2 = [range2]

startrange.Select
For i = 1 To UBound(array1)
    For x = 1 To UBound(array2)
        z = z + 1
        ActiveCell.Offset(z, 0).Value = array1(i, 1)
        ActiveCell.Offset(z, 1).Value = array2(x, 1)
    Next
Next
End Sub


You will see that your macro now has a name "cartesianproduct":



Close the Visual Basic Editor and return to Excel. All that is needed now is to have two ranges of data. Start your macro by visiting the Developr tab and then click Macros. You'll see your cartesianproduct macro in a list.

Select cartesianproduct and click Run. You'll be asked to select two ranges and a cell where you want the resultset to be placed (it's usually better to put the resultset on a seperate worksheet)

Sit back and wait for the macro to run.... That's it!

Caution!

Due to the nature of cartesian products and the limitations of the number of rows excel will support, this macro is only useful for relatively small range pairs. You have been warned!

Google Adwords Account Limits

Heard it from a very reliable source today that these are the current limits for a Google adwords account:
  • campaigns per account    100
  • adgroups per campaign    10,000
  • keywords per ad group (includes total keywords, negative keywords, ad group site exclusions)    5000
  • keywords per account    1 million
  • campaign negative keywords per campaign    10,000
  • campaign negative placements (eg campaign site exclusions) 10,000
  • placements per ad group    2000


Twitter SSL Certificate: Something Wrong?

Just got this security warning pop-up a couple of times in the last few minutes:



Seems that there maybe some issue with Twitters security certificate or it's implementation. Twitterfox seems very unhappy about the whole thing! Anyone know what is going on?

Spam reporting your SEO competition an ethical decision?

In a recent post by Rand Fish of SEOmoz about 5 common pieces of SEO advice he disagrees with, I was surprised to see him reluctantly defending the process of reporting sites to Google that had seemingly bought links (or similar). Surprised not because I am against the practice of reporting those who contravene Google's terms of service but because it was the first I had ever heard about this industry code of silence on the subject. Perhaps this is the point where I should "fess up"?

My name is tootricky and I happily report competitor spam/link buying to Google.

Search Engine Knight or Spawn of Search Satan?

Is "doing the dirty" on a SEO competitor an act of noble morality or is it industrial sabotage? Are we who perform this seemingly unpopular practice, Knights of Search Engine ethics or kin of the google-devil?

I see myself as neither actually but let me explain why.

The two reasons given in Rand's article for not reporting bought links and spam are:

You may inadvertently hurt your site's rankings if you've engaged in (or unknowingly benefited from) particular types of spam.
Reporting spam may hurt your fellow SEOs and is thus unethical
There are two fundamental ripostes one can make to these very valid points:
  1. Most SEO is some form of managed risk, therefore, if I am happy to stand by my work I expect others to be reasonable and do the same.
  2. One can no more seriously apply ethics or morality to practices that are against the Google TOS than one can apply it to strawberry trifle.
Moralising managed risk is illogical

Link buyers and spammers, perform both practices because they are generally:
  • Effective
  • Cost Efficient
  • Offer a measurable ROI
Therefore managing the risk of these types of marketing strategies is the main draw back not the fact that someone somewhere will think their work to be "ethical" or not. The risk to them is that their bought links may possibly be reported by a competitor or spotted by Google and may possibly be acted on (to their detriment).

Their decision to increase their liability is no more an ethical choice as mine is an unethical practice to report them.

So report them I will :)

Changing a batch of under performing keywords in Adwords

There are several frustrating issues with the Adwords interface, that can make optimising campaigns a real chore. One that continues to frustrate is the lack of a feature that allows you to apply changes to a group of keywords at once, based on their status.

Now before somebody points out that Adwords Editor allows you to filter keywords by their status flags (Enabled, Paused) this is not the "status" I mean. In the new adwords interface you can filter keywords based on their status which are currently:
  • Not triggering ads: requires action
  • Eligible: limited
  • Not active
  • Eligible


This filtering can be very useful when used with the "Eligible: limited" flag to find keywords with poor search volume and impressions (sometimes referred to as low share of voice) so you can tweak and improve their performance.

One thing I like to do when fiddling with campaigns is change the match type for keywords from phrase or exact match to broad match, to see if this improves impressions. The only way to change a batch of keywords through the new adwords interface is to do each manually by selecting the relevant drop down menu; there is no "copy to all rows" button like there is is for other attributes like position preference or Max CPC (see piccy below)

 
 


My workaround to batch edit keywords using Adwords and Adwords Editor

There is a dirty workaround to this annoying problem but it involves using other attributes in the keyword edit dialogue to temporarily group keywords. By using a mix of the following attributes you can create a unique "fingerprint" for the keywords that you can then match in Adwords Editor later:
  • Status
  • Max CPC
  • Destination URL
For example if I wanted to change a bunch of keywords that had low search volume from phrase match to broad match, I would:
  • Filter the keywords in the Adwords web interface using "Eligible: limited" to get the list of under performing (low share of voice) keywords.
  • Set the mentioned attributes and "copy to all rows" to create a unique "fingerprint" for that group of keywords:
    • Change Status to Paused
    • Set Destination url to http://www.somedomain.com (obviously this will only works if you don't use individual urls for your keywords)
      and/or
    • Set Max CPC to something unique and exact like £11.11 (only really works if your CPCs are currently all the same)
Once these attributes have been applied to the keywords, go into Adwords Editor and download the current campaign. Use advanced search to look for keywords that have the status paused and a destination url containing "somedomain" (or whatever you chose to use) and/or where the CPC is £11.11.

Once you have your keywords list filtered in Adwords Editor, select all, edit the match type of all of the keywords to "broad", remove the destination url on all keywords, correct the CPCs (if necessary) and switch all keywords to enabled and upload.

Obviously all of this kind of stuff would be much easier if Google improved the filtering and informational content available through Adwords Editor, but until that becomes a reality, we have to find other ways!
 

tootricky's blog 2010