Improving the Impact of Links in Old Content

It’s a well known fact in link building circles that Links in Old content simply aren’t as good as Links in new Content.

Taking some inspiration from a recent Whiteboard Friday I decided to test this theory. Cyrus Shepard, from SEOMoz, went through the theory that a link to your website from some old content does not pass as much “link juice” as a link from a new page; you can see the video below.

What do we mean by an “old page” when we talk about these old pages? From a technical, Google definition point of view, we’re talking about something that has been previously crawled and indexed by Google. Stale content, by stale we mean content that hasn’t been updated in a long time. It was written and it just stayed that way. There are no new blog comments. It has just been for two or three years the same way it was written. And old links. So this old page, all of the links that it got, it got years ago or months ago, and there are no new links coming in. That’s what we’re talking about when we talk about an old page. If it doesn’t meet these definitions, then it’s a new page.

Tim Grice, SEOWizz, did a study in March 2011 showing that links on Old Pages just weren’t worth it. Over a 5 week period Tim monitored the changes in search rankings by adding site wide text links in sidebars or footers, links inserted on indexed static pages with a PR 1 or more and finally he inserted links within entirely new content in fresh blog posts.

New Links in Old Content

Source: Links In Old, Crawled Content Don’t Pass Weight

As you can clearly see the rankings for the “old content links” barely changed at all over the period where as the links within the new content rose quickly.

This got me thinking, especially the statement from Cyrus, that in order for Google to consider the page as new again you would need to make a significant change to it or build some links to the old content. But exactly how much of a change would you need to make to a page?

The Experiment

I decided to build links to some of my test sites using the same principles that Tim used, the old pages were already indexed by Google & they had not had any new links built to them recently.

  • Link type 1 was my control this is a link where I only added the link into the text with exact match anchor text.
  • Link type 2 I inserted the link and inserted 1 paragraph within the content
  • Link type 3 I inserted the link and added two paragraphs
  • Link type 4 I inserted the link and did 5 social bookmarks to the old page

old links 1

No surprises that the link only and link plus 1 paragraph saw very little change in rankings but after seeing the great performance of both link type 4 and 3 so early on in the experiment I decided to edit two paragraphs of text and do 5 social bookmarks with another test page.

old links 2

So not only did the page climb the rankings rapidly it has stabilised its new ranking; better than the social bookmarks only pages.

Conclusions

By no means was this a completely controlled and perfect scientific experiment as there was a new Panda update during the period of the test as well as the fact that the the content in the test pages weren’t all exactly the same. But as you can clearly see just by simply adding a link to piece of old content and editing just a small amount of text on an old page it has less value than adding a link, editing some text on the page and building a few links to the old page.

This will flag to the spiders that this page is now relevant and to recrawl the page.Which in turn means that the bots will follow the links within that text once again. In an ideal world it is preferable to build links within new content and shows how important it is to continue with content based link building methods such as using blogger outreach, guest blogging but if you are building links to old content i.e. broken link building it’s worthwhile taking the time to add some more value to the old pages.

Are .Edu and .Gov Links Really Worth it?

Every SEO forum since the dawn of time Google has been debating the power of high authority links such as .Edu or .Gov links.  There are literally dozens of articles out there in the great wide web and every few months a different debate begins on how much these links actually effect your rankings.

It’s a common belief in some SEO circles that a link from .Edu or .Gov site is the best type of link you can ever get in order to improve your rankings. This belief comes from the fact it is not easy to obtain control of these domains on the open market as you have to be an educational or government establishment.

So are Google actually giving more or less weighting to certain Top Level Domains (TLD’s) well yes they can, and do look at the case of co.cc domains, but in the case of .Gov or .Edu links they claim not to do so as Googler JohnMu stated:

In general, I would like to add that no, backlinks from .EDU domains generally do not get “additional credibility from Google.” Because of that, the whole topic of working especially hard to talk webmasters of these domains into linking to your sites seems a bit problematic…

Some in the world of SEO will scream conspiracy that Google don’t want to let out the secret recipe but then let’s look at these types of links from the proper angle. Google may actually be telling the truth their algorithm may not give these TLD’s a considerably higher weighting than a .com or .co.uk but it is all actually based on Pagerank. Remember that thing Larry Page invented which means links from web pages with lots of links themselves carry more value than those that don’t.

Below is a video by Google Head of Web Spam, Matt Cutts from 2010 that also confirms this:

These types of web pages are often well linked to and many have been around for years. So what I am saying is that despite their “perceived” authority due to their offline status as an institue of learning it is actually the quality of the pages that are linking into these sites i.e. Corporate blue chip companies, Major hospitals or large news resources e.g. the Guardian, BBC that make them an authority online not the TLD.

How to get .Edu or .Gov Links?

With the right amount of time, hustle or money you can get a backlink on just about any site you want. Building relationships, investing in the right tools and good content will allow you to get these links easily.

From past experience I have used Broken Link Building on .Gov or .Edu sites, whereby you look for broken links on a resource and inform the webmaster to replace the link with a resource to your own content.

Outreach also works quite well in order to gain .edu links as most universities now provide their students and faculty with blogging platforms and sub domains so it is quite easy to email them relevant content to their blog or studies. If you really wanted to invest a significant amount of time and resources you could look for a piece of research they produced and reference it in a piece of your own content e.g. an infographic and its highly likely that you will obtain a link back as they will naturally want to share this with their peers.

Invite an academic to write an article on your blog or even come and speak to your workforce or at an industry conference you are running, chances are they will link to you as they wish to reference their engagements. As you can see you are only limited by your own imagination as to how you can obtain these types of links but by offering useful resources for Government or Academic webmasters to link to you will have a much higher success rate.

So What is their Value?

If you were to ask me what I look for in a link then, I value backlinks on the number of visitors, neigh, the number of pre-qualified visitors that the link can send me. What I mean by that is if I could get a few hundred visitors to my site from a back link, who are motivated to buy my product or subscribe to my mailing list, I would spend more time and money obtaining these links than just chasing links from Universities and Government sites.

So do Edu or Gov links help get you better rankings, yes but they are no better than any other well linked TLD, and remember after all SEO is not just about rankings!

How to use Scrapebox for Link Building not Spamming

Scrapebox is well known in the SEO community as a Grey Hat, Black Hat, Yellow Polka Dotted Hat link building tool that is mainly used by blog commenting spammers. If you have ever spent any time reading blogs you will have seen the stereotypical comments on blogs. They usually say things such as “Great Blog Post thanks for sharing” with a keyword rich anchor text link to a site selling fake Ugg boots.

I know a lot of my regular readers will have a heart attack at the recommendation of using Scrapebox as a “White Hat” Link Building tool. A lot of people in the SEO community hate the thoughts of automated link building and the sheer mention of a tool such as Scrapebox makes their skin crawl. I can already imagine several people ready to jump down to the comments and tell me that tools like this are ruining the internet…

Well “Soapbox White Hatters” I’m going to show you a way that you can actually use Scrapebox to make the internet a better place… in fact a safer place for all!

So what is this Scrapebox Link building technique?

This link building technique utilises some of the free plugins that you can get from Scrapebox, the main tactic in this technique is to find a compromised or malware infected site and open a dialogue with the site owner in an attempt to receive a link either via a Guest Post or by suggesting the site owner replaces broken links with your own.

Scrapebox currently costs $97 (there are a few coupons on the net for $57 if you search around) and for the amount of time and money this tool will save you it is more than worth the investment. Scrapebox allows you to harvest thousands of URL’s from Google and Bing in no time at all and by entering your own custom footprints e.g. “submit * guest post” [keyword] you will find lots of guest blogging opportunities for your niche quickly. You can also import .txt files with lots of different search terms to put your harvesting on steroids.

The first free plugin you will need is the Malware and Phishing Filter once you have installed this plugin it allows you to search a list of sites from Scrapebox to find sites that have been compromised by some form of Malware. If you have Google Webmaster Tools setup on your websites then Google will normally inform you that a site has been infected by malware. Sadly many bloggers and small business owners rarely check their sites for malware and not everyone knows how to setup Google Webmaster tools.

Import your list of scraped URLs into the Malware checker and run it. This will flag up any site that has been compromised by some form of malware. You now want to export all of these bad urls and using the OSE check for PA/DA of the pages. Starting with the sites with the highest authority I then work down my list.

You can run the list through the Scrapebox Whois tool or use Scrapebox itself to check the contact page for any email addresses. You do not want to visit these sites as there is a risk that your computer maybe infected by a virus.

Now you want to send an email to the webmaster informing them of the malware issue on their site and send them a link to some helpful blog posts on how to fix malware infected sites. (If you haven’t checked out John Doherty’s blog post on SEOMoz about outreach email then make sure you do!)

You obviously do not want to ask for a link at this point. Depending on the quality of the site it might be worth using your hustle to track down alternative contact details too such as phone number, Twitter Handle, LinkedIn profile etc.

I have had a very good success rate in contacting webmasters using this technique and quite often I find that they are very grateful for you pointing out the problem on their website. Now that you have the dialogue with the site owner I will leave it to your imagination as to what approach you use next to obtain the link. But, this a good time to check the site for broken links or pitch a guest blog as the webmaster will probably have to recover the content on the site. I have even had a few webmasters offer me the chance to buy their sites for a small fee as they don’t have the time or inclination to fix their site and keep it up to date anymore!

So there you have it one way in which you can use a well regarded spam tool to speed up your link building research and to help make the web a safer place.

If you have any more Scrapebox tips and tricks drop them in the comments below.

Use Content Curation to Drive More Traffic

For the last couple of weeks I have been using a content curation service called Scoop.it to generate lots of referral traffic to my blog posts and I’ve decided to share what I have learned so far and also show how content curation can help you to grow your social network too.

What is Content Curation?

If you have a good understanding of the new social web landscape then there is no doubt that the term “content is king” comes up again and again. Content is the currency of the internet and by sharing your own great content and other people’s great content you will grow your social network and be held in high regard by your followers as the go to source or expert in your field.

By following all the latest news sources in your niche you can soon find yourself overrun with numerous RSS Feeds, Tweets, Google+ updates and Facebook shares flashing  before your eyes every day. This is where content curation comes in; quite simply content curation is the process of filtering out the best content that you find and then sharing this with your networks.

What is Scoop.it?

Scoop.it is a content curation service, but rather than have me rattle on about it you can watch this brief video below.

Scoop.it has the latest news delivered to you and allows you to re-share it with your social network. Another great aspect of Scoop.it is that other people can suggest for content to be added to your pages too. Scoop.it have a free entry profile which allows you to setup 5 pages and get used to the interface, if you want to curate information in more niches or have analytical data then you need to upgrade to a paid membership.

How to Use Scoop.it

The key to being a great content curator is by picking a niche in which to share your information and vigilantly sticking to it. The narrower the niche you decide to curate content in, the better. If you decide to setup a Scoop.it page about knitting patterns the majority of your regular followers are unlikely to be interested in your curated content on pictures of kittens.

First things first, go to Scoop.it and sign up with either your Twitter or Facebook Profile.

You will then see a screen similar to this one where you fill in the name, description and keywords for your new page.

Pro Tip: Use Google’s Keyword Tool to find keywords that people looking for your content may use. If you want to learn more you can read my blog post on using the Google Keyword Tool.

Now that your page is created you want to setup your news sources. Simply enter your keywords into the search box, these keywords will be checked regularly in Google, Digg and Youtube for the latest content in your niche. Next you want to click on Advanced Options. This is where knowing your niche comes into its own:

As you can see from the above image you can add in various personalised news sources such as RSS Feeds, Twitter accounts and lists, Google News Search, Google Blog Search and OPML files from Google Reader. You want to add all the best curators and thought leaders in your Niche to this list & the best blog feeds too.

Protip: Use a blog curator such as AllTop to grab your feed and setup a Twitter List of interesting people you can add to easily so you don’t have to keep adding them to your Scoop.it sources.

Now you are ready to start your new career as a content curator. After about an hour Scoop.it will have scraped your RSS feeds, Twitter Followers and searched Google for new content based on your keywords. Simply clicking Scoop.it will scoop the news to your page, from here you can share it with your Twitter and Facebook accounts, add tags to make the Scoop easier to find and change the text or images. If you don’t like a suggested article simply click discard and the page is removed.

Pro Tip: Install the Scoop.it App (its free) and add the Scoop.it bookmarklet to make it easier to Scoop content on the fly.

If you see a piece of content on another Scooper’s page you can “rescoop” it by clicking the arrows that look like a refresh button. It is also common etiquette to thank (thumbs up) your fellow content curator when you rescoop their find.

Generating Traffic to your Site with Scoop.it

Well there are two ways to get traffic from Scoop.it, the first is obviously to add your own blog posts & photos to your page. This will have limited results, just like running any web site, until you grow your following. Building a following takes time and may require weeks of curating and sharing great content, following other people on Scoop.it and commenting on other peoples Scoops. If you are anything like me this looks a lot like hard work, but by doing this I have noticed I am sharing lots more content with my Twitter followers and growing my followers.

So the second and quicker way, and I’m sure all the link builders have spotted this already, is to suggest your content to other users.

There are people on Scoop.it who already receive hundreds and hundreds of views per day to their pages and in one of the examples above they have had over 130k views in less than a few months. So by suggesting your own content to their page you have a chance that your post will be accepted and a good percentage of their fellow Scoopers will come flooding to your site and re-scoop your page to their followers and other social networks too.

To start you need to find who the influencers are in your niche. This is easy to do by searching for your keyword in the search bar at the top or by browsing the topics based on popularity and current trends.

You can then quickly research the curator as their profiles often contain links to their other social profiles e.g. Twitter or Facebook.

From the this example you can clearly see links to this Scoop.it users Linked.in, Twitter and Facebook Accounts.

As any experienced linkbuilder will tell you its important to build a relationship with a Social Media Influencer first rather than just bombard them with requests out of the blue. By engaging with them on other social networks and where possible find their website and contact information you can then begin to approach them with suggestions for their Scoop.it accounts.

My favourite tool at the moment to research potential link targets is Follower Wonk. It is a great way to learn more about who your influencer influences. It will also help you discover if they have any “Thought leaders” within their network, so you can gauge whether or not your suggested content will go Viral if it is shared by them too.

My last blog post on Automating Google+ with your other Social Media Accounts was curated on a very popular Google+ Scoop.it page. Over the next 48 hours I received about two hundred visitors from this page and two other pages that re-scooped my blog post. I also received a 10% increase in traffic from Twitter and Facebook than normal during this period too.

But, isn’t Content Curation Bad for my SEO?

In a post Panda World I can understand why people might worry about “duplicated content” but the thing about content curation services such as Scoop.it is that you never republish the whole web page. The web page is also linked back to from Scoop.it providing confirmation of the contents origins and although most of links on Scoop.it are no-followed, to prevent spamming, a link is still a link.

Many businesses forget that SEO is not just about links or chasing the number one spot in the SERP’s but by growing and diversifying the traffic to your website. By having diverse traffic sources you will be able to continue to grow your business online for a long time to come no matter what happens with the next big “algo” change.

If you have had any positive or negative experiences with curation services such as Scoop.it please leave a comment below.

Link Building – Competitor Analysis Case Study

As a webmaster Link Building seems like a very daunting task; with so much misinformation, poor advice and varied personal opinions on the web it’s not hard to see why most people can be confused as to where to start – the most common questions I get asked by clients and friends are:

  • What are the best types of links to build?
  • How many links do I need to get to the first page of Google?
  • Should I use the same keyword anchor text or mix it up with brand terms?

Well all of that is relative…

Invariably my answer starts by asking them more about their competition. Competitor analysis is a very important part of the link building puzzle. By understanding what your competition are doing to hold on to those all important and much sought after search engine results then you too can learn more about the types of links you need, what type of anchor text and generally the volumes of links you might need to rank well.

With so many tools available in the market place and the large amounts of data that can be produced by them it has never been easier to snoop on your competitors. In this post I am going to carry out a competitor analysis for Cash Cow UK, a free to use internet auction site, so you can understand a little more about the processes I use to analyse the link profiles of my clients competitors.

Using the free to use Google Keyword Tool I normally find 50-100 keyword phrases that my client wants to rank well for to get more traffic to their site. I then run a quick Google search to find out who the top 5 ranking pages for those keywords are and make a note of them. (There are some more advanced automated techniques that I use to do this but for now I want to keep it simple)

For the purpose of this case study I am going to just focus on 3 key word phrases:

Online Auction – 60,500 UK Searches per month

Online Car Auction – 8,500 UK Searches per month

Free Online Auction – 1,300 UK Searches per month

Intuitively you would think that eBay would be top for at least 2 of these search terms however they only appear once in these SERP’s and in position 2, this is very common to find when you ask a client to tell you more about their market as they often over state and under state their competition’s online presence.

I have decided to look a little more at the back link profiles of uk.madbid.com, ebay.co.uk, uk.ebid.net and totalbids.co.uk in order to find some quality link opportunities. The two tools I would recommend are SEO Spyglass it’s free to trial and less than $100 for a license or you could try the SEOMoz pro tools again it is free to trial but its $99 a month thereafter, however it provides lots of great data, including Social data and tonnes of great reports.

Below is the output from SEOMoz’s Link Metrics Comparison tool, as you can see there are lots of external links pointing to eBay and ebid however Total Bids who we saw were ranking very highly for Free Online Auction only have 200 external links. So I will probably start here to understand this better.

What we are interested in here is the Domain Authority (DA) and Page Authority (PA), this is a value calculated by SEOMoz based on the quality of links that are pointing at the page.  SEO Spyglass does not have DA and PA but it does give you the Google Page Rank of the linking page and linking root domain which can also be useful to determine good quality links.

Go to Open Site Explorer and input the page you want to analyse; in this case I have input www.totalbids.co.uk

As you can see this page has over 1100 links from over 100 different domains, it has received only 7 Facebook shares and 1 tweet. This looks like quite an easy link profile to investigate to find some great links.

Firstly we want to look at the Anchor Text, this is very easy to do in both OSE and SEO Spyglass:

As you can see a lot of their anchor text is just their brand name there are very few links containing the keyword that they rank well for. I would say over 50% of their links don’t have any mention of Online Auction or Free Online Auction. This means that we are working in a niche where quality counts.

You now want to export the links report into Microsoft Excel by using the Download to CSV function. I normally only export the domains and followed links so that I can see what type of site they are and to determine how easy or difficult it will be to obtain the link. I normally don’t export no follow links as these are often just blog comments and have very little link value.

Now you want to do the same for all of the link profiles of the other competitors and add them to your list. You should now have a master list of several hundred if not thousands of domains to target.

I normally then sort the lists of domains by DA or Page rank and categorise them as one of the following:

  • Info Site – low quality information site
  • Education or Government Site – hard to find links
  • News Site – quality newspaper or news site e.g. Mashable
  • Article/Press Release
  • Blog –possible Guest Blogging opportunity or product review
  • Web Directory – easy to obtain by submitting a link

Then in the same spreadsheet I then put a few notes and actions I have taken e.g. when I sent an email to the webmaster, contact details of the blogger etc.

Advanced Tip: If any of the blogs have a Twitter Account, be sure to make a note of it and write down the number of followers they have and their Klout Score. This will help you determine whether they are a key influencer in their niche and how much effort you should put into getting a link on their site. You can easily find someone’s Klout score by visiting www.klout.com/(username) where (username) is their Twitter username.

Now that you have your list of sites to approach for a link you need to decide what your approach will be. How much time will you spend chasing that elusive link and what can you do to get it.

I normally approach link building with the easy links first I list the site on all the directory sites that I have found and if they are paid directories I will sum up whether or not the quality of the site is worth paying for the link to be added or not. I usually do this by looking at the DA and PA of the pages my link would appear on and whether it would be just as easy to get a free link from another source.

As for the article and press release directories I will submit a few pieces of content to these if they have a good DA but I really don’t value these types of links and I believe the Search Engines have devalued these types of links over time too. However SEO is not just about getting links it’s about getting traffic and brand recognition too so a few press releases/articles now and again never hurt.

If it is a lower value blog or site that I want to target for a back link then I may simply send them a nice email asking for a link adding to their article, in this case their list of eBay alternatives. With this link analysis I also discovered a number of Blogspot, WordPress, Squidoo and other web 2.0 properties. I would consider making some web 2.0 properties myself if I couldn’t persuade the owner to give me a link. For the higher quality news sites and blogs then I will spend time getting to know the blogger or webmaster in order to get a guest post, share an infographic I have created or perhaps get them to review my client’s product. This is where you want to target your most efforts, a link from a few high profile web sites can catapult your site in to the big time so this is where you want to be creative and invest your time.

I hope you found this guide useful and hopefully it’s given you some insight into how easy link building can be by researching your competition.

+Chris Dyson