SEO Blog - Internet marketing news and views  

Link Builders Guide to Historical Ranking Factors

Written by David Harry   
Tuesday, 29 April 2008 08:02

Because it's about time 

When in comes to doing SEO for Google, one thing is certain; you’d better know about backlinks. Given the fact that the core algorithm is based largely on ranking and detecting spam via links, it is certainly an area to be familiar with.

A while ago there was a re-release of a Google patent on temporal ranking factors which really was a watershed back in 2005 when it first came out, as were subsequent related patents. Analyzing the recent (re-released) publication for new anomalies gave life to the thought that it would be interesting to highlight a few areas as a bit of a refresher course.

For this post I am merely going to address the implications of temporal factors on link building programs.

 

The basics of historical factors

Just because a web page is 10 years old, doesn’t mean it is still relevant. It also doesn’t ensure that it is. In some situations fresh content can be more relevant than older content. The discovery data can be used to re-rank a document in a positive or negative manner…

Search engines, (such as Google) can use temporal data to analyze link profiles for anomalies based on discovery, (inception) dates;

“…it may be assumed that a document with a fairly recent inception date will not have a significant number of links from other documents”

And can be used for Spam detection;

“While a spiky rate of growth in the number of back links may be a factor used by search engine to score documents, it may also signal an attempt to spam search engine. Accordingly, in this situation, search engine may actually lower the score of a document(s) to reduce the effect of spamming

 

Thus inception dates and historical data can be used for link velocity analysis.

 

“ (the) search engine may use the inception date of a document to determine a rate at which links to the document are created (e.g., as an average per unit time based on the number of links created since the inception date or some window in that period).”

With this in mind it becomes important to understand the market you’re in when deciding what types of link building campaigns should be employed and monitor the link velocity of your niche competitors and verticals alike.

Just because you were a bad assed link builder in the Dating niche, doesn’t mean you should rush in applying the same heavy handed tactics in a less active niche travel site. Understanding the norm will dictate the speed of the program for maximum effectiveness. A heavy handed link buider can un-do the potential rankings of a good piece of content.

 

How do they do that?

And just how do they figure out which queries deserve freshness? One way is by looking at the averages of documents in the result set. If the average document is an older web page with a slow steady rate of link growth, or more recent documents with a high rate of growth (such as viral news). By tracking this data, anomalies will stand out.

The rate of growth is not the only metric that can be tracked; rate of loss can also be factored in. These can once more form a baseline in a given categorization (niche/market);

“link-based factors may relate to the dates that new links appear to a document and that existing links disappear.”

Furthermore, a search engine can track if it was merely the link that was removed from the page, or the entire page/site itself. One would imagine the former being a good way of detecting paid/reciprocal links among other uses. When crafting a link profile consideration to stability is important.

Link velocity valuations include;

  1. when links appear or disappear,
  2. the rate at which links appear or disappear over time,
  3. how many links appear or disappear during a given time period,
  4. whether there is trend toward appearance of new links versus disappearance of existing links to the document, etc.

 

When taken in context with other related websites in the index, patterns emerge and it would be advisable to understand each market to ensure you maintain the high end of the threshold. The search engine can look at time-varying behaviour within the link profile of a given page for ranking adjustments.

 

Slow and steady wins the race

Another area for concern is also the long term history of a web page’s link profile. If you have a certain target page as part of a campaign don’t throw all your links at it in one shot or even a short period of time. Not only can this raise potential flags, but it can cause ranking problems relating to velocity slowing down;

“…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine that a document is stale, in which case search engine may decrease the document's score”

By using these trends search engines can try to estblish the percieved freshness of a gven document and increase or descrease scoring depending on the situation (should fresh content be deemed more valuable).

Historical factors can also be used in weighting the links such as;

  1. the date of appearance/change of the link,
  2. the date of appearance/change of anchor text associated with the link,
  3. date of appearance/change of the document containing the link.
  4. Trust of the document/site where the link resides
  5. Freshness of the document
  6. Document site authority

 

Link Spam Detection

As discussed there are many ways that temporal factors can be used to identify spam or otherwise de-value links to a given page. Depending on the topical nature and related niche data, a link profile can grow at an unusual rate.

One historical factor to identify Spam is the rate of change. Link loss can be used to indetify not only a page that is stale, but trying to game the system as well. This might explain why Spam does well early on and is tanked after a period of time;

“A large spike in the quantity of back links may signal a topical phenomenon (e.g., the CDC web site may develop many links quickly after an outbreak, such as SARS), or signal attempts to spam a search engine”

If a number of links to a document dissapear over a given time frame (threshold) and the document is considered ‘stale’. This can include;

  1. the date at which one or more links to a document disappear,
  2. the number of links that disappear in a given window of time,
  3. or some other time-varying decrease in the number of links (or links/updates to the documents containing such links)

 

This can in turn be used to devalue or totally ignore the links pointing to a given document. You should always be aware of the norm in any market/niche you are doing SEO for. Anomalies can often lead to the discounting of links, removal of value or raise flags for the Spam Bots to come and have a closer look at your site/page.

Another temoral link factor that can be used to indentify spam is linkage of non-related documents. When spikes of links to a document from non-related related pages, another flag can be raised to send the spam bots over to have a closer look. 

 

Anchor Text

Link text information can also change over time and is another data source that can be utililized by a search engine to indicate an update or change in the focus of a document (web page). Such occurances happen when perhaps a domain has been purchased and the link text pointing to a page no longer topically matches. They will look at the data, (algorithmically) for which the page significantly changed or the link text pointing to a page has changed.

The search engine may now assume that change has occurred and links pointing to the page prior to that can be devalued ot discounted alltogether.

Thus anchor text can affect scoring;

  1. by the date of appearance/change of the anchor text,
  2. the date of appearance/change of the link associated with the anchor text,
  3. and/or the date of appearance/change of the document to which the associated link points.

If anchor text to a document changes over time both the document the reside on and the target document can be looked at to determine if they are still topically relevant.

This makes it important to ensure that documents that you update are not modified enough as to make the links ponting towards it no longer of value. This is certainly important for sites that have frequent home page changes. Understanding the topical nature of the links to the home page should be factored in to any updates. It is obviously far more difficult to try and ask those linking to you to change the link text to ensure relevance.

 

Anchor Text Anomalies

As you would also imagine, tracking the additions, changes and ratio of link text/phrasings can also be a useful tool. By observing the link-graph over time the search engine tries to differentiate true links from spammy links through the link texts associated with the page. Real humans don’t generally all use the exact same link text and there is a more natural flavor whereas link spam often is not;

“(artificially) generated web graphs, which are usually indicative of an intent to spam, are based on coordinated decisions, causing the profile of growth in anchor words/bigrams/phrases to likely be relatively spiky.”

A document that has a non-natural rate of growth often has spikes of new backlinks with similar/identical link text associated with it. Documents that show such spikes over time can have the links capped or otherwise devalued. By capping I mean that only the first X number of links discovered will be valued and the rest tossed in the trash.

“One reason for such spikiness may be the addition of a large number of identical anchors from many documents. Another possibility may be the addition of deliberately different anchors from a lot of documents

So the main consideration here is for the appearance of active involvement through the link building process, obtained via related texts. This is why using directory/article submission services can be a bad idea. As can any form of mass link building with only a few variants in the link texts associated with them.

A good link builder will go the extra-step of not only timing the link development, but crafting of link texts as well.

 

In the end analysis

The main take away from understanding how historical factors can affect your link bulding is this; quality link profiles take patience and focus. As with most things in life, that quick fix lazy assed approach, is simply not going to work. While you may not get whacked as a spam meister, much of your efforts could end up being devalued or destroyed alltogether.

You must understand the market you’re building in and also ensure that focus doesn’t change substantially for any given page on the site. From that, the speed, type and focus of the links you’ll need will emerge. It also highlights the benefits of crafting quality content that can build diverse editorial links within the link building program.

Creating natural velocity and textual diversity for a target page’s link profile is important to get the best bang for the buck….

 

 

Comments  

 
0 # waveshoppe 2008-04-29 09:32
Aloha Dave, I just wanted to compliment you on this one, it
Reply | Reply with quote | Quote
 
 
0 # Dave 2008-04-29 20:45
last fer a while... hands in a cast ...shall post pic tomorrow... a hunt and peck guy now.... arggggg

:sad:
Reply | Reply with quote | Quote
 
 
+1 # Matt 2008-09-05 07:39
You're the first person in 4 years of creating SEO strategies to talk about some of the issues I believe are at the heart of good SEO. Well done!
Reply | Reply with quote | Quote
 
 
0 # michaelj72 2009-06-21 03:17
slow and easy does it, that can be my motto on this front.
answers a bunch of the questions in the forums today about link building
Reply | Reply with quote | Quote
 
 
0 # Steve 2010-02-23 14:34
That's a very thorough and detailed post listing some of the ways Google can determine link spammers, however like with many such theories I believe a fundamental thing is missed - Google must apply this historical incoming link usage to every single one of the billions of web pages in its indexed. Can Google really send spiders to billions of web pages each week and record the data suggested? In an ideal environment, they of course would, however I believe in practice that Google would take many shortcuts to decrease their required bandwidth, and this in the process would make the distinction between spammy sites and legitimet ones less certain than mentioned.
Reply | Reply with quote | Quote
 
 
0 # Martin Schweitzer 2010-06-16 14:46
of the "linking eco system". I wish that some "potential linkbuilders" here in Germany had read your article. And would stop their spammy methods.
Reply | Reply with quote | Quote
 

Add comment


Security code
Refresh

Search the Site

SEO Training

Tools of the Trade

Banner
Banner
Banner

On Twitter

Follow me on Twitter

Site Designed by Verve Developments.