SEO Blog - Internet marketing news and views  

5 ways to improve rankings through content management

Written by David Harry   
Tuesday, 08 July 2008 09:12

Understanding historical ranking factors for content creation/management plans

Does your content creation and management plan consist of merely a fire-and-forget-it perspective? Then you may be missing out on ways to not only strengthen existing rankings but bolstering new ones. You may be making costly mistakes without even knowing it. Today, we’re going to look at how historical ranking factors can come into play with your content plans.

A while ago I discussed historical ranking factors for link builders and at the time felt there was more within these patents to be discussed…. So let’s jump right in and look at how some of these factors can be mined for content development and management perspectives.

Historical Ranking Factors



Document Inception Date – showing up on the radar

When we look at inception dates there are a few ways a search engine may qualify this including;
According to one implementation, the inception date of a document may be determined from;

  1. the date that a domain with which a document is first crawled may be used as an the inception date
  2. Also the first time that a document is referenced in another document, such as a news article, newsgroup, mailing list, or a combination of one or more such, may be used.
  3. a combination of crawl or submission-based indexing techniques, or in other ways.
  4. may be determined from the date that search engine first discovers a link to the document.
  5. According to another implementation, the inception date of a document may be equal to a time stamp associated with the document by the server hosting the document.

It is important in some instances to ensure you’re getting known to the search engine as early as possible. Because of the nature of some topical environments, NEW content or news is more important. Because a fresher offering will undoubtedly have less links than more mature ones, inception dates can actually be used as a ranking signal.

What you can do; when launching new content be certain to understand the market environment into which it is going. If you have done some research and feel your particular item has a time sensitive or seasonal element to it; get some good links fast.

That is not to say we want links for ranking necessarily either, in as much as getting noticed by the search engine as quickly as possible. Using social media and other sources is likely the call of the day with this stuff. We don’t care if the links are no-followed, we simply want a crawler to come back to the content in question; valuated link or not.


Content Updating

Another area that search engines look at is the frequency and amount of updates a given document may have. The Update Frequency factors are a measure of how often a page is updated over a given period of time. While the Update Amount calculations are often based on factors such as;

  1. the number of “new”/ “unique pages” associated with a document over a period of time.
  2. the ratio of the number of new/ unique pages associated with a document over a period of time versus the total number of pages associated with that document.
  3. the amount that the document is updated over one or more periods of time
  4. the amount that the document (or page) has changed in one or more periods of time (e.g., within the last x days).

To qualify this though, we’re talking actual content not items such as; Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, which are generally ignored or lightly weighted in most cases. While on the other hand, page titles, H1-5 and outbound link texts can be given greater wieght. This is at the discression of the engineer and the runing of the dials.

By looking at these factors search engines can see if there is an acceleration or decline in the changes to a given page from one timeframe to another.


What you can do; this becomes important to the content plan programmer in terms of not only producing new content, but occasionally updating older target pages and internal link texts as needed. Make updating of popular and time tested pages part of your plan not merely pumping out new pages.

Over time as new keywords are targeted; updating related pages titles and link texts on pages already ranking for related terms can also boost the over-all program.

Furthermore, associating related content to each other via internal linking can also show a theme and direction that brings new value to older content. Once more when targeting new terms look for ways to strengthen older related content with new pages via referrencing.



Query Analysis and Behavioral Metrics

Query analysis is one area we have discussed more than a few times here over the last year. More specifically is user behavioral metrics and query analysis. In essence, how does the end user interact with the SERP listing.

For more on that see the post on; User Performance Metrics and even Personalized Search

In the case of historical ranking factors, a search engine may plot user interactions over time for;

  1. The click through rates for a given document
  2. The bounce rates
  3. Time on page
  4. Interactions (such  as bookmarking, saving, printing)
  5. Scrolling data


By looking at a variety of performance factors in concert with ranking data (SERP placement) they can try to ascertain if a given document is in favor or losing interest with the end user for a given query.

What you can do; the smart content programmer will always want to be intimate with the anlytics not only to get a feel for what is popular, but to monitor pillar content and monitor search referrer traffic behavior to ensure those landing on them are getting the best possibe experience.

Your content should not be a fire-and-forget-it affair. Understanding how the search engines percieve the visitors it is sending as far as the relevance of your listing is an area to watch. You should also be looking at the search results lising to see if the meta-description is being used as a snippet and tweak it accordingly as well.




Link Text Valuations

Another area a search engine may look at is link texts. That is to say when there are significant changes in either the content of link text pointing to it, a search engine questions if they are still relevant. If over time one of the two changes the link texts pointing to it may be deemed to no longer be relevant and weighted less or discounted altogether.

To quote one Google patent on HRFs;
“One way to address this problem is to estimate the date that a domain changed its focus. This may be done by determining a date when the text of a document changes significantly or when the text of the anchor text changes significantly. All links and/or anchor text prior to that date may then be ignored or discounted.”

Link velocity is another consideration with inbound link texts in that newer documents often have fresher links in the profile. While not directly related to content creation, a content plan including older documents should pay heed to such theories.

What you can do; when working on your content creation/management program be careful to understand any shifts in focus from the original document you’re editing and related inbound links. You can also try to promote older content and create new links over longer periods of time through various link building methods.

The content management team should always be closely aligned with the link building department and this area highlights missed opportunities and potential risks worth aligning.


Tin Foil Hat time – traffic patterns

Ok a slightly more conspiracy theorist approach is that search engines such as Google mention using traffic data over time to valuate a given web page. I infer the tin foil hat in that now we’d be accepting that Google uses SERP data, ToolBar data, cookies, Trends or even Google analytics to formulate these assertions.

Changes in traffic patterns can mean the document is not fresh, lost popularity, or that it has seasonal fluctuations. Either way, a perception we want to pay attention to. A search engine may deem a given document in a known niche to have less value and weight accordingly by watching historical traffic patterns

What you can do; while watching and monitoring traffic patterns, as stated previously, is important to the (SEO) content manager most of the above noted items should help you focus on creating strong SERP representation and content plans. If you put all of the above together with quality content development, traffic pattern issues should take care of themselves. But do get intimate with your analyics package and watch the traffic patterns to and around major landing/target pages.

And there we have it… I tried to keep this ride as simple as possible to allow you dear reader to get creative with the concepts. It is more important to understand how search engines perceive things; more so than a stringent set of rules here. Now that we’re looked at link building and content plans, next time I want to talk about Spam detection and HRFs. We’ll look at how Google and Microsoft go about detecting and weighting potential spam documents – always important to know where the lines are drawn.

Some related reading;

Documenting your SEO changes historically – SEO Round table
Historical Ranking factors revisited; and here as well - SEO by the SEA
Ranking webpages based on their history – Google System
Link builder’s guide to historical ranking factors – here on the trail


Search the Site

SEO Training

Tools of the Trade


On Twitter

Follow me on Twitter

Site Designed by Verve Developments.