If there is one area I’d consider myself an authority on, it’s personalized search. Actually, behavioural signals in general (implicit/explicit) are an area that I do a LOT of reading (and research) in. We’ve gone as far as actually testing against Google’s personalized search (on two occasions) to see exactly how much it was being implemented (surprisingly little). It seems that the folks at Google are taking it to a new level.
On Friday the folks at Google made an announcement relating to Psearch
“Today we're helping people get better search results by extending Personalized Search to signed-out users worldwide, and in more than forty languages. Now when you search using Google, we will be able to better provide you with the most relevant results possible.” - Google Blog
This is kind of interesting on a few levels;
We’re wondering (at the Dojo) if this really is something recent as we have been seeing evidence of personalization when not logged in for a short while now… so it’s possible it was rolled out on some data centers before others.
Deeper personalization requires more processing power; does that mean this is somehow related to the Caffeine update.
The main point here is that people know longer need to be logged into Google to see modified results. You will know when a set of results is modified when the ‘View Customizations’ link is inserted in the SERP. Like so;
(care of Search Engine Land)
Dawn of a new age?
As I write this there are already visions of the spate of feckin’ posts that will come out this week with crap like ‘Rankings are dead’ – ‘Everyone gets a different SERP’. Let’s stop that one right here and now.... Let’s not start that funeral procession just yet my friends. You see in some of our past testing (can be found here) we established that;
For starters it is worth noting that we looked at a small set of related informational queries; there is more testing to be done. That being said, here are a few initial findings;
No 2 SERPs the same; of interest is that there was a constant state of flux and regardless of the searchers location (within the US) each set of results were unique.
No SERP unrest; with the above in mind, there wasn’t massive movement. It was often more re-ranking of the top 10 results than having totally different sets of urls in each.
Top dogs; while there was movement, the top 3-4 results were often very consistent with minor re-rankings. The 5-10 positions were far more likely to have larger re-ranking anomalies.
Personalization is a kitten; not a dragon. Another interesting finding is that personalization wasn’t having as much of an affect as many have felt it would. Yes, there was evidence of high levels of re-ranking, but the re-rankings with PS off weren’t greatly different from those with it turned on.
The important take-way here is that this is NOT a wholesale change of SERP results from one user to another. This will obviously vary on different query types, but from what we saw, it wasn’t the SEO killer that some have feared. It is a far more subtle change and we should not start professing that the game has changed.
Let us remember, they are not saying that the way they calculate personalization has changed, merely the presentation (ie; logged in/out makes no difference).
Yes, me and the gang in the SEO Dojo will be doing some more testing to make sure… But my suspicion is that personalization levels haven’t changed from the testing we did earlier this year.
Levels of personalization
Ok, so what exactly constitutes personalized search? For the sake of making sure we’re on the same page, let’s take a look at some of the common elements that play into personalization;
Service account (a profile on a search engine service)
Toolbar data (auto-fill)
Search patterns (query types)
Surfing history (implicit data)
Non search referrer
Tracked via toolbar/browser
Interactions with site
Now this isn’t a blue print for Google. The above elements are simply some of the many that search engineers look at through various papers/patents I’ve come across. The main thing is that you understand better how search engines work. Personalized search can be about a great many factors, not simply ‘search history’ or ‘surfing behaviour’.
If you, the fastidious SEO, are to truly grasp the implications and applications you need to be wary of the many factors which are at play. There is no tool that will teach you about how they all work together and where they are going. Understanding personalization is almost a strategic or artistic endeavour.
I’ve listed a ton of other reading at the end… get to know this stuff!
The Spam Connection
One of the more interesting parts about implicit/explicit user feedback is that it can be VERY effective in dealing with spam. The more personalized the results, the less chance that spammy domains will be ranking. This is VERY important.
Not only will this enable them to help limit spam through personalization, it would also be a great source of query/click analysis for Google. Consider that the click data across multiple users shows that a given entry in a query space rarely is clicked, or shows a high bounce rate; they might just use that signal as a dampening factor for said result. Yes, I did say ‘bounce rate’ lol…
That’s what is VERY interesting for me. I have long had a love hate relationship with Google’s use of behavioural data – the main reason was spammability and processing power. I’ve long held the belief that we wouldn’t see them being adopted until those issues were dealt with.
Between deeper personalization and greater processing power (of Cafiene) we might just be at the dawn of a new age. That’s VERY exciting news for this geek.
What you can do
Obviously there will be lots of talk about what we can do to better align ourselves (and client’s websites) to make the most of a more personalized world. Here are some things to areas that will become increasingly important;
If you are already knee deep in Content Strategy, incorporating Social and otherwise maxing out engagement levels, I can’t see that you have much to be concerned about. Using implicit data gives greater insight into user satisfaction (on paper at least) and those already moving in this direction should be fine. If your still pushing borderline MFA sites and link spamming for love; it may be time to reconsider the approach.
As far as setting goals is concerned, let’s look at the take-away from our recent research;
Top 1-4 spots are safest
Top 5-10 are secondary targets
Check multiple geo-targeted data centers for rank checking
Rankings to traffic ratios are ineffective
Consider query revisions when targeting (Analytics/Google suggest anyone?)
Over-all search traffic growth is a more important metric than rankings (as always)
Remember, Google stated they’ve changed the presentation, not the processing. Until we have new data to the contrary I will still be working from that assumption. We’re not going to see major upheaval (yet?).
“Imagine a world of Google Search where each person/computer received a different set of results? Or sets of query results that were fluid and more alive? No longer could the Search Optimizer simply use tried and true techniques to rank a given site, identifying probabilistic models, themes and demographics would become a talent to be learned.
The SERPs (search engine results page) would be a more fluid environment and rankings would be a moving target. All of a sudden end user behavioural metrics (bounce rates, conversions, frequency) would play a role in ranking of documents to individual users or sets of user groups” – yours truly, Dec. 2007
That was back some 2 years ago. Which was followed up with;
“You see, there are many ways to aggregate data other than a logged-in Google Account user; such as the Google Toolbar, IP addresses, cookies and more recently, the Google computer and Google Mobile (dubbed Android) possibilities are endless. This proliferation of Google services embedded in such devices means there is even more access to conversion or performance data relating to natural search results.” – yours truly, Dec. 2007
Seems we’re getting there now… so let’s take that next step.
Social search meets personalized is something I think we’ll be seeing more of. If we consider an approach such as ‘user sensitive PageRank’ we see that search engines have an interest in not only singular personalization, but user categorized personalization as well (smaller aggregate sets of implicit/explicit user feedback).
Granted, I’m not sure such signals always do what they’re supposed to (show intent/satisfaction), but I believe there is a place for them.
The main problems with personalization signals was always spam and processing power, (beyond the afore mentioned noise issues). Google has recently announced infrastructure updates (Caffeine), social search and the desire to speed up websites (in an effort to free up processing power?).There are developments that give rise to what is in store.
Most certainly I’ll write more on that soon (being the end of the year and all)
Don’t freak out – m’kay?
Every time Google hiccups it seems the SEO world goes into spasms (and new crap services crop up). I remember a few short months ago how the ‘Vince update’ was going to mess everything up bla bla bla… Do we hear much about it now? Nope.
Thus I ask that we keep our heads on, do some due-diligence testing, and continue on with the job at hand. Once more this does highlight how SEO is most certainly, not dead. Over the last month we’ve had; Social Search – Load Speed – Deeper Personalization and more. As they evolve, so do we (as SEOs).
Stay tuned for more once we’ve done some testing… If you’d like to take part, be sure to join the SEO Dojo today! (he he… shameless plugs are fun)
If you sit down and read ALL of the following… patents included… you will be one serious assed search geek when it comes to PS… Dig in!