Google’s search for the perfect result

by / Wednesday, 02 March 2011 / Published in Blog

Google has created a whole industry around the relative performance of websites in search results.  It’s the single reason that Google is so dominant on the web today.

It’s also the underpinning of the slew of products and services which have seen the search giant move into mobile, TV and many other developing areas.

That’s why everyone’s attention was piqued when Google went out of its way to announce some significant changes to the search algorithms which decide just what does, or doesn’t, end up in pride of place on the first results pages of any search query.

That industry, generally known as the Search Engine Optimisation sector, involves a range of large and small companies from either marketing or tech (or both) backgrounds which have filled a niche for website owners who simply lack the skills or the inclination to tweak their own web pages, in order to maximise the chances of appearing in the organic (or unpaid) search results of the big G.

Many other companies, such as website designers, have added SEO to their list of skills, some with greater success than others. But the key point here is that there are plenty of service companies which have built business on the back of helping to prioritise client websites in the search results on Google.

It’s become a bit of a battle in some highly competitive areas. Take accommodation in a big city such as Dublin or Belfast and you can see that there will be a lot of hoteliers desperate to be featured at the top of the first results page for “Hotel, Dublin” or “accommodation, Belfast”.

So when Google announces that it is changing the way it ranks websites and that it will affect more than 10 percent of the current crop of results, it’s a big deal.

Google is claiming that the only motivation is to improve these results for people searching for the best and most relevant web links. On the face of it that seems fair and reasonable with a search methodology which always sorts the wheat from the chaff.

Ostensibly Google wants to attempt to reduce the number of what it terms, “low-quality sites – sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”

Problem is, no matter how much you try to account for every variable, there will always be some false positives in the mix. Further, whilst this major algorithmic tweak currently only affects the US search service it will roll out to other national search sites over time.

In the US it’s already been demonstrated that web sites that can’t reasonably be termed of ‘low-quality’ have fallen foul of this big tweak and that has to have some implications for national Google services as the new search code rolls out. One early victim was the venerable British Medical Journal, another the Cult of Mac covering the latest Mac oriented news.

So, whilst it’s not unreasonable for Google to attempt to improve the search results for the end users of its service, it seems that there will inevitably be collateral damage. For those legitimate sites which manage to fall foul of the latest search set-up it’s unlikely there will be any recourse.

SEO is undoubtedly in a constant war between competing sides across a huge range of sectors. For all its undoubted usefulness, Google’s search results did begin to suffer under this constant battling to be top of the search pile. In trying to rectify the worst effects of this however, it looks like there will be legitimate sites which inadvertently suffer casualties.

That’s not to suggest for a moment that Google is wrong to try and keep its results relevant to the users of its service. But compared to 10 years ago, when a lot less rested on how much traffic went to a particular website, a change to the Google algorithm today has the power to make, or break, a business.


Leave a Reply

× 4 = four