Google’s search for the perfect result

by / Wednesday, 02 March 2011 / Published in Blog

Google has created a whole industry around the relative performance of websites in search results.  It’s the single reason that Google is so dominant on the web today.

It’s also the underpinning of the slew of products and services which have seen the search giant move into mobile, TV and many other developing areas.

That’s why everyone’s attention was piqued when Google went out of its way to announce some significant changes to the search algorithms which decide just what does, or doesn’t, end up in pride of place on the first results pages of any search query.

That industry, generally known as the Search Engine Optimisation sector, involves a range of large and small companies from either marketing or tech (or both) backgrounds which have filled a niche for website owners who simply lack the skills or the inclination to tweak their own web pages, in order to maximise the chances of appearing in the organic (or unpaid) search results of the big G.

Many other companies, such as website designers, have added SEO to their list of skills, some with greater success than others. But the key point here is that there are plenty of service companies which have built business on the back of helping to prioritise client websites in the search results on Google.

It’s become a bit of a battle in some highly competitive areas. Take accommodation in a big city such as Dublin or Belfast and you can see that there will be a lot of hoteliers desperate to be featured at the top of the first results page for “Hotel, Dublin” or “accommodation, Belfast”.

So when Google announces that it is changing the way it ranks websites and that it will affect more than 10 percent of the current crop of results, it’s a big deal.

Google is claiming that the only motivation is to improve these results for people searching for the best and most relevant web links. On the face of it that seems fair and reasonable with a search methodology which always sorts the wheat from the chaff.

Ostensibly Google wants to attempt to reduce the number of what it terms, “low-quality sites – sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”

Problem is, no matter how much you try to account for every variable, there will always be some false positives in the mix. Further, whilst this major algorithmic tweak currently only affects the US search service it will roll out to other national search sites over time.

In the US it’s already been demonstrated that web sites that can’t reasonably be termed of ‘low-quality’ have fallen foul of this big tweak and that has to have some implications for national Google services as the new search code rolls out. One early victim was the venerable British Medical Journal, another the Cult of Mac covering the latest Mac oriented news.

So, whilst it’s not unreasonable for Google to attempt to improve the search results for the end users of its service, it seems that there will inevitably be collateral damage. For those legitimate sites which manage to fall foul of the latest search set-up it’s unlikely there will be any recourse.

SEO is undoubtedly in a constant war between competing sides across a huge range of sectors. For all its undoubted usefulness, Google’s search results did begin to suffer under this constant battling to be top of the search pile. In trying to rectify the worst effects of this however, it looks like there will be legitimate sites which inadvertently suffer casualties.

That’s not to suggest for a moment that Google is wrong to try and keep its results relevant to the users of its service. But compared to 10 years ago, when a lot less rested on how much traffic went to a particular website, a change to the Google algorithm today has the power to make, or break, a business.

Ralph
@ralphenn

http://ennclick.com

3 Responses to “Google’s search for the perfect result”

  1. The algorithm change has huge potential implications for SEO experts, many of whom have worked out their own way to manage Google and keep their techniques a carefully guarded trade secret. Google has never, as far as I know, set up a webpage that says “everybody, here’s how to optimize your page to show up higher in our rankings,” and I doubt there will be a guide to the new algorithm, either. To my knowledge Google continues to look highly upon websites with a fresh and updated blog, so companies of all kinds would do well to revisit the question of when and how they can blog effectively, to keep themselves and their sites relevant.

  2. Sylvia says : Reply

    Sheila – agreed. One of the best ways to ensure you get good Google results is to have original, regularly updated content, with plenty of inbound and outbound links. A blog is a great way to do this. Would a scrolling Twitter feed on a website also be indexed by Google for SEO purposes, I wonder?

  3. Ralph says : Reply

    @Sylvia

    I think Google probably gives a weighting to all factors its bots discover on traversing a website. The issue is, no-one really knows (and perhaps that’s just as well) just what that weighting is? It may be that you spend loads of time doing social stuff around your service or product and it’s a stab in the dark as to whether the effort expended is justified by the relative improvement in the search position gained (or lost) from doing so.

    I suspect, as you say, you ought to focus on the fundamentals and making good/relevant content. Everything else is icing on the cake… if time permits.

    Ralph
    ennclick.com

Leave a Reply


four + = 6

TOP