Press enter to begin your search

Warning: A non-numeric value encountered in /home/customer/www/page-one.co.ke/public_html/wp-content/themes/dorianwp/framework/modules/title/title-functions.php on line 445

Google Algorithms & Self Respect

No doubt. A World without Google search has become inconceivable.

In as little as 20 years, the big G has become a seamless part of our lives. It’s how we find the things we need and desire, it’s who we turn to when seeking answers and how we understand the world we live in.  Google has become the backbone of marketing and capital flow around the world, it’s how businesses are better able to serve their consumers and most of all, how they most efficiently reach and appeal to their target market in the first place.

The Big Online Marketing Buzz

No wonder that ranking high on Google’s SERPs is the big buzz and that the battle for the top ranks amongst the 130 trillion pages that today constitute the web has become as fierce as ever. Imperfections in the search engine’s ranking algorithm are exploited without delay and without sacrifice giving rise to one of the biggest time-consuming phenomenon in our everyday lives: SPAM.

Eliminating spam, increasing high-quality results delivered to you are at the very basis of Googles success model and at the root of every algorithm update.

Seeing the Forest beyond the Black Box

While the code constituting Google Search is likely to remain one of the best-kept secrets on the planet, the objective driving the underlying algorithms and their numerous updates is rather easy to decipher.

Google, while maintaining an undisputed monopoly on search and widely dominating the dynamics of the web, is on the other hand also subjugated to delivering success, success in the form of relevance. In other words, the monopoly it defends is based on gaining a precise understanding of what you want and delivering similarly precise answers – in the form of content – to your fingertips. Eliminating spam, irrelevant content from its SERPs is what its success model and consequentially also its algorithm updates are all about.

The Task for Google is big

The web, according to Google, consists of 130 trillion pages today – and this number is exponentially growing. Google’s index, where these 130 trillion pages and their content are tracked and organized currently features some 100 million gigabytes. Every second, Google processes more than 35,000 searches. These numbers may provide a slight hint to the complexity but also the sheer stability of the infrastructure behind the world’s most popular search tool.

Keep calm and be relevant

In comparison, the driving dynamics of this unfathomable engine – to most of us – is rather simple: efficiently ranking content by relevance and thereby filtering spam from your search practice.

Consequentially our task as search engine optimizers lies in understanding relevance in the same way Google does by providing unique and interesting content to the user, i.e. your clients and target market.

In general, this means deploying marketing and site development best practices that any self-respecting site owner should be doing anyway, even if search engines didn't exist.

When Google first started in 1998 the search engine was far from being accurate, relying on basic meta-tags only to find and return the results you were looking for. Back in the days shortcuts to achieving high rankings were as simple as spamming meta-tags.

A lesson by history

Today, Google has come a far way in both interpreting what you are looking for, delivering quality results for your search and eliminating spam. Let’s take a quick look on the most recent and predominant milestones of this journey:

February 2011, PANDA Update – pushing for quality content

Panda is focused on quality web content – it was designed to prevent websites with poor quality content from ranking high on result pages. Sites such as content farms or those with a high-ad-to-content ratio would be penalised for poor content. Panda was meant to lower bounce rates and remove duplicate content – leading to an improved user experience.

Google stated that 12% of search results were affected by the initial Panda launch. A problem however, was found very quickly because poor quality websites with good SEO were now ranking higher than websites with good content, but with poor SEO. Subsequent Panda updates have been launched to refine this ranking criterion and to minimise this problem.

April 2012, PENGUIN update – eliminating sites with black hat techniques

The PENGUIN update was developed to filter sites that artificially claimed their ranking through practices such as keyword stuffing or the unnatural accumulation of inbound and outbound links.

In other words, those sites affected were buying backlinks or creating links from articles on bogus blogs purely to boost their rankings. To avoid PENGUIN penalties it is critical to feature natural anchor texts, contextual thematic links and high quality, relevant backlinks from trusted sites as valuable as yours.

August 2013, the HUMMINGBIRD algorithm – a better understanding of the user’s intent

This major update of Google’s search algorithms focusses on search engine queries by taking into account the meaning of the words or phrases entered into the search box or the intent of the searcher. This allows Google to provide more appropriate and matched results to people’s search queries. In other words, it focussed on semantic searches, rather than simple keyword searches.

What this means is that long tailed keywords are now an important part of your SEO. Other than that, if you have high quality content on your website and you already rank well – Hummingbird will just make it easier for your content to be matched with relevant search queries.

What’s next? A differentiation between popularity (just traffic) and authority by backlinks

There has been speculation recently on what the next round of updates will mean for websites. Matt Cutts (head of Google’s Webspam team) has suggested a need to differentiate between sites that are popular because of lots of traffic, and those that are authoritative because lots of people link to them.

An organic and natural backlink strategy will thereby certainly be of essence to climb and maintain your top rankings on Google SERPs. But isn’t that what any self-respecting site owner would watch out for, even without these updates?

Lessons to take home:

Google algorithms work in favour of good quality, unique and entertaining content. Sites that were built to provide just that will naturally climb to the top of Google’s SERP.

Efficient SEO therefore must begin with:

  • a thorough understanding of what makes you, your products and services unique
  • the development of interesting non-generic content that speaks to your target audience and reflects the keywords they use to find you,
  • an outreach for authoritative networks of backlinks, and
  • making sure that your internal link structure and architecture can easily be navigated, your pages load fast and your title, alt, H1 and H2 tags are used appropriately.

To ensure these four factors are well integrated to marketing strategy and to secure your ranking success we have developed a work process that evolves around understanding your business as a unique player in your market space and establishing potential authoritative networks suiting your objectives.