home // blog // Flow on SEO, and why content is still King

Flow on SEO, and why content is still King

Kate Rau
28 Feb 2013

spacer
What do MC Hammer pants, a Nirvana mix tape and search engine optimisation (SEO) have in common?

They’re all from the ‘90s. The same decade that saw Andre Agassi win the Wimbledon Men’s singles; a time when raves became popular (and then unpopular), when Dolly the sheep was cloned from a somatic cell and when Bill Clinton was elected President of the US.

But everything we know about the 90s has since changed.

MC Hammer was arrested, Kurt Cobain shot himself, Agassi retired, Clinton “did not have sex with that woman”, and Dolly is pushing up daisies.

What we knew about SEO, and how search engines such as Google work, has changed too – a lot.

“Google’s aim is to deliver highly relevant content to its users. And it has differentiated itself from other search engines – Yahoo, Bing and Ask, for instance – by doing this better than its competitors,” says Richard Frank, head of Flow’s programming studio.

Flow’s Cape Town Manager Roy Barford agrees: “It’s important to identify searches before they start happening, and prepare quality content with clever titles and image alt tags that will make your page appear at the top of Google’s rankings for that related search.”

In the past, website developers and SEO “experts” could game the SEO system and get higher search ranking by merely changing the text on the pages of a website, increasing the incidence of keywords and increasing the amounts of outgoing and incoming links in a website’s pages.

Those in the SEO game would build link farms, or hundreds of low-quality but keyword-rich pages that would lie unseen within the CMS (content management system) of the site. Advice for growing your SEO would focus on keyword usage and bolding – in the title tag of a page entry, in heading on each of pages, in URLs for pages of the site.

But none of the above techniques are, or were, in line with Google’s main objective – to deliver high-quality and relevant search results to its users.

“In 2011 Google launched Panda, an updated search algorithm that sought to deliver ‘higher quality’ sites to the top of the rankings. The algorithm took a bunch of elements into consideration, but its chief focus was on quality,” adds Frank.

What Google says

spacer

The Google Webmaster Central blog, featured the following advice for developers and publishers of online content: 

“Below are some questions that one could use to assess the ‘quality’ of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.”

The list of questions, which was given to testers to survey dozens of websites, included the following:

• Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
• Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
• Would you be comfortable giving your credit card information to this site?
• Does this article have spelling, stylistic, or factual errors?
• Does the article provide original content or information, original reporting, original research, or original analysis?
• Does the article describe both sides of a story?
• Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
• Does this article have an excessive amount of ads that distract from or interfere with the main content?
• Would you expect to see this article in a printed magazine, encyclopedia or book?
• Would users complain when they see pages from this site?

The result was an 84% correlation between what the human testers found as being “high quality” and what the Google algorithm worked out as “high quality”.

As if the warning wasn’t enough, in 2012 Google launched Penguin, a further update to the search algorithm that focused on reducing rank for those sites using “black SEO” – techniques such as keyword stuffing, cloaking, link schemes and deliberate duplication of content.

Where we are today

Taking Google’s advice into account means that developers and content producers have the ultimate control over their websites’ SEO rankings.

Websites that deliver high-quality, original content that is written by experts and delivers a superb user experience, will rank higher than those sites that don’t. It’s just that simple.

“Websites we develop have a clear navigation structure and sitemap, and are rich with content without being over-burdened by too many links. We write content for humans, stuff that is interesting and relevant. We check for broken links and fix the HTML codes regularly,” says Richard.

“So many website administrators out there make the mistake of using duplicate content (content which has already appeared on one or more websites) on their websites because this seems the easiest option.

“As a user, I’m annoyed when I come across content that I’ve read on another website, particularly when the information is incorrect or insufficient, and as a media developer, I’ve noted the negative effect this has on search rankings. Plagiarism is wrong, don’t do it,” adds Roy.

“My advice is to think what search terms people in your target market would use and make sure you produce answers to the questions they may have, in line with that particular search. Page titles are important, but even more important is that the content within said page is unique, well written, accurate, useful or interesting.”

Tweet

blog comments powered by Disqus
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.