Blame Google, a panda and a penguin.
While PageRank — Google’s algorithm that measures how many links link to and from a site was their initial primary ranking system, content has become king.
‘Panda’ became the codeword for the changes in their ranking algorithms by which they ranked sites using texts and the keywords found on individual pages. What this basically meant was that a page with lots of text that included the keywords a person was looking for would rank high in the search results. Sounds logical, but it backfired.
People would take advantage of that system and place large amounts of keywords on their sites. Someone selling used cell phones for example would put the words “used cell phones” hundreds of times on a single page, on many pages. Another popular way of cheating became scraper sites. By scraping the web, they would basically steal content and copy it, creating duplicates. Then ads would be included. These sites could be created automatically by the thousands and would mislead first the search engine and second the visitor who would then click on the ads, giving the scrapers revenue.
Google quickly grew wise to that and adjusted panda again. They now looked for complete articles and would ignore the obvious cheaters and duplicators. However, soon spun content appeared, meaning a program would develop a nonsensical article that included the keywords. Since the Google search spiders are only robots, they wouldn’t know the difference between actual articles and a spun one.
Recently Google changed their policy again, this time bringing artificial intelligence into the fold. They would use and study human test subjects while they surfed sites and ranked them by quality, trustworthiness, speed, design and so on. Panda’s learning algorithm would pick up on the similarities these test people found and could so learn to recognize spun from real articles, bad sites from good sites, low quality from high quality.
September 2012 a penguin came along. Together with a new panda update, this new Google algorithm declared war on sites violating their webmaster guidelines. Fake articles, link schemes, keyword stuffing, scraper sites and other black-hat practices are now being recognized and ignored.
Thanks to the panda and the penguin, Google now only respects sites that are legit. That means that in order to get a good ranking a site needs to have genuine information on the subject it tries to advertize. Or at the very least genuine content that contain the keywords written by real flesh-and-blood writers. The more real articles and blog posts with keywords, the higher the respect and the higher the ranking.
This has created the need for webmaster and site owners to update their sites with content. Lots of it. While it is more work and expenses for the webmasters and writers, it is a very fair arrangement for the end user.
After all, all we want to do is to type in a keyword and get a decent answer.
You may also like these articles: