Last month a number of sites reported a sudden drop in their traffic which would tend to indicate that Google had released a fresh algorithm update affecting search results. The SEO press (yes that’s a thing) was quick to badger Google for an explanation but the search giant remained tight lipped, until eventually confessing that it had made some tweaks to its core algorithm in terms of how it measures quality.
Keen search fans will recognise that quality signals are usually addressed in Google’s Panda updates however the big G was keen to point out that the update in May was not one such update. In fact, a new Panda iteration is set to roll out within the next two to four weeks, so if your site wasn’t affected by the May update, don’t think you’re out of the woods just yet, because more tinkering is on the way.
Wait, What Is Panda?
Oh come on, haven’t we been here before? Really? Fine. Panda was the name Google gave to the major new algorithm update it applied to its mighty search index back in 2011. The Panda update was designed to weed out sites thin on content and strike them from the prominent ranking positions in search results, allowing users to enjoy a better search experience as they were returned more high quality and authoritative results for their chosen search queries.
Like the responsible SEO experts that we are here at ThoughtShift we put together a very handy guide to surviving Panda at the time, and much of its content is as relevant today as it was when the algorithm update was first unleashed. Fast forward to today and we’ve had numerous updates to the brutal Panda algorithm which is currently at version 4.1 (as of September 2014), with each subsequent iteration and refresh punishing a whole new swathe of sites deemed too low on quality to be given that oh so valuable page one ranking space.
Should I Be Worried About This Latest Panda Update?
The official line from Google is always the same: if your website is of a decent quality then you’ve nothing to fear. The reality however, is often quite different, and there are plenty of high profile examples of sites who’ve seemingly played by the rules but still been hit with devastating losses to their rankings which in turn decimate the volume of traffic received. Well established, high authority sites such as eBay, parenting.com and digitaltrends.com were among those that saw a significant loss in search visibility following the last major update to the Panda algorithm in May last year, proving that Google were showing no mercy in dishing out their punishment across the web. The message was clear; not even the big boys were safe from the rampant Panda.
Ultimately the biggest losers suffering at the hands of this update have typically been sites that rely on other people’s content. Sites that aggregate news or information from external sources, without any of their own research or insight, weren’t deemed to have the requisite credibility to appear high up in the search results. It stands to reason from Google’s point of view, after all if a user is seeking information on a particular model of fridge/freezer, whilst a price comparison site might be useful, it shouldn’t be the top result because it has nothing fresh to offer that can’t already be found elsewhere on the web. Having sites that scrape other sites for information outrank those original source sites is insulting to those who’ve put their time and effort into the original content.
Whilst the Internet may be based on the sharing of content, that doesn’t mean that those marketing that content best should be able to profit from it at the expense of its original creators. In a world where content is still king, credit has to be given to authors and not just curators. In fact Google’s dalliance with attributing more credibility to individual authors of online content with the rel=author attribute (canned last year) was a clear indication of their desire to see the most serious weighting given to those at the top of the content creation tree.
Help, What Do I Do?
Abiding by the general rules of keeping on Google’s good side will be enough for most websites, and that means taking heed their warnings regarding creating valuable content and building high-quality sites. Recite the mantra of:
- Is my site useful?
- Is my content unique?
- Am I engaging?
- Am I credible?
And as long as you can answer YES to all of the above with some degree of authority then the chances are you’ll be spared the wrath of the Panda. Remember however that for as much as Google tries to be proactive, updates such as this tend to be reactive, in that the webspam team at Google HQ recognise where certain sites have managed to achieve competitive rankings in spite of the fact that they don’t offer the quality of content they deem appropriate for such lofty search result positions, and they meddle with their algorithm to try and prevent these sites from continuing to enjoy the advantages that come with such serious search visibility.
So given that nobody can be certain of exactly how Google will be deciding on the latest victims of their quality cull, for most webmasters it’s simply a case of sitting tight and hoping you’ve made the grade come the big day. Should the worst happen and you discover your site has been affected, the important thing is not to panic, because recovery is possible. In fact there are some great examples of a Panda recovery on our very own blog (where else?), along with top tips to restore your own visibility following an algorithmic sucker punch.
Follow my contributions to the blog to find out more about Google updates, creating engaging content and eCommerce SEO best practice or sign up to the ThoughtShift Guest List, our monthly email, to keep up-to-date on all our blogposts, guides and events.