Google’s Panda algorithm is an enigma, a paradox. On one hand, it has devastated many web-based businesses. That is not a trivial thing. On the other hand, in time it has the potential to truly improve search quality and evolve information retrieval. It simultaneously destroys SEO-reliant sites that have leveraged Google’s vulnerabilities, while it emphatically raises the entry to organic search as a marketing channel.
Panda has left sites starved and depleted of traffic, crushed in its wake. It has also brought out the best efforts of SEOs trying to find answers and solutions to the riddles.
In my own experience, for some queries it has greatly improved search quality, while for others, caused an odd retrograde. Anyone remember the Vince update? Brand bias seems to be back in a big way, something Aaron Wall has been pretty vocal about.
One important note: while brands tend to benefit from authority signals, Vince was technically not a brand bias per se, but rather a weighting to authority metrics for head terms, in which brands naturally benefited the most. This is important in the context of Panda, too.
While there certainly is a reason to lean on brands for relevance in search, much of what we’re seeing on the web is the ‘real world’ lining up with the Internet. Brands and big business rule, on and off the web, but that wasn’t always true.
There was a time when the Internet held the promise of a level playing field. Perhaps that’s changing; or perhaps that’s Google’s vision for local search?
Friendly Fire
There is too much bloodshed out there. I don't remember a Google algorithm that has done so much damage – so much collateral damage – as the February 2011 Panda update. One only has to read comments like this one from angry webmasters to get a flavor for the type of "friendly fire" Panda has peppered across the web. Not to mention sites like WonderHowTo.com seeing scrapers and syndication partners outrank them for their own content.
Panda is a profound change to Google's algorithm, and it's no surprise that there are sites out there being hurt that may not deserve it. The only caveat I would offer is that oftentimes sites that feel they are being incorrectly singled out, upon closer examination, often have plenty of practices contributing to their loss in traffic.
In short, take a close look at your site and be brutally honest. If you still feel Panda is mislabeling your site as low-quality, then let Google know about it like the 2,000+ sites have done in this thread. Just be sure your house is in order, first.
That said, it’s unfortunate that one of the few known reversals of Panda came to a site that may not have been impacted by Panda in the first place.
Panda-hit sites look like the following, with dates matching up where the decline begins (around the week of Feb 6-12):
classic-panda
While traffic loss from Panda tends to fall off a cliff, it can also slowly deteriorate, especially if there are other technical problems:
slow-panda-decline
One of the only sites we know of to recovery fully from Panda in short order:
panda-recovery
Because Google is updating the algorithm in intervals of up to 30 days or longer, recovery can take time. Even after steps have been taken to remedy the situation. Additionally, because this change is algorithmic there are no manual exceptions or overrides, unlike a link penalty, for example.
Why I Feel Panda Is Good For Search
In no way do I intend to minimize or neglect the pain out there for those sites that have been stung by Panda. We’re working hard on helping some of them. However, in the long term, Panda is actually a good thing for the web, a good thing for search, and a good thing for SEO.
Panda is a new way of looking at websites that goes beyond link and authority metrics, beyond on-page optimization, and folds in quality, credibility, and perceived trust metrics. It’s odd to see the anomalies popping up in search results – spammy looking sites ranking, scraper sites ranking. Even a few keyword-stuff subdomains are showing up. But these are the corners and edges of the algorithm, exceptions that can hold great clues into what Panda is scoring.
Panda is an evolution beyond sole focus on the PageRank model. Together with user metrics and social data, Google can rank websites based on a whole slew of scoring criteria beyond just the link graph and SEO 101 on-page factors (which remain important). Content integrity, usability, even aesthetics are all now potentially in the realm of SEO.
Michael Martinez astutely scribed recently, "where has all the PageRank gone?" and within the same time period toolbar PageRank was being hidden. Are these signs that Google is trying to grow beyond its PageRank model?
While not directly tied to search quality, the correspondence one affiliate marketer had with the AdWords quality team was revealing. Among other tidbits, the Google team wrote these instructions for judging the “compliance” of content to advertising ratios:
“The site must have user value other than providing ads. For example, Google provides web search, news sites provide regularly updated original content, and other services. To check that your website complies with our arbitrage policy:
1. Open the site in a new browser.
2. Expand the browser to a minimum of a 1024 x 768 pixel display.
3. Make sure you have minimal browser menus and your font is set to medium or normal.
4. Scroll to the very top of the page, as evaluation is based on what appears above the fold.
5. The site is considered compliant if the area of ads is less than or equal to the area of content.
“Please use the instructions above to evaluate your entire website and, if necessary, bring it into compliance with our arbitrage policy. If you’re not in compliance, you may receive a low landing page quality score, which can negatively affect your Quality Scores, cost-per-clicks, and ad positions.”
Content quality and uniqueness, together with the density of advertising on the page, are strong contributors to Panda’s algorithm.
SEO For Panda
One disclaimer first: I would never recommend one “optimize for Panda.” It is shortsighted to try to rank pages based on a single algorithm change, no matter how profound.
What has worked in the past continues to work, as I can bear witness to on behalf of our clients. That said, I think any web team who’s been hit by Panda would strongly disagree with Amit Singhal when he quips,
“…focus on delivering the best possible user experience on your websites and not to focus too much on what [you] think are Google’s current ranking algorithms or signals.”
Websites are in fact forced to make strategic changes based on the latest algorithm change from Google. It’s unfortunate, but it’s a fact.
Still, the message to focus on creating a stellar user experience is spot on. It can sometimes be hard, however, when a company’s lifeline is cut off and revenue plummets.
Matt Cutts told Wired in that famous interview that, "...our most recent algorithm does contain signals that can be gamed." And Amit Singhal agreed: "There is absolutely no algorithm out there which, when published, would not be gamed."
However, to rush headlong into attempting a reverse engineering of Google’s algorithm would be patently foolish. As would any attempt to spend precious time and resources optimizing solely for Panda. Instead, let’s explore some potential areas that we suspect are important with this change.
I've written about SEO for Panda before, and included some tips for rescuing sites from the gaping maw of traffic loss. Before we get into the details here, let’s go back to the Google's Singhal for this gem:
"Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?"
This is the core issue for most sites, along with saturation of advertising and lack of unique content. Beyond that problem, which can be profound for sites that have taken the easy street with SEO over the last several years, here are the major factors we’re seeing with Panda.
Affiliate links and ad units: Ensure the ratio of affiliate links to non-affiliate links is not too high. These can be a trigger. Advertising, too, is a factor and you’ll want to ensure the content to ad ratio makes for a good user experience. When in doubt remove advertising from some positions, especially above the fold, or where returns are nominal compared to your primary placements.
Low-quality or thin content: Remove these URLs, but carefully. It should be done deliberately and methodically, especially if there are thousands of URLs or more. Using a robots.txt file, robots meta noindex tag, or returning a 404 and placing that content on a separate sub-domain or domain, are the specific tactics to employ. Do not annotate this low-quality content with link canonical tags or attempt to 301 it elsewhere.
Canonicalization: If content is being syndicated, or is sourced from elsewhere, or if it has significant duplication on-site or off, tactics should be put in place to send strong canonical signals. Use of rel canonical annotations, on-page messaging, and even meta noindex, follow are all potential candidates here.
Site speed: Beyond only a good user experience, site speed was announced as a ranking factor well prior to Panda. Focus here because we suspect this to have more importance now.
Quality: This is the hardest part. Sites must make the effort to contribute value to the web, in the form of frequently published resources, information, guides, images, videos, or whatever. Sites serious about SEO need to commit to an editorial schedule and continually produce here.
Social signals: Facebook shares and likes – the former of which appears to influence rankings – Twitter activity, Google +1 use (coming soon) and quality links from social are of paramount importance.
Search result pages: Google has long publically stated that they dislike search results in their search results. However, search results have long worked in Google, to a greater or lesser extent. With Panda, it seems the dial has been turned up a bit on search results, and we’ve witnessed one site in particular suffer here.
One Potentially Troubling Change
Panda has given rise to meta noindex. This is potentially a troubling issue for SEOs. Does anyone remember nofollow and PageRank sculpting? After announcing that it had the potential to help with SEO, Matt Cutts reversed his advice and now recommends a site almost never use nofollow on their links.
Meta noindex is similar. Long has it been a handy tool for SEOs needing to keep content out of the search indexes, while still allowing it to be crawled and pass equity to other pages. This is still how it behaves.
However, with Google now advising webmasters to use meta noindex (among other options) its usage will proliferate across the web, dramatically changing the landscape. Perhaps that’s a good thing, but I don’t think anyone has the ability to really predict how it will change things, and how its use may be recommended or discouraged by Google and Bing in the future.