On Feb. 25th 2011, I woke up to discover in horror that Google’s Panda had shat all over my business. And I’m not talking about a single, steaming turd. The Panda covered our entire site in a stinky brown mess. This wasn’t some spammy made-for-Adsense site, or even a high quality affiliate site. It was a real small business, an e-commerce site with thousands of happy customers, an A+ BBB rating with no complaints…a 6 year old business featured in the mainstream media, with a real physical address and employees. But the day before, Google made the horribly irresponsible decision to release this joke of an update, a wild reckless animal, into the general population. The beast released it’s huge load on both low quality sites and high quality businesses, and it’s still running loose today. I’m able to joke about it now, but for months those of us who owned and worked in the business were devastated.
Here’s what happened to our rankings between Feb. 21 and Feb. 25, 2011:

Panda Ranking Devaluation
(The chart above is a screen shot from an report. An “X” is where rank disappeared from the top 50. The numbers on the left are the current rank, and the numbers on the right are the ranking change.)
The image above is just a snapshot of the keywords we monitor our rankings for. We lost 60% of our traffic overnight. Fortunately for us we’ve always operated with this principle: Minimizing risk is as important as maximizing profits. We kept our fixed expenses as low as possible, and while we watched companies in our industry go bankrupt from Google’s flawed and careless Panda, we were able to hang on. This past March, we made a full recovery:

Panda Ranking Recovery
How We Did It
Google claimed the was “to reduce rankings for low-quality sites”. After a flood of complaints from businesses, they posted “” for sites that had been hit by Panda. Their list of questions one should ask about a website was insulting to us to say the least, as we could positively answer all the relevant questions. But at the bottom of their post they wrote:
…low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.
We didn’t have low quality content on our site, at least nothing that would be considered low quality by a human. But we did have a significant number of “shallow pages”, as most e-commerce sites do. (Is that really a problem Google? Seriously?) When you’re selling thousands of products, especially where many are variations of the same product (different sizes, colors, etc.) there’s no way to have deep content or a high quantity of content on each of those pages, yet the pages are still valuable and necessary for both users and the business. I’ll reiterate that our site was a real business, not an affiliate site with product descriptions pulled from an affiliate feed, or with copied manufacturer descriptions. Every page on our site was uniquely written by us.
So what was the problem? Apparently Panda considers content that’s great for users to be bad for Google search results. The key here is to do the opposite of what Google tells you to do. Google tells you to think about your customers first. But they’ll penalize you for that. What you need to do is think about what Google wants, because if you don’t please Google’s wacky creatures, your business is toast.
Content Removal & Consolidation
What is Google saying between the lines? Panda thinks low quality, low quantity, and pages with similar content, no matter how useful for visitors, is bad. If you’ve got such a distribution of pages, you’re risking a massive devaluation by Panda. So we went about getting rid of countless useful pages with a low quantity of content. We dramatically reduced the number of pages on our site through both deletion and consolidation. Month after month, nothing happened.
Sub-domain Fix
After about 6 months, we finally came up with a solution that worked pretty well. We split our site into a number of sub-domains, hoping they’d be considered as new sites and escape the Panda’s wrath. It worked for a while, as you can see in the graph at the top of this post, with the exception of not applying to the home page and terms it ranked for. (Using the sub-domain fix on the root index page or “home page” appeared to transfer Panda’s devaluation in tests we did on other sites, so we left the home page on the root domain.) We breathed a serious sigh of relief! But on Oct. 14th the sub-domain fix quit working and we were back to being fully Pandalized again. It took 5 more months before we finally recovered, an amount of time that would cause most businesses to go under, scale back, or fire employees.
The Verdict
Panda kills websites and businesses for what it thinks are issues of quality, whether that’s accurate or not. A combination of content removal, consolidation, and a long period of time did the trick for us. 13 months after losing 60% of our traffic, we’re back.
Complications
As you’ve probably noticed in the graph at the top of this post, our rankings still aren’t where they were at the end of 2010. Around Dec. 10, 2010, we were hit with an external anchor text over-optimization penalty. Although it looks in the graph to be as serious as Panda, it was not. The graph is skewed toward higher traffic “head” terms, and it was a couple of those terms that were hit. If you’ll look at the yellow line, you’ll see that our current rankings are now better than just before Panda was released, and also better than just after the over-optimization penalty. Now that this site has recovered from Panda, we’re working on tackling the over-optimization penalty.
Note: Another factor that helped us stay in business was our diverse group of sites. As a company, we’ve created a number of high quality, profitable sites. That diversification of income sources has helped us survive when one site gets penalized by Google.
Next up…Google’s second careless and irresponsible creature: Penguin
Super article, thank you!
In your last “Note”, do you mean you have multiple site for the same specific business? Or were you using the word “business” as like an umbrella for different ventures?
I ask because I’m wondering how G would perceive multiple sites for the same business venture. Different content of course, but still, multiple sites for the same specific business. Thoughts?
Thanks Brian.
I was using the word business as an “umbrella for different ventures”. I have a number of sites in different niches, with different monetization methods. Take a look at my next post after this one, . In that post I discuss my business strategy, which includes diversifying with multiple projects.
Regarding how Google would perceive multiple sites for the same business venture: I really think it depends on the sites. In some cases I do have more than one site in the same niche, but only when those sites are substantially different. If users would find them both useful and interesting, and not realize they were owned by the same person by looking at them, I think you’re probably ok.
If you’re making two sites in the same niche that are very similar in terms of content but different in the way you promote them (link building, etc.), then it probably makes sense to use multiple, completely unattached Google accounts. But you have to be very careful with that so Google doesn’t tie them together.
Sweeet Article. Just what I was looking for. Around the 24th I noticed a spike of around 500 backlinks out of nowhere for a site!! Then it got pushed way back. Took over 3 weeks to get it ranking but still not #1. The funny thing is that when you look at serps for certain keywords Google is serving up some pure garbage while good legit sites are gone!
Matt
Good insights here. I’ve heard conflicting information on whether or not you need to wait for the next Panda update to recover or whether you can bounce back the next time Google reindexes your site.
Do you know which is true?
I’m not sure. It makes sense to me that if Panda is re-run periodically, a site could only recover on one of those dates. My recovery dates have been on or right near these dates.
What I do know is that leaving comments to prop up links will get your URL removed. I’m fine with links to your own site, but not to prop up your links.
Our website was affected a bit after panda update but we make some changes in the website and our strategy and our ranking was back in a couple of days. The key to recovering from panda is to post content on a regular basis so we can serve fresh content to users and search engines on a regular basis.
That’s a pretty sad excuse for a comment, for an supposed SEO company. You’re going to have to do a MUCH Better job than that if you want a link.
Besides, your Panda recovery strategy is the opposite of what works.
Hi
I to have been hit by panda / penguin you can check out my post on Google forums here for more info and analytics graphs.
I’ve made hundreds of changes, removed low quality, consolidated content and improved content, reduced bounce rate, conversion rates are through the roof as a result but organic results are getting worse from 1200 clicks at a high point to 70 yesterday.
We are still growing suprisingly but would be nice to have our old free traffic back.
Any ideas where I could be going wrong – any more pointers is there anything obvious you can spot? I’ve have several consultants looking at this and have spent 6k so far… no joy.
The site is .
Kind Regards
Andy
one desperate small business owner!
Unfortunately for you, and many others, the days of ranking a relatively “cookie cutter” e-commerce site in Google are over. The only way you’ll have a chance at ranking again is if your site is substantially different from other sites offering similar products/services AND provides unique value. Or, you have to be a big brand, which is the solution to everything these days.
E-commerce sites were hit hard by Panda, and in my opinion, small business e-commerce without something very unique and valuable is a sinking ship. Sorry!
Hey Max,
Nice piece, I caught this on seobook as well. I had actually read it quite a while before the inevitable. Knew where to go looking when the Panda munched my site.
I your opinion, do you think that Panda is more of a user experience thing or technical issue with the site. You mentioned in another post how you doubled your sales and increased all of your metrics during this process. I’m wondering if that might have played a bigger part in your recovery?
We’ve got a site that was hit by panda in early March. I was thinking about doing what you mentioned on Seobook and that is noindexing a good majority of our products. Most of our traffic (90%) comes to top level category pages and tips/advice articles written about our niche. Wouldn’t be a significant hit to drop them from the index. It sounds like you actually removed product pages altogether and I wonder if the results would be much different with a noindex tag. We have unique descriptions written for every product that consist on avg of about 75-100 words, but I’m wondering if some of the brands we offer carry too many similar products. It might be overkill to offer them all when it comes to duplicate/similar content issues and panda.
I think our bigger problem might be from sorting/filtering on our 150 category pages, they have been duplicated to the tune of 16,000 pages. We finally implemented rel=canonical to fix that issue. I’m thinking this is probably the biggest issue. We had used robots.txt to block the parameters, but then the dupes just sat for 2 years. Bad advice.
One of the other things I noticed is that our tips/advice pages…about 60-70% of them have terrible metrics. On average about 30 seconds, 1.25 pages and 85% bounce rate per visit. The product level stuff, about 30-40% of them are similar metrics when they come off the search. In comparison to top level category pages that do much better, 2 min, 4.25 pages and 35-40% bounce rate per visit. Pruning the pages with bad metrics from the index was the plan, through noindexing or 404 depending on the usefulness to our customers.
Sorry for the long post. Appreciate any feedback you can provide. It’s nice to see someone give real methods to fixing panda issues.
Thanks Greg.
I’ve had a number of sites hit by Panda. My e-commerce site recovered (as I wrote in this post), but then was hit again, and then recovered again. I think Panda is related to a number of things, and the interactions are complex. It’s not easy (or even possible?) to get some sites out. There are various thresholds, and making changes to get to the good side of one could put you over the bad side of another. Then, you’ve also got Penguin to content with. So if deep links, for example, are a protective factor against Panda, they could also cause you to get hit by Penguin. It’s not easy.
*I* don’t see Panda as a user experience thing, but other people I respect do think it is a big component. So it could be the samples I’m looking at. The reason I don’t see it, is because sites I have and know of that were hit are so much better in terms of usability and stickiness than sites that weren’t hit, that I find it hard to believe.
With my e-commerce site, my usage data was better for a YEAR before it recovered. And, at the time of recovery, there were very few visitors. If it were ONLY usage data, then surely my site would have recovered much sooner. And, it wouldn’t have gotten hit again. Again, my e-commerce site is very high quality with great usage data. There are and were other sites ranking that look like crap, and no doubt have much worse usage data.
I don’t want to tell you what to do with your site, but I will tell you that my second recovery looks to have come from noindexing most of my remaining pages. Getting rid of my product pages (the carts still generates them, but I removed all links to them, put them on a sub-domain, and added noindex tags…just to be triple sure) reduced my site size from about 2,000 pages to about 200 pages. I THINK that’s what got me out of Panda the first time around. I got hit again last October. In February of this year I decided to take it further. I noindexed about 100 more pages (sub category pages) AND I used Google WMT URL removal tool. It took a while for those removals to take affect, according to both Google WMT and looking at what was in the index. Days after all the pages were removed from the index, I recovered again. So, I took my site down from 2,000 useful pages (for visitors) to about 200 pages. Then, I had to noindex another 100+.
I think having lots of similar pages will kill you, with Panda, and I think that noindexing most of them, particularly if they’re not search landing pages, is a good move if you’re already hit by Panda. Might as well try it. It sucks, and Google sucks, but it is what it is.
The rel=cononical is important to deal with the sort issues. But, I’ve noticed that Google can take forever to take note of that, and they sometimes seem to come back into the index despite the fix.
So again…*I* don’t see the usability connection. For me, the only success I’ve had has come from blocking pages, IMO. But it could just be the sample I’m working with.
Hey Max,
Thank you for the reply! Really appreciate you taking the time to help out and answer my questions. It helped sway my decision to noindex products and even knock out more pages (shallow sub categories) that I was on the fence about.
Couple more quick questions.
Do you see a big difference in using the following tags.
meta name=”robots” content=”noindex” vs. meta name=”robots” content=”noindex, follow”
How long did it take you for recovery on the second hit?
Thanks,
Greg
Greg,
No problem at all.
I haven’t tested “noindex” vs. “noindex,follow”. However, I wouldn’t use the former. You want Google to follow the internal links, in order to spread the “page rank”. If you only noindex them, as far as I’m aware, you just bleed the page rank or link juice into a black hole. Also, I would NOT use “robots”, I would use “googlebot”. Remember, you don’t need to block Yahoo, Bing, etc.
It took me about 9 months to recover FROM the second hit. But it was about 5 months after I noindexed all the additional pages.
Max
Hey Max,
That’s a bummer. I was under the impression (from reading some of google articles) that follow was implied. So if you use just noindex it would not show in the index, but they would still follow the links. It was only if you used the noindex, that they stopped dead in their tracks. I have quite a few pages already getting pulled with the other format. I guess you can’t be to sure and it probably makes more sense to be VERY obvious. I’ll have to get that updated. Thanks for the tip on the Googlebot. I only get a trickle of traffic from the other two, so I just sort of neglected them.
How were the metrics on the pages you ended up removing? I know you didn’t think it had that much to do with it, but I’m really curious. We sort of had a benchmark of: 1 min or less avg visit was the biggest indicator IMO of a bad page on our site. Especially when our good pages avg 2+ min, 35-43% bounce rate, 35-43% exit rate. So those were the sub category pages I noindexed and planned on beefing up to make a little stickier.
Greg
Greg,
Ah…ok…maybe it is implied. I’ve just never taken chances like that. I may have been thinking of the other common use, “noindex,”. So you may be ok.
Regarding the metrics on the pages I removed, I didn’t look at them much. For me, as I believe I mentioned before, I just don’t see usage data as a big factor (but I could be wrong). Some pages SHOULDN’T be lingered on, when a user finds what he or she wants, and others should. For example, a “sub category” page that is between a category page and a product page…is very useful and important for navigation, but a user will spend almost no time on it…depending on the type of sub category page it is, of course. Some of those pages will be quickly clicked through. But in any case, that’s something I haven’t looked at in depth…maybe a blind spot of mine in this regard.
Hi Max,
Great article. You say your traffic recovered fully but you had to remove all the product pages and 100 category pages. Does that mean the traffic that returned to the remaining pages was more than you had before Panda for those pages or weren’t the removed pages attracting traffic anyway?
I’m a fellow Panda victim, ecommerce site reselling for other companies and, at the time Panda hit, using their datafeeds plus some of our own content, so fair game but, as you say, there are still sites around using those datafeeds, less unique than us but doing fine (I know the owners).
I’ve suspected the product pages for a long time – we had about 1,000 products at the time, tried reducing that to 500, wrote our own descriptions, added reviews, added useful content to our category pages and elsewhere but no success.
I’ve never contemplated completely removing all our product pages though as they do generate traffic, but if removing them brings more traffic to the remaining pages the sacrifice is worth it I guess.
Thanks,
Steve
Thanks Steve.
I haven’t compared traffic recently to pre-Panda 1.0 levels. But… When I first recovered, after only “removing” product pages, my traffic did return to pre-Panda levels. Then my site was hit again, then I got it out again, by noindexing sub-category pages. Right now I can compare traffic to last year, when I was temporarily out, as I got hit last year again in October. Now, my traffic is down compared to last year by 10-15%. So to answer your questions, the traffic that returned to my remaining pages is about the same as pre-Panda, however my overall traffic is down a bit…likely due to having so many pages out of the index. The remaining pages did NOT bring in more traffic.
If your site was completely trashed, then it wouldn’t hurt to try removing the product pages from the index (just adding a “googlebot” noindex tag). But if you’re still getting traffic to those pages, then you would lose that traffic.
I should add here that I have a few other sites hit by Panda, which I have not been able to get out more than very briefly, and it’s possible those escapes were due more to Google making changes to Panda thresholds than anything I did. And, some of those sites are cut down to about 10 indexed pages! So, removing pages is no guarantee that you will get out, anyway.
To be honest, at this point I have given up on SEO as a traffic source for sites/businesses. It is still my main source of income, but I’m focusing 100% now on generating traffic/exposure in other ways, depending on the business. I’m not doing anything on most of my sites…just letting them sit and make whatever they make as long as they do/can. I’m focusing/working only on 3 sites (vs. 30 or so before), and aiming to replace all of my SEO dependent income with other means…contacting businesses interested in buying products, networking/promoting my sites in places where there are real/existing audiences, a bit of Adwords, etc.
This site, Hungry Piranha, was built right before I gave up on SEO, and I’ve totally neglected it, as you can see, lol. However, the info is all still valid, and I may add to it in the future, as working on the web is still what I do. I also don’t think “SEO is dead”. As I said, it still provides 95% of my income now…I just don’t actively do it!
I think that selling other people’s products via small business e-commerce, reliant upon SEO, is a sinking ship at BEST though. For many people who made lots of money for years, it’s already over. You need something unique and valuable that cannot be copied. Otherwise, the combination of competition + penalties is a killer.
Max
Thanks for the quick reply, 24 minutes, even my customers don’t always get an answer that quickly!
No, SEO is not dead as you say, but it has changed in a big way since Panda came along. I agree with you regarding the unique business model that can’t be copied as the way forward for small sites. Kind of how business was before the internet came along – we’ve gone full cycle.
I had one partial recovery last year that lasted a few months with more traffic at each Panda and a few minor, short term improvements, all after making various structural changes to the site that included canonicalising similar product pages so only one version was indexed, adding filter navigation to improve user interaction and so on. I remain just as bemused by the recoveries as I am by the knock backs though.
Deep down I know the site doesn’t deserve to rank well but I’m having one last bash at it, moving to a responsive design which will be much more mobile friendly and modern (just in case user experience does matter), improving the product pages with more images etc. and trying a more PR lead approach to link building in the hope that will bring us some really high quality ‘natural’ links (just in case Panda can be overridden by quality links – although the content will have to be great to get them so one may lead onto the other).
Thanks for sharing your story and the quick reply. Good luck with your mission!
Full cycle…yes. We took advantage of a window of opportunity with SEO, and it will likely be open in various ways for many years to come. Obviously the web is still a great platform for business, and there will always be ways to leverage it. I think the key is to be able to adapt. Also, it’s very important, IMO, to keep expenses as low as possible if you’re relying on SEO, as you never know when you might get booted.
In any case, good luck with your efforts! Let me know if you’re able to get out.