Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
-
I have a site that has been affected by Panda, and I think I have finally found the problem.
When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to.
The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding.
Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet.
Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used.
With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example:
http://www.mysite.com/en/this-is-a-pandalized-page/
301 to
http://www.mysite.com/this-is-the-rewritten-page/
The benefits of doing this are:
-
decreasing the amounts of directories in the URL
-
getting rid of pages that are possibly causing trouble
-
getting fresh pages added to the site
Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead
Thanks,
Giorgio
PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
-
-
The reason I know for a fact its Panda, is that the site lost its rankings (and thus about 60% of its traffic) end of February 2011.
Since then I managed to get slightly better rankings by adding loads of content and rewriting some categories (with thin pages, and not too many sub pages) from the bottom up, however, I never realized I had the content I currently located, which is terrible in terms of quality, and has duplicates all over the web. Like I said, this content dates back to 2006 when I didn't have a clue about SEO.
It's not that the content will be rewritten, based on what's there. I just told my writer to write about topic X and topic Y and make it very informative, so I will go from bad to really good pages.
Moving the new pages to new locations and getting rid of the others "infected" pages seems the best in my opinion, despite the age of these pages and the occasional link to them.
-
Panda still runs in installments, not continually. Rewriting content sounds like a massive task, hope it's worth it (e.g. is it better to write new stuff instead?). Have you got any pagination present on the site or indexable search results? We are under assumption here that you are certain that the problem is caused by Panda filter and not other factor and that your page layout and ads are not the cause of the drop or that the problem is not link related. I see no problem with 301 of pages with duplicate content to a new better content page. Sounds like something users might appreciate as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advice for structuring hotel website
Hey guys, I am currently setting up a hotel booking website and I'm not so sure how to structure it. I have landing pages for: 1. Cities
Intermediate & Advanced SEO | | baresound
2. Sights
3. States The main keywords are mainly "Hotels in Cityname" or "Hotels near Sightname". What would be the best SEO friendly way of structuring the url? https://hotels-example.com/hotels/cities/cityname
https://hotels-example.com/hotels/sights/sightname
https://hotels-example.com/hotels/states/statename or https://hotels-example.com/hotels/cityname
https://hotels-example.com/hotels/sightname
https://hotels-example.com/hotels/statename or https://hotels-example.com/hotels-in-cityname
https://hotels-example.com/hotels-in-sightname
https://hotels-example.com/hotels-in-statename Or are there better ways of structuring it or am I just overthinking it? I would greatly appreciate any advice and suggestions 🙂 Best, Max0 -
Do I need to worry about sub-domains?
Hi Moz commnity, Our website ranking was good and dropped for couple of recent months. We have around 10 sub-domains. I doubt them if they are hurting us. Being said all over in SEO industry like the sub-domains are completely different websites; will they hurt if they are not well optimised? And we have many links from our sub-domains to website top pages, is this wrong for Google? How to well maintain the sub-domains? Do I need to worry about them? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Has My Site Been Hit by Panda 4.0?
I operate a New York City commercial real estate web site (www.nyc-officespace-leader.com). Ranking and traffic have dropped steeply since early June. Around May 20th a new Panda update was launched by Google and I wonder if that could partially explain the drop. My site contains the following: -300 listing pages. These are product pages and often contain less than 100 words. Many have not been changed in two years. -150 Building pages. These contain less than 220 words. Many have not been changed in two years. -40 blog pages. We have been adding 1 or 2 per month. -50 or 60 neighborhood and type of space pages. These contain 200-600 words. Could our drop in traffic be due to Panda? I might add that an upgraded version of the site with new forms, a modified right rail an header was launched on June 6th. Also, we submitted a disavow file with Google on April 20th for about 100 toxic domains, one third of the 300 domains that link to us. In order to take remedial action we need to understand what has happened. Any ideas??? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Wikipedia page need suggestions
http://en.wikipedia.org/wiki/Muslim_Academy Recently created this page but giving two errors at the moment. Need your advice with how to fix these two point mentioned by wikipedia. | This article has no links to other Wikipedia articles. (July 2013) This article is an orphan, as no other articles link to it. (July 2013) |
Intermediate & Advanced SEO | | csfarnsworth0 -
National Company Needs Local (7 Box) Rankings in Test Metros
A lot of pressure on me for this one...
Intermediate & Advanced SEO | | jayt
I have a national client with excellent naturals, and a huge PPC budget. Getting killed locally, forcing use below the 7box. I need to find either a definitive way to get into the 7 box in dozens of metro areas, or HIRE a consultant who knows this landscape better.
Any thoughts?
The client serves all 50 states, by phone and online, in a very competitive market. I've killed it for 12 years, and volume has slipped so we need to grab more local.
Complimenting matters, we are currently migrating their main site to CMS (Umbraco).. a little busy!0 -
Need Perfect URLs
I'm redesigning a site's structure from the ground up, and am having issues with the URLs. I'd love to have them be perfect, but kept finding conflicting advice online. 1. For my services blog, is it best to have it set up like www.example.com/services/keyword or
Intermediate & Advanced SEO | | Stryde
www.example.com/keyword There seems to be conflicting advice as to keep it short and keep the keyword as far to the left as possible, but also that including the word services would help with long tail phrases and site organization. 2. For my blog section, is it best to have it set up like
www.example.com/blog/keyword or
www.example.com/keyword or
www.example.com/blog-post-title-with**-keyword**-in-it It's similar to the first question, but also adds the question of including the entire post title in the URL or just the keyword. Your help would be greatly appreciated!1 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
Do I need a canonical tag on the 404 error page?
Per definition, a 404 is displayed for different url (any not existing url ...). As I try to clean my website following SEOmoz pro advices, SEOmoz notify me of duplicate content on urls leading to a 404 🙂 This is I guess not that important, but just curious: should we add a cononical tag to the template returning the 404, with a canonical url such as www.mysite.com/404 ?
Intermediate & Advanced SEO | | nuxeo0