2013 Panda Update Question
-
Hi everyone, I'm new here So far I've had wonderful success seo wise and none of the updates (Penguin nor Panda) affected any sites, until this one.
For example, one site has 7 keywords I'm optimizing for. Out of those 7, all but 2 (and variations of the 2 - one word vs long-tail) completely tanked. These keywords were all on page 2/3. One of the two survivors never budged from page 2 (it's a brand keyword so I was sooo happy to finally get it to page 2)
Now when I check rankings, the other terms show up in the 200-400 spots, but NOT for the URL I was optimizing for (category page) but instead for random products in the category.
The only thing I've done differently with the 2 keywords that are still doing well, was focus - we did more link-building for those, but not an extreme amount. Never over-optimize.
My question is, how did 2 survive and 5 are still floating up and down. Last night I saw one go up 122 spots, now today down 14. I'm really struggling with this.
Thank you
-
i just ran a diagnostic - no errors, no duplicate content, nothing..
-
I just did a quick check right now with a free plagiarism checker. When I get the return I see some outside blog posts using the same text - not 100% though.
Just wondering if having these removed will alleviate the situation or if I need to do more?
-
Hey, what have you done so far?
Have you checked internally for duplicates? Have you used copyscape to see if there is external duplication?
A single blog post should not cause a huge problem, I would suspect that this may be more widespread.
What CMS system are you using here?
-
Thank you!
I am finding duplicate content category pages and then a blog article with the same content. Will having the blog article removed fix this or does the content on the category need to be re-written?
-
Hey, it's going to be near impossible to answer that question without examples I am afraid.
The update this week was a Panda update so should be primarily related to content and duplication so the very first thing I would check for is duplicate content issues on and off your site.
This would be a good read:
http://www.seomoz.org/blog/fat-pandas-and-thin-content
Then, maybe run your site through copyscape to get some quick feedback on any external duplication issues.
If you have been hit, there will be a reason so you need to start doing some digging, get a handle on the issue and put measures in place to resolve them.
Also, consider, this may be something to do with the work the client is doing on the site or more likely the content or it could be totally external factors (scrapers stealing content etc).
Hope that gives you some direction!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server update to ipv6, SEO consequences
Hi all, I read the article from 2014 on MOZ regarding ipv6.
Intermediate & Advanced SEO | | AdenaSEO
https://moz.com/blog/ipv6-cblocks-and-seo Our technical department is about to change our server from ipv4 to ipv6.
Are there any things we have to consider regarding SEO / rankings / duplicate content etc.. with this transition? I hope you have a little spare time to answer this question. Regards,
Tom1 -
Curious what risk we are for Panda 4.2 update
We are curious to know what the MOZ community thinks about our level of unique content on the following profile and if the community thinks we are currently at risk / susceptible for a Panda 4.2 Penalty. We have profiles on over 4,000 colleges (https://www.noodle.com/colleges/coUD/williams-college) some are more populated with content than others. We've already taken action to noindex several hundred thousand profile URLs (https://www.noodle.com/tutoring/tc2d9be/eye-level-center-of-tribeca) which currently publish less original content. Curious how other major vertical search websites approach this problem (a la Glassdoor, Yelp, TripAdvisor, etc) As always - the feedback from this community is priceless!!
Intermediate & Advanced SEO | | abargmann0 -
Slug construction question
Hi there, question about what constitutes an optimal slug. I work for a Theater news site. An article we recently wrote announced the opening of the musical "Holler if you hear me," which features the music of Tupac Shakur. We considered a few options, including holler-if-you-hear-me-opens-on-broadway and tupac-musical-opens-on broadway. Any suggestions? Also, if the full URL reads something like theatermania.com/broadway/news/06-2014/[slug], should we try to ensure that the term 'broadway' never appears in the slug to reduce redundancy? Keep in mind that the term 'broadway' is a pretty popular search term.
Intermediate & Advanced SEO | | TheaterMania0 -
Updating existing content - good or bad?
Hi All, There are many situations where I encounter the need (or the wish) to update existing content. Here are few reasons: Some update turned up on the subject that does not justify a new posy / article but rather just adding two lines. The article was simply poorly written yet the page has PR as it is a good subject and is online for quite some time (alternatively I can create a new and improved article and 301 the old one to the new). Improving titles and sub titles of old existing articles. I would love to hear your thoughts on each of the reasons... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Complicated Question: Removing Spam Backlinks that were Not Requested
I'm new and seeking help with the following scenario: 1. Main site: is a domain.com established authority type site 2. Second site: is a domain.org (has robots.txt to no index) but someone obviously not site owner has done negative seo campaign against the .org domain and built spammy links to it. In fact, that's all that exist on this second domain because it is used for development purposes only right now.) No one would link to this one normally as it is just secondary domain used to protect trademark and for development use.) When searching for it by domain name it does not appear on first page for search results. Checking link profile the only links that show for domain.org are spam links. Have contacted site/s where spam links were placed (no answer) Main site domain.com and domain.org have same whois and hosted on the same server as they are owned by same company Main site domain.com still appears first for its name but has lost some rankings. I am working to fix some technical issues ie: duplicate urls with CMS etc, but would like to find out what to do about the domain.org content that clearly has had someone target it with spammy non requested backlinks.) domain.com has Google webmaster tools account, no messages about unnatural liking in those reports 1. I'm not sure I should add domain.org to GWT to see if there is an unnatural link penalty applied or if this would further connect the two domains through association. If I could get some feedback/suggestions on what my options are with regards to making sure that the domain.org domain has a clean profile that would be most appreciated. Also because site owner has would like to begin using domain.org in the future for some unique content, but as it stands right now cannot because domain has been targed by poor backlinks. Anyone else run into situation where the .org or .net versions were targeted by spammy backlinks even though the domains were not actively used? What's the safest way to proceed? a) Concerned about possible co-penalty between main site domain.com and domain.org b) how to remove problems issues with domain.org so that owner can use it in future. Many thanks for your thoughts and help with this one. I appreciate any help or feedback.
Intermediate & Advanced SEO | | web0230 -
This is kind of a trick question, but if you could only do ONE thing to improve your search ranking what would you do?
This is kind of a trick question, but if you could only do ONE thing to improve your search ranking what would you do?
Intermediate & Advanced SEO | | JHSpecialty0 -
Another E-commerce Canonical Question
Hi guys, Quick question: one of our clients has an e-commerce site with a very poor canonical tag setup and thousands of pages of duplicate content. Let's use this as an example: BRAND > Category > Type > Color
Intermediate & Advanced SEO | | elcrazyhorse
Four separate pages/URLs. The BRAND page lists all products.
The Category page lists all BRAND products for that category.
The Type page lists all BRAND products of a specific type in that category.
The Color page lists all BRAND products of a specific type in that category of a specific color. Anyway, these generate four separate URLs: /BRAND
/BRAND/Category
/BRAND/Category-Type
/BRAND/Category-Type-Color Avoiding duplicate content and product listings, I would appreciate your proposed canonicalization strategy/feedback.0