Whether our shared articles caused Panda hit to our high quality site
-
Hello,
We are a quality site hit by Panda
Our article collection:
http://www.nlpca(dot)com/DCweb/NLP_Articles.html
is partially articles written by the site owners and partially articles that are elsewhere on them web.
We have permission to post every article, but I don't know if Google knows that.
Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up.
Two of our main keywords are:
NLP
NLP Training
Thanks!
-
You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic.
I can't tell you how to make your decision but here is how I made mine..
I had hundreds of republished articles but a lot more that I had written myself. Deleting lots of republished articles would cut my traffic and cut my income. Noindexing them would cut my traffic and cut my income. However, although those were serious losses they were small in comparison to other content on my site. So, knowing that google does not like duplicate content I got rid of them. There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site.
The upside.... My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger. After making the cuts my rankings, income and traffic increased. Not quite to previous levels but back to nice numbers.
I have reduced risk and am pleased with that. Everything that I cut was redirected to similar content. The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content.
============================
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What from this is unique? Definitely keep that. Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself.
Do you see further risk in future panda updates?
Yep... that's why I cut off my foot.
My thoughts are to rel=author each of our own articles,
YES... In the past I wanted all of my content to be anonymously written. I have changed my mind on that and used rel=author on the best stuff.
no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles.
heh.... Here I would be chopping off two of those sites and merging them into one. I would have done that years ago before panda was ever heard of.
I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
I agree.
-
Hi EGOL,
We are getting a lot of traffic off of some of these articles, so the site owners are not sure they want to no-index them just in case that's not causing the problem. Our rankings have come up from 40 to 26 on our main term, and similar for other terms, even though we still have duplicate content. We were originally at 19 before a big drop in November/December
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What would you suggest I let the owner's know? Do you see further risk in future panda updates?
My thoughts are to rel=author each of our own articles, no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
However, it's not my decision, your thoughts?
-
I don't know. Everything that I have done is an experiment.
If you are really scared, delete... if you have some tolerance for uncertainty then play around with noindex or canonical. I deleted from a really important site.... used canonical where I the ranking loss was small and the risk was not jugular.
-
Hi EGOL,
When is no-indexing enough and when would you suggest deletion?
-
Can we no-index all the duplicate stuff? Or is some deletion necessary?
On one of my sites I deleted a lot and noindexed followed everything else that was a duplicate. We saw rankings recover in about a month.
On another site i had a lot of .pdf documents that were used to control printing of graphics. We used rel=canonical on them. That works very very slowly to remove them from the index. We are seeing slow recovery on that site.
if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
If the article belongs to someone else then I would noindex or delete. (Just saying what I would do it if was on my site). If it was my content I would set up a google + profile and use rel=author and rel=me to attribute them to a verified author.
-
Perhaps you could add a link to the original source on some of these where you have the permission. This should send a signal to google that you are showing it on your site for the convenience of users, but it is from a different source.
-
Can we no-index all the duplicate stuff? Is that enough to save our arse? Or is some deletion necessary?
I assume if we are not first in google for the content and title of an article, it is a potential duplicate content problem, correct? For example, if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
Any advice is appreciated. You're one of the best EGOL.
-
We have permission to post every article, but I don't know if Google knows that.
Google probably does not know and certainly does not care. If you have duplicate content on your site you are a potential target.
What type of link-building have you been doing? You might have been hit by the overoptimization penalty.
I was republishing some third-party content on a couple of my sites. I deleted most of it and no indexed the rest. Cut off your foot to save your arse.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site is being deindexed for unknown reason
A few days ago I noticed that my site gusty.se was not showing up in google, only the subpages. There is no message in the google search console. I requested the site to be reindexed and about a day later the site was showing up in google again. Now another day has past and the site is now again not indexed in google. Question is why the site is being deindexed??? I have worked a bit with getting backlinks to the site and I did recently gain 3 backlinks within a few days (about a week has past since I gained these links). Still I can't believe Google would count this as unnatural link building, especially since I guess it will take some time for Google to detect new incoming links. Another thing I've notice though is that my site about two weeks ago got a high number of incoming links from different spam sites with .gq TLD's (see the attached screenshot). The majority of these sites have however not linked to my main page but to a sub page which still is indexed by Google. Can all these spamlinks be the reason to why Google has deindexed the main page of my site? I've read that Google in general ignore links from spam sites, still I have taken action against these spam sites by submitting a disavow text file containing all these spam domains. I submitted this file about 2 days ago. I have now again requested the site to be reindexed so perhaps will it soon be listed again. Still, I can't keep having my site deindexed and having me reindexing it every second day. I would really appreciate if someone could give me some insight in this problem. moz.jpg
Intermediate & Advanced SEO | | Grodan21 -
Site Redesign Performance
Hi, I am looking for a bit of advice if possible. In October 2018 we did a site redesign for a website that we had acquired (www.drainageonline.co.uk) but we have lost SO much organic traffic since Oct (-40/50% each month). I have amended any crawl errors including broken redirects ensuring all 301's that we put in place are working correctly, ensured 404 pages working OK, implemented canonical onto all pages along with other bits but we don't really seem to be seeing any improvements in our organic traffic. Can anyone offer any advice on the above? Thanks, Lea
Intermediate & Advanced SEO | | GAP_Digital_Marketing0 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Mobile Site Annotations
Our company has a complex mobile situation, and I'm trying to figure out the best way to implement bidirectional annotations and a mobile sitemap. Our mobile presence consists of three different "types" of mobile pages: Most of our mobile pages are mobile-specific "m." pages where the URL is completely controlled via dynamic parameter paths, rather than static mobile URLs (because of the mobile template we're using). For example: http://m.example.com/?original_path=/directory/subdirectory. We have created vanity 301 redirects for the majority of these pages, that look like http://m.example.com/product that simply redirect to the previous URL. Six one-off mobile pages that do have a static mobile URL, but are separate from the m. site above. These URLs look like http://www.example.com/product.mobile.html Two responsively designed pages with a single URL for both mobile and desktop. My questions are as follows: Mobile sitemap: Should I include all three types of mobile pages in my mobile sitemap? Should I include all the individual dynamic parameter m. URLs like http://m.example.com/?original_path=/directory/subdirectory in the sitemap, or is that against Google's recommendations? Bidirectional Annotations: We are unable to add the rel="canonical" tag to the m. URLs mentioned in section #1 above because we cannot add dynamic tags to the header of the mobile template. We can, however, add them to the .mobile.html pages. For the rel="alternate" tags on the desktop versions, though, is it correct to use the dynamic parameter URLs like http://m.example.com/?original_path=/directory/subdirectory as the mobile version target for the rel="alternate" tag? My initial thought is no, since they're dynamic parameter URLs. Is there even any benefit to doing this if we can't add the bidirectional rel="canonical" on those same m. dynamic URLs? I'd be immensely grateful for any advice! Thank you so much!
Intermediate & Advanced SEO | | Critical_Mass0 -
Reindexing a site with www.
We have a site that has a mirror - i.e. www.domain.com and domain.com - there is not redirect both url's work and show pages so basically a site with 2 sets of URLs for each page. We have changed it so the domain.com and all assorted pages 301 redirect to the right URL with www. i.e. domain.com/about 301's to www.domain.com/about In the search engines the domain.com is the site indexed and the only www. page indexed is the homepage. I checked in the robots.txt file and nothing blocking the search engines from indexing both the www. and non www. versions of the site which makes me wonder why did only one version get indexed and how did the clients avoid a duplicate content issue? Secondly is it best to get the search engines to unidex domain.com and resubmit www.domain.com for the full site? We are definately staying with the www.domain.com NOT domain.com so need to find the best way to get the site indexed with www. and remove the non www. Hope that makes sense and look forward to everyone's input.
Intermediate & Advanced SEO | | JohnW-UK0 -
I currently have a client that has multiple domains for multiple brands that share the same IP Address. Will link juice be passed along to the different sites when they link to one another or will it simply be considered internal linking?
I have 7 brands that are owned by the same company, each with their own domain. The brands work together to form products that are then sold to the consumer although there is not a e-commerce aspect to any of the sites. I am looking to create a modified link wheel between the sites, but didn't know if my efforts would pay off due to the same IP Address for all the sites. Any insight on this would be greatly appreciated.
Intermediate & Advanced SEO | | HughesDigital0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0