What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you transcribe your youtube video to just google?
YouTube videos have been very effective for us on our site. The only problem is they have taken part of a fair amount of our content so to google our pages may seem to have minimal content. Transcribing each video and including that on the page would take away from the look of the page and clutter it. Is there a way to let google know the content without having to include it on the page? You can find an example page at https://www.waikoloavacationrentals.com/halii-kai-rentals/
On-Page Optimization | | RobDalton0 -
"Google-selected canonical different to user-declared" - issues
Hi Moz! We are having issues on a number of our international sites where Google is choosing our page 2 of a category as the canonical over page 1. Example; https://www.yoursclothing.de/kleider-grosse-groessen (Image attached). We currently use infinite loading, however when javascript is disabled we have a text link to page 2 which is done via a query string of '?filter=true&view=X&categoryid=X&page=2' Page 2 is blocked via robots.txt and has a canonical pointing at page 1. Due to Google selecting page 2 as the canonical, the page is no longer ranking. For the main keyphrase a subcategory page is ranking poorly. LqDO0qr
On-Page Optimization | | RemarkableAgency1 -
How does Google treat Dynamic Titles?
Let's say my website can be accessed in only 3 states Colorado, Arizona and Ohio. I want to display different information to each visitor based on where they are located. For this I would also like the title to change based on their location. Not quite sure how Google we treat the title and rank the site.... Any resources you can provide would be helpful. Thanks
On-Page Optimization | | Firestarter-SEO0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Hiding Legitimate Content From Google Without Penalty?
Hi Around the time of the April/May 2012 penguin update we lost around 40% of traffic and ever since traffic has slowly been reduced to around 50% of what is was from last year from approx 1000 hits a day to under 500. I'm not convinced totally that penguin is to blame as link building has not been overdone especially when other companies who rank first page for "personalised gifts" have 100's of anchor links pointing back to there sites with this term as the link. I think our site may be hit due to accidental keyword stuffing! Basically when an order is placed the product page is updated with the last message to be engraved on the item, so for example if someone buys 10 wine glasses all personalised with "happy birthday mum, happy birthday dad, happy birthday son etc or if customers choose similar engraving then we have lots of combinations of the same words repeated. This information has increased the number of conversions as it gives customers ideas as what to engrave and I don't really want to remove it from the page. Could anyone advise me on what i need to do, is there a way of telling google to ignore this text, a way of hiding without being penalised? Any ideas greatly appreciated!
On-Page Optimization | | SmithyWhiffy0 -
How to design a site map page for users (not for Google)
I would like to design a site map for my visitors so they can have a quick view on the whole content of the website. 2 questions : 1 - is this kind of site map can help in terms of SEO ? 2 - if so, what are the best practices to design it ? Thanks in advance.
On-Page Optimization | | betadvisor0 -
Meta Description not displaying in Google
Hi Mozzers, I have a client that wants to change the way the meta description for some of his pages is being displayed. I've tried using the NOOPD and NOYDIR tags and its not worked. This isn't the client but perform this search in Google.ie - "accommodation newry daft" you get this result - http://www.google.ie/#hl=en&sclient=psy-ab&q=accommodation+newry+daft&pbx=1&oq=accommodation+newry+daft&aq=f&aqi=&aql=&gs_sm=e&gs_upl=11197l11712l2l12016l5l5l0l0l0l0l186l851l0.5l5l0&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=f5c640577bb5a285&biw=1600&bih=775 See how Daft.com (2nd results down) has the text "10+ items" in the description- my client has this as well as do many other competitors but its not present in the meta description tag. Anyone know how to get rid of this and get the good old meta descrition in the SERPs? Thanks BUsh
On-Page Optimization | | Bush_JSM0 -
Major update to site architecture (outline)-Is Google going to drop?
I'm working with a lawyer client who has a table-based, outdated site. Her nav links consist of a jumble of topics and static pages in one long sidebar list on the home page. I'm moving her site to Wordpress and I've recommended that she organize the site based on categories that roughly match the topics/keywords she wants to rank highest for in Google. The site will be much better organized and coded and the URLs for the new launch will be much stronger for SEO by being targeted and coded properly. So the site should rank better after, right? Right??? I know that when Google crawls the new architecture, it's not going to find the expected long sidebar list of internal nav links. It'll find better, more keyword targeted internal nav links. But will that keep the site from getting dropped off page 1? I'm speaking w/ the client tomorrow and if she's going to drop or get bounced around, I feel like I should prepare her and let her know roughly what might happen. I'm thinking based on my current understanding that I should tell her to expect to be bounced around for a few weeks, but in the end she should rank higher than before. What would you do/say?
On-Page Optimization | | bvrob0