Are we being Penalized? Can someone Assess Please!
-
We have two eCommerce sites. Both the sites can broadly be divided into 3 pages
- 1. Home Page.
2. Detail Page.
3 Category Pages (Altogether our site has approx 3 Million pages each)
These are the site URLs
http://bit.ly/9tRZIi - This is targeted for USA Audience
http://bit.ly/P8MxPR - This is targeted for UK audience
The .com domain which was launched earlier in 2011 is doing okay with decent organic traffic
Precautions Taken: To avoid content being duplicate on both the sites we are using:
a. Geo-targeting through Google webmaster tools
b. rel=alternate tag on printsasia.co.uk
Problem
1. The .co.uk domain which was launched in May 2012 started gaining organic traffic slowly but then suddenly dropped to almost 0 after September 18.
2. When we use operator site:printsasia.co.uk and apply a filter on past week/month we don't see any result. While when same operator used for "any time" we see some results.
3. According to webmaster tool, Google has indexed 95% of our URLs in the sitemap
Our concern: Is our UK site penalized for some reasons? If yes, what could be the possible reason(s) for this penalty and possible steps to get out of it? Would request if experts here can review our site and help us.
-
It doesn't necessarily matter if the auto generated content is unique or not - Panda was intended to penalize low quality content (such as auto generated content), not just duplicate content.
Even if you were able to figure out a way to auto generate content that didn't get penalized, there's a good chance you'll get penalized in a future update.
-
Yes what you see through copyscape is correct as those content comes along with the book and will be true for all retailers and marketplace websites, be it amazon, bn or abebooks.
Since we could not think of any other way to come up with unique content we thought of auto generated content. I slightly differ here as these auto-generated content is unique for each page at least partially. Though I am not 100% confident if this is great way to go about it.
Yes review is something we are definitely coming up and this may help.
-
Hi Cyril,
I doubt that the rel=alternate tag will help. Copyscape shows that at least some of the content is duplicated across other sites, not just your two sites.
I also doubt that auto generated content will help avoid Panda. That's one of the things Panda was specifically created to penalize - auto generated content.
If you're getting unique reviews from users and/or writing editor reviews, that very well may help.
I realize that it is impractical to write content for 3 million pages, but you may find that is what you need to do. You may need to start with your top pages and work from there, and in the meantime block indexing of all pages without unique content. I would not take that step hastily, but it may be what you end up having to do.
~Adam
-
Thank you Adam for your time and valuable feedback. We were also thinking of being hit by Panda but thought and as correction we used rel=alternate tag on printsasia.co.uk
It's been just a few days since we implemented this we are unable to say if this is working.
2ndly to increase the content we are introducing some review program and at the same time have also generated some auto generated content since it is impossible to develop content for 3 mn and increasing pages. If you can see the last "Book Information" Section on this page http://bit.ly/QqMAFR you will understand what i mean.
This section will be there on all book detail pages. Your comment post reviewing this will be appreciated
-
According to this, there was a Panda update on Sept 18, so I suspect that's what hit your site. Panda mainly targets the content of your website - my guess would be that your site was penalized because it has a lot of "thin content" pages. In other words, all your book pages have very little (no?) unique textual content.
FYI, I would say your US site is also in danger of being penalized by Panda and/or Penguin. I see that over 1/3 of linking root domains link to you with the anchor text "<a class="clickable title link-pivot" title="See top linking pages that use this anchor text">buy books online". Over-use of keyword anchor text like that is strongly correlated with getting a Penguin penalty.</a>
-
Good plan. I would wait at least 4 weeks after removing the link before you decide whether or not it's worked
-
Mark thanks for your time and valuable feedback. I think you almost answered my doubt why only one site being penalized and not the other.
Mark you are right when you say "Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space"
As i mentioned co.uk is just 5 months old site and its taking us sometime to build links. But we are definitely working on it.
I believe having lesser links can only be the reason of poor page rank and low rank in SERP it should not be the reason of being penalized. I hope you will agree with this.
As immediate step
1. I will first remove the sitewide link and see if this was the reason. If thinks improves over the time, we will keep the link back with changed anchor text
2. We will definitely take care of the blog comments considering the importance of it in brand reputation
-
see my answer below!
-
Thanks for your time. You mean blog.printsasia and not (blog.bookshopasia) right?
Juts a question- blog.printsasia.com is our own official blog, we have placed links for both the sites on subdomain blog.printsasia.com. If that is the reason for penalty then why our other site is not being penalized? or why only co.uk is penalized and .com is having no issues
-
I agree in part with easyrider2. There may be a problem caused by the sitewide header links from your own blog (blog.printsasia.com). These currently use the keyword-rich anchor text "Online Bookshop UK", although I suspect you previously had this as "Bookshop UK", as this is what OSE has picked up. Either way, they look like the kind of links that might be targeted by Penguin, as they don't use your brand name as anchor text.
If the blog was a subdomain of your UK site (blog.printsasia.co.uk), I don't think this would be a problem. But because it's a subdomain of a US site (albeit the same company), this could look like a spammy type of link.
Note: it may be that Google has not penalised you, but has simply decided to discount a set of links, perhaps these ones.
The good news is that as this is your company blog you can quickly change the link.
You could try one of the following:
1. Remove the sitewide link from blog.printsasia.com altogether
2. Change the anchor text to your brand name (eg Printsasia UK)
3. Remove the sitewide link and add a few more "natural" links into blog posts (as easyrider2 suggests)
Personally, I would try 1, assuming it doesn't drive significant traffic to your site. If that helps then you know you've identified a problem.
However, I don't think this is your only problem, and I'm not even convinced it is a problem. Looking at your link profile, you simply don't have sufficient volume or diversity of links, nor do you have enough links from high-authority sites within your space. So even if you "fix" this immediate problem, you still need to focus on some serious linkbuilding (by which I mean relationship building) within your industry.
I agree with easyrider2 about the spammy blog comments. These may not cause a problem with Google but they look very poor to users (and webmasters who might potentially link to your sites).
-
Looking at open site explorer for your UK bookshop I would say with 99% confidence you are being penalised because of over optimisation. Sounds like you got hit on the penguin refresh around sept 18.
Your anchor text is nearly all pointing with Bookshops UK. In fact 236 times and the nearest alternative is printasia.co.uk 4 times. Plus they are all coming from the same domain (blog.bookshopasia). You need to vary your anchor text. However, i am guess that link is the page template, although I could only see a link for "online bookshop UK" it has to be in there somewhere as OSE picks it up.
Make sure that if it is on the template, make that link no-follow and get links from different domains for different keywords.
You also need to get on top of your blog commenting. People using names such as "how to build your own iphone app" are just spam and worthless comments. Even if you disallow websites to be linked, crap content is worthless to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What a PBN is? please describe how you use them for SEO.
what a PBN is? please describe how you use them for SEO.
Intermediate & Advanced SEO | | Green.landon0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
How do I use public content without being penalized for duplication?
The NHTSA produces a list of all recalls for automobiles. In their "terms of use" it states that the information can be copied. I want to add that to our site, so there is an up-to-date list for our audience to see. However, I'm just copying and pasting. I'm allowed to according to NHTSA, but google will probably flag it right? Is there a way to do this without being penalized? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup1 -
Can Google index PDFs with flash?
Does anyone know if Google can index PDF with Flash embedded? I would assume that the regular flash recommendations are still valid, even when embedded in another document. I would assume there is a list of the filetype and version which Google can index with the search appliance, but was not able to find any. Does anyone have a link or a list?
Intermediate & Advanced SEO | | andreas.wpv0 -
Client site is lacking content. Can we still optimize without it?
We just signed a new client whose site is really lacking in terms of content. Our plan is to add content to the site in order to achieve some solid on-page optimization. Unfortunately the site design makes adding content very difficult! Does anyone see where we may be going wrong? Is added content really the only way to go? http://empathicrecovery.com/
Intermediate & Advanced SEO | | RickyShockley0 -
Disavowal & Reconsideration request - Can I do one without the other?
I submitted a link disavowal file for a client a few weeks ago and before doing that I read up on how to properly use the tool. My understanding is that if you received a manual penalty then you need to submit a reconsideration request after cleaning up links. We didn't receive a penalty so I didn't submit one. I'm wondering if anyone has used the tool (not stemming from a penalty) and if you did or didn't submit a recon. request, and what the results were. I've read that if a site is hit algorithmically, then filing a recon request won't help. Should I just do it anyway? Would be great to hear from anyone who has gone through a similar situation.
Intermediate & Advanced SEO | | Vanessa120 -
How can a page be top rated for a phrase it does not have?
Hi, While looking to buy a Christmas gift to my wife I was searching for yellow diamonds. Being a bit familiar with SEO I gotta understand how the following page was ranked 4th for "yellow diamonds": http://www.bluenile.com/diamonds/fancy-color-diamonds The phrase yellow diamonds is not mentioned even once! Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Rank keeps decreasing - Is my site penalized
Hello, I have run into a bit of a predicament. All of my search terms keep dropping on a monthly basis, even though I am adding quality guest posts every month. Even if I get a handful of articles on semi-popular sites my ranking still drop. I am wondering, is my site penalized? My metrics also over exceed my rankings using both the Moz metrics and pagerank. I have a PR of 5 and domain rank of 50+ and I am still getting outranked on every term by people with lower metrics (PR 2 and DR of 30) In the past I have done mostly article syndication through sites like ezinearticles and isnare, but that was about 5 years ago. I have also done a couple of the "pay $50 for 100 directory submissions" once but that was also about 5 years ago. Has anyone experienced anything like this? Anyone have any advice? As you can probably tell I am getting really frustrated. P.S. - This is happening for all pages on my website, not just particular pages. Is is possible to get a site wide penalty, and if so, what can be done about it?
Intermediate & Advanced SEO | | Mjstout0