So What On My Site Is Breaking The Google Guidelines?
-
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles"
I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed)
Contacted Google via the site reconsideration and got the general response...
So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed.
I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons.
They are STILL saying my website is breaking the Google guidelines... mainly around links.
Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't)
Website in question: http://www.yourjigsawpuzzles.co.uk
UPDATE:
Just to let everyone know that after multiple reconsideration requests, this penalty has been removed.
They stated it was a manual penalty.
I tried removing numerous different types of links but they kept saying no, it's still breaking rules.
It wasn't until I removed some website directory links that they removed this manual penalty.
Thought it would be interesting for some of you guys.
-
Great new Rhys!
-
Just to let everyone know that after multiple reconsideration requests, this penalty has been removed.
They stated it was a manual penalty.
I tried removing numerous different types of links but they kept saying no, it's still breaking rules.
It wasn't until I removed some website directory links that they removed this manual penalty.
Thought it would be interesting for some of you guys.
-
Potentially quicker to rank well if you built back up on a fresh domain with no poor history, but that being said, whose not to know if Google have methods in place to identify if domain owners do this - potentially via comparing content, code and copy. You might end up redoing everything on your website just to be safe.
Sticking with the same domain just means that you have to build a relatively significant amount of natural links to bring down the same anchor text ratio vs total external backlinks in the profile - do-able though if you subscribe to a few blogs and regularly comment to articles and maybe write some content for publication at toy (or other related) blogs - ensuring that you avoid blog rings/link networks/farms though.
-
In a situation like list, on a fairly new domain would it be quicker to start from scratch on a new domain?
-
View your link profile here.
Links are mostly coming from unauthoritative sources and mostly contain the same anchor text. This will be what you have to work on, start building natural links with varying anchor text to counterweight the poor link profile and history on your domain.
-
Sounds like you are an excellent candidate for some fun Memes attempting to gain socical traction!
-
Hi Rhys,
I agree with everything that Alan stated regarding existing links.
Moving forward, I'd suggest the following:
- Add the ability for customers to share your products socially. I don't see any social media icons on any of the pages, especially the product pages. Add FB, Twitter, Pinterest, Google Plus.
- Do you have social media accounts for your site? If not, create the 4 above and start posting! You'll get more of a sense of community and people will be able to share what puzzles they've completed, which ones they want to purchase next etc. I'm not personally in to puzzles but I know people that are, and they can't wait to get their next one as soon as they finish one.
- Highlight your competitive advantage more (on the item template, page titles etc). What makes you stand out? Free shipping? (BTW really really confusing having two free shipping points in dollars and pounds), best customer service, fast shipping, the latest puzzles etc.) Give people a reason to shop with you.
- You've got reviews but none of the products I viewed had any reviews. I'd suggest emailing customers 3-4 weeks after purchase asking for reviews if you don't already. This would also tie in nicely with a social media pages. Reviews are great for original content.
- Your blog has no entries and is dated from 2010? This doesn't look great...
- If you're struggling to get good/unique content on the site try adding more pictures/videos/staff testimonials/staff favourites etc.
Hope some of that helps
-
"When you factor in hundreds of links on every page" - I don't really see how I could reduce the amount of links on a page? As it's an eCommerce based website there nothing on there that I can see that would be helpful to remove?
"almost no depth of content" - Yeah, this is a problem we've run into. The problem is that a jigsaw puzzle of a cat, is exactly that. There's not much more you can add "content" wise. Even if you try and force extra content out, the most we can get is "this cat looks like he's relaxing in the garden shed."
"the ability to find products through several paths" - I don't think we can really change this, as the products really do fall under multiple categories. We've done Canacolization.
-
A quick review in Open Site Explorer shows that since you apparently don't have a huge volume of links, there's way too many coming from blatantly spam based domains: wywlinks.com, onelinkseo.com, contentrichdirectory.com, organisedlinks.com, yourlinkmarket.com, regularseo.com, elaboratedirectory.com, greatindexdirectory.com, linksmaximum.com, directorysuper.com, gatewayoflinks.com....
Even if you've cleared some of these out, the overall picture is that no great effort was put into obtaining high quality off-site signals- that it was an attempt to game the Google system. Since you say you've done what you could to remove links, it's possible that I'm looking at a "before" snapshot from within OSE, so I can't definitively say this is the issue, but it sure smells like it.
From there, when you factor in hundreds of links on every page, and almost no depth of content, the ability to find products through several paths (leading in duplication issues), the site gives the appearance of being "link polluted" both inbound and on-site.
So I'd say clear out all the links you can from directories. Dramatically reduce the on-site link structure, and if you want multiple paths to products, block some of those from indexing.
Then work to get more depth of descriptive text content on your category pages, and work to get high quality off-site recognition.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Ignoring Canonical Tag for Hundreds of Sites
Bazaar Voice provides a pretty easy-to-use product review solution for websites (especially sites on Magento): https://www.magentocommerce.com/magento-connect/bazaarvoice-conversations-1.html If your product has over a certain number of reviews/questions, the plugin cuts off the number of reviews/questions that appear on the page. To see the reviews/questions that are cut off, you have to click the plugin's next or back function. The next/back buttons' URLs have a parameter of "bvstate....." I have noticed Google is indexing this "bvstate..." URL for hundreds of sites, even with the proper rel canonical tag in place. Here is an example with Microsoft: http://webcache.googleusercontent.com/search?q=cache:zcxT7MRHHREJ:www.microsoftstore.com/store/msusa/en_US/pdp/Surface-Book/productID.325716000%3Fbvstate%3Dpg:8/ct:r+&cd=2&hl=en&ct=clnk&gl=us My website is seeing hundreds of these "bvstate" urls being indexed even though we have a proper rel canonical tag in place. It seems that Google is ignoring the canonical tag. In Webmaster Console, the main source of my duplicate titles/metas in the HTML improvements section is the "bvstate" URLs. I don't necessarily want to block "bvstate" in the robots.txt as it will prohibit Google from seeing the reviews that were cutoff. Same response for prohibiting Google from crawling "bvstate" in Paramters section of Webmaster Console. Should I just keep my fingers crossed that Google honors the rel canonical tag? Home Depot is another site that has this same issue: http://webcache.googleusercontent.com/search?q=cache:k0MBLFcu2PoJ:www.homedepot.com/p/DUROCK-Next-Gen-1-2-in-x-3-ft-x-5-ft-Cement-Board-172965/202263276%23!bvstate%3Dct:r/pg:2/st:p/id:202263276+&cd=1&hl=en&ct=clnk&gl=us
Intermediate & Advanced SEO | | redgatst1 -
Google and JavaScript
Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?
Intermediate & Advanced SEO | | CleverPhD0 -
Site not indexed in Google UK
This site was moved to a new host by the client a month back and is still not indexed in Google UK if you search for the site directly. www.loftconversionswestsussex.com Webmaster tools shows that 55 pages have been crawled and no errors have been detected. The client also tried the "Fetch as Google Bot" tactic in GWT as well as running a PPC campaign and the site is still not appearing in Google. Any thoughts please? Cheers, SEO5..
Intermediate & Advanced SEO | | SEO5Team0 -
Google Indexed my Site then De-indexed a Week After
Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
Intermediate & Advanced SEO | | Travis-W
The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT! A week ago Google de-indexed almost all of those new pages. Check out this image, it kind of boggles my mind and makes me sad. http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
Does it come down to our site not having enough Domain Authority?
My client really needs an answer about how we are going to get these pages indexed.0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
Multiple Google+ Local (Google Place) under one email address
As a automotive dealership group, we have 15+ business listings set up under one Google+ local account. Google+ Local (Google Places) offers the ability to upload a data file for 10+ listings, so we've kept all listings under one login for efficiency. Is there any specific local SEO benefit or any general benefit at all to having each business listing set up under their own separate email address?
Intermediate & Advanced SEO | | autoczar0 -
Google +1 and Yslow
After adding Google's +1 script and call to our site (loading asynchronously), we noticed Yslow is giving us a D for not having expire headers for the following scripts: https://apis.google.com/js/plusone.js
Intermediate & Advanced SEO | | GKLA
https://www.google-analytics.com/ga.js
https://lh4.googleusercontent.com... 1. Is their a workaround for this issue, so expire headers are added to to plusone and GA script? Or, are we being to nit-picky about this issue?0 -
How to best utilize network of 50 sites to increase traffic on main site
Hey All, First off I wanna thank everyone who has responded to all my previous questions! Love to see a community that is so willing to help those who are learning the ropes! Anyways back to my point. We have a main site that is a PR 3 and our main focal point for lead generation. We recently acquired 50 additional sites (all with a PR of 1-3) that we would like to use as our own little back linking campaign with. All the domains are completely relevant to our main site as well as specific pages within our main site. I know that reciprocal links will get me no where and that google is quickly on to the attempted 3 way link exchange. My question is how do I best link these 50 sites to not only maintain there own integrity and PR but also assist our main site. Thanks All!
Intermediate & Advanced SEO | | deuce1s0