Your site's pages may be using techniques that are outside Google's Webmaster Guidelines
-
Hi All
The message below I received from google webmaster, please tell me how I solve this problem
Dear site owner or webmaster of http://testedfatburners.com/,
We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see ourWebmaster Help Forum for support.
Sincerely,
Google Search Quality Team
-
Follow the Google Guidelines if you want Google to index your site is my suggestion. I took a couple of long phrases from your site, and found them in several other places on the web. The duplicate content isn't helping, in addition to the things mentioned previously.
-
Any other suggestions?
-
This is another example http://www.moreoldies.com/ only 4 pages, but PR is 3 and SERP high rank for keywords like phen375 custom reviews
-
I'm sure they'll eventually catch that one too. They are very low quality sites, the very type of sites Google does not want in their index. Sorry, but there's not much of a future for sites like these in Google.
-
This site is clearly spam. It has two pages about diet pills, there's keyword stuffing and a spam comment at the bottom for fake Oakley's. Your best bet is to start over with a quality site or throw in the towel now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Duplicate Content - Multiple URL's
I know a few of these problems come from products being in the same categories but I have no idea how to get rid of the url's that are showing duplicate content when the product is in the exact same place. Hard to explain, but here are URL examples. http://www.ocelco.com/store/pc/www.ocelco.com/store/pc/Bathtub-Floor-Corner-Stainless-Steel-Grab-Bar-Right-Hand-left-hand-pictured-688p3308.htm http://www.ocelco.com/store/pc/www.ocelco.com/store/pc/Bathtub-Floor-Corner-Stainless-Steel-Grab-Bar-Right-Hand-left-hand-pictured-696p3308.htm http://www.ocelco.com/store/pc/Bathtub-Floor-Corner-Stainless-Steel-Grab-Bar-Right-Hand-left-hand-pictured-p3308.htm http://www.ocelco.com/store/pc/Bathtub-Floor-Corner-Stainless-Steel-Grab-Bar-Right-Hand-left-hand-pictured-688p3308.htm Any Idea's how to fix / get rid of these URL's? Thanks!
Moz Pro | | Mike.Bean0 -
What's the best way to eliminate "429 : Received HTTP status 429" errors?
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz. Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with: Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report. What can I do to eliminate "429 : Received HTTP status 429" errors? Any insight you can offer is greatly appreciated! Thanks,
Moz Pro | | ryanjcormier
Ryan0 -
Low KDS but high DA for all page 1 sites
Hi there A KDS report question... My client has a keyword with a low KDS (garden hose - 28). Intuitively this seemed too low, and when I checked what ranks, it is all high quality brand sites with high DAs. Admittedly the specific pages ranking have low PAs, but I can't see my client ever being able to compete with these big boys right now. How come the KDS is so low? This seems just wrong as the metric implies it is a good keyword to target - I know there are always lots of other things to check, but this does make me less confident with KDS reporting Anyone got any thoughts on this? Many thanks
Moz Pro | | Chammy3 -
How do YOU use site explorer?
I normally use open site explorer to identify links that competitors of my clients have and sometimes this gives me what I call 'some low hanging fruit' to go after. (and of course links that are more challenging to get) I don't know why this didn't occur to me sooner. If my client is a chiropractor why not look at the links for 50 or 100 of the top rankings chiropractic sites all over the US? This would HAVE to uncover a wealth of blogs to comment on that have good authority, great industry associations, publications, forums - a whole wealth of items. It made me wonder how many people use site explorer like I have been (top 3-4 competitors that your client has) or identifying links pointing to LOTS of competitors? How do you use it? Couldn't you almost base an entire link building campaign using OSE? Why would this be a bad idea if not? Just some random thoughts. THE WEEKEND IS ALMOST HERE - Have a great day everybody! 🙂
Moz Pro | | Mrupp441 -
What is the best ranking checker solution for 100's of sites
Hello, We used IBP for over 2 years and it worked great. We were able to schedule every clients site to auto run and email our clients. Now IBP is terrible due to Google's new updates. We are looking for something cost effective since we have 100's of websites we check on a weekly basis. We are either looking for a great software that uses proxies to check, or a service that offers unlimited sites and is cheap per month. We have searched for many, however there are so many that we aren't sure what is good and what isn't. We tried Jonathan Ledgers new one and it's not good, we looked into Web CEO and it's per amount of websites which is expensive. We tried cute rank tracker which is free and added proxies and it doesn't work, it lags out and doesn't even track ranks properly. It wouldn't hurt if it had a built in report analysis of the website as well. So whats a good one?
Moz Pro | | MarketingOfAmerica0 -
I know our business listed in Yahoo and medranks.com (for example). But my open site explorer report doesn't show those. however on their sites, I see the listing. Why is this?
I know our business listed in Yahoo and medranks.com (for example). But my open site explorer report doesn't show those links on the inbound report. however on their respective sites, I see the listing when I search for us. And the link does work..... Why is this? Why don't I see it on the open site report?
Moz Pro | | cschwartzel0 -
Rank Tracker violate Google Guidelines?
I would like to find out whether the use of SEOMOZ rank tracker violate Google's guidelines. "Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google."
Moz Pro | | PGRob0