Your site's pages may be using techniques that are outside Google's Webmaster Guidelines
-
Hi All
The message below I received from google webmaster, please tell me how I solve this problem
Dear site owner or webmaster of http://testedfatburners.com/,
We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see ourWebmaster Help Forum for support.
Sincerely,
Google Search Quality Team
-
Follow the Google Guidelines if you want Google to index your site is my suggestion. I took a couple of long phrases from your site, and found them in several other places on the web. The duplicate content isn't helping, in addition to the things mentioned previously.
-
Any other suggestions?
-
This is another example http://www.moreoldies.com/ only 4 pages, but PR is 3 and SERP high rank for keywords like phen375 custom reviews
-
I'm sure they'll eventually catch that one too. They are very low quality sites, the very type of sites Google does not want in their index. Sorry, but there's not much of a future for sites like these in Google.
-
This site is clearly spam. It has two pages about diet pills, there's keyword stuffing and a spam comment at the bottom for fake Oakley's. Your best bet is to start over with a quality site or throw in the towel now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I got an 803 error yesterday on the Moz crawl for most of my pages. The page loads normally in the browser. We are hosted on shopify
I got an 803 error yesterday on the Moz crawl for most of my pages. The page loads normally in the browser. We are hosted on shopify, the url is www.solester.com please help us out
Moz Pro | | vasishta0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How to add a Google+ business page to the social analytics?
Since my Google+ account is attached to my business page, MOZ is tracking the actual user page. I want to track the business page. It won't let me add another user. How do I fix this? Thanks.
Moz Pro | | kcampbell990 -
Only few pages (308 pages of 1000 something pages) have been crawled and diagnosed in 4 days, how many days till the entire website is crawled complete?
Setup campaign about 4-5 days ago and yesterday rogerbot said 308 pages were crawled and the diagnostics were provided. This website has over 1000+ pages and would like to know how long it would take for roger to crawl the entire website and provide diagnostics. Thanks!
Moz Pro | | TejaswiNaidu0 -
I'm trying to get 'tigi bed head' up most of all...
I'm 87th ish with this term and I don't know why?! crap result I know. With every other phrase I use 'cheap tigi bed head' 'buy tigi bed head online' etc etc, we are on the first page all day long, pls help this worthy cause? I am www.thehairroom.co.uk, free hair products for the best results. Thank You.
Moz Pro | | smoki6660 -
Duplicate Content Issue from using filters on a directory listing site
I have a directory listing site of harpists and have alot of issues coming up that say: Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings. Because this is a directory listing site the content is quite generic.The main issue appears to be coming from the functionality of the page. It appears that the "spider" is picking up each different choice of filter as a new page? If you have a look at this link you will see what I mean. People searching the site can filter the results of the songs played by this harpist by changing the dropdowns etc... but for some reason the filter arguments are being picked up...? Do you have any good approaches to solving this issue? A similar issue comes from the video pages for each harpist. They are being flagged as identical content - as there are currently no videos on the page. | http://www.find-a-harpist.co.uk/user/39/videos | http://www.find-a-harpist.co.uk/user/37/videos | Do you have any suggestions? Many thanks for taking the time to read this and respond. | | | | | |
Moz Pro | | dseo241
| |0 -
Your opinion on this opportunity's difficulty?
I'm building a tool for mechanical engineers, and I'm trying to find 10 low-competition keywords to target in my first few content marketing efforts. I've got a lot of maneuvering room, so (with a bit of expert advice) I bet I'll be able to find some low-hanging fruit. Here's what I've found: Most keywords are seem to have about 40% difficulty. What's the highest level of SEOmoz "keyword difficulty" that a new website should reasonably try for? Some ranking high-authority pages are don't appear to be targeted at the term Is it fair to say that I could beat any page with less than a 'C' ranking for on-page optimization? (Assuming I target the term with general best practices)* Thanks! If you're interested, here is my current process: Go on engineering blogs for keywords Use Wordstream's Keyword Suggestion Tool for ideas around it Use Google Keyword Tools for keywords above 50 searches in direct match Use SEOmoz Keyword difficulty report, looking more deeply at keywords under <50% If I can find a top 10 page that's less than 30PA and less than 40DA, or has less than 'C' ranking for on-page optimization, I consider the keyword achievable within 3mo, using general best practices. *Except for YouTube/Wikipedia/etc
Moz Pro | | 49wetnoodles0 -
Why would Open Site Explorer say some internal links are images when in the source code they're text?
Hi, All! I was looking at OSE for one of my client's site's pages, and I saw that all internal links were said to be images. I was pretty sure their menu was CSS, and all the links were text. So I did "fetch as Googlebot" and looked in the resultant code of one of the homepage to see the main navigation bar. The navigation link for one of those pages looked like this: `onmouseover="doMenu2on(this);">[](http://www.mysite.com/solution_sub.asp?ID=8)` [``` <code>![](images/arrow6.gif)Anchor Text Here</code> ```](http://www.mysite.com/solution_sub.asp?ID=8) ``` The nav link for a page that does show internal anchor text in OSE looks like this: onmouseover="doMenu2on(this);">[](http://www.mysite.com/solution.asp?ID=5) [```
Moz Pro | | debi_zyx
<code>Anchor Text Here</code> So there is an image in the link (it's the little arrow before the text to indicate a sub-category), but the anchor text comes right after it. Is Linkscape (and therefore potentially Google) seeing them as two different links to the same page and only counting the first? But it's all wrapped within the same <a></a>tags. Any ideas on if this is a bug in Linkscape, or a real issue, and what should be done? Thanks! Aviva``0