Help Understanding GWT Message
-
Brief background: A few months ago, our firm exchanged blog posts with another law firm in Pennsylvania with followed links. Though we did exchange links, the posts weren't spammy. They wrote "A Floridian's Guide To A Car Accident In Pennyslvania" and we wrote one for Pennsylvanians in Florida. (The reason for this is that Personal Injury law varies drastically from state-to-state, and Florida has a ton of people who move back and forth).
My question: His firm got a message from google saying our link to him violated googles' guidelines. I went and removed the link, BUT I didn't get any message saying his link to our site was a violation. Shouldn't we both have gotten messages? Perhaps, mine is "in the mail" so to speak, but I would think both would go out at the same time, so I'm wondering if there is another possible reason?
Thanks,
Ruben
-
Yes, I got it from Majestic SEO. View page source then ctrl+f and type in 'daisy'. A lot of the domains with a lot of exact match anchors - hidden behind giant images.
The registrant of the site probably has no idea this all happened.
-
I'm searching for that anchor text with a site:sexualproblems.net, and once the site comes up (it takes a while), I'm not seeing that anchor text on the site itself, though Google did see it at one point. I wonder if their site got hacked?
Travis, it'd be great if you told us which tool you used. Thanks!
-
Travis,
Can you show me how your seeing we got this information, so I can figure out what to do with it? I just ran our site through OSE, and I'm not seeing that. Maybe I'm just not looking in the right spot.
Thanks,
Ruben
-
We never posted on sexualproblems.com. Well, at least I didn't, but thanks for pointing that out. I'll definitely look into.
-
Your site appears to have received a little less than 70K links from sexualproblems.net in the last 90 days. The anchor text is "daisy fondant cutters". It looks like the result of a post on sexualproblems.com. I could go on, but it looks like the guest posting strategy isn't working out.
-
Hi Ruben,
Since these are "manual" messages, a Google employee had to personally review the account and decide to send the message. It is likely something about the site flagged Google, prompting them to manually review the site. We don't really have a way of knowing if what they address in the message is what flagged that site, but it is likely.
If Google thinks what this site did violated the guidelines, that means any site, including your own, that is in violation of that guideline has the potential to receive the same message.
It is likely that yours is "in the mail." You could receive it a week from now or ten months from now.
I would recommend cleaning up or "no following" the link. You might consider waiting until the other site has submitted their reconsideration request to find out if your actions to remove the link were acceptable.
-
Here's a link to the post that got flagged (the links are removed now): http://www.kempruge.com/a-floridians-guide-to-a-car-accident-in-pennsylvania/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
Creating a help hub, not sure the best name to use, " keyword help " or " help hub "?
I've been creating new content for our site, lots of help related content, so I created a help hub section. Now the more I go through it, and look at url structure and breadcrumbs, I can't help but think I should be using a keyword in there, but also don't want to over do it, since the keyword we are shooting for is also a subsection of our site, complete with url keyword and breadcrumb. So I just don't want to have too many over redundant titles like keyword this and keyword that, so I came here to get some advice from the awesome community of folks. Keep help hub so it's: Url: site.com/help-hub/helppage1 Breadcrumb: Home > Help-Hub > Help Page 1 or Url: site.com/keyword/help/helppage1 Breadcrumb: Home > Keyword > Help > Help Page 1
Technical SEO | | Deacyde0 -
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
Can increase in crawl errors in GWT) be caused by input fields and jquery?
Dear Mozzerz We took over www.urgiganten.dk not long ago and last week we opened up for indexation, after having taken the old website down for a couple of months. One week after opening for indexation we saw a huge increase in crawl errors.Google is discovering some weird links to e.g http://www.urgiganten.dk/30-garmin-urremme/ which returns a 404. In GWT we are told that we are linking to this url from http://www.urgiganten.dk/garmin-urremme. But nowhere on http://www.urgiganten.dk/garmin-urremme will you find this link. However you will find the following script in the source code, which is the only code part that contains "/30-garmin-urremme/":Can it be true that google take the id and adds it to our tld to form a url? We have seen quite a lot of these errors not only on Urgiganten.dk but also some of our other websites!
Technical SEO | | urgiganten0 -
Should I Parent/Child my Website Pages (need help with terminology too)
Hello I have a website that I am trying to SEO optimise.
Technical SEO | | NikitaG
The current structure of the site is that all pages are linked directly after the domain:
example: www.domain.com**/page01** www.domain.com**/page02** The website is however logically organised in the following form:
www.domain.com**/page01/page02** Sometimes the parenting goes to 3 levels: (please help me with the right term here) Domain
↳ Page001
↳ Page002
↳Page003 My question is: should keep the current structure, or is it worth the effort to re-link the website in a parented way. Are there any benefites to one or the other, and could you point to some video tutorials or documentation to read. BqoDAsx.jpg DMMIC5o.jpg0 -
Help Website Plumetting :(
Hi I have been smacked by the penguin/panda and traffic plumetted back in April/May. We are still trying to recover and am looking at all of the potential issues. I have since cleaned up the site as much as i can and attempted to remove as much duplicate content as possible which is automatically generated by Zencart. We add content regularly and have new product reviews everyday and all product page are kept fresh as they show the last 12 customers engraving details which change daily on popular items. Could someone give me some pointers as i am hitting my head against the wall and only seeing traffic drop all the time, it's soul destroying just how much work i am putting into this every day without any effect. Site is www.keepitpersonal.co.uk Kind Regards Andy
Technical SEO | | SmithyWhiffy0 -
Help With Joomla Duplicate Content
Need another set of eyes on my site from someone with Joomla experience. I'm running Joomla 2.5 (latest version) and SEOmoz is giving my duplicate content errors on a lot of my pages. I checked my sitemap, I checked my menus, and I checked my links, and I can't figure out how SEOmoz is finding the alternate paths to my content. Home page is: http://www.vipfishingcharters.com/ There's only one menu at the top. Take the first link "Dania Beach" under fishing charters for example. This generates the SEF url: http://www.vipfishingcharters.com/fishing-charters/broward-county/dania-beach-fishing-charters-and-fishing-boats.html Somehow SEOmoz (and presumably all other robots) are finding duplicate content at: http://www.vipfishingcharters.com/broward-county/dania-beach-fishing-charters-and-fishing-boats.html SEOmoz says the referrer is the homepage/root. The first URL is constructed using the menu aliases. The second one is constructed using the Joomla category and article alias. Where is it getting this and how can I stop it? <colgroup><col width="601"></colgroup>
Technical SEO | | NoahC0 -
404 help
Hello all, firstly let me apologize if this is the wrong place to ask this question. I have a site www.promptresponseaccidentmanagement.com which gets a 200ok when checked for crawl issues, however pages such as /whiplash-injury-compensation-claims.php , /road-traffic-accident-compensation-claims.php and quite a few more return a 404. That's fine (usually) as I can quite happily fix that most of the time. However if you actually go to those pages in your browser, or click through to them on any part of the site you will see that they are in fact not redirecting to a 404 and everything is fine!? Any body got any ideas? Best H
Technical SEO | | haydyn0