Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Webmaster Tools - content keywords containing spam?
-
Hi all,
When I looked in Google Webmaster Tools today I found under the menu Google Index, Content Keywords, that the list is full of spammy keywords (E.g. Viagra (no. 1) and stuff like that)
Around april we built a whole new website, uploaded a new xml-sitemap, and did all the other things Google Webmaster Tools suggest when one is creating a Google Webmaster Account.
Under the menu "Security Issues" nothing is mentioned.
All together I find it har d to believe that the site is hacked - so WHY is Google finding these content keywords on our site??
Should I fear that this will harm my SEO efforts?
Best regards,
Christian
-
I've encountered the same issue on clients websites before. They had a wordpress site and as it turned out a hidden virus - that was showing different results to Google, than to users. So the virus couldn't be detected.
I ended up using this WP Plugin " Anti-Malware & Brute Force Security" to scan enter website. It should vulnerable files - which were removed everything cleaned up.
And as previous posters mentioned, Google/Sucuri may not see the malware know but when they do you're at risk of the site being blacklisted - so best to take care of it now.
hope that helps
-
Hi Henrik,
Has this issue been resolved? We'd love an update! Thanks.
Christy
-
Hi,
In the Search Analytics report could you put the filter on "Pages" rather than "Keywords" - and check if there are pages listed that seem suspicious to you?
Did you try "spammy keyword" site:yoursite.com in Google?
Recently there where quite some questions about wordpress sites that had been hacked - mainly caused by this vulnerability:
https://blog.sucuri.net/2014/09/slider-revolution-plugin-critical-vulnerability-being-exploited.html
or this one https://blog.sucuri.net/2014/12/revslider-vulnerability-leads-to-massive-wordpress-soaksoak-compromise.html
I guess that sucuri would normally detect this - but maybe you could doublecheck that you're not using these kind of slider on your site.
rgds,
Dirk
-
Hi Laura,
Thank you for your answer.
-
No comment spam.
-
No malware found using Sucuri
-
No problems found using site:example.com
(and the above is the reason why I am a surprised - I have never seen this before)
So ony reason I can think of is the old site was spammed/hacked and the new site is not...?
-
-
Hi Andy,
Thank you for your answer.
The site is build in Wordpress.
I have sent you a PM with my website (couldn't find your name, so I pm'ed iq SEO)
Best regards
-
Here's what I would look for:
- Comment spam - Do you have any visible comment spam on your website that includes these types of keywords? If so, remove it all.
- Malware - Some types of malware will create new pages on your site that you can't access through a simple crawl from your home page. Search "site:example.com" in Google, replacing "example.com" with your domain name. Go through all of the results to see if Google has any pages indexed for your site that you don't recognize.
Even if Google doesn't flag your site for malware, it may still be there. Look at the files on the server to see if there are new files that shouldn't be there. Scan your site at https://sitecheck.sucuri.net/ to see if anything comes up there.
If your site is infected with malware, Google will eventually find it and flag your site. Yes, it will certainly harm your SEO efforts.
-
Hi Christian,
It's a little difficult to say what might be going on without being able to actually see the site. You are welcome to PM me this or post it here and I will gladly take a look for you.
Should I fear that this will harm my SEO efforts? It all depends. If your site has indeed been hacked, you need to find out what has happened (and how it happened). What did you build the site in?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Bulk URL Removal in Webmaster Tools
One of Wordpress sites was hacked (for about 10 hours), and Google picked up 4000+ urls in the index. The site is fixed, but I'm stuck with all those urls in the index. All the urls of of the form: walkerorthodontics.com/index.php?online-payday-cash-loan.htmloncewe The only bulk removal option I could find was to remove an entire folder, but I can't do that, as it would only leave the homepage and kill off everything else. For some crazy reason, the removal tools doesn't support wildcards, so that obvious solution is right out. So, how do it get rid of 4000 results? And no, waiting around for them to 404 out of the index isn't an option.
Technical SEO | | MichaelGregory0 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0