Malicious Software Warnings in Search Console
-
google search console said my site failed due to malicious software being hosted. the files creating the problems are behind a username and login gate and are user submitted. the host url is also a subdomain of the main site. unfortunately, catching all malicious files is an unwinable cat and mouse.
What is he best seo strategy for this situation?
-
Hi woolbert! Does this response help to answer your question or are you looking for more information? If you're good to go, please mark this as answered. Thanks!
-
Hey Woolbert -
Definitely something you need to solve. If this keeps up, your site may be either marked in the SERPs as having potentially malicious code or even worse they may show a layover from the SERPs warning people, thus driving them away from your site.
You need to clean up your site. First, remove the offending files. Then contact Google to let them know what you've done. Then implement stricter controls for what can be uploaded or not. Without knowing your site, it's impossible to know how feasible this is, but it's what you need to do.
And finally, make sure your site is running HTTPS.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Search visibility consistently low. No apparent cause.
I'm trying to nail down why a site would be consistently showing a search visibility of 1-2% whilst similar competitors are in 20-30s. There are no site errors reported. Sitemap, robot.txt, meta titles & descriptions, key word presence, alt attributes, load times, canonicals, etc all check out as fine. Backlinks profile is healthier than competitors'. Yet even searching for our main product the result links to an obscure blog page rather than our main site, despite the presence of identical and similar keywords on our homepage, in our title, h1 tags, web address... Site content and design seem subjectively good and at the very least matches better performing competitor site's. Does anyone know of any less visible reason as to why a site would be tanking so badly in search rankings? Have checked using other SEO tools and they all report the same as Moz.
Intermediate & Advanced SEO | | SimonZM1 -
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | jampaper0 -
Google Search Console Site Property Questions
I have a few questions regarding Google Search Console. Google Search Console tells you to add all versions of your website https, http, www, and non-www. 1.) Do I than add ALL the information for ALL versions? Sitemaps, preferred site, etc.? 2.) If yes, when I add sitemaps to each version, do I add the sitemap url of the site version I'm on or my preferred version? - For instance when adding a sitemap to a non-www version of the site, do I use the non-www version of the sitemap? Or since I prefer a https://www.domain.com/sitemap.xml do I use it there? 3.) When adding my preferred site (www or non-www) do I use my preferred site on all site versions? (https, http, www, and non-www) Thanks in advance. Answers vary throughout Google!
Intermediate & Advanced SEO | | Mike.Bean0 -
Huge spike in "access denied" in search console
Hey Guys, We have seen a huge spike in "Access Denied" status in the google search console for our website and I have no idea why that would be the case. Is there anyone that can shed some light on what is going on or who can point me in the direction of an SEO specialist that we can pay to fix the issue?? Thanks denied.png
Intermediate & Advanced SEO | | fbchris0 -
Site: search showing funny results
Hi When i do a site: search on my domain the very last result it returns is a URL which is listed as my domain but does not exist on my website. When clicked it redirects to a really spammy page. If im not being clear just let me know, quite hard to explain the situation! Any thoughts to get rid of this?
Intermediate & Advanced SEO | | TheZenAgency0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
How to Hide Directories in Search?
I noticed bad 404 error links in Google Webmaster Tools and they were pointing to directories that do not have an actual page, but hold information. Ex: there are links pointing to our PDF folder which holds all of our pdf documents. If i type in , example.com/pdf/ it brings up a unformated webpage that displays all of our PDF links. How do I prevent this from happening. Right now I am blocking these in my robots.txt file, but if i type them in, they still appear. Or should I not worry about this?
Intermediate & Advanced SEO | | hfranz0