My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
-
Excuse me for posting this here, I wasn't having much luck going through GWT support.
We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere.
I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website.
If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
-
Hi
Yeah, lets say they use Xrumer.
They hack your site, insert pages of their own, and links on your pages.
They put those urls in text files based on their keyword targets/groups.
They run the software, using those list with their link sources and using their auto insert random url template.
So that pings a 404 to GWT so the 404 shows up there.
If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in.
So you'll see those 404 pages getting links from different dates.
Hope that helps
-
I don't understand why more spam links would be coming in though. Is it because the spam network doesn't realize that I've removed the injected pages? In other words, are they unknowingly linking to 404s?
-
Since those URLs are already gone after you cleaned it up, you can just mark those as fixed. GWT usually is pretty late with picking those up. I've handled my share of hacked sites, some with invisible links.
If they appear again, then you'll need to find where they are getting through. It's a pain but you have to fully check your files for scripts and encrypted codes.
Aside from those, it's just time. Google will eventually stop showing them.
Good luck Andrew!
PS. You might want to look at some of your pages using Google's cache result. You can see invisible links using that. Just in case you haven't done this part.
-
Thank you for your response.
I also believe I was hacked through my wordpress. What exactly did you do once you realized the htaccess file was changed? Did you change it back to whatever code was there before?
I already submitted a reconsideration request to Google and it was successful. I no longer have "this site may be hacked" in the SERPs, but I still have thousands of urls pointing to 404 pages.
-
-
Samething happened to me a last month due to a securrity break in a pluggin that was part of my wordpress theme.
After hacked the site with injected url and also altering htaccess file (check that out) they changed the htacces file in order that if you enter your url you could see the correct version of your web, but if you enter your website in a google search traffic wen to this spammy viagra stuff pages.
I also recieved a manual action on my site.
What i did:
1- removed the injected files that were creating the spammy urls
2- edited the htacces file locating what code they had changed
3- summited a reconsideration request explaining what it was happening
4- Removed on webmaster tools al url that were spammy created on my site to remove them from google index
After 10 days manual action was removed. But till know i still have spammy links to 404 on my site. This happens because they also hacked other sites and creates like spammy linking networks. Has people start recovering their sites the amount of links to this pages will reduce.
My experience this big amount on 404 it has affected on about 30% of traffic. This traffic has know recovered almost completly and the amount on 404 is reducing with time.
So my conclusion is that this 404 are not healthy but they will be gone with time and your site will recover.
-
Ha, sorry about the initial test post. It wasn't publishing on my main computer at first.
-
Can you please be more specific.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
Will Google Recrawl an Indexed URL Which is No Longer Internally Linked?
We accidentally introduced Google to our incomplete site. The end result: thousands of pages indexed which return nothing but a "Sorry, no results" page. I know there are many ways to go about this, but the sheer number of pages makes it frustrating. Ideally, in the interim, I'd love to 404 the offending pages and allow Google to recrawl them, realize they're dead, and begin removing them from the index. Unfortunately, we've removed the initial internal links that lead to this premature indexation from our site. So my question is, will Google revisit these pages based on their own records (as in, this page is indexed, let's go check it out again!), or will they only revisit them by following along a current site structure? We are signed up with WMT if that helps.
Technical SEO | | kirmeliux0 -
Linking Root Domains after Site-Wide Redirect
(Edited for simplicity) Page #1 on site A has links from 5 different root domains. If I 301 that page to a page on site B that has zero links, will site B gain 5 linking root domains, per the Moz tool? Thanks.
Technical SEO | | ClearPoint0 -
Why does my mobile site have a "?mobiRedirect=1" string at the end of the URL?
Hello, When trying to access my site from a smart-phone, I'm getting a redirected to the mobile version (which is correct), however at the end of the URL there is a redirect string that shows every time. I'm not sure why its its showing or how it automatically gets appended to the end of the URL each time. How can I configure my mobile site to prevent the ?mobiRedirect=1" from showing? For example, if you search for "Columbus Regional Health" on Google with a smart-phone, the first result should be for www.crh.org. If you click that, you should get redirected to www.crh.org/mobile , however its displaying the URL as http://www.crh.org/mobile/default.aspx?mobiRedirect=1 Does anyone know how to fix this? Thank you,
Technical SEO | | Liamis
Brian0 -
Webmaster internal links issue
Hi All, In webmaster > Internal links https://www.google.com/webmasters/tools/internal-links?hl=en&siteUrl= I get counts as in the image http://imgur.com/9bO5H0f is this logical and ok or should i work on finding why so many links and reduce them? Thanks Martin
Technical SEO | | mtthompsons0 -
Can You 301 Unwanted Links to Another Site?
I am trying to clean up my link profile, and have noticed that a I have a lot of crappy inbound links linking to some of my old pages. And those old pages have since been 301'ed to current pages. My question is, is it worth trying to 301 those old pages, and thus those crappy links, to another website? Would this do anything to clean up my link profile?
Technical SEO | | red6marketing0 -
What are the considerations in setting language within the url of multilingual sites?
Is it good practice to use Language-Agnostic + LOCALE=en +LOCALE=fr (as per example below)? If not what is the best way to determine language within a url and why? For example, today we use: http://www.canadapost.ca/cpo/mc/default.jsf (goes to language last used by user) http://www.canadapost.ca/cpo/mc/default.jsf?LOCALE=fr (forces a French-launguage page) http://www.canadapost.ca/cpo/mc/default.jsf?LOCALE=en (forces and English-language page) I think you can get tell Google about these parameters through Webmaster tools to help them properly crawl and understand your content, but if we had the opportunity to change it what should we do?
Technical SEO | | CanadaPost0 -
How much of an issue is it if a site is somehow connected to a site that was penalized by Google?
I am working with someone that is about to launch a new site, and one of the sites was affected by the Panda update. Does it matter if the two sites are connected? Share the same hosting provider and same Google Webmaster's account?
Technical SEO | | nicole.healthline0