Open Site Explorer - Top Pages that don't exist / result of a hack(?)
-
Hi all,
Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which.
The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’. And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs:
ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php
I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console. Sooner or later they’ll disappear. Still, a year after, they appear.
I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack. The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links.
My question is: Is there a more direct action I should take here? For example, informing Google of the offending domains with these backlinks.
Any thoughts appreciated! Many thanks
-
Hi Mª Verónica B
That's great, Many thanks for the confirmation.
All the best,
Colin
-
Hi Colin,
If the backlinks/inbound links are spam, yes upload a disavow file, only related to those.
If multiple ghost pages in WordPress due to erased hacked pages, yes the new hidden page with all the above instructions, only related to the spam pages.
All the best,Mª Verónica B.
-
Thanks again Mª Veronica for taking the time to respond.
Ok, if i understand correctly, as those spam / 'canadagoose' related backlinks do indeed exist* , a disavow file for google would be the thing to do here?
There was indeed a hacking, which happened before i came along, which is reported in Google Search Console. And there are 100's of 'canadagoose' related crawl errors with a response code of 404 that just keep coming back. It looks like those pages did indeed once exist, and must have been deleted by the website developers. So the 'empty page' technique would apply here?
*It seems to me that the 'canadagoose' pages that have apparently since been deleted , and the backlinks linking to those 'ghost' pages, are all part of the hack:
- hack website, create 'canadagoose' pages
- link to 'canadagoose' pages from other websites
Many thanks,
Colin
-
Hi Colin,
Not exactly!
We are not talking about backlinks. Backlinks come from other websites, therefore we cannot control them, except upload a disavow file for Google.
That is quite different.We are talking about the hundred of "ghosts" of deleted pages - we deleted them, because our website was hacked.
At the time we deleted all those, that is not enough.
Crawlers will "see" 330 or more pages with 404 status!
That is awful for SEO, due to the crawlers/Google "understands" that you do not care about user experience, means you have so many erased pages that if somebody goes there, there will be nothing.1.- Moz Top Pages to find out all the spam pages and then Google, of course to be sure.
2.- A new page, not completely empty. Should say something like
"We are truly sorry... Thanks."
This is for the crawlers, it is supposed no human knows about those spam pages. Except the one that hacked your website.
3.- Redirect all spam pages in the list, with a 301 - a permanent redirect to the noindex/nofollow page you just created.
4:- Verify, copy and past from the list into the navigator and check if goes to the new page, also verify the page status with Moz bar.Thanks. Good luck,
Mª Verónica
-
Many thanks for your response Mª Veronica B, very helpful.
I've never used the disavow backlinks tool in Google Search Console. I would have assumed this is the ideal scenario to use it to disavow _specific _backlinks (not _all _backlinks). But instead what you're suggesting is:
-
create an empty / hidden (WordPress) Page, and make it noindex / nofollow
-
Get a list of all spam backlinks from Google Search Console
-
Redirect all spam backlinks in the list to the empty noindex / nofollow page
This would never have occurred to me, I'm going to do this right now.
Again, many thanks!
Colin
-
-
Hi,
It seems that the website has a similar situation as the one that I shared before.
Although, I had to take immediate action due to it was creating a very serious problem by sending malicious signals to all the crawlers.
Also, I discovered the issue by using the same Moz feature.
Thanks Moz!https://moz.com/community/q/more-than-450-pages-created-by-a-hacker
In my experience, by sending all those pages to a new hidden page, using a 301 and the noindex and nofollow directives. It is, somehow, sending the right signals to the crawlers of Google and the other search engines.
Let's say strongly informing all the crawlers, that those spam pages/404 are not relevant nor interesting for your website.
Andy's response agrees that is the best solution. Also, he recommends the Wordfence plugin for WordPress as a preventive measure to avoid further issues.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
301 redirect to search results page?
Hi - we just launched our redesigned website. On the previous site, we had multiple .html pages that contained links to supporting pdf documentation. On this new site, we no longer have those .html landing pages containing the links. The question came up, should we do a search on our site to gather a single link that contains all pdf links from the previous site, and set up a redirect? It's my understanding that you wouldn't want google to index a search results page on your website. Example: old site had the link http://www.oldsite.com/technical-documents.html new site, to see those same links would be like: http://www.newsite.com/resources/search?View+Results=&f[]=categories%3A196
Intermediate & Advanced SEO | | Jenny10 -
301'd an important, ranking page to the wrong new page, any recourse?
Our 1,300 page site conversion from static html to Wordpress platform went flawlessly with the exception of 1 significant issue....an old, important, highly ranking page was 301 redirected to the wrong corresponding new page. The page it was redirected to is about a similar product, but not the same. This was an oversight that slipped through. It was brought to my attention when I noticed this new page was still holding the old page's rankings but the bounce rate skyrocketed (clearly because the content on the wrong new page was not relevant). Once identified, we cleaned up the redirect. My fear is that all the juice built up on the old .html page that ranked well has now permanently been passed to an irrelevant, insignificant page. -Is there any way to clean up this mistake? -Is there anything I can do to assist Google in associating the correct 'new' page with correct 'old' page after the wrong redirect was initially set-up? -Am I going to have to start from scratch with the new page in terms of trust, backlinks, etc. since google already noted the redirect? Thanks!
Intermediate & Advanced SEO | | seagreen0 -
Which is better /section/ or section/index.php?
I have noticed that Google has started to simply link to /section/ as opposed to /section/index.php and I haven't changed any canonical tags etc. I have looked at my pages moz authority for the two /section/ = 28/100
Intermediate & Advanced SEO | | TimHolmes
/section/index.php = 42/100 How would I go about transferring the authority to /section/ from /section/index.php to hopefully help me in my organic serp positions etc. Any insight would be great 🐵0 -
Google's form for "Small sites that should rank better" | Any experiences or results?
Back in August of 2013 Google created a form that allowed people to submit small websites that "should be ranking better in Google". There is more info about it in this article http://www.seroundtable.com/google-small-site-survey-17295.html Has anybody used it? Any experiences or results you can share? *private message if you do not want to share publicly...
Intermediate & Advanced SEO | | GregB1230 -
Redirecting Pages from site A to site B
Hi, I have a client who have a solid, high ranking content based site (site A). They have now created an ecommerce site in addition (site B). To give site B a boost in terms of search engine visibility upon launch, they now wish to redirect approx 90% of site As pages to site B. What would be the implications of this? Apart from customers being automatically redirected from the page they thought they where landing on, how would google now view site A? What are your thoughts to thier idea. I am trying to talk them out of it as I think its a poor one.
Intermediate & Advanced SEO | | Webrevolve0 -
E-commerce Site - Filter Pages
Hi, We have a client who has a fairly large e-commerce site that went live quite recently. The site is near enough fully indexed by Google, but one thing I've noticed is that filtered search results pages are being indexed, all with duplicate page titles. Obviously this is an issue that needs to be looked at ASAP. My questions is this - would we be better tweaking site settings so that page titles are constructed from the filters (brand/price/size) and therefore unique (and useful for searchers who are after a specific brand or size of a given item). Or should we rel=canonical the filtered pages so that they are eventually dropped from the index (the safer of the two options)? Thanks in advance for your help!
Intermediate & Advanced SEO | | jasarrow0