Time to deindexing: WMT Request vs. Server not found
-
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error.
I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that.
Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
-
Unfortunately, Google may continue to keep those pages in its index for months, even if they return a 404. The 2 best options in these cases is usually:
- Claim the profile in GWT - which would probably be possible but requires a lot of work with Godaddy configuring the subdomains just so you could claim the profile and de-index.
- I haven't tried it, but Google introduced a URL removal tools for URLs you don't controll. Might be a good use case here. Here's some info: http://googlewebmastercentral.blogspot.com/2013/12/improving-url-removals-on-third-party.html
-
Ive seen this a couple times
It does go away eventually.
-
No they were not duplicates. They all just showed a soft 404 provided by goDaddy. We had wildcards turned on, but even so I don't understand how Google found these. They were just not used for anything ever i.e. vww.example.com
People have pointed to them as something wonky, so I'm trying to get rid of them in case they are hurting our site's overall performance in the SERP.
-
This will eventually stop the pages being indexed yes. It may take several days in some cases but they will go.
Were these subdomains duplicates of your main domain? If so you could try 301 redirecting them as this could speed the process up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage is deindexed in Google
We recently noticed that our primary page was de-indexed in Google. When looking in google search console there are no manual actions taken. We did add a few new banners to the site but I have no idea why this would have negatively affected that site. I did add a new page called https://enleaf.com/company/testimonials/ that had some duplicate testimonials that were also on the home page but have since removed that. Not sure where to go from here.
Technical SEO | | AChronister0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
Content on top-level-domain vs. content on subpage
Hello Seomoz community, I just built a new website, mainly for a single affiliate programm and it ranks really well at google. Unfortunately the merchant doesn’t like the name of my domain, that’s why I was thrown out of the affiliate program. So suppose the merchant is a computer monitor manufacturer and his name is “Digit”. The name of my domain is something like monitorsdigital.com at the moment. (It’s just an example, I don’t own this URL). The structure of my website is: 1 homepage with much content on it + a blog. The last 5 blog entries are displayed on the homepage. Because I got kicked out of the affiliate program I want to permanent redirect monitorsdigital.com to another domain. But what should the new website look like? I have two possibilities: Copy the whole monitorsdigital website to a new domain, called something like supermonitors.com. Integrate the monitorsdigital website into my existing website about different monitor manufacturers. E.g.: allmonitors.com/digit-monitors.html (that url is permitted by the merchant) What do you think is the better way? I just got the impression, that it seems to be a little easier to rank high with a top-level-domain (www.supermonitors.com) than with a subpage (www.allmonitors.com/digit-monitors.html). However the subpage can benefit from the domain authority, that was generated by other subpages. Thanks for your help and best regards MGMT
Technical SEO | | MGMT0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
Problems with my Site? (If you have time take a look :P thanks)
Hey! If anyone has a moment I would really appreciate any tips or problems you see about our website. I don't expect much but would appreciate any suggestions and such. We are working on back linking and content generation but I know that may not be much use if the website itself is not built well enough! Thank you to anyone who takes a moment of their time to take a look! Site: http://earthsaverequipment.com I am more interested in SEO issues or suggestions not comments that you dislike my artwork 😛 haha Cheers Charles
Technical SEO | | WebNooby0 -
Does Server Location have anything to do with Search Results
Good Morning Everyone... Does having a site hosted in Europe have any effect on Search Engine results in the US? Thanks
Technical SEO | | Prime850 -
Microsite on subdomain vs. subdirectory
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances. What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
Technical SEO | | ryanwats0