Time to deindexing: WMT Request vs. Server not found
-
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error.
I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that.
Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
-
Unfortunately, Google may continue to keep those pages in its index for months, even if they return a 404. The 2 best options in these cases is usually:
- Claim the profile in GWT - which would probably be possible but requires a lot of work with Godaddy configuring the subdomains just so you could claim the profile and de-index.
- I haven't tried it, but Google introduced a URL removal tools for URLs you don't controll. Might be a good use case here. Here's some info: http://googlewebmastercentral.blogspot.com/2013/12/improving-url-removals-on-third-party.html
-
Ive seen this a couple times
It does go away eventually.
-
No they were not duplicates. They all just showed a soft 404 provided by goDaddy. We had wildcards turned on, but even so I don't understand how Google found these. They were just not used for anything ever i.e. vww.example.com
People have pointed to them as something wonky, so I'm trying to get rid of them in case they are hurting our site's overall performance in the SERP.
-
This will eventually stop the pages being indexed yes. It may take several days in some cases but they will go.
Were these subdomains duplicates of your main domain? If so you could try 301 redirecting them as this could speed the process up.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having dots in my brand name hurt my SEO? ie: BoatU.S. vs BoatUS ?
Our official brand name has dots in it and we're wondering if having those dots will hurt our organic ranking and (or) lead to a mis-interpreted crawl by the bots..
Technical SEO | | BoatUS0 -
Canonical vs Alternate for country based subdomain dupe content?
What's the correct method for tagging dupe content between country based subdomains? We have: mydomain.com // default, en-us www.mydomain.com // en-us uk.mydomain.com // uk, en-gb au.mydomain.com // australia, en-au eu.mydomain.com // europe, en-eu In the header of each we currently have rel="alternate" tags but we're still getting dupe content warnings in Moz for the "WWW" subdomain. Question 1) Are we headed in the right direction with using alternate? Or would it be better to use canonical since the languages are technically all English, just different regions. The content is pretty much the same minus currency and localization differences. Question 2) How can we solve the dupe content between WWW and the base domain, since the above isn't working. Thanks so much
Technical SEO | | lvdh11 -
Subdomain vs Main Domain Penalties
We have a client who's main root.com domain is currently penalized by Google, but the subdomain.root.com is appearing very well. We're stumped - any ideas why?
Technical SEO | | Prospector-Plastics0 -
Picking a URL co.uk vs .net vs .biz which is best for SEO?
Hi I was always told that you only want a .co.uk or a .com is this still true? I am setting up an ecommerce site and the ideal .co.uk is gone! Am I better to have a longer ulr with maybe a hifen or a uk in it or is it ok to have a .biz or a .net these days? You help would be greatly appreciated! Thanks James
Technical SEO | | JamesBryant0 -
Competitive vs. non-competitive keywords
Posting this on behalf of a friends at a breast and ovarian cancer support organization... We're trying to improve our web site's results in search engines. I've been reading up on SEO and learning that it's wise to optimize for more specialized keywords rather than for highly competitive keywords (e.g., "triple negative breast cancer," instead of "breast cancer"). Practically speaking, how does this work when you're optimizing content? Does "triple negative breast cancer" need to appear multiple times in the page's content and is there an optimal place on the page where it should appear? Do keyword tags actually work, or should we not even bother spending time on adding them? How about title tags and description metatags? Will they help with search results? I know that increasing the number of outside links makes a difference, but will it help if I provide links from one page on our web site to another? Thanks!
Technical SEO | | Event360300 -
301s vs. rel=canonical for duplicate content across domains
Howdy mozzers, I just took on a telecommunications client who has spent the last few years acquiring smaller communications companies. When they took over these companies, they simply duplicated their site at all the old domains, resulting in a bunch of sites across the web with the exact same content. Obviously I'd like them all 301'd to their main site, but I'm getting push back. Am I OK to simply plug in rel=canonical tags across the duplicate sites? All the content is literally exactly the same. Thanks as always
Technical SEO | | jamesm5i0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
Good technical parameters worst load time.
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
Technical SEO | | sesertin0