Content From One Domain Mysteriously Indexing Under a Different Domain's URL
-
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info:
Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only.
Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below.
Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK
When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com
http://screencast.com/t/FkUgz8NGfFe
All of these links give you a 404 when clicked...
Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke.
The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports.
services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me.
the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent.
Any ideas? As one could imagine this is not an ideal scenario for either website.
-
A similar thing happened to me once. In my case, the DNS settings were incorrect. Check that
-
I'm not sure what would be causing this. It looks like the pages did exist on the services subdomain at one time. Maybe try adding the subdomain in Webmaster tools and removing all pages. You might also want to add a robots.txt to the subdomain and disallow bots from crawling.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different content on the same URL depending on the IP address of the visitor
Hi! Does anybody have any expierence on the SEO impact when changing the content of a page depending on the IP address of the visitor? Would be text content as well as meta information. This happening on the same URL. Many thanks.
Intermediate & Advanced SEO | | Schoellerallibert0 -
Why Aren't My Images Being Indexed?
Hi, One of my clients submitted an image sitemap with 465 images. It was submitted on July 20 2017 to Google Search Console. None of the submitted images have been indexed. I'm wondering why? Here's the image sitemap: http://www.tagible.com/images_sitemap.xml We do use a CDN for the images, and the images are hosted on a subdomain of the client's site: ex. https://photos.tagible.com/images/Les_Invalides_Court_Of_Honor.jpg Thanks in advance! Cheers,
Intermediate & Advanced SEO | | SEOdub
Julian0 -
Community Discussion: Are You Optimizing Your Brand's Content for Featured Snippets?
My latest post on the Moz Blog, Featured Snippets: A Dead-Simple Tactic for Making, explores how to keep Featured Snippets once you have them. I'm curious to know how many brands are actively working to get in the answer box, and for those who are, what's been the results?
Intermediate & Advanced SEO | | ronell-smith2 -
Sub Domains vs. Persistent URLs
I've always been under the assumption that when building a micro-site it was better to use a true path (e.g. yourcompany.com/microsite) URL as opposed to a sub domain (microsite.yourcompany.com) from an SEO perspective. Can you still generate significant SEO gains from a sub domain if you were forced to use it providing the primary (e.g. yourcompany.com) had a lot of link clout/authority? Meaning, if I had to go the sub domain route would it be the end of the world?
Intermediate & Advanced SEO | | VERBInteractive0 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0 -
.com Outranking my ccTLD's and cannot figure out why.
So I have a client that has a number of sites for a number of different countries with their specific ccTLD. They also have a .com in the US. The problem is that the UK site hardly ranks for anything while the .com ranks for a ton in the UK. I have setup GWT for the UK and the .com to be specific to their geographic locations. So I have the ccTLD and I have GWT showing where I want these sites to rank. Problem is it apparently is not working....Any clues as to what else I could do?
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Duplicate Content across 4 domains
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical. There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain. I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain. What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc. Or would it better to use cross-domain canoncial tags? Thanks
Intermediate & Advanced SEO | | bjalc20110