Odd scenario: subdomain not indexed nor cached, reason?
-
hi all
hopefully somebody can help me with this issue
6 months ago a number of pages hosted at a domain level have been moved to a subdomain level with 301redirects + some others were created from scratch ( at a subdomain level too).
what happens is that not only the new urls at the subdomain level are not indexed nor cached, but the old urls are still indexed in google, although by clicking on them they bring to the new urls via 301 redirect.
question is why having a 301 redirects to the new urls, no issues with robot.txt, metarobots etc, the new urls are still de-indexed? i might remind you that a few (100 pages or so) have been created from scratch, but they are also not indexed.
the only issue found across the page is the no-cache line of code set as follow:
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache
i am not familiar with cache control lines. Can this be an issue from a correct indexing?
thanks in advance
Dario
-
How long has it been since the change? Google will need weeks and week to recrawl and reindex all the stuff.
If it's been a while, this is one of those issues where we kind of need the URL. It can be a lot of different things, and sometimes it's a lot faster and easier if someone just gets in there and digs around.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Accidental No Index
Hi everyone, We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all. If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
Technical SEO | | AliMac260 -
Site not getting indexed by googlebot.
The following question is in regards to http://footeschool.org/. This site is not getting indexed with google(googlebot) This only happens when the user agent is set googlebot. This is a recent issue. We are using DNN as CMS. Are there any suggestion to help resolve this issue?
Technical SEO | | bcmull0 -
Subdomains or Subdirectory for multisite SEO structure?
Hey Mozzers, I work for a startup releasing several apps all within their own niches: hiking, mountain biking, skiing, running etc.. We've decided to go for the Wordpress Multisite route and I was wondering what the best site structure was. For example: Would hike.myapp.com or myapp.com/hike be more beneficial to our growth plans? My thinking follows that of geotargeting strategies for franchises (uk.travel.com etc) and to go for the subdomain option in order to build each individual 'sites' authority because each sport has niche audiences. Or am I talking nonsense? I've read varying advice and thought I'd ask you guys. Cheers, A
Technical SEO | | AdamRob011 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Web page is showing up on Google but doesn't show when it was cached, so is it indexed?
Hey everyone So I created a new page on a WordPress website, it was live for a few hours till I changed my mind & switched it back to a draft. Just out of curiosity I did the Site:www.example.com/Example search on Google to see if it had been indexed & apparently it had but when I click on cached to see what time it got indexed at exactly it's showing me an error. So does this mean it is indexed or not?
Technical SEO | | conversiontactics0 -
Understanding page and subdomain metrics in OSE
When using OSE to look at the **Total External Links **of a websites homepage, I dont understand why the page and subdomain metrics are so different. For example, privacy.net has 1,992 external links at the page level and 55,371 at the subdomain level. What subdomain? www redirects to privacy.net. And they have 56,982 at the root domain level - does that mean they have around 55k deep links or what?
Technical SEO | | meterdei0 -
Subdomains
Hi, I have recently started working in-house for a company and one site development was started and completed just as I joined. A new area of the site has been developed, but the developers have developed this new section in php, which cannot be hosted on the windows server the site is running on (they tell me, is this correct?) They want to add the new section as a subdomain - http://newarea.example.co.uk/ whereas I would have preferred the section added as a new subfolder. I plan to ensure that future developments to not have this problem, but is the best solution to work with the subdomain (in this instance it may not be too bad as it is a niche area of the site), or can I redirect the pages hosted on the sub-domain to a subfolder, and is this recommended? Thanks for your time.
Technical SEO | | LSLPS0 -
Subdomain Robots.txt
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain? If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up? Thanks!
Technical SEO | | JohnECF0