Website not being indexed after relocation
-
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain.
So effectively there were two sites, both more or less identical, with identical content.
The first website was thoroughly indexed by Google.
The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc.
I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site.
A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this.
For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index.
I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here.
Many thanks!
Matt.
-
Nice catch, Lynn. That's got to be (at least the majority of) the problem.
-
Hi Matt,
It looks like you have no index headers being sent out on your site. if you have the web developers toolbar installed on firefox and view the response headers on your homepage you will see:
X-Robots-Tag: noindex, nofollow, nosnippet
So this is like a noindex, no follow meta tag and is basically blocking the search engines from spidering and indexing your site. If you find out where those headers are getting set and get rid of them you should see your site getting indexed pretty quickly.
-
Hi Matt,
Majestic, Open Site Explorer and ahrefs are all showing zero links pointing to the entire domain, waydownunder.com.au.
I'm not suggesting this proves that you don't have enough links for Google to crawl/index the site, as I have also repeatedly seen Google index sites that don't have links yet. However, if these three major link indexes are showing zero links, there's a good chance Google's not discovering the site through regularly crawling as well.
Have you tried creating and submitting a sitemap via Webmaster Tools?
Best,
Mike -
Thanks Lynn and Mike - really appreciate your feedback. What you've both said about duplicate content being a ranking rather than indexing issue certainly makes sense.
Unfortunately the old site is unable to be restored. On the other hand, regular links have been posted to the site through social media (facebook) as well as a blog site (which IS being indexed regularly).
So - this has me entirely stumped!!!! I just cannot see any reason why the site is not being indexed at all! The site has been live now for around 2 -3 months; and I've had other sites with far less content / active links etc etc being indexed in no time at all.
The website in question is www.waydownunder.com.au - if anyone had a minute to take a quick look and see if I've missed anything obvious, I would really really appreciate it.
Thanks kindly,
Matt.
-
Hi Matt,
I would echo Lynn's recommendations here.
I doubt Google is actively filtering the 2nd site from search results (the duplicate content filter is employed scarcely, you'll find no shortage of duplicated sites that are indexed - it's also more of a results filter than an index filter, meaning duplicate content is still indexed, it just isn't shown in SERPs when the filter is active).
It's more likely that you simply haven't sent Google a strong enough ping that the site is worth indexing. Generate some marketing activity around the site, link to it from the current site as Lynn suggested (esp. with turning those pages into summaries), and I expect the site will show up in the index within a couple of weeks.
Best of Luck,
Mike -
Hi Matt,
It can take some time to index new sites. Submitting a sitemap to GWT, building a couple of links and sharing it a bit on social channels will usually help speed up the process. I am not very familiar with google sites, but if you can re-enable the google hosted site then maybe it is an idea to announce there that the site is now hosted elsewhere and link to it. You could reduce the content on the google site pages to just an abstract/intro on each page and link to the full content which is now on the new site which should take care of duplicate content issues and also show a clear connection between the two of them (for both incoming users and search engines).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About DA of the Website
Hi There, I would like to know why one of my website - https://patnalocalguide.com Domain Authority is not improving. It is not going up and down but it is maintaining its score between 10 since last 5 months. Not going above 10 or below. Why is it so? I'm conitnuously working on creating backlinks for the website and the DR score on Ahref improving but the DA is not. What steps to follow to increase DA? How to find out whats going wrong with our present strategy?
Intermediate & Advanced SEO | | MozByAnuj0 -
Website dropped out from Google index
Howdy, fellow mozzers. I got approached by my friend - their website is https://www.hauteheadquarters.com She is saying that they dropped from google index over night - and, as you can see if you google their name, website url or even site: , most of the pages are not indexed. Home page is nowhere to be found - that's for sure. I know that they were indexed before. Google webmaster tools don't have any manual actions (at least yet). No sudden changes in content or backlink profile. robots.txt has some weird rule - disallow everything for EtaoSpider. I don't know if google would listen to that - robots checker in GWT says it's all good. Any ideas why that happen? Any ideas what I should check? P.S. Just noticed in GWT there was a huge drop in indexed pages within first week of August. Still no idea why though. P.P.S. Just noticed that there is noindex x-robots-tag in headers... Anyone knows where this can be set?
Intermediate & Advanced SEO | | DmitriiK0 -
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Home page not being indexed
Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear - I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent
Intermediate & Advanced SEO | | EchelonSEO0 -
Canonical Tags being indexed on paginated results?
On a website I'm working on which has a search feature with paginated results, all of the pages of the search results are set with a canonical tag back to the first page of the search results, however Google is indexing certain random pages within the result set. I can literally do a search in Google and find a deep page in the results, click on it and view source on that page and see that it has a canonical tag leading back to the first page of the set. Has anyone experienced this before? Why would Google not honor a canonical tag if it is set correctly? I've seen several SEO techniques for dealing with pagination, is there another solution that you all recommend?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80