Website not being indexed after relocation
-
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain.
So effectively there were two sites, both more or less identical, with identical content.
The first website was thoroughly indexed by Google.
The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc.
I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site.
A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this.
For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index.
I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here.
Many thanks!
Matt.
-
Nice catch, Lynn. That's got to be (at least the majority of) the problem.
-
Hi Matt,
It looks like you have no index headers being sent out on your site. if you have the web developers toolbar installed on firefox and view the response headers on your homepage you will see:
X-Robots-Tag: noindex, nofollow, nosnippet
So this is like a noindex, no follow meta tag and is basically blocking the search engines from spidering and indexing your site. If you find out where those headers are getting set and get rid of them you should see your site getting indexed pretty quickly.
-
Hi Matt,
Majestic, Open Site Explorer and ahrefs are all showing zero links pointing to the entire domain, waydownunder.com.au.
I'm not suggesting this proves that you don't have enough links for Google to crawl/index the site, as I have also repeatedly seen Google index sites that don't have links yet. However, if these three major link indexes are showing zero links, there's a good chance Google's not discovering the site through regularly crawling as well.
Have you tried creating and submitting a sitemap via Webmaster Tools?
Best,
Mike -
Thanks Lynn and Mike - really appreciate your feedback. What you've both said about duplicate content being a ranking rather than indexing issue certainly makes sense.
Unfortunately the old site is unable to be restored. On the other hand, regular links have been posted to the site through social media (facebook) as well as a blog site (which IS being indexed regularly).
So - this has me entirely stumped!!!! I just cannot see any reason why the site is not being indexed at all! The site has been live now for around 2 -3 months; and I've had other sites with far less content / active links etc etc being indexed in no time at all.
The website in question is www.waydownunder.com.au - if anyone had a minute to take a quick look and see if I've missed anything obvious, I would really really appreciate it.
Thanks kindly,
Matt.
-
Hi Matt,
I would echo Lynn's recommendations here.
I doubt Google is actively filtering the 2nd site from search results (the duplicate content filter is employed scarcely, you'll find no shortage of duplicated sites that are indexed - it's also more of a results filter than an index filter, meaning duplicate content is still indexed, it just isn't shown in SERPs when the filter is active).
It's more likely that you simply haven't sent Google a strong enough ping that the site is worth indexing. Generate some marketing activity around the site, link to it from the current site as Lynn suggested (esp. with turning those pages into summaries), and I expect the site will show up in the index within a couple of weeks.
Best of Luck,
Mike -
Hi Matt,
It can take some time to index new sites. Submitting a sitemap to GWT, building a couple of links and sharing it a bit on social channels will usually help speed up the process. I am not very familiar with google sites, but if you can re-enable the google hosted site then maybe it is an idea to announce there that the site is now hosted elsewhere and link to it. You could reduce the content on the google site pages to just an abstract/intro on each page and link to the full content which is now on the new site which should take care of duplicate content issues and also show a clear connection between the two of them (for both incoming users and search engines).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
Intermediate & Advanced SEO | | akih0 -
Is my website is having enough content on it to rank?
I have less content on my website, is this okay or I need to add more content on my pages? Website is - brandstenmedia.com.au Any other suggestions for the website?
Intermediate & Advanced SEO | | Green.landon0 -
Website losing rank positions after https
Hi there.. we were having a good increase in rankings since using Moz and then we purchased an ssl certificate so our website would run under https... to make sure our website is only accessed through https, we have used a method on .htaccess (appache) to make sure the website is only reached through https, using: RewriteEngine on
Intermediate & Advanced SEO | | mszeer
RewriteCond %{HTTPS} !=on
RewriteRule ^(.*)$ https://%{HTTP_HOST}/$1 [R,QSA] after doing this, from our last crawl perspective, we lost several positions that we were increasing.. what have we done wrong ? we have to fix this asap, please help us...0 -
One website, multiple service points
Hi all Wondering if anyone could offer an opinion on this..am talking to a new client who offer kitchen installation and design. They have a central headquarters and cover a 100 mile radius of their location. A lot of search terms they are aiming to target - Kitchen Design, Kitchen Fitters etc offer localised results. This is where my issue lays. I have worked with plenty of clients in the past which have physical presence in multiple locations and have marked up the site so that the site ranks for each of the stores, but trying to make one site appear in many locations where it doesn't have an address is a different issue completely. Not only do they only have one address, they also only have one phone number. We will target, as best we can, the non localised keywords but need to work out what to do to cover the locations 20/30/40 miles from the office which they cover. I welcome any opinions on this please.
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Recovering from index problem
Hi all. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (<cite>thewilddeckcompany.co.uk/index.php?id=13</cite>) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. Any ideas? I'm stumped!
Intermediate & Advanced SEO | | Blink-SEO0 -
Freshness Index?
Hi, I've been a member for a few months but this is my first entry. I typically build small portal websites to help attract more customers for small business approx. 5-7 pages and very tightly optimized around one primary keyword and 2 secondaries. These are typically very low competition. I do no link building to speak of. I don't keyword stuff or use poorly written content. I know that may be subjective but I believe the content I am using is genuinely useful to the reader. What I have noticed recently is the sites get ranked quite well to begin with e.g. anywhere from the bottom half of the first page to page 2-3 and they stick for maybe 2-3 weeks, and the client is very happy, they then just vanish. It's not just the Google dance either these sites don't typically come back at all or when they do they are 100+ I was advised this was due to the freshness index but honestly these sites are hardly newsworthy...just wondering if anyone had any ideas? Many thanks in advance.
Intermediate & Advanced SEO | | nichemarkettools0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590