Website not being indexed after relocation
-
I have a scenario where a 'draft' website was built using Google Sites, and published using a Google Sites sub domain. Consequently, the 'same' website was rebuilt and published on its own domain.
So effectively there were two sites, both more or less identical, with identical content.
The first website was thoroughly indexed by Google.
The second website has not been indexed at all - I am assuming for the obvious reasons ie. that Google is viewing it as an obvious rip-off of the first site / duplicate content etc.
I was reluctant to take down the first website until I had found an effective way to resolve this issue long-term => ensuring that in future Google would index the second 'proper' site.
A permanent 301 redirect was put forward as a solution - however, believe it or not, the Google Sites platform has no facility for implementing this.
For lack of an alternative solution I have gone ahead and taken down the first site. I understand that this may take some time to drop out of Google's index, however, and I am merely hoping that eventually the second site will be picked up in the index.
I would sincerely appreciate an advice or recommendations on the best course of action - if any! - I can take from here.
Many thanks!
Matt.
-
Nice catch, Lynn. That's got to be (at least the majority of) the problem.
-
Hi Matt,
It looks like you have no index headers being sent out on your site. if you have the web developers toolbar installed on firefox and view the response headers on your homepage you will see:
X-Robots-Tag: noindex, nofollow, nosnippet
So this is like a noindex, no follow meta tag and is basically blocking the search engines from spidering and indexing your site. If you find out where those headers are getting set and get rid of them you should see your site getting indexed pretty quickly.
-
Hi Matt,
Majestic, Open Site Explorer and ahrefs are all showing zero links pointing to the entire domain, waydownunder.com.au.
I'm not suggesting this proves that you don't have enough links for Google to crawl/index the site, as I have also repeatedly seen Google index sites that don't have links yet. However, if these three major link indexes are showing zero links, there's a good chance Google's not discovering the site through regularly crawling as well.
Have you tried creating and submitting a sitemap via Webmaster Tools?
Best,
Mike -
Thanks Lynn and Mike - really appreciate your feedback. What you've both said about duplicate content being a ranking rather than indexing issue certainly makes sense.
Unfortunately the old site is unable to be restored. On the other hand, regular links have been posted to the site through social media (facebook) as well as a blog site (which IS being indexed regularly).
So - this has me entirely stumped!!!! I just cannot see any reason why the site is not being indexed at all! The site has been live now for around 2 -3 months; and I've had other sites with far less content / active links etc etc being indexed in no time at all.
The website in question is www.waydownunder.com.au - if anyone had a minute to take a quick look and see if I've missed anything obvious, I would really really appreciate it.
Thanks kindly,
Matt.
-
Hi Matt,
I would echo Lynn's recommendations here.
I doubt Google is actively filtering the 2nd site from search results (the duplicate content filter is employed scarcely, you'll find no shortage of duplicated sites that are indexed - it's also more of a results filter than an index filter, meaning duplicate content is still indexed, it just isn't shown in SERPs when the filter is active).
It's more likely that you simply haven't sent Google a strong enough ping that the site is worth indexing. Generate some marketing activity around the site, link to it from the current site as Lynn suggested (esp. with turning those pages into summaries), and I expect the site will show up in the index within a couple of weeks.
Best of Luck,
Mike -
Hi Matt,
It can take some time to index new sites. Submitting a sitemap to GWT, building a couple of links and sharing it a bit on social channels will usually help speed up the process. I am not very familiar with google sites, but if you can re-enable the google hosted site then maybe it is an idea to announce there that the site is now hosted elsewhere and link to it. You could reduce the content on the google site pages to just an abstract/intro on each page and link to the full content which is now on the new site which should take care of duplicate content issues and also show a clear connection between the two of them (for both incoming users and search engines).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do the Quoras of this world index their content?
I am helping a client index lots and lots of pages, more than one million pages. They can be seen as questions on Quora. In the Quora case, users are often looking for the answer on a specific question, nothing else. On Quora there is a structure setup on the homepage to let the spiders in. But I think mostly it is done with a lot of sitemaps and internal linking in relevancy terms and nothing else... Correct? Or am I missing something? I am going to index about a million question and answers, just like Quora. Now I have a hard time dealing with structuring these questions without just doing it for the search engines. Because nobody cares about structuring these questions. The user is interested in related questions and/or popular questions, so I want to structure them in that way too. This way every question page will be in the sitemap, but not all questions will have links from other question pages linking to them. These questions are super longtail and the idea is that when somebody searches this exact question we can supply them with the answer (onpage will be perfectly optimised for people searching this question). Competition is super low because it is all unique user generated content. I think best is just to put them in sitemaps and use an internal linking algorithm to make the popular and related questions rank better. I could even make sure every question has at least one other page linking to it, thoughts? Moz, do you think when publishing one million pages with quality Q/A pages, this strategy is enough to index them and to rank for the question searches? Or do I need to design a structure around it so it will all be crawled and each question will also receive at least one link from a "category" page.
Intermediate & Advanced SEO | | freek270 -
Content very similar on different websites
Hello, I am in the travel industry and I am currently building the same website (different domain names), one for the US and one for the UK (same website design). They will both features the same content (itinerary, activities) on the page with 2 exception, the 1 st one is that I will use different hotels for my uk clientele and for my US clientele and on the UK page I will use the word "holiday" in the UK and the word "vacation" in the US. Can the fact that I do the same "itineraries" and use the same text on 95 % of the page hurt my ranking in one country or another ?
Intermediate & Advanced SEO | | seoanalytics0 -
Problems with a website-help
Soooooo, I did a crawl report on this site : www.greatwesternflooring.com and this was what was on the report. This is a dnn site. I'm guessing the site has a redirect loop given the http status code. Can anyone help me with a fix. (the developers have said there is no redirect on the site......clearly there is....) | http://www.greatwesternflooring.com/ | 2015-01-07T21:32:25Z | 609 : Redirect to already-visited URL received for page request. | Error attempting to request page; see title for details. | 302 | http://www.greatwesternflooring.com | <colgroup><col width="319"> <col width="144"> <col width="378"> <col span="39" width="64"></colgroup>
Intermediate & Advanced SEO | | Britewave
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |0 -
Blogs and E-Commerce websites
I have recently launched an e-commerce website which has a whopping domain authority of 1! I was thinking about adding a blog to it (it's in open cart), but that would mean creating it in a wordpress but using the same domain name. Would this be beneficial from an SEO stand point (i.e sending traffic to w blog that isn't actually on the e-commerce website itself) , or am I better off creating content as blogs/articles on other people sites?
Intermediate & Advanced SEO | | lindsayjhopkins0 -
URL strategy mobile website
Hello everyone, We are facing a challenging decision about where our website (Flash Gaming website) is going. We are in the process of creating html5 games in the same theme of the flash games that we provide to our users. Now our main concern is to decide how to show this new content to the user? Shall we create brand new set of urls such as : http://www.mydomain.com/games/mobile/kids/ Or shall we adapt the main desktop url : http://www.mydomain.com/games/kids/ and show the users two different versions of the page depending on whether they are using a mobile device (so they see a mobile version) or a pc/laptop (so they a see desktop version). Or even redirect people to a sub-domain : http://m.mydomain.com/ The main idea we had is to keep the same url structure, as it seems that google is giving the same search results if you are using a mobile device or not. And creating a new set of urls or even a sub-domain, may involve a lot of work to get those new links to the same PA as the desktop URL that is here and know since a while now. Also the desktop page game should not be accessible to the mobile devices, so should this be redirected (301?) to the mobile homepage of the site? But how google will look at the fact that one url is giving 2 different contents, CSS etc, and also all those redirects might look strange... we are worried that doing so will hurt the page authority and its ranking ... but we are trying to find the best way to combine SEO and user experience. Any input on this will be really appreciated. Cheers,
Intermediate & Advanced SEO | | drimlike0 -
Construction website
Hi, I have a construction website that is aimed at tradesmen. There are 2 goals of the site: 1. To allow potential customers to sign up for a trade account. 2. To allow existing customers to access to products and login to their account to make an order. The site is full of categories and products which should be indexed so we rank for these trade products. The homepage redesign is where i am having an issue: Currently the site is set up like a standard retail site but without prices, which are viewable only when logged in. The homepage is designed such that there is several call to actions about promotions, services and to apply for a trade account, that apply to both existing and potential customers. At the moment there is a poor conversion to get potential customers to apply for a trade account. This is because there is too much distraction away from this goal and they are allowed to engage other areas of the site freely. The main purpose of the homepage should be to encourage potential customers to sign up. The secondary purpose to for existing customers to access the accounts and products. I believe potential customers should not be exposed to the categories and products as it is a distraction from the primary goal. Potential customers, i.e. Tradesmen, would already have a certain understanding of the types of products we provide, so I don't feel it is necessary to allow them to crawl the rest of the site unless they have an account. What are your thoughts on that? Here is my lack of understanding: On the homepage, if I restrict access to categories and products to existing account holders only, where a login is required to proceed, would that mean Google cannot access these pages to index them? Or is this only controlled by NoFollows & Robots.txt? Obviously not indexing is undesirable. I do understand potential customers will need some information about our range of products but the idea is to coerce them to sign up for an account so they can see this information. The more information that is provided to a potential customer, the higher the probability a person can make a decision against applying for an account. Restricting access creates a motivator to reveal information and we capture their data to converse with them personally. This increases the probability of us being able to retain their interest by providing a customised service based on their needs. All of this I feel makes perfect sense to me, the only query/obstacle I have is the indexing of the site. If Google cannot index pages that are restricted by account access, then I would like suggestions to solve/compromise/optimise the above. Just to address the desired behaviour of index pages. If in search a our product page appears, the person clicking the link would either be redirected or exposed to a login or sign up screen to view. Thank you so much for your help. Antonio
Intermediate & Advanced SEO | | AVSFencingSupplies0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
Should I Combine 30 websites into one?
I have a Private health care company that I have just begun consulting for. Currently in addition to the main website serving the whole group, 30 individual sites which are for each of the hospitals in their group. Each has it's own domain. Each site, has practically identical content: something that will be addressed in my initial audits. But should I suggest that they combine all the sites into one domain, providing individual category pages for each hosptial, or am I really going to suggest that each of the 30 sites, create unique content of their own. This means thirty pages of content on "hip replacements" thirty different versions of "our treatement" etc, and bearing in mind they all run off the same CMS, even with different body text, the pages are going to be practically identical. It's a big call either way! The reason they started out with all these sites, is that each hospital is it's own cost centre and whilst the web development team is a centralized resource. They each have their own sites to try and rank indivdually for local searches, naturally as they will each tend to get customers from their own local area. Not every hospital provides the full range of treatments.
Intermediate & Advanced SEO | | Ultramod0