Google Indexed Site A's Content On Site B, Site C etc
-
Hi All,
I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly.
I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert.
My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer.
I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great.
Thanks
Eric -
Hi Eric,
Thanks for the update.
The screenshot showing the 301 is correct - all good there.
Regarding the sitemap, sorry I should have been clearer on this - can you exclude that from the redirects so that when Google crawl it, they don't get redirected and instead find all of the URLs from the old site?
Cheers.
Paddy
-
Hi Paddy,
Its been a few days since I added the sites into webmaster tools and I'm now seeing the following (attached image) on all of them. Would that be correct or is there something else that I need to do?
Also when I submit a sitemap for the sites with the 301 redirect it loads up the sitemap on my correct site (since its a redirect site). I assume that would be correct but just wanted clarification on that.
Thanks
Eric
-
Great thank you
I'll give it a shot ant let you know how it worked.
-
Hi Eric,
I'd set up a profile for whichever version of the URLs 301 to your main site. So if the https version redirects, then use that one.
I don't think you need to submit every single URL, I'd recommend submitting a handful of the main ones (in terms of traffic or site architecture) and asking Google to also crawl all links on the page.
On the sitemap, you'd enter the URLs that have redirects in place which is your old site. In your example, this would be sites B,C and D which all need their own Search Consoles + XML sitemaps for the pages on those sites with redirects.
Cheers.
Paddy
-
Hi Paddy,
I do have access to all of those domains so I can set them up in search console. Would I setup the https version in search console and then run the crawl?
I have about 100 urls on each site that are wrong. Its not a huge deal for me to do it manually but is there a faster way to have it submitted and recrawled. If I do the sitemap would I enter in the old urls that are indexed or the new url that I want it to go to?
Thanks
Eric
-
Hi Eric,
Thanks for the question.
Are you able to register each of the duplicate sites with Google Search Console? If so, you could do that and then use the Fetch as Google feature which then lets you submit pages to the Google index. So you could enter the URL of a page that is now redirected and ask Google to recrawl it.
You could also setup sitemaps for the duplicate sites and submit those to try and prompt Google to recrawl them.
Hope that helps!
Paddy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Google Site Extended Listing Not Indexed
I am trying to get the new Site map to be picked up by Google for the extended listing as its pulling from the old links and returning 404 errors. How can I get the site listing indexed quickly and have the extended listing get updated to point to the right places. This is the site - http://epaperflip.com/Default.aspx This is the search with the extended listing and some 404's - Broad Match search for "epaperflip"
Intermediate & Advanced SEO | | Intergen0 -
medical site with no unique content
Hi I'm trying to promote an ecommerce site that sells vitamins and health goods. The site owner doesn't want to add texts in the product pages because it is medical material. therefore he Currently has non unique (duplicated) content in each product page' It is the same exact content all others have (taken From the manufacturer)' Any ideas? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Google isn't displaying the www. for my site in the SERPS
I noticed that every other site url in the serps for my main keywords has a www. on their display url except mine. I have the site set to display the www. Can this potentially hurt my SEO and what can I do to fix this? Thanks Aaron. www.png
Intermediate & Advanced SEO | | afranklin0