Google Indexed Site A's Content On Site B, Site C etc
-
Hi All,
I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly.
I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert.
My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer.
I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great.
Thanks
Eric -
Hi Eric,
Thanks for the update.
The screenshot showing the 301 is correct - all good there.
Regarding the sitemap, sorry I should have been clearer on this - can you exclude that from the redirects so that when Google crawl it, they don't get redirected and instead find all of the URLs from the old site?
Cheers.
Paddy
-
Hi Paddy,
Its been a few days since I added the sites into webmaster tools and I'm now seeing the following (attached image) on all of them. Would that be correct or is there something else that I need to do?
Also when I submit a sitemap for the sites with the 301 redirect it loads up the sitemap on my correct site (since its a redirect site). I assume that would be correct but just wanted clarification on that.
Thanks
Eric
-
Great thank you I'll give it a shot ant let you know how it worked.
-
Hi Eric,
I'd set up a profile for whichever version of the URLs 301 to your main site. So if the https version redirects, then use that one.
I don't think you need to submit every single URL, I'd recommend submitting a handful of the main ones (in terms of traffic or site architecture) and asking Google to also crawl all links on the page.
On the sitemap, you'd enter the URLs that have redirects in place which is your old site. In your example, this would be sites B,C and D which all need their own Search Consoles + XML sitemaps for the pages on those sites with redirects.
Cheers.
Paddy
-
Hi Paddy,
I do have access to all of those domains so I can set them up in search console. Would I setup the https version in search console and then run the crawl?
I have about 100 urls on each site that are wrong. Its not a huge deal for me to do it manually but is there a faster way to have it submitted and recrawled. If I do the sitemap would I enter in the old urls that are indexed or the new url that I want it to go to?
Thanks
Eric
-
Hi Eric,
Thanks for the question.
Are you able to register each of the duplicate sites with Google Search Console? If so, you could do that and then use the Fetch as Google feature which then lets you submit pages to the Google index. So you could enter the URL of a page that is now redirected and ask Google to recrawl it.
You could also setup sitemaps for the duplicate sites and submit those to try and prompt Google to recrawl them.
Hope that helps!
Paddy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Product Pages not indexed by Google
We built a website for a jewelry company some years ago, and they've recently asked for a meeting and one of the points on the agenda will be why their products pages have not been indexed. Example: http://rocks.ie/details/Infinity-Ring/7170/ I've taken a look but I can't see anything obvious that is stopping pages like the above from being indexed. It has a an 'index, follow all' tag along with a canonical tag. Am I missing something obvious here or is there any clear reason why product pages are not being indexed at all by Google? Any advice would be greatly appreciated. Update I was told 'that each of the product pages on the full site have corresponding page on mobile. They are referred to each other via cannonical / alternate tags...could be an angle as to why product pages are not being indexed.'
Intermediate & Advanced SEO | | RobbieD910 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Getting out of Google's Penguin
Hi all, my site www.uniggardin.dk has lost major rankings on the searchengine google.dk. Went from rank #2-3 on important keywords to my site, and after the latest update most of my rankings have jumped to #12 - #20. This is so annoying, and I really have no idea what to do. Can it cause bad links to my site? In that case what will I have to do? Thanks in advance,
Intermediate & Advanced SEO | | Xpeztumdk
Christoffer0 -
Wordpress blog in a subdirectory not being indexed by Google
HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1
Intermediate & Advanced SEO | | Tone_Agency0 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0