Two URLs with same content
-
We recently had a client who own multiple brands switch from having multiple urls to having a single domain with multiple sub domains. I've posted an example below to better explain.
My question is the original url is still functional, so there are two urls with identical content, yet I haven't been getting a duplicate content error. Also, would a rel canonical link be beneficial in this case since the duplicate content is on two separate domains?
My thoughts were to put a 301 redirect on the original pages so they permanently forward to the new sub-domain format. Is this the best course of action? If not, what would you recommend?
Example:
Original URLs
www.example1.com
www.example2.com
www.example3.com
www.parentcompany.comNew URLs
example1.parentcompany.com
example2.parentcompany.com
example3.parentcompany.com
www.parentcompany.comLet me know if this I need to clarify anything in better detail.
Thanks in advance! -
I would prefer 301 before canonical. This is because you don't loose as much link juice/pagerank when using 301 as you are when using canonical. In this scenario, it's best to act like Matt Cutts says in the attached video.
Take a look at the vid' and please don't hesitate to ask further questions to me
-
A 301 redirect is the best solution as it will point users and bots to what will then be the only source of the content. A rel="Canonical" will tell search engines which page is the canonical version but you will still have users hitting pages on the old domain and potential creating links to those pages instead of to your new subdomains, which isn't ideal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
JS and HTML links: Any risk involved being employing two similar links on every page?
Hi all, We used a have a banner with link from all pages our subdomain. The link is JS link and it's linking to our website homepage from our sub domain. Recently we have added similar HTML link from all pages of sub domain presuming that Google might not be considering JS links. So, now we have 2 links (JS and HTML) from every page of sub domain pointing to the website. If Google considers 2 links, will there be any risk for employing same link twice from every page? Thanks
Web Design | | vtmoz0 -
Does having too many wordpress portfolio pages with little content hurt a site's SEO?
I have a site that is for a service company, not image based like a photographer or artist. We utilize the Portfolio feature to create a gallery of floor coating finishes (images of all the flooring finish options available) but this solution has created /portfolio/file-name pages for each image. These pages have no other content besides the image. I've run SEMrush audits on this site which shows a high percentage of pages with low text/code ratio and duplicate content (a lot of the finishes have very similar names). This site has been extremely slow to improve any visibility online (more than 9 months) and I'm wondering if this is a factor by possibly having a negative effect on our site. We initially chose the portfolio option because it was the best-looking solution for our users but we can certainly change it to another format if that is better. Thanks!
Web Design | | WillGMG0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
How would a redesign, content update and URL change affect ranking?
Hi guys, I have a question that I suspect there is no simple true or false answer to, but perhaps someone has done the same thing as we're pondering wether or not to do? We're taking over an existing site that ranks very well on all the important keywords and is obviously very well liked by Google. The site is today hosted on a sub-domain (xxx.domain.com). When taking over, we'll have to redesign the site and recreate most of the content on the site (unique). The site structure, URLs, incoming links etc. will remain exactly the same. Since we are recreating the site, we also have the opportunity to move the site off the sub-domain and on to the main domain (domain.com/xxx - 85/100 Moz rank) and do a 301 Permanent Redirect on all old URLs. Our long-time experience is that content on the main domain, ranks way better than the sub-domain. The big question is wether or not Google will punish us for both changing the content and the location of the site at the same time? Cheers!
Web Design | | mattbs
Matt0 -
Content Migration & cost of moving pages
Hope you are all having a great day! I am wondering if anyone would be able to provide general feedback. I work for a medium size company in Chicago. Currently our site is static html and we are seeking to migrate to Wordpress. After speaking with a number of website companies and receiving proposals, I am trying to understand if there is an approximate going rate or range for moving content from static html to a CMS like Wordpress? i.e. a cost per page? We don't have any dynamic content. Most of our pages are text and images. The site itself, including the blog is around 220 pages. Thanks in advance for any insight or resources!
Web Design | | SEOSponge0 -
How to find internal pages linking to a URL?
Hey, I had an issue where a client found a bad link on their site then I went to fix it and couldn't figure out where on earth it was. I tried using different software which would find the link, but not tell me where it was linked from. I asked for some help from someone in my office and they found it in about 15 seconds. Their strategy was "think like a client - just click everywhere". Is there a way to quickly find what URLs are pointing to a specific URL? Cheers
Web Design | | renegadeempire0 -
Bizarre PDF URL string
Hey folks, I'm getting literally hundreds of Duplicate Title and Duplicate Content errors for a site, and most of them are a result of the same issue. The site uses javascript container pages a lot, but each gets their own URL. Unfortunately, it seems like each page is also loading all the content for all the other pages, or something. For instance, I have a section of the site under /for-institutions/, and then there are 5 container pages under that. Each container page has it's own URL, so when you select it, you get the URL /for-institutions/products/ or /for-institutions/services/ etc. However, the institutions container page doesn't change, just the content within. In my SEO results, I'm getting the following: /for-institutions/$%7Bpdf%7D/ /for-institutions/$%7Bpdf%7D/$%7Bpdf%7D/ etc, each as a duplicate title and content page. How can I eliminate this? Is there a regular expression that rewrites URL segments beginning with $ ? For your reference: The page is set up so that any URL that doesn't exist just refers to the subdirectory. /for-institutions/$%7Bpdf%7D/ displays /for-institutions/, but does not rewrite the URL. So too if I were to enter /for-institutions/dog.
Web Design | | SpokeHQ0 -
Duplicate content and blog/twitter feeds
Hi Mozzers, I have a question... I'm planning to add a blog summary/twitter feed throughout my website (onto every main content page) and then started worrying about duplicate content. What is best practice here? Let me know - thanks, Luke PS. I sat down and re: blog feed... thought that perhaps it would help if I fed different blog posts through to different pages (which I could then edit so I could add<a></a> text different from that in blog). Not sure about twitter.
Web Design | | McTaggart1