Question about moving content from one site to another without a 301
-
I could use a second opinion about moving content from some inactive sites to my main site.
Once upon a time, we had a handful of geotargeted websites set up targeting various cities that we serve. This was in addition to our main site, which was mostly targeted to our primary office and ranked great for those keywords. Our main site has plenty of authority, has been around for ages, etc.
We built out these geo-targeted sites with some good landing pages and kept them active with regularly scheduled blog posts which were unique and either interesting or helpful. Although we had a little success with these, we eventually saw the light and realized that our main site was strong enough to rank for these cities as well, which made life a whole lot easier, not to mention a lot less spammy.
We've got some good content on these other sites that I'd like to use on our main site, especially the blog posts. Now that I've got it through my head that there's no such thing as a duplicate content penalty, I understand that I could just start moving this content over so long as I put a 301 redirect in place where the content used to be on these old sites.
Which leads me to my question. Our SEO was careful not to have these other websites pointing to our main site to avoid looking like we were trying to do something shady from a link building perspective. His concern is that these redirects would undermine that effort and having a bunch of redirects from a half dozen sites could end up hurting us somehow.
Do you think that is the case?
What he is suggesting we do is remove all of the content that we'd like to use and use Webmaster Tools to request that this content be removed from the index. Then, after the sites have been recrawled, we'll check for ourselves to confirm they've been removed and proceed with using the content however we'd like.
Thoughts?
-
The SEO was right with the 301 with the knowledge that 301 will not pass 100% rank authority as the original URL. The 301 will drop between 1 and 10%.
Sounds a bit complicated the next bit so to save this complication. Have the info on both sites, but put Canonical tags on the pages with the duplicate data. This is the preference from Googles perspective. this tells google that this is duplicate content. If you intend remove the data from these original locations then rel canonical etc will not be needed.
Google does not want duplicate data, therefore you should for good practice use the canonical or delete the other data from sites
Hope that is of use
Bruce
-
I see his concern but think it may be unwarranted. If you are going to be getting rid of the microsites, then the 301 redirect would be the way to go to conserve all of the authority you built to them. Unless the microsites were penalized, you should be fine.
Another alternative is to leave the microsites up, copy the content over to your site and set up canonicals pointing to the new location of the content. This would transfer over the authority while keeping the microsites live.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Can a move to a new domain (with 301's) shake off a google algorithm penalty
we have done everything under the sun using the holy grail of google guidelines to get our site back onto page 1 for our domain. we have recovered (penguin and panda) algorithm filters for keywords that were page 1 going to page 7 and now page 2. its been 2 years and we cant hit page 1 again. this is our final phase we cna think of.. do you thin kit will work if we move to a new domain. and how much traffic/rankings can we expect to lose in the short-term?
Intermediate & Advanced SEO | | Direct_Ram0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
Merging three sites to one
Hi guys, I just wanted confirmation if this is the right way to go about doing this. I need to merge three websites and I've never done three websites in to a brand new site before. Ok so we have Sitex.com
Intermediate & Advanced SEO | | Profero
Sitey.com
Sitez.com We've created a SiteB.com SiteB.com has SiteB.com/SiteXCat
SiteB.com/SiteYCat
SiteB.com/SiteZCat Each X,Y and Z have over 1,000 pages. They only have about 10 pages each with Page Authority above 10 and the domains arn't that strong. What i plan to do is: 301 redirect each site domain (X,Y,,Z) to it's corresponding category. e.g. Sitex.com > SiteB.com/SiteXCat 301 redirect each page off X,Y,Z that has a Page Authority above 10 to their new pages on SiteB.com Then, I'm unsure if i should 410 every other URL... I don't think its worht 301 every single URL if they arn't in search results much - but maybe it is if they have a lot of inbound links even with low page authority? Any ideas and does the above seem the best practise? Thanks.0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Are there any concerns moving a site to https?
I am currently having analytics issues where the non-secured (http) front end of my site is not properly communicating to the backend (https) of my site. When a user jumps between the the secured and non-secured, it will display as a bounce in GA and I get duplicate visits. GA has a work around for this but it is messy and not working. So my question is, has anyone had good/bad experiences moving a non-secured site over to the secured side? Thanks!
Intermediate & Advanced SEO | | 2comarketing0