Old website, new domain name
-
Hi,
We have an old website which is currently being 301'ed to a new domain name / bespoke e commerce website as we're rebranding and having an increase in the range of products we will be selling.
My question is can we keep the original e commerce website selling the original products under a new domain name and escape duplicate content issues with the product descriptions as we have copied & pasted product descriptions to the new website ?
I'm looking to the future and building another domain name will help with future expansion plans / thoughts.
Both websites are registered to us at the same business address if this helps.
Please feel free to ask questions if I haven't worded this very well!
Many thanks in advance.
-
Hi,
Sorry I should have made it clear that the new domain name on the old website (or rather a copy of it) was to allow it to trade as a stand alone business.
What I'm doing is Old Website - Re Branded New Website with all original content replicated and 301 across to keep rankings (1000+ top 3's). We will then add additional categories and products to expand.
I am then left with a Magento website that I would like to "start afresh" with from the beginning and build it up as we have done over the last 2 years with the first one.
This is where I have 1000+ products and need to know if I have to change the product descriptions just trying to save a lot of work!) even though Google can see I own both domains and host them on my own server.
-
Oliver-
Keeping the original site has its advantages for additional SEO and the ability to effectively create a defensive URL which could show up with the new site in search results.
Definitely a NO for copying and pasting the product descriptions. There are a lot of very informative posts on this issue. Some SEO folks say that you have to change over 50% of the content on the new descriptions to register as original content. We have done some work in this area and have found that rewriting all of the descriptions to some extent, is the only way to generate additional SEO value. (Obviously skus will probably stay the same). Another easy thing to do is to be able to change the meta tag descriptions for the photos. That will count towards the original content "meter".
You will get some short and long term value if you take the time to rewrite the product descriptions.
Hope this helps. If so, let me know....
Mark
-
To avoid duplicate content, you should probably 301 all those pages as well. If not you can deindex them in your Google webmaster tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any disadvantages of employing additional images which open in new window? Will it impact bounce rate and rankings?
Hi all, Our website is all about our software product. Generally our website pages are filled with 3 to 6 screenshots of our product features. As Google recently shifting to mobile index and pageload speed is going to be priority, we decided to compress the images on our pages and show the same images of large size in new window when someone clicks on a image. I wonder if this helps or has any disadvantages? Users may click on these clickable images while browsing the pages and may shift to new window to view the image. Will this have any negative impact on bounce rate? Please share your thoughts. Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
SEOMoz advice on only buying domain if .com version is available
RE: "In order to maximize the direct traffic to a domain, it is advised that webmasters should only buy a domain if the .com version is available. " http://www.seomoz.org/learn-seo/domain I am working for a client who's had a domain live for 5 years or so without a .com version of the domain (just .co.uk) - the domain is also hyphenated (which doesn't look like a great idea). So, just wondering what research has been done into probs caused by lack of .com domain and by using hyphenated domain. I'm trying to figure out whether it would be worth advising client to switch to a new domain. Your thoughts would be welcome 🙂
Algorithm Updates | | McTaggart2 -
How long does it take for a website to starting ranking once the website becomes live?
I am in the process of finishing the last touches on my new company's website and I am wondering about the process of getting my new website to rank on google. I've heard many different things from many different people who believe they know everything about SEO, but they can't all be right. Is there a definite timeline? Thanks
Algorithm Updates | | uofmiamiguy0 -
Redesign, new content, new domain and 301 redirects (penalty?)
We merged our old webshops into one big project. After a few days we received our rankings back and traffic was coming in. Then suddenly we lost almost all rankings overnight. We did not use any wrong seo techniques and have unique content, written by our own writers. Is this a penalty or do we have to wait longer?
Algorithm Updates | | snorkel0 -
Website taken a hit?
We have recently (yesterday 12<sup>th</sup> April) taken a hit for our main keywords it seems that there is no constant fall but it seems to be the most competitive words as anything that ranked purely on content still seems to rank which makes me assume that we have just lost a lot of power from links (are SEO did build quite a few links from article sites he also built a few blog network links which we did not know till we got the webmaster message 3 weeks ago (24<sup>th</sup> may) and we have made him remove them all but some still show which a) he can’t contact or b) were scrapers). But on the other hand we still have decent on site content and some good links from graphic design blogs (review articles) which would suggest a penalty as some sites with poor links and poor on site content are outranking us for a couple of our good keywords. I cannot decide if this is a penalty, keyword anchor text penalty (this is wiping more power out than the bad links) or just devaluation of links (but as said before our good links are still much more powerful than the competitors out ranking us and with our content we should easily not have lost many places). If I was going to come up with an idea it would be like the bad links have taken twice there power away from the site, so our on site content is still good but compared to medium on site and crap links they are out ranking us. (we did use a lot of anchor text with keywords in) If this is the case if we build gooad quality review links, press releases and make them more natural not go for normal link building – articles ect would this help and has anyone ever dealt with this before and have any idea how to know what is happing and how long it might take to recover.
Algorithm Updates | | BobAnderson0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0 -
Are the latest Ranking Reports counting the new large format site links as positions?
Received my weekly ranking report this morning and noticed a specific keyword that I've been ranking in the 3rd or 4th spot has dropped a significant amount of positions. I tested the results myself and it appears the site links of the manufacturer are being counted as positions? My keyword has me in the 3rd position (although it is much lower on the physical page now because of the new format). I'm really wondering how this will affect organic listings going forward - this new format could be a game changer.
Algorithm Updates | | longbeachjamie2