Redirecting acquired website: DNS or 301?
-
Hi all,
We have taken down some infringing websites and acquired them, so far we made around 40 websites. We are using DNS to land on our website from the acquired. But I've see that recently that DNS is not a redirect and we must 301 redirect from the old site to the new site. Also there is a potential harm in employing DNS method due to the duplicate content that will harm SERP performance severely. Which is the best way? DNS or 301 redirect?
Thanks
-
Agree with Tom re getting it done with 301 redirects. The other thing to be very careful about though is that the newly acquired websites don't have major quality issues, especially spammy, manipulative backlink profiles. If the infringing sites were using inappropriate SEO methods and were coming under algorithmic pressure/penalties as a result, those issues will also be passed to your "real" site.
Paul
-
+1 to what Tom said.
Also, be careful not to redirect a whole buch of pages/websites to the same page (or the home page).
Here, Matt Cutts said it:
Is there a limit to how many 301 (Permanent) redirects I can do on a site? - Google Webmaster Youtube ChannelBest luck.
GR. -
I'd go ahead and 301 redirect those websites and pages.
With a 301 redirect you will also pass on any link equity the infringing websites once had, which in turn may help your organic ranking performance.
However, in order for that to happen, you need to ensure that you redirect the individual pages on those websites to the most relevant/equivalent versions on your own. Otherwise, you may see those 301 redirects treated as soft 404 errors.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi all, As most of the users are visiting our website for log-in, we are planning to deindex login page. As they cannpt find it on SERP, they gonna visit our website and login; I just wonder what happens when most of the visitors just end up at homepage by browsing into an "noindex" page. Obviously it increases bounce rate and exit rate as they just gonna disappear. Is this going to push down us in rankings? What are the other concerns to check about? Thanks
Algorithm Updates | | vtmoz0 -
Any risks involved in redirecting low quality Infringement website?
Hi all, Recently we have over taken one of the websites (with Trademark Infringement )who been using our domain name in their domain. That website got no traffic or backlinks. Is there any risk involved in redirecting that website to our website? Thanks
Algorithm Updates | | vtmoz0 -
A website Mobile versions for different languages
Hi There!, A website available in two languages(EN, FR) but mobile version is only available for one languages (EN). Mobile website is designed by following 'Separate URLs' configuration. So all English version desktop URLs are redirected to corresponding mobile version pages. For example : https://www.sitegeek.com/godaddy
Algorithm Updates | | gamesecure
()
is redirect to: https://m.sitegeek.com/godaddy (if page opened in mobile)
() But, same URL is in France language https://fr.sitegeek.com/godaddy is not redirected to https://m.sitegeek.com/godaddy So what would the correct implementation? France version would be redirected to English mobile version till Fr version is not prepared or it should not be redirected ? Rajiv0 -
Complete website redesign: original domain vs subdomain vs new domain ?
Hello dear community fellas!
Algorithm Updates | | PayPro
The story goes like this: my company has a good ol' website launched back in 2008 and since then nothing much was updated there. Our rank dropped significantly because, well, barely any SEO was done for it. Me and my team decided to redesign the whole thing: content, structure, visuals, links, everything but this time really making it right. However, with our oldie we managed to get a nice user base, so we still want to get all the traffic juice out of it. Now the questions is where do you think is the best place to publish our new website: Our original domain www.companyname.com? Create a subdomain new.companyname.com? Totally new domain www.namecompany.com? Cheers!0 -
Mutliple Websites for Same Company (different areas of practice)
My company has had one primary website for a number of years, a few years back we created a second website separate and apart from the first one to generate more business in a niche market that we cater to. Since then we ended up adding 3 more websites to help increase our footprint with more content. Each of the new websites deal with a major aspect of our business and the content generated on those websites are related to those areas of our business. My question is - is it bad idea to have a network of 5 websites for SEO-purposes? What are the pros and cons and why? Any supporting resources to back up your position would be greatly appreciated. Note there is no "duplicate content" problem here, all content we create is unique to the site it is hosted on.
Algorithm Updates | | goldbergweismancairo0 -
Long term plan for a large htaccess file with 301 redirects
We setup a pretty large htaccess file in February for a site that involved over 2,000 lines of 301 redirects from old product url's to new ones. The 'old urls' still get a lot of traffic from product review sites and other pretty good sites which we can't change. We are now trying to reduce the page load times and we're ticking all of the boxes apart from the size of the htaccess file which seems to be causing a considerable hang on load times. The file is currently 410kb big! My question is, what should I do in terms of a long terms strategy and has anyone came across a similar problem? At the moment I am inclined to now remove the 2,000 lines of individual redirects and put in a 'catch all' whereby anything from the old site will go to the new site homepage. Example code: RedirectMatch 301 /acatalog/Manbi_Womens_Ear_Muffs.html /manbi-ear-muffs.html
Algorithm Updates | | gavinhoman
RedirectMatch 301 /acatalog/Manbi_Wrist_Guards.html /manbi-wrist-guards.html There is no consistency between the old urls and the new ones apart from they all sit in the subfolder /acatalog/0 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0