Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
-
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like:
http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not.
Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem:
RewriteCond%{HTTP_HOST} ccisound.com$ [NC]
RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L]
Any ideas on why the 301 redirect isn't happening? Thanks all!
-
Yes, That is the best thing you can do. Because seems there have some other issues in configs. And we can not see from here about your all the configs. Anyway the codes we have given you will successfully work once you solve your other problems.
Best Regards
Prasad
-
Thanks Prasad for all your help. And thank you to Ersin also. We have solved the problem. Apparently, our URL re-write at the TomCat level was taking precedence over the .htaccess file. Once our IT director added the appropriate redirect for these domain pages into the URL re-write file, the problem was fixed and those pages now render with a 301-redirect to the correct page one the correct domain. I have encouraged him to write a blog post about this and put it here because there was very little documentation online about TomCat redirects tasking precedence over an .htaccess file.
-
Thanks Prasad,
We tried your suggestion because there was a lot of stuff [1220 lines of code] in our .htaccess file. We stripped everything out except your code, and still had the same problem. Our IT director is wondering if perhaps there is a conflict between the .htaccess file that operates at the Apache level and the URL rewirte file which operates at the TomCat level. He is wondering, does one of them take precedence over the other? In other words, could our URL re-write file be causing the redirects in the .htaccess file to not work properly?
I am thinking maybe we need to hire someone to look at the code in both files in order to figure out where and why we are having a conflict?
Dana
-
Hi,
I think you are doing so many mistakes. First take out all the other codes from your .htaccess file. Then copy only one code from me or from Ersin. While you have some other codes related to this domains redirection there may have conflicts. And your current code work for your root domains mean your .htaccess redirection is working. The problem is you have not used it correctly.
-
Thanks to both Ersin and Prasad. I appreciate your efforts to help very much. My IT director tried both versions of code without success. Here is exactly what he wrote:
" I tried the suggestions without success. I even moved the ccisound
redirects to the top of the file thinking that some other redirect was grabbing
it first, But no go, same results. Top level redirected, lower level not."Any suggestions as to why neither code succeeded at creating a "catch all." ?
-
cprasad's and my purpose are same, so our responses will work. But just some diffrences;
"<ifmodule mod_rewrite.c="">" line checks wheter Apache's rewrite module is activated or not. If it is then it runs the code after that. The second one is that my rewritecond lines are less than cprasad's, but they all same conditions.By the way, i have just tested it before I wrote it.</ifmodule>
Just do it..
-
Hi,
Do not worry about the differences about the code provided by me and by Ersin. Both are same in functioning. He have just added module activation tags in to the code. So you may use any code which you prefer and works for you.
And the answer for your question about the code you were using, it may not be initiate a wildcard redirection.
Are there more codes inside the .htaccess?
If you can post the exact code without hyper-linking any url then I can tell you the exact reason why your code does not do the job.
Prasad
-
Thanks Ersin,
Can you explain how your coding suggestion is different from Prasad's? I am not a coder so am just wondering if there's a different methodology behind the two suggestions?
-
Yes, a catch-all makes sense to me. what is the difference between your code suggestion and the one just below posted by Ersin A. ? Also, just for our own understanding, can you explain why the code we were using wasn't accomplishing what we wanted? (I just want to be able to explain it to our Web team). Thanks!
-
Hi,
The supplied url for interior page of old domain which you have mentioned not redirecting seems not correct. Because you have linked that text with the new domain. Anyway I have understand your problem. Your problem is
is redirecting to
but
http://www.ccisound.com/StoreFront/category/cd-duplicators
is not redirecting to
http://www.ccisolutions.com/StoreFront/category/cd-duplicators
If I have understood correctly, the solution for your problem is, Do a wildcard 301 redirection. It will redirects all the inner pages to the new domain's inner pages. But all the inner pages in new site, must have the same paths as the old site. Hope you understand what I mean.
Anyway when look in to the example urls you have provided it seems both sites have the same content. So there is nothing to worry about. Seems you have just changed domain name.
So anyway use the following code in your .htaccess file inside your root folder
RewriteEngine on
Options +FollowSymLinks
RewriteCond %{HTTP_HOST} ^ccisound.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.ccisound.com$
RewriteRule ^(.*)$ "http://www.ccisolutions.com/$1" [R=301,L]First and second line do not write again if those already written inside your .htaccess file.
the above code will initiate a wildcard 301 redirection and will solve your problem. Hope you can improve the code for all the other domains you have. Otherwise post here. I will do it for you.
Regards
Prasad
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Re-directing 'empty' domains
Hello, My client had purchased a few domains and 301 re-directed them, pointing to our main website. As far as I am aware the 'empty domains' are brand related but no content has ever been displayed on them, and I doubt they have much authority. The issue here is that we took a dive in ranking for our main keyword, I had a look on ahrefs and found the below: | www.empty-domain/our-keyword | 30 | 19 | 1 | fb 0
Technical SEO | Apr 21, 2017, 10:36 AM | SO_UK
G+ 0
in 4 | REDIRECT 301 TO www.main-domain/our-keyword | 8 Feb '175 d | The ranking dip happened at the same time as the re-direct was re-discovered / re-crawled. Could the 'empty' URL in question been causing us any issues? I understand that this is terrible practice for 301 redirects, I was hoping someone in the community could shed light on any possible solution for this.0 -
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance!
Technical SEO | May 13, 2016, 3:30 PM | usDragons0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | May 9, 2016, 12:27 PM | kfallconnect0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | Nov 10, 2015, 6:52 PM | KempRugeLawGroup1 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | Nov 3, 2014, 2:03 PM | IgorMateski0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | Jan 20, 2014, 3:17 PM | DHS_SH0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | Nov 17, 2012, 2:22 AM | PooleyK0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | Mar 22, 2011, 4:14 PM | briankb0