Can I redirect a link even if the link is still on the site
-
Hi Folks,
I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places.
When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation.
We can't use rel-canonical because they don't want visitors going to that 2nd page.
Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page?
I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change.
So, what are your thoughts?
Thanks!
-
Are you using a CMS, or some inhouse solution? If it is a CMS, in many cases you should be able to update that CMS so that the 2 links are generated but the page itself isn't generated twice.
Another option if 2 pages must exist, would be to set a canonical on both pages to the 1 main location for the content, while using a pushstate on the url to manipulate the browser into the main pathing. Although the more I think about that one, it may not be a 100% viable option.
-
I agree - but as with many things, there's politics involved. . . . . I'll leave it at that.
-
Although, depending on Craig's site structure, it could be a simple, one-time set up of the htaccess so all Link 2's 301 to the Link 1's.
For example, if when creating website.com/category1/product1, it also creates a duplicate page on /category2/product1, he could use regex so that all products under /category2/ redirect to the /category1/ product URL.
You're right that it's still not the most elegant of solutions, but it's a simple enough way to make sure users are where you want them to be without requiring any effort every time you create a new page - and it shouldn't upset Googlebot.
-
Yes, you absolutely can redirect this link. However I think your time would be better spent focusing on a solution that prevents this from happening long term. You will continually have to redirect new content as long as this continues to work as is.
-
Redirecting the 2nd link would probably be the best option, in my opinion. If the 2nd link has an integral part of the site structure and navigation, but you don't want users (or Google) to access that duplicate page, I don't see how you could do it any other way if your client insists that the 2nd page has to be created.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Unnatural links from your site
Hi, 24 February got this penalty message in Google webmaster tool. Google detected a pattern of unnatural, artificial, deceptive, or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. Already removed all the link on the blog and sent reconsideration request to Google spam team. But request is rejected. Please help me on this or share link with me on same case. Thanks,
Technical SEO | | KLLC0 -
Find all links in the site and anchor text
Hi, Find all links in the site and anchor text and i need this done on my own website so i know if we dont have links that are anchored to numbers and punctuations that are not seen at all. Thanks
Technical SEO | | mtthompsons0 -
Site not coming up even when I search with the .com
We have a customer whose site: http://camilojosevergara.com doesn't show up even when you search for his exact domain. http://bit.ly/18RjPPX Wondering why that is. Is it because wikipedia and the other links rank higher? I've submitted his sitemap to google so I'm trying to figure out why its not showing up. Any tips/recommendations to fix this would be greatly appreciated. thanks
Technical SEO | | callmeed0 -
What damage can internal duplicated hidden links do to rankings?
Hi, I have a rental website, www.akilar.com, for Spain. My question is, on the home page we have links to the seperate regions of the country. Somehow in the redesign of the site, these links have been placed on every page of the site and hidden in the code at the top. The links are there as well on each page in the header, these are additional. The page quantity is over 2000 pages. Also this is taking the internal links well over the limit. In anyones opinion what damage has this caused as our rankings of late have fallen. Thanks very much for your help!
Technical SEO | | AkilarOffice0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
Can we use our existing site content on new site?
We added 1000s of pages unique content on our site and soon after google release penguin and we loose our ranking for major keywords and after months of efforts we decided to start a new site. If we use all the existing site content on new domain does google going to penalized the site for duplicate content or it will be treated as unique? Thanks
Technical SEO | | mozfreak0