301 duplicate content dynamic url
-
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls.
here is an example:
/kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance
/kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance
/kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance
should all be 301 to the original page that I want to remain indexed:
/kidsfashionnetherlands/mimpi.html
I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress?
Thanks
-
I can't believe it but it worked! I've been trying to fix this for days....Thanks so much for you help!
-
It's really going to be a bit complex to get this done right. But based your example above, it looks like you just want to redirect anything with a query string back to the base url.
There's a discussion specifically about that right here:
http://www.webmasterworld.com/apache/3203401.htm
You could start with this code and refine it from there:
RewriteCond %{QUERY_STRING} .
RewriteRule (.*) http://www.example.com/$1? [R=301,L]Good luck.
-
Thanks Marek, I know how to do a simple 301 redirect, which the generator creates but it doesn't work for a dynamic url : http://www.dashinfashion.com/kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance to http://www.dashinfashion.com/kidsfashionnetherlands/mimpi.html
Any idea how to do this with a dynamic url?
-
-
Thanks for your reply. I'm actually going to relaunch a new site and want to get rid of these duplicate pages. I don't have access to these duplicated pages to put the noindex.
Thought of doing a 301 since these pages will be eliminated with the new website. Any idea how to do a 301 for a dynamic url?
-
I setup the url parameter tool yesterday blocking out these pages but my concern is that when I relaunch my website I'll have lots of 404s.
I thought of also doing the 301 redirect for pages listed in google webmaster as duplicate pages but I can't figure out the right script. A simple 301 redirect won't work because its a dynamic url.
Do you know how to setup a dynamic 301? For example: http://www.dashinfashion.com/kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance
redirected to: http://www.dashinfashion.com/kidsfashionnetherlands/mimpi.html
Thanks!
-
Hi,
Your title for all these pages is:
<title>Mim Pi Netherlands Holland Dutch Clothes Girls Clothes Girls Dresses Baby Clothes Kids Fashion Mim Pi Children's Clothing Girls Fashion Designer</title>
and it is the same. In my opinion if you can not differentiate titles the best solution is to "noindex" search queries pages.
Marek
-
Have you seen the URL parameter tool in Google Webmaster Tools? It's made just to handle these kinds of situations:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1235687
But, there's no reason you can't setup redirects for all of these URLs, but it's probably going to be time consuming to figure out all of the possible URL combinations that you have and where to redirect them to.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | | KarlBantleman0 -
Looking for a technical solution for duplicate content
Hello, Are there any technical solutions to duplicate content similar to the nofollow tag? A tag which can indicate to Google that we know that this is duplicate content but we want it there because it makes sense to the user. Thank you.
Technical SEO | | FusionMediaLimited0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
Duplicate Content For Trailing Slashes?
I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?
Technical SEO | | RyanKelly0 -
Cross-domain duplicate content issue
Hey all, Just double-checking something. Here's the issue, briefly. One of my clients is a large law firm. The firm has a main site, and an additional site for an office in Atlanta. On the main site, there is a list of all attorneys and links to their profiles (that they wrote themselves). The Atlanta site has this as well, but lists only the attorneys located in that office. I would like to have the profiles for the Atlanta lawyers on both sites. Would rel=canonical work to avoid a dupe-content smackdown? The profiles should rank for Atlanta over the main site. This just means that G will drop the main site's profiles (for those attorneys) from their index, correct? No other weird side effects? I hope I worded all that clearly!
Technical SEO | | LCNetwork0