Dynamic 301's causing duplicate content
-
Hi, wonder if anyone can help?
We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache.
The 301's our developer told us to use was in this format:
RewriteCond %{REQUEST_URI} ^/Default.aspx$
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L]This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page!
example.co.uk/About-Us/?pagename=About-Us
Webmaster Tools has now picked up on this and is seeing it a duplicate content.
Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too.
Many Thanks
-
Right I have done some research... As this was bugging me...
Remove:
RewriteCond %{REQUEST_URI} ^/Default\.aspx$ RewriteCond %{QUERY_STRING} ^pagename=About-Us$ RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L]
And replace it with:
``` <code>RewriteCond %{QUERY_STRING} ^pagename=About-Us$ [NC] RewriteRule ^Default\.aspx$ http://www.domain.co.uk/About-Us/? [R=301,L,NC]</code> ``` Test it and let me know...
Keith
-
Are you looking to redirect a single url or many similar URL's ?
-
Ok, maybe someone else will be able to pop in with some more info. Here's a post with a similar issue from Webmaster World:
-
IN that case it didn't work. I was able to visit both urls.
-
If the above rules Rewrite rules work correctly (always check) then it will redirect both Google and humans.
Passing googles page rank from your old page to the new (well most of it anyway).
-
Would this stop example.co.uk/About-Us/?pagename=About-Us working or just stop Google from listing as a dupe?
-
Hi Neil, try this:
RewriteCond %{REQUEST_URI} ^/Default.aspx$
RewriteCond %{REQUEST_URI} ^About-Us/$ RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L]Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Syndicated content with meta robots 'noindex, nofollow': safe?
Hello, I manage, with a dedicated team, the development of a big news portal, with thousands of unique articles. To expand our audiences, we syndicate content to a number of partner websites. They can publish some of our articles, as long as (1) they put a rel=canonical in their duplicated article, pointing to our original article OR (2) they put a meta robots 'noindex, follow' in their duplicated article + a dofollow link to our original article. A new prospect, to partner with with us, wants to follow a different path: republish the articles with a meta robots 'noindex, nofollow' in each duplicated article + a dofollow link to our original article. This is because he doesn't want to pass pagerank/link authority to our website (as it is not explicitly included in the contract). In terms of visibility we'd have some advantages with this partnership (even without link authority to our site) so I would accept. My question is: considering that the partner website is much authoritative than ours, could this approach damage in some way the ranking of our articles? I know that the duplicated articles published on the partner website wouldn't be indexed (because of the meta robots noindex, nofollow). But Google crawler could still reach them. And, since they have no rel=canonical and the link to our original article wouldn't be followed, I don't know if this may cause confusion about the original source of the articles. In your opinion, is this approach safe from an SEO point of view? Do we have to take some measures to protect our content? Hope I explained myself well, any help would be very appreciated, Thank you,
Intermediate & Advanced SEO | | Fabio80
Fab0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Multiply domains and duplicate content confusion
I've just found out that a client has multiple domains which are being indexed by google and so leading me to worry that they will be penalised for duplicate content. Wondered if anyone could confirm a) are we likely to be penalised? and b) what should we do about it? (i'm thinking just 301 redirect each domain to the main www.clientdomain.com...?). Actual domain = www.clientdomain.com But these also exist: www.hostmastr.clientdomain.com www.pop.clientdomain.com www.subscribers.clientdomain.com www.www2.clientdomain.com www.wwwww.clientdomain.com ps I have NO idea how/why all these domains exist I really appreciate any expertise on this issue, many thanks!
Intermediate & Advanced SEO | | bisibee10 -
I'm afraid I may have messed up my site's organization
So I recently started working on an existing site for a company, and I'm afraid I may have done something to lose some backlinks. So to start off, say the website is www.domain.net and when I arrived domain.net and www.domain.net showed up as two separate sites so I changed my web.config file to direct all domain.net to www.domain.net The homepage was called default.asp, and I wanted the homepage to always show up as www.domain.net instead of www.domain.net/default.asp. Of course they both showed the same thing but I couldn't figure it out. So I removed www.domain.net/default.asp from indexing and changed the my internal links to the homepage to point at www.domain.net instead of simply pointing at the file default.asp. So now www.domain.net/default.asp still brings up the page, but I want it to revert to www.domain.net. I'm also a little worried because I noticed that one of my incoming links points at www.domain.net/default.asp and it doesn't get passed along to www.domain.net and I think i may have damaged my sites SEO I guess this is a very complicated and roundabout way of saying this, but how can I get www.domain.net/default.asp to take you to www.domain.net
Intermediate & Advanced SEO | | bcrabill0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Duplicate page content and duplicate pate title
Hi, i am running a global concept that operates with one webpage that has lot of content, the content is also available on different domains, but with in the same concept. I think i am getting bad ranking due to duplicate content, since some of the content is mirrored from the main page to the other "support pages" and they are almost 200 in total. Can i do some changes to work around this or am i just screwed 🙂
Intermediate & Advanced SEO | | smartmedia0 -
Cross-Domain Canonical and duplicate content
Hi Mozfans! I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
Intermediate & Advanced SEO | | MaartenvandenBos
The thing is that the client has about 3 sites with the same Jobs on it. I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why. Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A). Thanks! Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday0