Bad Duplicate content issue
-
Hi,
for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content).
What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ?
It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff.
Thanks in advance.
-
Your original question had two URLs, one of where the "=" was replaced with "%3D". If that was an actual crawled URL (and not a copy-and-paste error), then it's likely coming from bad links within your own site. That's malformed, so you should definitely check it out. A desktop crawler like Xenu or Screaming Frog could help track down the culprit:
http://www.seomoz.org/blog/crawler-faceoff-xenu-vs-screaming-frog
-
Thanks Peter for the reply!
What do you mean by "bad internal links" ?
I'm well ranked so based on your suggestions what I have to do is to set up properly the rel=canonical tag and rel=alternate, right? I'm still bit scarred about duplicate content report in the SEOmoz campaign. 2.700 warnings is kind of a big deal.
-
One of these URLs just seems to be the encoded version of the other, which should appear as identical. I'm not seeing any evidence that Google is indexing both. I have a feeling that you may have some bad internal links that need to be fixed. I'm seeing the English/German version of this page in the index, but that should be fine. As Khem said, you could use .
Be careful about converting to a "static" version. It's not that it's a bad idea, but the problem is that you could end up turning 2 duplicates into 3 duplicates. You'll still have to canonicalize the dynamic version to the static version. In other words, done badly, changing your URLs could actually make the problem worse.
-
Rel=prev/next is for paginated series, such as internal search results. While I see you have a pagination parameter on these pages ("idpagina=13"), it doesn't seem like this is a series or that the two pages are even duplicates. I'm a bit confused on the intent, but my initial reaction is that rel=prev/next doesn't fit the bill here.
-
As long as you are managing a multilingual site, it is always recommended to use rel="alternative" even if you're redirecting your website.
For next, prev, don't use, unless you feel it is really required, as I could not find the need May be I missed something, could you be please bit more specific?
-
Thanks Raj! I will for sure re-write the dynamic urls into static and that's a starting point. Take for example these pages:
http://www.grappa.com/eng/grappa.php/argomento=grappa_in_italy/idsezione=1/idpagina=13
Do you suggest in this case to use rel=nex, prev ?
I thought about using rel="alternate" for the multilingual issue, but now my site redirects automatically from www.grappa.com to www.grappa.com/eng/index.php. is that bad for SEO? Should I put rel="canonical" to www.grappa.com ?
Many thanks
-
Hey Nicola, ~2700 is a huge no.
I would suggest you to talk to you programmer/developer to re-write the dynamic URLs into static, which I am sure they can easily do.
second thing, make sure to delete all the duplicate pages or use rel=unfollow. using 301 for all the duplicate pages is not a bad option but not a permanent solutions. It is better to re-write all the dynamics urls into static one, delete all the dups pages and then 301 redirect all the deleted pages to the originals.
for multilingual you can use the following code:
The tag enables you to say, “This is for Spain. this is for Germany
The rel="alternate" hreflang="es" annotations help Google serve the Spanish language or regional URL to searchers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
How different should content be so that it is not considered duplicate?
I am making a 2nd website for the same company. The name of the company, our services, keywords and contact info will show up several times within the text of both websites. The overall text and paragraphs will be different but some info may be repeated on both sites. Should I continue this? What precautions should I take?
Technical SEO | | savva0 -
Duplicate video content question
This is really two questions in one. 1. If we put a video on YouTube and on our site via Wistia, how would that affect our rankings/authority/credibility? Would we get punished for duplicate video content? 2. If we put a Wistia hosted video on our website twice, on two different pages, we would get hit for having duplicate content? Any other suggestions regarding hosting on Wistia and YouTube versus just Wistia for product videos would be much appreciated. Thank you!
Technical SEO | | ShawnHerrick1 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0