Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content issue with ?utm_source=rss&utm_medium=rss&utm_campaign=
-
Hello,
Recently, I was checking how my site content is getting indexed in Google and from today I noticed 2 links indexed on google for the same article:This is the proper link - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/
But why this URL was indexed, I don't know - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/?utm_source=rss&utm_medium=rss&utm_campaign=hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims
Could you please tell me how to solve this issue? Thank you
-
Hi @Dinsh007!
I usually exclude such pages with parameters in robots.txt file. - topic:timeago_earlier,9 days
-
We are also having the same issue with UTM tags in SERPS:
?utm_source=rss&utm_medium=rss&utm_campaign=some-article-title
We have canonicals in place, but google still decides to show this. It's ok - google can decide on this if the links are from respective sources... well, we can't find those sources - checked search console, searched ahrefs. nothing.
We are using RSS to feed Google Publisher Center, but it does not have RSS values for links.
We are close to disabling our global website feed to get rid of non-canonical links on SERPs. - topic:timeago_earlier,3 years
-
Thank you very much for so many solutions, I will implement it if I see any issues again and bookmarking this page also.
Regards
Dinesh Singh -
YoastSEO should do the job for Wordpress - it allows you to define the canonical page URL on each of your pages/posts:
https://yoast.com/help/canonical-urls-in-wordpress-seo/
Another option which I usually find effective which I initially skipped over is the URL parameters section in Google Search Console. You'll find it in the legacy tools section and you can set these parameters to "No: Doesn't affect page content:" which is the option for tracking code parameters:
-
Hello Paddy,
Thank you very much for your reply.
It seems that now the issue is gone but if it happens in future, then is there any Wordpress plugin to resolve this issue, instead of manually messing up with the scripts. Thank you.
-
Hi there,
This is quite a common issue and happens because technically, the addition of those parameters at the end of the URL mean that it's a new URL from Google's perspective. These kinds of parameters are very common and Google often figure it out and drop these URLs from their index and focus on the correct version. However, they don't always do this and may not do it as quickly as you'd like.
One way to deal with this is to add a rel=canonical tag to any duplicate versions of the URL and have that tag point back to the correct URL. Based on your question, this would mean that the canonical tag would look like this:
This would go into the section of this URL (and any other duplicates):
Hope that helps!
Paddy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | Jun 9, 2022, 2:29 PM | rj_dale0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | Jul 10, 2017, 12:31 PM | znotes0 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | Apr 30, 2015, 12:57 PM | HiteshP0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | Mar 16, 2015, 11:43 AM | Digital-DMG0 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | Mar 21, 2013, 10:22 PM | Ditigal_Taylor0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | May 28, 2018, 11:47 PM | hawkvt10 -
Block Quotes and Citations for duplicate content
I've been reading about the proper use for block quotes and citations lately, and wanted to see if I was interpreting it the right way. This is what I read: http://www.pitstopmedia.com/sem/blockquote-cite-q-tags-seo So basically my question is, if I wanted to reference Amazon or another stores product reviews, could I use the block quote and citation tags around their content so it doesn't look like duplicate content? I think it would be great for my visitors, but also to the source as I am giving them credit. It would also be a good source to link to on my products pages, as I am not competing with the manufacturer for sales. I could also do this for product information right from the manufacturer. I want to do this for a contact lens site. I'd like to use Acuvue's reviews from their website, as well as some of their product descriptions. Of course I have my own user reviews and content for each product on my website, but I think some official copy could do well. Would this be the best method? Is this how Rottentomatoes.com does it? On every movie page they have 2-3 sentences from 50 or so reviews, and not much unique content of their own. Cheers, Vinnie
Technical SEO | Feb 22, 2012, 5:29 PM | vforvinnie1 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | Jun 15, 2011, 9:01 AM | CPLDistribution0