Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate page titles are the same URL listed twice
-
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same.
We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
-
Oh, and the trailing slash of course - thanks for the reminder!
If you think it's a software error you might be best contacting support directly. Have you checked for similar errors in Google and/or Bing Webmaster Tools?
-
Thank you for the fast response, Alex.
They are character for character exact matches, both with a trailing slash
All other variants are 301 redirected to that url,
except uppercase/lowercase, which gets a 404 error response.
-
Are you sure the URLs are exactly the same? The following are all different URLs:
- examplesite.com
- www.examplesite.com
- www.examplesite.com/index.html
- www.examplesite.com/Index.html (note the capital 'I')
If they are different it's best to 301 redirect one to the other, or if that's not possible - add canonical tags to the two pages linking to your preferred version. Other than that, make sure you only use one unique URL throughout your site to link to one page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Length Issue
MOZ is telling me the URLs are too long. I did a little research and I found out that the length of the URLs is not really a serious problem. In fact, others recommend ignoring the situation. Even on their blog I found this explanation: "Shorter URLs are generally preferable. You do not need to take this to the extreme, and if your URL is already less than 50-60 characters, do not worry about it at all. But if you have URLs pushing 100+ characters, there's probably an opportunity to rewrite them and gain value. This is not a direct problem with Google or Bing - the search engines can process long URLs without much trouble. The issue, instead, lies with usability and user experience. Shorter URLs are easier to parse, copy and paste, share on social media, and embed, and while these may all add up to a fractional improvement in sharing or amplification, every tweet, like, share, pin, email, and link matters (either directly or, often, indirectly)." And yet, I have these questions: In this case, why do I get this error telling me that the urls are too long, and what are the best practices to get this out? Thank You
Moz Pro | | Cart_generation1 -
Url-delimiter vs. SEO
Hi all, Our customer is building a new homepage. Therefore, they use pages, which are generated out of a special module. Like a blog-page out of the blog-module (not only for blogs, also for lightboxes). For that, the programmer is using an url-delimiter for his url-parsing. The url-delimiter is for example a /b/ or /s/. The url would look like this: www.test.ch/de/blog/b/an-article www.test.ch/de/s/management-coaching Does the url-delimiter (/b/ or /s/ in the url) have a negative influence on SEO? Should we remove the /b/ or /s/ for a better seo-performance Thank you in advance for your feedback. Greetings. Samuel
Moz Pro | | brunoe10 -
My "tag" pages are showing up as duplicate content. Is this harmful?
Hi. I ran a Moz sitecrawl. I see "Yes" under "Duplicate Page Content" for each of my tag pages. Is this harmful? If so, how do I fix it? This is a Wordpress site. Tags are used in both the blog and ecommerce sections of the site. Ecommerce is a very small portion. Thank you. | |
Moz Pro | | dlmilli1 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
Remove geographic modifiers from keyword list
I just pulled a search term report for all of 2013 from my PPC account. What I got was 673,000 rows of terms that have garnered at least 1 impression in 2013. This is exactly what I was looking for. My issue is that the vast majority of terms are geo-modified to include the city, the city and state or the zip code. I am trying to remove the geographic information to get to a list of root words people are interested in based on their search query patterns. Does anyone know how to remove all city, state and zip codes quickly without having to do a find and replace for each geo-modifier in excel? for example, if i could get a list of all city and state combinations in the US and a list of all zip codes, and put that list on a separate tab and then have a macro find and remove from the original tab any instances of anything from the second tab, that would probably do the trick. Then I could remove duplicates and have my list of root words.
Moz Pro | | dsinger0 -
Is there a tool to upload multiple URLs and gather statistics and page rank?
I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page. Does anyone know of a tool that can do this or do we need to create one?
Moz Pro | | Brother220 -
How fast can page authority be grown
I understand that it is easier to rank for a particular keyword given a higher DA score. How fast can page authority be established and grown for a given keyword if DA is equal to 10/20/30/50? What are the relative measures that dictate the establishment and growth of this authority? Can it be enumerated to a percentage of domain links? or a percentage of domain links given an assumed C-Block ratio? For example you have a website with DA of 40, and you want to target a new keyword, the average PA of the top ranked pages is 30, the average domain links are 1,000, and the average number of linking domains is 250 - if you aim to build 1,000 links per month from 500 linking domains, how fast can you approximate the establishment of page authority for the keyword?
Moz Pro | | NickEubanks0