Duplicate Page Title
-
Virtually all of my pages are coming up with a "Duplicate Page Title" error even though the page title are different. I assume this is down to the end of the page title having the company name. Is this the reason and is it a problem to have a page title like below...
"Page title description - Company Name"
-
Hi Justin
I have now resolved the duplicate page title as mentioned.
However I am still getting lots of "duplicate page content" errors on all my wordpress tags pages. Any idea's why and how to resolve this?
Thanks
Pete
-
Facepalm. Such a good call. Can't believe I didn't think of that.
-
It isn't possible to change from a root to a subdomain, the only way of doing this is to create a new campaign, you will however start from scratch on the historical data of rankings etc.
I would strongly recommend you keep the campaign set up as is and use 301's to resolve the duplicate issues, as if you don't you rankings are likely to be hurt by duplicate content.
I'm sure you are aware already, but if not, here is the info for setting up 301
For apache
http://www.isitebuild.com/301-redirect.htmFor Windows
http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-domain-name-issuesAs I say, you really should fix this issue as opposed to working around it.
-
Thanks for the reply. Your right. I can't see the option in my settings to change from a root domain to a sub domain. Any idea's?
-
I assume here that you are referring to a "Duplicate page Title" error showing up in the SeoMOZ web app?
The most likely issue is that you have set your campaign up as root domain (i.e. with no www prefix for the domain), Root domains are fine, however if the links in your site point to www.domain.com and domain.com this will show as a duplicate page.
This behaviour is by design of SeoMOZ as Google will also be seeing your pages as duplicates which will most likely harm your SERPs
The best solution to this problem is to 301 or rel=canonical to the preferred domain
ie: if a user types in domain.com you 301 to www.domain.com
Make sure you do this for all domains/subdomains you have pointing to your site and add the trailing slash to the domain i.e. http://www.domain.com/ as this gets added anyway.
This will stop the page reporting as duplicates, also Google will only see one version of your pages which should pack some rankings bonus' for you.
Hope that helps
Justin
-
Depends on the tool you are using but duplicate page title would have to be an exact match on the page title I would expect...
-
Where are you seeing this error appear?
And no, it's unlikely that if all your page titles are being appended with your company name that this would flag duplicate page titles.
If your website is dynamic, it is likely that your page titles are being rewritten but something is delaying the time it takes to rewrite them so any bot/crawler is just detecting them as duplicates... Possibly.
-
Thanks, but I'm still unsure.
Does "Duplicate Page Title" mean the whole page title is used on another page or can it be just certain words are repeated as I am getting lots of errors even thought page title is different?
-
If the words are different it is highly unlikely that this is the issue. It might be possible if the titles were (for example) Sugar-company name Sugars-company name, sugary-company name.
But even then it would be a stretch. Are you sure that your website platfor hasn't dynamically created mirror pages? This happened to us in Magento recently. The developer didn't know any better and we had to set him straight.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internally linked pages from different subdomain must be well optimised?
Hi all, We have guide/help pages from different subdomain (help.website.com). And we have linked these from 3rd hierarchy level pages of our website (website.com/folder1/topic2). But help.website sumdomain & pages are not well optimised. So, I am not sure linking these subdomain pages from our website pages hurts our rankings? Thanks,
Web Design | | vtmoz0 -
Will HTTPS Effect SERPS Depending on Different Page Content?
I know that HTTPS can have a positive influence on SERPS. Does anyone have any thoughts or evidence of this effect being different depending on the page content? For example, I would think that for e-commerce sites HTPS is a must, and I guess the change in rankings would be more significant. But what about other situations, AMP pages for example? Of if you run Adsense, or Affiliate links? Or if your page contains a form?
Web Design | | GrouchyKids1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
How to make sure category pages rank higher than product pages?
Hi, This question is E-Commerce related. We have product categories dividing products by color. Let's say we have the category 'blue toy cars' and a product called 'blue toy car racer', both of these could rank for the keyword 'blue toy car'. How do we make sure the category 'blue toy cars' ranks above the product 'blue toy car racer'? Or is the category page automatically ranked higher because of the higher page authority of that page? Alex
Web Design | | WebmasterAlex0 -
Will keyword optimization for a landing page impact SEO for subsequent pages?
For example, if I optimize keyword “pleurx” really well on our landing page, I'd like to know if subsequent
Web Design | | Todd_Kendrick
pages linking back to that landing page will rank higher than before for “pleurx”
even if “pleurx” wasn't optimized on the subsequent pages. Thanks! -Andrew0 -
ECWID How to fix Duplicate page content and external link issue
I am working on a site that has a HUGE number of duplicate pages due to ECWID ecommerce platform. The site is built with Joomla! How can I rectify this situation? The pages also show up as "external " links on crawls... Is it the ECWID platform? I have never worked on a site that uses this. Here is an example of a page with the issue (there are 6280 issues) URL: http://www.metroboltmi.com/shop-spare-parts?Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560081
Web Design | | Atlanta-SMO0 -
Wordpress Pages not indexing in Google
Hi, I've created a Wordpress site for my client. I've produced 4 content pages and 1 home page but in my sitemap it only says I have 1 page indexed. Also SEOmoz only finds 1 page. I'm lost on what the problem could be. The domain name is www.dobermandeen.co.uk Many thanks for any help. Alex
Web Design | | SeoSheikh0 -
What's the best way to sculpt links on a page?
I know PR isn't a top ranking factor anymore, so "PR sculpting" isn't something to focus on. But isn't it still true that having more links that you need on any given page is worse than having fewer, in terms of that page's authority? I'm managing a site that has a lot of navigational links in the footer, which are duplicative because they're almost all included in the top nav bar, and several are triplicated in the sidebar as well. I wanted to remove 85% of these duplicative links from the footer, thinking they diluted the page authority and that most users probably won't scroll there anyway when we launch the site. The site owner is pushing back, though, not wanting to remove so many links because he believes they might be useful to some users. We can test our respective user-behavior theories after launching, but right now I have two questions: Will having a sizable number of duplicative links in the footer dilute the page's authority? and 2) Are there any other ways to reduce this dilution, aside from simply removing the links? (I know nofollow is not the answer, but possibly using iframes or Java or something like that?)
Web Design | | KyleJB0