How to avoid duplicate title tags?
-
I've got roughly 1200 location pages for a travel client. Since the business does the same thing at every location, the title tags and descriptions are almost identical except for the location name. I know Google likes tags and meta descriptions to be unique, but how many different ways can I write the same title in a 55 character limit? For example, here's how the titles look:
Things to do in San Jose, CA | Company Name
Things to do in Dallas, TX | Company Name
Things to do in Albuquerque, NM | Company Name**My question: Are 1200 title tags structured this way unique enough for Google? **
I have got the same problem with the meta descriptions, but I can vary those a bit more because i have more characters to work with.
Thanks for your input,
Dino -
Thanks Anthony and Miriam, this site doesn't really have a lot of unique textual content on the pages. It's sort of a portal site where consumers can download travel brochures in each city. I thought Googlebots might read the pages as duplicate because the pages are mostly images and links. There's a one or two sentence description for each page (brochure) as well as a physical address (NAP), but that's about it for text.
My thought was that since it's all images and links, Google would have very little to crawl and therefore consider all the pages as duplicates, but the Moz crawl is not considering them duplicate pages from what I can tell. Do you think that since the pages are light on textual content that a duplicate content issue could pose a problem?
-
Hi Dino!
Agree with Anthony that those are unique and that the main concern here is to be certain that your content on those pages is unique. Definitely a challenge with 1200 pages, but fortunately, there should be totally unique things to do in each of those cities, so it's quite possible to manage this.
-
The title tags in this example are unique and are fine.
The bigger concern would be if those pages all have unique, valuable and not-thin content as opposed to being created for ranking a variety of keywords.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check if the website has duplicate content?
I've been working with the websites from couple of months and it was always in my mind if there could be a legit way to find if the website have a duplicate content. I've tried couple of websites through google but nothing worked for me. It would be much appreciated if anyone can help. Thanks
Web Design | | rajveer_singh0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Duplicate Product Descriptions for Each Variant
Hi, I am setting up a Shopify e-commerce store and I have a questions about duplicate product descriptions. I have written unique product descriptions for all our products. Each product has at least 10 color options. I am thinking that it would look better if I created each color variant as a unique product. i.e. store.com/nice-shirt-blue, store.com/nice-shirt-red ect. Here is the kicker. Would I be penalized for using the same product descriptions for each product type?
Web Design | | Jon_B0 -
Duplicate Content? Designing new site, but all content got indexed on developer's sandbox
An ecommerce I'm helping is getting a complete redesign. Their developer had a sandbox version of their new site for design & testing. Several thousand products were loaded into the sandbox site. Then Google/Bing crawled and indexed the site (because developer didn't have a robots.txt), picking up and caching about 7,200 pages. There were even 2-3 orders placed on the sandbox site, so people were finding it. So what happens now?
Web Design | | trafficmotion
When the sandbox site is transferred to the final version on the proper domain, is there a duplicate content issue?
How can the developer fix this?0 -
Am I pigeonholing myself with a geo-targeted titles?
I've got a site that ranks very well in my local area for mobile app development related keywords. We use terms like "NYC" and "New York" in our title tags. However I do not believe that this is NORMALLY a "local" geo keyword. The reason that I believe this is because my competitors rank all over the USA (and even in Europe) for these keyterms, but we only rank in our local area of New York. Is it possible that by including geographic terms like NYC and New York, that we are actually HURTING our rankings in other cities like Los Angeles and Chicago? If we removed these words, could we see rankings increases in other parts of the world? The other side of the coin is that if we remove the "NYC" and "New York" keyterms, could we see serious drops in the local area as a result?
Web Design | | Fueled0 -
Google Tag Manager
I recently discovered the Google Tag Manager and I am in the process of updating many of my websites with this feature. I am using Tag Manager to mange Google Analytics, Google Remarketing, Alive Chat, Woopra, etc. I have one question about how Tag Manager actually works. As best I can tell, the Tag Manager code snippet that I insert into my web pages is the same for all my websites and does not include a unique ID. If that is the case, then Tag Manager must search all the URLs in the TM database to find a match. What is to stop someone else from adding some rules for my URLs to their containers? I expect Google has a method to ensure proper matching, but I'm not clear on how that is enforced. Best,
Web Design | | ChristopherGlaeser
Christopher0 -
Penalized by duplicate content?
Hello, I am in a very weird position. I am managing a website(EMD) which a part of it dynamically creates pages. The former webmaster who create this system though that this would help with SEO but I dought! The thing is that now the site has about 1500 pages which must look duplicate but are they really duplicate? Each page has a unique URL but the content is pretty much the same: one image and a different title with 5-8 words. There is more: All these pages are not accessible by the users but only for the crawlers!!! This URL machine is a part of a php - made photo gallery which i never understood the sense of it! The site overall is not performing very well in SERP, especially after Penguin, but judging by the link profile, the Domain authority, construction (ok besides that crazy photo gallery) and content, it never reached the position it should have in the past. The majority of these mysterious pages - and mostly their images - are cached by Google and some of them are in top places to some SERP - the ones that match the small title on page - but the numbers are poor, 10 - 15 clicks per month. Are these pages considered as duplicated, although they are cached, and is it safe for the site just to remove 1500 at once? The seomoz tools have pointed some of them as dups but the majority not! Can these pages impact the image of the whole site in search engines?( drop in Google and has disappeared from Yahoo and Bing!) Do I also have to tell Google about the removal? I have not seen anything like it before so any comment would be helpful! Thank you!
Web Design | | Tz_Seo0