Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are best page titles for sub-folders or sub-directories? Same as website?
-
Hi all,
We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles".
Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO?
Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase?
Thanks,
-
VTCRM,
Good luck!
-- Jewel
-
Thanks Jewel,
So we stick on this and get back to you for any other clarifications.
-
CTRM,
I'm glad my response helped you.
To my eyes, without looking at keyword rankings, etc., the middle one looks like the most natural language version.
Good luck, and feel free to ping me if I can provide any additional ideas.
-- Jewel
-
I think it's Okay to go with as I noticed many are practising same from our industry. And I feel like "brand & keyword" is not going to hurt; if so it must be hurting all the pages of website being with same suffix across all page titles. I think Topic name is going to play key role which we possibly do not have duplicate content issues. Our new sub directory is a help guide and I am planning to add "help" and choose one of the below format.
Topic | help - vertigo tile
Topic | vertigo tiles help
Topic | vertigo tiles - help
-
Hi Jewel,
Thanks for such descriptive answer which explains a lot. Rather than worrying about getting penalised; I would like to make sure which way of using brand and main keyword across these page titles fetch in SEO. Actually our sub-directory is all about help guides. So I decided to go with our brand name and keyword as per you suggestion with high confidence levels. Again I need to add "help" to this...So I am now in finding out the best natural looking out of below:
Topic | help - vertigo tile
Topic | vertigo tiles help
Topic | vertigo tiles - help
-
Hello VTCRM,
This is a tough call. Because it is a branding versus SEO issue. Convention is to put the website's name on all the pages. However, you are correct to be concerned about duplication and "too much".
I decided to poke around on some big websites, where I know they have usability experts and ought to have the $$$ for high quality SEO. It looks like the convention is have the name in there, either as a repeated tagline, the company name, or as part of the product.
Target uses SquareSpace AFAIK, so even with customization, that may be a requirement of the platform, to repeat the tagline. But having used SquareSpace, it is probably their choice, as they have the programmers to change that.
I looked at Home Depot, and they do use their name in the product title. I also examined Nike. They use the name integrated into the product name, so not tagged on at the end.
My advice, then, would be to follow the convention and add the name to the title. I think the Google search engine has been programmed will enough to understand the brand name versus spamming.
Nike's way of integrating the name into the product is the one that stands out to me as potential SEO buster for spam. However, again, I think search engines ought to be able to pick apart a site or product name from spamming.
I think if you stick to convention and do "Topic | vertigo tiles", you'll be all right. As don_quixote pointed out, removing the standard branding name from the title does give you more room for other keywords. I agree with him that you should think through your navigation carefully, as you are doing, and that includes the page names ==> URL/slug names (the overall Information Architecture).
To summarize, do I think you'll be penalized for following web convention of the past 20 years and tacking your brand name/website name to the title? No.
Then your question will be, do you want to do this?
It sounds like you do, but you are hesitant because of fears of a duplication penalty. I don't think you need to worry about that, especially given these big sites are doing it.
The other aspect to information retrieval, is the location of one term or phrase near another that creates associations and helps in findability. Associating "product X women's tennis shoes" with "Nike" is a genuine association.
I think you'll be fine to add that name to the title, assuming you don't want the real estate for other keywords. IOW, I see no reason why you would be penalized. (And if not, contact me, and I'll help you fix it on my time!)
Me? I tend to follow convention in that regard. I'll buck convention in other areas, but you ought to be fine. (If it matters, I started building websites in 1995, I have worked with CMS systems for years, and I have yet to be penalized.)
-- Jewel
-
Without going into how to technically achieve the outcome. It may be beneficial to go back one step and consider drawing up a the url structure. Lay out the keyword/s being targeted for the global home page and then the first sub-folders. The url structure, when laid out, with keywords, should provide guidance on the layout of Title's and H1's. We often take out the company name/brand when required and use the 600 pixels available to optimise the page. This allows more individual title tags for search and customers. ie Your client will likely rank No 1-3 for their brand and brand labelling inner pages, unless beneficial for the customer experience is unlikely to assist brand ranking...
You may only want to index some of the sub directory pages... as well. rel canonical the juice back to the header page..
Anyway I like to go back to the url structure, and find when I get that right everything flows easily from there...
So in answer to your question - No I would not recommend you put vertigo tiles on every page of the sub-folder. I would make sure each page has a unique relevant title.. and a closely though not exact matching H1... to the page content. I add I see "black vertigo tiles" as different to "white vertigo tiles"
Hope that assists.
-
Hi Jewel,
Our website is wordpress and yes it auto generates our company name and main keyword to rear of the every page title. This is good because we do have targetiitng keywords and brand on all pages.
Our sub-directory is a different CMS. It's been hosted independently with own design. This will be even auto generated. My doubt is whether repeating same "company name and keyword" in all page titles of this sub-directory good or bad? Will this be kind of duplicate look for Google? Or it'll help us in the keyword scenario?
Thanks
-
What website platform/CMS are you using? Does it auto-generate your website name to either the front or rear of the page title? For example, as WordPress does? Or, is this something you can suppress, which I believe SquareSpace allows (but don't quote me on that).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Markups on The Same Page - Best Solution?
Hi there! I have a website that is build in react javascript, and I'm trying to use markup on my pages. They are mostly articles about general topics with common questions (about the topic), and for most articles I would like to use two markups: article markup + FAQ Markup ( for the questions in the article) article markup + how-to markup Can I do this or will Google get confused? Since I have two @type at the same time, for example @type": "FAQPage" and "@type": "Article". How should I think? I'm using https://schema.dev/ right now. Thanks!
Intermediate & Advanced SEO | | Leowa0 -
Should we 301 redirect old events pages on a website?
We have a client that has an events category section that is filled to the brim with past events webpages. Another issue is that these old events webpages all contain duplicate meta description tags, so we are concerned that Google might be penalizing our client's website for this issue. Our client does not want to create specialized meta description tags for these old events pages. Would it be a good idea to 301 redirect these old events landing pages to the main events category page to pass off link equity & remove the duplicate meta description tag issue? This seems drastic (we even noticed that searchmarketingexpo.com is keeping their old events pages). However it seems like these old events webpages offer little value to our website visitors. Any feedback would be much appreciated.
Intermediate & Advanced SEO | | RosemaryB0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0