Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url-delimiter vs. SEO
-
Hi all,
Our customer is building a new homepage. Therefore, they use pages, which are generated out of a special module. Like a blog-page out of the blog-module (not only for blogs, also for lightboxes).
For that, the programmer is using an url-delimiter for his url-parsing. The url-delimiter is for example a /b/ or /s/. The url would look like this:
www.test.ch/de/blog/b/an-article
www.test.ch/de/s/management-coaching
Does the url-delimiter (/b/ or /s/ in the url) have a negative influence on SEO? Should we remove the /b/ or /s/ for a better seo-performance
Thank you in advance for your feedback.
Greetings. Samuel
-
Hi Samuel,
In general, URLs should not contain any unnecessary folders (delimiters). In your first example, the /b/ is not needed since you've already got a /blog/ folder. In the second example, that page appears to be main site content, you don't need any additional folders unless they're specifying a general topic under which you'll be adding more specific pages.
You're also burying your keywords a one step further into the URL than is needed. Google says they don't put too much weight on URL structure, but in my experiences, well planned and logical URL structures perform better. It's not going to have a huge impact on your rankings, but it will help to some degree.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with URL Too Long
Hello Mozzers! MOZ keeps kindly telling me the URLs are too long. However, this is largely due to the structure of E-commerce site, which has to include 'brand' 'range' and 'products' keyword. For example -
Moz Pro | | tigersohelll
https://www.choicefurnituresuperstore.co.uk/Devonshire-Rustic-Oak-Bedside-Cabinet-1-Drawer-p40668.html MOZ recommends no more than 75 characters. This means we have 25-30 characters for both the brand name and product name. Questions:
If it is an issue, how to fix it on my site?
If it's not an issue, how can we turn off this alert from MOZ?
Anyone know how big an issue URLs are as a ranking factor? I thought pretty low.0 -
How long do changes in title tags take to affect SEO?
This is kind of a loaded question. I'm completely new to SEO. I think my boss signed up for Moz Pro sometime in February and started adding data to our Ecommerce site to help with rankings. Sometime before this, I changed some of the title tags on the site (trying to help with organic search and CTR). I did not do a site wide change.... just changed maybe 10-20 (just a guess). I did it with keywords in mind but did not make note of when I did it. I didn't really know better at the time, and I did not have access to Google Analytics or Moz Pro. I was looking through the ranking data/graph for February and March. It won't let me look before February 29th (so that's why I think my boss started the Mos Pro subscription around at that time). On that day it said we ranked 12 keywords in the 1-3 spot, and then the following week (march 7) it went down to 6. I don't think or know if any major site changes were implemented, so I'm not sure why that happened and if it has anything to do with my title tag changes I did maybe a week or two before (again I am not sure when I did this unfortunately). Since then the keyword ranking numbers stayed about the same with organic traffic slowly going down (it could be because we are getting out of season for our industry though). The second week of March the site was upgraded and since then the menu has been completely changed around. Last week I did a site wide title tag change. So the minor changes I made in February are no longer in effect anyway. I added more keywords to Moz earlier this week and the number for 1-3 spot keywords went up from 6 to 20. It also says my ranking moved up 4 keywords and down 13 keywords. Anyway, I am wondering how seriously I should take these changes and if I'm damaging the site. I am new to Moz Pro also so all the data you can access is kind of confusing/overwhelming.
Moz Pro | | AliMac260 -
404 Crawl Diagnostics with void(0) appended to URL
Hello I am getting loads of 404 reported in my Crawl report, all appended with void(0) at the end. For example: http://lfs.org.uk/films-and-filmmakers/watch-our-films/1289/void(0)
Moz Pro | | moshen
The site is running on Drupal 7, Has anyone come across this before? Kind Regards Moshe | http://lfs.org.uk/films-and-filmmakers/watch-our-films/1289/void(0) |0 -
Woocommerce filter urls showing in crawl results, but not indexed?
I'm getting 100's of Duplicate Content warnings for a Woocommerce store I have. The urls are
Moz Pro | | JustinMurray
etc These don't seem to be indexed in google, and the canonical is for the shop base url. These seem to be simply urls generated by Woocommerce filters. Is this simply a false alarm from Moz crawl?0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
I am looking for SEO tips specifically for magazine site's
I have a client who has a website that is based on a magazine. They make their money through advertisement I am primarily an inbound marketer I would be very grateful if anyone out there has any tips for a site that has been around for quite a while ( over 10 years) we are transforming the site from HTML into WordPress then hosting it with a fast managed WordPress host using CDN. I feel the lack of links is an obvious place to start however if there's anything specific to magazine based websites I would be more than grateful to hear your opinions. Thank you all in advance. Sincerely, Thomas von Zickell
Moz Pro | | BlueprintMarketing0 -
TLD vs Sub Domain in Regards to Domain Authority
I have always been under the impression that top level (or root) domains can hold different domain authority than that of a sub domain. Meaning that sub domain's and TLD can hold different ranks and strength in search engine result pages. Is this a correct or just an assumption? If so when i add a root domain and subdomain into the campaign manager i get back the same link information and domain authority? www.datalogic.com
Moz Pro | | kchandler
www.automation.datalogic.com Have I made an incorrect assumption or is this an issue with the SEOMoz campaign manager?0