Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URL Structure & Best Practice when Facing 4+ Sub-levels
-
Hi.
I've spent the last day fiddling with the setup of a new URL structure for a site, and I can't "pull the trigger" on it.
- Example: - domain.com/games/type-of-game/provider-name/name-of-game/
- Specific example: - arcade.com/games/pinball/deckerballs/starshooter2k/
The example is a good description of the content that I have to organize. The aim is to a) define url structure, b) facilitate good ux, **c) **create a good starting point for content marketing and SEO, avoiding multiple / stuffing keywords in urls'.
The problem? Not all providers have the same type of game. Meaning, that once I get past the /type-of-game/, I must write a new category / page / content for /provider-name/.
No matter how I switch the different "sub-levels" around in the url, at one point, the provider-name doesn't fit as its in need of new content, multiple times.
The solution? I can skip "provider-name". The caveat though is that I lose out on ranking for provider keywords as I don't have a cornerstone content page for them.
Question: Using the URL structure as outlined above in WordPress, would you A) go with "Pages", or B) use "Posts"
-
I'm not quite sure if I'm correct since I'm not in the gaming industry, but I don't think there would be many benefits in terms of SEO for having provider name in URL. There are a few reasons that I wouldn't add provider name in URL:
- Your URL will be much longer, Moz suggests to keep it within 75 words, including https://www.
- I'm guessing the more important keyword here is the game name, by having publisher name you're moving your important keyword further from the root domain
- It cost more time and effort to manage and create content for each provider page/content
- Provider name is a branded keyword, and it would be hard for you to outrank them in their brand name
- doesn't give much SEO value
When I search for "Battlefield 1", almost all the top results have "Battlefield 1" close to their root domain.
https://www.windowscentral.com/battlefield-1-2018
https://www.gamespot.com/battlefield-1/
https://www.g2a.com/en/battlefield-1-origin-key-global-i10000016618004
https://www.origin.com/sgp/en-us/store/battlefield/battlefield-1#store-page-section-criticalacclaimI think at the end it depends on what keyword you're trying to rank for and does having the publisher name helps.
-
Thank you for both your time and effort in response to my thread, Joseph.
The reason I do not want to use WordPress and its Tags-function, is that it is a WP-function, and not something beneficial to SEO at all. On the contrary, tags create new URLs, duplicate content, and thin content as well.
Continuing on your stating that "relevant content to the keywords", does this mean I can skip the "provider-name" in the url?
I've seen competitors that includes the provider in URL, and they outrank others that do not. I recognize the fact that the URL is not the only signal / factor here, but I'm seeing a trend, hence my question.
-
Hey Dan,
though keyword in URL can be helpful it wasn't that much of a factor in ranking. If your page has relevant content to the keywords Google will be able to tell. Alternatively, since you're using Wordpress, will "tag" be another solution?
Regards,
Joseph Yap
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Looking to remove dates from URL permalink structure. What do you think of this idea?
I know most people who remove dates from their URL structure usually do so and then setup a 301 redirect. I believe that's the right way to go about this typically. My biggest fear with doing a global 301 redirect implementation like that across an entire site is that I've seen cases where this has sort of shocked Google and the site took a hit in organic traffic pretty bad. Heres what I'm thinking a safer approach would be and I'd like to hear others thoughts. What if... Changed permalink structure moving forward to remove the date in future posts. All current URLs stay as is with their dates Moving forward we would go back and optimize past posts in waves (including proper 301 redirects and better URL structure). This way we avoid potentially shocking Google with a global change across all URLs. Do you know of a way this is possible with a large Wordpress website? Do you see any conplications that could come about in this process? I'd like to hear any other thoughts about this please. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
If I own a .com url and also have the same url with .net, .info, .org, will I want to point them to the .com IP address?
I have a domain, for example, mydomain.com and I purchased mydomain.net, mydomain.info, and mydomain.org. Should I point the host @ to the IP where the .com is hosted in wpengine? I am not doing anything with the .org, .info, .net domains. I simply purchased them to prevent competitors from buying the domains.
Intermediate & Advanced SEO | | djlittman0 -
Yoast & rel canonical for paginated Wordpress URLs
Hello, our Wordpress blog at http://www.jobs.ca/career-resources has a rel canonical issue since we added pagination to the front page and category-pages. We're using Yoast and it's incorrectly applying a rel-canonical meta tag referencing page 1 on page 2, 3, etc. This is a known misuse of the rel-canonical tag (per Google's Webmaster Blog - http://googlewebmastercentral.blogspot.ca/2013/04/5-common-mistakes-with-relcanonical.html, which says rel-canonical should be replaced with rel-prev and rel-next for page 2, 3, etc.). We don't see a way to specify anywhere in Yoast's options to correct this behaviour for page 2, 3, etc. Yoast allows you to override a page's canonical URL, otherwise it automatically uses the Wordpress permalink. My question is, does anyone know how to configure Yoast to properly replace rel-canonical tags with rel-prev and rel-next for paginated URLs, or do I need to look at another plugin or customize the behavior directly in my child theme code? This issue was brought up here as well: http://moz.com/community/q/canonical-help, but the only response did not relate to Yoast. (We're using Wordpress 3.6.1 and Yoast "Wordpress SEO" 1.4.18)
Intermediate & Advanced SEO | | aactive0 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0