Permalink structure of an excising website
-
Hey everyone,
We have a website - a business directory. It is a large directory with lots of B2B listings. So currently our URL structure for the categories look like this:
dispensarystack.com/category/professional-services/
**We want to rank for the KW "dispensary professional services". **
Is it enough that we have the word dispensary in our brand name and in this way in the URL structure or is it better to make the slug:
**dispensarystack.com/category/**dispensary-professional-services/
Or even closer to the root of the website:
**dispensarystack.com/**dispensary-professional-services/
Thank you for your time and answers!
-
Hi there,
Although there are many SEO articles and guides that encourage webmasters to use the keyword in the URL slug, in my experience with large websites there is no gain on doing that.
Please, if you are optimizing a really small website on close to non-competed on SERPs, then there is a small chance that having your target keyword in the slug helps.I would spend my resources and time on optimizing content, technical debts and speed on those pages rather than this simple slug thing.
Did I answer your question?
Hope it helps.
Best luck! -
Testing something, please disregard!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website Page Speed is not increasing
HEY EXPERTS, My website page speed is not increasing. I used the wp rocket plugin but still, I am facing errors of Reduce unused CSS, Properly size images, and Avoid serving legacy JavaScript to modern browsers. you can see in the image Screenshot (7).png I used many plugins for speed optimization but still facing errors. I optimized the images manually by using photoshop but still, I am facing the issue of images size. After Google Core Web Vital Update my website keyword position is down due to slow speed. Please guide me on how I increase the page speed of my website https://karmanwalayfabrics.pk Thanks
Technical SEO | | frazashfaq110 -
Url folder structure
I work for a travel site and we have pages for properties in destinations and am trying to decide how best to organize the URLs basically we have our main domain, resort pages and we'll also have articles about each resort so the URL structure will actually get longer:
Technical SEO | | Vacatia_SEO
A. domain.com/main-keyword/state/city-region/resort-name
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature _
_ domain.com/family-condo-for-rent/orlando-florida/liki-tiki-village/kid-friend-pool_ B. Another way to structure would be to remove the location and keyword folders and combine. Note that some of the resort names are long and spaces are being replaced dynamically with dashes.
ex. domain.com/main-keyword-in-state-city/resort-name
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village_ _ domain.com/main-keyword-in-state-city/resort-name-feature_
_ domain.com/family-condo-for-rent-in-orlando-florida/liki-tiki-village-kid-friend-pool_ Question: is that too many folders or should i combine or break up? What would you do with this? Trying to avoid too many dashes.0 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Our Website Get too many visits from this IP ?
Hi, We get alot of visits from this IP below... Whats this ? | Host Name: | google-proxy-66-249-85-8.google.com | Browser: | IE 8.0 |
Technical SEO | | Asjad
| IP Address: | 66.249.85.8 — [Label IP Address] | Operating System: | WinVista |
| Location: | Mountain View, California, United States | Resolution: | 1024x768 |0 -
Which are the best website reporting tools for speed and errors
Hi, i have just made some changes on my site by adding some redirects and i want to know what affect this has had on my site www.in2town.co.uk I have been using tools such as Pingdom tools, http://gtmetrix.com but these always give me different time reports so i do not know the true speed of my site and give me different advice. So i am wanting to know how to check the true speed for my site in the UK and how to check for the errors to make it better any advice would be great on which tools to use
Technical SEO | | ClaireH-1848860 -
Absolute of Relative Internal Website Links
Hi, I am not sure what is considered best practice when linking between pages on the same site - absolute or relative: Link Or Link I notice a lot of CMS systems (WordPress) use the absolute method - is there a reason? Any help much appreciated. Barney.
Technical SEO | | barnst0 -
SeoMoz robot is not able to crawl my website.
Hi, SeoMoz robot crawls only two web pages of my website. I contacts seomoz team and they told me that the problem is because of Javascript use. What is the solution to this? Should I contact my webdesign company and ask them to remove Javascript code?
Technical SEO | | ashish2110 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0