How to Structure URL's for Multiple Locations
-
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations.
We currently have 60 locations nationwide and our URL structure is as follows:
www.mydomain.com/locations/{location}
Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes)
The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes".
To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this.
Option 1
Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path:
Option 2
Build the city and state pages into the URL and breadcrumb path:
www.mydomain.com/locations/{state}/{area}/{location}
(i.e www.mydomain.com/locations/fl/orlando/waterford-lakes)
Any insight is much appreciated. Thanks!
-
Hi David,
Typically, your main landing pages are going to be those that represent the city of location, as in:
etc.
What I'm trying to understand is if you are saying you have more than one office within a single city (as in orlando office A, orlando office B, orlando office C) and are trying to hash out how to distinguish these same-city offices from one another. Is this the scenario, or am I not getting it? Please feel free to provide further details.
-
David -
It looks like there are two main options for you:
Keep the same URL structure (option 1), and create category pages that are state-based / area-based, that then have a short description of each location in that geographic area, with a link to their location page.
This is typically how it might be done with an eCommerce site, where you'd have a parent category (i.e. shoes) and then a sub-category (i.e. running shoes).
The downside to this is that you risk having duplicate content on these category pages.
Option #2 would be my recommendation, because you are including the area / state information into the URL.
One company that does not do this well is Noodles & Company. Their location URL looks like this:
http://www.noodles.com/locations/150/
... where "150" is a store ID in a database. Easy to pull out of a database table. Less helpful to the end user who doesn't know that store ID 150 = the one closest to them.
It would be much better to have it listed like:
http://www.noodles.com/locations/Colorado/Boulder/2602-Baseline/You don't want to go much beyond 4 layers, but it's a better way of indicating to Google and other search engines the location tree.
Also, I'd highly recommend using a rich-data format for displaying the location information.
For example, on the Customer Paradigm site, we use the RDFa system for tagging the location properly:
Customer Paradigm
5353 Manhattan Circle
Suite 103
Boulder CO, 80303
303.473.4400
... and then Google doesn't have to guess what the location's address and phone number actually are.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing/shortening a url hurt SEO?
Hi all, I am in the process of making small optimization changes to my site. I noticed Moz identified quite a few URLs that could be shortened. I intend to shorten these URLs and create 301 redirects to ensure website users land on the right page. My question is, will this change in URL damage rankings and engagement(assuming the URL remains content relevant)? I have read in some places that when creating URL redirects for a change in domain, people saw a dip in rankings and engagement. I, however, am not intending to change the main domain of the site, but rather the URL slug. Any thoughts?
On-Page Optimization | | annegretwidmer0 -
What word should I use in my URL for my blog
Should I use the word "blog" in my sub folder as in : http://www.mybusiness.com/blog or should I use http://www.mybusiness.com/news. Is there a difference for when my site is crawled. I understand that a blog works a little differently. Can someone explain the basics?
On-Page Optimization | | graemesanderson0 -
Can't get my preferred URL, how much does it matter?
Hi guys. I'm building a new site at the moment - seen a solid SEO opportunity for my work. I'm a producer engineer, specialising in mixing and mastering, so i'm creating a site for online mixing services. After a bit of keyword research I decided that "online mixing" was the best, most relevant and high volume term to go for. Ideally i'd like my home page to be www.onlinemixing.com (or something similar) but alas! It's been taken, as well as all the variations (like switching words, hypens etc) How much does this matter form an SEO point of view? E.g - For the search term "online mixing" would - www.onlinemixing-signalchain.co.uk be much worse than -www.onlinemixing.co.uk? Or am I sweating the small stuff? Any thoughts would be greatly appreciated. Isaac.
On-Page Optimization | | isaac6630 -
URL Question
This url looks bad: http://www.patrickmunoz.com/#!classes/c1vw1 And when you click around the page change doesn't actually occur, it's a fade into the next page. I think this is a major problem for rankings. Although pages are crawled: https://www.google.com/search?q=site%3Ahttp%3A%2F%2Fwww.patrickmunoz.com%2F&oq=site%3A&aqs=chrome.2.69i57j69i58j69i59l3j69i61.3548j0j7&sourceid=chrome&espv=210&es_sm=122&ie=UTF-8 When I search for a simple page - "patrick munoz FAQs" nothing comes up:
On-Page Optimization | | tylerfraser
https://www.google.com/search?q=site%3Ahttp%3A%2F%2Fwww.patrickmunoz.com%2F&oq=site%3A&aqs=chrome.2.69i57j69i58j69i59l3j69i61.3548j0j7&sourceid=chrome&espv=210&es_sm=122&ie=UTF-8#q=patrick+munoz+|+FAQs Do you think this is a bad url configuration? Thanks! Tyler0 -
Optimize URL
Hello, My website have been running over five years. I have just reviewed and seen some URLs had not good. It is http://www.vietnamvisacorp.com/faqs/who-need-visa-to-vietnam---1.html, containing characters "---1". Should I remove unnecessary characters "---"?. Thanks for any advice!
On-Page Optimization | | JohnHuynh0 -
Link Structure
On my site I have a dropdown menu going across the page at the top to all of my categories on each page, I also have a similar structure going down the side going to the same categories, is this acceptable or would Google count this as double the internal links?
On-Page Optimization | | Palmbourne0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
How do I cure 'overly dynamic' url's on an e-commerce website?
I've just launched an e-commerce website selling hosiery and have received aa report from SEO Moz regarding overly dynamic URL's. How do I resolve this issue - in words of one syllable please, I'm new to SEO! Here are three exapmles of over 120: http://www.yosassy.com/index.php?route=product/category&path=1&page=2 http://www.yosassy.com/index.php?route=product/product&filter_tag=&page=1&product_id=57 http://www.yosassy.com/index.php?route=product/product&filter_tag=&page=1&product_id=64 Thank you.
On-Page Optimization | | lindsayjhopkins0