Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Local SEO - Multiple stores on same URL
-
Hello guys,
I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL.
What do you think? What's the best way and why?
Thank you in advance.
-
Hey There!
You have 2 possible approaches here:
-
Build a unique landing page on the website for each store if you feel you can create high quality content for each. This should be possible, if the project has the funding to post things like unique specials, events, products, offerings in each store on an ongoing basis. Link the Google My Business page and all other citations for each location to its unique landing page on the company website. A good example of this would be a site like REI.com. Go to http://www.rei.com/map/store and type in a geographic search. For example, searching for San Diego, CA, takes you here: http://www.rei.com/map/store#San Diego%2C CA. From there you can click on the map-based link to get to the individual page: http://www.rei.com/stores/san-diego.html. Look at what a nice job they've done with that page!
-
If the project seems overwhelming, the alternative would be something more like this: http://www.mcdonalds.com/us/en/restaurant_locator.html. When you type in a zip code, it simply brings up a map. There does not seem to be a unique page for each store. No doubt, the corporation felt such an approach would be futile given that McDonald's has some 35,000 locations globally and they all serve basically the same thing (though I have heard you can get an unlisted green chile cheeseburger at the McDonald's in Window Rock, AZ. in Navajo Country).
Which approach is stronger? #1, in most cases, but whether you can take that approach is going to depend on the funding for the project.
Hope this helps.
-
-
Although there might not be any issue, it's not the ideal approach.
I would suggest siloing each location to separate pages and use the geo modifier for the URL (so the location in new york would be mydomain.com/new-york). Include the info that John recommended and, if feasible, markup that info with schema markup. The regionally specific URL now has more of a potential of ranking in the normal search results, as well as the 3-pack (generated from GMB listings).
Not only will it increase your index count, but also give you a specific URL for each locations GMB listing which makes measuring success easier. I would also recommend setting up a UTM parameter for the GMB profiles so you can segment out the traffic and prove value to the client.
-
There is no issue in the perfect world each store has its own unique page with amazing content. The reality is that is not practical for some companies. For 50 stores I would be trying to get instructions to create unique pages for each store. Local search drives such high traffic instore and it should be worth it.
Maps, opening hours, contact details, pics of the location and if the staff are good looking & fun - throw in a pic of them.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
In local SEO, how important is it to include city, state, and state abbreviation in doctitle?
I'm trying to balance local geographic keywords with product keywords. I appreciate the feedback from the group! Michael
Local Website Optimization | | BFMichael0 -
Multi location silo seo technique
A physical therapy company has 8 locations in one city and 4 locations in another with plans to expand. I've seen two methods to approach this. The first I feel is sloppy and that is the individual url for each location that points to from the location pages on the main domain. The second is to use the silo technique incorporated with metro scale addition. You have the main domain with the number of silos (individual stores) and each silo has its own content (what they do at each store is pretty much the same). My question is should the focus of each silo, besides making sure there is no duplicate copy, to increase their own hyperlocal outreach? Focus on social, reviews, content curated for the specific location. How would you attack this problem?
Local Website Optimization | | Ohmichael1 -
Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area). We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool. Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google. Any insight would be very helpful. Thanks
Local Website Optimization | | RosemaryB1 -
Can PPC harm SEO results, even if it's off-domain?
Here's the scenario. We're doing SEO for a national franchise business. We have over 60 location pages on the same domain, that we control. Another agency is doing PPC for the same business, except they're leading people to un-indexable landing pages off domain. Apparently they're also using location extensions for the businesses that have been set up improperly, at least according to the Account Strategists at Google that we work with. We're having a real issue with these businesses ranking in the multi-point markets (where they have multiple locations in a city). See, the client wants all their location landing pages to rank organically for geolocated service queries in those cities (we'll say the query is "fridge repair"). We're trying to tell them that the PPC is having a negative effect on our SEO efforts, even though there shouldn't be any correlation between the two. I still think the PPC should be focused on their on-domain location landing pages (and so does our Google rep), because it shows consistency of brand, etc. I'm getting a lot of pushback from the client and the other agency, of course. They say it shouldn't matter. Has anyone here run into this? Any ammo to offer up to convince the client that having us work at "cross-purposes" is a bad idea? Thanks so much for any advice!
Local Website Optimization | | Treefrog_SEO0