Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
Multiple location pages are they bad?
Hello all, I am research some competitors of a client of mine. My client specializes in H.P. printer repair and over the last 8 years has lost market shares to the competition. I want to reclaim market share. As I was searching some of the service companies many have page that list multiple towns that they service. here is an example. http://printerrepairservice.com/locations-we-service/ Should I be recommending this to my client? To me it seems like a spam keyword process. I know an employee of this particular company and he say their online business is booming. I want my clients to boom too! What are your thoughts on these location type pages?
Local Website Optimization | | donsilvernail0 -
Passing Juice through Multiple Locations
Hey Gang, Thank you in advance for taking some time out of your day to read/comment on this. I really am thankful for this awesome community. SO, I just took over a locksmith client with over 20 different locations all up and down the west coast. They have some of their Google My Businesses ranking in the snap three. But most of them are not even close. The SEO that they had done was very 2012 and very messy. They have the name of the cities in their GMB profiles which is against google policy (although we haven't got taken down) Example: Instead of Locksmith plus they have Locksmith Plus Portland or Locksmith Plus Seattle. So their Citations are all over the place. Some locations have a bunch, and some locations I haven't even been able to put them on Yelp or Super pages (because they do not accommodate well at all for multi location business it's kind of been a nightmare) And Besides mediocre citations their websites are all over the place to. None of them are Linked to each other they each look like a separate brand. So here's my question(s) 1. I have a pretty good PBN network of my own real websites for clients that I have ranked to page one. I want to start Backlinking to just our one Main locksmith site (that ranks for no city) an have that juice flow into all the other sites but I am afraid I wont interlink them correctly and the juice will get wasted. Should I have like all the links to every cities website on the front page and point all my pbn at the front page? How to I link these bad boys correctly? Or should I... (next question) 2. Ok I know the Google my business does not care about how many citations we have but rather the quality of those citations. I already know we are having a brand crisis. We need to change all these listings to the same brand name but I am afraid google will spank us once we change and take down our number ones (so be it?) But My question is how much should I focus on back linking some of these page listings. Like should I be posting the naked Yelp URL on some of my web 2.0s (that link back to my main website)? Or what if i just had the main citations on the cities website so they could get some juice too? Confusing! Overall I know that Google wants clean consistent branding and that what we want to do.I just want to make sure everything is hooked up right so when I do make some Bad a** big content that every location can benefit from it. Guys thank you again. Much Loves and I hope every body had a great new year. Here's to a strong 2016
Local Website Optimization | | Meier0 -
Removed huge spammy location footer, looking to rebuild traffic the right way
Hello, On this site, I removed a huge spammy location footer with hundreds of cities, states, and dog training types. The traffic and rankings have gone down a lot, and I'd like a discussion on how to rebuild things the right way. There's some local adjustments to be made to the home page content, but other than that: My plans: 1. Analyze top 10 Google analytics keyword queries and work them into the content as best as possible, though I am debating whether the client should make new pages and how many. 2. I'm going to suggest he add a lot of content to the home page, perhaps a story about a dog training that he did in Wisconsin. I'll think about what else. Any advice is appreciated. Thanks.
Local Website Optimization | | BobGW0 -
Collapsing Location-Specific Subdomains
My client has 24 separate subdomains for its nationwide business, one for each specific location. Much of the content is very similar, as the site serves as a lead-generator for rental reservations. After years of suggesting the approach of using one domain, we have finally gotten the client onboard to eliminating the subdomains and maintaining a subdirectory/page approach for location-specific content and allowing universal content to live at the root domain. I've been looking for any case studies that have any watch-outs or demonstrated benefits when collapsing domestic subdomains (phoenix.client.com; albuquerque.client.com, etc.) into the root, and have been fairly unsuccessful so far. We will be setting up a rigorous 301 redirect tree to ensure we retain as much link juice as possible from any existing subdomain-specific inbound links. Any advice/guidance to help set expectations of what will shake down from this change? It feels like we should see increased domain authority and less cannibalization, as the client ranks nationally for important broad-level keywords, with significantly higher DA at the root level than any tracked competitors, but I'm a little nervous about how localized search results will be affected. Thank you!
Local Website Optimization | | ClassicPartyRentals1 -
.com vs .com/language ?
Hello Moooooooooz ! We're currently working on a new website http://www.globalmetal.fr/ which deep SEO issues. The problematic is as always in this case: 1 company + different subsidiaries + different markets + different languages The companies is handling different domains: http://www.globalmetal.fr/
Local Website Optimization | | JoomGeek
www.globalmetalbroker.ch
www.globalmetalbroker.com
and so on. Until recently I was totally convinced (there is no magic solution I know) that it was better for a SME to focus on 1 domain (.com) and get the other websites per language .com/fr .com/es etc. But in their case their TLD is pretty new: www.globalmetalbroker.com (DA 1) vs globalmetal.fr (DA 15) So I'm wondering: 1- Does Google know understand that globalmetal.fr is the french website of globalmetalbroker.com (maybe via webmaster tool) ?
2- Does it make senss to move all the (new) language websites into .com/[folders] and once the .com DA is doing better redirecting the .fr to.com/fr ?
3- Is it better to focus on .com .fr (but french speakers are not just in france) .ru and so on or to keep the .com/[languages] Hope someone got the same issue recently 😛0 -
Australian local business website on a dot.com - how do I ensure its indexed/ranked by Google.com/au as priority
look forward to your advice My client is a local business in australia but has a dotcom site which is hosted in US. We are just moving it to wordpress and new hosting. I want to ensure that Google.com/au will be able to index and rank the content. How can I tell google its a site for people in australia? I thought best to set up a subfolder like this hissite.com/au and redirect anyone from australia to go to this url? Thanks for your recommendations
Local Website Optimization | | bisibee10 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0