Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Search Location Keyword Use
Hello. Whats the best way to approach the use of location phrases within the page content itself? Say your based in a large city but also work in smaller surrounding areas, would you target the main location i.e. "London" on the home page and the main product/service pages directly. Or would you leave this all to deeper pages where you can more easily add value? I can imagine that the inclusion of the location i.e. "London" might compromise the quality of the writing. And put off the users from other locations. For example on the Home Page if your targeting:
Local Website Optimization | | GrouchyKids
Keyword: Widgets
Location: London Widgets in London and Beyond For the best Widgets in London come to... And for a key product or service page if your targeting:
Keyword: Car Widgets
Location: London Car Widgets London and Beyond For the best Car Widgets in London come to... On deeper pages its going to be easier to make this work, but how would you approach it on the main pages and homepage? Hope that all makes sense?0 -
Is CNAME / URL flattening a bad practice?
I recently have moved a number of websites top a new server and have made the use of CNAME / URL flattening (I believe these are the same?). A network admin had said this is an unrecommended practice. From what I have read it seems flattening can be beneficial for site speed and SEO even if very little.
Local Website Optimization | | Dissident_SLC0 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Country/Language combination in subdirectory URL
Hello, We are a multi country/multi lingual (English, Arabic) website. We are following a subdirectory structure to separate and geotarget the country/language combinations. Currently our english and arabic urls are the same: For UAE: example.com/ae (English Site) For Saudi Arabic: example.com/sa (Saudi Arabia) We want to separate the English and Arabic language URLs and I wanted to know if there is any preference as to which kind of URL structure we should go with : example.com/ae-en (Country-Language) example.com/en-ae (Language-Country) example.com/ae/en (Country/Language) Is there any logic to deciding how to structure the language/country combinations or is is entirely a matter of personal preference. Thanks!
Local Website Optimization | | EcommRulz0 -
Collapsing Location-Specific Subdomains
My client has 24 separate subdomains for its nationwide business, one for each specific location. Much of the content is very similar, as the site serves as a lead-generator for rental reservations. After years of suggesting the approach of using one domain, we have finally gotten the client onboard to eliminating the subdomains and maintaining a subdirectory/page approach for location-specific content and allowing universal content to live at the root domain. I've been looking for any case studies that have any watch-outs or demonstrated benefits when collapsing domestic subdomains (phoenix.client.com; albuquerque.client.com, etc.) into the root, and have been fairly unsuccessful so far. We will be setting up a rigorous 301 redirect tree to ensure we retain as much link juice as possible from any existing subdomain-specific inbound links. Any advice/guidance to help set expectations of what will shake down from this change? It feels like we should see increased domain authority and less cannibalization, as the client ranks nationally for important broad-level keywords, with significantly higher DA at the root level than any tracked competitors, but I'm a little nervous about how localized search results will be affected. Thank you!
Local Website Optimization | | ClassicPartyRentals1 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
How to target an established .co.uk site/blog to audiences in other English speaking countries - UAE, Singapore for example?
Excuse for the novice questions, but looking for help! 🙂 I have an established .co.uk website/blog for which I have established a good solid following in the UK over a good number of years. That said I have recently relocated to Dubai and so I am looking to target my English blog content to English speakers here and Singapore? While the language setting of my site is "en" is there anyway that I can change this to "en-ae" and "en-sg" for example to build a following in these markets? Or is my .co.uk TLD an issue that is going to hold me back from building following in these locations? I ask as I have just read the hreflang announcement from Google, but noticed in my Webmaster Tools that I get the following message: "Your site has no hreflang tags". Thanks in advance!
Local Website Optimization | | twofourseven0 -
Which internal page approach is better? Couponsite/Kohls OR Couponsite/Houston/Kohls
Google will use the user's location for a restaurant search but it doesn't look to me like it uses it for a national company like Kohls. Is there a way to determine that? Assume I have no physical local presence in Houston for answering the question. Assume also that the coupon I list is a national one that applies everywhere. It seems to me that a facebook post that uses the first one as a link is better because more people live outside of Houston than inside and will see it as relevant, AND I may list it for more than one city. But, for specificity perhaps it makes sense to have the second one as it may be more likely to show up in a Google search result by someone in Houston.. Your thoughts please? Thanks.
Local Website Optimization | | couponguy0