Structured Data Schema for Local business
-
Hi
Where should you add ‘local business’ schema, the 'Home Page', ‘About Us’ page, 'Contact Us' page etc etc ? I presume the page with the address such as 'contact us' page but if say the address is on every page say in a footer for example is it ok to add address schema to every page ?
I know someone who did this and havn't got any rich snippets out of it so presume best to focus on one primary page such as 'contact' or 'about' type pages ?
Also:
If your business serves multiple areas can you add schema for the other areas too or is it only for your primary business address ?
For example if your business address is listed in say ‘Wandsworth’ but you visit & serve customers in ‘Clapham’, ‘Balham’ & other regions of South West London, anyway of adding local business address structured data to your site for these areas too (to help target local searches including these other regions)Many Thanks
Dan -
ps - any idea how to add schema for the regions served ?
for example my clients business address is in are A but their customer and regions they serve are in areas B, C & D if you see what i mean and really want to target local search (in part by applying region specific schema) for these regions not just the company address address
Any advice much appreciated ?
all best
dan
-
Thanks Martijn !
-
Hi Dan,
I would definitely put it on the About us and contact pages. You could go for all pages if all your contact information, for example, could be found in the footer of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Proper SEO structure for Franchise/ Franchisee websites
Hi Neighbors, Franchise website design and development can be difficult, there’s no doubt about it. I had to find the right balance between a unique and unified brand identity, and a localized experience that accurately reflects the individual franchisees and their efforts. Due to the many benefits, I have structured the to read _domain.com/location _ domain.com = TLD /location = subfolder (location page) I have also built a customized CMS (e.g. Drupal) and have given each location access to manage their location page (subfolder). To accommodate local SEO optimization, franchisees have complete control in terms of optimizing their location page (subfolder). Title tags, meta description, Alt tags, etc... Will any local optimization performed in the subfolder (location page) be stiffened because it was not done in the TLD but in the subfolder ?
Local Website Optimization | | Jeffvertus1 -
How to Get 1st Page Google Rankings for a Local Company?
Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.
Local Website Optimization | | nanton1 -
Preventing multiple market domains from appearing in the local search rsults
Working on an international client, how would you help solve multiple market domains from appearing in the local search rsults?
Local Website Optimization | | Cristiana.Solinas0 -
Hreflang errors "no return tag" sitemap.xml , and local search landing page with wrong Languages
Really need help , our website when search in google(US) will provide global page (keyword:asus/asus zenfone3). and search console also return "no return tags"another wear thing is when use googlebot crawl sitemap.xml googlebot cannot finish the file less than a quarterCan you please advise on what needs to be edited or changed to make sure my implementation is correct and not returning errors?
Local Website Optimization | | June01270 -
Optimizing Local SEO for Two Locations
Hi there! I have a client that has just opened a 2nd location in another state. When optimizing for local I have a few questions: We're creating a landing page for each location, this will have contact information and ideally some information on each location. Any recomendations for content on these landing pages? The big question is dual city optimization. Should Include the city & state of BOTH locations in all my title tags? or should I leave that to the unique city landing pages? What other on-page optimizations should i consider across the site? Thanks! Jordan
Local Website Optimization | | WorkhorseMKT0 -
Can you recommend any widgets or additions for a local landing page?
Our company has locations in several different cities, and we're in the process of creating landing pages for each city that feature relevant information. We use Drupal, fwiw. In the past, we've talked about trying to include a local weather widget, a news widget, or something similar as a way to help improve our local rankings for each area. Have you used anything like that? What did you find to be effective? Can you recommend anything similar? Thanks!
Local Website Optimization | | ScottImageWorks0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1