Schema markup for a local directory listing and Web Site name
-
Howdy there! Two schema related questions here
- Schema markup for local directory
We have a page that lists multiple location information on a single page as a directory type listing. Each listing has a link to another page that contains more in depth information about that location.
We have seen markups using Schema Local Business markup for each location listed on the directory page. Examples:
- http://www.yellowpages.com/metairie-la/gold-buyers
- http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx
Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location. Other sites such as Yelp etc have no markup for a location at all on a directory type page.
We want to stay with schema and leaning towards the superpages option. Any opinions on the best route to go with this?
- Schema markup for logo and social profiles vs website name.
If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup
https://developers.google.com/structured-data/customize/social-profiles
If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup.
https://developers.google.com/structured-data/site-name
We want to have the markup for the logo, social profiles and website name. Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?). Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website. Does this matter? Will Google be ok with the logo and social profile markup if we use the @website designation?
Thanks!
-
Thanks Everett!
Makes sense and the examples from your site were very helpful. We realized that we needed to add G+ and Linked in to our "sameAs" list as well.
-
Hello Heahea,
1. I would agree that you should go with the Superpages option: Name = Name of the Location
2. I would go with Organization Schema Type using JSON-LD script, as outlined here:
https://developers.google.com/structured-data/customize/social-profiles
I wouldn't have two Schema Types on that page. Use Organization and provide the URL using the URL property as defined here:
https://schema.org/Organization
This is how we do it on http://www.GoInflow.com. View source and look for the following code, which is the start of our JSON-LD script:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Schema Markup adds whitespace
We have this white space below our logo when our local schema markup is added: http://d.pr/i/73EmV0 Can the markup be hidden to remove the space and still be indexed by google? Kevin
Local Website Optimization | | KevnJr1 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
Staying on top of Google - Edmonton Web
Hello Moz folks, About 2 months ago we launched a brand new site - edmontonweb.ca I have gotten some amazing advice from the folks here at Moz, and amazingly enough, we are now ranking in the top 2 in our city for keywords like "Edmonton web design." We are very excited, but the question now, is how do we stay on top? We are updating our blog on a weekly basis, we are continuing to build inbound links, but how can we do more? I have considered summarizing blog posts in "guest blogs," and linking back to our blog post (so not all the inbound links are directing at the home page). I have considered a more active presence through press releases. Could these strategies be effective? Do you have any other suggestions on strategies we could utilize to stay on top?
Local Website Optimization | | Web3Marketing87
Thanks, Anton TLAUNCH Core Inc.0 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872