Schema Address Question
-
I have a local business with a contact page that I want to add schema markup to. However, I was wondering if having the address with schema info on the contact page instead of the home page has any adverse effects on the rich snippet showing up in search.
There's no logical place to add schema for a local business on the home page, so having it on the contact page—not in the footer or sidebar—is the only option.
-
Hi Dan, So glad to help. Do hang onto the idea of putting schema encoded NAP in your footer - it's kind of Local SEO 101 to do so. If, for some reason, it would be easier to put it up in your masthead or in a sitewide nav bar, that would be a good substitute.
-
Thanks for reply, Miriam. I'm glad to know it doesn't have to be on the home page. It's possible for me to put it in the footer, but it would be more hassle than it's worth, so I wanted to make it clear for answering purposes that I didn't want to do that.
-
Hi Dan, Rich snippets come and go in the SERPs, so I don't have a simple answer to that, but it is definitely a best Local SEO practice to add your complete NAP (name, address, phone number) as the very first thing on a Contact Us page. You do not have to have it on your home page. This will not hurt you in any way. In addition to this, it's a good idea to place the same in your footer, sitewide, but you seem to be indicating that you have no way to do this. Why are you unable to do so? I'm curious.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fundamental HTTP to HTTPS Redirect Question
Hi All I'm planning a http to https migration for a site with over 500 pages. The site content and structure will be staying the same, this is simply a https migration. Can I just confirm the answer to this fundamental question? From my reading, I do not need to create 301 redirect for each and every page, but can add a single generic redirect so that all http references are redirected to https. Can I just double check this would suffice to preserve existing google rankings? Many Thanks
Technical SEO | | ruislip180 -
302 Redirect Question
After running a site crawl. I found two 302 redirects. The two redirects go from: site.com to www.site.com & site.com/products to www.site.com/products How do I fix the 302 redirect and change it to a 301 redirect? I have no clue where to start. Thanks.
Technical SEO | | Ryan_1320 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Google Enterprise Search Questions
Hi Everybody, A client has asked me to take a look at Google Enterprise Search for them. It has been a few years since I last fooled around with implementing a Google search box on a website, and that was the free version which included off-site results in the results. This appears to be the main page describing the paid product: http://www.google.com/enterprise/search/ I have three questions: The search testing function on the above page doesn't seem to be working. I'm typing in a URL and search term, as prompted, and the page is simply refreshing. It never provides me an example set of results. Is it working for you? This client has a moderately large e-commerce site (about 200 products). Have you implemented Google enterprise search on such a site and are you happy with its performance? The goal here is to let users search for a topic and be returned both product and informational pages. How well does this tool do this? Am I going to need to know any special types of coding (beyond html/css) to implement this? If so, what are they? If you have experience with this product, I would surely appreciate your feedback. Thank you!
Technical SEO | | MiriamEllis0 -
Can Google Read schema.org markup within Ajax?
Hi All, as a local business directory, we also display Openinghours on a business listing page. ex. http://www.goudengids.be/napoli-kontich-2550/
Technical SEO | | TruvoDirectories
At the same time I also have schema.org markup for Openinghours implemented.
But, for technical reasons (performance), the openinghours (and the markup alongside) are displayed using AJAX. I'm wondering if google is able to read the markup. The rich snippet tool and markup plugings like Semantic Inspector can't "see" the markup for openinghours. Any advice here?0 -
Are there SEO implications to blocking foreign IP addresses?
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess. Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots. Looking for any insight on this or if there are any other potential SEO problems to consider. Thanks
Technical SEO | | ZeeCreative0 -
How to add business address in local directories for consistent NAP
Hi Mozers I keep puzzling over this one! I work from home and really don't want to plaster my address all over the web. The GP page now allows for me to hide my exact location, which is great. However, as far as I can see this is not the case with all the potential local directories and listings. I have been trying in to get around this by not adding my house number and last digit and 2 characters of my post code. So far this has been allowed by the local listings I have signed up with. When I tried doing as recommended by the excellent Miriam and checking my business name with 'Getlisted' I found that I could only see these local listings if I added the doctored address, i.e. no house number or full postcode. My question, finally is, if I continue in this fashion for businesses based at home addresses am I going to confuse the search engines. I want to provide a consistent NAP but GPP insists that I add a full postcode. The only way I could possible see around this is to add: street name city full postcode and omit the house name/number. Will this be a reasonable work around to maintain client confidentiality and satisfy the NAP requirement of Local search?
Technical SEO | | catherine-2793880 -
Microformats & Schema.org query
Just finished watching the Microformats & Schema.org webinar (thanks for a good presentation Richard) and picked up some interesting tips. It did get me thinking about ways I could use them with a couple of ecommerce sites I am working on. At present there are no reviews on the page so cannot add that tag, however the product pages have a facebook like and a tweet option so maybe I could add a tag based around that? Another one I am considering is putting the sizes of the items in one 'available in sizes 12-32' for example as women often ponder if a store will have it in their size. I guess my question is, would these ways of using it be considered too spammy. I note the webinar state using microformats can be useful but there is a risk if they are too spammy etc. Any opinions would be most welcome, Carl
Technical SEO | | Grumpy_Carl0