NAP - is lack of consistency in address elements an issue?
-
I've been looking at a local business in London and they have multi-sites, each with multiple versions of the same address in their NAP - they're using the correct addresses, with variations in terms of order of address elements (with some missing out London, and some including London)
For example, one listing puts the postcode after the city district - another before. Sometimes London is included in the address, though often not (the postal service doesn't include London in their "official version" of the addresses).
So the addresses are never wrong - it's just the elements in the address are mixed up a little, and some include London, and some do not.
Should I be concerned about this lack of address consistency, or should I try to exact match the various versions?
-
Sounds like a good plan, Luke! Good luck with the work, and be sure the calendar is crawlable
-
Hi Luke,
It's a complex topic. I think you'll find this Matt McGee article from SmallBusinessSEM and this one from Marcus Miller at Search Engine Land extremely helpful. Both talk about how to optimize multi-location businesses and very specifically about data consistency and does Google pay attention to slight variations like the one you described in your question where the addresses are never wrong, just "mixed up a little".
"... for the most part, the algo handles those minor discrepancies well. That being said, you don’t want to tempt fate."
-
Yes sorry it needed clarification - was struggling to describe the issue - what you suggest sounds like a good idea, indeed - I will put a complete NAP only at the top of each of the 8 main landing pages, in Schema, along with a calendar on each landing page linking to the class descriptions. Many thanks for your help with this - much appreciated
-
Ah, got it, Luke! Thanks for clarifying. It seems to me, then, that what you might need is some kind of a calendar on the main city landing page for each location that links to the different class descriptions. Would this be a way to format 38 different links so that customers can understand them easily and see what's available? Just a thought!
-
Hi Miriam - yes the 38 pages have been created about the services from each specific location (in this case health and fitness classes) - the classes are specific to that location, so each of the run of 38 pages are specific to a specific location, so there would a strong contextual relationship. Basically the 38 pages are specific to classes unique to that location (in terms of times, tutors and often type).
So I guess the whole idea of whether to do a specific footer for each locational section was what was humming around in my brain, with the specific address relevant to the content above, in the footer, rather than all 8 business locations consistently in the footer.
I was originally thinking of adding all 8 business addresses consistently in the footer, though I thought perhaps specific addresses may be more user friendly, and may even help Google understand the locational context.
-
Hi Luke,
Hmm ... that doesn't sound right to me. I may be missing something, but unless these 38 pages for each location have genuinely been created about the location and things relating specifically to it, I would not stick the NAP on there, just for the sake of putting it on a bunch of pages. What you're describing to me sounds like some kind of afterthought.
I also wouldn't change the footer around like that. It could create usability difficulties if it's changing throughout the site. Rather, my preference would be complete NAP only at the top of a single landing page per physical location, and NAP of all 8 businesses consistently in the sitewide footer. And, again, NAP of all 8 on the Contact page.This is what I consider to be the normal structure.
As for what to do with those several hundred pages, are they of really high quality? Are they city-specific or just generic to the business' topic? An example of city-specific might be something like a website for an arborist. He has a page for City A talking about how Dutch Elm Disease has hit that city. For City B, he has a page about birch tree borers that have affected that city's trees. So, from the main city A landing page, he could link to the Dutch Elm piece and for the main city B landing page, he could link to the birch borer page, as additional resources.
But if the content is just generic and you're trying to divvy it up between the cities, if there's not a strong contextual relationship, then there isn't really a good reason for doing so.
-
Hi Miriam,
What I meant is there are 8 business locations and the site's 300 odd pages are divided into these 8 (so each geographical location has around "38 pages" dedicated to that specific location and its services).
So what I was planning to do was simply put the correct location-specific NAP in the footer of each of the location-specific pages (so each run of location-specific "38 pages" will have the relevant [single] NAP in the footer of every page).
But my co-worker said only put the correct [single] NAP in the footer of the 8 location home(/landing) pages within the site, rather than on every page.
Hope that makes sense [it's been a long week ;-I]
-
(Miriam responding here, but signed into Mozzer Alliance right now)
Hi Luke,
If you mean in the footer and it's 10 or less locations, I'd say it's okay to put the NAP for the 8 businesses there, but not in the main body of the page.
My preferred method would be to put the complete NAP, in Schema, for Location A at the top of City Landing Page A, complete NAP for Location B at the top or City Landing Page B, etc. I would not suggest putting all of this NAP anywhere else on the site but the Contact Page.
-
Thanks Miriam - it sure does - their website is divided up by location, so I'm planning to put the relevant NAP at the bottom of every page through the website (8 locations and NAPs in total - 300 pages) - a colleague suggested just puting the NAP on each of the 8 location homepages though I suspect it would help more if the NAP was at foot of every page (so long as the correct NAP on the correct page ha!) - is that the right thing to do?
-
Hey Luke!
NAP consistency was judged to be the second most influential pack ranking factor on this year's Local Search Ranking Factors (https://moz.com/local-search-ranking-factors) so, yes, it's of major importance! Hope this helps.
-
When it comes to NAP, it should be as close to an exact match as you're able to achieve. Inconsistency in this area - while not the biggest detriment you can have - should be avoided.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New websites issues- Duplicate urls and many title tags. Is it fine for SEO?
Hey everyone, I have found few code issues with our new website and wanted to see how bad those problems are and if I have missed anything. If someone can take a look at this and help me it would mean the world. Thank you. all! We hired an agency to design a new site for us and it's almost ready, but the other day I found some problems that made me wonder if this new site might not be as good as I thought and I wanted to ask you to take a look at the code and possibly help me understand if from SEO prospective it is sound. But I really want someone who understands SEO and web design to look at our code and point out what might be wrong there. Here is a link to the actual site which is on a new server: http://209.50.54.42/ What I found few days ago that made me wonder something might not be right. Problem 1. Each page has 3 title tags, I guess whatever template they are using it automatically creates 3 title tags. When you do " View Page Source" For example on this url: http://209.50.54.42/washington-dc-transportation when you view the code, the lines Lines 16,19 and 20 have the title tag which in my opinion is wrong and there should only be one. Could this hurt our SEO? Problem 2. Infinite duplicate urls found All following pages have INFINITE NUMBER OF DUPLICATE URLS. EXAMPLE: http://209.50.54.42/privacy-policy/8, http://209.50.54.42/privacy-policy/1048, http://209.50.54.42/privacy-policy/7, http://209.50.54.42/privacy-policy/1, http://209.50.54.42/privacy-policy you can add any type of number to this url and it will show the same page. I really think this 2nd problem is huge as it will create duplicate content. There should be only 1 url per page, and if I add any number to the end should give a 404 error. I have managed to find these 2 issues but I am not sure what else could be wrong with the code. Would you be able to look into this? And possible tell us what else is incorrect? I really like the design and we worked really hard on this for almost 5 moths but I want to make sure that when we launch the new site it does not tank our rankings and only helps us in a positive way. Thanks in advance, Davit
Intermediate & Advanced SEO | | Davit19850 -
Issue with site not being properly found in Google
We have a website [domain name removed] that is not being properly found in Google. When we run it through Screaming Frog, it indicates that there is a problem with the robot.txt file. However, I am unsure exactly what this problem is, and why this site is no longer properly being found. Any help here on how to resolve this would be appreciated!
Intermediate & Advanced SEO | | Gavo1 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Issues with Google-Bot crawl vs. Roger-Bot
Greetings from a first time poster and SEO noob... I hope that this question makes sense... I have a small e-commerce site, I have had Roger-bot crawl the site and I have fixed all errors and warnings that Volusion will allow me to fix. Then I checked Webmaster Tools, HTML improvements section and the Google-bot sees different dupe. title tag issues that Roger-bot did not. so A few weeks back I changed the title tag for a product, and GWT says that I have duplicate title tags but there is only one live page for the product. GWT lists the dupe. title tags, but when I click on each they all lead to the same live page. I'm confused, what pages are these other title tags referring to? Does Google have more than one page for that product indexed due to me changing the title tag when the page had a different URL? Does this question make sense? 2) Is this issue a problem? 3) What can I do to fix it? Any help would be greatly appreciated Jeff
Intermediate & Advanced SEO | | IOSC0 -
Bing flags multiple H1's as an issue of high importance--any case studies?
Going through Bing's SEO Analyzer and found that Bing thinks having multiple H1's on a page is an issue. It's going to be quite a bit of work to remove the H1 tags from various pages. Do you think this is a major issue or not? Does anyone know of any case studies / interviews to show that fixing this will lead to improvement?
Intermediate & Advanced SEO | | nicole.healthline0 -
SEO friendly way to redirect users based on IP address
Hi There I have an issue where I need to redirect all visitors from a specific country to a sub-domain, which has content explaining to those users that our service is not offered in that country. I would like to know the most SEO friendly way to do this? Thanks Sadie
Intermediate & Advanced SEO | | dancape0 -
REL canonicals not fixing duplicate issue
I have a ton of querystrings in one of the apps on my site as well as pagination - both of which caused a lot of Duplicate errors on my site. I added rel canonicals as a php condition so every time a specific string (which only exists in these pages) occurs. The rel canonical notification shows up in my campaign now, but all of the duplicate errors are still there. Did I do it right and just need to ignore the duplicate errors? Is there further action to be taken? Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0