Google Places & Multiple Listings
-
Our client used to have a listing in each city, but after updating the addresses they were forever under review. Google said that businesses serving customers at their locations can only list their primary office.
Back when this client had multiple city listings, all addresses but one were UPS boxes. If they are to change back to "No, all customers come to the business location," can they once again submit a listing for each city using these addresses?
Yes, I realize they are UPS boxes, but they insist on being listed for each city.
-
You are so welcome, Zeke!
-
Thank you, Miriam. Sometimes it's good to have a third party confirm what you already know the correct answer should be. Appreciate it.
-
Hi Zeke,
Oh, clients like these are a handful! Explain, very clearly, to the client that the reason their listings went under review was because they broke the rules. What they want to do now is still breaking the rules and could risk their one legitimate location's rankings if Google decides they are spamming the index. Don't be vague. Be totally straightforward on this. Show them the guidelines: http://support.google.com/places/bin/answer.py?hl=en&answer=107528
Especially this part:
Business Location: Use a precise, accurate address to describe your business location.
Do not create a listing or place your pin marker at a location where the business does not physically exist. P.O. Boxes are not considered accurate physical locations.
Do not create more than one listing for each business location, either in a single account or multiple accounts.
Businesses that operate in a service area, as opposed to a single location, should not create a listing for every city they service. Businesses that operate in a service area should create one listing for the central office or location and designate service areas. Learn how to add service areas to your listing.
If you don't conduct face-to-face business at your location, you must select "Yes, this business serves customers at their locations" under the "Service Areas and Location Settings" section of your dashboard, and then select the "Do not show my business address on my Maps listing" option.
If the client cannot see that these rules are precisely describing that what they want to do is a violation, my advice is to drop them like a hot potato.
Local SEOs strive to help honest business people - not to abet rule breakers. If your client changes his tune after he sees the guidelines, then you can offer him an alternative, legitimate strategy that would work along these lines:
-
The client may go after true local rankings for his city of location by running a well optimized website that incorporates important local hooks, by having a single Places listing/Google+ Local Page that follows all the rules, and by building citations for his single, legit address.
-
If he is a service-radius-type business (like a plumber, carpet cleaner, chimney sweep) and serves customers at their locations rather than at his location, then he must comply with the hide address rule on his single Places Listing.
-
All of the above goes toward achieving high local rankings within the pinned, lettered blended/local pack of results.
-
Now, to approach the task of ranking well for his service cities (as a plumber, carpet cleaner or lawyer would), he can begin to showcase his work in these other surrounding cities where he is not physically located by created awesome city landing pages for each. These pages must feature totally unique, first class copy (no cutting and pasting copy, no thin content). He can create a unique page for each city that he serves.
-
He can then work on earning links to these pages to improve their chances of rankings.
-
Unlike the goal of steps 1,2 and 3, the goal of steps 4 and 5 for his service cities will be organic rankings - not local rankings. Google predominantly views any business as being most relevant to its city of location - not its service cities, so this is vital for the client to understand.
By following the above method, the client will be doing all he can to try to gain high local rankings for his city of location terms, and high organic rankings for his service city location terms. This is a completely valid way of working with this type of business model. Lay it out clearly for the client what you can do, and then let him make a decision. If he just won't see the light, walk away...he's going to be living in penalty land until he decides to play by the rules. In my own work as a Local SEO, I have learned to shoot straight with clients like this one who are spamming either because they don't understand the rules, or because they do know the rules and want to bend them for their own perceived benefit. The first type, I have a wonderful opportunity to educate. The second type, I can be quite direct in stating that I only offer guidelines-compliant services. Then, let them decide. Good luck and I hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Google Indexing Duplicate URLs : Ignoring Robots & Canonical Tags
Hi Moz Community, We have the following robots command that should prevent URLs with tracking parameters being indexed. Disallow: /*? We have noticed google has started indexing pages that are using tracking parameters. Example below. http://www.oakfurnitureland.co.uk/furniture/original-rustic-solid-oak-4-drawer-storage-coffee-table/1149.html http://www.oakfurnitureland.co.uk/furniture/original-rustic-solid-oak-4-drawer-storage-coffee-table/1149.html?ec=affee77a60fe4867 These pages are identified as duplicate content yet have the correct canonical tags: https://www.google.co.uk/search?num=100&site=&source=hp&q=site%3Ahttp%3A%2F%2Fwww.oakfurnitureland.co.uk%2Ffurniture%2Foriginal-rustic-solid-oak-4-drawer-storage-coffee-table%2F1149.html&oq=site%3Ahttp%3A%2F%2Fwww.oakfurnitureland.co.uk%2Ffurniture%2Foriginal-rustic-solid-oak-4-drawer-storage-coffee-table%2F1149.html&gs_l=hp.3..0i10j0l9.4201.5461.0.5879.8.8.0.0.0.0.82.376.7.7.0....0...1c.1.58.hp..3.5.268.0.JTW91YEkjh4 With various affiliate feeds available for our site, we effectively have duplicate versions of every page due to the tracking query that Google seems to be willing to index, ignoring both robots rules & canonical tags. Can anyone shed any light onto the situation?
Intermediate & Advanced SEO | | JBGlobalSEO0 -
Google penalty or what???
Hi, we have a blog site xxxxxxxxxxx.es, that yesterday dissapear from google ranks all of a sudden it only appears if you write xxxxxxxxx.es I have checked gogle webmaster tools and there are no manual actions, no messages. Also, we don't have much links pointing to this site. Webmaster tools show only 319 links. We don't understand what have happenned. Never see something similar. What do you think? Any help would be appreciated. How do you proceed in this cases? It doesn't seem to be a link problem. How do you know what kind of penalty do you have? Thank you. Update: Hi, the domain is www.crearcorreoelectronico.es I have check the majestic seo, ose, and wmt and get the links. We have some links that are not good, but are automatic ones, that some portals generate. Maybe is something related with the content. I don't know Thanks
Intermediate & Advanced SEO | | teconsite1 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
Okay. So I had the top spot for organic rank ( # 1 position ) above the places section. Now it has been eliminated and I am in the top spot for just the places section. What gives?
So i dont just want to be number in the places section.
Intermediate & Advanced SEO | | NWExterminating0 -
Multiple city network
Im currently setting up a large network and my original thought was to target keywords via the city and then setting up a website with the domain name being that keyword. Now im thinking that in the long run thats going to be a massive pain in my ass. Im thinking what i should do is something along these lines... "www.companyname.com/cityorkeywordhere" any thoughts? Thanks for the help
Intermediate & Advanced SEO | | dcstover10 -
Where do we place Google plus one button
Does Google +1 button have to placed on each page of the website or on on the home page ?
Intermediate & Advanced SEO | | seoug_20050