Local area coverage
-
Hi Moz people...
I have a little one that i can't quite get my head into the best way of achieving the clients goals.
My client has a local business but also travels. His area coverage is around 50 miles from his base. He also has a few skills, all under the main discipline. When users search for his skills set, they would search for a specific skill against where they live ie. "skill1 in city A"
What is the best way concentrate on these locations, and make sure his is targeting them? I know int he old days dynamic pages where created for each location (now black hat?), but I don't think this is now the best way forward? Any help, tips or just 'look at this' would really help me out.
-
Hi Jimbo,
Ryan has highlighted some good ideas for you. I will add:
-
Yes, it's still forbidden to created Google Places/+Local page for service areas where the business has no physical location. Google's local product hinges on physical location.
-
I would create two sets of pages on the website. One set will contain a page for each of his major services. The other set will contain a city landing page for each of his main service cities. I wrote a pretty well-cited article some months back on the topic of city landing pages that I feel may be of some help to you: The Nitty Gritty of City Landing Pages
-
Then, once you've established the two sets of static pages on the client's site, I would consider following up with any or all of the following:
-
Linkbuilding
-
Blogging to showcase his projects in different towns
-
Social media with a local focus
-
Video marketing with a local focus
Provided the playing field isn't too tough, a path like the one I've described will enable most clients to gain quite a bit of visibility.
-
-
Set your radius on your local to a wider area when possible, but Google is really making this type of situation difficult because technically he doesn't have a "physical presence" in all of these places. I would say that you go ahead and create content around these seperate locations, but make sure the content is unique and good. Don't simply copy the same content and swap out "city1" with "City2" and call it good. Google has specifically targeted those type of things and something they do not want in search results. But having great content and information for each of those locations is an excellent idea.
1. One idea would be to house all of that content under a "service areas" type of page on the site. Then you list each area/town/city that he provides service in, with deeper links and content to each of those areas with unique content. Then you make sure to utilize breadcrumbs in the navigation with really good url structure i.e. www.service.com/plumbing/city1 and you could start to have the specialty skills as the topical/category high level pages, and then service areas within each specialty nested underneath. (hope that makes sense).
Just some ideas that have worked on my end with local. For sure check out he GetListed stuff and utilize those services as much as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High ranking nationally but not locally via google
A website I am working on is ranked very well in all tracked keywords at a national level, but not from a local standpoint via google. I find it weird that the site is on the first page if you search from many other states/towns/locations but not locally. Looked on Google Search Console and couldn't see any link to why this is happening. Figured we would clear out the htaccess for any redirect issues and hope it fixes it. Suggestions please? Never seen google do this. It is strange.
White Hat / Black Hat SEO | | SeobyKP1 -
1 business targetting multiple local locations
When researching a new client - I just came across a site in the same field which is ranking really well for all the local towns/cities/villages in the area. Each page for each town is a duplicate only changing out the town name (which appears 13 times on the page) - all pics and videos are the same. His url structure is along the lines of: budget-business-domain.com/budget-business-area/budget-business-town/ The domain was registered in 2012 - all backlinks are internal - anchor text is the same. I think it shouldn't be working.... but it is 😞 Why is this working?
White Hat / Black Hat SEO | | agua0 -
Does the proximity to the center of the city have anything to do with higher rankings in local results ?
Does the proximity to the center of the city have anything to do with higher rankings in local results ?? If yes then how ?
White Hat / Black Hat SEO | | mnkpso0 -
Keywords in Google Local results
We have a client in the moving business and I'm absolutely flabbergasted by the "local" results and the number of them that are not following Google's guidelines for Google Local accounts. 3 of them are using exact match keyword strings as their company names. I've reported all 3, every week for the last 2 months and have not seen a single dip in the rankings. Meanwhile our client has a duplicate listing we've verified and "suspended" and it hasn't changed for 4 months! Any tips? I've attached a photo of the listings as well. xwWZWyT.gif
White Hat / Black Hat SEO | | SmartWebPros0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Google Local Listing Verification - Is there a way to skip this?
Hi, We are running 2 types of service in our company. 1.) Dry Cleaning 2.) Laundry Services The problem is we have 2 website but only 1 office address.
White Hat / Black Hat SEO | | chanel27
It is not recommended to put same address for the both websites
both doing laundry & dry cleaning services. Is there any tip on how we can get listed on Google place without using the same address for both website?0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Can you set up a Google Local account under a PO Box?
I have a client that wants a Google local listing in a town he serves but does not have a physical location. Is it an issue to share an address with an existing company? Is is it better to use a P.O. Box? or is there a forwarding address company? Is this considered a black hat Local SEO tactic?
White Hat / Black Hat SEO | | BonsaiMediaGroup0