SEO For Local Searches
-
I run a driving school of over 100 instructors in the UK. We cover around 60 different areas.
My homepage www.driveJohnsons.co.uk is optimised for 'driving lessons' and 'driving school' search terms mainly.
My area pages are optimised for the same but with the area included ie: Driving Lessons Birmingham or Driving Lessons Leeds
I've taken a drop in many areas...
I've cleaned up my incoming links using the disavow too and upped more relevant links associated with the same industry as myself.
The question i have is should i change my URL's for my area pages from www.driveJohnsons.co.uk/driving-lessons-leeds to: www.driveJohnsons.co.uk/leeds
I've been told stuffing the URL with keywords for an area actually dilutes the strength of my homepage and all the other areas.
At the moment i have 60 area pages with: www.drivejohnsons.co.uk/driving-lessons-area
It use to work a treat, but i've started seeing some companies change their URLs to: /area and excluding the driving-lessons
If i make this change then i'm either going to have to bit the bullet on build up links for those areas again or do a redirect for each area.
I've added most areas to google places and i've added google map to many of area pages too.
If anyone knows a bit more, please let me know...
-
Hi Anthony,
So glad the resource helped!
-
Hi Miriam...
I found the local listings article very interesting, quite frightening in fact. It all makes sense but you have put a lot of things back into perspective.
-
Unfortunately, these kinds of pages, regardless of the URL, can present problems for organic SEO (even when they have some local SEO benefit). If the 60 pages are basically cookie cutter - the same content except for the city/region, then this is the kind of thin content that can cause problems with Panda. The keyword-loaded URL might marginally increase the problem, but I think the risk is real either way.
As Miriam said, if it's a few locations, it's not usually a big deal. Hundreds is definitely a risk. Sixty is a bit borderline, IMO. If your site had 1,000 indexed pages and 60 local pages, probably no big deal. If you have 100 pages total and 60 local, then I'd be concerned. There's no easy solution. Either you: (1)focus the regions and pare it down a bit, or (2) work to create more unique content on each of these pages and make sure they don't look thin.
-
Hi Anthony,
Back again. You might like to check out the good examples of multi-location businesses in this article from Local U:
http://localu.org/blog/designing-business-location-website-pages-part-2-multiple-location-business/
I'm still pretty much sticking to my original suggested URL structure, and I think these examples may be useful in your planning.
-
Hi Anthony,
You've raised an interesting point. I've asked some of our traditional SEO experts if they would weigh in on this with you. My clients typically have just a handful of locations, in which case, it's been a no-brainer for me to go with the /service-city url structure, but I think what you're asking is a valid question. It might still be the best choice to go with this structure, but if you're really concerned about it, you could go with just a /city URL structure for the office landing pages. On the other hand, the URL is only one step in your optimization work. Even if you did just go with the /city structure, wouldn't you be optimizing the tags and text of the pages with the core service phrase? Another thought, too, would be to go with /business-name-city, but when one considers that many business names may contain the core service phrase (Superior Driving Lessons, for example), this brings us back to square one.
When things become messy like this, I try to step back and ask myself if what I'm doing is natural. In this case, I think having pages on your website that specify that driving lessons are offered in X city is totally natural. It's not like you're trying to game anything with explaining this. You're giving an honest representation of what the business does and where it does it. Sometimes, I can over-think things about my clients, in which case, coming back to what is natural and honest can often provide a guiding light.
As I've said, I think your question is worthy of an answer. I've shared my thinking on this, but I really hope you'll get feedback from some of our other staff on this as I believe several heads may be better than one in hashing out the technical specifics of this.
-
Hi Miriam
I've got physical addresses for the areas. A google postcard has been sent to the area to authorise.
I'm not bothered if i rank on the first page locally or organically.
My concern with the URL was if i have 60 URL's saying /driving-lessons-AREA
Would that not dilute dilute each areas strength because of the heavy use of driving lessons and also dilute my home page optimisation which is for the whole of the UK for driving lessons.
As for links, i've had a good clean up around 2 months ago but it seems the disavow tool takes time as these horrible links are still present in my webmaster links to site section.
It's weird i have another website that is spammed to hell, bad links and poor content and that sits on the first page of local listings and i've done no work on it since the penguin update - as there was too much to do and i had other priorities.
All i've tried to do with my main site is good seo practice.
So you think the driving-lessons for every area page shouldn't make a difference ?
-
Hi Anthony,
Are you saying that your business has a physical office in each of these cities? I am assuming this is so, as you are only allowed to create Google+ Local pages for physical offices. If there is some chance that you've done so, lacking physical offices, then you could expect Google to remove these listings if they become aware that they don't represent physical offices.
I see no problem with your URLs. I'm curious as to what you read. Those look like perfectly fine URLs for local landing pages to me.
Are you aware that there is currently a shakeup going on in Google's local results? It's possible that this could account for any fluctuations you are seeing.
If not, it sounds like you may have had some link problems in the past. Is there any chance that you might have run afoul of the Google Places Quality Guidelines in some way? Here's a link to them: https://support.google.com/places/answer/107528?hl=en
Ranking fluctuations happen in Local. Sometimes they are caused by tweaks to Google's algo. Other times, they occur when you are surpassed by a competitors' efforts. And, in some cases, a business drops because of engaging in bad practices. Consider these three scenarios and see if one of them fits your business most closely. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need reccomodations for Good SEO company
Hello, I'm looking for seo in the gaming industry - European languages Thanks
Industry News | | Rogeroz0 -
SEO Service Needed
Any suggestion about hiring an SEO individual? Primary focus will be proper link/relationship building. What questions would you ask them? What is a fair compensation method? Can the compensation be based on results vs. effort? Thank you, Joe
Industry News | | csamsojo0 -
SearchDex for SEO consultation? Price feedback?
I know the JCPenny story from 2011 and read their CEO response to that. I've been pleased with my conversations with them so far and was given a reference that checked out well. Now us. We are a small business and our expertise is in print manufacturing and fulfillment. A portion of our business is running a network of retail websites related to the print business and that network generates six figure revenue, but we rely entirely on our affiliates to promote and drive traffic. I've been learning SEO for about 9 months - primarily just so that I can be informed while managing our direct to consumer initiative. But we don't have the internal expertise to pull off a campaign that is not heavily outsourced so that is why I'm open to talking with SearchDex. Does anyone have any feelings about them one way or another? Does anyone have a recommendation for me to talk to that might be an even better fit for what we want to do?
Industry News | | rickpeyton0 -
Does it make sense to go after broad search with less competition vs. narrow search with very high competition?
We are in the process of analyzing our current site structure, on-page optimization and keywords to form a new strategy around our site. What we are finding with the keyword research we’ve done thus far is keywords that are shorter-tail have less competition, but far more searches than some of the long-tail keywords. For purposes of illustration I will give an example. Let's say we sell Wedding Cakes and the keyword string “Garden Wedding” has approximately 246,000 monthly local searches and medium competition, but “Garden Wedding Cakes” only has 880 searches and very high competition. We believe that if we create a very effective landing page for "Garden Wedding" with all kinds of great content surrounding "Garden Wedding" that we have a much better chance of ranking on page 1 than if we were to go after the term "Garden Wedding Cakes". Furthermore, the volume of search far exceeds the "Garden Wedding Cakes" and hopefully will reach a much larger audience. However, because "Garden Wedding" is such a broad term, we are concerned that we don't necessarily understand what folks are searching for vs, when someone types in "Garden Wedding Cakes" we know they are looking for a cake. Here are the questions we have: Targeting broader terms with higher search, has anyone implemented this type of strategy? We think in the long run, this will help us with exposure, but also with help our targeted page of "Garden Wedding Cakes" rank higher (if we can earn a great PR for the page "Garden Wedding". Would we run the risk of creating a higher bounce rate with this strategy for people who are looking specifically for Garden Wedding items/supplies, etc.. Is this a major concern? Could we monetize the effort put into new, rich content surrounding Garden Weddings, when we are in the business to sell Wedding Cakes? Any insight that one can provide would be greatly appreciated.
Industry News | | UrbanityStudios0 -
Searching for another Defender of the Front Page!
Greetings Mozfriends and Champions! My company is searching for a performance based SEO that can handle multiple websites and platforms at the same time. Was wondering if any of you fine people would be interested in such a quest. Please reply to the message or you can E-mail me at Jsmith@frontlinemobility.com. Good Luck Champions! Justin Smith
Industry News | | FrontlineMobility0 -
How many total independent SEO professionals are there in the US?
I'm looking for two pieces of data: 1. How many total independent/freelance SEO professionals are there in the US? That is, SEO pros who work directly with their own clients. 2. How many SEO firms are there in the US? I've been scouring all over looking for this data and can't find it. Any help would be much appreciated.
Industry News | | jsteimle0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
SEO sites were blasted by Panda
Just noticed that lots of SEO sites were blasted by Panda... http://trends.google.com/websites?q=webmasterworld.com&geo=all&date=2011&sort=0 http://trends.google.com/websites?q=seomoz.org&geo=all&date=2011&sort=0 http://trends.google.com/websites?q=digitalpoint.com&geo=all&date=2011&sort=0 http://trends.google.com/websites?q=forums.seochat.com&geo=all&date=2011&sort=0
Industry News | | EGOL3