Link Building: Location-specific pages
-
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!).
Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase
They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there.
I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
-
As stated, having a sub directory works, but I don't think it gives that much of a benefit over the example you gave. But yes location and geo targeting with specific pages can be a great strategy. It works well for me, but I'm a local business so everything I do is defined by location. What you want to avoid is creating pages with duplicate content just to appear local. Simply changing out keyword locations in the content is not going to give you a sustainable advantage. If you are going to create GEO specific pages then make content unique to that location. This is just good for SEO but it's good for selling and converting as well.
-
Sub domains can also turn into a real mess!
-
That's the right bias to have!
-
Ah, I do see what you mean. Thanks for the input. I tend to stay away from subdomains as general practice anyway. My own personal bias as a web designer/dev I think.
-
I agree!
-
Yikes! Who would want to start over with link building to a subdomain!?
-
Angie,
I would have to say this is not a "bad practice" Matt does not say it is bad or spammy nor does Google. It also would really depend on your site structure as what the best way to do this. My site it structured just like this as well as all of my major competitors except for one.
They do use sub domains for example: Seattle.mydomain.com
And I have to tell you in my opinion it is not as effective as the way I and many others do it. A good example of what I am saying is in the real estate industry. Go to Google and search "seattle homes for rent" or "seattle homes for sale" And you will see what I am talking about. You also will see one company uses a sub domain plus a directory to target the location for the users search. the result looks like this:
washington.theirdomain.com/Seattle.In this instance it does work well but if you do some searches in other major markets or just some different terms for this industry you will see all the big sites have the structure of www.theirdomain.com/target-city
And it works well and always have for years. But who knows if Google wakes up tomorrow in a bad mood or not?Good Luck!
-
Glad I could help
-
That. Is. Awesome. Thank you. Somehow I missed that video this summer (I subscribe to those Google Webmaster videos).
-
From the Matt Cutts video I saw earlier: http://www.youtube.com/watch?v=c9vD9KGK7G8&feature=player_embedded
It seems like it would be better to put the Geo specific pages on a subdirectory of your website, and geo target it with Webmaster tools. Then, you can start building local, and relevant, links to that page or directory.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Third part http links on the page source: Social engineering content warning from Google
Hi, We have received "Social engineering content" warning from Google and one of our important page and it's internal pages have been flagged as "Deceptive site ahead". We wonder what's the reason behind this as Google didn't point exactly to the specific part of the page which made us look so to the Google. We don't employ any such content on the page and the content is same for many months. As our site is WP hosted, we used a WordPress plugin for this page's layout which injected 2 http (non-https) links in our page code. We suspect if this is the reason behind this? Any ideas? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Link Getting Deleted for Few Days
If a link gets deleted for few days and re-appears... Will Google treat it as a "new link" or give it the same old link-age.
White Hat / Black Hat SEO | | Akshayshr0 -
Canonical tags being direct to "page=all" pages for an Ecommerce website
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
White Hat / Black Hat SEO | | JMSCC0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Is there any reason to Nofollow Internal Links or XML Sitemap?
I am viewing a new client's site and they have the following nofollow(S) on their site homepage. Is there a reason for this? Also, they people who originally built their site have a footer link on every page to their company (I guess to promote their work). They didn't "nofollow" that link lol... What are the thoughts on footer links? About Us Privacy Policy Customer Service Shipping & Returns Blog Contact Us Site Map Thanks James Chronicle
White Hat / Black Hat SEO | | Atlanta-SMO0 -
I have 4012 links from one blog - will Google penalise?
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog - I also have 3,776 from another website to 6 pages of my website 1,832 from pinterest to 183 pages etc etc overall there are 627 domains linking to my website I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains is that true? should I ask the blog owner to remove my link?
White Hat / Black Hat SEO | | GardenBeet0 -
11 000 links from 2 blogs + Many bad links = Penguin 2.0\. What is the real cause?
Hello, A website has : 1/ 8000 inbound links from 1 blog and 3000 from another one. They are clean and good blogs, all links are NOT marked as no-follow. 2/ Many bad links from directories that have been unindexed or penalized by Google On the 22nd of May, the website got hurt by Penguin 2.0. The link profile contains many directories and articles. The priority we had so far was unindexing the bad links, however shall we no-follow the blog links as well? Thanks!
White Hat / Black Hat SEO | | antoine.brunel0 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0