Link Building: Location-specific pages
-
Hi! I've technically been a member for a few years, but just recently decided to go Pro (and I gotta say, I'm glad I did!).
Anyway, as I've been researching and analyzing, one thing I noticed a competitor is doing is creating location-specific pages. For example, they've created a page that has a URL similar to this: www.theirdomain.com/seattle-keyword-phrase
They have a few of these for specific cities. They rank well for the city-keyword combo in most cases. Each city-specific page looks the same and the content is close to being the same except that they drop in the "seattle keyword phrase" bit here and there.
I noticed that they link to these pages from their site map page, which, if I were to guess, is how SEs are getting to those pages. I've seen this done before on other sites outside my industry too. So my question is, is this good practice or is it something that should be avoided?
-
As stated, having a sub directory works, but I don't think it gives that much of a benefit over the example you gave. But yes location and geo targeting with specific pages can be a great strategy. It works well for me, but I'm a local business so everything I do is defined by location. What you want to avoid is creating pages with duplicate content just to appear local. Simply changing out keyword locations in the content is not going to give you a sustainable advantage. If you are going to create GEO specific pages then make content unique to that location. This is just good for SEO but it's good for selling and converting as well.
-
Sub domains can also turn into a real mess!
-
That's the right bias to have!
-
Ah, I do see what you mean. Thanks for the input. I tend to stay away from subdomains as general practice anyway. My own personal bias as a web designer/dev I think.
-
I agree!
-
Yikes! Who would want to start over with link building to a subdomain!?
-
Angie,
I would have to say this is not a "bad practice" Matt does not say it is bad or spammy nor does Google. It also would really depend on your site structure as what the best way to do this. My site it structured just like this as well as all of my major competitors except for one.
They do use sub domains for example: Seattle.mydomain.com
And I have to tell you in my opinion it is not as effective as the way I and many others do it. A good example of what I am saying is in the real estate industry. Go to Google and search "seattle homes for rent" or "seattle homes for sale" And you will see what I am talking about. You also will see one company uses a sub domain plus a directory to target the location for the users search. the result looks like this:
washington.theirdomain.com/Seattle.In this instance it does work well but if you do some searches in other major markets or just some different terms for this industry you will see all the big sites have the structure of www.theirdomain.com/target-city
And it works well and always have for years. But who knows if Google wakes up tomorrow in a bad mood or not?Good Luck!
-
Glad I could help
-
That. Is. Awesome. Thank you. Somehow I missed that video this summer (I subscribe to those Google Webmaster videos).
-
From the Matt Cutts video I saw earlier: http://www.youtube.com/watch?v=c9vD9KGK7G8&feature=player_embedded
It seems like it would be better to put the Geo specific pages on a subdirectory of your website, and geo target it with Webmaster tools. Then, you can start building local, and relevant, links to that page or directory.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Vetting Link Opportunties that are Penguin Safe
I am looking to go after sites that are, and will never be, affected by Penguin/Panda updates. Is there a tool or a general rule of thumb on how to avoid such sites? Is there a method anyone is currently using to get good natural links post Penguin 2.0?
White Hat / Black Hat SEO | | dsinger0 -
Link "Building" or "Earning" Which one are you doing? Both?
I'm curious to see how SEO's interpret this section of the Google Webmaster Guidelines on Link Schemes: The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. (Source: https://support.google.com/webmasters/answer/66356?hl=en) I'm not asking what you "should" do, but rather what do YOU do... Do you interpret this as: Create awesome content and the links will come? Create Awesome Content and Outreach a bit? Perhaps you don't follow it all and concentrate on building links over content? What do you do and why? Discuss!
White Hat / Black Hat SEO | | BrettDixon0 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
Link Building Plan Need some tips
Okay so I want to start focusing on SEO for my Web design company and I have been reading and reading and am now working on my off-page optimization and have some questions. My words are competitive words and a few of them are finally increasing to top 500 and a few in top 100. My current MozRank for my domain is 2.28, but my mozbar shows my homepage to be MozRank to be 3.35 Link Building Plan -Around 50 Directory submissions to high PR directories and high MozRank domains Free listings Paid Listings -3-5 articles written for guest posts Around 25 Local Business directories such as brownbook.net, whitepages, etc -Around 20 dofollow forum profile creation and backlinks from signatures -Squidioo Lens Creation( I will be creating high quality content related my website which would be web design, internet marketing, seo services.) -Hubpages Creation My question is should the content used on squidioo be 100% original, like can I post the content to other directories as well? -Article Directories Okay so from what I've read this is method doesn't really give much benefit with the recent updates, is this true or should I still at least submit some articles to article directories. I am also confused wouldn't distributing your articles to article directories be considered duplicate content? 20-25 Blog Comments on related blogs that support dofollow blog commenting, also will build some nofollow blog comment links for link diversity
White Hat / Black Hat SEO | | azokaei0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
Whatever Happened to Text Link Ads?
I've searched the web for any objective articles, good or bad, written about Text Link Ads or Text Link Brokers written in the past two years. Other than the occasional discussion board question, SEOs are silent about these services. I know back in 2006, Rand looked upon them almost favorably. But what has happened since then? Is there any legitimate use for these services anymore (as a link builder)?
White Hat / Black Hat SEO | | 1000Bulbs0