Location Pages and Duplicate Content and Doorway Pages, Oh My!
-
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services.
Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc.
They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well.
My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names".
In a nutshell, Google's Guidelines seem to have a conflict on this topic:
Location Pages: "Have each location's or branch's information accessible on separate webpages"
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one."Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page:
Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content."
...starting to feel like I'm in a Google Guidelines Paradox!
Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?
-
Thanks for the comment Laura!
I was aware of the fact duplicate content wasn't the issue, but it just baffled me that this very obvious black-hat tactic wasn't punished by Google in any way. Even though their guidelines clearly stated doorway pages are a big "no-no".
Let's hope the December 2017 update has a noticeable impact
Have a nice day!
-
The Panda filter is just that, a filter. It doesn't remove pages from the index, and you won't get a manual penalty because of it.
In the case of duplicate content, Google chooses the most relevant or original content and filters out the duplicates. On the other hand, when a website has multiple pages with the same content, that can affect the overall quality of the entire website. This can affect search performance as well.
Then there's the issue of doorway pages, which are duplicate pages created for the purpose of funneling visitors to the same destination. This goes against Google's guidelines, and they confirmed a December 2017 algorithm update that affects sites using doorway pages.
-
Hi Laura,
It seems like this age-old black-hat tactic still works though. Maybe only outside of the US? Check out this SERP: https://www.google.be/search?q=site:trafficonline.be+inurl:seo-&ei=Z0RnWqHED47UwQLs5bkQ&start=0&sa=N&filter=0&biw=1920&bih=960&num=100
You don't have to understand the language to see that this is almost the same identical page, purely setup to rank well for localized terms (city names). Each page has the same exact content but uses some variables as to not have the exact same text: nearby city names, a Google Map embed, and even some variables for the amount of people living in a city (as if that's relevant information for the user). The content itself is really thin and the same for all cities.
The crazy thing is this site ranks well for some city names in combination with their keywords, even though it's very clearly using black-hat SEO tactics (doorway pages) to manipulate rankings for localized search terms. I would think websites that so blatantly violate the Google Guidelines would be completely removed from the search index, but that definitely isn't the case here.
Any thoughts as to why sites like this aren't removed for violating Google's terms and conditions? Or how I could keep telling our clients they can't use black hat tactics because Google might remove them from the index, even though it appears the chance of such a removal is almost non-existent?
Thanks in advance,
Kind regards -
Some great ideas: Content Creation Strategy for Businesses with Multiple Location Pages
-
Yeah it seems like the best logical answer is that each location page needs unique content developed for it. Even though it still kinda feels a little forced.
Goes to show you that Google has really pushed SEO firms to think differently about content and when you have to do something just for SEO purposes it now feels icky.
Yes creating unique content for that page for that location can be seen as useful to the users but it feels a little icky because the user would probably be satisfied with the core content. But we're creating unique location specific content mostly to please Google... not the user.
For example what if Walmart came to this same conclusion. Wouldn't it be a little forced if Walmart developed pages for every location that had that locations weather, facts about the city, etc?
Due to it's brand it's able to get away with the thin content version of location pages: http://www.walmart.com/store/2300/details they don't even use the markup... but any SEO knows you can't really follow what is working for giant brand like Walmart.
-
In response to the extra landing pages, our key thing for our business following on from the above comments is to remember that fresh and unique content is best.
We have spent a lot of money on our websites as well as clients in building extra pages, what we do is have a plan. For example if we have 30 pages to add, we spread this over a period of weeks/months. Rather than bashing them all out together. We do everything in a natural organic manner.
Hope this helps, it is our first post!
-
Welcome to my hell! I have 18 locations. I think it's best practice to have a location page for each location with 100% original content. And plenty of it. Yes, it seems redundant to talk about plumbing in Amherst, and plumbing in Westfield, and plumbing in...wherever. Do your best and make the content valuable original content that users will find helpful. A little local flair goes a long way with potential customers too and also makes it pretty clear you're not spinning the same article. That said, with Google Local bulk spreadsheet uploads, according to the people I've spoken with at Google, your business description can be word for word the same between locations and it won't hurt your rank in the maps/local packs one bit. Hope this helps!
-
These do appear to be contradictory guidelines until you understand what Google is trying to avoid here. Historically, SEOs have tried to rank businesses for geo-specific searches in areas other than where a business is located.
Let's say you run a gardening shop in Atlanta and you have an ecommerce side of the business online. Yes, you want to get walk-in traffic from the metro Atlanta area, but you also want to sell products online to customers all over the country. Ten years ago, you might set up 50 or so pages on your site with the exact same content with the city, state switched out. That way you could target keywords like the following:
- gardening supplies in Nashville, TN
- gardening supplies in Houston, TX
- gardening supplies in Seattle, WA
- gardening supplies in San Francisco, CA
- and so on...
That worked well 10 years ago, but the Panda update put a stop to that kind of nonsense. Google understands that someone searching for "gardening supplies in Nashville, TN" is looking for a brick and mortar location in Nashville and not an ecommerce store.
If you have locations in each of those cities, you have a legitimate reason to target the above search queries. On the other hand, you don't want to incur the wrath of Google with duplicate content on your landing pages. That's why the best solution is to create unique content that will appeal to users in that location. Yes, this requires time and possibly money to implement, but it's worth it when customers are streaming through the door at each location.
Check out Bright Local's recent InsideLocal Webinar: Powerful Content Creation Ideas for Local Businesses. They discussed several companies that are doing a great job with local landing page content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SERP: From page 4 to page 1 to page 4 again -_- ...
Hi there Moz Amigos! So I have this Website: campmusicaladagio.com Right now, our main target keyword is "camp de jour gatineau". The website was on WIX before. So, I created the worpress version and redirected the domain name to the new hosting server (outside of WIX). So before doing the changes, the website was on page 4... After the changes, it went in 1 week on page 1 (lol, WIX sucks so much). After 3 weeks on page 1, it went on page 4 again... I am so confused XD like what the hell happen... Any ideas?
Local Website Optimization | | Gab-SEO0 -
Search result page
I need an answer how google sees this page. if somebody searches in carhub.com , normally goes to http://www.carhub.com/Results.aspx?CarState=Used&MakeName=BMW&MakeId=ENKWD0M8TR7W&Location=Los_Angeles but pushes the webpage http://www.carhub.com/Results.aspx , User sees the webpage like these.. but not seen any title, description and h1
Local Website Optimization | | carhub0 -
Pages ranking outside of sales area
Hi there Moz Community, I work with a client (a car dealership), that mostly serves an area within 50-100 miles at most from their location. A previous SEO company had built a bunch of comparison pages on their website (i.e. 2016 Acura ILX vs. Mercedes-Benz C300). These pages perform well in their backyard in terms of engagement metrics like bounce rate, session duration, etc. However, they pull in traffic from all over the country and other countries as well. Because they really don't have much of an opportunity to sell someone a car across the country that a customer could easily buy at their local dealership, anyone from outside their primary marketing area typically bounces. So, it drags down their overall site metrics plus all of the metrics for these pages. I imagine searchers from outside their primary sales area are seeing their location and saying "whoah that's far and not what I'm looking for." I tried localizing the pages by putting their city name in the title tags, meta descriptions, and content, but that doesn't seem to really be getting rid of this traffic from areas too far away to sell a car to. My worry is that the high bounce rates, low time on site, and general irrelevancy of these pages to someone far away are going to affect them negatively. So, short of trying to localize the content on the page or just deleting these pages all together, I'm not quite sure where to go from here. Do you think that having these high bouncing pages will hurt them? Any suggestions would be welcomed. Thanks!
Local Website Optimization | | Make_Model1 -
Collapsing Location-Specific Subdomains
My client has 24 separate subdomains for its nationwide business, one for each specific location. Much of the content is very similar, as the site serves as a lead-generator for rental reservations. After years of suggesting the approach of using one domain, we have finally gotten the client onboard to eliminating the subdomains and maintaining a subdirectory/page approach for location-specific content and allowing universal content to live at the root domain. I've been looking for any case studies that have any watch-outs or demonstrated benefits when collapsing domestic subdomains (phoenix.client.com; albuquerque.client.com, etc.) into the root, and have been fairly unsuccessful so far. We will be setting up a rigorous 301 redirect tree to ensure we retain as much link juice as possible from any existing subdomain-specific inbound links. Any advice/guidance to help set expectations of what will shake down from this change? It feels like we should see increased domain authority and less cannibalization, as the client ranks nationally for important broad-level keywords, with significantly higher DA at the root level than any tracked competitors, but I'm a little nervous about how localized search results will be affected. Thank you!
Local Website Optimization | | ClassicPartyRentals1 -
How can i optimize my pages for local areas if we are not in that area?
Hi Mozers! So I watched a video about Matt Cutts he talks about creating multiple web pages just for one keywords is an absolutely no go. So I was wondering we serve a clients in NZ Australia and USA, If we target phrase like Psychic Readings California, Psychic Readings San Diego etc (USA) Psychic Readings Melbourne, Psychic Readings Sydney (AU) Psychic Readings Auckland, Psychic Readings Wellington (NZ) What is the best practice or right way to go about structuring my pages to do this without going against googles guidelines. Many thanks
Local Website Optimization | | edward-may1 -
How to handle clients who want to target far away from their location?
In general, How do you recommend handling clients that are persistent about targeting a location that is very far away from their physical location, i.e. the client is in Providence, RI, but wants to target Boston, MA. I typically give them a discussion about how they will not rank in the 7 packs, particularly post pigeon, but wanted to know if the Moz community had any other tips since this seems to come up so frequently. Thank you!
Local Website Optimization | | Red_Spot_Interactive1 -
Stuck on Page 4...is this diagnosis on the right track?
My website's (http://bartlettpairphotography.com) SERP rank is #45 for my targeted keyword: Philadelphia wedding photographers. My site is several years old, with 31-Domain Authority and 42-Page Authority. I've been stuck in SERP 40's for about a year now (I used to be top 5) and I have been pulling my hair out trying everything to no avail. I have an inkling that some configuration is seriously wrong, and would be very very appreciative is someone could point me in the right direction! I'm evidently not an expert at this, but here are my high level thoughts, though I could be totally off base here: Homepage problems (ranking 45 for highest priority keyword: Philadelphia wedding photographers): The #5 rank has a flash website, homepage = 33-DA/44-PA (slightly better than me). This makes me wonder if my problem is off-page? I have recently been submitting my photography work to many relevant wedding blogs so I think I will get some nice relevant backlinks in the coming weeks/months. The #11 rank has the same wordpress theme as me (ProPhotoBlogs), and homepage = 26-DA, 35-PA (somewhat worse than me) and similar homepage content etc...this makes me think I have an on-page problem? As you can see, my targeted keyword starts off with a geographic location. Geographically, our location is ~1 hour outside of the location, so ranking on Google maps etc. is very competitive (hundreds of competitors that are closer). Therefore, I'm mostly focused on non-local ranking. Both of the competitors I mentioned are ranking non-locally and both are 1 hour outside Philadelphia. With that said, would it still benefit me to add local content to my homepage (insert google maps, address, hours etc.)? NON-homepage problems (ranking ~30 for longer tail keywords, i.e. specific wedding venues) My blog page (http://bartlettpairphotography.com/blog) is ="noindex,follow." My reasoning for the "noindex" is because I'm showing FULL posts rather than excerpts (because I want my brides to flip through ~5 weddings rather than only clicking on 1). My thinking was that the FOLLOW aspect would pass along the link juice, while avoiding a duplicate content penalty by noindexing? I don't think this problem affects my higher priority homepage problem, but still wanted to point it out. We have ~100 published posts, but honestly I only care about ranking for ~30 of them. What should I do with the ~70 that I don't care about? Are they sucking up link juice that would be better elsewhere? Or should I just leave it because it's more content? Other than that, I'm really lost as to how I can improve my site. I gave the above examples to show that I am trying, but ultimately I feel like I'm looking in the wrong areas. With my SERP in the mid 40s, I feel like many things are broken that I am not able to figure out. I would be so very grateful if someone could help diagnose my issues!
Local Website Optimization | | bartlettpairphoto0 -
One location performing worse than the rest despite no major difference in SEO strategy
Hi all, I'm flummoxed. I'm dealing with a business that has 15 or so offices in three cities, and one city is performing horribly (this includes every office therein). The other two cities have shown consistently stellar results with massive traffic increases month over month for the past year; the city in question dropped unexpectedly in June and hasn't ever recovered. We didn't perform any major website changes during or immediately prior to that time period, and the website in general hasn't been negatively affected by Hummingbird. All locations for the business are optimized in the exact same way and according to best practices; there's no significant difference in the number of local listings, reviews, G+ fans, social signals, etc across locations. All meta data and content is optimized, NAPs are all consistent, we've built links wherever we can: the SEO for every location has been by-the-books. We've run a competitor audit in this particular city that included pulling our top competitors and exploring their domain authority, meta data, on-page keyword grade for the term we're trying to rank for, number and type of inbound links, social signals, and more; and we didn't spot any patterns or any websites that were significantly outperforming us in any area (besides actual rankings). It's frustrating because the client is expecting a fix for this city and I can't find anything that needs to be fixed! Have any multi-local SEOs out there run into a similar problem? What did you do about it?
Local Website Optimization | | ApogeeResults0