Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Location Pages and Duplicate Content and Doorway Pages, Oh My!
-
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services.
Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc.
They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well.
My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names".
In a nutshell, Google's Guidelines seem to have a conflict on this topic:
Location Pages: "Have each location's or branch's information accessible on separate webpages"
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one."Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page:
Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content."
...starting to feel like I'm in a Google Guidelines Paradox!
Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?
-
Thanks for the comment Laura!
I was aware of the fact duplicate content wasn't the issue, but it just baffled me that this very obvious black-hat tactic wasn't punished by Google in any way. Even though their guidelines clearly stated doorway pages are a big "no-no".
Let's hope the December 2017 update has a noticeable impact
Have a nice day!
-
The Panda filter is just that, a filter. It doesn't remove pages from the index, and you won't get a manual penalty because of it.
In the case of duplicate content, Google chooses the most relevant or original content and filters out the duplicates. On the other hand, when a website has multiple pages with the same content, that can affect the overall quality of the entire website. This can affect search performance as well.
Then there's the issue of doorway pages, which are duplicate pages created for the purpose of funneling visitors to the same destination. This goes against Google's guidelines, and they confirmed a December 2017 algorithm update that affects sites using doorway pages.
-
Hi Laura,
It seems like this age-old black-hat tactic still works though. Maybe only outside of the US? Check out this SERP: https://www.google.be/search?q=site:trafficonline.be+inurl:seo-&ei=Z0RnWqHED47UwQLs5bkQ&start=0&sa=N&filter=0&biw=1920&bih=960&num=100
You don't have to understand the language to see that this is almost the same identical page, purely setup to rank well for localized terms (city names). Each page has the same exact content but uses some variables as to not have the exact same text: nearby city names, a Google Map embed, and even some variables for the amount of people living in a city (as if that's relevant information for the user). The content itself is really thin and the same for all cities.
The crazy thing is this site ranks well for some city names in combination with their keywords, even though it's very clearly using black-hat SEO tactics (doorway pages) to manipulate rankings for localized search terms. I would think websites that so blatantly violate the Google Guidelines would be completely removed from the search index, but that definitely isn't the case here.
Any thoughts as to why sites like this aren't removed for violating Google's terms and conditions? Or how I could keep telling our clients they can't use black hat tactics because Google might remove them from the index, even though it appears the chance of such a removal is almost non-existent?
Thanks in advance,
Kind regards -
Some great ideas: Content Creation Strategy for Businesses with Multiple Location Pages
-
Yeah it seems like the best logical answer is that each location page needs unique content developed for it. Even though it still kinda feels a little forced.
Goes to show you that Google has really pushed SEO firms to think differently about content and when you have to do something just for SEO purposes it now feels icky.
Yes creating unique content for that page for that location can be seen as useful to the users but it feels a little icky because the user would probably be satisfied with the core content. But we're creating unique location specific content mostly to please Google... not the user.
For example what if Walmart came to this same conclusion. Wouldn't it be a little forced if Walmart developed pages for every location that had that locations weather, facts about the city, etc?
Due to it's brand it's able to get away with the thin content version of location pages: http://www.walmart.com/store/2300/details they don't even use the markup... but any SEO knows you can't really follow what is working for giant brand like Walmart.
-
In response to the extra landing pages, our key thing for our business following on from the above comments is to remember that fresh and unique content is best.
We have spent a lot of money on our websites as well as clients in building extra pages, what we do is have a plan. For example if we have 30 pages to add, we spread this over a period of weeks/months. Rather than bashing them all out together. We do everything in a natural organic manner.
Hope this helps, it is our first post!
-
Welcome to my hell! I have 18 locations. I think it's best practice to have a location page for each location with 100% original content. And plenty of it. Yes, it seems redundant to talk about plumbing in Amherst, and plumbing in Westfield, and plumbing in...wherever. Do your best and make the content valuable original content that users will find helpful. A little local flair goes a long way with potential customers too and also makes it pretty clear you're not spinning the same article. That said, with Google Local bulk spreadsheet uploads, according to the people I've spoken with at Google, your business description can be word for word the same between locations and it won't hurt your rank in the maps/local packs one bit. Hope this helps!
-
These do appear to be contradictory guidelines until you understand what Google is trying to avoid here. Historically, SEOs have tried to rank businesses for geo-specific searches in areas other than where a business is located.
Let's say you run a gardening shop in Atlanta and you have an ecommerce side of the business online. Yes, you want to get walk-in traffic from the metro Atlanta area, but you also want to sell products online to customers all over the country. Ten years ago, you might set up 50 or so pages on your site with the exact same content with the city, state switched out. That way you could target keywords like the following:
- gardening supplies in Nashville, TN
- gardening supplies in Houston, TX
- gardening supplies in Seattle, WA
- gardening supplies in San Francisco, CA
- and so on...
That worked well 10 years ago, but the Panda update put a stop to that kind of nonsense. Google understands that someone searching for "gardening supplies in Nashville, TN" is looking for a brick and mortar location in Nashville and not an ecommerce store.
If you have locations in each of those cities, you have a legitimate reason to target the above search queries. On the other hand, you don't want to incur the wrath of Google with duplicate content on your landing pages. That's why the best solution is to create unique content that will appeal to users in that location. Yes, this requires time and possibly money to implement, but it's worth it when customers are streaming through the door at each location.
Check out Bright Local's recent InsideLocal Webinar: Powerful Content Creation Ideas for Local Businesses. They discussed several companies that are doing a great job with local landing page content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo-location by state/store
Hi there, We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website. Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed? We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking? Any help will be greatly appreciated. Thanks - Costa
Local Website Optimization | | Hanuman881 -
I want to rank a national home page for a local keyword phrase
Hello - We are a nationally available brand based in Denver, CO. Our home page currently ranks #8 (used to be 5) for "real estate photography in Denver" -- I want to improve this ranking, but our home page is generalized and not geared toward Denver, CO but to all of our markets. I'm trying to troubleshoot this and have a few ideas.... I would love advice on the best route, or a different route altogether: Create a Denver-specific page -- _will that page compete with my home page that is already ranked in the top ten? _ Add the keyword phrase in the image alt attribute Add keyword phrase into the content - need to make sure that viewers realize we are national I already updated the meta description to say "real estate photography in Denver and beyond"
Local Website Optimization | | virtuance_photography1 -
Service Area Location Pages vs. User Experience
I'm familiar with the SAB best practices outlined here. Here's my issue: Doing local landing pages as described here might not be ideal from a user experience point of view. Having a "Cities We Serve" or "Service Areas" link in the main navigation isn't necessarily valuable to the user when the city-specific landing pages are all places within a 15-mile radius of the SAB's headquarters. It would just look like the company did it for SEO. It wouldn't look natural. Seriously, it feels like best practices are totally at odds with user experience here. If I absolutely must create location pages for 10 or so municipalities within my client's service area, I'd rather NOT put the service areas as a primary navigation item. It is not useful to the user. Anyone who sees that the company provides services in the [name of city] metropolitan area will already understand that the company can service their town that is 5 miles away. It is self-evident. For example**, who would wonder whether a plumbing company with a Los Angeles address also services Beverly Hills?** It's just... silly. But the Moz guide says I've got to do those location pages! And that I've got to put them high up in the navigation! This is a problem because we've got to do local SEO, but we also have to provide an ideal experience. Thoughts?
Local Website Optimization | | Greenery1 -
More pages on website better for SEO?
Hi all, Is creating more pages better for SEO? Of course the pages being valuable content. Is this because you want the user to spend as much time as possible on your site. A lot of my competitors websites seem to have more pages than mine and their domain authorities are higher, for example the services we provide are all on one page and for my competitors each services as its own page. Kind Regards, Aqib
Local Website Optimization | | SMCCoachHire0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Sub domain for geo pages
Hello Group! I have been tossing the idea in my head of using sub domains for the geo pages for each of my clients. For example: one of my clients is a lawyer in a very competitive Atlanta market http://bestdefensega.com. Can I set his geo page to woodstock.bestdefensega.com? Is this a viable option? Will I get penalized? Thoughts or suggestions always appreciated! Thanks in Advance
Local Website Optimization | | underdogmike0 -
Is it okay for my H3 Tag to appear above my H2 Tag on the Web Page
Hello All, I am currently doing my H1 ,H2, H3 Tags on my redesigned website We have the ability to have links to relevant DIY Guides on the bottom of our webpage and these are currently displayed under a heading "DIY Useful Guides" above my on page content which is at the bottom of the page. My H2 Tag will obviously be the title that sits above my On Page Content at the bottom of the Webpage and I was going to do the H3 Tag for my DIY Guides Is it a problem if the H3 tag sits above the H2 Tag on the Page or not ? Or have i got this wrong and I need to move the DIY Guides (links) to below the on page content so the H3 tag sits below the H2 tag? thanks Pete OTmPbbR
Local Website Optimization | | PeteC120 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0