Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Targeting local areas without creating landing pages for each town
-
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages).
Then along came Panda...
I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name.
My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products.
Next I have rewritten the content for every product to ensure they are now as individual as possible.
However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible.
The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too.
QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns?
I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once.
Any examples of big sites that have reduced in size since Panda would be great.
I have a headache... Thanks community.
-
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
-
Thank you Miriam. I appreciate you sharing with me the broad idea of the type of structure that you feel a site should have in this instance (if starting from scratch).
You have pretty much echoed my proposal for a new site structure, built for how Google works nowadays, rather than 2-3 years ago. We are currently reducing the size of the current site, to bring it as close to this type of model as possible. However the site would need a complete redesign to make it viably possible to have this type of structure.
I guess what I've been looking for is some kind of reassurance that we are moving in the right direction! Its a scary prospect reducing such a huge amount of pages down to a compact targeted set. With prospects of losing so much long tail traffic, it can make us a little hesitant.
However the on-site changes we have made so far, seem to be having a positive affect.And thank you for giving me some ideas about content creation for each town. I really like this as an idea to move forward after the changes are complete, which will hopefully be by the new year!
-
Hi Silkstream,
Thank you so much for clarifying this! I understand now.
If I were starting with a client like this, from scratch, this would be the approach I would take:
-
View content development as two types of pages. One set would be the landing pages for each physical location, optimized for each city, with unique content. The other set would be service pages, optimized for the services, but not for a particular city.
-
Create a Google+ Local page for each of the physical locations, linked to its respective landing page on the website. So, let's say you now have 25 city pages and 46 service pages. That's a fairly tall order, but certainly do-able.
-
Build structured citations for each location on third party local business directories. Given the number of locations, this would be an enormous jobs.
-
Build an onsite blog and designate company bloggers, ideally one in each physical office. The job of these bloggers would be something like each of them creating one blog post per month about a project that was accomplished in their city. In this way, the company could begin developing content under their own steam that would meet the need of showcasing a given service with a given city. Over time, this body of content would grow the pool of queries for which they have answers for.
-
Create a social outreach strategy, likely designating brand representatives within the company who could be active on various platforms.
-
Likely need to develop a link earning strategy tied in with steps 4 and 5.
-
Consider video marketing. A good video or two for each physical location could work wonders.
I'm painting in broad strokes here, but this is likely what the overall strategy would look like. You've come into the scenario midway and don't have the luxury of starting from scratch. You are absolutely right to be cleaning up duplicate content and taking other measures to reduce the spaminess and improve the usefulness of the site. Once you've got your cleanup complete, I think the steps I've outlined would be the direction to go in. Hope this helps.
-
-
Hi Miriam,
Thanks for jumping in.
The business model is service-based. So when i refer to "46 products" they are actually 46 different types of service available.
The customer will typically book and pay online, through the website, and they are then served at their location which is most often either their home or place of work. They actually have far more than the 25 actual locations, much closer to 120 I believe. However, I only began their SEO in February, AFTER they were hit by Panda. So building up their local listings is taking time, as the duplicate content issue seems far more urgent. Trying to strike a balance, and fix this all slowly over time to lay a solid foundation for inbound marketing, as its being diluted by the poor site structure.
Does this help? Am I doing the right things here?
-
Hi Silkstream,
I think we need to clarify what your business model is. You say you have a physical location in each of your 25 towns. So far, so good, but are you saying that your business has in-person transactions with its customers at each of the 25 locations? The confusion here is arising from the fact that e-commerce companies are typically virtual, meaning that they do not have in-person transactions with their customers. The Google Places Quality Guidelines state:
Only businesses that make in-person contact with customers qualify for a Google Places listing.
Thus, my wanting to be sure that your business model is actually eligible, given that you've described it as an e-commerce business, which would be ineligibl_e._ If you can clarify your business model, I think it will help you to receive the most helpful answers from the community.
-
You scared me then Chris!
-
Of course, if you've got the physical locations, you're in good shape there.
-
"It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization."
Why? The business has a physical location in every town, so why should they not have a page for every location? This is what we were advised to do?
"If there was no other competition, you would almost certainly rank for your keywords along with the town name"
I have used this tactic before, for another nationwide business, but on a smaller scale and it worked. Ie; they ranked (middle of page 1) but for non competitive keywords and the page has strong backlinks. With this site, the competition is stronger and the pages will not have a strong backlink profile at first.
My biggest worry, is to cut all the existing pages and lose the 80% long tail the site currently pulls in. But what other way is there to tackle so much duplicate content?
-
It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization. If not that, it's likely to stop being worth the time as a visibility tactic.
As far as whether or not mentioning local surrounding towns in your page copy will be enough to get you to rank for them, it would depend on competition. If there was no other competition, you would almost certainly rank for your keywords along with the town name but with competition, all the local ranking factors start coming into play and your ability to rank for each one will depend on a combination of all of them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on their own page?
Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael
Intermediate & Advanced SEO | | yaelslater1 -
JSON-LD schema markup for a category landing page
I'm working on some schema for a client and have a question regarding the use of schema for a high-level category page. This page is merely the main lander for Categories. For example: https://www.examples.com/pages/categories And all it does is list links to the three main categories (Men's, Women's, Kid's) - it's a clothing store. This is the code I have right now. In short, simply using type @Itemlist and an array that uses @ListItem. Structured Data Testing Tool returns no errors with it, but my main question is this: Is this the _correct _way to do a page like this, or are there better options? Thanks.
Intermediate & Advanced SEO | | Alces0 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
How to Target Keyword Permutations
I have a client that wants to rank for a keyword phrase that has many permutations.. ex. "Alaska Hill Country Resort", "Hill Country Resort Alaska", "Hill Country Alaska Resort" But I'm wondering if I should target these all on the same page or not. I'm assuming all of these permutations are actually valid searches because I did my keyword research for 'exact match' keywords and got results like this.. (let me know if I'm missing something here, or if this sounds right) [Alaska Hill Country Resort] - 230 Local Searches [Hill Country Resort Alaska] - 140 Local Searches [Hill Country Alaska Resort] - 30 Local Searches The phrase we're targeting is their main keyword phrase, so I've chosen their home-page as the page to rank for this phrase. My thought is to optimize for the most popular phrase (ex. "Alaska Hill Country Resort"), and sprinkle in the other phrases throughout the copy. Next I would run a link-building campaign targeting the main phrase first.. then the next phrase, and so on, so that my anchor text is more heavily focused on the more popular terms, but I would also make sure to include the less popular terms. Do you think this is the best way to go about this? Do I really need to make individual pages for each of the permutations, or is it okay to target them all on one page since they are essentially the same keyword?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640