Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Targeting local areas without creating landing pages for each town
-
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages).
Then along came Panda...
I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name.
My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products.
Next I have rewritten the content for every product to ensure they are now as individual as possible.
However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible.
The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too.
QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns?
I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once.
Any examples of big sites that have reduced in size since Panda would be great.
I have a headache... Thanks community.
-
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
-
Thank you Miriam. I appreciate you sharing with me the broad idea of the type of structure that you feel a site should have in this instance (if starting from scratch).
You have pretty much echoed my proposal for a new site structure, built for how Google works nowadays, rather than 2-3 years ago. We are currently reducing the size of the current site, to bring it as close to this type of model as possible. However the site would need a complete redesign to make it viably possible to have this type of structure.
I guess what I've been looking for is some kind of reassurance that we are moving in the right direction! Its a scary prospect reducing such a huge amount of pages down to a compact targeted set. With prospects of losing so much long tail traffic, it can make us a little hesitant.
However the on-site changes we have made so far, seem to be having a positive affect.And thank you for giving me some ideas about content creation for each town. I really like this as an idea to move forward after the changes are complete, which will hopefully be by the new year!
-
Hi Silkstream,
Thank you so much for clarifying this! I understand now.
If I were starting with a client like this, from scratch, this would be the approach I would take:
-
View content development as two types of pages. One set would be the landing pages for each physical location, optimized for each city, with unique content. The other set would be service pages, optimized for the services, but not for a particular city.
-
Create a Google+ Local page for each of the physical locations, linked to its respective landing page on the website. So, let's say you now have 25 city pages and 46 service pages. That's a fairly tall order, but certainly do-able.
-
Build structured citations for each location on third party local business directories. Given the number of locations, this would be an enormous jobs.
-
Build an onsite blog and designate company bloggers, ideally one in each physical office. The job of these bloggers would be something like each of them creating one blog post per month about a project that was accomplished in their city. In this way, the company could begin developing content under their own steam that would meet the need of showcasing a given service with a given city. Over time, this body of content would grow the pool of queries for which they have answers for.
-
Create a social outreach strategy, likely designating brand representatives within the company who could be active on various platforms.
-
Likely need to develop a link earning strategy tied in with steps 4 and 5.
-
Consider video marketing. A good video or two for each physical location could work wonders.
I'm painting in broad strokes here, but this is likely what the overall strategy would look like. You've come into the scenario midway and don't have the luxury of starting from scratch. You are absolutely right to be cleaning up duplicate content and taking other measures to reduce the spaminess and improve the usefulness of the site. Once you've got your cleanup complete, I think the steps I've outlined would be the direction to go in. Hope this helps.
-
-
Hi Miriam,
Thanks for jumping in.
The business model is service-based. So when i refer to "46 products" they are actually 46 different types of service available.
The customer will typically book and pay online, through the website, and they are then served at their location which is most often either their home or place of work. They actually have far more than the 25 actual locations, much closer to 120 I believe. However, I only began their SEO in February, AFTER they were hit by Panda. So building up their local listings is taking time, as the duplicate content issue seems far more urgent. Trying to strike a balance, and fix this all slowly over time to lay a solid foundation for inbound marketing, as its being diluted by the poor site structure.
Does this help? Am I doing the right things here?
-
Hi Silkstream,
I think we need to clarify what your business model is. You say you have a physical location in each of your 25 towns. So far, so good, but are you saying that your business has in-person transactions with its customers at each of the 25 locations? The confusion here is arising from the fact that e-commerce companies are typically virtual, meaning that they do not have in-person transactions with their customers. The Google Places Quality Guidelines state:
Only businesses that make in-person contact with customers qualify for a Google Places listing.
Thus, my wanting to be sure that your business model is actually eligible, given that you've described it as an e-commerce business, which would be ineligibl_e._ If you can clarify your business model, I think it will help you to receive the most helpful answers from the community.
-
You scared me then Chris!
-
Of course, if you've got the physical locations, you're in good shape there.
-
"It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization."
Why? The business has a physical location in every town, so why should they not have a page for every location? This is what we were advised to do?
"If there was no other competition, you would almost certainly rank for your keywords along with the town name"
I have used this tactic before, for another nationwide business, but on a smaller scale and it worked. Ie; they ranked (middle of page 1) but for non competitive keywords and the page has strong backlinks. With this site, the competition is stronger and the pages will not have a strong backlink profile at first.
My biggest worry, is to cut all the existing pages and lose the 80% long tail the site currently pulls in. But what other way is there to tackle so much duplicate content?
-
It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization. If not that, it's likely to stop being worth the time as a visibility tactic.
As far as whether or not mentioning local surrounding towns in your page copy will be enough to get you to rank for them, it would depend on competition. If there was no other competition, you would almost certainly rank for your keywords along with the town name but with competition, all the local ranking factors start coming into play and your ability to rank for each one will depend on a combination of all of them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I apply Canonical Links from my Landing Pages to Core Website Pages?
I am working on an SEO project for the website: https://wave.com.au/ There are some core website pages, which we want to target for organic traffic, like this one: https://wave.com.au/doctors/medical-specialties/anaesthetist-jobs/ Then we have basically have another version that is set up as a landing page and used for CPC campaigns. https://wave.com.au/anaesthetists/ Essentially, my question is should I apply canonical links from the landing page versions to the core website pages (especially if I know they are only utilising them for CPC campaigns) so as to push link equity/juice across? Here is the GA data from January 1 - April 30, 2019 (Behavior > Site Content > All Pages😞
Intermediate & Advanced SEO | | Wavelength_International0 -
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
Multiple Landing Pages and Backlinks
I have a client that does website contract work for about 50 governmental county websites. The client has the ability to add a link back in the footer of each of these websites. I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it). I also want a different landing page to rank for each term. The 50 different landing pages would be a bit like location pages for local search. Each one targets a different county. However, I do not have a lot of unique content for each page. Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy? Do I need more unique content for each landing page in order to prevent duplicate content flags?
Intermediate & Advanced SEO | | shauna70840 -
Google indexing only 1 page out of 2 similar pages made for different cities
We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?
Intermediate & Advanced SEO | | abhihan0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640