Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Targeting local areas without creating landing pages for each town
-
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages).
Then along came Panda...
I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name.
My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products.
Next I have rewritten the content for every product to ensure they are now as individual as possible.
However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible.
The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too.
QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns?
I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once.
Any examples of big sites that have reduced in size since Panda would be great.
I have a headache... Thanks community.
-
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
-
Thank you Miriam. I appreciate you sharing with me the broad idea of the type of structure that you feel a site should have in this instance (if starting from scratch).
You have pretty much echoed my proposal for a new site structure, built for how Google works nowadays, rather than 2-3 years ago. We are currently reducing the size of the current site, to bring it as close to this type of model as possible. However the site would need a complete redesign to make it viably possible to have this type of structure.
I guess what I've been looking for is some kind of reassurance that we are moving in the right direction! Its a scary prospect reducing such a huge amount of pages down to a compact targeted set. With prospects of losing so much long tail traffic, it can make us a little hesitant.
However the on-site changes we have made so far, seem to be having a positive affect.And thank you for giving me some ideas about content creation for each town. I really like this as an idea to move forward after the changes are complete, which will hopefully be by the new year!
-
Hi Silkstream,
Thank you so much for clarifying this! I understand now.
If I were starting with a client like this, from scratch, this would be the approach I would take:
-
View content development as two types of pages. One set would be the landing pages for each physical location, optimized for each city, with unique content. The other set would be service pages, optimized for the services, but not for a particular city.
-
Create a Google+ Local page for each of the physical locations, linked to its respective landing page on the website. So, let's say you now have 25 city pages and 46 service pages. That's a fairly tall order, but certainly do-able.
-
Build structured citations for each location on third party local business directories. Given the number of locations, this would be an enormous jobs.
-
Build an onsite blog and designate company bloggers, ideally one in each physical office. The job of these bloggers would be something like each of them creating one blog post per month about a project that was accomplished in their city. In this way, the company could begin developing content under their own steam that would meet the need of showcasing a given service with a given city. Over time, this body of content would grow the pool of queries for which they have answers for.
-
Create a social outreach strategy, likely designating brand representatives within the company who could be active on various platforms.
-
Likely need to develop a link earning strategy tied in with steps 4 and 5.
-
Consider video marketing. A good video or two for each physical location could work wonders.
I'm painting in broad strokes here, but this is likely what the overall strategy would look like. You've come into the scenario midway and don't have the luxury of starting from scratch. You are absolutely right to be cleaning up duplicate content and taking other measures to reduce the spaminess and improve the usefulness of the site. Once you've got your cleanup complete, I think the steps I've outlined would be the direction to go in. Hope this helps.
-
-
Hi Miriam,
Thanks for jumping in.
The business model is service-based. So when i refer to "46 products" they are actually 46 different types of service available.
The customer will typically book and pay online, through the website, and they are then served at their location which is most often either their home or place of work. They actually have far more than the 25 actual locations, much closer to 120 I believe. However, I only began their SEO in February, AFTER they were hit by Panda. So building up their local listings is taking time, as the duplicate content issue seems far more urgent. Trying to strike a balance, and fix this all slowly over time to lay a solid foundation for inbound marketing, as its being diluted by the poor site structure.
Does this help? Am I doing the right things here?
-
Hi Silkstream,
I think we need to clarify what your business model is. You say you have a physical location in each of your 25 towns. So far, so good, but are you saying that your business has in-person transactions with its customers at each of the 25 locations? The confusion here is arising from the fact that e-commerce companies are typically virtual, meaning that they do not have in-person transactions with their customers. The Google Places Quality Guidelines state:
Only businesses that make in-person contact with customers qualify for a Google Places listing.
Thus, my wanting to be sure that your business model is actually eligible, given that you've described it as an e-commerce business, which would be ineligibl_e._ If you can clarify your business model, I think it will help you to receive the most helpful answers from the community.
-
You scared me then Chris!
-
Of course, if you've got the physical locations, you're in good shape there.
-
"It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization."
Why? The business has a physical location in every town, so why should they not have a page for every location? This is what we were advised to do?
"If there was no other competition, you would almost certainly rank for your keywords along with the town name"
I have used this tactic before, for another nationwide business, but on a smaller scale and it worked. Ie; they ranked (middle of page 1) but for non competitive keywords and the page has strong backlinks. With this site, the competition is stronger and the pages will not have a strong backlink profile at first.
My biggest worry, is to cut all the existing pages and lose the 80% long tail the site currently pulls in. But what other way is there to tackle so much duplicate content?
-
It sounds like you're saying that your one ecommerce company has 25 Google local business listings--and growing?! It's very possible that could come back and haunt you unless you in the form of merging or penalization. If not that, it's likely to stop being worth the time as a visibility tactic.
As far as whether or not mentioning local surrounding towns in your page copy will be enough to get you to rank for them, it would depend on competition. If there was no other competition, you would almost certainly rank for your keywords along with the town name but with competition, all the local ranking factors start coming into play and your ability to rank for each one will depend on a combination of all of them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal links to landing pages
Hi, we are in the process of building a new website and we have 12 different locations and for theses 12 locations we have landing pages with unique copy on the following: 1. Marketing...2 SEO....3. PPC....4. Web Design Therefor there are 48 landing pages. The marketing pages are the most important ones to us in terms of traffic and priority. My question is: 1. Should we put a dropdown of the are pages in the main header under locations that link to the area marketing pages? 2. What is the best way to link all the sub pages such as London Web Design? Should these links just be coming off the London marketing page? or should we have a sitemap in the footer that lists every page? Thanks
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Using href lang tag for multi-regional targeting on the same page
Hi, I have the site au.example.com and I ranked on google AustraliaI would like to be ranked also in Google New Zeland for the same page (au.example.com) Because they are geographically & culturally close Can I place href lang tag for both countries and present the same page The code should look like: OR should i have create a different page for New Zealand (for eample: http://au.example.com/EN-NZ) And the code will look like: What will work better or there is other solution? Hope I’m clear.. Thanks!
Intermediate & Advanced SEO | | Kung_fu_Panda0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Are there any negative effects to using a 301 redirect from a page to another internal page?
For example, from http://www.dog.com/toys to http://www.dog.com/chew-toys. In my situation, the main purpose of the 301 redirect is to replace the page with a new internal page that has a better optimized URL. This will be executed across multiple pages (about 20). None of these pages hold any search rankings but do carry a decent amount of page authority.
Intermediate & Advanced SEO | | Visually0