Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Location Pages On Website vs Landing pages
-
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example.
One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com.
I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past.
What are are your thoughts and and resources so I can convince my team on the best practice.
-
Hi KJ,
Agree with the consensus here that building mini sites is not the right approach. Take whatever energy you would have put into developing these and channel it into making the landing pages for your locations the best in their industry/towns. I was just watching a great little video by Darren Shaw in which this is one of the things he covers. Might be worth sharing with your team:
http://www.whitespark.ca/blog/post/70-website-optimization-basics-for-local-seo
And earlier this year, Phil Rozek penned some pretty fine tips on making your pages strong:
I am curious about one element of your original post. You mention, "We have been having a terrible time in the local search results for 20 + locations." I wasn't sure whether you were saying that you've never done well in them, were doing well in them until something changed (such as the universal rollout of Local Stacks) or something else. With the latter, I would guess that a huge number of businesses are now struggling to cope with the fact that there are only 3 spots to rank for any keyword, necessitating greater focus on lower volume keywords/categories, organic and paid results. Everybody but the top 3 businesses is now in this boat. Very tough.
-
Hi KJ,
First things first, do you have a physical address for each location and are these set up in Google My Business? I doubt you have premises in each location, so ranking for all the areas is going to be an uphill task.
Google is smart and knows if you have physical premises in the targeted location, after all it's all about delivering highly relevant results to its users. Lets say for example you're an electrician and a user searches for "Electrician in Sheffield" - realistically, if you only have premises in Leeds, it's going to be difficult to rank above the company who is actually located in Sheffield.
I would firstly target 2-3 of your primary locations and focus on building 10x content, I would aim to write 1000+ words for each page (completely unique content) whilst focusing on your set keywords, but be natural and don't keyword stuff. Put reviews from customers in that specific area on the landing page and build citations from local directories.
Again, you can't build citations unless you have physical premises in the location. Trust me, I've done it for years for a Roofing company and it's taken some time to see the results. He's #1 for the city he is located in, but for other cities it's a very difficult task. Writing about the same service for each location is a daunting task too, you should consider Great Content to outsource the content if you're stuck for ideas. It's a low budget solution and will save you mountains of time.
I would also use folders and not subdomains. Build a 'service areas' page, examples of urls for the roofing company below.
-
Hello KJ,
You absolutely don't want to begin creating subdomains for different locations. That will split your link flow across multiple domains (rather than consolidating it within a single domain).
It sounds like you are attempting a silo structure for your website (multiple locations targeting the same keyword) but this can be seen as stuffing if performed incorrectly. Using multiple pages to rank for a single keyword is problematic as it hits both Panda and Penguin red flags. What you want to do is begin ranking for different keywords or at least ensuring that your content for each of these locations pages is unique and sufficiently long (500 words+) to avoid arousing suspicion.
Your site structure sounds like it is okay. For example, a silo we put in place for one of our clients followed the following pattern:
domain.com/country/region/city/service
We hit about 15 cities using this tactic, and they have been sitting 1st page for the last year or so. We also built sufficient links to the home page and relevant pages and ensured that our technical SEO was spotless, so perhaps these are the areas you might engage your team to move forward on.
If you want to know more about our process, feel free to touch base and I will provide what advice I can.
Hope this helps and best of luck moving forward!
Rob
-
Right. You will not beat the other folks with the subdomain approach. You are getting beat because your competitors are taking the time to make better content in a niche. Find a way to get better content on those pages and mark them up with schema to make the info more readable to the search engines and possibly get an enhanced listing the SERPs.
We went through a site relaunch and the review schema on locations got messed up. Did not impact our rankings, but did impact click through from the search engines. None of the stars were showing up in the SERPs due to the schema goof up. Got the schema fixed and traffic was back up.
This link will point you toward the relevant Moz resources
https://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
If you are happy with my response, please feel free to mark as a "Good Answer" thanks!
-
I agree with you. Some marketing people believe that we cannot beat out smaller companies is that we are too diverse in services. We do great with niche keywords and markets, but are being beat by companies who only focus on one of our key services. That is why they thought sub domains would do better, but I remember Rand posting something on sub domains vs sub folders, but cannot find the original source.
Thanks for your answer...
-
This is similar to the question on if a blog should be on a subdomain (blog.website.com) vs a folder (website.com/blog).
Most people agree that the use of the folder is the better option as with every blog post that you get links to etc, you are building your domain authority and generally speaking, rising tides raise all ships.
You would run into the same issue with your option to setup subdomains for each location. You would also end up having to deal with separate webmaster accounts for each etc. I don't think the subdomain is the solution. I run a site with thousands of locations and using a folder structure the business pages rank well for a given location, if you search on the name of the location, so I know it works and I manage it at scale.
I would get back to looking at any technical issues you have and your on page options for the pages. Anything you can further do to make these pages 10x better than any other page on the net for those locations?
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I optimize the login page? Will it affect the website SEO ranking?
I'm trying to resolve the site crawl issues that we have on our website. One of the links that has different issue types together is our login page. Currently we have two login pages that have the same content but different sub domains. **However I'm wondering if optimizing SEO on our login pages affects our website SEO ranking and if it's something better to do or not. ** To point out the details of the issues, the issue types that the logins pages have are "duplicate title", "duplicate content", "missing H1", "missing description", "thin content", "missing canonical tag" I'd appreciate your help, thank you!
Intermediate & Advanced SEO | | Kaylie0 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Ranking Page - Category vs. Blog Post - What is best for CTR?
Hi, I am not sure wether I shall rank with a category page, or create a new post. Let me explain... If I google for 'Basic SEO' I see an article from Rand with Authorship markup. That's cool so I can go straight to this result because I know there might be some good insight. BUT: 'Basic SEO' is also an category at MOZ an it is not ranking. On the other hand, if I google for 'advanced SEO' then the MOZ category for 'advanced SEO' is ranking. But there is no authorship image, so users are much less likely to click on that result. Now, I want to rank for a very important keyword for me (content keyword, not transactional). Therefor, I have a category called 'yoga exercises'. But shall I rather create an post about them only to increase CTR due to Google Authorship? I read in Google guidelines that Authorship on homepage an category pages are not appreciated. Hope you have some insights that can help me out.
Intermediate & Advanced SEO | | soralsokal0 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0