What about using it just a a simple place to keep your targeted keyword for a given page while you are working? So if you have a number of hand coded page and want to quickly go through and put in what you are going for there before you start changing the page to suit. I'm not sure why you would care if competitors knew? If your page is optimized your keyword phrase for that page is in the title and the h1 so they can crawl your site if they want and get all of them anyway.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by trevogre
-
RE: Meta Keywords Good or Bad
-
RE: Local SEO: How to optimize for multiple cities on website
"Why would you take the approach of creating blank pages to be filled out later?"
You would create "blank" pages, by which I mean templated pages that have calls to actions or forms, because if you are linking any pages into your site hierarchy they should make sense. So you wouldn't add a service areas page where you linked to some cities with good content but not others and left a broken index. Where you have a few cities that you embellish and others that you don't list or aren't links. I understand the seo caution, but if you create a site follow where your landing pages are actually linked in, the user that is wondering if you will services them would go to services areas, and then click their location if found and then fill out a request form, and/or read whatever local content you have drummed up to make that page good as a landing page.
So I suppose they wouldn't be blank page, but they would be duplicate in the sense that they would only server as request pages for a certain area. So if you do it this way, the question is if you can just no-follow the links so that you can have a site architecture that makes sense but still acknowledge to google that you don't have something unique for a given city/service combo (even though you really do want it listed).
I wasn't talking about local pack. What I'm talking about is location based organic. Where if you search for lawyer, you might get the local pack, but you might also get organic that is from high content sites that don't address your concern, and then put in a city + service to find what you are looking for in the organic results. In that case, location is completely relevant because the organic results will return pages optimized for a given city. Rather than a truly local service area result. So in those circumstances there is a bias towards the keyword as a word rather than a location / quality mix in organic that respects business type. For food you want food in a given city, for lawyers you want the closest lawyer to that city that is going to be expert in solving your problem. Not the guy who just happens have an address in that city. That is what the local pack is for (if I understand that properly). For when you want to find a business in a given location. Not when you want the best organic result around your location.
So I think that google isn't returning quality results based upon named keywords because they aren't parsing intent.
I have seen that in a few places where you analyze competitors and they have a lower page and domain authority, but still return before your site for the simple fact that they put the city name in the title or multiple times on the page. While that might be great seo, those are bad search results. As you get sites that are forced to be spammy by injecting geo targeted keywords. And you have the problem that everyone is addressing of making cruddy landing pages to get those searches rather than focusing on having high quality content.
My intuitive grasp of this is lacking. So maybe they are doing something I don't understand. But I would like to see results for a different city than the one I enter in some categories when those results have better content. This might further disenfranchise small town businesses, but it would also provide them with the opportunity to focus on quality content rather than geo-landing pages.
-
RE: Local SEO: How to optimize for multiple cities on website
So, I haven't looked this up, I thought I'd just ask here.
So if I create a services areas page, with links to landing pages for each county/city that is serviced. And then I create the pages for each area, and leave the content blank but then pursue filling out the content. Is the absence of the content considered duplicate content? Would you get penalized for the essentially blank pages.
If so could you keep the pages but mark them as no-follow so that you are telling google that you want the page but don't consider it something they should index (and penalize you for).
You could put a from on those pages, which I would assume isn't considered "content". Or shouldn't be. and have the form preloaded with the location information for that service area.
The other issue that I see here is with the concept of "core" services. If you are a law firm and you have many lawyers and lines of business. You would then logically want to have each service area / line of business to have its own landing page.
So without complaining about the unique content problem, which I get, you structural want to build all of this out so that acts as a placeholder and exists in non-competitive areas, where despite the uniqueness of your content you are the only person with a given keyphrase. But you don't want to be considered spam.
So I'm not sure what the right answer is. I doesn't seem right not optimize for an adjacent city/service combination just because there is only so many things to say about that service.
The suggestion. "Service description pages" and "City landing pages", Is (I guess) a place where you can start.
Ultimately, I think this is a bias that google is supporting that is wrong. It assumes that an urbanized world is a positive because larger cities, while potentially more competitive because of the larger traffic, are going to have a unfair advantage over business is outlying areas. And non-location sensitive business (anything knowledge related) are going to be penalized for not being in urban areas. Ultimately I think this leads to poor organic search results, because the ability to determine the quality of a small business has nothing to do with its location. I suppose that it helps by allowing local business to be listed at all against stronger competitors, but I think it would be better to use a combination of signals. So that you show as local to the nears n people. So in a city like Seattle you might have an audience of a million, but in city of 50000 that runs into other cities of 50000 with 20 miles the definition of local should change.
I think when you break it down, geographic terms should always be parsed out so that cars seattle doesn't actually look for Seattle, but looks for cars (+ locations withing x miles of seattle) that also have high domain or page authority. I think that would lead to better results and would solve the problem of location based optimization so that we can stop wasting time on it.
It could start with a default service area size, but then calculate a services area based upon result density. So a search for a given business type would automatically return for three states away if the next closest business was 5 states away.
I'm sure some of this is already around, I'm just sharing my thoughts because this is a massively irritating and time consuming issue. But I suppose it just serves to push us further in the direction of content and link building. Equally unnatural pursuits for your average small business. Sometimes I feel like google should reward business that don't create content and somehow work on the less is more principle.
Is the best search result the one from a business paying crazy rents in a large city, or one in an adjacent city that is more affordable but equally qualified and doesn't show because they aren't in city limits.