Lots of city pages - How do I ensure we don't get penalized
-
We are planning on having a job posting page for each city that we are looking to hire new CFO partners in. But, the problem is, we have LOTS of locations. I was wondering what would be the best way to have similar content on each page (since the job description and requirements are the same for each job posting) without being hit by Google for having duplicate content? One of the main reasons we have decided to have location based pages is that we have noticed visitors to our site are searching for "cfo job in [location] but we notice that most of these visitors then leave. We believe it to be because the pages they land on make no mention of the location that they were looking for and is a little incongruent with what they were expecting.
We are looking to use the following URLs and TItle/Description as an example:
|
http://careers.b2bcfo.com/cfo-jobs/Alabama/Birmingham
| CFO Careers in Birmingham, AL |
| Are you looking for a CFO Career in Birmingham, Alabama ? We're looking for partners there. Apply today! |
|
Any advice you have for this would be greatly appreciated.
Thank you.
-
We would have the job description on each page mentioning the locations, then we would also have the job capture form.
You are right in that these descriptions do have unique data on them. I am thinking we are just going to have to take the time to write as much unique content as possible.
Thanks for the feedback.
-
Hey, the last sentence was based around other ways to bring in this inbound traffic but scratch that for now.
So, have you examined how these other, well ranking sites are doing what they do? Are they living off the fact they are big domains? Is the content on these pages unique as I just Googled:
CFO Careers in Birmingham, AL
And it appears they are job listings specific to that location so I am guessing that content is fairly unique and the listings is the content.
These pages that you would create, what content would they have on them? Would they all be different?
My initial understanding was that this would just be a data capture form but if we actually have unique job listings like on indeed.com, simplyhired, jobs2careers etc then these pages should be unique enough to rank.
Or am I missing something? (it is late in the day here 7pm, hitting my 12th hour of work so the old synapses may be failing me somewhat).
-
Marcus,
I am not sure I understand the last line of your post. But I have looked at the Keyword difficulty tool and these are fairly competitive phrases.
The problem we have is that we are competing against the likes of Indeed.com, Monster.com and sites such as that. While we do use these sites, they don't quite provide the flexibility we are looking for.
We used to rank quite highly for these types of phrases, but I have noticed a recent trend in Google for them to rank the job search sites ahead of us. The hope is that if we provide similar content, then Google would start pushing us up the rankings again.
-
A lot of this depends on the competitiveness of the search query and would need some testing to better determine your approach.
You can use the keyword difficulty tool here but also just google the terms and see what comes up. If the results are weak, you could try this as a stage 1 approach and see how you get on.
Maybe there is another way to think about it, what about the job listings themselves or does it not work that way?
-
I think these need to be indexed as it is through organic search that people have been getting to our site using terms such as "cfo jobs in [location]"
I have been thinking about adding new content for each city, but you are right, that is a LOT of work. I wonder if it might be worth having one page with unique, location based content for the main city in an area and just have a list of nearby cities on the page that we are also hiring in.
-
Hey Danny
A few suggestions:
1. Make each location page unique enough that you can safely have it on the site without worrying about duplication (lots of work).
2. If people are only searching or browsing to these pages internally then don't index them (robots.txt / meta noindex)
3. You could do this dynamically and use a canonical to your main enquiry page on these pages.
4. You could just create all the variations and add a canonical to your main enquiry page and they may, if it is not mega competitive rank (bit risky but easy to fix if it causes issues).
I would always try to look at this from the perspective of your users and if you don't really care about having these as organic search landing pages then simply noindexing them would seem an ideal solution.
Hope that helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Doesn't Make Any Sense!! Help Please
Hey again Moz community! I've been trying to read up on schema markup and watch videos multiple times (!) but I can't understand how it works. I would greatly appreciate it if someone can answer these questions: Do I need to ‘markup’ every part of the article? Like “this section can be FAQ snippet, and this can also FAQ etc..". So I guess my question is how detailed does the markup have to be? What are the best tools to use for schema markup for wordpress? What are the best tools to use for schema markup for react web-app? The https://search.google.com/test/rich-results shows if the markup is good for a page, but it doesn’t provide any details. For some articles it says that sitelinks searchbox is detected but that’s only one type of snippet possibility? Do I need to add additional markup for, say, list snippets and FAQ snippets if I want a chance to get those? Thanks a lot! Leo W
Intermediate & Advanced SEO | | Leowa2 -
Paginated Pages Which Shouldnt' Exist..
Hi I have paginated pages on a crawl which shouldn't be paginated: https://www.key.co.uk/en/key/chairs My crawl shows: <colgroup><col width="377"></colgroup>
Intermediate & Advanced SEO | | BeckyKey
| https://www.key.co.uk/en/key/chairs?page=2 |
| https://www.key.co.uk/en/key/chairs?page=3 |
| https://www.key.co.uk/en/key/chairs?page=4 |
| https://www.key.co.uk/en/key/chairs?page=5 |
| https://www.key.co.uk/en/key/chairs?page=6 |
| https://www.key.co.uk/en/key/chairs?page=7 |
| https://www.key.co.uk/en/key/chairs?page=8 |
| https://www.key.co.uk/en/key/chairs?page=9 |
| https://www.key.co.uk/en/key/chairs?page=10 |
| https://www.key.co.uk/en/key/chairs?page=11 |
| https://www.key.co.uk/en/key/chairs?page=12 |
| https://www.key.co.uk/en/key/chairs?page=13 |
| https://www.key.co.uk/en/key/chairs?page=14 |
| https://www.key.co.uk/en/key/chairs?page=15 |
| https://www.key.co.uk/en/key/chairs?page=16 |
| https://www.key.co.uk/en/key/chairs?page=17 | Where is this coming from? Thank you0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
Why isn't www.devonshiredentalcare.co.uk ranking?
Hi, devonshiredentalcare.co.uk was hit badly by one of Google's algorithm updates due to some prior poor seo (by another company). We took this client on and followed all of Google's guidelines and after a lot of work, managed to lift the penalty. It's been almost a year since the penalty was lifted, but it seems to be impossible to get this website ranking for 'Dentist Glasgow', they are currently page 4 in the SERPs! They have 33 Google reviews, we've built good quality links and citations, they have a 'Grade A' for on page optimisation in moz, we are also about to make the website responsive due to the recent mobilegeddon update. Do you have any further suggestions to help get this website ranking? Thanks in advance, Faye
Intermediate & Advanced SEO | | dentaldesign0 -
Submitted a Disavow BUT can't send in a RECONSIDERATION, WHY?
Hi Community! 2 weeks ago, i sent in our first/HUGE disavow list to Google. Out of the 2700 domains we submitted, 1300 of them we successfully removed, but we have nothing to show Google. Reason is because on our reconsideration request page, we can't submit anything because we didn't receive a message from Google (please see screenshot). I know for a FACT we got hit by an ALGORITHM penalty back in March2013. So, I have this wonderful Gdoc to prove that we worked LONG AND HARD to add and remove links in the past year, but we can't seem to message Google and tell them our story on why we should be reconsidered. How do we tell Google our success of removals? It's been 2 weeks, how much longer until we see a change in traffic? Or do we have to wait for the next update of algorithms by google aka REFRESH to see a change? Let me know and thank you so much in advance! Shawn cYGKLVR
Intermediate & Advanced SEO | | Shawn1241 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Google isn't displaying the www. for my site in the SERPS
I noticed that every other site url in the serps for my main keywords has a www. on their display url except mine. I have the site set to display the www. Can this potentially hurt my SEO and what can I do to fix this? Thanks Aaron. www.png
Intermediate & Advanced SEO | | afranklin0 -
Link Juice - Lots of Pages
I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time. My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for. Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!
Intermediate & Advanced SEO | | modparent0