Job Posting Page and Structured Data Issue
-
We have a website where we do job postings. We manually add the data to our website.
The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc.
We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt.
Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue.
Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page:
"Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details."
Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
-
First of all those more detailed URLs should have been handled via canonical tags and not via robots.txt
You are probably safe to allow the detailed URLs to rank, try allowing a sample of them to rank whilst keeping others disallowed. First, fix the architecture. Stop using robots.txt and on the detailed URLs, make them canonical to their parents
Once that is done, select a volume of the detailed URLs as a test. Remove the canonical tags from those URLs, allowing them to index. Do they start ranking, performing? Do you get duplicate content warnings?
Depending on the outcome, you may want to lift the canonical tags from all detailed URLs, or even reverse the canonicals so that the detailed pages have ranking preference
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doorway page penalty
Has Google changed their interpretation of Doorway pages?We do not sell widgets but allow me to use Widget for this example;If we sold 25 very different widgets an online vendor would typically have 1 "mother" website with 25 different inner pages, each page to explain each type of widget they sell.However, for the past 9 years our approach is to have 25 different websites, one for each widget. With these 25 sites we concentrated on ranking the home page only . All these sites link back to our (No idexed) "Mother' site via no follow links where we have our Shopping Cart and Terms of Business. We did this partly to avoid having 25 separate Shopping Carts and to avoid having to change our Terms 25 times each time that became necessary. But yes we also did this as it was so much easier to rank each different type of widget in the SERPS. Also we think its a better user experience as in our business buyers of yellow widgets will not be interested in blue widgetsWe have been reading for years that google does not like doorways pages but we were not 100% certain if they might regard our sites as such .This is because our approach has worked great for nine years. That is until December last year when all 95% our sites fell dramatically in the SERPS usually from page 1 to page 2 or 3. First thing we did was to go through all our sites and search for the obvious; toxic links, duplicate content, keyword density, https issues, mobility issues, anchor text, etc etc and of course content. We found no obvious problems that could affect 95% of the sites at the same time but we ordered new homepage content for most of our sites from expert seo writers. However, after putting on this new content 3 -4 weeks ago our sites have not moved up the SERPS at all.So we are left with the inescapable conclusion that our problem is because google sees and devalues our sites as doorway pages especially as 95% of your sites have been affected all at the same time Would any SEO experts on this forum agree or be able to offer an opinion?If so, what might be the solution going forward? We have 2 solutions under consideration;1) Remove all links from each of our 25 sites to our "mother Site" and put a shopping cart and our TOS on each of the 25 sites so they are all truly independent stand alone websites.2) Create 25 inner pages on our mother site (after removing the no index) , for each of the 25 widgets we sell , then 301 each of the 25 individual sites home pages to its inner page on the mother site . I think this might be the best solution partly as almost all of our higher ranking competitors are ranking their inner pages not their homepage. But I worry if these 25 sites will really pass much link juice if they have been devalued by Google.?Any advice will be gratefully received.
Intermediate & Advanced SEO | | apcsilver90 -
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Regarding SEO Structured Data
1. Should we add organization schema on all pages of the website OR just homepage? 2. What is the best practice for catalog page schema as every website is following a different pattern?
Intermediate & Advanced SEO | | Rajesh.Prajapati1 -
What is the impact of an off-topic page to other pages on the site?
We are working with a client who has one irrelevant, off-topic post ranking incredibly well and driving a lot of traffic. However, none of the other pages on the site, that are relevant to this client's business, are ranking. Links are good and in-line with competitors for the various terms. Oddly, very few external links reference this off-topic post, most are to the home page. Local profile is also in-line with competitors, including reviews, categorization, geo-targeting, pictures, etc. No spam issues exist and no warnings in Google Search Console. The only thing that seems weird is this off-topic post but that could affect rankings on other pages of the site? Would removing that off-topic post potentially help increase traffic and rankings for the other more relevant pages of the site? Appreciate any and all help or ideas of where to go from here. Thanks!
Intermediate & Advanced SEO | | Matthew_Edgar0 -
I have 2 keywords I want to target, should I make one page for both keywords or two separate pages?
My team sells sailboats and pontoon boats all over the country. So while they are both boats, the target market is two different types of people... I want to make a landing page for each state so if someone types in "Pontoon Boats for sale in Michigan" or "Pontoon boats for sale in Tennessee," my website will come up. But I also want to come up if someone is searching for sailboats for sale in Michigan or Tennessee (or any other state for that matter). So my question is, should I make 1 page for each state that targets both pontoon boats and sailboats (total of 50 landing pages), or should I make two pages for each state, one targeting pontoon boats and the other sailboats (total of 100 landing pages). My team has seen success targeting each state individually for a single keyword, but have not had a situation like this come up yet.
Intermediate & Advanced SEO | | VanMaster0 -
Product with two common names: A separate page for each name, or both on one page?
This is a real-life problem on my ecommerce store for the drying rack we manufacture: Some people call it a Clothes Drying Rack, while others call it a Laundry Drying Rack, but it's really the same thing. Search volume is higher for the clothes version, so give it the most attention. I currently have 2 separate pages with the On-Page optimization focused on each name (URL, Title, h1, img alts, etc) Here the two drying rack pages: clothes focused page and laundry focused page But the ranking of both pages is terrible. The fairly generic homepage shows up instead of the individual pages in Google searches for the clothes drying rack and for laundry drying rack. But I can get the individual page to appear in a long-tail search like this: round wooden clothes drying rack So my thought is maybe I should just combine both of these pages into one page that will hopefully be more powerful. We would have to set up the On-Page optimization to cover both "clothes & laundry drying rack" but that seems possible. Please share your thoughts. Is this a good idea or a bad idea? Is there another solution? Thanks for your help! Greg
Intermediate & Advanced SEO | | GregB1230 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Canonical category pages
A couple of years ago I used to receive a lot of traffic via my category pages but now I don't receive as much, in the past year I've modified the category pages to canonical. I have 15 genres for the category pages, other than the most recent sorting there is no sorting available for the users on the cat pages, a recent image link added can over time drop off to page 2 of the category page, for example mysite.com/cat-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-23. New image link can drop off to page 2. mysite.com/dog-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-53. New image link can drop off to page 2. mysite.com/turtle-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-2. New image link can drop off to page 2. Now on the first page (eg mysite.com/cat-page1.html) I've set this up to rel= canonical = mysite.com/cat-page1.html One thing that I have noticed is the unique popup short description tooltips that I have on the image links only appears in google for the first pages of each category page, it seems to ignore the other pages. In view of this am I right in applying canonical ref or just treating it as normal pages.? thanks
Intermediate & Advanced SEO | | Flapjack0