Job Posting Page and Structured Data Issue
-
We have a website where we do job postings. We manually add the data to our website.
The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc.
We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt.
Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue.
Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page:
"Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details."
Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
-
First of all those more detailed URLs should have been handled via canonical tags and not via robots.txt
You are probably safe to allow the detailed URLs to rank, try allowing a sample of them to rank whilst keeping others disallowed. First, fix the architecture. Stop using robots.txt and on the detailed URLs, make them canonical to their parents
Once that is done, select a volume of the detailed URLs as a test. Remove the canonical tags from those URLs, allowing them to index. Do they start ranking, performing? Do you get duplicate content warnings?
Depending on the outcome, you may want to lift the canonical tags from all detailed URLs, or even reverse the canonicals so that the detailed pages have ranking preference
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
Post migration issues - #11 + configuration issue
Hello Moz community. I'm keen to find out your experiences on the following: Have you ever experienced a migration whereby a large % of keywords are stuck in position #11 - post migration? The keywords do not move up or down (whilst competitors jump from 13 to 9 and vice versa) over a three month period. Please see the % difference in the attached e-mail. (sample 1,000 keyword terms) Question: Has anyone ever experienced this type of phenomenon before? If so - what was the root cause of this and did this happen post migration? What solution did you use to rectify this? Have you ever seen a cross-indexing issue between two domains (each domain serves a different purpose) post migration, which impacts the performance of the main brand domain? I will explain a little further - say you have www.example.com (brand site) and www.example-help.com (customer service site) and the day the brand website is migrated (same domain - just different file structure), www.example-help.com points to the same server that www.example.com is on (with a different file structure) and starts to inherit the legacy file structure. For example, the following is implemented on migration day: I will explain a little further - say you have www.example.com (brand site) and www.example-help.com (customer service site) and the day the brand website is migrated (same domain - just different file structure), www.example-help.com points to the same server that www.example.com is on (with a different file structure) and starts to inherit the legacy file structure. For example, the following is implemented on migration day: For example, the following is implemented on migration day: www.example.com/fr/widgets-purple => 301s to www.example.com/fr/widgets/purple But www.example-help.com now points to the same server where the customer service content is now hosted. So although the following is rendered: So although the following is rendered correctly: www.example-help.com/how-can-we-help We also have the following indexed in Google.fr - competing for the same keyword terms and the main brand website has dropped in rankings: www.example-help.com/fr/widgets-purple [legacy content from main brand website] Even when legacy content is 301 redirected from www.example-help.com to www.example.com, the authority isn't passed across and we now have www.example.com (as per Q1) a lot lower in Google than pre-migration. Question: Have you ever experienced a cross-indexing issue like above whereby Google potentially isn't passing authority across from legacy to the new setup? I'm very keen to hear your experiences on these two subjects and whether you have had similar problems on some of your domains. E0hbb
Intermediate & Advanced SEO | | SMVSEO0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
If I had an issue with a friendly URL module and I lost all my rankings. Will they return now that issue is resolved next time I'm crawled by google?
I have 'magic seo urls' installed on my zencart site. Except for some reason no one can explain why or how the files were disabled. So my static links went back to dynamic (index.php?**********) etc. The issue was resolved with the module except in that time google must have crawled my site and I lost all my rankings. I'm nowher to be found in the top 50. Did this really cause such an extravagant SEO issue as my web developers told me? Can I expect my rankings to return next time my site is crawled by google?
Intermediate & Advanced SEO | | Pete790 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash.
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash. Should we rewrite to just the trailing slash or without because of duplicates. The other question is, if we do a rewrite, google has indexed some pages with the slash and some without - i am assuming we will lose rank for one of them once we do the rewrite, correct?
Intermediate & Advanced SEO | | Profero0 -
Will Google Visit Non-Canonicalized Page Again and Return Its Page's Original Ranking?
I have 2 questions about canonicalization. 1. Will Google ever visit Page A again if after it has been canonicalized to Page B? 2. If Google will still visit Page A and found that it is not canonicalizing to Page B already, will the original rankings and traffic of Page A returned to the way before it's canonicalized? Thanks.
Intermediate & Advanced SEO | | globalsources.com0