Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Solved How to solve orphan pages on a job board
-
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time.
They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans.
How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed?
We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress.
What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
-
@cyrus-shepard-0 Thanks so much for your input! The categorization option was what we were thinking about as well, but not sure if the client will be ready to invest the time. Will definitely suggest it to them.
Not majorly concerned about the jobs being found via Google search as individual posts, it's more about avoiding the orphans, as I'm sure they will be seen as a red flag.
Also, yes, the job posts are covered in a sitemap, you are correct.
-
@michael_m Seems like you have a number of options.
Can you categorize the jobs into more specific types (e.g. region, job type, etc.) and then add them to more category-specific "job board" pages? Even if you had duplication across job boards, seems like you'd get better crawl + indexation coverage. Anything to create a more clear crawling path to those pages. Even 20-50 job categories (or other sort/filter features) might provide benefit, and those category pages probably have a better chance of ranking on their own.
Cross-linking from similar/related jobs might also be a good option to explore. Much how we link to related questions here in the Q&A.
Orphaned pages aren't always a problem, as long as the pages are getting indexed and ranked. I imagine the search volume is pretty low for some of those jobs, but Google's sitemap indexation report is going to be your friend here.
Hope that helps!
Are the job postings covered in a sitemap? As SEO tools are finding them as orphaned, I assume they are discovering the pages via sitemaps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Product pages are throwing Missing field "image" and Missing field "price" in Wordpress Woocommerce
I have a wordpress wocommerce website where I have uploaded 100s of products but it's giving me error in GSC under merchant listing tab. When I tested it show missing field image and missing field price. I have done everything according to https://developers.google.com/search/docs/appearance/structured-data/product#merchant-listing-experiences and applied fixed i.e. images are 800x800 and price range is also there. What else can be done here?!merchant listing.jpg
Technical SEO | | Ravi_Rana0 -
How important is Lighthouse page speed measurement?
Hi, Many experts cite the Lighthouse speed as an important factor for search ranking. It's confusing because several top sites have Lighthouse speed of 30-40, yet they rank well. Also, some sites that load quickly have a low Lighthouse speed score (when I test on mobile/desktop they load much quicker than stated by Lighthouse). When we look at other image rich sites (such as Airbnb, John Deere etc) the Lighthouse score can be 30-40. Our site https://www.equipmentradar.com/ loads quickly on Desktop and Mobile, but the Lighthouse score is similar to Airbnb and so forth. We have many photos similar to photo below, probably 30-40, many of which load async. Should we spend more time optimizing Lighthouse or is it ok? Are large images fine to load async? Thank you, Dave bg_05.jpg
Reporting & Analytics | | erdev0 -
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Will noindex pages still get link equity?
We think we get link equity from some large travel domains to white label versions of our main website. These pages are noindex because they're the same URLs and content as our main B2C website and have canonicals to the pages we want indexed. Question is, is there REALLY link equity to pages on our domain which have "noindex,nofollow" on them? Secondly we're looking to put all these white label pages on a separate structure, to better protect our main indexed pages from duplicate content risks. The best bet would be to put them on a sub folder rather than a subdomain, yes? That way, even though the pages are still noindex, we'd get link equity from these big domains to www.ourdomain.com/subfolder where we wouldn't to subdomain.ourdomain.com? Thank you!
Reporting & Analytics | | HTXSEO0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Does traffic coming from Adwords increase overall Domain Authority or Page Rank?
If I'm setting up an Adwords campaign, will setting my homepage as the landing page boost my domain rank? and will the Page Rank of the landing page get boosted because of the high click rate coming from the Adwords campaign?
Reporting & Analytics | | s2bkevin0 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0 -
Time on page: What happens when I open many tabs?
Hello everyone, I was studying Analytics, and checked that the time on page is calculated by the diference of the time you entered the page and when you click to go to another one. But how the time is calculated when I open several links using new tabs in different moments? Does Google counts the last tab? Just a guess... Thanks!
Reporting & Analytics | | seomasterbrasil0