Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google for Jobs: how to deal with third-party sites that appear instead of your own?
-
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences:
The Muse...
• Lists Experience Requirements
• Uses HTML in the description withtags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for OrganizationWhen you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com".
What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?
-
We have found the following:
1 Using the API is better than waiting for Google to crawl the jobs.
2 They have you must have data fields, but they have would like to have and be tickled pink if you have fields. Filling in all three changes rankings in the testing we have done.
3 The quality of the title you give vs the title they understand.
4 The overall authority of your site. No exact on this yet but a gut feel factor.
5 SERPs result are also jumping around like crazy just now, we see the Google for jobs panel with no links about it and then four hours later it has 4 organic links about it for the same search, then a day later 2, then a day later none, then back to four then an hour later none...Testing google for jobs when it landed in the UK three weeks ago its results are inconsistent with its own rules, we have found jobs with the wrong suggested title format, the wrong address format, landing pages not actual jobs have found their way onto the service!!! jobs with red warning have made it onto the service and so the list goes on.
-
Yeah, I'm sorry I'm not seeing a really good resource for you, Kevin. It's early days. The person who takes on the task of writing that resource will have valuable information to share. I would say your best hope is in experimentation with this, but I don't see that anyone has figured out a solution to the important questions you've asked.
-
Thanks, Miriam. This article offers a good summary of information that Google put out there, but it doesn't discuss factors that may affect which version of a duplicate posting appears. Ideally, there's be a way to canonical third-party duplicates, but I'm not sure if this would be possible with these huge third-party job posting sites or even if this would affect which version of the posting appeared in Google for Jobs.
-
Hi Kevin! It's nice to speak with you, too. Another article that might help:
http://www.clearedgemarketing.com/2017/06/optimize-google-jobs/
I'd love to see someone do a deep dive on the exact questions you've raised.
-
Wow, a reply by the Miriam Ellis! I've found your past posts on local search very useful.
Seriously, though, this was a very good thread on which I could begin to pull. I took a look at the article and found this helpful line: "For jobs that appeared on multiple sites, Google will link you to the one with the most complete job posting." I'd be interested in knowing more about what constitutes "complete." I'm assuming it's the post that has the most schema items included and in particular the "critical" items according to Google's rich cards report. If this is the case, then it would seem that organic signals may not affect the visibility of the job posts as much as I originally suspected.
Then again, there's got to be some keyword relevance going on here.
Our website's job posting is being included in Google for Jobs. However, this posting only appears with a very specific search (typing in the exact job title plus "site:http://www.oursite.com".)
So, maybe it's a combination: multiple versions of the same job can be part of Google for Jobs, but Google for Jobs will show the posting that is both most keyword relevant and most complete. This is just a theory without significant research (everyone's favorite kind of theory, right?), but I'm going to send an email to the author of the TechCrunch article to see if there's any more detail he can share. Thanks again!
-
Hey Kevin,
I'm afraid I'm not very familiar with Google for Jobs, but here's something that caught my eye in a TechCrunch article:
To create this comprehensive list, Google first has to remove all of the duplicate listings that employers post to all of these job sites. Then, its machine learning-trained algorithms sift through and categorize them.
This sounds like it might be applicable to what you're describing. Maybe read the rest of the article? And I'm hoping you'll get further community input from folks who have actually been experimenting with this new Google function.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Is there any way to report a website that is not complying with webmaster guidelines to Google?
Like how we can "suggest an edit" in Google Business Listings, is there any way to report Google about the webmaster guidelines violation?
Local Website Optimization | | Alagurajeshwaran0 -
Google my business - Image sizes
I have scoured the web in order to find a guide that would give me the ideal dimensions for images to populate google my business page... in vain. Google itself is very vague about it as indicated below Format: JPG, PNG, TIFF, BMP Size: Between 10 KB and 5 MB Minimum resolution: 250px tall, 250px wide Does anyone know of a guide with optimum recommendation for each photo (profile, Cover photo, business specific photos...) or alternatively can recommend the exact size needed. Thanks
Local Website Optimization | | coolhandluc0 -
Schema for same location on multiple sites - can this be done?
I'm looking to find more information on location/local schema. Are you able to implement schema for one location on multiple different sites? (i.e. - Multiple brands/websites (same parent company) - the brands share the same location and address). Also, is schema still important for local SEO? Thank you in advance for your help!
Local Website Optimization | | EvolveCreative0 -
Sub-Domain Google Search Nested under main Domain?
Hello, I have a strange issue that I have not come across before:My subdomain is: michigan.dogdaycare.com. Some of the Keyword searches show our subdomain being nested under the main domain for Google searches instead of being indexed individually. Example search term: Dogtopia Bloomfield https://www.google.com/?gws_rd=ssl#q=dogtopia+bloomfield -This will show two subdomain links nested under the main domain Example search term: Dogtopia Birmingham https://www.google.com/?gws_rd=ssl#q=dogtopia+birmingham -This shows the subdomain showing correctly in searches and not nested. Any idea as to how to fix this? Thanks in advance!
Local Website Optimization | | dogtopiamichigan0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0