Product schema GSC Error 'offers, review, or aggregateRating should be specified'
-
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier.
Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified'
I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions?
Thanks in advance.
-
Really interested to see that others have been receiving this too, we have been having this flagged on a couple of sites / accounts over the past month or two
Basically, Google Data Studio's schema error view is 'richer' than that of Google's schema tool (stand-alone) which has been left behind a bit in terms of changing standards. Quite often you can put the pages highlighted by GSC (Google Search Console) into Google's schema tool, and they will show as having warnings only (no errors) yet GSC says there are errors (very confusing for a lot of people)
Let's look at an example:
- https://d.pr/i/xEqlJj.png (screenshot step 1)
- https://d.pr/i/tK9jVB.png (screenshot step 2)
- https://d.pr/i/dVriHh.png (screenshot step 3)
- https://d.pr/i/X60nRi.png (screenshot step 4)
... basically the schema tool separates issues into two categories, errors and warnings
But Google Search Console's view of schema errors, is now richer and more advanced than that (so adhere to GSC specs, not schema tool specs - if they ever contradict each other!)
What GSC is basically saying is this:
"Offers, review and aggregateRating are recommended only and usually cause a warning rather than an error if omitted. However, now we are taking a more complex view. If any one of these fields / properties is omitted, that's okay but one of the three MUST now be present - or it will change from an warning to an error. SO to be clear, if one or two of these is missing, it's not a big deal - but if all three are missing, to us at Google - the product no longer constitutes as a valid product"
So what are the implications of having schema which generates erroneous, invalid products in Google's eyes?
This was the key statement I found from Google:
Google have this document on the Merchant Center (all about Google Shopping paid activity): https://support.google.com/merchants/answer/6069143?hl=en-GB
They say: "Valid structured markup allows us to read your product data and enable two features: (1) Automatic item updates: Automatic item updates reduce the risk of account suspension and temporary item disapproval due to price and availability mismatches. (2) Google Sheets Merchant Center add-on: The Merchant Center add-on in Google Sheets can crawl your website and uses structured data to populate and update many attributes in your feed. Learn more about using Google sheets to submit your product data. Prevent temporary disapprovals due to mismatched price and availability information with automatic item updates. This tool allows Merchant Center to update your items based on the structured data on your website instead of using feed-based product data that may be out of date."
So basically, without 'valid' schema mark-up, your Google Shopping (paid results) are much more likely to get rejected at a higher frequency, as Google's organic crawler passes data to Google Shopping through schema (and assumedly, they will only do this if the schema is marked as non-erroneous). Since you don't (well, you haven't said anything about this) use Google Shopping (PLA - Product Listing Ads), this 'primary risk' is mostly mitigated
It's likely that without valid product schema, your products will not appear as 'product' results within Google's normal, organic results. As you know, occasionally product results make it into Google's normal results. I'm not sure if this can be achieved without paying Google for a PLA (Product Listings Ad) for the hypothetical product in question. If webmasters can occasionally achieve proper product listings in Google's SERPs without PLA, e.g like this:
https://d.pr/i/XmXq6b.png (screenshot)
... then be assured that, if your products have schema errors - you're much less likely to get them listed in such a way for for free. In the screenshot I just gave, they are clearly labelled as sponsored (meaning that they were paid for). As such, not sure how much of an issue this would be
For product URLs which rank in Google's SERPs which do not render 'as' products:
https://d.pr/i/aW0sfD.png (screenshot)
... I don't think that such results would be impacted 'as' highly. You'll see that even with the plain-text / link results, sometimes you get schema embedded like those aggregate product review ratings. Obviously if the schema had errors, the richness of the SERP may be impacted (the little stars might disappear or something)
Personally I think that this is going to be a tough one that we're all going to have to come together and solve collectively. Google are basically saying, if a product has no individual review they can read, or no aggregate star rating from a collection of reviews, or it's not on offer (a product must have at least one of these three things) - then to Google it doesn't count as a product any more. That's how it is now, there's no arguing or getting away from it (though personally I think it's pretty steep, they may even back-track on this one at some point due to it being relatively infeasible for most companies to adopt for all their thousands of products)
You could take the line of re-assigning all your products as services, but IMO that's a very bad idea. I think Google will cotton on to such 'clever' tricks pretty quickly and undo them all. A product is a product, a service is a service (everyone knows that)
Plus, if your items are listed as services they're no longer products and may not be eligible for some types of SERP deployment as a result of that
The real question for me is, why is Google doing this?
I think it's because, marketers and SEOs have known for a long time that any type of SERP injection (universal search results, e.g: video results, news results, product results injected into Google's 'normal' results) are more attractive to users and because people 'just trust' Google they get a lot of clicks
As such, PLA (Google Shopping) has been relatively saturated for some time now and maybe Google feel that the quality of their product-based results, has dropped or lowered in some way. It would make sense to pick 2-3 things that really define the contents of a trustworthy site which is being more transparent with its user-base, and then to re-define 'what a product is' based around those things
In this way, Google will be able to reduce the amount of PLA results, reduce the amount of 'noise' they are generating and just keep the extrusions (the nice product boxes in Google's SERPs) for the sites that they feel really deserve them. You might say, well if this could result in their PLA revenue decreasing - why do it? Seems crazy
Not really though, as Google make all their revenue from the ads that they show. If it becomes widely known that Google's product-related search results suck, people will move away from Google (in-fact, they have often quoted Amazon as being their leading competitor, not another search engine directly)
People don't want to search for website links any more. They want to search for 'things'. Bits of info that pop out (like how you can use Google as a calculator or dictionary now, if you type your queries correctly). They want to search for products, items, things that are useful to them
IMO this is just another step towards that goal
Thank you for posting this question as it's helped me get some of my own thoughts down on this matter
-
I had a similar issue as we offer SaaS solutions with various different prices.
How I resolved this problem was by changing the Entity Type from Product to Service. Then you no longer need Sku or product related parameters.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Inspector, Rich Results Tool, GSC unable to detect Logo inside Embedded schema
I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector: By comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector: In pseudo-code, our OLD schema looked like this: The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?
Technical SEO | | ValnetInc0 -
Reviews on Product Page or Separated
Good Afternoon We currently have our individual product information pages set-up with a link through to a separate review page optimised for the term "Product A Reviews" I was reading about structured data and if I read correctly, the reviews should sit with the marked up product data so I was wondering whether to merge them back into one page. We have many reviews so the review pages are paginated in blocks of 25 My options are: Leave as it is, product info page and separate review page Merge the review content back in to the main page and have the pagination work on that page Include the first 25 reviews on the product info page then when user clicks through to page 2, 3 etc they're taken to the separated review page. In that way the product page would regularly get new content and we can still have a page specifically targeted for reviews. From the users point of view, they probably aren't even aware they're being taken to a separate reviews page so with that in mind as I'm typing this maybe they should be one page again
Technical SEO | | Ham19790 -
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Ecommerce website: Product page setup & SKU's
I manage an E-commerce website and we are looking to make some changes to our product pages to try and optimise them for search purposes and to try and improve the customer buying experience. This is where my head starts to hurt! Now, let's say I am selling a T shirt that comes in 4 sizes and 6 different colours. At the moment my website would have 24 products, each with pretty much the same content (maybe differing references to the colour & size). My idea is to change this and have 1 main product page for the T-shirt, but to have 24 product SKU's/variations that exist to give the exact product details. Some different ways I have been considering to do this: a) have drop-down fields on the product page that ask the customer to select their Tshirt size and colour. The image & price then changes on the page. b) All product 24 product SKUs sre listed under the main product with the 'Add to Cart' open next to each one. Each one would be clickable so a page it its own right. Would I need to set up a canonical links for each SKU that point to the top level product page? I'm obviously looking to minimise duplicate content but Im not exactly sure on how to set this up - its a big decision so I need to be 100% clear before signing off on anything. . Any other tips on how to do this or examples of good e-commerce websites that use product SKus well? Kind regards Tom
Technical SEO | | DHS_SH0 -
Google's "cache:" operator is returning a 404 error.
I'm doing the "cache:" operator on one of my sites and Google is returning a 404 error. I've swapped out the domain with another and it works fine. Has anyone seen this before? I'm wondering if G is crawling the site now? Thx!
Technical SEO | | AZWebWorks0 -
I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
Technical SEO | | DerekM880 -
Can someone break down 'page level link metrics' for me?
Sorry for the, again, basic question - can someone define page level link metrics for me?
Technical SEO | | Benj250