Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Schema.org product offer with a price range, or multiple offers with single prices?
-
I'm implementing Schema.org, (JSON-LD), on an eCommerce site. Each product has a few different variations, and these variations can change the price, (think T-shirts, but blue & white cost $5, red is $5.50, and yellow is $6).
In my Schema.org markup, (using JSON-LD), in each Product's Offer, I could either have a single Offer with a price range, (minPricd: $5, maxPrice $6), or I could add a separate Offer for each variation, each with its own, correct, price set.
Is one of these better than the other? Why? I've been looking at the WooCommerce code and they seem to do the single offer with a price range, but that could be because it's more flexible for a system that's used by millions of people.
-
I have a question about the offerCount item within an AggregateOffer type.
I want to show the "true" price range of every product in our inventory but we don't automatically load them all to the page. Most implementations I have seen that trigger the price range showing in the SERP have the individual offers marked up further down the page as well, but that wouldn't work for us. We show 10 or so out of 100s.
In my mind there are two options here. We can use the true aggregate price of the set and skip tagging up individual offers. Or we can tag up the offers displayed but still show what I am calling the "true" aggregate price. Any opinions on whether Google needs the individual offers tagged up? And any opinions on whether the individual offers tagged up need to "match" the aggregate offer prices?
THANKS
-
Anytime, John, I am happy to help!
-
Thanks Thomas.
AggregateOffer is what I was looking for.
-
Each product can have a few different variations
See Google's https://developers.google.com/search/docs/data-types/product
Aggregate offer properties
An
AggregateOfferis a kind of Offer representing an aggregation of other offers. When marking up aggregate offers within a product, use the following properties of the schema.org AggregateOffer type:Properties lowPriceNumber, required
The lowest price of all offers available. Floating point number.
|
|highPrice|Number, recommended
The highest price of all offers available. Floating point number.
|
|priceCurrency|Text, required
The currency used to describe the product price, in three-letter ISO 4217 format.
|
|offerCount|Number, recommended
The number of offers for the product.
|
https://developers.google.com/search/docs/data-types/product
**Just 1 **
Product rich results provide users with information about a specific product, such as its price, availability, and reviewer ratings. The following guidelines apply to product markup:
- Use markup for a specific product, not a category or list of products. For example, “shoes in our shop” is not a specific product. See also our structured data guidelines for multiple entities on the same page.
- Adult-related products are not supported.
- Reviewer’s name needs to be a valid name for a Person or Team For example, "James Smith" or"CNET Reviewers." By contrast, "50% off on Black Friday" is invalid.
To include product information in Image Search, follow these guidelines for required markup:
-
To show your product information in the rich image viewer: Include the
name,image,price, andpriceCurrencyproperties. Alternatively, instead ofpriceandpriceCurrency, you can include any four properties and excludeprice. -
To show your product information in the Related Items feature: Include the
name,image,price,priceCurrency, andavailabilityproperties. -
Be careful that the text you use is the same text that is on the page
-
https://www.distilled.net/resources/understanding-and-implementing-json-ld/
-
http://www.remicorson.com/add-woocommerce-product-to-cart-from-url-using-products-sku/
/*
- Remove the default WooCommerce 3 JSON/LD structured data format
*/
function remove_output_structured_data() {
remove_action( 'wp_footer', array( WC()->structured_data, 'output_structured_data' ), 10 ); // Frontend pages
remove_action( 'woocommerce_email_order_details', array( WC()->structured_data, 'output_email_structured_data' ), 30 ); // Emails
}
add_action( 'init', 'remove_output_structured_data' );
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
I have multiple URLs that redirect to the same website. Is this an issue?
I have multiple URLs that all lead to the same website. Years ago they were purchased and were sitting dormant. Currently they are 301 redirects and each of the URLs feed to different areas of my website. Should I be worried about losing authority? And if so, is there a better way to do this?
Technical SEO | | undrdog990 -
Unique page for each product variant? (Not eCommerce)
Hi Mozzers, Just looking for a little advice before I launch into a huge workload. We have landing pages for vehicle manufacturers. We then have anchor links in that page for each vehicle model that manufacturer has, with further info on the model further down the page. So we're toying with the idea of launching a unique page for each of the models rather than having them all on the same landing page. This will take an age and a minute but if it is worth it, we want to do it. Do you guys see a benefit to having unique pages for each model? Do you think it would attract more natural links? Would this help or hinder the manufacturer landing page in general? Should the manufacturer landing page be noindex so as to avoid duplicate content issues? I can see a lot of work and risk, just looking for a few opinions. PM for more info. Thanks a lot people, Jamie
Technical SEO | | SanjidaKazi0 -
Product meta tags are not updating in my Magneto website!
I need some help! For some reason, each time I update the product meta tags in my Magento website, it doesn't change on the current website? Could someone help me understand why that is?
Technical SEO | | One2OneDigital0 -
Multiple H1 tags in Squarespace
Hi. I'm using Squarespace, and I've noticed they assign the page title and site title h1 tag status. So if I add an on-page h1 tag, that's three in total. I've seen what Matt Cutts said about multiple h1 tags being acceptable (although that video was back in 2009 and a lot has changed since then). But I'm still a little concerned that this is perhaps not the best way of structuring for SEO. Could anyone offer me any advice? Thanks.
Technical SEO | | The_Word_Department0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Starting a new product, should we use new domain or subdomain
I'm working with a company that has a high page rank on it's main domain and is looking to launch a new business / product offering. They are evaluating either creating a subdomain or launching a brand new domain. In either case, their current site will link contextually to the new site. Is there one method that would be better for SEO than the other? The new business / product is related to the main offering, but may appeal to different / new customers. The new business / product does need it's own homepage and will have a different conversion funnel than the existing business.
Technical SEO | | gallantc0