How do I Enable Rich Snippets for an Events Page that is updated weekly?
-
Hello Moz World!
I have a client that has an events page that they update every week. They conduct weekly demos with current customers and potential customers for their software. They post dates, times and topics for each demo. I'd like to enable event rich snippets for their website (see attached image for an example), but I am unsure exactly how to do that. A) Do I just need to setup Event Schema Tags? Does it need to be updated manually every week? Is their a software solution?
Thanks ahead of time for all the great responses!
Cheers
Will H.
-
I use JSON-LD and have had no issues since implementing it. It's easier and is a
-
Hi Lauren,
JSON is usually an easy way to implement schema tags because you can add them to the section of a page instead of adding them inline element by element. This can make it simpler to roll out schema tags at scale. Theoretically the different ways you implement schema tags shouldn't affect the outcome, only the ease of implementing them.
Hope this helps!
-
Hi Daniel,
I'm researching the benefits of implementing Schema markup via JSON vs. HTML. Why do you recommend using JSON-LD? Trying to decide which method is best for us. Thanks!
Lauren
-
Hi there,
You're right that setting up Schema tags is necessary (I'd recommend using JSON-LD). It won't guarantee a rich snippet but it's the best you can do. You can also use this tool to validate your structured markup: https://developers.google.com/structured-data/testing-tool/
It's definitely doable to pull in this data automatically instead of manually adding it each week but you'll have to talk with the client's engineering team about the specific implementation for this. I don't know of any out of the box software solution unfortunately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
Google Page Speed
Is it worthwhile going after a good score on Google page speed? Had prices but a LOT of money, and don't know if it's worth it or not. Also to add to the complication it is a new site. Does anyone have any experience if it helps rankings? Thanks
Intermediate & Advanced SEO | | seoman100 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
What to do with Authoritative footer pages?
Alo everyone! The site I'm working on has had a homepage that essentially used the footer as the main form of navigation on the site and the PA of each of those pages reflects that. I'm helping them re-organize the site (I'm still a noob though), and was curious for some input on this particular situation. Some of the most authoritative pages are: 1. www.charged.fm/privacy - PA 29 2. www.charged.fm/terms - PA 29 My question: Is this just a consequence of previous mistakes that we live with, or is there something involving 301's and creation of new pages that could help us utilize the link juice on these pages. Or should we come up with ways to internally link to 'money' pages from these pages instead? Thanks for any input, Luke
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
Category pages in forums
I would like to hear feedback on the best SEO practice for forum category pages. An example would be a forum about cars. You can have a Chervorlet category which contains forums for every chevy model. Often this category page is simply a list of all the forums. If I noindex, follow the page then am I missing an opportunity? I am thinking of Google sitemaps for example where this page can be used for a category link. If I noindex the page, there probably isn't another great place for a sitemap to link to. I could fill out the page with wiki-like generic chevy information. Please share any thoughts or best practices.
Intermediate & Advanced SEO | | RyanKent0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0