Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to deal with an event calendar
-
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions.
Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future.
I thought of having the calendar no followed at all but the content for the classes seems valuable.
Thanks,
-
Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs
http://www.webconfs.com/url-rewriting-tool.php
Thomas
-
A great completely and this is a good example of the type of difference changing the robots.txt file could make
I would read all the information you can on it as it seems to be constantly updating.
I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link.
http://wordpress.org/support/topic/max-cpu-usage/page/2
CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers.
My setup is as follows:
Linode 2GB VPS
Nginx 1.41
Percona SQL Server using XtraDB
PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
Wordpress 3.52
All in One Event Calendar 1.11All the Best,
Thomas
-
I got the robots.txt file I hope this will help you.
This is built into every GetFlywheel.com website they are a managed WordPress only hosting company
website the reason they did this was the same reason Dan as described above.
I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue.
This is an exact copy of any default robots.txt file from getflywheel.com
Default Flywheel robots file
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Disallow: /calendar/action:posterboard/
Disallow: /calendar/action:agenda/
Disallow: /calendar/action:oneday/
Disallow: /calendar/action:month/
Disallow: /calendar/action:week/
Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue.
I hope this is of help to you,
Thomas
PS
here is what
The maker of the all in one event calendar has listed on their site as a fix
-
Hi Landon
I had a client with a similar situation. Here's what I feel is the best goal;
Calendar pages (weeks/months/days etc) - don't crawl, don't index
Specific event pages - crawl and index
Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content.
-
is this the all-in-one event calendar for WordPress?
If so I can give you the information or you can just Google CPU Max WordPress
essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it.
Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you.
Sincerely,
Thomas
-
Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event
-
Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
Best way to handle Breadcrumbs for Blog Posts in multiple categories?
The site in question uses Wordpress. They have a Resources section that is broken into two categories (A or B). Underneath each of these categories is 5 or 6 subcategories. The structure looks like this: /p/main-category-a/subcategory/blog-post-name /p/main-category-b/subcategory/blog-post-name All posts have a main category, but other posts often have multiple subcategories while some posts also fall into both main categories. What would be the easiest or most effective way to auto-populate the breadcrumb based on from where the person reached the blog post? So for example, a way to set Home -> Main Category -> Subcategory 1 as the breadcrumb if they reach it from the Subcategory 1 landing page. Or is this not possible and we should just set the breadcrumb manually based on where we feel it best lives? Thanks.
Technical SEO | | Alces0 -
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
Best URL format for pagination
We're currently changing the URL format of our website search, we have been discussing a lot and cannot decide the past way to pass the pagination parameter for SEO. We narrowed down to the options. www.website.com/apples/p2 - www.website.com/apples?page=2 - www.website.com/apples/page/2 What would give us best ranking returns? What do you think?
Technical SEO | | HelpSaude0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
Using Schema.org: Product or Event as the schema type?
Hello, Most of you heard from the launch of the new format for microdata: Schema.org and my question is about the different types of Schema they provide. Our websites provide an overview of courses, visitors can search/filter training courses and most important: read peer reviews. Until now we formatted (the source) of those courses with the schema type "Product" because it allows us to provide search engines with metadata about reviews via the "Aggregrated Rating". Recently we updated the information about courses, to also provide start dates and locations to users, just like the schema type for: "Events". Because we would like to provide search engines also with both types of data I would like to know your opinion. Schema.org looks like not to support the Aggregated Rating for Events and vice versa for Startdates/Locations for the Product type. And combining the two Schema types also does not looks like an option because we can't put them on the same level like it should be. So what would you recommend to use for kind of schema type(s), are we able to use the 'Product' type next to the 'Event' type and so to combine them? Thanks a lot!
Technical SEO | | Martijn_Scheijbeler0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0