Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Recommended Website Monitoring Tools
-
Hi,
I was wondering what people would recommend for website monitoring (IE is my website working as it should!).
I need something that will:
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities.We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case.
Thanks
-
We use Pingdom to monitor a lot of client websites. It is great, because we receive SMS messages when something is wrong. The detailed reporting, iPhone app and abilty to monitor http-statuses is exceptional!
-
Have not, but based on the service for free, it is likely worth a try given it is more robust. With most of our sites we do not have the level of complexity so it is less of a need. Hopefully, some of the mozzers with more eCommerce will see and respond. Also, if you have a private question available, you might use that to go straight to moz and see what they could suggest.
-
PS - I had a look at Mon.itor.us - have you tried their paid service: http://portal.monitis.com/ ??
-
Hi Rob,
Essentially we have a pretty complex website, with many different sections. This website is constantly being developed so there will probably be code releases for changes maybe 4-5 times per week. Any one of these changes may end up causing an issue with one of the pages (IE page of a specific type) . In addition to this we can get issues with DB or server memory which can occasional cause the website to fail.
All issues are pretty disastrous for business, so what I need to know (or to be more exact our developers need to know) as soon as an issue occurs (most of the attached services will check down all you to set a checking period of say every 5 mins) so it can be fixed (as opposed to waiting for a customer etc to tell us there is a website issue, or manually checking every page type with every code release).
As I say we do have websitepulse at the moment which is great, but also far to complex etc to easy set up and manage, so just doing research around this area, and seeing if anyone has some advice.
Thanks
-
Mon.itor.us works well and is free.
-
It seems you are looking for something that constantly monitors the site and simply alerts you to problems. From my point of view as an agency that has more than a few sites up, it might be overkill and I am not sure of what it would be. What we do to cover what you are listing is this: We have a pro plus moz membership and do campaign tracking with it. We can see on a weekly basis via email and daily if we just log in: 4xx, 5xx errors, dupe pg titles, missing pg titles, blocked bots, etc. as well as on page SEO issues, and general robots, rel - canon, etc.
For content checking of page changes I am at a loss, error reports as above and server downtime as below (mon.itor.us) with good result. The beauty of the SEOmoz campaign for me is that it also tracks rankings, connects to G Analytics, and provides competitive link analysis DA, PA, etc.
For the Headers you can use Screaming Frog (I just love that name and it works).
Hope that helps.
-
Doing some digging I found a useful list:
http://mashable.com/2010/04/09/free-uptime-monitoring/
Anyone have any feedback/reviews on these specific tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Website rankings drop significantly after moving to new hosting provider
My website - www.isacleanse.co.nz has dropped from being top10 rankings for all of my keywords to not even being in top 50 after just checking now. It used to be hosted on: www.1stdomains.nz
Web Design | | IsaCleanse
It got migrated to Sitground servers about a month ago See attached screenshot - would moving hosting provider cause such a huge drop? Or would there be anything else I should be looking at ? J2ahi0 -
Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
Hi Moz Experts I would like to know what does it the best practice for multilanguage website for the default language ? There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS. Thank you in advance.
Web Design | | johncurlee0 -
Reasons Why Our Website Pages Randomly Loads Without Content
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution. Happens on Chrome, IE and Firefox, testing with multiple browser versions Happens across various page types - but seems to be only the main content section/container Happens while on the company network, as well as externally Happens after deleting cookies, temporary internet files and restarting computer We are using a CMS that is virtually unheard of - Bridgeline/Iapps Codebase is .net Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it. I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz Linking to an image example below knEUzqd
Web Design | | CliqStudios0 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
3 Brands, 3 Services, 3 Different Websites Right?
My client was told that having 1 website for 3 different brands/services is better than having 3 websites. I need your help to prove my value to a new client. This client has worked with Reach Local on PPC for some time and when they first got started the Reach Local Markering Consultant told this cleint that they needed to have one site for better SEO purposes. The client was told that Google ranks websites higher if they have more paid traffic going to them. I've been doing this for long enough to realize this does not help ranking, at least not enough to make a difference. Keep in mind this is for 3 different companies. One company does plumbing, another electrical and the last one does air conditioning. They also have 4 locations but only two locations have mutliple services opperating out of them. I understand these 2 location will not have there own Google+ Local / Places listing. Using the same address for 2 different business and expecting a first page ranking is just not possible. Right now when you visit the clients website you see a logo that rotates with a banner section that follows the logo rotation. First you see the AC Company and then the Plumbing etc. I see this as confusing to the end user and it is more work to get it ranked for SEO. I recommended that we build 3 speerate websites for each service and just list out all the addresses that the company services on the contact page. I would also design inside the footer links to the other services for branding purposes. Please share your thoughts on how you would handle this if you were doing the SEO for your own 3 different business services. I really appreicate any input/insight to this. Thank you so much in advance!!!!
Web Design | | 1SMG0 -
How do you visualize website structure
How do you visualize a website structure in terms of (categories of) pages and interlinking. I use such visuals for discussing what you are actually doing now and what can be improved. I have made visuals I few times myself (basically making boxes representing categories of pages and lines representing internal links), but I found that I soon ran into a scheme of huge proportions and needed more paper and more time. Appreciate your thoughts!
Web Design | | NewBuilder2