Contents

Monitoring Survey 2015 - Background

Contents

As many of you are aware I recently ran a small Monitoring survey. I ran a similar survey last year and decided to see if the results had changed. Assuming interest continues I’ll run it again next year too.

Again, the intent of the survey was to understand the state of maturity across some key areas of monitoring. I was specifically interested in what sort of monitoring people were doing, some idea of why they were doing that monitoring, and what tools they were using to do that monitoring. I am also writing a book about monitoring and wanted to get some insights that could help shape the book.

The survey greatly benefited from community feedback and was tweaked in response to that and the data I received last year.

This year the survey was 15 questions across 5 pages. The questions (which included some skip logic) are reproduced here:

  1. Which of the following best describes your IT job role?
  2. How big is your organization?
  3. Are you responsible for IT monitoring in your organization
  4. If you are not responsible for monitoring, who is?
  5. What tools do you use for monitoring? (Choose all that apply)
  6. What parts of your environment do you monitor? Please select all the apply.
  7. Do you collect metrics on your infrastructure and applications?
  8. What tools do you use to collect metrics? (Choose all that apply)
  9. What tools do you use to store your metrics?
  10. What tools do you use to visualize your metrics?
  11. If you collect metrics, what do you use the metrics you track for? (Select all that apply)
  12. When do you most commonly add monitoring checks or graphs to your environment?
  13. Do you ever have unanswered alerts in your monitoring environment?
  14. How often does something go wrong that IS NOT detected by your monitoring?
  15. Do you use a configuration management tool like Chef, Puppet, Salt or Ansible to manage your monitoring infrastructure?

The survey was launched 6/15/2015 and ran until 7/20/2015. It was advertised on this blog, Twitter, and a number of monitoring, DevOps, SysAdmin and tools events, publications and mailing lists. As a result there’s likely some bias in the responses towards more open source, DevOps, Operations and startup-centric communities.

In total there were 1,116 response (slightly more than last year’s 1,016), of which 884 were complete (866 last year). In my analysis I’ve considered complete and some partial responses where appropriate.

I’ll be again analyzing each section of the survey in a series of posts, starting with the demographics of the respondents. Once I’ve posted my analysis I’ll be making the source data available to anyone who wants to use it.

The posts: