As many of you are aware I recently ran a small Monitoring survey. I ran a similar survey last year and decided to see if the results had changed. Assuming interest continues I’ll run it again next year too.
Again, the intent of the survey was to understand the state of maturity across some key areas of monitoring. I was specifically interested in what sort of monitoring people were doing, some idea of why they were doing that monitoring, and what tools they were using to do that monitoring. I am also writing a book about monitoring and wanted to get some insights that could help shape the book.
The survey greatly benefited from community feedback and was tweaked in response to that and the data I received last year.
This year the survey was 15 questions across 5 pages. The questions (which included some skip logic) are reproduced here:
- Which of the following best describes your IT job role?
- How big is your organization?
- Are you responsible for IT monitoring in your organization
- If you are not responsible for monitoring, who is?
- What tools do you use for monitoring? (Choose all that apply)
- What parts of your environment do you monitor? Please select all the apply.
- Do you collect metrics on your infrastructure and applications?
- What tools do you use to collect metrics? (Choose all that apply)
- What tools do you use to store your metrics?
- What tools do you use to visualize your metrics?
- If you collect metrics, what do you use the metrics you track for? (Select all that apply)
- When do you most commonly add monitoring checks or graphs to your environment?
- Do you ever have unanswered alerts in your monitoring environment?
- How often does something go wrong that IS NOT detected by your monitoring?
- Do you use a configuration management tool like Chef, Puppet, Salt or Ansible to manage your monitoring infrastructure?
The survey was launched 6/15/2015 and ran until 7/20/2015. It was advertised on this blog, Twitter, and a number of monitoring, DevOps, SysAdmin and tools events, publications and mailing lists. As a result there’s likely some bias in the responses towards more open source, DevOps, Operations and startup-centric communities.
In total there were 1,116 response (slightly more than last year’s 1,016), of which 884 were complete (866 last year). In my analysis I’ve considered complete and some partial responses where appropriate.
I’ll be again analyzing each section of the survey in a series of posts, starting with the demographics of the respondents. Once I’ve posted my analysis I’ll be making the source data available to anyone who wants to use it.