I must have been daydreaming again when this was being covered too. Could some please tutor me before the test?

How can the standard deviation be greater than the maximum value of the data set?

Probe Interval: 15000 (15 seconds)

Alert Interval: 15000 (15 seconds)

All others default

dpinger output:

`loss: 0, delay: .0095890, stddev: .0079330`

loss: 0, delay: .0091930, stddev: .0081340

loss: 0, delay: .0092940, stddev: .0080790

loss: 0, delay: .0097080, stddev: .0078550

loss: 0, delay: .1656320, stddev: .2779040

loss: 0, delay: .1651380, stddev: .2781920

loss: 0, delay: .1653530, stddev: .2780680

loss: 0, delay: .1652700, stddev: .2781160

loss: 0, delay: .0046320, stddev: .0013230

loss: 0, delay: .0055140, stddev: .0006520

loss: 0, delay: .0056070, stddev: .0007150

loss: 0, delay: .0052770, stddev: .0008860

Also see attached quality graph image.

2.3-BETA (i386)

built on Fri Mar 04 19:46:06 CST 2016