In any measurement anyone has ever conducted there was some amount of uncertainty involved. If I use a glass measuring cup to measure a cup of water I'll know I have about 1 cup but the uncertainty in this measurement may be +- 5 tbs, if I use a graduated cylinder it may be +- 1 tbs, if I use a very accurate and precise scale it weight the water and then calculate it's volume it may be +- .1 Tbs but never can I be entirely certain that I am measuring exactly 1 cup of water. It's simply impossible to do. What's important is that the measured value and its uncertainty is acceptable for the situation.

The NHTSA requires that the uncertainty of a stationary speed measuring device is at most +1 mph and -2 mph.

If the officer used a moving radar unit, the unit relied on both the speedometer measurement of the vehicle and the radar measurement of the radar. Uncertainty is exacerbated when a measurement relies on more than one measurement device. Speedometers can inherently become less accurate since they are based on the rpm of the drive train and the circumference of the tire. The circumference of a tire can change quite dramatically during its service life. I would estimate the speedometer in an average car over the life of a set of tires to be +-2 mph or potentially more. This code requires that buses have a speedometer that is accurate to a range of +- 5 mph. A recently calibrated police unit may have an uncertainty that is significantly less.


This uncertainty will create the "buffer" fastline is talking about because it casts doubt on the officers contention that the defendant was traveling exactly 40 mph. How large this amount is I do not know, the defendant would have to research it but the judge will likely have their own idea of this in their head. Which may or may not be enough to warrant an acquittal or dismissal. I mean surely if one was given a citation for 1 mph over most judges would dismiss it right?