Windows PCs use regular interrupts to manage timers and form events; a common repeat period is 15.6 milliseconds, which corresponds to 64 interupts per second. Microsoft chose this as a general purpose compromise between timer resolution and UI responsiveness, which are better served with higher interrupt frequencies and power consumption, which is improved with lower interrupt frequencies. Among other things, the interrupt frequency determnines the resolution of the Windows "Sleep" method, which this cache uses to effect throttling.
If a PC is using the standard 15.6ms interrupt period, sleep times will be rounded up to the nearest 15.6ms making a 1ms sleep last 15.6ms and a 16ms sleep last 31.2ms etc. To work round this, if required, the cache can increase the sleep timing resolution to the maximum supported by the PC or 1ms, whichever is the larger. The sleep resolution will only be changed when the first GetXXX or SetXXX call is made that specifies a non-zero value for MaximumCallFrequency. If all calls specify a MaximumCallFrequency of 0.0, the cache will not change the timing resolution.
The Cache.Dispose() method should be called when the driver is closing down in order to restore the original timing resolution. If this isn't done, Windows will still restore the original value when the overall application terminates, but it remains good practice to dispose of objects at the proper time.