What’s the difference between GetTickCount and timeGetTime?

I’ve always believed that the most frequently used multimedia API in winmm.dll was the PlaySound API.  However I recently was working with the results of some static analysis tools that were run on the Windows 7 codebase and I realized that in fact the most commonly used multimedia API (in terms of code breadth) was actually the timeGetTime API.  In fact almost all the multimedia APIs use timeGetTime which was somewhat surprising to me at the time.

The MSDN article for timeGetTime says that timeGetTime “retrieves the system time, in milliseconds. The system time is the time elapsed since the system started.”.

But that’s almost exactly what the GetTickCount API returns “the number of milliseconds that have elapsed since the system was started, up to 49.7 days.” (obviously timeGetTime has the same 49.7 day limit since both APIs return 32bit counts of milliseconds).

So why are all these multimedia APIs using timeGetTime and not GetTickCount since the two APIs apparently return the same value?  I wasn’t sure so I dug in a bit deeper.

The answer is that they don’t.  You can see this with a tiny program:

 int _tmain(int argc, _TCHAR* argv[])
 {
     int i = 100;
     DWORD lastTick = 0;
     DWORD lastTime = 0;
     while (--i)
     {
         DWORD tick = GetTickCount();
         DWORD time = timeGetTime();
         printf("Tick: %d, Time: %d, dTick: %3d, dTime: %3d\n", tick, time, tick-lastTick, time-lastTime);
         lastTick = tick;
         lastTime = time;
         Sleep(53);
     }
     return 0;
 }

If you run this program, you’ll notice that the difference between the timeGetTime results is MUCH more stable than the difference between the GetTickCount results (note that the program sleeps for 53ms which usually doesn’t match the native system timer resolution):

Tick: 175650292, Time: 175650296, dTick:  46, dTime:  54

Tick: 175650355, Time: 175650351, dTick:  63, dTime:  55

Tick: 175650417, Time: 175650407, dTick:  62, dTime:  56

Tick: 175650464, Time: 175650462, dTick:  47, dTime:  55

Tick: 175650526, Time: 175650517, dTick:  62, dTime:  55

Tick: 175650573, Time: 175650573, dTick:  47, dTime:  56

Tick: 175650636, Time: 175650628, dTick:  63, dTime:  55

Tick: 175650682, Time: 175650683, dTick:  46, dTime:  55

Tick: 175650745, Time: 175650739, dTick:  63, dTime:  56

Tick: 175650792, Time: 175650794, dTick:  47, dTime:  55

Tick: 175650854, Time: 175650850, dTick:  62, dTime:  56

That’s because GetTickCount is incremented by the clock tick frequency on every clock tick and as such the delta values waver around the actual time (note that the deltas average to 55ms so on average getTickCount returns an accurate result but not with spot measurements) but timeGetTime’s delta is highly predictable.

It turns out that for isochronous applications (those that depend on clear timing) it is often important to be able to retrieve the current time in a fashion that doesn’t vary, that’s why those applications use timeGetTime to achieve their desired results.