When using SDL_GetPerformanceCounter()
for timing, we divide by SDL_GetPerformanceFrequency()
to convert the raw counter values into meaningful time units (typically seconds).
This process is necessary because SDL_GetPerformanceCounter()
returns platform-specific "ticks" that don't directly correspond to standard time units.
SDL_GetPerformanceFrequency()
SDL_GetPerformanceFrequency()
returns the number of "ticks" per second for the high-resolution counter. This value varies depending on the system's hardware and OS, but it remains constant for the duration of your program's execution.
For example, if SDL_GetPerformanceFrequency()
returns 1,000,000, it means there are one million ticks per second, or each tick represents one microsecond.
To convert the difference between two SDL_GetPerformanceCounter()
values into seconds, we divide by the frequency:
#include <SDL.h>
#include <iostream>
int main(int argc, char** argv) {
SDL_Init(SDL_INIT_TIMER);
Uint64 start{SDL_GetPerformanceCounter()};
// Simulate some work
for (int i{0}; i < 1000; ++i) {
std::cout << "Working...\n";
}
Uint64 end{SDL_GetPerformanceCounter()};
Uint64 counterDelta{end - start};
double secondsElapsed{
static_cast<double>(counterDelta) /
SDL_GetPerformanceFrequency()};
std::cout << "Counter delta: "
<< counterDelta
<< "\nSeconds elapsed: "
<< secondsElapsed;
SDL_Quit();
return 0;
}
Working...
Working...
Working...
Counter delta: 448647
Seconds elapsed: 0.0448647
In this example, counterDelta
represents the number of ticks that passed between start
and end
. By dividing this by the frequency, we convert it to seconds.
By adjusting the division, we can easily obtain different time units:
#include <SDL.h>
#include <iostream>
int main(int argc, char** argv) {
SDL_Init(SDL_INIT_TIMER);
Uint64 start{SDL_GetPerformanceCounter()};
// Simulate some work
for (int i{0}; i < 1000; ++i) {
std::cout << "Working...\n";
}
Uint64 end{SDL_GetPerformanceCounter()};
Uint64 counterDelta{end - start};
double frequency{
static_cast<double>(
SDL_GetPerformanceFrequency())};
double secondsElapsed{
counterDelta / frequency};
double millisecondsElapsed{
(counterDelta * 1000.0) / frequency};
double microsecondsElapsed{
(counterDelta * 1000000.0) / frequency};
std::cout << "Seconds: " << secondsElapsed
<< "\nMilliseconds: " << millisecondsElapsed
<< "\nMicroseconds: " << microsecondsElapsed;
SDL_Quit();
return 0;
}
Working...
Working...
Working...
Seconds: 0.0405826
Milliseconds: 40.5826
Microseconds: 40582.6
You might wonder why we don't just use the raw counter values. The problem is that these values are not standardized across different systems. On one computer, 1000 ticks might represent one millisecond, while on another, it could be one microsecond.
By dividing by the frequency, we normalize these differences, ensuring our timing calculations are consistent across different hardware and operating systems.
This approach allows us to write portable code that behaves consistently regardless of the underlying system's timer resolution, making it crucial for cross-platform development.
Answers to questions are automatically generated and may not have been reviewed.
Learn to measure time intervals with high accuracy in your games