mirror of
https://github.com/genodelabs/genode.git
synced 2024-12-22 15:02:25 +00:00
399e1586be
There are hardware timers whose frequency can't be expressed as ticks-per-microsecond integer-value because only a ticks-per-millisecond integer-value is precise enough. We don't want to use expensive floating-point values here but nonetheless want to translate from ticks to time with microseconds precision. Thus, we split the input in two and translate both parts separately. This way, we can raise precision by shifting the values to their optimal bit position. Afterwards, the results are shifted back and merged together again. As this algorithm is not so trivial anymore and used by at least three timer drivers (base-hw/x86_64, base-hw/cortex_a9, timer/pit), move it to a generic header to avoid redundancy. Ref #2400 |
||
---|---|---|
.. | ||
util.h |