
Dear Reinhard Meyer,
In message 4CC62B6C.30601@emk-elektronik.de you wrote:
In such cases I prefer to use:
uint64_t etime; ... etime = get_ticks() + get_tbclk(); /* 1 second */ do { whatever; udelay (xx); } while (condition && get_ticks() <= etime);
That is far more accurate than calling udelay() 100000 times.
It may be more accuratre, but it may also be HORRIBLY WRONG!!
Do NOT do that!! NEVER implement such a delay loop as
end = time() + delay; while (time() < end) ...
It fails in case the timer wraps around.
Assume 32 bit counters, start time = 0xFFFFFFF0, delay = 0x20. It will compute end = 0x10, the while codition is immediately false, and you don't have any delay at all, which most probably generates a false error condition.
Correct implementation of a timeout like that should always look like that:
start = time(); while ((time() - start) < delay) ...
This works much better (assuming unsigned arithmetics).
Best regards,
Wolfgang Denk