
On Wed, 2015-09-09 at 12:37 -0400, Tom Rini wrote:
On Wed, Sep 09, 2015 at 11:22:25AM -0500, Scott Wood wrote:
On Tue, 2015-09-08 at 21:01 +0300, ivan.khoronzhuk wrote:
Hi, Andreas
On 07.09.15 14:43, Andreas Bießmann wrote:
From: Heiko Schocher hs@denx.de
introduce BIT() definition, used in at91_udc gadget driver.
Signed-off-by: Heiko Schocher hs@denx.de [remove all other occurrences of BIT(x) definition] Signed-off-by: Andreas Bießmann andreas.devel@googlemail.com
Full buildman is running
....
+#define BIT(nr) (1UL << (nr))
Why UL? Why not simply 1 << (nr)?
That would give the wrong result for nr == 31 if used as a 64-bit number, and would produce undefined behavior for nr >= 32 (though even with 1UL that would be undefined on 32-bit builds).
What if I need set ULL bit on 32-bit system? Thanks for explanation.
Yes, ULL would be better.
That would be BIT_ULL(nr) ? I want to assume that there was some care given upstream here. It was about 2 years ago now the kernel added a specific BIT_ULL and family in addition to BIT(nr) from back in 2007.
A quick search didn't turn up much justification for keeping them separate (and it seems like using BIT where BIT_ULL is needed could be a source of difficult bugs), but sure, we don't want to encourage writing driver code that will break on Linux.
-Scott