How can I compute the range of signed and unsigned types

Walter B. Ligon III walt at
Wed Apr 18 13:03:26 EDT 2001

> Errrrr... :-) have fun
> #include <stdio.h>
> #define BYTESIZE(me)    (sizeof(me))
> #define BITSIZE(me)     (2 << sizeof(me))

please excuse my ignorance - but what exactly is BITSIZE doing here?
I don't understand and the results I get don't seem to make sense.
I would have written it as:

#define BITSIZE(me) (8 * sizeof(me))

but of course that assumes 8 bits per byte (which is normal, but not
strictly true (see old CDC machines).

Dr. Walter B. Ligon III
Associate Professor
ECE Department
Clemson University

Beowulf mailing list, Beowulf at
To change your subscription (digest mode or unsubscribe) visit

More information about the Beowulf mailing list