Frank Heckenbach wrote:
Perhaps I'm too much a mathematician to ever understand such motivations.
<snip>
My suggestion would be:
Numbers are interpreted in the mathematical sense (i.e., positive, as long as they don't contain a `-' sign etc.).
`not', `and', `or' and `xor' are conceptually defined as functions on the integer numbers (defined such that they correspond to the bit operations given a sufficiently large representation), see appendix.
<snip>
Appendix: Mathematical definition of "bit" functions
And I am too much a programmer to understand mathematicians ...
Looking at the mathematical definitions some more, aren't we transforming back bitwise operations and a specific bitwise notational convention (two's complement) back into mathematics ? If so, isn't it true that:
- the mathematical definitions aren't universal mathematical definitions, independent of bit conventions - the bit-level definitions are simpler
In other words, aren't the mathematical definitions artificial, too complex and valid only for a specfic notational bit convention (two's complement) ?
Regards,
Adriaan van Os