I'VE FIXED IT!! The problem was caused by a negative index into the hash table. To fix it change the line (gci-hash.c:122):
(((((char *)(NODE))-(char *)0) >> 2) % MAX_HASH_TABLE)
to:
(((((unsigned long)(NODE))-(unsigned long)0) >> 2) % MAX_HASH_TABLE)
(patch attached)
Now for the explanation: when gcc subtracts pointers, a SIGNED integer is produced and all further operations on that value used the signed versions (eg cwd,idiv to get the modulus). The reason this showed up for me and not Peter is djgpp's sbrk algorithm mixed with window's memory allocation: sbrk allocates separate dpmi memory chunks each time it is called and under windows (3.x or 95), the allocated memory can very easily come from BELOW the beginning of the program. This will result in pointers in the range 0xfxxxxxxx. As to the reproducability: it all depends on the fragmentation of dpmi memory under windows (cwsdpmi doesn't seem to have this problem, and nor does dosemu I suppose).
Well, anyway, maybe now this bug can be laid to rest...
Bill -- Leave others their otherness.
*** gpi-hash.c~ Tue Jul 22 14:04:30 1997 --- gpi-hash.c Wed Jul 30 16:04:00 1997 *************** *** 119,125 ****
#define HASH_FUNC(NODE) \ ! (((((char *)(NODE))-(char *)0) >> 2) % MAX_HASH_TABLE)
/* return pointer to a node uid field */ int * --- 119,125 ----
#define HASH_FUNC(NODE) \ ! (((((unsigned long)(NODE))-(unsigned long)0) >> 2) % MAX_HASH_TABLE)
/* return pointer to a node uid field */ int *