Frank Heckenbach wrote:
Gale Paeper wrote:
[snip]
There is also a definition in <time.h> if _ANSI_SOURCE and _POSIX_SOURCE aren't defined:
char *timezone __P((int, int));
That's it. So it's a function and treating it as an integer variable obviously produces nonsense.
There is also a note in man GETTIMEOFDAY(2) which states timezone (I think as a variable) is no longer used. In addition, I see there are gpc and gcc configuration tests for the availability of timezone.
(As a variable.)
In the glibc manual I found this:
: - Variable: long int timezone : This contains the difference between UTC and the latest local : standard time, in seconds west of UTC. For example, in the U.S. : Eastern time zone, the value is `5*60*60'. Unlike the `tm_gmtoff' : member of the broken-down time structure, this value is not : adjusted for daylight saving, and its sign is reversed. In GNU : programs it is better to use `tm_gmtoff', since it contains the : correct offset even when it is not the latest one.
: - Data Type: struct tm : : [...] : : `long int tm_gmtoff' : This field describes the time zone that was used to compute : this broken-down time value, including any adjustment for : daylight saving; it is the number of seconds that you must : add to UTC to get local time. You can also think of this as : the number of seconds east of UTC. For example, for U.S. : Eastern Standard Time, the value is `-5*60*60'. The : `tm_gmtoff' field is derived from BSD and is a GNU library : extension; it is not visible in a strict ISO C environment.
I think I could check for this field and use it where available. If this is present on Mac OS X (I hope so, since the field seems to come from BSD), this might solve the problem.
I don't think any variable solution is going to work for Mac OS X's <time.h> or <sys/time.h>.since there is no guarenteed storage allocated variables declared in the headers. (There is one iffy, extern char *tzname[], but it is dependent upon _ANSI_SOURCE not being defined.) Everything guarenteed to be always present in those two headers are just preprocessor defines of constants and macros, type declarations, and function declarations.
There is a `tm_gmtoff' field in <time.h>; however, it is just a struct field in a type declaration:
struct tm { int tm_sec; /* seconds after the minute [0-60] */ int tm_min; /* minutes after the hour [0-59] */ int tm_hour; /* hours since midnight [0-23] */ int tm_mday; /* day of the month [1-31] */ int tm_mon; /* months since January [0-11] */ int tm_year; /* years since 1900 */ int tm_wday; /* days since Sunday [0-6] */ int tm_yday; /* days since January 1 [0-365] */ int tm_isdst; /* Daylight Savings Time flag */ long tm_gmtoff; /* offset from CUT in seconds */ char *tm_zone; /* timezone abbreviation */ };
So, in brief, on Mac OS X, the only thing available in <time.h> or <sys/time.h> to get a valid value for TimeStamp's timezone field is one of the available, appropriate function calls. Since I don't have the breadth of systems experience necessary, I don't know what to suggest for a GPC solution. If it would help, Frank, I could send you copies of the two Mac OS X hearder files in private e-mail.
The descriptions also made me aware of the sign issue (which I hadn't considered before). Which sign is the "correct" one? (Independently of the C semantics, since we can easily flip it if necessary.) Mail headers, e.g., use the latter one (negative in the west, positive in the east), so I'd tend to it (which would change GPC's current behaviour, but I don't think it's too firmly established yet). But are there any relevant standards?
I really don't know. We're in areas I've never had to deal with before.
Gale Paeper gpaeper@empirenet.com