Rationale of the KEP2 design: central in-format TZ-data vs. individual local TZ-databases and more (Part 2)
Florian v. Samson
florian.samson at bsi.bund.de
Thu Mar 31 19:09:38 CEST 2011
Am Dienstag, 29. März 2011 um 17:41:54 schrieb Bernhard Reiter:
> Am Dienstag, 29. März 2011 14:28:05 schrieb Florian v. Samson:
> > > So how do you know which is the UTC for 13:00 in Europe/Berlin on
> > > 30.3.2015?
> > By taking a closer look at the TZ-data, which accompanies the
> > TZ-ID "Europe/Berlin": add the offset, which is valid for that date
> > (likely to be "-02:00" for the 30 March 2015, see below) to the local
> > time. This results in 11:00 UTC, right?
> > > My understanding is that you would use the same rules you have today
> > > as those are the best information you have, and thus calculate UTC
> > > based on those rules. But this is merely an assumption. You do not
> > > know whether those rules are going to change between now and 2015.
> > This example is very fuzzy, please be concise.
> > I see at least two different cases addressed here:
> > 1. The DST switching dates for this specific TZ are not defined yet.
> > 2. The DST rules for this specific TZ are redefined (i.e. they are
> > altered), but they have been defined (although differently) before.
> > Which one did you intend to address?
> Both, in both cases we cannot be sure that the tz-data stays the same.
Well "not sure" is a very weak statement, which can mean anything
from "extremely unlikely" to "almost guaranteed".
In case 1 a change is quite unlikely IMO (Georg has a different view on
this, and came up with some interesting examples, I was not aware of;
though less than 5 countries in the next 5 years still fulfils "quite
unlikely" for me).
For case 2 actually at least one Kolab-client knows for *sure* that this
> > So the "DST assumption" (whose meaning I never really comprehended)
> > simply means "the TZ-data used", right?
> Yes, as far as I understand it.
> My email in the last days explained the same thing in different words.
Ugh, I failed to read it that way. Sorry to have written the same content,
again, but I was not aware of that.
> > And thanks for pointing out this big advantage of in-format TZ-data:
> > For case 1 above (TZ-data not yet preassigned by legal authorities) a
> > single source of TZ-data in the Kolab-object would prevent that every
> > client my do its own (potentially deviating) extrapolation of the
> > (definitely deviating) local sources of TZ-data (e.g. a local
> > installation of the Olson database).
> It has a good and a bad side effect. For me it comes out less favorable.
Well this is a matter to discuss in order to be aware of all aspects;
even then still people may conclude different evaluation results.
> It is true, tz-data within the object will make all clients display the
> same, but in the case that the tz-data changes, this will not be what the
> user wants.
So you think when there is a change in TZ-data, it is less critical to have
different clients accessing one Kolab-object displaying different times
(some of them wrong as well), instead of all clients showing the same time,
although that time might be wrong?
I am not inclined to follow that assessment, as this is also "not what the
I believe there is no right or wrong in many of the discussed topics and
issues on this mailing list, hence I believe it is crucial to thoroughly
evaluate these issues.
> So the tz-data would need updating, which means rewriting the
> object with the drawbacks coming with it.
Georg once mentioned, that you listed these drawbacks in detail, concluding
that my proposition has been rebutted, but I was unable to find that.
Do you have a pointer / web-link at hand?
> > > which ultimately boils down to a pre-processing step whether the
> > > stored UTC was correctly calculated and can be relied upon, or needs
> > > to be adjusted.
> > No, not at all! By definition of this process, the stored UTC *is*
> > correctly calculated against the TZ-data used and *must* be relied upon
> > (what else would you rely on?).
> I agree with Georg: Using the "old" tz-data will lead to an appointment
> which is not where the user wanted it to be. So you need to start
> guessing or update the tz-data.
This is what I wrote a couple of times: "Do update the TZ-data in this case,
and the Kolab-client doing so may have to perform some additional changes
in this Kolab-object". But the old, now stale TZ-data and (in case of
storing UTC) the stored date-time fields _were_ correctly calculated by
definition; one needs them to correctly calculate the intended local time.
Then the fresh TZ-data is used to calculate the new UTC equivalent of the
intended local time, which is stored together with the new TZ-data in the
Kiolab-object, replacing the old, stale data there: this has to be
an "atomic update" from the perspective of other Kolab-clients accessing
that Kolab-object (i.e. first both changes have to be applied, then the
updated object is written).
More information about the format