Rationale of the KEP2 design: central in-format TZ-data vs. individual local TZ-databases and more (Part 2)

Bernhard Reiter bernhard at intevation.de
Tue Mar 29 17:41:54 CEST 2011


Am Dienstag, 29. März 2011 14:28:05 schrieb Florian v. Samson:
> > So how do you know which is the UTC for 13:00 in Europe/Berlin on
> > 30.3.2015?
>
> By taking a closer look at the TZ-data, which accompanies the
> TZ-ID "Europe/Berlin": add the offset, which is valid for that date (likely
> to be "-02:00" for the 30 March 2015, see below) to the local time.
> This results in 11:00 UTC, right?
>
> > My understanding is that you would use the same rules you have today as
> > those are the best information you have, and thus calculate UTC based on
> > those rules. But this is merely an assumption. You do not know whether
> > those rules are going to change between now and 2015.
>
> This example is very fuzzy, please be concise.  
> I see at least two different cases addressed here:
> 1. The DST switching dates for this specific TZ are not defined yet.
> 2. The DST rules for this specific TZ are redefined (i.e. they are
> altered), but they have been defined (although differently) before.
> Which one did you intend to address?

Both, in both cases we cannot be sure that the tz-data stays the same.

> So the "DST assumption" (whose meaning I never really comprehended) simply
> means "the TZ-data used", right?

Yes, as far as I understand it. 
My email in the last days explained the same thing in different words.

> And thanks for pointing out this big advantage of in-format TZ-data:
> For case 1 above (TZ-data not yet preassigned by legal authorities) a
> single source of TZ-data in the Kolab-object would prevent that every
> client my do its own (potentially deviating) extrapolation of the
> (definitely deviating) local sources of TZ-data (e.g. a local installation
> of the Olson database).

It has a good and a bad side effect. For me it comes out less favorable.
It is true, tz-data within the object will make all clients display the same,
but in the case that the tz-data changes, this will not be what the user 
wants. So the tz-data would need updating, which means rewriting the object
with the drawbacks coming with it.

> > which ultimately boils down to a pre-processing step whether the
> > stored UTC was correctly calculated and can be relied upon, or needs to
> > be adjusted.
>
> No, not at all!  By definition of this process, the stored UTC *is*
> correctly calculated against the TZ-data used and *must* be relied upon
> (what else would you rely on?).

I agree with Georg: Using the "old" tz-data will lead to an appointment 
which is not where the user wanted it to be. So you need to start guessing
or update the tz-data.

Best,
Bernhard
-- 
Managing Director - Owner: www.intevation.net       (Free Software Company)
Deputy Coordinator Germany: fsfe.org. Board member: www.kolabsys.com.
Intevation GmbH, Osnabrück, DE; Amtsgericht Osnabrück, HRB 18998
Geschäftsführer Frank Koormann, Bernhard Reiter, Dr. Jan-Oliver Wagner
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 198 bytes
Desc: This is a digitally signed message part.
URL: <http://lists.kolab.org/pipermail/format/attachments/20110329/9b5723cd/attachment.sig>


More information about the format mailing list