Rationale of the KEP2 design: central in-format TZ-data vs. individual local TZ-databases and more (Part 2)

Florian v. Samson florian.samson at bsi.bund.de
Tue Mar 15 14:01:52 CET 2011


this is part 2, trying to split the the discussion threads about (more or 
less) unrelated topics a bit.

> > > The concept has been tried in iCalendar, likely because they also
> > > could not agree on one approach and thus allowed both, and leads to
> > > inconsistent client behaviour.
> >
> > Please name your "one", "both" and "inconsistent client behaviour"
> > specifically.
> There are two ways of encoding DST data.

Oh, I can envision many more ...

> One is to encode it statically, e.g. this time zone always switches to
> summer time on 3rd of March every year and back on the 5th of October.

... but this one would be simply idiotic.
But please mind your wording (that is why I failed to understand before): an 
encoding is always static (as long as the specification for that encoding 
does not change), but in your example you depict TZ-data as immutable, 
persistent and identical for every year.  

> The other is to encode it dynamically, 

See above: if I understand correctly, you do not mean the the encoding 
changes dynamically, rather than that the TZ-data is somewhat "dynamic".
Still, I do not fully comprehend which aspects you are trying to address:
- alterable?
- updateable?
- different for different years?
- etc.?

> through a database, 

No, the source of the TZ-data is completely unrelated to its properties!
So a database is just one possible source of that TZ-data, exactly the same 
things can be achieved with in-format TZ-data (i.e. the Kolab-objects 
containing the TZ-data), e.g. updates, changes in the TZ-data (= the 
TZ-definition), etc.

Using in-format TZ-data, only a client which newly creates a Kolab-object 
MUST have some other source of TZ-data at hand to copy it into that object.  
This or other clients with write access to this Kolab-object MAY update this 
TZ-data when they have newer TZ-data for that TZ and alter that 
Kolab-object anyway, hence a UTC-timestamp of the original release date of 
that TZ-data should be provided with in-format TZ-data.
And again, please leave it up to the implementer which source of TZ-data is 
preferred to be used.

> The dynamic approach is likely to be more correct over time, but has the
> potential pitfall that users with very old/outdated systems may be shown
> the wrong time.

No, not if it is done in-format, in contrast to relying in sources external 
to the PIM-Client (e.g. a database): all clients always show the same time, 
which is guaranteed not to be the case with different sources of TZ-data 
sooner or later, e.g. be it different versions of the same database, or 
fundamentally different sources as a web-service and a database.
If the PIM-Clients ever go wrong with in-format TZ-data, they all go wrong 
the same way, so no one will realise and their users will still successfully 

So the whole "source of TZ-data: in-format vs. local database" discussion is 
1. A single, central in-format TZ-data entry in every Kolab-event (for those 
which are not explicitly defined in UTC), which _all_ clients accessing 
that Kolab-object use
2. An individual, local database providing the TZ-data: so each client may 
have a different TZ-mapping and display events at different times.

> If BOTH time zone identifier and static encoding are stored, they
> necessarily drift apart as soon as the DST rules change.

No a TZ-id and TZ-data cannot possibly "drift apart", as they are 
I think you mean, that not updated ("stale") TZ-data might be simply wrong 
(as outdated): sure, this is a no-brainer and true for any source of 
TZ-data, databases included.

> > You correctly provided a reason, why we must agree on and specify
> > timezone identifiers (you call them geographical identifiers here):
> > interoperability on the format-level.
> I am glad we agree that storing timezone identifiers are the right
> approach.

I also think that the TZ-ID defined by the Olson-database are a good choice 
(there are other just as good ones as well), but just the ID definitions.

> > But you always lacked and still do lack any reason why ...
> > a. the *source* for the timezone data must be a database
> What else would that source be, in your view?

I do not care and believe the Kolab-format should not, either: anything the 
client-implementer likes or is easily accessible.  Fact is, that various 
clients have very different restrictions and possibilities.

> All systems I know use a system wide database for this kind of
> information, often in the form of some ASCII table with system libraries
> that make it easy to access for all applications.

BTW, it was you naming a web-service as another possible source, even 
pointing to an ongoing specification effort for such a service.

> Which other methodology do you propose?

Any which suffices: that is definitely more you and I can momentarily think 

> > b. this *source* must be the same database for all (actually this point
> > was weakened by you, lately)
> That point was never made, 

AFAIR it was: IIRC you wrote numerous times that the Olson *database* shall 
be mandatory.

> The narrowing down of Olson has always been for the reason of providing
> an encompassing, yet limited, set of tzids, for the reasons already
> discussed, and to which you agreed.

Then please cease using the word "Olson database" throughout this discussion 
(and in KEP2), as you now state (for the first time, AFAICR) that you 
merely intended to address the TZ-IDs defined by Olson.

> > > Not all communication takes place in email on list, as explained.
>  >
>  > Consequently they do not exist for the readers of this list.
> That would be inappropriately exclusive, I believe.

How can one possibly take something into account, which one cannot possibly 
be aware?  
In my view excluding the participants on this list from knowing what has 
been discussed, but referring to those "behind the scenes"-discussions 
multiple times is "inappropiate" IMO, as you could be state anything 
what "the developers of XYZ" said, without providing any(thing) of the 
original source.
> People should have the right to participate in this process in any way
> they consider workable for them, and I'd rather get more input, than
> preclude their ability to provide input by trying to force them to
> formulate it in full extent to the list.

Oh, I was not aware that you define this as your process, not as a process 
of the Kolab initiative: yes, I fully agree, if the KEPs are just for you 
personally, then it is fully sufficient that all information gets in your 
If the KEPs are meant as something happening within the Kolab-initiative, 
then I must fully disagree, as this is completely opaque for the 
participants in the public discussion as well as for anybody tracing back 
these decision making processes in the future.

> But everyone is following the discussion, I believe, so could have and
> still can speak up if they felt their input was somehow misrepresented.

This is not the point: I actually do not care, as anybody can speak for 
themselves.  Those who decide not to speak for themselves do not speak and 
hence obviously decided not to have a relevant opinion for the ongoing 
I am not at all criticising private conversations on these topics, but 
opinions must be voiced here in order to exist for the participants of the 
public discussion.  Hence I am glad and appreciated, that Till finally 
spoke up here on this list (last week).

> > Yes we do.  Any of us needed some time to fully comprehend the various
> > aspects and depth of the timezone issue.  Sure, that applies to Henrick
> > as well, and it was RFC3339 limited to UTC only: please do read his
> > mail you referenced.
> I did. Quoting from

I rather let Henrick speak for himself, which is what he did meanwhile.

> > Sorry, that is not true: the proposal for RFC3339 came from you.
> Please re-read the above.

I did and it does not change the fact, that you and/or Jeroen initially 
proposed RFC3339.
Hendick's mail you quoted was amidst the intense RFC3339 discussion and is 
quite ripped out of context the way you present it, IMO.

> Unfortunately the understanding of most other people had not yet reached
> the point that local time zone storage is indeed the best solution. Most
> of us, myself included, were fiercly of the opinion that UTC + time zone
> should be sufficient to model this issue.

These two are equivalent.
The only, but vast difference is that "UTC + TZ" is compatible to the 
Kolab-Format, but "Local Time + TZ" is not! 
Any additional ambiguity you sensed in "UTC + TZ" over "Local Time + TZ" 
does not exist (or I just do not see it).

> There is just no alternative.

There always is at least one. ;-))
Seriously, way too many "too quick conclusions" have been presented 
throughout this discussion and similar statements have been sent out a 
couple of times, faltering and collapsing later.


More information about the format mailing list