JFC (Jefsey) Morfin wrote:
At 18:11 27/08/2005, David Hopwood wrote:
JFC (Jefsey) Morfin wrote:
[...] The DNS root is updated around 60 times a year. It is likely
that the langroot is currently similarly updated with new langtags.
No, that isn't likely at all.
your opposition is perfectly receivable. But it should be documented.
For the long-term, sustained rate of updates to the registry to be 60 a year,
there would have to be real-world changes in the status of countries or in
the classification of languages and scripts that occurred at the rate of 60
a year (i.e. every 6 days). And even in times of significant political
upheaval, that is simply implausible.
The order of magnitude is the same. I did not note the number of entries
in the IANA file during the last months. This is something that I will
certainly maintain if the registry stabilises.
Exactly; the registry has not stabilised. It will do, but until it does,
there is little point in arguing statistics on how frequently it is updated.
The langtag resolution will be needed for every HTML, XML, email page
Patent nonsense. In practice the list will be hardcoded into software
that needs it, and will be updated when the software is updated.
Then? the langtag resolution is the translation of the langtag into a
machine understandable information. It will happen every time a langtag
is read, the same as domain name resolution is needed everytime an URL
The langtags would already be encoded in a form that can be interpreted
directly by each application. You were trying to imply that repeatedly
downloading this information would impose significant logistical costs:
# Even if the user cache their 12.000 to 600.000 k zip file when they boot,
# or accept an update every week or month, we are in the logic of an
# anti-virus update.
In fact there is unlikely to be any additional cost apart from that of
upgrading software using existing mechanisms.
This is perfectly sufficient. After all, font or character encoding
support for new scripts and languages (e.g. support for Unicode version
updates) has to be handled in the same way.
I am afraid you confuse the process and the update of the necessary
information. And you propose in part the solution I propose :-) .
If it is sufficient to upgrade software using existing mechanisms, then
there is no problem that is not already solved.
Languages, scripts, countries, etc. are not domains.
The DNS root tend to be much more stable. What count is not the number
of changes, but their frequency.
- there is no difference between ccTLDs and country codes. We probably
can say that there is one change a year. At least.
What happens if the change isn't immediately picked up by all software?
Not much. Only use of that particular country code is affected.
Now, if there are updates, this means there are needs to use them, now -
not in some years time.
And if they do, they will upgrade their software -- which is what they
have to do anyway to actually make use of any new localisations, scripts,
PS. The problem is: one way or another one billion users, with various
systems and appliances must get a reasonably maintained related
information which today weight 15 K and is going to grow to 600 K at
some future date,
The subset of the information needed by any particular application will
typically be much less than 600K. If there is a real issue of database size,
operating systems will start providing shared libraries to look up this
information, so that only an OS update is needed (and similarly for the
Unicode data files, which are already significantly more than 600K).
with a change from every week to every day (IMHO much
more as people start mastering and adapting a tool currently not much
adapted to cross lingual exchanges). From a single source (in exclusive
case) or from hundreds of specialised sources in an open approach. This
should not be multiplied by all the languages that will progressively
want to support langtags, but will multiply the need by two or three.For
example an Ukrainian will want langtags in Ukrainian, in Latin and
Cyrillic scripts [...]
You pick one of the very few languages that are written in more than
one script, and use that example to imply that the total number of
language-script combinations used in practice is 2 to 3 times the number
of languages. Please stop exaggerating.
David Hopwood <david.nospam.hopwood@xxxxxxxxxxxxxxxx>
Ietf mailing list