Where is there an end of it? | Alex Brown's weblog



My father-in-law, possibly amused by watching me dick around with a DSLR and laptop over the weekend, decided to dig his camera equipment out of storage

These are the cameras he used, over three decades, to take the pictures for his magnum opus (so becoming the first non-Russian to be awarded the Russian Academy of Fine Arts’ gold medal). He asserted he'd always been pleased with Rolleiflex ...

The episode has a useful pay-off ... it established with my wife a new baseline for the number of cameras it is reasonable for a man to own :-)

Document Format Standards and Patents

This post is part of an ongoing series. It expands on item 9 of Reforming Standardisation in JTC 1.


Historically, patents have been a fraught topic with an uneasy co-existence with standards. Perhaps (within JTC 1) one of the most notorious recent examples surrounded the JPEG Standard and, in part prompted by such problems there are certainly many people of good will wanting better management of IP in standards. Judging by some recent development in document format standardisation, it seems probable that this will be the area where progress can next be made …

Most recently, the Fast Track standardisation of ISO/IEC 29500 in 2007/8 saw much interest in the IPR regime surrounding that text, with much dark suspicion surrounding Microsoft's motives. However, the big development in this space – when it came – was from an unexpected direction …

The i4i Patent

Back in the SGML days I remember touring the floor of trade shows and noticing the S4-Desktop product from Candian company Infrastructures for Information, Inc (i4i). Like a number of other products at the time (including Microsoft's own long-forgotten SGML Author for Word, or Interleaf’s BladeRunner) it attempted to make Word™ a structure-aware authoring environment, based on the (accurate) belief that while many companies wanted structured data they didn't want to have to grapple with pointy brackets.

Keen to avoid the phenomenon that Rob Weir describes whereby

There is perhaps no occasion where one can observe such profound ignorance, coupled with reckless profligacy, as when a software patent is discussed on the web.

I will avoid any punditry about the ongoing legal course of this patent. Those interested would do well to read IP lawyer Andy Updegrove's post (and follow-up) on the legalities of this matter.

On the technical merit of the standard though, there appears to me to be unanimity among disinterested experts qualified to judge. For example Jim Mason (for 22 years the chair of the ISO committee responsible for all-things-markup) commented:

[T]his technique did not originate with i4i. It was already established in other commercial products and was, in effect, standardized in ISO/IEC 8613, Office Document Architecture. ODA essentially described a binary format for word-processor document representation, which worked by pointers into a byte stream. Its original interchange format, ODIF, started as a representation of that structure, but it was extended to have an alternative SGML stream, exported by a process similar to that described in the i4i patent. So there was prior art, specifically prior art described in public standards.

This point was expanded on by markup veteran Rick Jelliffe, who concluded:

By the end of the judgment I was left thinking "what interactive XML system with any links wouldn't be included in this?" which is utterly ridiculous.

I was creating SGML systems from 1989, and the i4i patent is just as obvious then as it is now.

In a Guardian Interview i4i chairman Loudon Owen seemed to make it clear that the patent would not be licensed on a reasonable and non-discriminatory (RAND) basis (at least – or especially – where Microsoft are concerned):

On licensing to Microsoft, Owen sounds on the edge of anger: "No. No. This is our property. We are going to build our business. There's no right for Microsoft to use it and go forward." But i4i could license it at some humungous, eye-watering price that Microsoft might have to pay, surely? No, says Owen.

The Wider Context

As part of its amicus brief (PDF) in the Bilski case pending before the Supreme Court, IBM offered what might be termed the orthodox pro-patent position. In a section headed “Software Patent Protection Provides Significant Economic, Technological, and Societal Benefits” we thus find a footnote quoting this text:

Given the reality that software source code is human readable, and object code can be reverse engineered, it is difficult for software developers to resort to secrecy. Thus, without patent protection, the incentives to innovate in the field of software are significantly reduced. Patent protection has promoted the free sharing of source code on a patentee’s terms—which has fueled the explosive growth of open source software development.

While it is somewhat surpising to learn here of the affinity between FOSS and patents, the point is of course that the idea of patents is not wholly without foundation: that a state-sanctioned restraint of trade (for such is a patent) is justified in allowing innovators to monetize their inventions. However, increasingly when we listen to the voices of actual FOSS (and non-FOSS) people the view seems to be that any advantages are outweighed by the problems of patents. For example Mike Kay (developer of the superb Saxon family of XSLT, XQuery, and XML Schema processing products) in an open letter to his MP argues against software patenting in a piece which is well-worth reading in its entirety:

The software business does not need incentives to innovate. If you don't innovate, you die. [...] [I]n the software business, patenting of ideas benefits no-one: certainly, it does not benefit society or the economy at large, which is the only possible justification for governments to interfere with the market and grant one company a monopoly over an idea.

And, in specific reference to the i4i patent:

recently an otherwise unsuccessful company has been awarded a similar [i.e. 9-figure] sum against Microsoft, for an idea which most people in the industry considered completely trivial and obvious.

More colourfully Tim Bray lists some horror-story cases (again well worth reading) and opines that the whole patent system is "too broken to be fixed". He also addresses the question of whether patent activity benefits society, and comes down firmly against:

And here are a few words for the huge community of legal professionals who make their living pursuing patent law: You’re actively damaging society. Look in the mirror and find something better to do.

The Myth of Unencumbered Technology

Given the situation we are evidently in, it is clear that no technology is safe. The brazen claims of corporations, the lack of diligence by the US Patent Office, and the capriciousness of courts means that any technology, at any time, may suddenly become patent encumbered. Technical people - being logical and reasonable - often make the mistake of thinking the system is bound by logic and reason; they assume that because they can see 'obvious' prior art, then it will apply; however as the case of the i4i patent vividly illustrates, this is simply not so.

Turning to document format standards, we can see there most certainly are known and suspected patents in play. For example:

  • the i4i patent mentioned above (which, in his Guardian interview, the i4i Chairman refuses to rule out as applying to ODF)
  • 45 unspecified patents which Microsoft has claimed infringes, some number of which may relate to the ODF specification (and which Sun and Microsoft agreed a cease-fire over until 2014 - at least as far as Sun is/was concerned)
  • an unknown number of unspecified patents which have led IBM to include ODF under its Interoperability Specifications Pledge
  • an unknown number of unspecified patents which have led Microsoft to include OOXML under its Open Specification Promise (though presumably clear OOXML-specific patents such as US Patent 7,676,746 are in scope here)

Now, as is clear from the above, large corporations have a preferred means of neutralising their IP stake in standards: by "promises", "covenants" and the like.

The question for standardizers remains: is the current situation acceptable? and if not, what can be done to improve it?

The ISO Rules (and Are They Followed?)

Since 2007 the "big three" International SDOs (ISO, IEC and ITU-T) have operated a common patent policy predicated on the wholly reasonable premise that standards should be "accessible to everybody without undue constraints". The policy is implemented in detail by JTC 1 (which joins the forces of ISO and IEC) and which – as we know – governs the International Standards ODF and OOXML.

The Policy as implemented in the Directives has several aspects, which I would categorise as falling under the following headings …

Personal Disclosure

Anybody aware of an IPR issue has a duty to speak out:

any party participating in the work of the Organizations should, from the outset, draw their [sic] attention to any known patent or to any known pending patent application, either their own or of other organizations. (ISO Directives Part 1, Clause 3)

And indeed committee secretaries and chairs are routinely reminded by Geneva to issue a request for IPR disclosure at meetings, to jog people's memory.

Formal Disclosure in Standards

Readers of Standards can expect to have the IPR/patent situation made explicit in the text before them, and accordingly there are may textual items mandated for Standards to which patents apply. In particular it is stated, "[a] published document for which patent rights have been identified during the preparation thereof, shall include the following notice in the introduction:"

The International Organization for Standardization (ISO) [and/or] International Electrotechnical Commission (IEC) draws attention to the fact that it is claimed that compliance with this document may involve the use of a patent concerning (…subject matter…) given in (…subclause…).

Centralised Record-keeping

A JTC 1 "patent database" (served as a huge HTML document) is maintained in Geneva which gathers together all the patents applying to published standards, and the terms under which patent holders have agreed to make licenses available.

Clear Access Rights

Patent Holders who have signed the licensing declaration to ISO, IEC or ITU-T agree to license their patents under a clear regime: either RAND, ZRAND (i.e. RAND with a free-of-charge license), or – exceptionally – on a per-case commercial basis. Anybody accessing the patent database is able to see this and, by referring to the ISO/IEC governing documents, know what it means, not least because no deviations from Geneva's wording are permitted:

the patent holder has to provide a written statement to be filed at ITU-TSB, ITU-BR or the offices of the CEOs of ISO or IEC, respectively, using the appropriate "Patent Statement and Licensing Declaration" Form. This statement must not include additional provisions, conditions, or any other exclusion clauses in excess of what is provided for each case in the corresponding boxes of the form.

Problem Handling

And if things go wrong:

2.14.3 Should it be revealed after publication of a document that licences under patent rights, which appear to cover items included in the document, cannot be obtained under reasonable and non-discriminatory terms and conditions, the document shall be referred back to the relevant committee for further consideration.

Unfortunately, when we hold up the big two document standards of ODF and OOXML against the goals set out, we see there is work still to be done …

Moving Forward

While the "broken stack" of patents is beyond repair by any single standards body, at the very least the correct application of the rules can make the situation for users of document format standards more transparent and certain. In the interests of making progess in this direction, it seems a number of points need addressing now.

  • Users should be aware that the various covenants and promises being pointed-to by the US vendors need not be relevant to them as regards standards use. Done properly, International Standardization can give a clearer and stronger guarantee of license availability – without the caveats, interpretable points and exit strategies these vendors' documents invariably have.
  • In particular it should be of concern to NBs that there is no entry in JTC 1's patent database for OOXML (there is for DIS 29500, its precursor text, a ZRAND promise from Microsoft); there is no entry whatsoever for ODF. I would expect there to be declarations from the big US vendors who profess patent interests in these standards, and I would expect this to be addressed as a matter of urgency (perhaps in parallel with the publication of these standards' forthcoming amendments)
  • In the case of the i4i patent, one implementer has already commented that implementing CustomXML in its entirety may run the risk of infringement (and this is probably, after all, why Microsoft patched Word in the field to remove some aspects of its CustomXML support). OOXML needs to be referred back to its committee (this may be JTC 1, not SC 34) for a decision on what happens next. My personal guess is that CustomXML will be left in OOXML Transitional (patent-encumbrance will be just one more of the many warning stickers on this best-avoided variant), and modified in, or removed from, OOXML Strict
  • When declaring their patents to JTC 1, patent holders are given an option whether to make a general declaration about the patents that apply to a standard, or to make a particular declaration about each and every itemized patent which applies. I believe NBs should be insisting that patent holder enumerate precisely the patents they hold which they claim apply to ODF or OOXML, as this will give greater transparency about what is (or is not) covered and prevent the vague threat ("there may be patents but we're not saying what") which seems to apply at the moment.

There is obviously much to do, and I am hoping that at the forthcoming SC 34 meetings in Stockholm this work can begin. Certainly, anybody reading this blog post now knows there are outstanding IPR issues which we as standardizers have a duty to raise …

Hi-Fi Life (so far)

The Dark Ages

My first awareness of Hi-Fi came from glossy advertisements in the Sunday supplements. The age of the “music centre” had (just) passed and the in-vogue technology was of the “tower” – preferably that lit up like a Jodrell Bank control console, and which contained a record deck, cassette tape player, amp, tuner and the inevitable “graphic equalizer” – all preferably encased in a cabinet with a smoked glass front. I particularly remember the Philips “Black Tulip” system as seeming particularly desirable.

Philips X70 Black Tulip 3e Set
Philip's Black Tulip Hi-Fi
(image used with kind permission of Vintage Collection)


Soon, like any self-respecting teenage geek, I learned that in fact in the realm of true Hi-Fi one had separates: a record player, an amp, and speakers. Graphic equalizers were frowned upon, and even tone controls were seen as a bit iffy. Received wisdom was that the most important thing was the source (the record player) – on the not-unreasonable premise that if what was being extracted from the vinyl was no good, it could not be rescued downstream. So began my quest for decent sound, ending (largely through purloining things from my father’s system) with a Garrard 401 turntable (which type – amazingly – still seems to command good money on eBay), and SME Series V tonearm (which – even more amazingly – is still available in a newer incarnation starting from £2,050) together with some fiddly moving coil cartridge. Everything sounded okay in its own terms but the trouble came comparing this “Hi-Fi” to live music. I’d go to London and listen to Klaus Tennstedt conduct the London Philharmonic Orchestra in a Mahler symphony, then come home and listen to the same forces perform the same work on record; there was no comparison. Nor was the problem limited to this setup, every other “audiophile” setup I heard exhibited the same kind of problems, which were (as Ken Rockwell recently wrote in typically forthright fashion) essentially caused by LPs:

LPs are awful. Audiophiles are often hoping that I'm endorsing LPs, but no. These "plastic dog plops," as one mastering engineer referred to them, are loaded with noise, wow & flutter, distortion, echoes, room feedback and even pitch changes from never being pressed on-center. LPs usually have their lows cut, or at least summed to mono. Some people prefer the added noise and distortion, much as a veil hides the defects in an ugly woman's face, allowing our brain to fill-in what we want to see. 

Perfect Sound Forever?

Soon after CD was launched commercially as audio format, it became clear which way things were heading and so in 1984 I happily sold off my LP collection and purchased a Philips CD100 top-loading CD player. It was odd recently to take my son to the Science Museum and see one of these units as part of a “how we used to live” exhibit! This CD player, with its signal routed (via potentiometer) into a home-made power amplifier (J Linsday Hood design like this) and feeding a pair of cheap-but-good KEF Coda III speakers clearly offered a step change to anything I’d heard from LP-based systems, although amazingly some people claimed at the time – and still claim – LP could complete with digital sound; either they had something very wrong with their systems or else never listened to live music!

This kind of combination (with occasional modest improvements from upgrading the player or the speakers) was my “Hi-Fi” for the next 25 years – although one significant improvement came in year 21 from the addition of a subwoofer (in this case an MJ Acoustics Pro 50 Mk II). It takes a while to get the levels set right, but a properly integrated sub-woofer not only fills in the all important bottom octaves, but seems to make the whole sound (for example in orchestral music) more realistically airy.

In the Power Amp
Inside a home-brew power amp

But how “hi” was the “fi” of this system? Although pretty good it always seemed there were a few things not quite right…

  • Character: I tend to classify loudspeakers as “happy” or “sad” – and the distinction sometimes only becomes apparent after long acquaintance. In general I prefer (happy) speakers with good presence and a tendency to warmness – speakers with a recessed middle ultimately give less pleasure even if their “sparkling” treble and/or deep bass may have sounded good at first.
  • Volume handling: as speakers get louder (at an orchestral climax for example) they can have a tendency to sound more and more strained – and savour more of “speakers” than of the music. This is particularly the case with less-powerful transistor-based amplification – which also tends to impart a certain unpleasant hard character to the sound.
  • Real bass: real bass (as experienced in the concert hall) is not warm, fuzzy and indistinct, but has a musical and visceral quality far removed from the inchoate whumping noises emitted by low-quality sub-woofers.
  • Detail: more is not necessarily good (as sometimes low-fi reproduction can accentuate things a high-fidelity set up won’t), but presentation of different instruments in the overall blend such that one can concentrate on something if one wants to. For stereo recordings, a good strong stereo image helps here as the instrument will be coherently positioned at a location in the sound stage.

Audio File

The problem of too many CDs seemed like a good opportunity for a re-think; if these could all be ripped to a server then streamed, might this also be an opportunity to upgrade the “Hi-Fi” also?

Looking around, it seemed the answer might be yes. In particular, a British firm caught my attention: AVI Hi-Fi and its ADM 9.1 active loudspeakers. AVI’s robust promotional material seemed congenial (its lambasting of cable fetishism is worth a read), and its premises made engineering sense, in particular that:

  • The digital source does not matter – a cheapo off-brand CD player will extract the same bits from a CD as the most exotic “transport” (funny how this turns on its head the old wisdom of the source mattering most)
  • The DAC does matter, hence the best that money can buy should be used
  • Even given a good speaker design, there are advantages to be had from doing away with the need for a passive crossover (as the AVI site has it: “this is fact not hype” !).

The ADM 9.1’s are thus an almost-complete “system” needing only to be fed a signal (preferably bits via their optical input, to take advantage of the high-quality inbuilt DAC). Each unit is a ported two-driver active speaker; each requires mains power; and each contains two amplifiers – 75 watts for the tweeter and 250 watts for the woofer. The master speaker unit accepts line-level input and/or up to two optical inputs via TOSLINK. The connection to the slave unit is via RCA phono cable, and there is an additional line-level output to drive a sub-woofer (I continue to use my MJ Acoustics unit, but AVI also make their own matched sub-woofer for rather more money). The setup is controlled by a simple remote handset which allows for source selection and volume control.

And how do they sound? Well, after a few weeks making sure to eliminate any false first impressions I find them superb. All the traditional problem areas of hi fi reproduction have been addressed, in particular:

  • There is no apparent false warmth; the lower midrange and bass are tight and precise (and even better with a sub-woofer). But “lack of warmth” does not imply these are cool speakers, more that they are neutral … if you’ve ever listened to high-end headphones you’ll recognise the free, uncoloured type of sound. I’ve read some reports these speakers are too clinical. I don’t agree: it’s just fidelity.
  • Volume: these babies go loud, and do so without losing the plot. A nice side-effect of this is that even at volumes approaching “realistic levels” they are not unduly fatiguing (I mean “realistic levels” for classical music!).
  • Detail: the stereo image is particularly solid, and maybe this helps the very strong sense of detail – so for example when listening to a Beethoven symphony the subsidiary background motoric rhythms in the strings are “there” if you care to listen to them.

Most surprisingly, I was expecting the ADM’s to be a merciless revealing lens through which to view the problems of early recordings, but – on the contrary – good quality analogue material sounds better than ever – something like Peter Maag’s legendary 1960 recording of Mendelssohn's Scottish Symphony (for example) sounds gorgeous. I wonder what the reason for this is.

Are they perfect? No, not quite – listening to the human voice one is sometimes aware of a slight cabinet-i-ness to the sound. But this is a picky caveat – what I’m hearing now is the closest to real I’ve ever heard from my own or other systems.

"Perfect Sound Forever" ?
An ADM 9.1 unit with a CD and Logitech Squeezebox. Karajan's Eine Alpensinfonie (pictured)
was the first commercial CD pressed; the rather nasty recording quality of this,
and may other early CDs, became associated with digital recording itself and set
the cause of digital audio back among audiophiles.

The end of the CD

To drive the ADM 9.1’s I use a Logitech Squeezebox. The long process of ripping my CD collection is underway but it is increasingly clear music in future is going to come straight off the net. I’ve just bought the “studio master” 24-bit FLACs of a thrilling new set of Mozart symphonies from Sir Charles Mackerras and the Scottish Chamber Orchestra. From a server upstairs the bits are sent over a wireless network to the Squeezebox and then down an optical cable straight into the ADM 9.1’s. I’m not sure my ears are able to detect the advantage of 24-bit over 16-bit material, but the end result is absolutely riveting musical reproduction. Maybe this kind of system will form the template for my next 25 years of Hi-Fi life …

[Disclaimer: I have no connection with any of the companies or products mentioned in this posting!]

Update — February 2012

I am still as impressed with these speakers as when I wrote this post originally. I have had a query about how well the MJ Acoustics subwoofer works: again, very well. The one tricky thing is getting it properly integrated with the main speakers. My (not very technical) method for doing this is to use a recording with a very well-recorded bass drum, and then adjust the settings until it sits properly in the orchestral texture. My current disc of choice for this exercise is the excellent recording of Mahler's Symphony No 1 by the Pittsburg Symphony Orchestra conducted by Manfred Honeck. The settings achieved by doing this, at least in my listening room, can be seen in the photograph below.

Subwoofer settings

SC 34 WG meetings in Paris last week

The croissants of AFNOR

Last week I was in Paris for a stimulating week of meetings of ISO/IEC JTC 1/SC 34 WGs, and as the year draws to a close it seems an opportune time to take the temperature of our XML standards space and look ahead to where we may be going next.

WG 1 (Schema languages)

WG 1 can be thought of as tending to the foundations upon which other SC 34 Standards are built - and of these foundations perhaps none is more important than RELAX NG, the schema language of many key XML technologies including ODFDocBook and the forthcoming MathML 3.0 language. WG 1 discussed a number of potential enhancements to RELAX NG, settling on a modest but useful set which will enhance the language in response to user feedback. 

A proposed new schema language for cross reference validation (including ID/IDREF checking) was also discussed; the question here is whether to have something simple and quick (that addresses the ID/IDREF validation if RELAX NG, say), or whether to develop a more fully-featured language capable of meeting challenges like cross-document cross-reference checking in an OOXML or ODF package. It seems as if WG 1 is strongly inclining towards the latter.

Other work centred on proposing changes for cleaning up the unreasonable licensing restrictions which apply to "freely-available" ISO/IEC standards made available by the ITTF: the click through license here is obviously out-of-date, and text is required to attach to schemas so that they can be used on more liberal, FOSS-friendly terms. (I mentioned this before in this blog entry).


WG 4 had a full agenda. One item of business requiring immediate attention was the resolution of comments accompanying the just-voted-on set of DCOR ballots. These had received wide support from the National Bodies though it was disappointing to see that the two NBs who had voted to disapprove had not sent delegates to the meeting. P-members are obliged both to vote on ballots and attend meetings in SCs and so these nations (Brazil and Malaysia are the countries in question) are not properly honouring their obligation as laid down in the JTC 1 Directives:

3.1.1 P-members of JTC 1 and its SCs have an obligation to take an active part in the work of JTC 1 or the SC and to attend meetings.

I note with approval the hard line taken by the ITTF, who have just forcibly demoted 18 JTC 1 P-members who had become inactive.

Nevertheless, all comments received were resolved and the set of corrigenda will now go forward to publication, making a significant start to cleaning up the OOXML standard.


The other big topic facing WG 4 was to the thorny problem of what has come to be called the issue of "Strict v Transitional". In other words, deciding on some strategy for dealing with these two variants of the 29500 Standard.

The UK has a clear consensus on the purpose of the two formats. Transitional (aka "T") is (in the UK view) a format for representing the existing legacy of documents in the field (and those which continue to be created by many systems today); no more, and no less. Strict (aka "S") is viewed as the proper place for future innovation around OOXML.

Progress on this topic is (for me) frustratingly slow – ah! the perils of the consensus forming process – but some pathways are beginning to become visible in the swirling mists. In particular it seems there is a mood to issue a statement that the core schemas of T are to be frozen, and that any dangerous features (such as the date representation option blogged about by WG 4 experts Gareth Horton and Jesper Lund Stocholm) are removed from T.

This will go some way to clarify for users what to expect when dealing with a 29500-conformant document. However, I foresee needed work ahead to clarify this still further since within the two variants (Strict and Transitional) there are many sub-variants which users will need to know about. In particular the extensibility mechanism of OOXML (MCE) allows for additional structures to be introduced into documents. And so, is a "Transitional" (or "Strict") document:

  • Unextended ?
  • Extended, but with only standardized extensions ?
  • Extended, but with proprietary extensions ?
  • Extended in a backwards-compatible way relative to the core Standard ?
  • Extended in a backwards-incompatible way ?

I expect WG 4 will need to work on conformance classes and content labelling mechanisms (a logo programme?) to enable implementers to convey with precision what kind of OOXML documents they can consume and emit, and for procurers to specify with precision what they want to procure.

WG 5 (Document interop)

WG 5 continues its work with TR 29166, Open Document Format (ISO/IEC 26300) / Office Open XML (ISO/IEC 29500) Translation Guidelines, setting out the high-level differences between the ISO versions of the OOXML and ODF formats. I attended to hear about a Korean idea for a new work item focussed on the use of the clipboard as an interchange mechanism.

This is interesting because the clipboard presents some particular challenges for implementers. What happens (for example) when a user selects content for copying which does not correspond to well-formed XML (from the middle of one paragraph to the middle of another)? I am interested in seeing exactly what work the Koreans will propose in this space ...

WG 6 (ODF)

Although I had registered for the WG 6 meeting, I had to take the Eurostar home on Thursday and so attempted to participate in Friday's WG 6 meeting by Skype (as much as rather intermittent wi-fi connectivity would allow).

From what I heard of it, the meeting was constructive and business-like, sorting out various items of administrivia and turning attention to the ongoing work of maintaining ISO/IEC 26300 (the International Standard version of ODF).

To this end, it is heartening to see the wheels finally creak into motion:

  • The first ever set of corrigenda to ISO/IEC 26300 has now gone to ballot
  • A second set is on the way, once a mechanism has been agreed how to re-word those bits of the Standard which are unimplementable
  • A new defect report from the UK was considered (many of these comments have already been addressed within OASIS, and so fixes are known)

Most significant of all is the work to align the ISO version of ODF with the current OASIS standard so that ISO/IEC 26300 and ODF 1.1 are technically equivalent. The National Bodies present reiterated a consensus that this was desirable (better, by far, than withdrawing ISO/IEC 26300 as a defunct standard) and are looking forward to the amendment project. The world will, then, have an ISO/IEC version of ODF which is relevant to the marketplace while waiting for a possible ISO/IEC version of ODF 1.2 – as even with a fair wind this is still around two years away from being published as an International Standard.


I'll update this entry with links to documents as they become available. To start with, here are some informal records: :-)