Where is there an end of it? | Notes on Document Conformance and Portability #4

Notes on Document Conformance and Portability #4

In my last post I wrote about the reaction to Microsoft's ODF support in the recent service pack released for their Office 2007 product, and in particular how claims of its "non-conformance" seemed ill-founded. Now, to look a little deeper at the conformance question, I will use an XML pipeline to validate some would-be ODF documents, to get a clear-sighted and spin-free look at what the state of ODF conformance really is.

XML Pipelines: The Next Big Thing

For many years pipelines have been recognised as something the XML community badly needed. Eager markup geeks would seek out Sean McGrath or Uche Ogbuji to hear miraculous tales of how XML pipelines could be put to work; some bold experimenters would try to coerce technologies like Apache Ant into action, and some pioneers would even specify and implement their own pipelining languages – witness, for example Eric van der Vlist's xvif, or maybe XPL, which happily sits at the heart of the awesome Orbeon Forms framework.

Now however, the W3C is on the cusp of finalising its XProc language and this looks set to bring pipelines into the mainstream. I am convinced that XProc is the most significant specification from the W3C since XSL, and fully expect it to become as pervasive in all XML shops.

So what are pipelines? Well, as we know XML processing models can be described as conforming to the model: "in; out; shake it all about". The "in" bit is catered for by XML storage technologies (eXist maybe), and the "out" bit is catered for by web servers; XProc is for the "shake it all about" bit, where, with XSLT it will become the engine of many an XML process. XSLT is great for transforms but less convenient for a number of day-to-day things we routinely want to do with XML: validating, stripping element, renaming attributes, glomming together, splitting up ...  Essentially, pipelines are for doing stuff to XML in a step-by-step way, but without the overhead of a full-on programming language, since XProc pipelines are written using nice, declarative XML.

Pipelines and Office Documents

One of these typical "day to day" tasks is validating XML inside ZIPs. Both ODF and OOXML resources are not simply XML documents, but "packages" (ZIP archives) of content which include several XML documents. So to perform a full validation, we need to visit the XML resources in the package and validate them all against their governing schemas to get an overall validation result. This is exactly the sort of scenario where XML pipelines can help.

A Walk Through

I am going to describe an XML pipeline for performing ODF validation using Calabash, a FOSS (GPL v2)  implementation of XProc for the JVM written by Norm Walsh (the XProc WG chair). I'm not going to cover the absolute basics, for those (and more) consult some of the excellent material on XProc already appearing on the web such as:

We start, immediately after the root element, with a couple of "option" elements. These allow values to be passed in from the outside. In our case, we need the name of the package we want to validate ...

<?xml version="1.0"?>
<pipeline name="validate-odf" xmlns=""

  <!-- the URL of the package to be validated must be supplied by the caller -->
  <option name="package-url" required="true"/>

  <!-- whether to enforce use of the IEC/ISO 26300 schema -->
  <option name="force-26300-validation" select="'false'"/>

Next we import some extensions. Like XSLT, XProc is designed to be extensible and already additional sets of functions are becoming available. Calabash ships with a handy function for ZIP extraction which we are going to need.

  <!-- we use the Calabash extension in this library for looking inside ZIP files -->
  <import href="extensions.xpl"/>

Now we start the processing proper. This next step uses the ZIP extraction mechanism to pull the "manifest.xml" document out of the archive and outputs that XML for onward processing

  <!-- emits the package manifest -->
  <cx:unzip file="META-INF/manifest.xml">
    <with-option name="href" select="$package-url"/>

As a sanity check, we are going to make sure that this manifest actually conforms to the ODF manifest schema. I made this schema by manually extracting it from the ODF 1.1 specification (here referred to as "odf-manifest.rng"). As you can see, XProc makes this kind of document validation a cinch:

  <!-- validate the manifest against the manifest schema -->
  <cx:message message="Validating manifest ..."/>
  <validate-with-relax-ng assert-valid="false">    
    <input port="schema">
      <document href="odf-manifest.rng"/>

[Update: I have added an @assert-valid="false" attribute here, as this is just a 'sanity check']

Now we start to visit the individual documents in the package referenced by the manifest. This is done here using the viewport step, which offers a kind of "keyhole surgery" option allowing us to isolate bits of a document. Here we're interested in all the <file-entry> elements in the manifest which (1) have a media type of "text/xml" and (2) aren't residing in the "META-INF" folder itself.

  <!-- visit each file entry in the manifest which targets an XML resource -->
  <viewport name="handle"
    and not(starts-with(@mf:full-path,'META-INF'))]">

For each of these <file-entry> elements, a @full-path attribute specifies the name of an XML resource in the ZIP, again we use the unzip step to pull each of these XML documents from the archive:

    <!-- assume paths are relative to package base, and extract the XML resource -->
    <cx:unzip name="get-validation-candidate">
      <with-option name="href" select="$package-url"/>
      <with-option name="file" select="/*/@mf:full-path"/>

Once we've grabbed an XML resource, we need to work out which schema to use to validate it. Generally this can be done by looking at a @version attribute on the root element. However, ODF does not make this mandatory and so implementations are free to omit it. ODF specifies no fall-back rules, so we need to invent our own. What I've done here is to use the version specified, but fall back to the most recent published standard (1.1) when it is not specified.

    <!-- emits the schema RELAX NG that corresponds to the ODF version -->
    <choose name="get-relax-ng-schema">
      <when test="$force-26300-validation='true' or /*/@o:version='1.0'">
        <cx:message message="Validating with v1.0 schema ..."/>
        <load href="OpenDocument-schema-v1.0.rng"/>
      <when test="/*/@o:version='1.2'">
        <cx:message message="Validating with draft v1.2 schema ..."/>
        <load href="OpenDocument-schema-v1.2-cd01-rev05.rng"/>
        <cx:message message="Validating with v1.1 schema ..."/>        
        <load href="OpenDocument-schema-v1.1.rng"/>        
    <identity name="the-schema"/>

So now we have the document to validate, and the schema to use. We simply need to apply one to the other:

    <!-- and: validates the candidate against the schema -->
      <input port="schema">
        <pipe step="the-schema" port="result"/>
      <input port="source">
        <pipe step="get-validation-candidate" port="result"/>


Et voilà, a complete pipeline for validating ODF instances. Running it against packages which contain invalid XML will cause the pipeline processor to halt and report a dynamic error, for that is the default behaviour of the validate-with-relax-ng step.

Since ODF is clear that invalid XML signals non-conformance to the spec, we know that any package which fails this pipeline is, beyond argument, non-conformant.

Running It

Rob Weir helpfully provided a ZIP of the spreadsheets used for his Maya's Wedding Planner piece. Consult his blog entry for details of how these documents were produced. Putting these 7 test files through our pipeline we get this result:

Producer                   FAIL    PASS
Google                      X
KSpread                              X
Symphony                    X
OpenOffice                  ? *
Sun Plugin                  ? *
CleverAge                            X
MS Office 2007 SP2                   X
* See update below

So, Why the Failures?

  • Google failed because for some bizarre reason the manifest.xml document in its package specified a document type declaration referring to a non-existent "Manifest.dtd"; the processor cannot find this DTD and aborts with an IO Exception.
  • Symphony failed because its styles.xml document contained a date-value of "0-00-00". This fails to match the datatyping rules the ODF 1.1 schema uses to police date values.
  • OpenOffice failed because its manifest was not valid to the 1.1 schema. Now, this is an odd result as the manifest claims to be valid to version "1.2" of the ODF schema, yet consulting the latest drafts of ODF 1.2 it appears the manifest schema is not defined there, but has been planned for being specified in a new "Part 3" of ODF. I cannot find Part 3 of ODF in draft – maybe the OOo code has been written, but the standards text not fitted to it yet. If somebody can point me to a public draft of this schema, I'd like to re-run this test. [Update: I have now been pointed at the draft of Part 3 of ODF 1.2, and it does indeed contain a new schema. This draft is unfinished and contains non conformance clause, so it is not really possible to know for sure whether a package conforms to it. However, the OOo package here is invalid to the schema. I am going to assume that Part 3 will mirror the draft of Part 1 of ODF 1.2, and so will require schema validity. On that (reasonable) basis this OOo package is non-conformant; but of course the draft might change tomorrow. We do not know quite what version of the spec is being targetted here ...]
  • The Sun Plugin also failed because its manifest uses a @manifest:version attribute which the 1.1 schema does not declare. Again, maybe this is valid to some draft schema I have not seen, but it certainly does not conform to any published version of ODF. As above, if I can get a new schema I can re-run the test. [Update: see bullet above, it's the same here]


There had been a lot of spin in the blogosphere about who is, and who is not, supporting ODF at the moment. This validation test focusses on a small but important area of that discussion: conformance. One of the reasons it is important is that it is testable. From the test above we have the hard fact that most of the mainstream ODF applications are failing to emit standards-conformant ODF, even for a case as simple as "Maya's Wedding Planner". Surprisingly when assessing conformance it appears KOffice, Microsoft and CleverAge are leading the conformance pack; while Sun, Google and IBM have fallen behind.

To me this merely goes to confirm one of the fundamental dynamics of standardisation; done right, standards wrench "ownership" from those who thought they owned them, and distributes that ownership through the community at large. We, as users, should be applauding the widening adoption of ODF - and should be keeping the pressure on those vendors that seem to have been left behind, to raise their games.

Comments (64) -

  • AndréR

    5/17/2009 4:45:38 AM |

    "KSpread, Microsoft and CleverAge are leading the conformance pack; while Sun, Google and IBM have fallen behind."

    Which KSpread version are you running?

  • Alex

    5/17/2009 5:03:16 AM |


    I didn't run it, Rob Weir did. He reports he used version 1.6.3.

  • AndréR

    5/17/2009 7:42:55 AM |

    I was curious, so I installed KSpread 2 RC1. I never took KOffice very serious but I must confess that I am very impressed about the progress made since the Fosdem 2008 meeting in Brussels where the KDE developers outlined the work on the next version.

    When you happen to use Ubuntu Jaunty a simple

      apt-get install koffice-kde4

    installs it. For the windows version there still seem to be compilation problems with Visual C++. In principle KOffice 2.x is supposed to compile and run under Windows and MacOSX, too. This makes KOffice interesting in particular for the Apple platform. Sure, there are only few developers but I was equally sceptic about khtml and Webkit Browsers as Safari and Chromium show that it was worth the effort. The implemementation is also potentially useful for future cloud adaptations.
    I tried to open a larger document I wrote with OO 3.1 and there are still some visual glitches with the footnotes (functionality does not seem to be implemented yet), but I am pleased with the results. A new odf form letter template I received today was totally messed up. KOffice 2 RC says the document contains different versions of the document.  Also my Opentype fonts render fine. Loading documents takes a bit long but it looks quite promising.

  • hAl

    5/17/2009 11:39:37 PM |

    Amusing that you used Rob Weir example files ment to show MS Office as least compatible to show that MS Office is actually one of the few conformant ODF implementations and OOo and Rob Weirs IBM Symphony implementations failing conformance validation.

  • Inigo

    5/18/2009 1:42:37 AM |

    You should also, I think, be able to do it in Calabash with the jar: URL handler rather than the cx:unzip extension step.

  • Alan Bell

    5/18/2009 2:39:43 AM |

    Interesting results, I had already put the OOo and Office SP2 files through the ODF validator and both came back as valid. I didn't try the others, I suspect they would all be reported as valid. I note your issues mostly surround the manifest, I wonder if that is included in the ODF validator. I am a little curious that such issues with the manifests don't cause bigger problems on opening files.
    I totally agree with your conclusions, standardisation should take ownership of the format and the implementations should all be required to raise their game to keep up with and follow improvements and innovations that happen in the standard first. In terms of the end user experience of interoperability I am not sure I would describe any implementation as being "left behind", that is an overgeneralisation as far as I am concerned. From the user perspective I am not sure how anyone would ever detect a non-conformant manifest, however spreadsheet formulas being stripped out is a little more obvious.
    They are all trying to catch up with the standard, and in some cases they may get a little ahead of it in some areas, or in one case dart off at an unexpected tangent. Any conformance testing is done at a point in time and finding issues is the first step in fixing them. It certainly seems healthy to me that the topic of conversation has turned to making ODF and implementations of it better in terms of standards compliance and true interoperability.

  • Alex

    5/18/2009 2:58:45 AM |


    I didn't want to make the post too long, but in fact there's a problem with all these manifests, as I reported here:

    If I'd stuck to the absolute letter of the spec in my pipeline, all these implementations would be non-conformant (arguably). In general you're right: these kinds of smaller non-conformance bugs can get fixed and we can all usefully move forward.

    However, since it was clear a branding iron was in the fire ready for application to Microsoft's implementation (alone) as being "non-conformant", I think it's useful for everybody to have all the facts available to them ...

    And the XProc stuff is interesting too!

  • Alex

    5/18/2009 3:17:58 AM |


    Some information about the Ooo ODF "validator" is provided here:

    Basically, it appears the file that a user submit is quietly amended so that various non-conformant things OOo has done in its various versions are fixed-up prior to validation, so that it appears what you're submitting is problem-free.

    No so much ODF, as faux-DF.

  • Alan Bell

    5/18/2009 5:01:37 AM |

    the info is here
    and I think this is what you are referring to:

    If the test type is conformance test, and if the file is not a formula file, then the sub files content.xml, styles.xml, meta.xml and settings.xml are pre-processed as described in section 1.5 of the OpenDocument specification (that is foreign elements and attributes are removed), and are then validated with respect to the schema of the selected OpenDocument version.

    which I have to say is news to me, I didn't realise it did that. On the strict checking it doesn't do the pre-processing, but still doesn't fail the documents.

    I think a better online validator would be a handy tool.

    The XProc stuff looks quite cool, kind of xpath but more powerful. I think the use cases for things like this are really what XML file formats are all about. This will enable document management systems of various types to dip into the documents they manage and pluck out interesting facts.

  • Alex

    5/18/2009 1:06:09 PM |


    How very intriguing. I wonder what extra it's doing?

    The site seems to be down now, so I can't see :-(

  • Rob Weir

    5/19/2009 10:11:15 AM |

    Alex, where do you read that validity of manifest.xml is required for conformance?  That is news to me.  Section 1.5 defines document conformance and says that conformance requires validity with respect to the OpenDocument schema.  Section 1.4 defines that schema to be that described in chapters 1-16.  This is confirmed in Appendix A.  The optional packing format, including manifest.xml, is described in chapter 17, which the astute reader will observe is not between 1 and 16.  Therefor it is not part of the OpenDocument schema to which validity is required for conformance.

    XProc doesn't help if you don't read the specification more carefully.

  • Alan Bell

    5/19/2009 3:15:59 PM |

    @AndréR I don't know if there is a formal definition of "true interoperability" but I think it means that I don't have to know or care what application was last used in editing a file, I can use whatever application I like to make changes, then pass the file on to someone else without knowing or caring what application they might use to make further edits to it.
    Think of it as interoperability from the perspective of someone who really doesn't understand or care what their computer is doing.

  • Questor

    5/19/2009 4:21:34 PM |

    Rob, if I get your drift right for interoperability an office suite has to meet two out of three:

    1) confirm to the required part of ODF 1.1 schema
    2) confirm to the optional part of ODF 1.1 schema
    3) confirm to OOo ODF 1.2 draft implementation quirks

    What I do not get is why you favor 1 & 3 and not 1 & 2.

  • Alex

    5/19/2009 4:40:19 PM |


    Err, you're reading the wrong version of the spec. The manifest invalidity problem only occurs for the two apps (OOo/Sun plugin) which claim to be using so-called "ODF 1.2"; so for the results here, the conformance section 1.5 of ODF 1.1, that you quote, is not relevant.

    However, to loosen this rule and improve the pipeline for other packages then yes, I should have @assert-valid="false" on the manifest validation for ODF 1.0/1.1 packages. However, it astonishes me that the schema in these specification is apparently only there for decoration!

    Now, what to do with these "ODF 1.2" packages? I have now been pointed at the latest draft of Part 3, and it does contain a schema. So, I will re-run the test as I stated I would. However, the Part 3 draft is not finished and does not contain a conformance clause. I think it is reasonable assumption to make that Part 3 will mirror Part 1, and require schema validity and outlaw the use of extensions in the ODF namespaces - and on the basis of that assumption the results are unchanged.

    Do you believe any of the documents are conformant, which I have labelled as being non-conformant?

  • Alex

    5/19/2009 6:06:25 PM |


    I've just run the MS spreadsheet through the validator and it complains about a missing "settings.xml" file in the package ("error"). By my reading of the spec this isn't a legitimate conformance test and so can be safely disregarded ...

  • Luc Bollen

    5/19/2009 7:50:08 PM |

    @AndréR, @Patrick Durusau: Mr Durusau's post should have been written one year ago, when Microsoft announced support of ODF in SP2.  At this point of time, all the optimistic statements of Mr Durusau were valid.

    Unfortunately, it turned out that the aim of Microsoft was not to spot underspecified parts in order improve the standard, but rather to exploit them in order to break interoperability under the cover of "strict conformance".  It seems that Mr Durusau never heard about the Embrace, Extend and Extinguish tactic used so many times by Microsoft (Java, HTML and now ODF being some famous examples).

    Mr Durusau wrote: "Being a member of the ODF community involves asking questions about possible lack of clarity or missing features in current or prior versions of ODF. In a forum where they can be usefully discussed. Some people prefer to blog on such topics but personally I would suggest joining in the discussions at the ODF TC at OASIS." I asked to Doug Mahugh: "Did you ever submitted this issue (how to have interoperable formulas in ODF 1.1 implementations) to the OIC TC ?  It seems to me a very good place to discuss this subject."  His reply was: "The ODF TC actually is not a very good place to discuss this subject. [...]"

    I think this tells enough about the real intentions of Microsoft.  As Rob Weir wrote: "Of course, being useless might be the intent here. Didn't the Germans in WWII hatch the plan of counterfeiting the British currency so they could air drop it over the UK to destabilize their economy? The same idea works with document formats. If you have the monopoly on silver coins, then you mint millions of fake gold coins."

  • Alex

    5/19/2009 8:14:28 PM |


    Everyone's know for yonks that ODF didn't specify spreadsheet formulas, and everyone's know for yonks that OpenFormula is slated to address this in ODF 1.2.

    Are you suggesting MS should have proposed an alternative to OpenFormula?

  • Rob Weir

    5/19/2009 8:31:31 PM |

    Alex, you can assume whatever you want, but no text supports your assumption, neither the ODF 1.0, ODF 1.1 nor the draft ODF 1.2 specifications.  It seems you like to ignore requirements in order to defend Microsoft, while at the same time inventing requirements in order to criticize the other vendors.  Curiouser and curiouser...

    In any case, if you have a specific idea for a Part 3 conformance clause, then send it along to the office-comment list.  The conformance clause tends to be the last part we write, so there is still ample time to make proposals.

  • Rick Jelliffe

    5/19/2009 8:40:17 PM |

    I would be interested to see the results apart from any manifest.xml issues and 8601 year problems. As you say, these are quite trivial. What would be more interesting and compelling would be answers to questions like "Are any non-ODF elements, attributes or list values used?" and "Are any required elements or attributes missing?"

    (And, to be tedious, knowing what the first validation error (as grammar-based validation tends to) really does not tell what we need to know. We need to get a broad view and count of all the validation errors.)

  • Alex

    5/19/2009 9:02:37 PM |


    > no text supports your assumption

    Incorrect: there is the statement in the Part 3 draft that "Part 3 (this document) defines a package format to be used for OpenDocument documents".

    Or are you asserting that the schema is not part of that "definition"?

    Anyway, one needs to assume something when trying to assess conformance to an unfinished draft. I have plainly stated my assumption, and I'll make a prediction for you: the final conformance clause of 1.2 Part 3 will be in accord with what I have written.

    As to Microsoft, as you well know I have been happy to point out invalidities in their product's documents where they exist. It's not my fault if Symphony's got a bug in it (nothing to do with the manifest, mind you). Get it fixed!

  • Alex

    5/19/2009 10:07:13 PM |


    Apart from the manifests, the OO/Sun Plugin package docs conform at least to the recent draft version of the 1.2 schema I happen to have. Apart from this:

    - For the Symphony document, it appears the date problem is the only problem.

    - For Google, there is a bunch of problems: 6 x @svg:font-family attributes used invalidly in styles.xml, and over 2000 errors for content.xml, most (but not all) of which are for the use of non-NCNames for style names (which the schema mandates).

  • Luc Bollen

    5/19/2009 10:34:20 PM |

    @Alex: "Are you suggesting MS should have proposed an alternative to OpenFormula?".  Quite funny. This is exactly what MS did (they imposed an alternative to both OOo formulas and OpenFormula), but not what I suggested.

    You can find my suggestion to Doug in the comment I made to his post on May 14, 2009 4:19 AM: "[...] as Microsoft decided to break the interoperability solution used for a long time by ALL the ODF implementations (including the so called CleverAge add-in), it would have been appropriate to bring the subject to the OIC TC, explain the reason of your choice, and at least reach a common view on the impact of this choice.  This would have been more appropriate than discussing it with Rob Weir during a DII workshop."

  • AndréR

    5/19/2009 10:55:53 PM |

    @Luc: I understand your views and I didn't endorse Durusau's views. But what I observe is that your message is not well-developed yet, have a look at Jeremy Allison's comments that are very comprehensible. He points to Posix compatibility as an analogy, -- where the followup happened in competition law fora, not standardisation bodies.

    Slavish conformance is of course not necessarily a contribution to "interoperability". I would suggest that an contra-interoperability "strict conformance" would constitute a competition law concern which does not relate to the standard. It looks like the US authorities somehow have an open bill with Microsoft. So I expect the next investigations to go into office exchange document formats, first in the US but also in Europe. I am still suprised that what I perceived as a PR stunt of Hakon resulted in a real investigation in Europa, he presented that in Geneva.

    I am sure that a criticism of the open document format will lead to its improvement. This has nothing to do with the competition sphere.

    As a flanking measure we will also have the interoperability frameworks and regulations strategic debate around the world. In Europe the EIF2 is expected soon. Found this document online where I argued against the premature Gartner EIF2 draft:
    Of course the Gartner proposal was obsolete even when it was delivered. We will get a full fledged interoperability program, with a lot parallel work and backup going on.

    I fully agree with the Semic good practices study and the somewhat controversial remark:
    "Given the numerous different economic, legal, and cultural backgrounds of the Member States [of the EU], achieving interoperability without standards is the most efficient solution."

    Interoperability != standardisation.

  • Rob Weir

    5/19/2009 11:34:36 PM |

    Alex, Your post makes it clear that you were testing the ODF 1.1 schemas.  You concluded "From the test above we have the hard fact that most of the mainstream ODF applications are failing to emit standards-conformant ODF".  This is bold (and false) claim, not backed up by the text of the ODF 1.1 standard, as I have indicated.

    You misread the standard.  But rather than admit and accept that, you are now trying to redeem your post by a back-up argument, which tries to apply your interpretation and prediction of what Part 3 of ODF 1.2 will say when it is eventually approved.  But this text has not even reached Committee Draft stage in OASIS.  It is merely an Editor's Draft at this point.  Are you really going to persist in making bold claims like this based on an Editor's Draft, which has not been approved, or in fact even reviewed by the ODF TC, especially when contradicted by the actually approved ODF 1.0 (in ISO and OASIS versions) and ODF 1.1?

    Do you get paid to spread FUD like this, or is it merely a dilettantish pursuit?

  • Luc Bollen

    5/19/2009 11:40:29 PM |

    @AndréR: you did not put a link to Jeremy Allison's post.  Here it is:  This is indeed a very good analysis and explanation of Microsoft's strategy.

    About Patrick Durusau's post: I would say that I agree with his viewpoint, in general. Patrick Durusau comments definitely make sense. What I find disturbing is the timing: posting this text now could appear as an endorsement of Microsoft deeds, without explicitly telling so. Very subtle if this is indeed the intention.

    And I agree that Microsoft behaviour should constitute a competition law concern: in May 2008 they claim that they will provide "practical interoperability", and in April 2009 they deliver "slavish conformance" that works against any interoperability.  An usual dirty trick of Microsoft to try and extend its monopoly.

  • Alex

    5/20/2009 12:33:11 AM |


    No, Rob - you are the one misrepresenting ODF.

    The ODF 1.1 standard states: "[t]he normative XML Schema for OpenDocument Manifest files is embedded within this specification." (s17.7.1)

    How is it anything other than non-conformant, for an XML document to be invalid to a normative schema?

    > This is bold (and false) claim

    No Rob - the claim stands:

    - Symphony's document has a plain error in its styles.xml content
    - Google's documents are lousy with faults
    - OOo's document doesn't even *attempt* to aim at a standard
    - likewise for Sun's plugin

    Therefore it most certainly is a hard fact that most of the mainstream ODF applications are failing to emit standards-conformant ODF. None of the four mentioned above are so doing, by the evidence of the files that you yourself made.

    > Do you get paid to spread FUD like this,
    > or is it merely a dilettantish pursuit?

    *shrug* well, you're evidently showing an increasing appetite for name-calling, innuendo and smears these days. My advice to you would be to stop it, as it does you no credit.

  • Rob Weir

    5/20/2009 1:10:22 AM |

    Alex, as you know, normative clauses include those that state recommendations as well as requirements.  So 'shall' is normative, but so is 'should' and so is 'may'.  Normative refers to all provisions, not merely requirements.  Please refer to ISO Directives Part 2, definitions 3.8 and 3.12.  So normative does not imply "required for conformance".  The conformance clause defines conformance and that clause clearly defines it in terms of the schema excluding chapter 17. Of course you know this all.  I am unable to even imagine that you would be ignorant of basic standards terminology. So why do you persist in intentionally misleading your readers?

  • Luc Bollen

    5/20/2009 1:37:51 AM |

    @Rob: "So why do you persist in intentionally misleading your readers?"

    If I wanted to distract the discussion from the real issue (MS-ODF breaking interoperability with the existing corpus of ODF documents), as if I was as clever as Alex, this is exactly what I would do: find conformance issues in other implementations (real issues or misrepresented ones, I wouldn't care) and blog about them in a way that will spark a heated discussion.

    Have you noticed that neither Doug Mahugh, nor Gray Knowlton, nor Alex are blogging about interoperability ?  The main subject being discussed is now strict conformance rather than practical interoperability.  Bingo !

  • Doug Mahugh

    5/20/2009 1:53:28 AM |

    Luc, the latest post on my blog (and the longest one I've ever written) is 100% about tracked-changes interoperability.  I'm also planning posts on other interoperability topics, coming soon.  Other than formulas, which isn't even in the current version of ODF at all, and has been exhaustively discussed in numerous forums already, is there an interoperability topic you'd like to see covered in more detail?  One that's actually in the ODF standard?

  • Luc Bollen

    5/20/2009 2:02:45 AM |

    Doug, "the latest post on my blog is 100% about tracked-changes interoperability". Indeed. And it is a quite interesting one, but on a peripheral subject and only to explain (with valid arguments) why you are not interoperable.

    Indeed there is an interoperability topic of interest to me: what will Microsoft do to become interoperable with the existing corpus of ODF files.  And please don't claim that it is not in the ODF Standard: of course, interoperability issues are caused by what is NOT in the standard.  So these are the interesting topics to blog about.

  • Alex

    5/20/2009 2:40:49 AM |


    I'm sorry, but you have veered off into the surreal now.

    If you're maintaining that "the normative XML Schema for OpenDocument Manifest files" is in fact NOT REALLY "the normative XML Schema for OpenDocument Manifest files", then we're going to have to disagree.

  • Rob Weir

    5/20/2009 3:06:17 AM |

    Alex, read more carefully.  I never said that schema was not normative.  What I said is that "normative" is not the same as "required for conformance", as you had been asserting.  But now I think you may be confused on this as well.  Take a look at the ISO Directives references I gave you and see if that clears it up.

  • AndréR

    5/20/2009 3:31:02 AM |

    @Luc: Indeed, and this makes you feel angry?

    OpenXML was ripped apart, now the opportunity is seen to criticise ODF. How? 1) Riddicule conformance and 2) affirm the prior talking point that ODF 1.0/1.1 does not specify formula (while OpenXML does). Here you find the new sales arguments:
    Let's bet there is more in the pipeline for the katharsis of the OOXML team!

    Now, in my opinion the spec criticism is desirable as it would lead to a better ODF 1.2 and better fidelity between applications. A simple small automated patch for the customers can resolve the spreadsheet prank. A simple competition complaint would achieve that. Right now the argument is used to hassle the chair and flex the muscles inside the committee, and to pretend that ODF was not important. In fact all these resources are spent because ODF broad adoption is no real commercial threat.

    To my perception these messages are of course also targeted at public administrations along the lines above. What still strikes me are the disproportionate resources attached. Here the attempt is made to mimick the past debate and to reverse the communication role, blur the scheme. I can tell you why that doesn't work.

    Here is the ODF Alliance link btw:

    I saw your "call to arms" but I would rather say, relax, chose your target, and don't take the flags raised on inhabited islands too serious. Also the ODF Alliance operates by far too reactive, too defensive. I often observe this with advocacy, once they adopt a reactive work scheme they continue like that, pain driven campaigning. I prefer clear strategic objectives, in particular when you are on the potentially winning side.

  • Luc Bollen

    5/20/2009 3:56:39 AM |

    @AndréR: Indeed, MS dirty tricks, hypocrisy and cynicism infuriated me.  But thankfully, you seem wiser than me  Wink

    You are right: let them smile now, we will smile when we will see the fine imposed by the EC, and again when Microsoft will replace OOXML by ODF in a future version of Office...

  • Alex

    5/20/2009 4:04:14 AM |


    Something that violates the mandatory normative provisions of a standard is non-conformant. The normative ODF Manifest schema sets rules for elements declared in a certain XML namespace; the faulty manifests in question violate those rules by using those very elements in ways which are schema-invalid.

  • Rob Weir

    5/20/2009 4:59:38 AM |


    Your first sentence is true.  And your second sentence may be true.  But you have nothing that connects them other than a thin gruel of irrelevancy.

    To the point:  can you point me to where it is stated that validity to the manifest schema is a ODF conformance requirement?  No, I don't think you can, because such a requirement does not exist.

    That is the curious thing about you, Alex.  You'll take a pedantic read of the text to defend Microsoft's sham implementation of ODF spreadsheets, to portray something that is patently incompatible and of dubious conformity as being legitimate and even proper. And the next day you do toss out sloppy analysis of every other vendor's ODF conformance, of documents which are undoubtedly interoperable, applying the wrong schemas versions to the documents, mixing and matching conformance clauses across versions, misusing basic standards terminology, and generally constructing a text by whim.   Very curious.

  • marc

    5/20/2009 5:10:05 AM |

    alex , you said

    "Incorrect: there is the statement in the Part 3 draft that "Part 3 (this document) defines a package format to be used for OpenDocument documents"."

    i don't understand why you talk about Part 3 draft of ODF 1.2.

    Can you clarify if your post is about ODF 1.1 conformance or ODF 1.2 conformance?

    Thanks for the clarification

  • Paul E. Merrell, J.D. (Marbux)

    5/20/2009 5:16:44 AM |

    @Rob Weir: "I am unable to even imagine that you would be ignorant of basic standards terminology. So why do you persist in intentionally misleading your readers?"

    Rob, you owe Alex an apology. You certainly have been placed on notice that your definition of "normative" is questionable and even if it were not, you have not laid out a sufficient case to accuse Alex of "intentionally misleading" his readers.

    You and I have already had the conversation about your interpretation of "normative" and you left it, not with a bang but with a whimper. You led off on the pertinent discussion with the following statement:

    > Strictly speaking, "normative" clauses in a standard define the provisions
    > of the standard. And provisions of the standard include the mandatory as
    > well as the optional requirements.  So the "shall's" and the "should's"
    > are both normative.  

    I said in my first response:

    "You might take up that issue with OASIS:

    "'Normative Statement' – a statement made in the body of a specification
    that defines *prescriptive* requirements on a Conformance Target....

    "4. Normative Statements

    "A specification broadly consist of descriptive text and Normative
    Statements. The Normative Statements define what a Conformance Target
    MUST do to adhere to that part of the specification, and the
    descriptive text provides background information, descriptions and
    examples. ..."

    When you responded, you simply ignored what I had quoted. Instead, you came back, as you do here, with the following statement:

    "You had the disagreement on the meaning of "normative".

    "I'm taking my definition from ISO.  You can see it here in ISO Directives,
    Part 2 "Rules for the structure and drafting of International Standards".

    "Section 3, 'Terms and Definitions' lays out the basic standards
    vocabulary.  3.8 defines 'normative elements' as 'elements that describe
    the scope of the document, and which set out provisions'.  This appears to
    be distinguished from 'required elements' which is separately defined.

    "Annex H of this document, which is declared to be a normative annex, is a
    statement of the "verbal forms for the expression of provisions".  Note
    that it gives a vocabulary for expressing a range of provisions, including
    requirements, recommendations, permissions and possibilities.  Since these
    are all stated to be ways of expressing provisions in a standard, and
    normative elements are defined as those which set out provisions of a
    standard, then it logically follows that normative content of a standard
    includes statements of requirements, recommendations, permissions and

    I replied:

    "I considered and rejected the ISO/IEC definition in developing The
    Interop Glossary as being a non-definition. Here are a few of the
    problems I have with it in relevant regard. The definition states in
    its entirety:


    3.8 normative elements
    elements that describe the scope of the document, and which set out
    provisions ( 3.12)"


    The cross reference to section 3.12 leads one to a perplexion as to
    why the cross reference was made:


    3.12 Provisions

    3.12.1 requirement
    expression in the content of a document conveying criteria to be
    fulfilled if compliance with the document is to be claimed and from
    which no deviation is permitted

    NOTE Table H.1 specifies the verbal forms for the expression of requirements.

    expression in the content of a document conveying that among several
    possibilities one is recommended as particularly suitable, without
    mentioning or excluding others, or that a certain course of action is
    preferred but not necessarily required, or that (in the negative form)
    a certain possibility or course of action is deprecated but not

    NOTE Table H.2 specifies the verbal forms for the expression of recommendations.

    expression in the content of a document conveying information

    NOTE Table H.3 specifies the verbal forms for indicating a course of
    action permissible within the limits of the document. Table H.4
    specifies the verbal forms to be used for statements of possibility
    and capability.


    "So does the cross reference to 3.12 refer to the entirety of the
    section or only to a subsection? Given that 3.12.3 includes any
    'statement,' 'normative element' becomes a synonym for 'any statement
    in a standard,' in other words, 'any provision,' rendering the
    adjective "normative" entirely superfluous. E.g., one could quote the
    entirety of Walt Whitman's Leaves of Grass in an electronic document
    standard and it would constitute "provisions" of that standard and
    therefore be "normative elements" of that standard. All notion of
    relevance is lost.

    "Granted, a document may define 'up' to mean 'down' and vice versa yet
    still be rationally interpreted. But one would expect non-ambiguous
    definitions in a circumstance where one departs from the common and
    ordinary meaning of those terms.

    "So too, at least in my mind, in regard to 'normative.' The
    'prescriptive' sense of 'normative' is the only common and ordinary
    sense that might apply, so 'serving to prescribe : laying down rules'
    or directions : giving precise instructions,' according to Webster's
    Unabridged (Third).

    "Can one still view Mr. Whitman's quoted prose as 'normative elements?'
    I think instead one must begin to wonder if the drafter of the
    definitions was paying more attention to a passing skirt than to his
    work product.

    "This is not to say that I cannot understand your reasoning in arriving
    at the meaning you ascribe. But it does not, in my view, lead to a
    useful result. 'Normative' becomes superfluous.

    "I do believe that the OASIS definition is more useful. But I think
    this is a situation where we'll need to await a decision by a court or
    legislative body before we acquire an authoritative definition.

    "I do apologize for introducing 'normative' into the conversation. I
    knew people ascribe quite different meanings to the term and for that
    reason I should have avoided its usage without providing a definition."

    In your parting response, did you address the merits of what I had said, arguing for example that the definition could still be rescued from ambiguity by pointing to any error in my analysis?  I think not. You turned tail and ran. Here is the entirety of your parting response:

    "Well, nothing in ISO practice is written with a rigor that can withstand
    an adverse reading.  But you are stretching it too far, I believe, to
    suggest that specifications of optional features are not normative.  In
    fact, by that argument, the very ISO Directives, Part 2 (the ISO standard
    for standards), which we were just looking at, would be improper, since
    they label text as being normative even when it includes specifications of
    optional features."

    Did you in any way meet the substance of my analysis? If there be logic in your parting message, it is logic that I am unable to divine. It is a response too reminiscent of a lawyer friend's frequent joke, "If you can't dazzle 'em with your footwork, baffle 'em with your [bovine excrement]."

    Having not responded in any meaningful way to my dissection of the ISO/IEC definition leading to my conclusion that it is a non-definition, you now appear on Alex's blog, citing the same authority as your "proof" that Alex is "ignorant" in his principle area of expertise and was ***"intentionally misleading"*** his readers.

    But at all relevant times you knew that you could not respond on the merits if Alex took the time to write the same analysis I did. I call foul.

    Foul 1: You accused Alex of ignorance and deceit.
    Foul 2: You had no informed basis for those insults.
    Foul 3: You knew you had no informed basis for your insults.
    Foul 4: You have put me to the work of repeating the conversation we already had.

    Shame on you, Rob Weir. The position you took was unprincipled. You are the one who has intentionally misled Alex's readers. You are caught.

    If you are a principled person, you will immediately retract your insults and apologize to Alex Brown for your deceit in as public a manner as you inflicted your deceit. If you do not do so, the undeniable record lies here of a man who is not man enough to take responsibility for his wrongs and apologize.

  • Rob Weir

    5/20/2009 5:37:34 AM |

    Ah, Marbux, what circus is complete without the clowns?

    When Alex uses the term 'normative' one assumes he is using the ISO definition of that term, not the OASIS definition, since he is an ISO member and not an OASIS member.  

    The problem with simply saying normative==required for conformance, is you can't answer the question: conformant to what?  As soon as you have multiple conformance classes and conformance targets (and even ODF 1.0/ODF 1.1 defines a document and consumer/producer conformance), you can no longer just speak of a "conformance requirement".  It needs to be a requirement for a specific target and class.  So if, purely for sake of argument, we said that validity was required for conformance, then how do you determine what conformance targets and classes apply to it?  Only extended documents?  All documents?  Producers and Consumers?  Producers but not Consumers?  Only spreadsheet producers?  Non-extended spreadsheet producers?  The point is that conformance flows down from the conformance clause.  It doesn't percolate up from random normative statements scattered through the standard, not even ones that are stated as requirements.  Conformance requirements need to be invoked from the conformance clause.  

    You don't see manifest validity invoked from the conformance clause.  In fact, you see it quite explicitly omitted, whereas validity to the main schema is invoked as a conformance requirement for conformant ODF documents.  What do you lawyers say... ah..yes.. "expressio unius est exclusio alterius": expressing one thing excludes the other.  If there are two schemas and validity to one is explicitly stated as required for conformance, then the presumption is validity to the one not stated is not required for conformance.

  • Luc Bollen

    5/20/2009 5:38:51 AM |

    @Marbux: Wow ! So, if I understand correctly your very long post, you are using an ISO "non-definition" that you apply to an OASIS standard, known to use a different vocabulary, to explain that Alex's explanation is better than Rob's explanation, on which basis you conclude that Rob has no strong basis to cast a doubt on Alex's conclusions ?  Sorry if I have not understood correctly, but I think you lost everybody long before the end of your post.

    I'm still missing something here: how will all this semantic analysis help to solve Microsoft non-interoperable mess ?

  • Alex

    5/20/2009 5:39:09 AM |


    Well, if you want to maintain the incredible position that a normative schema in a standard has non-normative provisions, then good luck to you.

    We then move into your latter-day mode of Weiresque innuendo.

    Just so you are clear:

    - my position on the wisdom on MS's approach to spreadsheet implementation is equivocal

    - however, I have shown that branding it "non-conformant" was not justified. I'm glad to see you have retreated to a position where you now just call it "dubious"

    - my analysis has shown (using the correct schemas N.B.) technical faults in the Symphony, Google, OOo and Sun plugin documents. You have taken issue with the last two of these, and failed to carry your point.

    You are the one with the tortured readings and misrepresentations of ODF -- topped with a tendency to retreat into vague accusations and colourful rhetoric when you can't argue the points of detail.

  • Alex

    5/20/2009 5:46:52 AM |


    If you look at the pipeline, you'll see it selects *either* the 1.0, 1.1 or a 1.2 (draft) schema according to the version attribute it finds on the root element of the documents it validates.

    - Alex.

  • Luc Bollen

    5/20/2009 6:00:34 AM |

    @Alex: For a simple ODF user as me, interested not in semantic analysis but in real-life interoperability, the irony is that even with the assumed technical faults in the Symphony, Google, OOo and Sun plugin documents, they are largely interoperable, while the assumed conformant Microsoft SP2 documents are interoperable with none of the other implementations.

  • Rob Weir

    5/20/2009 6:00:37 AM |

    Alex, there you go again.  ISO Directives quite clearly states that normative provisions define requirements, recommendations or statements of possibility. All requirements are normative statements.  But this does not mean that all normative statements are requirements.  Similarly, when you indicate that a clause is normative or informative, it does not mean that every statement in the normative clause is a requirement.  Normative clauses often contain both requirements and recommendations, often mixed into the same sentence.  I'm sure you've seen that.

    Saying that something is the normative schema is like saying clause X is a normative clause.  It is denoting whether a portion of the standard is normative or informative.  It doesn't mean every sentence is clause X is a conformance requirement.  You need to trace it out from the conformance clause to tell what is a conformance requirement, and for what conformance target and class.  We clearly did that for the OpenDocument schema.  But we clearly did not do that for the manifest schema.  You may not like that, but you cannot fail to notice that a distinction was made.

  • marc

    5/20/2009 9:05:15 AM |

    Thank you Alex.

    Talking about ODF 1.1 conformance , do you note that 1.4 Relax-NG Schema of the ODF 1.1 specification says

    "The normative XML Schema for the OpenDocument format is embedded within this specification. It can be obtained from the specification document by concatenating all schema fragments contained in chapters 1 to 16" ( the schema for the manifest files is embedded in chapter 17 )

    and that 1.5 Document [...] conformance says"

    "Conforming applications either shall read documents that are valid against the OpenDocument schema if all foreign elements and attributes are removed before validation takes place, or shall write documents that are valid against the OpenDocument schema if all foreign elements and attributes are removed before validation takes place."


    Do you read anything about the manifest file there?  

    Where do you read in the ODF 1.1 specification that document conformance includes the manifest schema ?

  • AndréR

    5/20/2009 9:53:58 AM |

    @marbux: Who is Walt Whiteman?
    does not help me to get the context. It is extremely difficult for me to catch up with Walt Mitties, Whitemans and other American folklore evidence.

    We have similar "normative" fun with the European Patent Office:


    It reminds me of a tv team that interviewed the airport security and was concerned that travellers from Switzerland transit were not searched and carried weapons and trophies from Africa. The airport security answered that these flights were checked randomly, but the TV alleged not to understand what randomly means and insisted there were no checks at the airport (which was true in that case as no one was checked from the flight). When you apply different meanings to the different contexts, confusion is guaranteed. And random searches can also mean that no passenger from these flights ever gets checked. Still the airport security can pretend they check these flights on a random base.

    Confusion may arise from the fact that required, valid, conformant, mandatory, lawful are reinstatements of the same semantic concept "normative" on different levels. So in standard language actually a similar test is applied in different contexts and with different functions. To what I understand "normative" is just a standard document section property marker to be contrasted with "informative".

  • Rick Jelliffe

    5/20/2009 11:42:44 AM |

    I think there are a two different issues here: there can be "conforming" and "Conforming" to a standard.

    Small-c conforming to a standard is where your thing accords to all clauses marked normative and which don't have specific "may" or "should" escape terms. This is the type of general conformance I think Alex is talking about. It requires no specific wording in a conformance section.

    Capital-C conformance is where the standard itself creates conformance classes with tests. For example, IS8879 clause 15 has an explicit statement that
    "If an SGML document complies with all provisions of this International Standard, it is a conforming SGML document."   (Golfarb's gloss is that if a document comes close to conforming with all the provisions, "it is still an SGML document not a conforming one.")  IS8879 then specifies various kind of conformance: Basic SGML Document, Minimal SGML Document, Variant Conforming SGML Document.

    I think Rob only means capital C conformance in his comments. Because ODF is frankly underspecified in this area, then it should not be a surprise that something cannot be held to be non-conforming against non-existent straw conformance classes. Rob's category is sound, his application of it in this case is fatuous.

    General conformance requirements exist from a clause being part of the normative text. Indeed, it may even be the most common case. General conformance requirements certain exists in the absence of explicit conformance classes (and may exist even when there are explicit classes.)  

    In the case of ISO standards, I think any absence of an explicit statement linking conformance to the provisions in the rest of the text is a flaw, but not a serious one (unless the standard uses but does not explicitly define different conformance classes): general conformance is the only workable way to interpret most standards, has in fact been the way that they are interpreted, goes to their very purpose and structure, and the JTC1 Directives are

    Fast-tracked standards like IS26300 are a little different, because they come from different drafting rules. In that case, you would take into consideration drafting conventions from the sponsoring organization too (i.e. OASIS-Open.)

    I think the SGML conformance clauses would provide a good editorial input for the ODF TC to see a range of conformance classes they could support. For example, it gives the difference between a "conforming SGML system" with and without a "validating SGML parser": i.e. an application that can load valid SGML documents correctly but wouldn't necessarily barf on invalid ones, and an application that will validate when loading.

  • Rick Jelliffe

    5/20/2009 12:07:09 PM |

    Alex: Thanks.

    Does anyone have an online validator for ODF documents?  It would make life easier for the developers if validity to the schemas and grammars were trivial to check, I think. It is easy to blame the programmers that this shows a gap in their tests, but they may not be particularly competent in valid XML.

  • Alex

    5/20/2009 2:27:41 PM |

    @Marc @Rob

    Okay, it seems we have a simple disagreement here. You are maintaining that a provision has to be prefigured in the conformance section to play a role in determining conformance. I am maintaining that normative clauses which mandate something play a role in assessing conformance no matter where they occur.

    To take an example other than the schema, for ZIP packages ODF has this detail:

    "If a MIME type for a document that makes use of packages is existing, then the package should contain a stream called "mimetype". This stream should be first stream of the package's zip file, it shall not be compressed, and it shall not use an 'extra field' in its header"

    This detail is not prefigured in the conformance section. However I would judge any package which contradicted this provision by compressing its mimetype stream, to be a non-conformant package.

    I regard the schema as an equivalent case. It is a *grammar* which formally makes unambiguous formal provisions about the disposition of XML elements and attributes declared in a certain namespace. If the schema mandates that element x MUST contain element y, and an XML manifest makes use of element x but fails to use element y as mandated, then I say that document is as surely non-conformant as the one which disregards the ZIP format provisions mentioned above.

  • Paul E. Merrell, J.D. (Marbux)

    5/20/2009 7:02:49 PM |

    @AndréR: "
    does not help me to get the context."

    Whitman was a poet. But rather than referring to his Leaves of Grass work, I could instead have referred to any other information that has no relevance to a standard other than the fact that someone placed its text in the text of a standard, quotations from a pornographic novel, for example.

    To illustrate, let's run the analysis again for Rob's latest evasion, his statement above that:

    "Saying that something is the normative schema is like saying clause X is a normative clause. It is denoting whether a portion of the standard is *normative or informative."*

    Please bookmark that statement because the distinction Rob draws in the latter sentence cannot be valid under the definitions he has directed Alex to.

    The ISO/IEC Directives Part 2 definition of "normative element" is: "elements that describe the scope of the document, and which set out provisions. ( 3.12)"

    So we move to section 3.12, which I quoted earlier in its entirety, in hopes of learning what the term "provisions" means.  There we are given three  definitions rather than a single definition. The three defined terms include, as Rob says, "requirement" and "recommendation." But Rob would have us close our eyes to the third, the definition of "statement" in 3.12.3:

    "expression in the content of a document conveying information"

    So now let's return to the bookmarked Rob Weir statement. According to him, "normative" denotes "whether a portion of the standard is *normative or informative." But we have just been told by the Directives that any "expression ... conveying information" is a "normative element." So Rob's claimed distinction is a figment of his imagination, not what the same "definitions" he directs us to say.

    Now let us imagine a provision of ODF 1.2 that states: "Columbus sailed the ocean blue in fourteen hundred ninety-two." Under the defective ISO/IEC definition of "normative element," our hypothesized provision is in fact normative because it is an "expression ... conveying information." ( 3.12.3.) This is true despite our expression having nothing to do with the specification whatsoever beyond the fact that it was included in the specification's text.

    And because of this problem, we may with absolute confidence brand Directives Part 2 section 3.8 as a non-definition.  "Normative elements" is a non-definition. It establishes no boundaries on what the term means other than an "expression ... conveying information." Our nursery rhyme  about Columbus (or Walt Whitman's prose) is normative under the definition.  

    What Rob directs us to as definitive in reality is hopelessly ambiguous. Moreover, it is just as hopelessly at odds with the common and ordinary meaning of "normative," which is "prescriptive," i.e., "serving to prescribe : laying down rules or directions' : giving precise instructions,'" according to Webster's Unabridged (Third), the meaning adopted in OASIS Guidelines for Drafting Conformance Clauses and unambiguously defined there: "The Normative Statements define what a Conformance Target MUST do to adhere to that part of the specification[.]"

    The elasticity with which Rob's views the English language is also demonstrated by comparing a statement he made in response to me above with a prior statement. Rob says, "The point is that conformance flows down from the conformance clause. It doesn't percolate up from random normative statements scattered through the standard, not even ones that are stated as requirements."

    Rob says this as though it were a self-proving tautology. But when he drafted the ODF Interoperability and Conformance TC Charter's list of deliverables, he included, "A conformity assessment methodology specification, detailing how *each provision and recommendation* in the ODF standard may be tested for conformance[.]"

    I submit that "each provision and recommendation" in context could be stated more simply as "all provisions and recommendations."

    But more significantly for Rob's just-crafted "percolate up/trickle down" distinction between "conformance" and "normative," his suggestion in the OIC TC Charter that one may test "conformance" for "each provision" in the specification necessarily requires that conformance *both* trickle down from the conformance section and trickle up from other sections, else one could only test for conformance those provisions directly required by the conformance section. (In aid of brevity, I will not address the Charter's preposterous implicit assertion that a recommendation creates a conormity requirement, but again that illustrates the Humpty Dumpty-like elasticity in Rob's approach to the meaning of terminology. See e.g., Rob Weir, Compatibility According to Humpty Dumpty, .)

    So in the OIC TC Charter, we have an example of Rob apparently applying the ISO/IEC Directives 3.8-3.12.3 definition of "normative element" as an "expression [that] conveys information" and extending "normative" to encompass "conformance."  Yet he now tells us that "normative==required for conformance" is false.

    Rob cannot have it both ways

    And once again --- as he did with Alex --- Rob leads a response to a principled discussion of the issue with a derogative personal insult: "Ah, Marbux, what circus is complete without the clowns?"

    That, followed quickly by invoking the legal canon of statutory construction, "expressio unius est exclusio alterius," without explanation of how the canon might be applicable. Chevron, U.S.A. v. Echazabal, 536 U.S. 73, ___ (2002), Slip Op. at 81,  (the proponent of that canon's application has the burden of persuading the court that the canon applies).  Rob simply slips in the phrase as a self-invoking tautology, adorning his latest switch in definitions with a canon of statutory construction he understands so poorly that he did not even attempt to carry his requisite burden of persuasion that the canon applies.

    To which I reply: Adding Latin phrases you do not understand may make you appear learned to those who do not understand the canon thus invoked, but it unmistakably brands you at best as a sloppy scholar in the eyes of those who do understand. I suggest that you avoid that practice in the future.

    I have taken no position to date on the correct meaning of "normative." My point was and is that Rob lacked any reasonable basis for accusing Alex of ignorance and deceit on the basis of the ISO/IEC sections he cited. He was on prior notice that precisely the same definitions were worded as non-definitions and that his position was at odds with the corresponding OASIS definitions.  

    Furthermore, Rob had and has no sensible response to my prior analysis. In his most recent reply, he:

    [i] begins with a derogatory ad hominem attack, the relevance fallacy of *argumentum at hominem* of the abusive form, ;

    [ii] cites a single legal authority he demonstrably does not comprehend as his only authority;

    [iii] makes no attempt to shoulder his assigned burden of persuasion as to that authority's applicability;  

    [iv] makes no attempt to respond to the merits of my analysis quoted from our prior discussion of the issue;

    [iv] changes the subject to a new position inconsistent with his prior statement without acknowledging that he has taken a prior inconsistent position and addressing why he should not be held to his prior position; and

    [v] omits any defense of his *ad hominem argumentum* abusive attack on Alex Brown, the offense for which I challenged him to recant and apologize.  

    The record of a man who is not man enough to take responsibility for his wrongs and apologize stands unrebutted. Rather than admit that he cited ambiguous authority as the only basis for his personal attack on Alex and apologizing, Rob simply digs deeper the logical and factual hole that he occupies, embellished with yet another personal attack.

    I pity people who lack the fortitude to admit their errors, apologize, and move on. The inevitable result of clinging to an unsustainable position is but to provide further evidence of the position's weakness and to waste others' time.  

    Here, Rob has attempted to erect an intellectual house of cards on the foundation of a non-definition, personal insults, and other assorted horse puckey, all to divert attention from the merits of a bug report on ODF implementations.

    I do wish that bug reports on the ODF specification and its implementations  in a principled fashion, without the invective that Rob typically greets their authors with. Bug reports are a Very Good Thing and Rob's habit of attacking their authors can only discourage bug reports.

    Rob's invective has ramped up of late, although his employment of the ad hominem attack in combination with straw man argumentation in lieu of responding on the merits to criticisms of ODF dates back at least to 2007. See e.g., . This escalation of Rob's smoke and mirrors follows on too many years of IBM tawdry tactics, such as its ODF Interoperability disinformation campaign,  that are impeding repair of a badly broken specification.

    In my view, Rob and his company need a disincentive for continuation of such tactics. To that end, I have just launched the "IBM: Standards and Double Standards" group on Diigo, . This group will gather bookmarks, inter alia, on Rob's employment of ad hominem attacks in aid of his company's goals, as well as addressing the contrast between IBM's positions on OOXML and ODF.

    I am well stocked with quotes, notes, and links on those and related subjects. Rob may continue on his path of employing the smear tactic to divert public attention from bug reports, but a cumulative record of his resort to the tactic will be made. Hopefully, a centralized collection of his use of the smear tactic will provide him with an incentive to employ the tactic less often. I encourage anyone who has endured Rob's personal attacks to contact me to have deserving  bookmarks added to the Diigo site.  

  • Paul E. Merrell, J.D. (Marbux)

    5/20/2009 7:08:40 PM |

    @luc: You have substituted your own mischaracterization of what I said for what I did say, then you attack your own mischacterization. That is the straw man argumentation fallacy.

    If you wish me to respond, please address what I did say rather than a mischaracterization of it.    

  • marc

    5/20/2009 11:49:59 PM |

    paul, could you post your comments in the ZIP format? thank you


  • Luc Bollen

    5/21/2009 5:48:15 AM |

    @Marbux: My intent was not to attack you explanation, but to show that such a semantic discussion, while surely very important in front of a court, is very confusing in the context of a blog, and is far from the main issue.

    For day-to-day users, the main issue resulting from the release of Office 2007 SP2 is not the conformance of the various ODF implementations, but is their interoperability.  So, if there are conformance issues and solving them will improve interoperability, conformance is a valid subject today.  But if you simply want to prove, as Alex is doing, that the interoperable implementations are not conformant, and that the not interoperable one is conformant, sorry, I don't care now.  

    The conformance issues will surely have to be discussed (and solved, if confirmed), but *after* the much more important issue, interoperability. So, if you wish to respond to my post, please tell us your view about the best way to ensure interoperability between all the ODF implementations. Distributing good points and bad points in the secondary discussion about conformance will not help.

  • Nobody Real

    5/22/2009 1:36:22 AM |

    I find it humorous that the ODF crowd argues that conformance to the spec is irrelevant, or argues various reasons why the spec doesn't apply to them, and argue that interoperability is the only issue.

    The argument is, that since most ODF documents are written to work with OOo, then all implementations of ODF should be written to work with OOo.

    This is like arguing that Web browsers should be compatible with IE, W3C specs be damned, because most web sites are written to work with IE.  

    Do you not see how hypocritical that is?

  • Alan Bell

    5/22/2009 6:37:28 AM |

    @Nobody Real
    A crowd can hold more than one opinion. I think the spec should be improved to cover gaps revealed by making multiple implementations, I also think that implementations should be improved to meet the spec.

  • Alex

    5/22/2009 5:34:14 PM |

    @Nobody Real

    Like Alan I'm unhappy with the phrase "ODF Crowd"; we really need to resist those who would like to claim there's some kind of divide based on which standard one supports. Or indeed that "supports" can be used in a kind of sports-contest way.

  • Paul E. Merrell, J.D. (Marbux)

    5/22/2009 9:11:55 PM |

    @luc: The critical flaw in your discussion is your assumed premise that there already are interoperable ODF implementations. I am aware of no evidence that there are and there is plenty of evidence that there are not. E.g., my comment on the following blog post that collects such evidence, complete with citations and links.

    Are you familiar with Microsoft propaganda falsely claiming that OOXML is designed for interoperability? Did you believe it? If not, why your assumption that IBM tells the truth when it claims ODF is designed for interoperability and has interoperable implementations? Are you approaching the situation with the mindset that this is some sort of battle between Good and Evil, with Microsoft classified as the bad guy and IBM as the good guy?  The same economic forces that lead Microsoft to lie about interoperability push IBM to do the same.  

    Although I personally triggered the ODF v. Microsoft Office XML global public debate with my investigative report in early 2005, , by 2007 I learned that I had failed to check one critical fact, the false assurances I had been given that ODF is an open standard designed for interoperability.  Like OOXML, it is nothing of the sort. It is grossly underspecified and does not include the conformity requirements that are essential to achieve interoperability.  How one might describe a standard as "open" when there is more unspecified than not requires a lapse into fantasy. ODF is a standard in name only

    Since late 2004, I have been working more than full time to sort out the interoperability mess for office productivity software and I play no favorites. All of the big vendors involved have egregiously misbehaved.  

    IBM's latest excuse for keeping people locked into the code base it uses, that of --- and for keeping the ODF specification dark and mysterious --- is that we must ignore the deficiencies of that specification and engage in application-level interoperability efforts, a sort of group grope toward interoperability, or as summarized in the agenda of an upcoming ODF plug-fest, "[f]rom interop workshops to perpetual plugfests."

    Sounds wonderful on the surface, right? But notice the omission of any intent to fix the interop defects in the ODF specification. How might one enter the ODF market with an interoperable new product after having missed all those plugfests?

    That is precisely why antitrust law forbids such collaborations among competitors not directly aimed at development of a standard. They are anti-competitive, serving only the interests of the entrenched market leaders.  See e.g., U.S. Federal Trade Commission and Department of Justice, "Antitrust Guidelines for Collaborations among Competitors" (April, 2000), (pg. 2 n. 5):

    These Guidelines take into account neither the possible effects of competitor collaborations in foreclosing or limiting competition by rivals not participating in a collaboration nor the possible anticompetitive effects of standard setting in the context of competitor collaborations. Nevertheless, these effects may be of concern to the Agencies and may prompt enforcement actions.

    There is also the fact that application-level interop can work when there is only a single interoperability target, e.g., the Samba Project's successful reverse engineering of Windows CIFS. But in the many-to-many interoperability situation, the complexity of such a task is nearly impossible to surmount without an application-neutral specification to use as the baseline reference, i.e., a real standard that specifies the conformity requirements that are essential to achieve the interoperability. So from a practical standpoint what IBM would have the world believe is impractical. It's just a hollow excuse for not fixing the ODF specification.

    Hiding behind IBM's call for perpetual plugfests and using as the reference implementation for ODF -- which itself is illegal --- is IBM's desperate desire to keep at the center of the ODF universe, to have the OOo implementation tail wag the ODF "standard" dog. But OOo and its flavor of ODF are anything but application-neutral. I could point you to several ODF 1.2 provisions that were kept in despite the ODF TC being put on notice that they created interop breakpoints for Microsoft Office, not to mention among ODF implementations    

    So with that background, I'll proceed to your question about my view of "the best way to ensure interoperability between all the ODF implementations."  The solution is multi-faceted and includes:

    [i] First and foremost, wresting control of ODF away from the big vendors and putting its control in the hands of government. The big vendors are stalemated; government intervention is necessary to end the stalemate;

    [ii] Confining the big vendors to an advisory role with a healthy dose of skepticism about their claims;

    [iii] Transforming ODF into a real standard, one that is application-neutral and fully specified, without license to claim conformance whilst writing application-specific extensions to the standard or supporting an undefined subset [cluestick: another gaping hole in the ODF spec is the preservation of conforming markup while processing a document generated by another app];

    [iv] Specifying a suitable range of subsetting profiles so that less featureful implementations may interoperate with more featureful implementations;

    [v] Specifying something akin to the OOXML compatibility framework so that less featureful implementations may still intelligibly process documents written to supersetting profiles;

    [vi] Holding big vendors accountable when they direct personal attacks at people who write bug reports rather than fixing the bugs;

    [vii] Educating standard development organization participants on their legal responsibilities as to interoperability;

    [viii] Persuading competition regulators to take a more holistic approach in their antitrust enforcement actions, dropping the focus on single bad actors and  holding  standards development organizations accountable for their failures to implement their gatekeeper responsibilities in regard to interoperability;

    [ix] Recognizing that standards-based interoperability is going to be a long slog in the office productivity sector; and

    [x]  Recognizing and accepting that we've got gargantuan connectivity bugs in the software infrastructure for the emerging Information Society --- ODF and OOXML --- that will not be removed before those specifications are repaired or replaced by a real standard.

    I could list many other things that need to happen, but I think that's a fair sampling that should give you the flavor of my view. I'll add to it that I am pleased as punch that Microsoft implemented formula support the way it did.

    There is no ODF interoperability anyway, will not be for many years, and Microsoft's action has finally forced a public discussion of the ODF Interoperability Myth that IBM has been peddling for so long. Rob Weir has been forced to admit publicly that ODF is underspecified. The spin he attempts to gloss that admission with, blaming Microsoft, is entirely irrelevant. Rob Weir is the ODF TC's co-chairman, not Microsoft. He and that TC are responsible for ensuring that the ODF 1.1 specification gets fixed, pursuant to the JTC 1-OASIS maintenance agreement.  But Rob would have us blame Microsoft rather than fixing the spec.  

    Microsoft has said that it intends to move to OpenFormula when that specification is firmed up. But Microsoft has also thrown down the gauntlet on the issue of whether what constitutes "ODF" should be defined by IBM and Sun's code base rather than by the standard. And that is a topic long overdue for public discussion.

    When ODF 1.2 reaches JTC 1 and its control is transferred to Subcommittee 34, the serious work to remodel ODF into a real standard can begin. But the feature freeze for ODF 1.2 at OASIS went into effect awhile back and specifying the conformity requirements essential to achieve interoperability is not on the list of remaining tasks. Moreover, every committee draft of ODF 1.2 so far has been rife with interoperability breakpoints.

    So in short,  there is and has been an ODF interop crisis for many years. But Microsoft's implementation of ODF 1.1 is not a crisis. It merely illustrates forcefully one facet of how abysmally under-specified the ODF specification really is.

    Am I happy that we have years to go before we achieve interoperability among office productivity programs?  No. In fact I think it's outrageous that we are now 53 years from the date when the word processor was invented and still have no open standard that specifies the conformity requirements essential to achieve interoperability. That's something I want to see happen before I die and I'm in the twilight years of my life.

    But my original article was a snowball I rolled down the mountain that has turned into an avalanche as I hoped. None of the misbehaving big vendors will be able to slow that avalanche for long.  The sufficiency of IT standards is now a major public and governmental issue and it is only a matter of time until both the Microsofts and the IBMs of the world are engulfed by that avalanche. Indeed, it must happen if we are to build a connected world. So if it doesn't happen before I die, I can go to my grave knowing that I set forces in motion to assure that it does happen.

    The major barriers to office productivity software interoperability are anything but technical. The techniques of standards-based interoperability are well understood. The real barrier is anti-interoperability corporate policies and companies' abuse of the standardization process for anti-competitive ends.  Their vendor lock-in tactics embodied in IT standards treat software users like sheep to be shorn rather than respected customers.

    As Alex said a few days ago, "the whole purpose of shoving an international standard up a vendor’s backside it to get them to behave better in the interests of the users."

    If you believe Microsoft is the only company whose management needs an attitude adjustment, you've either wittingly or unwittingly chosen to be part of the problem rather than its solution. I need hear no more IBM talk about ODF interop; I'm still waiting for IBM's first tiny step on the ODF interop walk. But thus far, all I've seen IBM walking in the other direction, substituting talk for action.  

    The interop solution is creating a real standard and shoving it up the backsides of all big vendors' involved. So contrary to your expressed opinion, dealing with what the specification says and conformance with it is a necessary prerequisite to ODF implementation interoperability. You've been bamboozled by IBM's latest excuse for postponing repair of the spec.  

  • Andrew Sayers

    5/23/2009 2:57:20 AM |


    I'd like to add another facet to the solution: wrest control of OOo from Sun/Oracle/whoever's controlling it this week.

    In practice, ODF is sufficiently complex that we'll always need a reference implementation.  Therefore, the path of least resistance for a bug report needs to go through the implementation rather than the standard, so that people aren't tempted to declare something a feature just because fixing it would involve learning German[1].

    Although OOo is open source in a legal sense, the practical barriers to contributing seem to defeat most would-be developers.  If people could send patches in as easily as they do with most open source projects, many debates would end quickly with "it's issue number 12345 in OOo's bugzilla, there's a patch and it'll be fixed in the next version".

    Oh, and since I'm re-de-lurking after such a long time, I should take the opportunity to thank Alex for the shiny new blog engine.

      - Andrew


  • AndréR

    5/23/2009 3:03:44 AM |

    @marbux: Do you want the "ODF crowd" to wait for the compount document format (CDF) cavellery?

    I need hear no more IBM talk about ODF interop; I'm still waiting for IBM's first tiny step on the ODF interop walk. But thus far, all I've seen IBM walking in the other direction, substituting talk for action.

    The main difference between IBM and Microsoft is that the stakes for IBM are low while it makes a competitor yodel when it gunpoints a bit at the cash cow. I am sure Rob Weir views that differently.

    Killing the cash cow a bit is a matter of medium scale investment, and that power translates into the expectance to be more nice and walk the interop talk. IBM does not depend on the cattle and gunmen the other side has in crosshair range and their hostages.

    Don't forget to recommend me better trivial literature. Wink

  • Paul E. Merrell, J.D. (Marbux)

    5/24/2009 11:53:18 PM |

    @Andrew: Yes, Sun is just a bit overdue delivering on that promise to transfer ownership of OOo to a non-profit foundation, isn't it?

    I don't agree on the need for or desirability of a reference implementation, particularly one controlled by a single vendor and maintained as a moving interoperability target. Saying that OOo should define the ODF standard is no different in my book from saying that Microsoft Office should define the OOXML standard.

    The problem is that neither specification is application-neutral in terms of implementability and we can't make either of them application-neutral if they remain dependent on a particular implementation. At least that's the view from here. One really must make a choice between open standards-based interop and application-level interop that can never result in universal interoperability.    

    @AndréR: On CDF, I'm after a real open standard with conforming and fully interoperable implementations. I don't care what its name is. May the best standard win. But at this point, ODF is looking like a loser because the powers that be on the ODF TC refuse to fix it. ODF interoperability was a nice myth while it lasted.

    I disagree that the stakes are low for IBM. Microsoft has been moving aggressively into Notes/Domino's turf on the server side with Sharepoint Server and its interoperable collaboration-ware. IBM claims to still have 140 million licensed seats, but that's way down even if accurate.

    I see ODF as part of IBM's play to save its Notes/Domino lunch by playing off anti-Microsoft sentiment and the popularity of OOo. In my eyes, Symphony isn't much more than free bait for Notes/Domino. And of course Notes/Domino helps IBM sell a lot of hardware.  IBM has its own cash cow in the fight.

  • AndréR

    5/25/2009 6:16:55 AM |

    @marbux: Not bad! Of course I don't agree with your criticism of ODF 1.1.

    Sun is just a bit overdue delivering on that promise to transfer ownership of OOo to a non-profit foundation, isn't it?

    A matter of financial engineering.

    I see ODF as part of IBM's play to save its Notes/Domino lunch by playing off anti-Microsoft sentiment and the popularity of OOo. In my eyes, Symphony isn't much more than free bait for Notes/Domino

    Of course! And it is a cheap investment. So what does this imply for ODF?

    On the ideological front it is more that Microsoft regards obstruction of interoperability as crucial for their business, it remains rooted in 1995 platform ideology and struggles with the internet. The sentiment is automatically invoked when that commercial strategy clashes with the public interest, common sense or the conduct. The main difference is that IBM is conservative and strategic while Microsoft is unreasonably aggressive to prevent the domino effect and tactical. Now, when you know that, it is a fantastic opponent for corporate warfare and public policy confrontation. Raise some flags here and there and watch the troops storm the hills like crazy.

    Think of the unreasonable fight against the EIF1 paper that turned it into a monster: Here is the dwarf:

    Think of Opera's cheap IE bundling complaint and the foolish bullying of the competition authority and other recent Lemming operations:

    ...or the standardisation of Open XML. It is always the same pattern.

  • Andrew Sayers

    5/25/2009 10:50:15 PM |


    I'm coming at this from the perspective of the web developer, which is the group I expect to form the core of an office document developer community in the next decade.  As such, my view on a reference implementation is probably a bit unusual around here.

    If I have a question about what should happen when I write part of a web page, I've got two choices: I can spend all afternoon delving through HTML, CSS, DOM, and other specs to work out what the answer should be; or I can run it through a browser and see what the answer actually is.  I know deep down that the former is the right thing to do, but it takes far too long to be practical in real life.  For reasons I'll get into below, web developers generally run their pages through Firefox, making Firefox the de facto reference implementation of HTML.  Of course, this can change (as it's changed away from IE in the past decade), and it's not an absolute (no developer would claim a bug in Firefox is actually a bug in the spec), but it's a very strong rule in day-to-day development.

    The reason Firefox has become the de facto reference implementation of HTML is because it provides good (not necessarily best) compliance to the spec, and because it provides the best debugging features by a mile.  I don't see this as a bad thing, because it lets us mortals get into web development without spending a month digesting the spec before writing our first ever line.

    I certainly agree that the reference implementation shouldn't be the target of the standard (IE taught me that in the 90's), but if the standards community doesn't promote a reference, a groundswell of developers will elect one based on their own criteria.

      - Andrew

Comments are closed