Data Models 2: Minutes of questions and discussion, by JCM Mireille Louys presented the latest work on the Characterization model and Francois Bonnarel presented a possible serialization. The suggestion that Accuracy be made part of Characterization rather than part of the data generated some discussion. Jonathan argued that the errors should be close to the data while Doug pointed out that different levels of noise model may be useful. It wasn't entirely clear what the definition of Accuracy was in this model. Another issue was whether complicated data like exposure maps should be in characterization or provenance. We reiterated that provenance should contain data with instrumental signature while characterization should have the stuff that is 'pure' physics. Doug's view is that detailed calibration doesn't belong in characterization, while Jonathan believes that it ultimately does, although of course not for the query-response aspects used in the SSAP protocol. Francois' presentation of the proposed XML for Char. included an element time defined by restriction on a particular UCD value, and we asked for some examples to clarify this in both votable and XML instance. FB commented that this construct was partly due to the removal of the Frame object from Brian's version of the Quantity model. It was suggested (GL?) that instead of inheriting things from the quantity model it should be reused. Arnold warned that restrictions mixed with substitution groups can lead to problems in the resulting code. Igor presented on 3D IFU data. I note that the concepts of line spread function (as a function of instr.x, instr.y, lambda) and spatial PSF (as a function of mirror.x, mirror.y? different from instr.x instr.y in the case of image slicers?) came up as something we may need to handle. (they fit in Resolution.Spectral and Resolution.Spatial of course). We should review the Euro3D format and see how it maps to the SED model. The issue of curving/optical distortions on the IFU spectrum was raised. It was commented that the data quality flags recorded different parameters of the instrumental process and so had a stronger link to Provenance than Characterization. In a general discussion on 3D data it was felt that two main approaches - a 3D data cube x,y,lambda, and an aggregate of multiple 1D spectra each tagged with a Support.Spatial region description - covered all the main cases, and that we could define a mapping between the two representations. In the latter case (aggregate) of course you avoid rebinning in the spatial domain, which makes people happy. We noted that in radio, the former case, simple image-spectral cubes, are common. In the former case you need to deal with the WCS, which we haven't addressed in detail so far in the IVOA. Characterization doesn't include WCS, and this may be good because it can apply to non-binned data for which WCS does not apply. We briefly discussed the roadmap for characterization, including a revised XSD and more instance examples. JCM agreed to provide an event list. Norman presented work on OWL reasoning engines. The idea is to make different ontologies interoperable by mapping only a few fields in them and deducing further compatibilities. JCM expressed concern that in complicated cases, only lowest leaves on the tree might have counterparts. Gerard berated us for not fully following UML syntax and diagram rules, and proposed an IVOA standard subset syntax that we should all follow, including bindings to XSD, Java, databases. One problem he commented on in his scheme was for cases where an object relationship might point outside a document, e.g. a class might refer to a human, but the object describing the human might be in some external resource. Importing via Xinclude (F) or index pointers for external instances (AR) were suggested. Gerard pointed out a problem with using element references is the need to have a top element in the document, limiting reuse. He will compare his STC schema binding to Arnold's and report. He recommended avoiding restrictions and using extensions. - Jonathan