STC2:Coords Proposed Recommendation: Request for Comments #2


NOTICE: This RFC page replaces RFC#1

Why RFC #2

Rationale for a second RFC round:

  • Many comments have been collected after RFC #1. Some of them just required text improvements but some others implied significant model changes, especially for Coords.
  • A global RFC answer has been sent to WG mailing list in February 2020: answer.
  • These data models should support the description of datasets and DAL responses, by defining fundamental elements which are commonly used in many contexts. The intent is that they be imported as components in these more complex models such as CubeDM or Mango, so that they all build from the same basis, thereby enhancing interoperability.
  • They should NOT be expected to fully support any use-cases outside of the described set. For example, they cannot currently support the complex error types found in various Catalogs. These use-cases are to be considered in future updates to the models.
  • Measurements cannot be used without Coords because the latter is imported by Measurements but Coords can be used in some other contexts e.g. Transform.
  • The nature of the changes in Coords, that impacts Measurements thus, led the TCG to decided on 2020-10-8 a second RFC round for both models.

Summary

Version 1 of STC was developed in 2007, prior to the development and adoption of vo-dml modeling practices. As we progress to the development of vo-dml compliant component models, it is necessary to revisit those models which define core content. Additionally, the scope of the STC-1.0 model is very broad, making a complete implementation and development of validators, very difficult. As such it may be prudent to break the content of STC-1.0 into component models itself, which as a group, cover the scope of the original.

This effort will start from first principles with respect to defining a specific project use-case, from which requirements will be drawn, satisfied by the model, and implemented in the use-case. We will make use of the original model to ensure that the coverage of concepts is complete and that the models will be compatible. However, the form and structure may be quite different. This model will use vo-dml modeling practices, and model elements may be structured differently to more efficiently represent the concepts.

This model describes the Coordinates model and covers the following concepts.

  • Description of single and multi-dimensional coordinate space, and coordinates within that space.
  • Coordinate Frames, providing metadata describing the origin and orientation of the coordinate space.
  • Definition of simple domain-specific coordinate types for the most common use cases.
  • Coordinate Systems, a collection of coordinate frames.
The latest version of the model and supporting docs:

Implementation Requirements

(from DM Working group twiki):

The "IVOA Document Standards" standard has a broad outline of the implementation requirements for IVOA standards. These requirements fit best into the higher level standards for applications and protocols than for data models themselves. At the Oct 2017 interop in Trieste, the following implementation requirements for Data Model Standards was agreed upon, which allow the models to be vetted against their requirements and use cases, without needing full science use cases to be implemented.

  • VO-DML models must validate against schema
  • Serializations which touch each entity of the model. These serializations may be 'fake' (ie: not based on actual data files), and are to be produced by the modeler as unit tests/examples.
  • Real world serializations covering use cases, produced by others following the model, in a mutually agreed upon format.
  • Software which interprets these serializations and demonstrates proper interpretation of the content.

VO-DML Validation:

  • The Coords model was developed using the Modelio UML tool, and exported to xmi format.
  • The xmi model description was then processed using an xslt script into VO-DML/XML format.
  • This VO-DML/XML format file was then validated against the vo-dml schema with no reported violations.
    • the xml file is available in the IVOA document repository, here.

Serializations:

  • VOTable COOSYS
    • this represents a standardized serialization of a Coordinate model SpaceFrame
      • COOSYS =>!SpaceFrame
      • COOSYS.system => SpaceFrame.spaceRefFrame
      • COOSYS.equinox => SpaceFrame.equinox
      • COOSYS.epoch => would map to epoch of a particular measurement set, outside the scope of SpaceFrame
      • NOTE: COOSYS lacks the 'refPosition' present in SpaceFrame.. this is on the list as a probable enhancement to COOSYS
  • VOTable 1.4: TIMESYS
    • this is similarly, a standardized serialization of Coordinate model TimeFrame
      • TIMESYS => TimeFrame
      • TIMESYS.timescale => TimeFrame.timescale
      • TIMESYS.refposition => TimeFrame.refPosition
      • TIMESYS.timeorigin => TimeOffset.time0; centralizing this information high in the serialization
  • Example serializations:
    • Annotated VOTables:
      • all model elements as VOTable files annotated to the VODML Mapping Syntax ( WD:20170323), produced by Jovial software package.
        • coordinates model elements: here
          • includes xml, and jovial dsl files
        • measurement model elements: here
          • includes xml, and jovial dsl files
        • transform model elements: here
          • includes xml, and jovial dsl files
    • Various Formats:
      • independent python code, generated example serializations spanning all elements of the models in 4 formats:
        • *.vot: VOTable-1.3 standard syntax
          • Validates using votlint
        • *.avot: VOTable-1.3 annotated with VO-DML/Mapping syntax
          • Validates using xmllint to a VOTable-1.3 schema enhanced with an imported VO-DML mapping syntax schema
        • *.xml: XML format
          • Validates against the model schema
        • *.xxx: An internal DOC format
          • XML/DOM structure representing the instances generated when interpreting the templates.
      • measurement model elements: here
      • coordinates model elements: here

Software:

A detailed study was performed to determine the compatibility of the Meas/Coords data models to the AstroPy package, a popular Python package with intensive support for Space and Time coordinates.

In addtion, several software packages have been developed which generate/manipulate Coordinates model elements.
  • Jovial: A Java toolset that helps build and generate serializations for VODML compliant data models.
  • Rama: Python package, parses annotation and instantiates instances of model classes. Includes adaptors to AstroPy classes.
  • ModelInstanceInVOT Code: Python package for processing annotated VOTables
  • TDIG: Working project of Time Series as Cube.
    • An effort to enhance SPLAT to load/interpret/analyze TimeSeries data using data annotation
      • the tool was enhanced to use new annotations (eg: TIMESYS, UTypes) to identify and interpret the data automatically.
    • Delays in resolving on a standard annotation syntax has hindered progress on this project to fully realize the possibilities. This is a high-priority for upcoming work.
  • pyVO: extract_skycoord_from_votable()
    • Demonstrated in Paris this product of the hack-a-thon generates AstroPy SkyCoord instances from VOTables using various elements embedded in the VOTable.
      • Interrogates a VOTable, identifies key information and uses that to automatically generate instances of SkyCoord.
        • UCD: 'pos.eq.ra', 'pos.eq.dec'
        • COOSYS.system: "ICRS", "FK4", "FK5"
        • COOSYS.equinox
      • The COOSYS maps directly to SpaceFrame, and the value of the system
      • The UCD 'pos.eq' maps directly to meas:EquatorialPosition; with 'pos.eq.ra|dec' identifying the corresponding attributes (EquatorialPosition.ra|dec) as coordinates coords:Longitude and coords:Latitude.
      • This illustrates that even with minimal annotation, this sort of automatic discovery/instantiation can take place. With a defined annotation syntax, this utility could be expanded to generate other AstroPyobjects very easily.
  • AST: Starlink's library for handling World Coordinate Systems
    • A project to exercise the Transform model, which includes pulling in elements from the Coordinates model to define the 'Frames' on either end of the transform.
      • Uses YAML serializations, annotated to the model elements
      • Currently includes the CoordSpace, PixelSpace, and SpaceFrame elements of the Coords model.
      • There is a high level of correlation between the AST Frame objects and the Coordinate model elements.

Validators

As noted above, the serializations may be validated to various degrees using the corresponding schema

  • VOTable-1.3 using votlint: verifies the serialization complies with VOTable syntax
  • VOTable-1.3 + VODML: verifies the serialization is properly annotated
  • XML using xmllint with model schema: verifies the serialization is a valid instance of the model.
  • NOTE: The modeler examples undergo all levels of validation, showing that the VOTable serializations are also valid instances of the model.
I don't believe there are validators for the various software utilities. Their purpose is to show that given an agreed serialization which can be mapped to the model(s), the data can be interpreted in an accurate and useful manner.

Usage

In the period since the close of the RFC2 review, a great deal of effort has been made to illustrate the usability of the Meas/Coords models in the context of real world scenarios. Each have confirmed the usability of the data models, and illustrate how annotating data to models can facilitate interoperability.

These include:

  • Data Model Workshop - May 2021
    • This Git repository contains original implementations from all participants.
  • DM Case Implementations
  • AstroPy Wrapper
    • Using an AstroPy wrapper in the ModelInstanceInVOT code (see Software)
    • This Git repository holds case implementations
      • Meas/Coords model elements are mapped in VOTable
      • parser interprets annotation to generate model instances, and converts them to SkyCoord instances.
      • Threads:
        • Extract positions, parallax and proper motions from ESAC archive; generate 3D plot of source positions
          • using direct Measurements model instances
          • using converted AstroPy SkyCoord instances
        • Identify annotated source positions and reconcile the coordinate frames.
        • Extract observation history of a source from ESAC XMM TAP archive, track source movement over 20 year period.
    • Examples
    • Notebook
  • ADASS 2021 BoF - TAP and the Data Models
    • This BoF discussed the possibility and benefits for TAP services to apply on-the-fly annotation of the query responses to serve not only the data, but real model instances.
    • Annotated TAP responses can be consumed by software such as those described blow, to interpret the content in terms of IVOA data models, greatly enhancing the interoperability of manipulating query responses from various services in science threads.
    • Conclusions of the BoF include: "This session and the following discussions 4 highlighted that TAP services can already serve hierarchical data and that serving legacy data with annotations or even Provenance instances is within our reach."
    • Resources

Links with Meas

The Coordinates model a base component, primarily used to support the development of other models. Most significantly is in support of the Measurement model, which is the core model of interest to most use cases. Information about the relation to that model, how the use case requirements divide up, etc. can be found on the STC2 page

Comments from the IVOA Community during RFC/TCG review period: 2020-10-26 - 2020-12-07

The comments from the TCG members during the RFC/TCG review should be included in the next section.

In order to add a comment to the document, please edit this page and add your comment to the list below in the format used for the example (include your Wiki Name so that authors can contact you for further information). When the author(s) of the document have considered the comment, they will provide a response after the comment.

Additional discussion about any of the comments or responses can be conducted on the WG mailing list. However, please be sure to enter your initial comments here for full consideration in any future revisions of this document

Comments by Markus Demleitner

(1) I'm really unhappy with what the document gives as "use cases". To me, a use case says something like "User is doing X, client can do Y because of what we're specifying here". Only from something like that one can reliably derive requirements -- and avoid discussions like "should features A and B be in there at this moment?" Contrast this with "exercise the transform model" -- I can derive anything or nothing from that. Can we have use cases that actually describe uses, as in "Bring two catalogues on the same epoch" or "Overplot objects from a catalogue on a calibrated image"?

  • The primary driver for this version of the measurements and coordinates models is to support describing the main concepts and content of N-Dimensional cube data. This does not include usage threads such as the ones you describe, which would be relevant to the development of a "Catalogue" model. These models establish the framework for describing the data structure and content, which will facilitate a wide variety of usage threads. The first order 'action' is always to serialize, list and/or display the content of the data product. From there, if we find certain actions are not supported, these become specific usage cases to refine the model. -- MarkCresitelloDittmar - 2021-10-26
  • Additionally, the various usage exercises performed since the close of RFC2 (described above) illustrate the usability of the model in the context of complex datasets (Cubes, TimeSeries, Mango) and the ability to use model-aware code to find and manipulate these model instances. -- MarkCresitelloDittmar - 2022-02-15
(2) As in the STC1 days, I still disagree quite strongly with treating JD, MJD, and ISOTime at the modelling level. It's true that ISO time is a bit special in that it's not just a float continuing on and you certainly don't want free offsets on that, but still: that's container-level serialisation, and JD and MJD aren't special in any way over any other sort of floating-point time. Can't we just follow what TIMESYS in VOTable does?

  • This has been discussed in depth. These are needed at the model level to distinguish between types which are serialized the same (eg: as a real), and for non-votable usage (eg: JSON). Defining classes for these is far more efficient and verifiable than trying to convey the dependencies based on semantics (eg: if ucd=time.mjd, then these other things must be satisfied ). -- MarkCresitelloDittmar - 2021-10-26
(3) The cardinal sin of any data model is trying to do too much; since polarisation sticks out as not really related to anything in this model and it's only mentioned in a single requirement the source of which remains unclear: Can't we take it out and save it for later, when we have real use cases?

  • I agree that polarization is conspicuous in the workshop usage. However, this is within the Cube model scope; specifically with Images, where we can and do have a Polarization axis. Since the Cube model is the primary driver, I don't believe that postponing until later is a viable option. -- MarkCresitelloDittmar - 2021-10-26
(4) What would the ~coordSys of a BinnedCoordinate be? Talking about BinnedCoordinate: I don't think it's a good idea to talk about pixels too much here -- most people will encounter pixels as a coordinate system on their CCD frames, and there, they normally don't look at integral pixels, and they very certainly shouldn't use BinnedCoordinates. Come to think of it: When would they use them anyway? Perhaps that's another thing we can postpone?

  • For images, the coordSys would be a PixelCoordSystem describing the image axes.
  • The concepts of Pixel and PixelSpace are integral to the Image case (pun intendent). Yes, there is also the 'pixel' unit in the spatial domain, which his continuous and would not use the binned axes. I don't believe this will be a great source of confusion for users.
  • ACTION: I will add an example to the DM use case implementation set involving image data. -- MarkCresitelloDittmar - 2021-10-26
(5) Since all times are offsets (from what's called their epoch in timekeeping), I fail to see why we would have TimeOffset and TimeInstant as separate classes.

  • The TimeInstant types (MJD, JD, ISO) have a fixed/definitive zero point. The TimeOffset type has a user specified zero-point (using a TimeInstant). The separation avoids a recursion. -- MarkCresitelloDittmar - 2021-10-26
(6) Since we know about TimeInstant: Why would Epoch not just use that but rather uses an ad-hoc time serialisation?

  • The TimeInstant and Epoch are serialized and interpreted differently, so they are separate types. e.g. MJD-OBS=59513, EPOCH=B1950.0
  • It has been suggested that Epoch should be within the TimeStamp type. If that becomes the general concensus, moving it into that family would be a minor version update to the model in the future. -- MarkCresitelloDittmar - 2021-10-26
(7) I still think it's a bad idea to consider temperature or flux as "coordinates". If you want "coordinate" to mean anything, it's linked to vector spaces, and neither temperature nor flux make sense as a vector component as such. Why would you want to do that?

  • On the contrary, most physical quantities will be representable by the PhysicalCoordinate type and not require specialization. Specializations will only be needed when there is important associations which need to defined.
  • As PhysicalCoordinate-s they are automatically included in the Measure scope, for associating Errors.
  • -- MarkCresitelloDittmar - 2021-10-26
(8) What's the use case for your Axis objects? For instance, is it expected that RA is explicitly marked as a cyclic axis? I see requirements coords.001 and coords.002 referenced, but I, really, still can't tell what a client would do with this information.

  • There are standard coordinate spaces defined which are the default (so one typically will not need to specify the coordinate space).
    • these are defined in terms of the Axis model elements.
    • we've talked of the possibility for users to define and register 'home-grown' standard spaces
  • The Cube model requires support for non-standard spaces..
    • Chip/Detector coordinate spaces: Cartesian axes with fixed domain space
    • MSC coordinates: MSC( Theta, Phi ) are the off-axis angle and azimuth of the photons in a frame fixed with respect to the HRMA optical axis.
  • Also, this is very consistent with the AstroPy 'Representation' concept
  • -- MarkCresitelloDittmar - 2021-10-26
(9) I'm afraid I don't find any of the "reference implementations" terribly convincing. There's far too much "work in progress" or "previous version". At least having one thing where we could see as many of the (IMHO too many) features of this model at work as possible would help mollify my concerns that very little of what is written in the spec has actually been tried out.

  • The workshop implementations should help address that.
  • There has been a several projects undertaken to demonstrate the usability of the models in various scenarios (TimeSeries as Cube for example). It is difficult to maintain all of these through multiple iterations of these core models.
  • ACTION: I will make a sweep of the twiki pages to include the workshop cases.
  • ACTION: I will see about defining/implementing additional cases which utilize the Cube elements more directly (Image, Non-standard coordinate space, polarization)
  • -- MarkCresitelloDittmar - 2021-10-26
-- MarkusDemleitner - 2020-11-25



Comments from TCG member during the RFC/TCG Review Period: 2022-02-24 - 2022-04-10

WG chairs or vice chairs must read the Document, provide comments if any (including on topics not directly linked to the Group matters) or indicate that they have no comment.

IG chairs or vice chairs are also encouraged to do the same, althought their inputs are not compulsory.

TCG Chair & Vice Chair

The Coords PR has gone through a thorough review by TCG and community (twice), and is complemented by serializations, validators and tools. It updates a central model component of the IVOA architecture using the VO-DML standardised approach. Moreover it provides a basis for other data models, the Measurements DM not the least of them.
TCG review and vote has been provided and cast. TCG coordination is fine in approving this PR to move on the REC path for Exec evaluation.

Applications Working Group

Data Access Layer Working Group

Only two very minor comments:

Appendix C:

  • "Standard Reference Position" is repeated as the title for all three vocabularies
Appendix D
  • other RECs have the most recent changes at the top (i.e. subsections in descending order) for ease of use by the reader
Response: Thanks James! I can't believe the vocabulary titles got past so many eyes!. These are both corrected in Pull Request #13.

-- MarkCresitelloDittmar - 2022-07-21

-- JamesDempsey - 2022-07-04

Data Model Working Group

After having tested the model on different Vizier tables (see the globals section here). I suggest a modification in the sky coordinates .
Sky coordinates are now reprensented by a 3D vector (coords:Point).
The role of the 3 axis (lon/lat or xyz) is given by the axis descriptions that is part of the coordinate system (coords:SpaceSys).
It would be quite more simpler to have an abstract class (e.g. Point) with 4 concrete sub classes:

  • The current Point (renamed as GenericPoint?)
  • CartesianPoint (x,y,z)
  • SphericalPoint (r, theta, phi)
  • CelestianPoint (lon, lat) <-- Not sure about the name, could be LonLatPoint.
The wouldn't break anything since the abstract class Point could be used everywhere in replacemement of the current one.

This keep the independance of the sky coordinates and the sky frames.

This change would significantly simplify the data annotation.

  • The suggested changes will be made. -- MarkCresitelloDittmar - 2021-10-26
    • addendum: did not add SphericalPoint, only GenericPoint, CartesianPoint and LonLatPoint. Others may come as needed

Grid & Web Services Working Group

At this stage there are no particular comments from GWS.

-- GiulianoTaffoni - 2022-07-07

Registry Working Group

The Registry standards mention coordinate frames in several places, however the allowed values are defined by controlled vocabularies which are linked from (vs explicitly stated in) this Proposed Recommendation, so it seems ok for us. Otherwise, we don’t currently see any issues brought by this new standard.

Response: Thanks Renaud.. the migration of this model to use the external vocabularies was a bit step.

-- MarkCresitelloDittmar - 2022-07-21

-- RenaudSavalle - 2022-07-05

Semantics Working Group

Two spots in the current PR are spots of trouble for Semantics:

(1) You shouldn't include the vocabularies (as in appendix C) in the specification -- that'll only tempt people to use exactly what's mentioned there in their software. A brief comment to the effect that software is encouraged to regularly update their lists or so would of course be welcome, as would be comments on the general architecture or quirks (like the old VOTable COOSYS names). The complete list, however, will just be outdated after a while, and people will curse us because we apparently publish contradicting information. Plus, it'll shave off a few more pages, which might lessen peoples' aversion against takeup.

(2) The vocabulary http://www.ivoa.net/rdf/refframe definitely isn't good for REC yet. In particular:

  • I'm rather sure that we either need to re-define GENERIC_GALACTIC or make SUPER_GALACTIC independent of GENERIC_GALACTIC (did I really do this?)
  • figure out what barycentric was (I fear it's a conflation of ICRS and refpos BARYCENTER -- what do we do then?) It's one of the VOTable legacy terms. Perhaps we're lucky and nobody has ever used it?
  • figure out what geo_app was in VOTable COOSYS -- or drop it. Volunteers for cleaning this up?
-- MarkusDemleitner - 2020-11-25

  • Vocabularies will be removed from the Appendix. I understand 2 may be resolved now, but it should not be a problem to have a v1.0 with a minimally agreed upon set to go with this model. -- MarkCresitelloDittmar - 2021-10-26
    • addendum - Vocabulary details have been removed, leaving only references to the rdf repository.
Data Curation & Preservation Interest Group

Education Interest Group

Knowledge Discovery Interest Group

Radio Astronomy Interest Group

  • This looks like a good datamodel for full description of coordinates systems. Important for radio astronomy data. Cube datamodel may be strongly relevant for these data
  • Like Markus above I am reluctant to see Epoch as something not part of TimeInstant. Decimal years (with different Time Originsand year length definitions sulmmlarized in the initial letter) are another representation of time different but with the same status than JD, MJD, Iso. Let's call it DecimalYear. Then the equinox attribute in the SpaceFrame will have this type . The epoch attribute in CustomRefLocation ? Is that the date for which the position of the refernce point is given ? If yes then the name epoch is OK and the type will be again DecimalYear
  • BinnedSpace, PixelIndex, PixelCoordSys : Which shouldn't we call everything "Binned" ? Pixel is ambiguous because in general we consider a pixel coordinate as a real number (at least for transformations). It's strange to have it as an integer here. I understand there is a pixel unit for spatial spaces. But is it really specific to space ? In conclusion I think Binned spaces and Pixel Spaces (and systems + coordinates) should be managed differently
  • Miscelaneous points : 1 ) Coords.005 requirement . How is the axis/coordinate association done ? by order in the sequence ? 2 ) LonLatPoint.dist and CartesianPoint.x . is there a unit ? or is it with radius 1 ? How do we know these distances anyway ? 3 ) GenericPoint: "spatial coordinates in a custom coordinate space" . Does that mean a real volume ? something else ?
Response:

  • Epoch: this could be a sub-type of TimeInstant. It has been separate in this model, mainly because, in my experience, one is not converting an "Epoch" to other representations. ie: "my position is given in 'B1950'... what is the MJD date that corresponds to?" That would be a compelling reason to have it as a TimeInstant along with the others. If a case like that is worked in the future, this would be a minor patch type of change, so I think this is a good approach.
    • OK it will be cleaner if we do that in the future -- FrancoisBonnarel - 2022-07-06
  • Binned vs Pixel: This content has been unchanged for quite some time. I do actually think of the term 'Pixel' as an integer, these basically represent the FITS NAXIS and NAXISn, which I consider the image pixel space. I do agree there, is a continuous space (floating point values) which have unit="pixel" which is a different entity.
  • Misc:
    • 1) axis association is by order unless otherwise specified (eg: CartesianPoint states which axis, while GenericAxis is only by order). This was more directly associated by a reference at one point, but that proved unpopular and a bit combersome in action.
    • 2) LonLatPoint.dist and Cartesian.x unit: These values are Quantities, so have a unit. The UnitSphere space was cut some time ago at the request to minimize the content of the model.. I expect it will be restored at some poiint with additional use cases. At that point, I expect the LonLatPoint can be assocated with that space, and the 'dist' attribute would be NULL.
      • We have done this in preparing a derived model for instrument FoV. dist is simply skipped which is allowed by its cardinality -- FrancoisBonnarel - 2022-07-06
    • 3) GenericPoint: It means a point associated with a non-standard space. The example file for this, uses the Chandra MSC coordinate space.
  • -- MarkCresitelloDittmar - 2022-07-05

Solar System Interest Group

Points of confusion for me:

  • Section 2.1.1 (and similar): It is not clear to me what the distinction is between "pixel domain" and "physical domain". After reading the whole document, I suspect that perhaps "pixel domain" is where measured quantities that are NOT physical coordinates (like magnitude, flux, etc.) live, but it is not at all clear to me how to treat non-physical measurements in the context of this model. Perhaps they are not supposed to be - by working back and forth between this and the measurements model, it begins to make some sense in the abstract, at least. If that is the intention, then a clear statement early on describing the relationship between the two models for physical quantities would be helpful to the novice. For example, in the second bullet of section 1.2, a reference to the appropriate subsection for defining the physical coordinate axis followed by a reference to the Measurement Model for defining value and error might help to underline the need to consult both models simultaneously for the complete picture. Response:
Response:
The 'pixel domain' pertains to binned image axes. You have an N-Dimensional image, the pixel coordinate identifies the the cell of the array.

The 'physical domain' are the physical quantities ( have physical meaning ) that you find as columns in cubes, event lists, and catalogs; or the physical/wcs axes and value of images. These include position, time, flux, magnitude, temperature, etc.

Section 2 does include a section each describing "Pixelated Image Cube" and "Physical Data (Observables" which tries to convey where each is used. If there is more needed, I'd appreciate a suggestion on what/where clarification can be added.

There has been confusion in the past about the line/relation between Coordinates and Measurements models..
There is certainly a relation between the Measurements and Coordinates models, ( Meas uses Coords ), but one should not need to read Meas to understand Coords. In a nutshell, the Coordinates model defines the 'values' and the domain space in which they live. The Measurements model combines Value and Error to define what we typically see as data/properties in cubes, event lists and catalogs.

-- MarkCresitelloDittmar - 2022-05-17

  • Section 4.8, item 2:
    • I don't disagree with the statement beginning "It is a bad idea...", but I don't see how quoting the ISO time format for UTC mitigates the problem, especially in light of the lack of a time scale for JD and MJD. The implication seems to be that M/JD should simply be converted to resemble the ISO 8601 format. Is that the intention? Doesn't that amount to "Don't use M/JD at all?", or worse "Don't tell us you used M/JD, just make it look like UTC?"
    • The text refers to the "restricted form of ISO 8601". If this is intended to refer to the subset defined by IETF RFC 3339 (https://datatracker.ietf.org/doc/html/rfc3339), then the format given is problematic - the restricted form does not allow for extended years with signs or more than 4 digits. If it is not intended to refer to that RFC, then it is probably better not to refer to it as "the restricted form of ISO 8601". Something like "Instead, use a subset of the ISO 8601 form ... with no time zone indicators" would be sufficient (and cover the cases of both time zone characters and numeric offsets indicating time zones).
Response: see Git issue #9 for discussion

o bullet 1: I think this is a misinterpretation... I've rephrased/ordered the bullets.

o bullet 2: No, this is the FITS-like time string specification.. same as DALI and PhotDM1.1. I've changed that to reference the ISOTime section which describes the format.

o I'll note that there was some discussion about using the FITS format rather than ISO8601 specifically. While it is feesible/easy to support both formats, I think the primary use-case for string formats is still, by far the FITS convention.

o I'll also note that the document does not directly reference the DALI Timestamp spec (to avoid the extra cross-standard reference), but the text is entirely compatible. This could be added, but for DMs anyway, a better approach would be to add/update the IVOA base types to either specify ivoa:datetime as this format, or extend it to a sub-class which is specified as this format and could be directly used by the models.

-- MarkCresitelloDittmar - 2022-05-17

  • Section 4.8, item 6:
    • The string beginning "Because TDB runs..." is not a sentence and I'm not sure whether the intended statement that all this qualifies is missing, or if perhaps "which" should be "it".
    • There is a reference to "the above cited A&A paper", but no "A&A paper" is mentioned by that designation prior to this point in the text. I suspect this should be replaced by the citation "Rots, et al. (2015)" (or perhaps "Rots and Bunclark et al, 2015", as cited in the opening of section 4.8.)
Response:

o right.. "above cited A&A paper" is the "FITS WCS Paper IV". I think that bullet tries to say too much, and have reduced it to a more generic statement.

"Quite a lot! Complete and accurate Time metadata is extremely important for many IVOA use cases. We strongly encourage a review of the above cited FITS WCS paper when describing temporal data."

-- MarkCresitelloDittmar - 2022-05-17

  • Section 4.13.1: TimeOffset is a real quantity, but does not appear to have an associated unit of measure in the model. This is probably part of my general "units of measure" confusion, but it leads me to wonder: Is there a reason why this is required to be a real number and not an ISO 8601 time interval? Section 4.8 seems to be directing users to use the ISO 8601 format for all types of time measurements, so an ISO interval would seem to be a more natural way of stating an offset, and would avoid the units issue entirely, of course.
Response:

o re: units of measure; the TimeOffset.time is a RealQuantity type, the units are included there.

o re: ISO 8601 interval; The vast majority of data we've seen represent time offsets as a real number, it is the natural 'type' for this. The ISOTime type accommodates a very common representation for instants seen in data which is compatible with ISO 8601. But the model is not encouraging adoption of ISO 8601.

-- MarkCresitelloDittmar - 2022-05-17

Typos:

  • Section 2.1.1, "General", first major bullet, third minor bullet: "limitiations" should be "limitations".
  • Section 2.3: The PDF version does not contain a diagram, but rather a "PDF fallback:" block.
Response:

Typo corrected (git PR#11). Will ensure next PDF will include the Architecture diagram.

-- MarkCresitelloDittmar - 2022-05-17

-- AnneRaugh - 2022-04-20

Theory Interest Group

Time Domain Interest Group

Operations

Standards and Processes Committee


TCG Vote : 2022-02-24 - 2022-04-10

If you have minor comments (typos) on the last version of the document please indicate it in the Comments column of the table and post them in the TCG comments section above with the date.

Group Yes No Abstain Comments
TCG X      
Apps X      
DAL X      
DM X      
GWS X      
Registry X      
Semantics X      
DCP        
KDIG        
RIG X      
SSIG X      
Theory        
TD X      
Ops        
<nop>StdProc        


Edit | Attach | Watch | Print version | History: r29 < r28 < r27 < r26 < r25 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r29 - 2022-08-12 - JanetEvans
 
This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback