*IVOA DAL WG Running Meeting #10 *Wednesday 28 April 2021 - 20:00 UTC - vconf Participants: (15) Marco Molinaro, Mark Taylor, Anzhen Zhang, Brian Major, François Bonnarel, Tamara Civera, Judith Silverman, Alberto Micol, Renaud Savalle, Tess Jaffe, Christophe Arviset, Dave Morris, Steve Groom, Theresa Dower, Hendrik Heinl Agenda: *Ops/DAL service validation and compliance *Notes: Slides from MT with context and in view of the Interop. SCS, SIA, SSA, TAP considered becuase they're the easier to measure. Different reasons compliance is not met, validation tools exist to help out there (even if not for all standards) VO Wheather reports are presented at Interops showing the available figures of validation, with details on what fails and how many times. Ops/DAL activity to try to improve the validation reports, to improve user's experience and make life easier to client developers. (slide 6/8) -> actions to improve compliance, on all sides:providers, toolkit developers, validator developes, runners. *Discussion points Questions to weather reporters, validator authors: * What are current activities? * How much user support is offered? How much is taken up? * Are standards generally validator-friendly? Is there advice for standards authoring? Questions to toolkit/service developers/operators: * Are you using validation tools? Are they adequate? What would make them more useful? * What other approaches to compliance are you using? Questions to standards authors: * Are adequate update/erratum/issue-tracking mechanisms in place? Questions to service users and client authors: * What are the most annoying compliance issues? * Are there compliance failures that don't matter? Questions to all: * Are weather reports a good tool? Are they testing the right things? * Are the right validators available? Do they do the right things? * What else would help? *Interop Planning Schedule: * 1 Ops/DAL session on Validation + Compliance + 1 other Ops session Focus: * Discuss known/common specific issues? * Advertise/encourage validator use? * Wider discussion on approach to compliance? Possible content: * Weather reports? * Short presentations from validator maintainers? * What's on offer, how to use it, what level of user support is available * Repeat/expand discussion from this meeting? * Look at specific issues? * Validation hackathon/clinic/demo? *For reference: Ops Validation/Compliance activities to date Approaches to improve compliance * Target toolkits with multiple deployments * Identify widely-used toolkits * Dachs, TAPLib, others? * Identify deployments using those toolkits * Eventually: SoftID Note * Detective work * Ask toolkit developers, service deployers * Identify issues in toolkits * Categorise issues: * version-specific -> encourage upgrades * toolkit issues -> discuss with developer, note issues * deployment issues -> support deployers with help from toolkit developers * Advertise/encourage use of validators * Offer help with running validators * Improve validator ease of use, documentation * Make sure support is available * Try to ensure comprehensible error messages * Error report wiki page/issue tracker? * Advertise reference implementations? * Provide more locally-runnable validators? * Discuss validation in service provider training * More engagement on Ops IG mailing list? Slack? * Run validators, contact operators by email with reports, offer advice to address issues * People are usually keen to fix things if contacted, less keen on volunteering * Identify deployed services with contact people likely to be resonsive? * Extend this to more recipients (any self/other nominations?) * Use/examine bulk validation site data (Euro-VO, VO-Paris, HEASARC) * Web GUI and/or underlying DBs depending on site * Quite time-consuming, but lots of information there * Assistance usually required from maintainers * Generally helpful, but support commitment/expectation not always clear * Identify/investigate common errors * Try to identify/address common causes * Update validator behaviour as required * Fix bugs * Reconsider validator output * New category "wrong-but-harmless"? * New category "wrong-but-sensible"? * Are service compliant/non-compliant thresholds set appropriately? * Improve standards as required * Errata * Clarifications etc in future versions * Develop advice for standards authors? * Community discussion on ways forward * DAL running meeting * Interop Ops/DAL session * Future Ops telecons? Challenges: * Chasing and fixing compliance issues is time-consuming * It's not very sexy and a bit thankless * The better you get at validation, the worse the results look Actions completed/results: * Service updates * Issues cleaned up in various services: * GAVO, CADC, ARI-Gaia, J-PLUS/J-PAS, ESO, IPAC, ASTRON, others * DaCHS current release now passes all taplint tests * Validator updates reducing reported errors: * taplint downgrade Error to Warning for unknown LANG * taplint relax timestamp xtype checks * taplint update for VODataService 1.2 * taplint clarified some error messages * User engagement: * "taplint clinic" announced on Ops mailing list (not much response) * interactive validation hackathon at ESCAPE Tech Forum 04/2021 (worthwhile) Actions pending: * Software fixes/updates: * Array issues in TAPLib will be fixed * ESA-VO validator to improve trailing "?" handling * More checks required in taplint: EPN-TAP, UCDs, Units, XTypes * User engagement: * ESCAPE Data Providers workshop 06/2021 * Standards issues to raise with DAL: * SimpleDAL recommended MIME types (source of many ESA-VO reports) * Trailing "?" inconsistency in SSA * Identify general advice for standards authorship *ESO experience with validating centres Great work, very useful services! Here few suggested improvements A. Micol: Better to have standalone validators, in preparing services, testing them behind firewalls, etc. What about a Datalink validator and a SODA one (this latter exists, MM) The HEASARC validation can be pointed at a URL that is publicly available even if it's not yet published. For example: https://heasarc.gsfc.nasa.gov/vo/validation/bin/perl/runTest.pl?idorurl=http://vao.stsci.edu/CAOMTAP/TapService.aspx&type=TAP where the idorurl=&type=. I know this isn't what you asked, but it's at least helpful before publication if not behind the firewall. *Question mark '?' in registry endpoints mail DAL about this and see how to solve this, might need some errata or such. In some standards it is correctly specified it is needed, maybe SSA is not so clear about that. 1) it is the HTTP standard that requires the ? before a QUERY_STRING 2) If I do not add the ? the validator tries, fails and tells me that my service is not compliant! It should instead inform me that the ? is missing E.g. it tries: http://archive.eso.org/ssapRA=0&DEC=0&SR=0.000 which returns a 404. It even thinks that my service returns a 404, while instead is apache returning that. => Validating centre’s answer: a bug soon to be resolved by the validating centre. => Good: but what about other validating centres? Are they aware? Will they fix that too? *HTTP status > 399 not accepted Validator complains that my SSA error votable is returned along with a HTTP status > 399. 1) I use status > 400 to tell the receiver that there is an error WITHOUT the need to parse the resulting votable That's what's in the standard, so maybe we need to comply. Check SSA & DALI for a proper answer. *Testing SSA without REQUEST=queryData Validator complains for wrong results when trying to access SSA without the necessary REQUEST input param. It tries: 1) http://archive.eso.org/ssap?POS=0.0,0.0&SIZE=0.000 2) http://archive.eso.org/ssap?FORMAT=METADATA Those 2 URLs return a votable informing that: “Required String parameter 'REQUEST' is not present” Validating centres please use e.g.: http://archive.eso.org/ssap?REQUEST=queryData&POS=0.0,0.0&SIZE=0.000 *Keeping validators up-to-date (e.g. after an erratum) Validator using an old version of TAPLINT, hence not implementing accepted Errata: e.g.: https://wiki.ivoa.net/twiki/bin/view/IVOA/ObsCore-1_1-Erratum-1 1) E.g.: Wrong UCD in ObsCore column dataproduct_type: meta.code.class != meta.id *Support for TAP v1.1? Validator does not seem to offer the possibility to validate a TAP v1.1: only v1.0 supported? 1) Next week I will register the new ESO TAP v1.1 service: how will the validators react on it? * *ESO Lessons learned: - VO Weather reports: how representative are those reports? The above must be considered and improved! - Data Provider <=> Validating Centre is a two-way communication: one cannot do a proper job without the other! Iterative feedback, until resolution of the issues, is very important! CA: appreciated feedback from providers, might be difficult to contact them sometimes. Better to have more two-way interactions. Version aware validation is under preparation (not simple). TJ: same as Christophe. True some validators are the same on multiple sites (i.e. 1 validator run at multiple validation centres). CA: services can still be registered and used even if not fully compliant. RS: Paris in the past contacted providers to solve issues, but sometimes the contact point was not up to date, resulting in a useless contact. Tamara Civera: - Using different validators give different results: For example, I validate our SCS against Paris VO validator and all was OK and the service was fully compliant, but then I probe to validate it using Euro-VO validator and it tells me that it was partially compliant (might be testing different features of a protocol). So you think that it is all OK, but really it is not fully compliant. Another example, I validate our TAP service using TAPLINK and it is fully compliant, but in Euro-VO it appears as not compliant. - Problems found with Euro-VO validators: - SIAP: Bug in 4.2.4.b.iii validation? Problems to validate WCS_CoordProjection? - SCS, SIAP not clear how to indicate the coordinates to validate the service. By default use 0,0 and if the service does not return data it appears as not fully compliant. RS: convergence of validators responses was started in the past, a document is still available on the wiki. MT: aligning validators is fine, but making all of them the same might loose information. AM: polygon winding direction validation. SG: user agent validation and identification (fits the "other" Ops session)