Putting the pieces together...
busko at stsci.edu
Fri May 14 05:45:42 PDT 2004
Doug Tody wrote:
> Ivo - regarding your point about data quality vectors: As you know,
> the SSA data model has a data quality vector. We don't really know what
> to put in it though. I don't think we should put anything instrumental
> in nature in the general SSA data model (this can be done but it would
> go into nonstandard extension records). Simple models for the quality
> vector would be binary (good or bad) or trinary (known good, known bad
> or flagged, or questionable). Perhaps once we get more experience with
> real data from archives it will be possible to develop a more refined
> quality model. (Note this should not be confused with the error vectors
> which we already have).
> - Doug
Thanks, Doug, that sounds good enough. I agree that nothing
should be put in the data model. However, something must be done to
cases that do not follow the norm.
I have in mind cases where a binary or trinary model wouldn't be enough
to summarize the data quality information available in the original
A good example is FUSE data; it uses a continuously variable 2-byte
value to store a kind of "weight" (between 0 an 100), instead of the
commonly found bit-encoded mask. To cast that data into a, say, binary
model, one needs an additional piece of information, in the form of a
Ideally, a VO transaction involving such data should allow for the
value to be either specified by the requestor, or alternatively be set
data provider in a instrument-dependent way.
In short, my point is: shouldn't the data model allow some room for
extra bits of info such as data quality threshold values?
More information about the apps