Friday, March 19, 2010

Sensor data in GeoRSS?


  1. The OGC Atom+GeoRSS validation results are confusing. Feed Validator has a better presentation

  2. It loosk like the OGC version only supports GML

  3. How should the payload be encoded? This is the big question. See line 24, where we make hard to parse human readable text.

  4. I think the ID is wrong, but it still not clear to me how id's should be handled.

  5. How many data points should be in the feed? If we are collecting a point every 12 seconds, do we want to just have 120 seconds of data? Do we need a realtime feed an one that is decimated? And if decimated, what is the "best" algorithm? I hate systems where you can't find out the actual algorithm used (Ahem! You know who you are!)

  6. Does anything actually pay attention to the syndication flags? We definitely need to have these match the data window (see previous point)

  7. And lots of other issues that escape me right now...


Thoughts?

1 comment:

  1. Hi Kurt,

    Here's my $.02:

    1. True.

    2. A GML representation of a point has a little more markup overhead than georss:point, but not so much that it impacts usability. IMO, georss:where/gml:Point is the way to go.

    3. Tailor payloads to users. There's no one size fits all format. For a human analyst using an unspecialized feed reader or one that has minimal spatial or observation parsing capabilities (like Google Maps), provide HTML content: a standard station plot or time series graph plus links to other resources so that the analyst can drill down. Feeds meant for machine processing should deliver observations in a standard format.

    4. Atom entries have an identifier (atom:id) separate from their "self" link for those who want identifiers that aren't bound to internet domains. For feeds that aren't on the web, or preserve synchronization in the face of a change of domain. A tag URI might fit your needs, something like: tag:ccom.nh,2010-03-22:1269253023 using unix epoch time as the last part of the identifier.

    5. Real time and decimated, yes, with degree of decimation that best suits your users.

    6. I'm not sure there's attention. It seems like a lot of the interest in optimizing syndication has moved on to efforts like RSSCloud and PubSubHubbub.

    ReplyDelete