home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 57926291

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/pull/175#issuecomment-57926291 https://api.github.com/repos/pydata/xarray/issues/175 57926291 MDEyOklzc3VlQ29tbWVudDU3OTI2Mjkx 1217238 2014-10-05T04:24:08Z 2014-10-05T04:24:08Z MEMBER

@akleeman I just read over my rebased versions of this patch again, and unfortunately, although there are some useful features here (missing value support and not writing trivial indexes), overall I don't think this is the right approach.

The idea of decoding data stores into "CF decoded" data stores is clever, but (1) it adds a large amount of indirection/complexity and (2) it's not even flexible enough (e.g., it won't suffice to decode coordinates, since those only exist on datasets). Functions for CF decoding/encoding that transform a datastore to a dataset directly (and vice versa), more similar to the existing design, seem like a better option overall.

As we discussed, adding an argument like an array_hook to open_dataset and to_netcdf (patterned off of object_hook from the json module) should suffice for at least our immediate custom encoding/decoding needs.

To make things more extensible, let's add a handful of utility functions/classes to the public API (e.g., NDArrayMixin), and break down existing functions like encode_cf_variable/decode_cf_variable into more modular/extensible components.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  36467304
Powered by Datasette · Queries took 0.893ms · About: xarray-datasette