home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 243364284

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/992#issuecomment-243364284 https://api.github.com/repos/pydata/xarray/issues/992 243364284 MDEyOklzc3VlQ29tbWVudDI0MzM2NDI4NA== 3404817 2016-08-30T08:06:49Z 2016-08-30T08:07:07Z CONTRIBUTOR

But maybe the encoding dict is not the way to go after all, since it contains entries per variable, while it is the dimension that must be unlimited.

Currently the dataset variables can be created in any order and their necessary dimensions created whenever needed (in the set_necessary_dimensions function). I would not like to change that logic (e.g. towards creating all dimensions required by all variables first, before adding the data variables).

So how about a new kw argument to to_netcdf, like

ds.to_netcdf(unlimited_dimensions=['time'])

or

ds.to_netcdf(dimension_unlimited={'time': True})

(the second option better for explicitly setting {'time': False})?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  173773358
Powered by Datasette · Queries took 0.577ms · About: xarray-datasette