home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 326138431

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/1225#issuecomment-326138431 https://api.github.com/repos/pydata/xarray/issues/1225 326138431 MDEyOklzc3VlQ29tbWVudDMyNjEzODQzMQ== 2443309 2017-08-30T22:36:14Z 2017-08-30T22:36:14Z MEMBER

@tbohn - What is happening here is that xarray is storing the netCDF4 chunk size from the input file. For the LAI variable in your example, that isLAI:_ChunkSizes = 19, 1, 160, 160 ; (you can see this with ncdump -h -s filename.nc).

shell $ ncdump -s -h veg_hist.0_10n.90_80w.2000_2016.mode_PFT.5dates.nc netcdf veg_hist.0_10n.90_80w.2000_2016.mode_PFT.5dates { dimensions: veg_class = 19 ; lat = 160 ; lon = 160 ; time = UNLIMITED ; // (5 currently) variables: float Cv(veg_class, lat, lon) ; Cv:_FillValue = -1.f ; Cv:units = "-" ; Cv:longname = "Area Fraction" ; Cv:missing_value = -1.f ; Cv:_Storage = "contiguous" ; Cv:_Endianness = "little" ; float LAI(veg_class, time, lat, lon) ; LAI:_FillValue = -1.f ; LAI:units = "m2/m2" ; LAI:longname = "Leaf Area Index" ; LAI:missing_value = -1.f ; LAI:_Storage = "chunked" ; LAI:_ChunkSizes = 19, 1, 160, 160 ; LAI:_Endianness = "little" ; ...

Those integers correspond to the dimensions from LAI. When you slice your dataset, you end up with lat/lon dimensions that are now smaller than the _ChunkSizes. When writing this back to netCDF, xarray is still trying to use the original encoding attribute.

The logical fix is to validate this encoding attribute and either 1) throw an informative error if something isn't going to work, or 2) change the ChunkSizes.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  202964277
Powered by Datasette · Queries took 76.935ms · About: xarray-datasette