home / github / issues

Menu
  • Search all tables
  • GraphQL API

issues: 202964277

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
202964277 MDU6SXNzdWUyMDI5NjQyNzc= 1225 “ValueError: chunksize cannot exceed dimension size” when trying to write xarray to netcdf 1217238 closed 0     11 2017-01-24T22:52:36Z 2017-11-13T03:22:58Z 2017-11-13T03:22:58Z MEMBER      

Reported on StackOverflow: http://stackoverflow.com/questions/39900011/valueerror-chunksize-cannot-exceed-dimension-size-when-trying-to-write-xarray

Unfortunately, the given example is not self-contained: ``` import xarray as xr
ds=xr.open_dataset("somefile.nc",chunks={'lat':72,'lon':144}
myds=ds.copy()

ds is 335 (time) on 720 on 1440 and has variable var

def some_function(x): return x*2 myds['newvar']=xr.DataArray(np.apply_along_axis(some_function,0,ds['var']))
myds.drop('var')
myds.to_netcdf("somenewfile.nc") ```

Apparently this works if engine='scipy' in to_netcdf!

Something strange is definitely going on, I suspect a bug.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1225/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 2 rows from issues_id in issues_labels
  • 11 rows from issue in issue_comments
Powered by Datasette · Queries took 1.284ms · About: xarray-datasette