home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

31 rows where user = 11750960 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 13

  • call to colorbar not thread safe 5
  • can't store zarr after open_zarr and isel 4
  • Element wise dataArray generation 4
  • implement a more threadsafe call to colorbar 4
  • improve to_zarr doc about chunking 3
  • mfdataset fails at chunking after opening 2
  • isel slows down computation significantly after open_dataset 2
  • standard deviation over one dimension of a chunked DataArray leads to NaN 2
  • to_netcdf - RuntimeError: NetCDF: HDF error 1
  • Improving documentation on `apply_ufunc` 1
  • plot.line breaks depending on coordinate shape 1
  • automatic chunking of zarr archive 1
  • overwriting netcdf file fails at read time 1

user 1

  • apatlpo · 31 ✖

author_association 1

  • CONTRIBUTOR 31
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
1435056645 https://github.com/pydata/xarray/issues/7541#issuecomment-1435056645 https://api.github.com/repos/pydata/xarray/issues/7541 IC_kwDOAMm_X85ViToF apatlpo 11750960 2023-02-17T18:12:10Z 2023-02-17T18:12:10Z CONTRIBUTOR

An issue already exists on the dask issue tracker https://github.com/dask/dask/issues/5679

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  standard deviation over one dimension of a chunked DataArray leads to NaN 1589771368
1435054875 https://github.com/pydata/xarray/issues/7541#issuecomment-1435054875 https://api.github.com/repos/pydata/xarray/issues/7541 IC_kwDOAMm_X85ViTMb apatlpo 11750960 2023-02-17T18:10:36Z 2023-02-17T18:10:36Z CONTRIBUTOR

This seems to be an upstream issue:

da.data.std(axis=0).compute()

I will close shortly, apologizes

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  standard deviation over one dimension of a chunked DataArray leads to NaN 1589771368
667585879 https://github.com/pydata/xarray/issues/4284#issuecomment-667585879 https://api.github.com/repos/pydata/xarray/issues/4284 MDEyOklzc3VlQ29tbWVudDY2NzU4NTg3OQ== apatlpo 11750960 2020-08-01T20:54:16Z 2020-08-01T20:54:16Z CONTRIBUTOR

I've got issues reproducing this issue, so closing for now.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  overwriting netcdf file fails at read time 667763555
631479782 https://github.com/pydata/xarray/pull/4048#issuecomment-631479782 https://api.github.com/repos/pydata/xarray/issues/4048 MDEyOklzc3VlQ29tbWVudDYzMTQ3OTc4Mg== apatlpo 11750960 2020-05-20T13:39:25Z 2020-05-20T13:39:25Z CONTRIBUTOR

argh ... saved_on_disk.nc has been removed.

I was unfortunately not able to create a proper list without breaking tests and have exhausted my time trying to figure it out. If you know how to do that I am more than happy to follow your advice.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  improve to_zarr doc about chunking 614854414
631461955 https://github.com/pydata/xarray/pull/4048#issuecomment-631461955 https://api.github.com/repos/pydata/xarray/issues/4048 MDEyOklzc3VlQ29tbWVudDYzMTQ2MTk1NQ== apatlpo 11750960 2020-05-20T13:07:25Z 2020-05-20T13:07:25Z CONTRIBUTOR

ok, tests are passing.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  improve to_zarr doc about chunking 614854414
629416035 https://github.com/pydata/xarray/pull/4048#issuecomment-629416035 https://api.github.com/repos/pydata/xarray/issues/4048 MDEyOklzc3VlQ29tbWVudDYyOTQxNjAzNQ== apatlpo 11750960 2020-05-15T18:38:20Z 2020-05-15T18:38:20Z CONTRIBUTOR

if anybody has a clue on how to fix tests this would be welcome. thx

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  improve to_zarr doc about chunking 614854414
625893739 https://github.com/pydata/xarray/issues/4046#issuecomment-625893739 https://api.github.com/repos/pydata/xarray/issues/4046 MDEyOklzc3VlQ29tbWVudDYyNTg5MzczOQ== apatlpo 11750960 2020-05-08T16:17:52Z 2020-05-08T16:17:52Z CONTRIBUTOR

Thanks for this speedy reply @rabernat !

Improving docs is still within my reach (I hope) and I will give it a shot. Could this improvement in the document take place in the description of the encoding parameter of xarray.Dataset.to_zarr?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  automatic chunking of zarr archive 614785886
611365647 https://github.com/pydata/xarray/pull/3944#issuecomment-611365647 https://api.github.com/repos/pydata/xarray/issues/3944 MDEyOklzc3VlQ29tbWVudDYxMTM2NTY0Nw== apatlpo 11750960 2020-04-09T07:01:12Z 2020-04-09T07:01:12Z CONTRIBUTOR

you're welcome, I hope one day I'll be to make a more worthy contribution to this great library !

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  implement a more threadsafe call to colorbar 595666886
611328122 https://github.com/pydata/xarray/pull/3944#issuecomment-611328122 https://api.github.com/repos/pydata/xarray/issues/3944 MDEyOklzc3VlQ29tbWVudDYxMTMyODEyMg== apatlpo 11750960 2020-04-09T04:57:09Z 2020-04-09T04:57:09Z CONTRIBUTOR

I would feel bad claiming credit for such a small contribution ... it's not even solving my pb which I should probably better document on the issue tracker.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  implement a more threadsafe call to colorbar 595666886
610918333 https://github.com/pydata/xarray/pull/3944#issuecomment-610918333 https://api.github.com/repos/pydata/xarray/issues/3944 MDEyOklzc3VlQ29tbWVudDYxMDkxODMzMw== apatlpo 11750960 2020-04-08T12:03:49Z 2020-04-08T12:03:49Z CONTRIBUTOR

Ok, let me know if you need me to do anything else.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  implement a more threadsafe call to colorbar 595666886
610281639 https://github.com/pydata/xarray/pull/3944#issuecomment-610281639 https://api.github.com/repos/pydata/xarray/issues/3944 MDEyOklzc3VlQ29tbWVudDYxMDI4MTYzOQ== apatlpo 11750960 2020-04-07T09:31:14Z 2020-04-07T09:31:14Z CONTRIBUTOR

Argh, my bad. This implementation does not solve #1889.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  implement a more threadsafe call to colorbar 595666886
610233017 https://github.com/pydata/xarray/issues/1889#issuecomment-610233017 https://api.github.com/repos/pydata/xarray/issues/1889 MDEyOklzc3VlQ29tbWVudDYxMDIzMzAxNw== apatlpo 11750960 2020-04-07T07:48:35Z 2020-04-07T07:48:35Z CONTRIBUTOR

hehe. I cannot reopen the issue apparently

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  call to colorbar not thread safe 294735496
610232714 https://github.com/pydata/xarray/issues/1889#issuecomment-610232714 https://api.github.com/repos/pydata/xarray/issues/1889 MDEyOklzc3VlQ29tbWVudDYxMDIzMjcxNA== apatlpo 11750960 2020-04-07T07:48:00Z 2020-04-07T07:48:00Z CONTRIBUTOR

I reopen this issue as it came across my road again when generating figures on a dask.Distributed LocalCluster. I just open a PR suggesting a change that solves the issue in my situation.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  call to colorbar not thread safe 294735496
610167347 https://github.com/pydata/xarray/issues/3932#issuecomment-610167347 https://api.github.com/repos/pydata/xarray/issues/3932 MDEyOklzc3VlQ29tbWVudDYxMDE2NzM0Nw== apatlpo 11750960 2020-04-07T04:32:12Z 2020-04-07T04:32:12Z CONTRIBUTOR

I'll close this for now as there doesn't seem to be other ideas about this

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Element wise dataArray generation 593825520
609605285 https://github.com/pydata/xarray/issues/3932#issuecomment-609605285 https://api.github.com/repos/pydata/xarray/issues/3932 MDEyOklzc3VlQ29tbWVudDYwOTYwNTI4NQ== apatlpo 11750960 2020-04-06T07:08:19Z 2020-04-06T07:08:19Z CONTRIBUTOR

This sounds like method 1 (with dask delayed) to me. There may be no faster option, thanks for giving it a thought @fujiisoup

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Element wise dataArray generation 593825520
609407162 https://github.com/pydata/xarray/issues/3932#issuecomment-609407162 https://api.github.com/repos/pydata/xarray/issues/3932 MDEyOklzc3VlQ29tbWVudDYwOTQwNzE2Mg== apatlpo 11750960 2020-04-05T12:17:15Z 2020-04-05T12:17:47Z CONTRIBUTOR

thanks a lot @fujiisoup, your suggestion does help getting rid of the necessity to build the ds['_y'] variable. Here is the updated apply_ufunc solution: ``` x = np.arange(10100) y = np.arange(20100)

ds = xr.Dataset(coords={'x': x, 'y': y})

ds = ds.chunk({'x': 1, 'y':1}) # does not change anything

let's say each experiment outputs 5 statistical diagnostics

Nstats = 5 some_exp = lambda x, y: np.ones((Nstats,))

out = xr.apply_ufunc(some_exp, ds.x, ds.y, dask='parallelized', vectorize=True, output_dtypes=[float], output_sizes={'stats': Nstats}, output_core_dims=[['stats']]) ``` An inspection of the dask dashboard indicates that the computation is not distributed among workers though. How could I make sure this happens?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Element wise dataArray generation 593825520
609407192 https://github.com/pydata/xarray/issues/3932#issuecomment-609407192 https://api.github.com/repos/pydata/xarray/issues/3932 MDEyOklzc3VlQ29tbWVudDYwOTQwNzE5Mg== apatlpo 11750960 2020-04-05T12:17:26Z 2020-04-05T12:17:26Z CONTRIBUTOR

sorry closed by accident

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Element wise dataArray generation 593825520
609071924 https://github.com/pydata/xarray/issues/3933#issuecomment-609071924 https://api.github.com/repos/pydata/xarray/issues/3933 MDEyOklzc3VlQ29tbWVudDYwOTA3MTkyNA== apatlpo 11750960 2020-04-04T18:42:10Z 2020-04-04T18:42:10Z CONTRIBUTOR

thanks to you for the fix @TomNicholas !

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  plot.line breaks depending on coordinate shape 593860909
496876520 https://github.com/pydata/xarray/issues/2808#issuecomment-496876520 https://api.github.com/repos/pydata/xarray/issues/2808 MDEyOklzc3VlQ29tbWVudDQ5Njg3NjUyMA== apatlpo 11750960 2019-05-29T10:16:56Z 2019-05-29T10:16:56Z CONTRIBUTOR

I have ended up using apply_ufunc at several occasions and have developed a love/hate relationship with it. Often it turned out to be the simplest and most powerful option ... once I figured how to use it.

So thumbs up for an improved documentation.

Undertaking this task seems like a daunting one to me however, mostly because there are many different ways of using apply_ufunc I am not familiar with. Maybe it's the case for other users as well ...?

If this is the case, shouldn't we 1/ gather clean versions of our examples in a temporary place, 2/ sort these examples, and 3/ consider pushing it as a doc ?

{
    "total_count": 4,
    "+1": 4,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Improving documentation on `apply_ufunc` 420584430
432998937 https://github.com/pydata/xarray/issues/2504#issuecomment-432998937 https://api.github.com/repos/pydata/xarray/issues/2504 MDEyOklzc3VlQ29tbWVudDQzMjk5ODkzNw== apatlpo 11750960 2018-10-25T10:27:49Z 2018-10-25T10:27:49Z CONTRIBUTOR

ds = xr.open_dataset(grid_dir_nc+'Depth.nc', chunks={'face':1}, engine='h5netcdf') % time print(ds.Depth.mean().values) leads to: CPU times: user 824 ms, sys: 50.4 ms, total: 875 ms Wall time: 11.3 s and ds = xr.open_dataset(grid_dir_nc+'Depth.nc', chunks={'face':1}).load() % time print(ds.Depth.mean().values) to CPU times: user 61.9 ms, sys: 22.4 ms, total: 84.3 ms Wall time: 76.2 ms much better ... thanks a lot for these two solutions.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  isel slows down computation significantly after open_dataset 373449569
432672506 https://github.com/pydata/xarray/issues/2504#issuecomment-432672506 https://api.github.com/repos/pydata/xarray/issues/2504 MDEyOklzc3VlQ29tbWVudDQzMjY3MjUwNg== apatlpo 11750960 2018-10-24T14:08:07Z 2018-10-24T14:08:07Z CONTRIBUTOR

additional information about the file:

``` (equinox) [pontea@visu01 LLC4320]$ ncdump -sh grid_nc/Depth.nc netcdf Depth { dimensions: i = 4320 ; j = 4320 ; face = 13 ; variables: int64 i(i) ; i:standard_name = "x_grid_index" ; i:axis = "X" ; i:long_name = "x-dimension of the t grid" ; i:swap_dim = "XC" ; i:_Storage = "contiguous" ; i:_Endianness = "little" ; int64 j(j) ; j:standard_name = "y_grid_index" ; j:axis = "Y" ; j:long_name = "y-dimension of the t grid" ; j:swap_dim = "YC" ; j:_Storage = "contiguous" ; j:_Endianness = "little" ; int64 face(face) ; face:standard_name = "face_index" ; face:_Storage = "contiguous" ; face:_Endianness = "little" ; float Depth(face, j, i) ; Depth:_FillValue = NaNf ; Depth:standard_name = "ocean_depth" ; Depth:long_name = "ocean depth" ; Depth:units = "m" ; Depth:coordinate = "XC YC" ; Depth:_Storage = "chunked" ; Depth:_ChunkSizes = 1, 480, 480 ; Depth:_Endianness = "little" ;

// global attributes: :_SuperblockVersion = 0 ; :_IsNetcdf4 = 1 ; :_Format = "netCDF-4" ; } ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  isel slows down computation significantly after open_dataset 373449569
404873326 https://github.com/pydata/xarray/issues/2278#issuecomment-404873326 https://api.github.com/repos/pydata/xarray/issues/2278 MDEyOklzc3VlQ29tbWVudDQwNDg3MzMyNg== apatlpo 11750960 2018-07-13T15:48:46Z 2018-07-13T15:48:46Z CONTRIBUTOR

Could you please be more specific about where this is done for netCDF?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  can't store zarr after open_zarr and isel 340192831
404503718 https://github.com/pydata/xarray/issues/2278#issuecomment-404503718 https://api.github.com/repos/pydata/xarray/issues/2278 MDEyOklzc3VlQ29tbWVudDQwNDUwMzcxOA== apatlpo 11750960 2018-07-12T12:59:44Z 2018-07-12T13:00:01Z CONTRIBUTOR

Note that there is also a fix for case 2 that is simply del ds['v'].encoding['chunks'] prior to data storage.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  can't store zarr after open_zarr and isel 340192831
404503025 https://github.com/pydata/xarray/issues/2278#issuecomment-404503025 https://api.github.com/repos/pydata/xarray/issues/2278 MDEyOklzc3VlQ29tbWVudDQwNDUwMzAyNQ== apatlpo 11750960 2018-07-12T12:57:17Z 2018-07-12T12:57:35Z CONTRIBUTOR

With the same case, I have another error message which may reflect the same issue (or not), maybe you can tell me. The error message is different which is the reason I am posting this.

Starting from the same dataset: nx, ny, nt = 32, 32, 64 ds = xr.Dataset({}, coords={'x':np.arange(nx),'y':np.arange(ny), 't': np.arange(nt)}) ds = ds.assign(v=ds.t*np.cos(np.pi/180./100*ds.x)*np.cos(np.pi/180./50*ds.y)) ds = ds.chunk({'t': 1, 'x': nx/2, 'y': ny/2}) ds.to_zarr('data.zarr', mode='w')

Case 1 works fine: ds = ds.chunk({'t': nt, 'x': nx/4, 'y': ny/4}) ds.to_zarr('data_rechunked.zarr', mode='w')

Case 2 breaks: ds = xr.open_zarr('data.zarr') ds = ds.chunk({'t': nt, 'x': nx/4, 'y': ny/4}) ds.to_zarr('data_rechunked.zarr', mode='w') with the following error message: .... NotImplementedError: Specified zarr chunks (1, 16, 16) would overlap multiple dask chunks ((64,), (8, 8, 8, 8), (8, 8, 8, 8)). This is not implemented in xarray yet. Consider rechunking the data using `chunk()` or specifying different chunks in encoding.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  can't store zarr after open_zarr and isel 340192831
404415760 https://github.com/pydata/xarray/issues/2278#issuecomment-404415760 https://api.github.com/repos/pydata/xarray/issues/2278 MDEyOklzc3VlQ29tbWVudDQwNDQxNTc2MA== apatlpo 11750960 2018-07-12T07:25:36Z 2018-07-12T07:25:36Z CONTRIBUTOR

thanks for the workaround suggestion. Apparently you also need to delete chunks for the t singleton coordinate though. The workaround looks at the end like: ds = xr.open_zarr('data.zarr') del ds['v'].encoding['chunks'] del ds['t'].encoding['chunks'] ds.isel(t=0).to_zarr('data_t0.zarr', mode='w') Any idea about how serious this is and/or where it's coming from?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  can't store zarr after open_zarr and isel 340192831
389627375 https://github.com/pydata/xarray/issues/2132#issuecomment-389627375 https://api.github.com/repos/pydata/xarray/issues/2132 MDEyOklzc3VlQ29tbWVudDM4OTYyNzM3NQ== apatlpo 11750960 2018-05-16T18:52:59Z 2018-05-16T18:52:59Z CONTRIBUTOR

it turned out I had exceeded my quota on the storage.

Sorry about the noise.

I will look into zarr though.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  to_netcdf - RuntimeError: NetCDF: HDF error 323333361
364385357 https://github.com/pydata/xarray/issues/1889#issuecomment-364385357 https://api.github.com/repos/pydata/xarray/issues/1889 MDEyOklzc3VlQ29tbWVudDM2NDM4NTM1Nw== apatlpo 11750960 2018-02-09T09:44:50Z 2018-02-09T09:44:50Z CONTRIBUTOR

Ok, thanks, do you have a piece of code with a thread-lock that I could get inspiration from?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  call to colorbar not thread safe 294735496
363437598 https://github.com/pydata/xarray/issues/1889#issuecomment-363437598 https://api.github.com/repos/pydata/xarray/issues/1889 MDEyOklzc3VlQ29tbWVudDM2MzQzNzU5OA== apatlpo 11750960 2018-02-06T14:25:59Z 2018-02-06T14:26:10Z CONTRIBUTOR

thanks, got it. I tried the suggested fix but it did not work. Unfortunately I don't have more time on the subject for a couple of days. I'll keep you posted when I can sort things out.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  call to colorbar not thread safe 294735496
363430118 https://github.com/pydata/xarray/issues/1889#issuecomment-363430118 https://api.github.com/repos/pydata/xarray/issues/1889 MDEyOklzc3VlQ29tbWVudDM2MzQzMDExOA== apatlpo 11750960 2018-02-06T13:59:46Z 2018-02-06T13:59:46Z CONTRIBUTOR

Hi,

Yes why not even though I am not too familiar with the process.

I am not even able to properly install the library so far ... python setup.py install creates the following library: /home1/datahome/aponte/.miniconda3/envs/pangeo/lib/python3.6/site-packages/xarray-0.10.0rc1_2_gf83361c-py3.6.egg/ I am surely doing something wrong, I'd like to have xarray.egg.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  call to colorbar not thread safe 294735496
232115345 https://github.com/pydata/xarray/issues/896#issuecomment-232115345 https://api.github.com/repos/pydata/xarray/issues/896 MDEyOklzc3VlQ29tbWVudDIzMjExNTM0NQ== apatlpo 11750960 2016-07-12T17:18:38Z 2016-07-12T17:18:38Z CONTRIBUTOR

Along time_counter

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  mfdataset fails at chunking after opening 165104458
232114342 https://github.com/pydata/xarray/issues/896#issuecomment-232114342 https://api.github.com/repos/pydata/xarray/issues/896 MDEyOklzc3VlQ29tbWVudDIzMjExNDM0Mg== apatlpo 11750960 2016-07-12T17:15:14Z 2016-07-12T17:15:14Z CONTRIBUTOR

Thanks for your answer. You'll find below an output from ncdump in order to answer the part about data arrangement (I hope ...) Otherwise, we have:

dask 0.8.2 py27_0 defaults xarray 0.7.2 py27_0 defaults netcdf4 1.1.1 np18py27_0 defaults

Looking forward for any suggestions. cheers aurelien

``` netcdf NATL60-MJM155_y2008m01d09.5d_gridT { dimensions: x = 5422 ; y = 3454 ; deptht = 300 ; time_counter = UNLIMITED ; // (1 currently) time_bounds = 2 ; variables: float nav_lat(y, x) ; nav_lat:axis = "Y" ; nav_lat:standard_name = "latitude" ; nav_lat:long_name = "Latitude" ; nav_lat:units = "degrees_north" ; nav_lat:nav_model = "grid_T" ; nav_lat:_Storage = "chunked" ; nav_lat:_ChunkSizes = 12, 5422 ; float nav_lon(y, x) ; nav_lon:axis = "X" ; nav_lon:standard_name = "longitude" ; nav_lon:long_name = "Longitude" ; nav_lon:units = "degrees_east" ; nav_lon:nav_model = "grid_T" ; nav_lon:_Storage = "chunked" ; nav_lon:_ChunkSizes = 12, 5422 ; float deptht(deptht) ; deptht:axis = "Z" ; deptht:long_name = "Vertical T levels" ; deptht:units = "m" ; deptht:positive = "down" ; deptht:_Storage = "chunked" ; deptht:_ChunkSizes = 300 ; float votemper(time_counter, deptht, y, x) ; votemper:long_name = "temperature" ; votemper:units = "degC" ; votemper:online_operation = "average" ; votemper:interval_operation = "40s" ; votemper:interval_write = "5d" ; votemper:_FillValue = 0.f ; votemper:missing_value = 0.f ; votemper:coordinates = "time_centered deptht nav_lon nav_lat" ; votemper:_Storage = "chunked" ; votemper:_ChunkSizes = 1, 1, 12, 5422 ; votemper:_DeflateLevel = 1 ; double time_centered(time_counter) ; time_centered:standard_name = "time" ; time_centered:long_name = "Time axis" ; time_centered:title = "Time" ; time_centered:calendar = "gregorian" ; time_centered:units = "seconds since 1958-01-01 00:00:00" ; time_centered:time_origin = "1958-01-01 00:00:00" ; time_centered:bounds = "time_centered_bounds" ; time_centered:_Storage = "chunked" ; time_centered:_ChunkSizes = 1 ; double time_centered_bounds(time_counter, time_bounds) ; time_centered_bounds:_Storage = "chunked" ; time_centered_bounds:_ChunkSizes = 1, 2 ; double time_counter(time_counter) ; time_counter:axis = "T" ; time_counter:standard_name = "time" ; time_counter:long_name = "Time axis" ; time_counter:title = "Time" ; time_counter:calendar = "gregorian" ; time_counter:units = "seconds since 1958-01-01 00:00:00" ; time_counter:time_origin = "1958-01-01 00:00:00" ; time_counter:bounds = "time_counter_bounds" ; time_counter:_Storage = "chunked" ; time_counter:_ChunkSizes = 1 ; double time_counter_bounds(time_counter, time_bounds) ; time_counter_bounds:_Storage = "chunked" ; time_counter_bounds:_ChunkSizes = 1, 2 ; float vosaline(time_counter, deptht, y, x) ; vosaline:long_name = "salinity" ; vosaline:units = "psu" ; vosaline:online_operation = "average" ; vosaline:interval_operation = "40s" ; vosaline:interval_write = "5d" ; vosaline:_FillValue = 0.f ; vosaline:missing_value = 0.f ; vosaline:coordinates = "time_centered deptht nav_lon nav_lat" ; vosaline:_Storage = "chunked" ; vosaline:_ChunkSizes = 1, 1, 12, 5422 ; vosaline:_DeflateLevel = 1 ; float sossheig(time_counter, y, x) ; sossheig:long_name = "sea surface height" ; sossheig:units = "m" ; sossheig:online_operation = "average" ; sossheig:interval_operation = "40s" ; sossheig:interval_write = "5d" ; sossheig:_FillValue = 0.f ; sossheig:missing_value = 0.f ; sossheig:coordinates = "time_centered nav_lon nav_lat" ; sossheig:_Storage = "chunked" ; sossheig:_ChunkSizes = 1, 12, 5422 ;

// global attributes: :description = "ocean T grid variables" ; :conventions = "CF-1.1" ; :production = "An IPSL model" ; :start_date = 20040101 ; :output_frequency = "5d" ; :CONFIG = "NATL60" ; :CASE = "MJM155" ; :_Format = "netCDF-4" ; } ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  mfdataset fails at chunking after opening 165104458

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 1197.962ms · About: xarray-datasette