html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/4043#issuecomment-1065536538,https://api.github.com/repos/pydata/xarray/issues/4043,1065536538,IC_kwDOAMm_X84_gswa,8419421,2022-03-11T21:16:59Z,2022-03-11T21:16:59Z,NONE,"I believe I am experiencing a similar issue, although with code that I thought was smart enough to chunk the data request into smaller pieces:
```
import numpy as np
import xarray as xr
from dask.diagnostics import ProgressBar
import intake
wrf_url = ('https://rda.ucar.edu/thredds/catalog/files/g/ds612.0/'
'PGW3D/2006/catalog.xml')
catalog_u = intake.open_thredds_merged(wrf_url, path=['*_U_2006060*'])
catalog_v = intake.open_thredds_merged(wrf_url, path=['*_V_2006060*'])
ds_u = catalog_u.to_dask()
ds_u['U'] = ds_u.U.chunk(""auto"")
ds_v = catalog_v.to_dask()
ds_v['V'] = ds_v.V.chunk(""auto"")
ds = xr.merge((ds_u, ds_v))
def unstagger(ds, var, coord, new_coord):
var1 = ds[var].isel({coord: slice(None, -1)})
var2 = ds[var].isel({coord: slice(1, None)})
return ((var1 + var2) / 2).rename({coord: new_coord})
with ProgressBar():
ds['U_unstaggered'] = unstagger(ds, 'U', 'west_east_stag', 'west_east')
ds['V_unstaggered'] = unstagger(ds, 'V', 'south_north_stag', 'south_north')
ds['speed'] = np.hypot(ds.U_unstaggered, ds.V_unstaggered)
ds.speed.isel(bottom_top=10).sel(Time='2006-06-07T18:00').plot()
```
This throws an error because, according to the RDA help folks, a request for an entire variable is made, which far exceeds their server's 500 MB request limit:
```
rda.ucar.edu/thredds/dodsC/files/g/ds612.0/PGW3D/2006/wrf3d_d01_PGW_U_20060607.nc.dods?U%5B0:1: 7%5D%5B0:1:49%5D%5B0:1:1014%5D%5B0:1:1359%5D
```
Here's the error:
```
Traceback (most recent call last):
File ""/home/decker/classes/met325/rda_plot.py"", line 29, in
ds.speed.isel(bottom_top=10).sel(Time='2006-06-07T18:00').plot()
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/plot/plot.py"", line 862, in __call__
return plot(self._da, **kwargs)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/plot/plot.py"", line 293, in plot
darray = darray.squeeze().compute()
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/dataarray.py"", line 951, in compute
return new.load(**kwargs)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/dataarray.py"", line 925, in load
ds = self._to_temp_dataset().load(**kwargs)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/dataset.py"", line 862, in load
evaluated_data = da.compute(*lazy_data.values(), **kwargs)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/base.py"", line 571, in compute
results = schedule(dsk, keys, **kwargs)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/threaded.py"", line 79, in get
results = get_async(
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/local.py"", line 507, in get_async
raise_exception(exc, tb)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/local.py"", line 315, in reraise
raise exc
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/local.py"", line 220, in execute_task
result = _execute_task(task, data)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/core.py"", line 119, in _execute_task
return func(*(_execute_task(a, cache) for a in args))
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/dask/array/core.py"", line 116, in getter
c = np.asarray(c)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/indexing.py"", line 357, in __array__
return np.asarray(self.array, dtype=dtype)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/indexing.py"", line 521, in __array__
return np.asarray(self.array, dtype=dtype)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/indexing.py"", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/conventions.py"", line 62, in __getitem__
return np.asarray(self.array[key], dtype=self.dtype)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/indexing.py"", line 422, in __array__
return np.asarray(array[self.key], dtype=None)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/backends/pydap_.py"", line 39, in __getitem__
return indexing.explicit_indexing_adapter(
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/core/indexing.py"", line 711, in explicit_indexing_adapter
result = raw_indexing_method(raw_key.tuple)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/backends/pydap_.py"", line 47, in _getitem
result = robust_getitem(array, key, catch=ValueError)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/xarray/backends/common.py"", line 64, in robust_getitem
return array[key]
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/pydap/model.py"", line 323, in __getitem__
out.data = self._get_data_index(index)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/pydap/model.py"", line 353, in _get_data_index
return self._data[index]
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/pydap/handlers/dap.py"", line 170, in __getitem__
raise_for_status(r)
File ""/home/decker/local/miniconda3/envs/met325/lib/python3.10/site-packages/pydap/net.py"", line 38, in raise_for_status
raise HTTPError(
webob.exc.HTTPError: 403 403
```
I thought smaller requests would automagically happen with this code. Is it intended that a large request be made?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-657136785,https://api.github.com/repos/pydata/xarray/issues/4043,657136785,MDEyOklzc3VlQ29tbWVudDY1NzEzNjc4NQ==,221526,2020-07-11T22:01:55Z,2020-07-11T22:01:55Z,CONTRIBUTOR,Probably worth raising upstream with the THREDDS team. I do wonder if there's some issues with the chunking/compression of the native .nc files that's at play here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-628484954,https://api.github.com/repos/pydata/xarray/issues/4043,628484954,MDEyOklzc3VlQ29tbWVudDYyODQ4NDk1NA==,48764870,2020-05-14T08:37:43Z,2020-05-14T08:37:43Z,NONE,"We tried several times with 2000MB this configuration in the thredds:
```
50
2000
opendap/3.7
```
But when we request more than a chunk of time=500MB the error appears: RuntimeError: NetCDF: Access failure
> You might want to experiment with smaller chunks.
I tried with 50MB and the elapsed time was huge.
Local Network - Elapsed time: 0.5819 minutes
OpenDAP - Elapsed time: 37.1448 minutes
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-628016841,https://api.github.com/repos/pydata/xarray/issues/4043,628016841,MDEyOklzc3VlQ29tbWVudDYyODAxNjg0MQ==,1197350,2020-05-13T14:13:06Z,2020-05-13T14:13:06Z,MEMBER,"> Using this chunk of time=500Mb the code runs properly but it is really slow compared with the response through local network.
You might want to experiment with smaller chunks.
In general, opendap will always introduce overhead compared to direct file access.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627882905,https://api.github.com/repos/pydata/xarray/issues/4043,627882905,MDEyOklzc3VlQ29tbWVudDYyNzg4MjkwNQ==,48764870,2020-05-13T10:01:08Z,2020-05-13T10:01:08Z,NONE,"I followed your recommendations @rabernat, please see my test code bellow.
```python
import xarray as xr
import os
from datetime import datetime, timedelta
import pandas as pd
import shutil
import numpy as np
import time
lonlat_box = [-4.5, -2.5, 44, 45]
# ERA5 IHdata - Local
# -------------------
ds = xr.open_mfdataset(['raw/Wind_ERA5_Global_1998.05.nc', 'raw/Wind_ERA5_Global_1998.06.nc'])
ds = ds.get('u')
# from 0º,360º to -180º,180º
ds['lon'] = (ds.lon + 180) % 360 - 180
# lat is upside down --> sort ascending
ds = ds.sortby(['lon', 'lat'])
# Make the selection
ds = ds.sel(lon=slice(lonlat_box[0], lonlat_box[1]),
lat=slice(lonlat_box[2], lonlat_box[3]))
print(ds)
tic = time.perf_counter()
df = ds.to_dataframe()
toc = time.perf_counter()
print(f""\nLocal Network - Elapsed time: {(toc - tic)/60:0.4f} minutes\n\n"")
del ds, df
# ERA5 IHdata - Opendap
# ---------------------
ds = xr.open_mfdataset(['http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.05.nc',
'http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.06.nc'],
chunks={'time': '500MB'})
ds = ds.get('u')
# from 0º,360º to -180º,180º
ds['lon'] = (ds.lon + 180) % 360 - 180
# lat is upside down --> sort ascending
ds = ds.sortby(['lon', 'lat'])
# Make the selection
ds = ds.sel(lon=slice(lonlat_box[0], lonlat_box[1]),
lat=slice(lonlat_box[2], lonlat_box[3]))
print(ds)
tic = time.perf_counter()
df = ds.to_dataframe()
toc = time.perf_counter()
print(f""\n OpenDAP - Elapsed time: {(toc - tic)/60:0.4f} minutes\n\n"")
del ds, df
```
Result:
```ipython
dask.array
Coordinates:
* lon (lon) float32 -4.5 -4.25 -4.0 -3.75 -3.5 -3.25 -3.0 -2.75 -2.5
* lat (lat) float32 44.0 44.25 44.5 44.75 45.0
* time (time) datetime64[ns] 1998-05-01 ... 1998-06-30T23:00:00
Attributes:
units: m s**-1
long_name: 10 metre U wind component
Local Network - Elapsed time: 0.4037 minutes
dask.array
Coordinates:
* lon (lon) float32 -4.5 -4.25 -4.0 -3.75 -3.5 -3.25 -3.0 -2.75 -2.5
* lat (lat) float32 44.0 44.25 44.5 44.75 45.0
* time (time) datetime64[ns] 1998-05-01 ... 1998-06-30T23:00:00
Attributes:
units: m s**-1
long_name: 10 metre U wind component
OpenDAP - Elapsed time: 8.1971 minutes
```
Using this chunk of time=500Mb the code runs properly but it is really slow compared with the response through local network. I will try to raise this limit in the Opendap configuration with our IT-team to a more reasonable limit. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627387025,https://api.github.com/repos/pydata/xarray/issues/4043,627387025,MDEyOklzc3VlQ29tbWVudDYyNzM4NzAyNQ==,1197350,2020-05-12T14:38:37Z,2020-05-12T14:38:37Z,MEMBER,"> Just for my understanding, So theoretically It is not possible to make big requests without using chunking?
This depends entirely on the TDS server configuration. See comment in https://github.com/Unidata/netcdf-c/issues/1667#issuecomment-597372065. The default limit appears to be 500 MB.
It's important to note that _none of this_ has to do with xarray. Xarray is simply the top layer of a very deep software stack. If the TDS server could deliver larger data requests, and the netCDF4-python library could accept them, xarray would have no problem.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627375551,https://api.github.com/repos/pydata/xarray/issues/4043,627375551,MDEyOklzc3VlQ29tbWVudDYyNzM3NTU1MQ==,48764870,2020-05-12T14:19:24Z,2020-05-12T14:19:24Z,NONE,"@rabernat - Thank you! I will review the code (thank you for the extra comments, I really appreciate that) and follow your instructions to test the chunk size.
Just for my understanding, So theoretically It is not possible to make big requests without using chunking? The threads server is under our management and we want to know if these errors can be solved through any specific configuration of the service in the thredds.
Thank you in advance!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627368616,https://api.github.com/repos/pydata/xarray/issues/4043,627368616,MDEyOklzc3VlQ29tbWVudDYyNzM2ODYxNg==,1197350,2020-05-12T14:07:39Z,2020-05-12T14:07:39Z,MEMBER,"I have spent plenty of time debugging these sorts of issues. It really helps to take xarray out of the equation. Try making your request with just the netCDF--that's all that xarray uses under the hood. Overall your example is very complicated, which makes it hard to find the core issue.
You generally want to try something like this
```python
import netCDF4
ncds = netCDF4.Dataset(OPENDAP_url)
data = ncds[variable_name][:]
```
Try playing around with the slice `[:]` to see under what circumstances the opendap server fails. Then use chunking in xarray to limit the size of each individual request. That's what's described in pangeo-data/pangeo#767.
A few additional comments about your code:
```python
# Select spatial subset [lon,lat]
ds = ds.where((ds.lon >= Lon[0] - dl) & (ds.lon <= Lon[1] + dl) & (ds.lat >= Lat[0] - dl) & (ds.lat <= Lat[1] + dl), drop=True)
```
This is **NOT** how you do subsetting with xarray. Where is meant for masking. I recommend reviewing the xarray docs on [indexing and selecting](http://xarray.pydata.org/en/stable/indexing.html). Your call should be something like
```python
ds = ds.sel(lon=slice(...), lat=slice(...))
```
What's the difference? `where` downloads all of the data from the opendap server and then fills it with NaNs outside of your selection, while `sel` lazily limits the size of the request from the opendap server. This could make a big difference in terms of the server's memory usage.
```python
ds = ds.sortby('lon', 'lat')
```
Can you do this sorting *after* loading the data. It's an expensive operation and might not interact well with the opendap server.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627363191,https://api.github.com/repos/pydata/xarray/issues/4043,627363191,MDEyOklzc3VlQ29tbWVudDYyNzM2MzE5MQ==,48764870,2020-05-12T13:58:26Z,2020-05-12T13:58:26Z,NONE,"thank you @dcherian, We know that if the request is small it works fine, but we want to make big requests of data. Is any limitation using opendap?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627357616,https://api.github.com/repos/pydata/xarray/issues/4043,627357616,MDEyOklzc3VlQ29tbWVudDYyNzM1NzYxNg==,2448579,2020-05-12T13:48:49Z,2020-05-12T13:48:49Z,MEMBER,I would check your server logs if you can. Or avoid xarray and try with lower level pydap / netCDF4. This may be useful: https://github.com/pangeo-data/pangeo/issues/767. Maybe you're requesting too much data?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627346640,https://api.github.com/repos/pydata/xarray/issues/4043,627346640,MDEyOklzc3VlQ29tbWVudDYyNzM0NjY0MA==,48764870,2020-05-12T13:30:39Z,2020-05-12T13:30:39Z,NONE,"Thank you @ocefpaf!
But it raised the same error. I also try to load ""u"" variable with matlab ncread through opendap and also failed! So maybe is not a problem related with python...? I am very confused!
```Loading files:
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.05.nc
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.06.nc
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
d:\2020_REPSOL\Codigos_input_TESEO\user_script.py in
58 # )
59
---> 60 ERA5_windIHData2txt_TESEO(lonlat_box=[-4.5, -2.5, 44, 45],
61 date_ini=datetime(1998, 5, 28, 0),
62 date_end=datetime(1998, 6, 1, 12),
d:\2020_REPSOL\Codigos_input_TESEO\TESEOtools_v0.py in ERA5_windIHData2txt_TESEO(***failed resolving arguments***)
826
827 # From xarray to dataframe
--> 828 df = ds.to_dataframe().reset_index()
829 del ds
830 print('[Processing currents 2D...]')
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\dataset.py in to_dataframe(self)
4503 this dataset's indices.
4504 """"""
-> 4505 return self._to_dataframe(self.dims)
4506
4507 def _set_sparse_data_from_dataframe(
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\dataset.py in _to_dataframe(self, ordered_dims)
4489 def _to_dataframe(self, ordered_dims):
4490 columns = [k for k in self.variables if k not in self.dims]
-> 4491 data = [
4492 self._variables[k].set_dims(ordered_dims).values.reshape(-1)
4493 for k in columns
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\dataset.py in (.0)
4490 columns = [k for k in self.variables if k not in self.dims]
4491 data = [
-> 4492 self._variables[k].set_dims(ordered_dims).values.reshape(-1)
4493 for k in columns
4494 ]
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\variable.py in values(self)
444 def values(self):
445 """"""The variable's data as a numpy.ndarray""""""
--> 446 return _as_array_or_item(self._data)
447
448 @values.setter
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\variable.py in _as_array_or_item(data)
247 TODO: remove this (replace with np.asarray) once these issues are fixed
248 """"""
--> 249 data = np.asarray(data)
250 if data.ndim == 0:
251 if data.dtype.kind == ""M"":
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\array\core.py in __array__(self, dtype, **kwargs)
1334
1335 def __array__(self, dtype=None, **kwargs):
-> 1336 x = self.compute()
1337 if dtype and x.dtype != dtype:
1338 x = x.astype(dtype)
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\base.py in compute(self, **kwargs)
164 dask.base.compute
165 """"""
--> 166 (result,) = compute(self, traverse=False, **kwargs)
167 return result
168
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\base.py in compute(*args, **kwargs)
442 postcomputes.append(x.__dask_postcompute__())
443
--> 444 results = schedule(dsk, keys, **kwargs)
445 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
446
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\threaded.py in get(dsk, result, cache, num_workers, pool, **kwargs)
74 pools[thread][num_workers] = pool
75
---> 76 results = get_async(
77 pool.apply_async,
78 len(pool._pool),
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
484 _execute_task(task, data) # Re-execute locally
485 else:
--> 486 raise_exception(exc, tb)
487 res, worker_id = loads(res_info)
488 state[""cache""][key] = res
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\local.py in reraise(exc, tb)
314 if exc.__traceback__ is not tb:
315 raise exc.with_traceback(tb)
--> 316 raise exc
317
318
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
220 try:
221 task, data = loads(task_info)
--> 222 result = _execute_task(task, data)
223 id = get_id()
224 result = dumps((result, id))
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in (.0)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in (.0)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in (.0)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk)
119 # temporaries by their reference count and can execute certain
120 # operations in-place.
--> 121 return func(*(_execute_task(a, cache) for a in args))
122 elif not ishashable(arg):
123 return arg
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\dask\array\core.py in getter(a, b, asarray, lock)
98 c = a[b]
99 if asarray:
--> 100 c = np.asarray(c)
101 finally:
102 if lock:
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\indexing.py in __array__(self, dtype)
489
490 def __array__(self, dtype=None):
--> 491 return np.asarray(self.array, dtype=dtype)
492
493 def __getitem__(self, key):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\indexing.py in __array__(self, dtype)
651
652 def __array__(self, dtype=None):
--> 653 return np.asarray(self.array, dtype=dtype)
654
655 def __getitem__(self, key):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\indexing.py in __array__(self, dtype)
555 def __array__(self, dtype=None):
556 array = as_indexable(self.array)
--> 557 return np.asarray(array[self.key], dtype=None)
558
559 def transpose(self, order):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\coding\variables.py in __array__(self, dtype)
70
71 def __array__(self, dtype=None):
---> 72 return self.func(self.array)
73
74 def __repr__(self):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\coding\variables.py in _scale_offset_decoding(data, scale_factor, add_offset, dtype)
216
217 def _scale_offset_decoding(data, scale_factor, add_offset, dtype):
--> 218 data = np.array(data, dtype=dtype, copy=True)
219 if scale_factor is not None:
220 data *= scale_factor
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\coding\variables.py in __array__(self, dtype)
70
71 def __array__(self, dtype=None):
---> 72 return self.func(self.array)
73
74 def __repr__(self):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\coding\variables.py in _apply_mask(data, encoded_fill_values, decoded_fill_value, dtype)
136 ) -> np.ndarray:
137 """"""Mask all matching values in a NumPy arrays.""""""
--> 138 data = np.asarray(data, dtype=dtype)
139 condition = False
140 for fv in encoded_fill_values:
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\numpy\core\_asarray.py in asarray(a, dtype, order)
83
84 """"""
---> 85 return array(a, dtype, copy=False, order=order)
86
87
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\indexing.py in __array__(self, dtype)
555 def __array__(self, dtype=None):
556 array = as_indexable(self.array)
--> 557 return np.asarray(array[self.key], dtype=None)
558
559 def transpose(self, order):
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\backends\netCDF4_.py in __getitem__(self, key)
70
71 def __getitem__(self, key):
---> 72 return indexing.explicit_indexing_adapter(
73 key, self.shape, indexing.IndexingSupport.OUTER, self._getitem
74 )
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\core\indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method)
835 """"""
836 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support)
--> 837 result = raw_indexing_method(raw_key.tuple)
838 if numpy_indices.tuple:
839 # index the loaded np.ndarray
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\backends\netCDF4_.py in _getitem(self, key)
83 with self.datastore.lock:
84 original_array = self.get_array(needs_lock=False)
---> 85 array = getitem(original_array, key)
86 except IndexError:
87 # Catch IndexError in netCDF4 and return a more informative
~\AppData\Local\Continuum\miniconda3\envs\TEST\lib\site-packages\xarray\backends\common.py in robust_getitem(array, key, catch, max_retries, initial_delay)
52 for n in range(max_retries + 1):
53 try:
---> 54 return array[key]
55 except catch:
56 if n == max_retries:
netCDF4\_netCDF4.pyx in netCDF4._netCDF4.Variable.__getitem__()
netCDF4\_netCDF4.pyx in netCDF4._netCDF4.Variable._get()
netCDF4\_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()
RuntimeError: NetCDF: Access failure```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-627326097,https://api.github.com/repos/pydata/xarray/issues/4043,627326097,MDEyOklzc3VlQ29tbWVudDYyNzMyNjA5Nw==,950575,2020-05-12T12:58:16Z,2020-05-12T12:58:16Z,CONTRIBUTOR,"I installed xarray through the recommended command in the official website in my minicoda env some months-year ago
That is probably it then. I see you have `libnetcdf 4.6.2`, if you recreate that env you should get `libnetcdf 4.7.4`. Can you try it with a new clean env:
```shell
conda create --name TEST --channel conda-forge xarray dask netCDF4 bottleneck
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-625675263,https://api.github.com/repos/pydata/xarray/issues/4043,625675263,MDEyOklzc3VlQ29tbWVudDYyNTY3NTI2Mw==,48764870,2020-05-08T07:16:47Z,2020-05-08T09:10:13Z,NONE,"thank you @ocefpaf ,
I installed xarray through the recommended command in the official website in my minicoda env some months-year ago:
```python
conda install -c conda-forge xarray dask netCDF4 bottleneck
```
I list my versions below:
```
INSTALLED VERSIONS
------------------
commit: None
python: 3.6.7 (default, Feb 28 2019, 07:28:18) [MSC v.1900 64 bit (AMD64)]
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: Intel64 Family 6 Model 42 Stepping 7, GenuineIntel
byteorder: little
LC_ALL: None
LANG: None
LOCALE: None.None
libhdf5: 1.10.4
libnetcdf: 4.6.2
xarray: 0.12.1
pandas: 0.24.2
numpy: 1.16.3
scipy: 1.2.1
netCDF4: 1.5.1.2
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.0.3.4
nc_time_axis: 1.2.0
PseudonetCDF: None
rasterio: None
cfgrib: 0.9.6.2
iris: None
bottleneck: None
dask: 1.1.5
distributed: 1.28.1
matplotlib: 3.0.3
cartopy: 0.16.0
seaborn: None
setuptools: 41.0.1
pip: 19.1.1
conda: 4.8.2
pytest: None
IPython: 7.5.0
sphinx: None
```
I'v just created a new environment with python3.7 and all last versions and the result is the same error, I list this new environment below also:
```
INSTALLED VERSIONS
------------------
commit: None
python: 3.7.7 (default, May 6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)]
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: Intel64 Family 6 Model 42 Stepping 7, GenuineIntel
byteorder: little
LC_ALL: None
LANG: None
LOCALE: None.None
libhdf5: 1.10.4
libnetcdf: 4.7.3
xarray: 0.15.1
pandas: 1.0.3
numpy: 1.18.1
scipy: 1.4.1
netCDF4: 1.5.3
pydap: installed
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: 1.1.2
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: 1.3.2
dask: 2.15.0
distributed: 2.15.2
matplotlib: None
cartopy: None
seaborn: None
numbagg: None
setuptools: 46.1.3.post20200330
pip: 20.0.2
conda: None
pytest: None
IPython: 7.13.0
sphinx: None
```
Thank you in advance!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-625426383,https://api.github.com/repos/pydata/xarray/issues/4043,625426383,MDEyOklzc3VlQ29tbWVudDYyNTQyNjM4Mw==,950575,2020-05-07T18:35:20Z,2020-05-07T18:35:20Z,CONTRIBUTOR,"How are you installing `netcdf4`? There was a problem with the underlying `libetcdf` some time ago that caused access failures like that. You can try upgrading it or using another backend, like `pydap`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-625330036,https://api.github.com/repos/pydata/xarray/issues/4043,625330036,MDEyOklzc3VlQ29tbWVudDYyNTMzMDAzNg==,48764870,2020-05-07T15:36:15Z,2020-05-07T15:36:15Z,NONE,"Totally agree,
from my code the list of url are:
```python
Loading files:
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.05.nc
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.06.nc
```
and through the web browser I can copy paste for that dates this:
```python
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.05.nc
http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global/Wind_ERA5_Global_1998.06.nc
```
So I think the URL is properly constructed, indeed if I select only the longitude variable, which is quit small, I can perform the ds.to_dataframe() method... so I think url is fine! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170
https://github.com/pydata/xarray/issues/4043#issuecomment-625325400,https://api.github.com/repos/pydata/xarray/issues/4043,625325400,MDEyOklzc3VlQ29tbWVudDYyNTMyNTQwMA==,2448579,2020-05-07T15:28:21Z,2020-05-07T15:28:21Z,MEMBER,"It's unfortunate that we don't print filenames when access fails.
Are you sure all the urls you construct are actually valid?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,614144170