html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/4082#issuecomment-1094072825,https://api.github.com/repos/pydata/xarray/issues/4082,1094072825,IC_kwDOAMm_X85BNjn5,5635139,2022-04-09T15:51:46Z,2022-04-09T15:51:46Z,MEMBER,I'm trying to close issues that won't lead to changes — please reopen with a MCVE if this is still an issue.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-642905079,https://api.github.com/repos/pydata/xarray/issues/4082,642905079,MDEyOklzc3VlQ29tbWVudDY0MjkwNTA3OQ==,14808389,2020-06-11T20:15:54Z,2020-06-11T21:26:26Z,MEMBER,"the `file_cache_maxsize` option controls how many ""files"" are kept open simultaneously (see the uses of `self._cache` in https://github.com/pydata/xarray/blob/4071125feedee690364272e8fde9b94866f85bc7/xarray/backends/file_manager.py#L50 so this might be a issue in `xarray` that only happens for `netcdf` (?) on windows.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-642896196,https://api.github.com/repos/pydata/xarray/issues/4082,642896196,MDEyOklzc3VlQ29tbWVudDY0Mjg5NjE5Ng==,579593,2020-06-11T19:54:49Z,2020-06-11T19:54:49Z,NONE,"@rsignell-usgs no, netcdf4-python does not do any caching.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-642841283,https://api.github.com/repos/pydata/xarray/issues/4082,642841283,MDEyOklzc3VlQ29tbWVudDY0Mjg0MTI4Mw==,1872600,2020-06-11T17:58:30Z,2020-06-11T18:00:28Z,NONE,"@jswhit, do you know if https://github.com/Unidata/netcdf4-python is doing the caching?
Just to catch you up quickly, we have a workflow that opens a bunch of opendap datasets, and while the default `file_cache_maxsize=128` works on Linux, if this exceeds 25 files on windows it fails:
```
xr.set_options(file_cache_maxsize=25) # works
#xr.set_options(file_cache_maxsize=26) # fails
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-641468791,https://api.github.com/repos/pydata/xarray/issues/4082,641468791,MDEyOklzc3VlQ29tbWVudDY0MTQ2ODc5MQ==,905179,2020-06-09T17:40:16Z,2020-06-09T17:40:16Z,NONE,"I do not know because I do not understand who is doing the caching.
The above archive reference is no longer relevant because the dap2 code
now uses an in-memory file rather than something in /tmp.
Netcdf-c keeps its curl connections open until nc_close is called.
I would assume that each curl connection maintains at least one file descriptor open.
But is the cache that shows the problem a python maintained cache
or a Windows cache of some sort? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-641236117,https://api.github.com/repos/pydata/xarray/issues/4082,641236117,MDEyOklzc3VlQ29tbWVudDY0MTIzNjExNw==,1872600,2020-06-09T11:42:38Z,2020-06-09T11:42:38Z,NONE,"@DennisHeimbigner , do you not agree that this issue on windows is related to the number of files cached from OPeNDAP requests? Clearly there are some differences with cache files on windows: https://www.unidata.ucar.edu/support/help/MailArchives/netcdf/msg11190.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640871586,https://api.github.com/repos/pydata/xarray/issues/4082,640871586,MDEyOklzc3VlQ29tbWVudDY0MDg3MTU4Ng==,905179,2020-06-08T20:34:30Z,2020-06-08T20:34:43Z,NONE,"So I tried to duplicate using cygwin with the latest netcdf master and using ncdump.
It seems to work ok. But this raises a question? Can someone try this command
under windows to see if it fails?
> ncdump 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc'
If it succeeds then it may mean the problem is with python rather than netcdf.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640815050,https://api.github.com/repos/pydata/xarray/issues/4082,640815050,MDEyOklzc3VlQ29tbWVudDY0MDgxNTA1MA==,905179,2020-06-08T19:05:44Z,2020-06-08T19:08:34Z,NONE,"BTW, what version of the netcdf-c library is being used.
I see this in an above comment: netcdf4: 1.5.3
But that cannot possibly be correct.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640813885,https://api.github.com/repos/pydata/xarray/issues/4082,640813885,MDEyOklzc3VlQ29tbWVudDY0MDgxMzg4NQ==,905179,2020-06-08T19:03:28Z,2020-06-08T19:03:28Z,NONE,"I agree. To be more precise, NC_EPERM is generally thrown when an attempt is made
to modify a read-only file. So it is possible that it isn't the DAP2 code, but
somewhere, an attempt is being made to modify the dataset.
There are pieces of the netcdf-c library that are conditional on Windows.
It might be interesting if anyone can check if this occurs under cygwin.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640808125,https://api.github.com/repos/pydata/xarray/issues/4082,640808125,MDEyOklzc3VlQ29tbWVudDY0MDgwODEyNQ==,1872600,2020-06-08T18:51:37Z,2020-06-08T18:51:37Z,NONE,"@DennisHeimbigner I don't understand how it can be a DAP or code issue since:
- it runs on Linux without errors with default `file_cache_maxsize=128`.
- it runs on Windows without errors with `file_cache_maxsize=25`
Right? Or am I missing something?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640803093,https://api.github.com/repos/pydata/xarray/issues/4082,640803093,MDEyOklzc3VlQ29tbWVudDY0MDgwMzA5Mw==,905179,2020-06-08T18:41:16Z,2020-06-08T18:41:16Z,NONE,"You would lose your money :-)
However, I can offer some info that might help.
This message: OSError: [Errno -37] NetCDF: Write to read only
is NC_EPERM. It is the signal for opendap that you attempted
an operation that is illegal for DAP2. As an aside, it is a lousy
message but I cannot find anything that is any more informative.
Anyway, it means that your code somehow called one of the following
netcdf-c API functions:
> nc_redef, nc__enddef, nc_create, nc_put_vara, nc_put_vars
> nc_set_fill, nc_def_dim, nc_put_att, nc_def_var
Perhaps with this info, you can figure out which of those above operations
you invoked. Perhaps you can set breakpoints in the python wrappers for
these functions?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-640590247,https://api.github.com/repos/pydata/xarray/issues/4082,640590247,MDEyOklzc3VlQ29tbWVudDY0MDU5MDI0Nw==,1872600,2020-06-08T13:05:28Z,2020-06-08T13:05:28Z,NONE,"Or perhaps Unidata's @WardF, who leads NetCDF development. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-639450932,https://api.github.com/repos/pydata/xarray/issues/4082,639450932,MDEyOklzc3VlQ29tbWVudDYzOTQ1MDkzMg==,1872600,2020-06-05T12:26:14Z,2020-06-05T12:26:14Z,NONE,"@shoyer, unfortunately these opendap datasets contain only 1 time record (1 daily value) each. And it works fine on Linux with `file_cache_maxsize=128`, so it must be some Windows cache thing right?
So since I just picked `file_cache_maxsize=10` arbitrarily, I thought it would be useful to see what the maximum value was. Using the good old bi-section method, I determined that (for this case anyway), the maximum size that works is 25.
In other words:
```
xr.set_options(file_cache_maxsize=25) # works
#xr.set_options(file_cache_maxsize=26) # fails
```
I would bet money that Unidata's @DennisHeimbigner knows what's going on here!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-639221402,https://api.github.com/repos/pydata/xarray/issues/4082,639221402,MDEyOklzc3VlQ29tbWVudDYzOTIyMTQwMg==,1217238,2020-06-05T02:30:46Z,2020-06-05T02:30:46Z,MEMBER,"I wonder if this is somehow related to the fact that the file associated with a URL remains open in different calls to `open_mfdataset` when `file_cache_maxsize > 10`. One way to test this would be to change the script to only open each day's data once, e.g., opening the entire month at once.
Another possibility is that Windows has a smaller limit on the number of open connections for some reason.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-639145573,https://api.github.com/repos/pydata/xarray/issues/4082,639145573,MDEyOklzc3VlQ29tbWVudDYzOTE0NTU3Mw==,65610153,2020-06-04T22:11:55Z,2020-06-04T22:11:55Z,NONE,"Thanks for putting the time in to figure this out! I appreciate it. Tried it myself and it worked, just as you mentioned.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-639111588,https://api.github.com/repos/pydata/xarray/issues/4082,639111588,MDEyOklzc3VlQ29tbWVudDYzOTExMTU4OA==,1872600,2020-06-04T20:55:49Z,2020-06-04T20:55:49Z,NONE,"@EliT1626 , I confirmed that this problem exists on Windows, but not on Linux.
The error:
```
IOError: [Errno -37] NetCDF: Write to read only: 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201703/oisst-avhrr-v02r01.20170304.nc'
```
suggested some kind of cache problem, and as you noted it always fails after a certain number of dates, so I tried increasing the number of cached files from the default 128 to 256:
```
xr.set_options(file_cache_maxsize=256)
```
but that had no effect.
Just to see if it would fail earlier, I then tried *decreasing* the number of cached files:
```
xr.set_options(file_cache_maxsize=10)
```
and to my surprise, it ran all the way through:
https://nbviewer.jupyter.org/gist/rsignell-usgs/c52fadd8626734bdd32a432279bc6779
I'm hoping someone who worked on the caching (@shoyer?) might have some idea of what is going on, but at least you can execute your workflow now on windows!
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632819541,https://api.github.com/repos/pydata/xarray/issues/4082,632819541,MDEyOklzc3VlQ29tbWVudDYzMjgxOTU0MQ==,65610153,2020-05-22T17:29:02Z,2020-05-22T17:29:02Z,NONE,"After discussing this issue with someone who has a lot more knowledge than me, it seems that it may be pertinent to mention that I am using a Windows machine. He is able to run the script fine on his Linux environment, much like some of you have been able to do. I have tried changing the`window` to different amounts and the script always fails around 25ish calls to the Opendap server. This was done on a new environment with only the required packages installed and updated to the latest versions. Is there some sort of issue with Windows in this regard? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632378632,https://api.github.com/repos/pydata/xarray/issues/4082,632378632,MDEyOklzc3VlQ29tbWVudDYzMjM3ODYzMg==,65610153,2020-05-21T22:28:56Z,2020-05-21T22:28:56Z,NONE,"It turns out I did mix the two. I uninstalled netcdf4 from pip and reinstalled via conda, but now I am back to the original error from my first post.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632267849,https://api.github.com/repos/pydata/xarray/issues/4082,632267849,MDEyOklzc3VlQ29tbWVudDYzMjI2Nzg0OQ==,14808389,2020-05-21T18:26:01Z,2020-05-21T18:26:01Z,MEMBER,"it looks like something went wrong when you installed `netcdf4`. Did you mix `pip install` with `conda install`? The rule for `conda` is normally that you either shouldn't use `pip` or, if you do, you shouldn't use `conda` again (`conda` doesn't know what `pip` does, so the environment might break).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632264658,https://api.github.com/repos/pydata/xarray/issues/4082,632264658,MDEyOklzc3VlQ29tbWVudDYzMjI2NDY1OA==,65610153,2020-05-21T18:19:18Z,2020-05-21T18:19:18Z,NONE,"Okay, I updated all packages in my current environment. I am getting a new error now.
```
AttributeError Traceback (most recent call last)
in
18 date_window = list_dates(cur_date - window, cur_date + window)
19 url_list = [url.format(x) for x in date_window]
---> 20 window_data=xr.open_mfdataset(url_list).sst
21 data.append(window_data.mean('time'))
22 print(data[-1])
~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, **kwargs)
906 getattr_ = getattr
907
--> 908 datasets = [open_(p, **open_kwargs) for p in paths]
909 file_objs = [getattr_(ds, ""_file_obj"") for ds in datasets]
910 if preprocess is not None:
~\Anaconda3\lib\site-packages\xarray\backends\api.py in (.0)
906 getattr_ = getattr
907
--> 908 datasets = [open_(p, **open_kwargs) for p in paths]
909 file_objs = [getattr_(ds, ""_file_obj"") for ds in datasets]
910 if preprocess is not None:
~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime)
497
498 if engine is None:
--> 499 engine = _get_default_engine(filename_or_obj, allow_remote=True)
500 if engine == ""netcdf4"":
501 store = backends.NetCDF4DataStore.open(
~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine(path, allow_remote)
145 def _get_default_engine(path, allow_remote=False):
146 if allow_remote and is_remote_uri(path):
--> 147 engine = _get_default_engine_remote_uri()
148 elif is_grib_path(path):
149 engine = _get_default_engine_grib()
~\Anaconda3\lib\site-packages\xarray\backends\api.py in _get_default_engine_remote_uri()
46 def _get_default_engine_remote_uri():
47 try:
---> 48 import netCDF4 # noqa: F401
49
50 engine = ""netcdf4""
~\Anaconda3\lib\site-packages\netCDF4\__init__.py in
1 # init for netCDF4. package
2 # Docstring comes from extension module _netCDF4.
----> 3 from ._netCDF4 import *
4 # Need explicit imports for names beginning with underscores
5 from ._netCDF4 import __doc__, __pdoc__
include\membuf.pyx in init netCDF4._netCDF4()
AttributeError: type object 'netCDF4._netCDF4._MemBuf' has no attribute '__reduce_cython__'
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632261972,https://api.github.com/repos/pydata/xarray/issues/4082,632261972,MDEyOklzc3VlQ29tbWVudDYzMjI2MTk3Mg==,14808389,2020-05-21T18:13:35Z,2020-05-21T18:13:35Z,MEMBER,"umm... I think these are the related packages:
- `libhdf5`
- `libnetcdf`
- `netcdf4`
for reference, here's my environment
```
commit: 732b6cd6248ce715da74f3cd7a0e211eaa1d0aa2
python: 3.8.2 | packaged by conda-forge | (default, Apr 24 2020, 08:20:52)
[GCC 7.3.0]
python-bits: 64
OS: Linux
byteorder: little
LC_ALL: None
libhdf5: 1.10.5
libnetcdf: 4.7.4
xarray: 0.15.1
pandas: 1.0.3
numpy: 1.18.1
scipy: 1.4.1
netCDF4: 1.5.3
pydap: installed
h5netcdf: 0.8.0
h5py: 2.10.0
Nio: 1.5.5
zarr: 2.4.0
cftime: 1.1.1.2
nc_time_axis: 1.2.0
PseudoNetCDF: installed
rasterio: 1.1.3
cfgrib: 0.9.8.1
iris: 2.4.0
bottleneck: 1.3.2
dask: 2.15.0
distributed: 2.15.2
matplotlib: 3.2.1
cartopy: 0.17.0
seaborn: 0.10.1
numbagg: installed
setuptools: 46.1.3.post20200325
pip: 20.1
conda: None
pytest: 5.4.1
IPython: 7.14.0
sphinx: None
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632258946,https://api.github.com/repos/pydata/xarray/issues/4082,632258946,MDEyOklzc3VlQ29tbWVudDYzMjI1ODk0Ng==,65610153,2020-05-21T18:07:05Z,2020-05-21T18:07:05Z,NONE,Updating what specifically? Certain packages?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632257306,https://api.github.com/repos/pydata/xarray/issues/4082,632257306,MDEyOklzc3VlQ29tbWVudDYzMjI1NzMwNg==,14808389,2020-05-21T18:03:24Z,2020-05-21T18:03:24Z,MEMBER,I can't reproduce this: except from a few future warnings about `auto_combine` your code sample works for me. Could you try updating your environment?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-632238093,https://api.github.com/repos/pydata/xarray/issues/4082,632238093,MDEyOklzc3VlQ29tbWVudDYzMjIzODA5Mw==,65610153,2020-05-21T17:25:20Z,2020-05-21T17:25:32Z,NONE,"Update. I tried installing a virtual environment and running this script on older versions of both netCDF4 and xarray. The end result was that the script stopped working after only a few timesteps instead of the larger number it did before. Here is the full error traceback. Still have not been able to find any info on what this might mean. Would this also be worth posting on the netCDF4 Github page?
Error:
```
IOError Traceback (most recent call last)
in ()
18 date_window = list_dates(cur_date - window, cur_date + window)
19 url_list = [url.format(x) for x in date_window]
---> 20 window_data=xr.open_mfdataset(url_list).sst
21 data.append(window_data.mean('time'))
22 print(data[-1])
C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, autoclose, parallel, **kwargs)
622 getattr_ = getattr
623
--> 624 datasets = [open_(p, **open_kwargs) for p in paths]
625 file_objs = [getattr_(ds, '_file_obj') for ds in datasets]
626 if preprocess is not None:
C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\api.pyc in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs)
318 group=group,
319 autoclose=autoclose,
--> 320 **backend_kwargs)
321 elif engine == 'scipy':
322 store = backends.ScipyDataStore(filename_or_obj,
C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in open(cls, filename, mode, format, group, writer, clobber, diskless, persist, autoclose, lock)
329 diskless=diskless, persist=persist,
330 format=format)
--> 331 ds = opener()
332 return cls(ds, mode=mode, writer=writer, opener=opener,
333 autoclose=autoclose, lock=lock)
C:\Users\Eli T\Anaconda3\envs\condavenv\lib\site-packages\xarray\backends\netCDF4_.pyc in _open_netcdf4_group(filename, mode, group, **kwargs)
228 import netCDF4 as nc4
229
--> 230 ds = nc4.Dataset(filename, mode=mode, **kwargs)
231
232 with close_on_error(ds):
netCDF4\_netCDF4.pyx in netCDF4._netCDF4.Dataset.__init__()
netCDF4\_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()
IOError: [Errno -37] NetCDF: Write to read only: 'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.1/AVHRR/201703/oisst-avhrr-v02r01.20170304.nc'
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-631125403,https://api.github.com/repos/pydata/xarray/issues/4082,631125403,MDEyOklzc3VlQ29tbWVudDYzMTEyNTQwMw==,65610153,2020-05-19T22:42:14Z,2020-05-19T22:42:14Z,NONE,"Yes, it is quite long though.
Error:
```
KeyError Traceback (most recent call last)
~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock)
197 try:
--> 198 file = self._cache[self._key]
199 except KeyError:
~\Anaconda3\lib\site-packages\xarray\backends\lru_cache.py in __getitem__(self, key)
52 with self._lock:
---> 53 value = self._cache[key]
54 self._cache.move_to_end(key)
KeyError: [, ('https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False))]
During handling of the above exception, another exception occurred:
OSError Traceback (most recent call last)
in
17 date_window = list_dates(cur_date - window, cur_date + window)
18 url_list = [url.format(x) for x in date_window]
---> 19 window_data=xr.open_mfdataset(url_list).sst
20 data.append(window_data.mean('time'))
21 print(data[-1])
~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, lock, data_vars, coords, combine, autoclose, parallel, join, attrs_file, **kwargs)
906 getattr_ = getattr
907
--> 908 datasets = [open_(p, **open_kwargs) for p in paths]
909 file_objs = [getattr_(ds, ""_file_obj"") for ds in datasets]
910 if preprocess is not None:
~\Anaconda3\lib\site-packages\xarray\backends\api.py in (.0)
906 getattr_ = getattr
907
--> 908 datasets = [open_(p, **open_kwargs) for p in paths]
909 file_objs = [getattr_(ds, ""_file_obj"") for ds in datasets]
910 if preprocess is not None:
~\Anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime)
500 if engine == ""netcdf4"":
501 store = backends.NetCDF4DataStore.open(
--> 502 filename_or_obj, group=group, lock=lock, **backend_kwargs
503 )
504 elif engine == ""scipy"":
~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose)
356 netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
357 )
--> 358 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)
359
360 def _acquire(self, needs_lock=True):
~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in __init__(self, manager, group, mode, lock, autoclose)
312 self._group = group
313 self._mode = mode
--> 314 self.format = self.ds.data_model
315 self._filename = self.ds.filepath()
316 self.is_remote = is_remote_uri(self._filename)
~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in ds(self)
365 @property
366 def ds(self):
--> 367 return self._acquire()
368
369 def open_store_variable(self, name, var):
~\Anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in _acquire(self, needs_lock)
359
360 def _acquire(self, needs_lock=True):
--> 361 with self._manager.acquire_context(needs_lock) as root:
362 ds = _nc4_require_group(root, self._group, self._mode)
363 return ds
~\Anaconda3\lib\contextlib.py in __enter__(self)
110 del self.args, self.kwds, self.func
111 try:
--> 112 return next(self.gen)
113 except StopIteration:
114 raise RuntimeError(""generator didn't yield"") from None
~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in acquire_context(self, needs_lock)
184 def acquire_context(self, needs_lock=True):
185 """"""Context manager for acquiring a file.""""""
--> 186 file, cached = self._acquire_with_cache_info(needs_lock)
187 try:
188 yield file
~\Anaconda3\lib\site-packages\xarray\backends\file_manager.py in _acquire_with_cache_info(self, needs_lock)
202 kwargs = kwargs.copy()
203 kwargs[""mode""] = self._mode
--> 204 file = self._opener(*self._args, **kwargs)
205 if self._mode == ""w"":
206 # ensure file doesn't get overriden when opened again
netCDF4\_netCDF4.pyx in netCDF4._netCDF4.Dataset.__init__()
netCDF4\_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()
OSError: [Errno -37] NetCDF: Write to read only: b'https://www.ncei.noaa.gov/thredds/dodsC/OisstBase/NetCDF/V2.0/AVHRR/201703/avhrr-only-v2.20170322.nc'
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286
https://github.com/pydata/xarray/issues/4082#issuecomment-631122293,https://api.github.com/repos/pydata/xarray/issues/4082,631122293,MDEyOklzc3VlQ29tbWVudDYzMTEyMjI5Mw==,1217238,2020-05-19T22:35:22Z,2020-05-19T22:35:22Z,MEMBER,Can you share the full error traceback you see?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,621177286