html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/5189#issuecomment-863098508,https://api.github.com/repos/pydata/xarray/issues/5189,863098508,MDEyOklzc3VlQ29tbWVudDg2MzA5ODUwOA==,6574622,2021-06-17T09:49:48Z,2021-06-17T09:49:48Z,CONTRIBUTOR,"Pydap has several important fixes which have been merged into `master` already. Nevertheless, the latest release of Pydap is from May 2017, which is before the referenced PR.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824238589,https://api.github.com/repos/pydata/xarray/issues/5189,824238589,MDEyOklzc3VlQ29tbWVudDgyNDIzODU4OQ==,6943441,2021-04-21T17:38:43Z,2021-04-21T17:38:43Z,NONE,Well. That is strange. My version of `pydap` does not have the fix from the PR submitted to solve the issue: https://github.com/pydap/pydap/pull/151/files. When I implement this in my `pydap` I don't get that error.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824234023,https://api.github.com/repos/pydata/xarray/issues/5189,824234023,MDEyOklzc3VlQ29tbWVudDgyNDIzNDAyMw==,6943441,2021-04-21T17:31:11Z,2021-04-21T17:31:11Z,NONE,Looks like this is the same issue: https://github.com/pydap/pydap/issues/150,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824232570,https://api.github.com/repos/pydata/xarray/issues/5189,824232570,MDEyOklzc3VlQ29tbWVudDgyNDIzMjU3MA==,6943441,2021-04-21T17:28:36Z,2021-04-21T17:28:36Z,NONE,So it seems like it's something specific to the `NASA` dataset/server,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824232141,https://api.github.com/repos/pydata/xarray/issues/5189,824232141,MDEyOklzc3VlQ29tbWVudDgyNDIzMjE0MQ==,6943441,2021-04-21T17:27:53Z,2021-04-21T17:27:53Z,NONE,"I tried on a public test dataset and I don't get any error: ```python import xarray as xr url = ""http://test.opendap.org/opendap/data/nc/sst.mnmean.nc.gz"" store = xr.backends.PydapDataStore.open(url) ds = xr.open_dataset(store) ds['sst'].isel(time=0, lat=0, lon=0).values ``` ``` array(-1.8, dtype=float32) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824227019,https://api.github.com/repos/pydata/xarray/issues/5189,824227019,MDEyOklzc3VlQ29tbWVudDgyNDIyNzAxOQ==,221526,2021-04-21T17:19:21Z,2021-04-21T17:19:21Z,CONTRIBUTOR,"> KeyError: 'tmp2m%2Etmp2m' This looks like an issue with the encoding of the URL and what the server expects.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824226103,https://api.github.com/repos/pydata/xarray/issues/5189,824226103,MDEyOklzc3VlQ29tbWVudDgyNDIyNjEwMw==,6943441,2021-04-21T17:17:46Z,2021-04-21T17:17:46Z,NONE,Good idea. I'll see if I can find one.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-824220498,https://api.github.com/repos/pydata/xarray/issues/5189,824220498,MDEyOklzc3VlQ29tbWVudDgyNDIyMDQ5OA==,10194086,2021-04-21T17:08:23Z,2021-04-21T17:08:23Z,MEMBER,Would be good to get a publicly available dataset to try to figure this out. ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-822847939,https://api.github.com/repos/pydata/xarray/issues/5189,822847939,MDEyOklzc3VlQ29tbWVudDgyMjg0NzkzOQ==,10194086,2021-04-19T23:11:20Z,2021-04-19T23:11:20Z,MEMBER,"Hmm yes makes sense with such a big dataset. xarray must be converting the integer index to something fancy internally that pydap cannot handle. I am not very familiar with that part of the code, unfortunately.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-822833896,https://api.github.com/repos/pydata/xarray/issues/5189,822833896,MDEyOklzc3VlQ29tbWVudDgyMjgzMzg5Ng==,6943441,2021-04-19T22:36:48Z,2021-04-19T22:36:48Z,NONE,I get a timeout error when trying `load()`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-822829654,https://api.github.com/repos/pydata/xarray/issues/5189,822829654,MDEyOklzc3VlQ29tbWVudDgyMjgyOTY1NA==,6943441,2021-04-19T22:31:50Z,2021-04-19T22:32:33Z,NONE,"I could try. It's a pretty big dataset though. 38x10^9 data points: ![image](https://user-images.githubusercontent.com/6943441/115311282-00947500-a135-11eb-85a6-b901ab0b1895.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673 https://github.com/pydata/xarray/issues/5189#issuecomment-822822057,https://api.github.com/repos/pydata/xarray/issues/5189,822822057,MDEyOklzc3VlQ29tbWVudDgyMjgyMjA1Nw==,10194086,2021-04-19T22:16:19Z,2021-04-19T22:16:19Z,MEMBER,Can you try `ds['tmp2m'].load()` before calling `isel`? ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,861684673