html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/516#issuecomment-135510417,https://api.github.com/repos/pydata/xarray/issues/516,135510417,MDEyOklzc3VlQ29tbWVudDEzNTUxMDQxNw==,3688009,2015-08-27T18:11:43Z,2015-08-27T18:11:43Z,NONE,"using `ncdump -hs`, I found the chunk sizes of any of the files to be:
`_ChunkSizes = 1, 90, 180 ;`
Using that, it took even more time:
```
datal = xray.open_mfdataset(filename, chunks={'time':1, 'lat':90, 'lon':180})
In [7]: %time datal.tasmax[:, 360, 720].values
CPU times: user 3min 3s, sys: 59.4 s, total: 4min 3s
Wall time: 12min 8s
```
I should say that I am using open source data, and therefore do not control how the original data is being chunked. This is also using `open_mfdataset` on around 100 files
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,99026442
https://github.com/pydata/xarray/issues/516#issuecomment-129995032,https://api.github.com/repos/pydata/xarray/issues/516,129995032,MDEyOklzc3VlQ29tbWVudDEyOTk5NTAzMg==,3688009,2015-08-11T18:01:04Z,2015-08-11T18:01:04Z,NONE,"Hmm. I moved the uncompressed files to my local hard drive, and I am still getting a lot more wall time than CPU time. 31 seconds would be more than acceptable, but 8 minutes is really pushing it.
```
%time datal.tasmax[:, 360, 720].values
CPU times: user 25.2 s, sys: 5.83 s, total: 31 s
Wall time: 8min 1s
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,99026442