html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2227#issuecomment-425224969,https://api.github.com/repos/pydata/xarray/issues/2227,425224969,MDEyOklzc3VlQ29tbWVudDQyNTIyNDk2OQ==,291576,2018-09-27T20:05:05Z,2018-09-27T20:05:05Z,CONTRIBUTOR,"It would be ten files opened via xr.open_mfdataset() concatenated across a time dimension, each one looking like:
```
netcdf convect_gust_20180301_0000 {
dimensions:
latitude = 3502 ;
longitude = 7002 ;
variables:
double latitude(latitude) ;
latitude:_FillValue = NaN ;
latitude:_Storage = ""contiguous"" ;
latitude:_Endianness = ""little"" ;
double longitude(longitude) ;
longitude:_FillValue = NaN ;
longitude:_Storage = ""contiguous"" ;
longitude:_Endianness = ""little"" ;
float gust(latitude, longitude) ;
gust:_FillValue = NaNf ;
gust:units = ""m/s"" ;
gust:description = ""gust winds"" ;
gust:_Storage = ""chunked"" ;
gust:_ChunkSizes = 701, 1401 ;
gust:_DeflateLevel = 8 ;
gust:_Shuffle = ""true"" ;
gust:_Endianness = ""little"" ;
// global attributes:
:start_date = ""03/01/2018 00:00"" ;
:end_date = ""03/01/2018 01:00"" ;
:interval = ""half-open"" ;
:init_date = ""02/28/2018 22:00"" ;
:history = ""Created 2018-09-12 15:53:44.468144"" ;
:description = ""Convective Downscaling, format V2.0"" ;
:_NCProperties = ""version=1|netcdflibversion=4.6.1|hdf5libversion=1.10.1"" ;
:_SuperblockVersion = 0 ;
:_IsNetcdf4 = 1 ;
:_Format = ""netCDF-4"" ;
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890
https://github.com/pydata/xarray/issues/2227#issuecomment-424795330,https://api.github.com/repos/pydata/xarray/issues/2227,424795330,MDEyOklzc3VlQ29tbWVudDQyNDc5NTMzMA==,291576,2018-09-26T17:06:44Z,2018-09-26T17:06:44Z,CONTRIBUTOR,"No, it does not make a difference. The example above peaks at around 5GB of memory (a bit much, but manageable). And it peaks similarly if we chunk it like you suggested.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890
https://github.com/pydata/xarray/issues/2227#issuecomment-424485235,https://api.github.com/repos/pydata/xarray/issues/2227,424485235,MDEyOklzc3VlQ29tbWVudDQyNDQ4NTIzNQ==,291576,2018-09-25T20:14:02Z,2018-09-25T20:14:02Z,CONTRIBUTOR,"Yeah, it looks like if `da` is backed by a dask array, and you do a `.isel(win=window.compute())` because otherwise isel barfs on dask indexers, it seems, then the memory usage shoots through the roof. Note that in my case, the dask chunks are (1, 3000, 7000). If I do a `window.load()` prior to `window.isel()`, then the memory usage is perfectly reasonable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890
https://github.com/pydata/xarray/issues/2227#issuecomment-424479421,https://api.github.com/repos/pydata/xarray/issues/2227,424479421,MDEyOklzc3VlQ29tbWVudDQyNDQ3OTQyMQ==,291576,2018-09-25T19:54:59Z,2018-09-25T19:54:59Z,CONTRIBUTOR,"Just for posterity, though, here is my simplified (working!) example:
```
import numpy as np
import xarray as xr
da = xr.DataArray(np.random.randn(10, 3000, 7000),
dims=('time', 'latitude', 'longitude'))
window = da.rolling(time=2).construct('win')
indexes = window.argmax(dim='win')
result = window.isel(win=indexes)
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890
https://github.com/pydata/xarray/issues/2227#issuecomment-424477465,https://api.github.com/repos/pydata/xarray/issues/2227,424477465,MDEyOklzc3VlQ29tbWVudDQyNDQ3NzQ2NQ==,291576,2018-09-25T19:48:20Z,2018-09-25T19:48:20Z,CONTRIBUTOR,"Huh, strange... I just tried a simplified version of what I was doing (particularly, no dask arrays), and everything worked fine. I'll have to investigate further.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890
https://github.com/pydata/xarray/issues/2227#issuecomment-424470752,https://api.github.com/repos/pydata/xarray/issues/2227,424470752,MDEyOklzc3VlQ29tbWVudDQyNDQ3MDc1Mg==,291576,2018-09-25T19:27:28Z,2018-09-25T19:27:28Z,CONTRIBUTOR,"I am looking into a similar performance issue with isel, but it seems that the issue is that it is creating arrays that are much bigger than needed. For my multidimensional case (time/x/y/window), what should end up only taking a few hundred MB is spiking up to 10's of GB of used RAM. Don't know if this might be a possible source of performance issues.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,331668890