html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/3165#issuecomment-516195053,https://api.github.com/repos/pydata/xarray/issues/3165,516195053,MDEyOklzc3VlQ29tbWVudDUxNjE5NTA1Mw==,1217238,2019-07-29T23:05:57Z,2019-07-29T23:05:57Z,MEMBER,"I think this triggers a case that dask's scheduler doesn't handle well, related to this issue: https://github.com/dask/dask/issues/874","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721 https://github.com/pydata/xarray/issues/3165#issuecomment-516193739,https://api.github.com/repos/pydata/xarray/issues/3165,516193739,MDEyOklzc3VlQ29tbWVudDUxNjE5MzczOQ==,1217238,2019-07-29T23:00:37Z,2019-07-29T23:00:37Z,MEMBER,"Actually, there does seem to be something fishy going on here. I find that I'm able to execute `temp.rolling(x=100).construct('window').mean('window').compute()` successfully but not `temp.rolling(x=100).mean().compute()`, even though that should mostly be equivalent to the former.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721 https://github.com/pydata/xarray/issues/3165#issuecomment-516193582,https://api.github.com/repos/pydata/xarray/issues/3165,516193582,MDEyOklzc3VlQ29tbWVudDUxNjE5MzU4Mg==,1217238,2019-07-29T22:59:48Z,2019-07-29T22:59:48Z,MEMBER,"For context, xarray's rolling window code creates a ""virtual dimension"" for the rolling window. So if your chunks are size (5000, 100) before the rolling window, they are size (5000, 100, 100) within the rolling window computation. So it's not entirely surprising that there are more issues with memory usage -- these are much bigger arrays, e.g., see ``` >>> temp.rolling(x=100).construct('window') dask.array Dimensions without coordinates: x, y, window ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721 https://github.com/pydata/xarray/issues/3165#issuecomment-516187643,https://api.github.com/repos/pydata/xarray/issues/3165,516187643,MDEyOklzc3VlQ29tbWVudDUxNjE4NzY0Mw==,1217238,2019-07-29T22:33:56Z,2019-07-29T22:33:56Z,MEMBER,"You want to use the chunks argument *inside* da.zeros, e.g., da.zeros((5000, 50000), chunks=100). On Mon, Jul 29, 2019 at 3:30 PM peterhob wrote: > Did you try converting np.zeros((5000, 50000) to use dask.array.zeros > instead? The former will allocate 2 GB of data within each chunk > > Thank you for your suggestion. Tried as you suggested, still with same > error. > > import numpy as npimport xarray as xrimport dask.array as da# from dask.distributed import Client > temp= xr.DataArray(da.zeros((5000, 50000)),dims=(""x"",""y"")).chunk({""y"":100, }) > temp.rolling(x=100).mean() > > I have also tried saving the array to nc file and read it after that. > Still rolling gives same error (with or without bottleneck and different > chunks). Even though it says memory error, it doesn't consume too much > memory. > > — > You are receiving this because you commented. > Reply to this email directly, view it on GitHub > , > or mute the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721 https://github.com/pydata/xarray/issues/3165#issuecomment-516060323,https://api.github.com/repos/pydata/xarray/issues/3165,516060323,MDEyOklzc3VlQ29tbWVudDUxNjA2MDMyMw==,1217238,2019-07-29T16:20:07Z,2019-07-29T16:20:07Z,MEMBER,"Did you try converting `np.zeros((5000, 50000)` to use `dask.array.zeros` instead? The former will allocate 2 GB of data within each chunk","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721 https://github.com/pydata/xarray/issues/3165#issuecomment-515738254,https://api.github.com/repos/pydata/xarray/issues/3165,515738254,MDEyOklzc3VlQ29tbWVudDUxNTczODI1NA==,1217238,2019-07-28T06:55:43Z,2019-07-28T06:55:43Z,MEMBER,"Have you tried adding more chunking, e.g., along the x dimension? That’s that usual recommendation if you’re running out of memory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,473692721