html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/1948#issuecomment-369309422,https://api.github.com/repos/pydata/xarray/issues/1948,369309422,MDEyOklzc3VlQ29tbWVudDM2OTMwOTQyMg==,19403647,2018-02-28T17:07:23Z,2018-02-28T17:07:23Z,NONE,"Alright, this might be a better idea. I'll try to suggest this functionality to matplotlib first. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,301013548 https://github.com/pydata/xarray/issues/1142#issuecomment-266032884,https://api.github.com/repos/pydata/xarray/issues/1142,266032884,MDEyOklzc3VlQ29tbWVudDI2NjAzMjg4NA==,19403647,2016-12-09T14:56:35Z,2016-12-09T14:56:35Z,NONE,"Hi, I have taken another approach for using nd window over several dimensions of xarray objects to perform filtering and tapering, based on `scipy.ndimage`, `scipy.signal ` and ` dask.map_overlap`. @shoyer @jhamman it is somewhat similar to what I have presented during the aospy meeting. It also refers to the issue #819. For the moment, I have something that works like this : ``` shape = (50, 30, 40) dims = ('x', 'y', 'z') dummy_array = xr.DataArray(np.random.random(shape), dims=dims) # Define and set a window object w = dummy_array.window w.set(n={'x':24, 'y':24}, cutoff={'x':0.01, 'y':0.01}, window='hanning') ``` where ` n` is the filter order (i.e. the size), `cutoff` is the cutoff frequency, `window` is any window name that can be found in the `scipy.signal.windows` collection. Then the filtering can be perform using the `w.convolve()` method, which build a dask graph for the convolution product. I also want to add a tapering method 'w.taper()' which would be useful for spectral analysis. For multi-tapering, it should also generate an object with an additional dimension corresponding to the number of windows. To do that, I first need to handle the window building using dask. Let me know if you are interesting in this approach. For the moment, I have planned to upload a github project for signal processing tools in the framework of [pangeo-data](https://pangeo-data.github.io/). It sould be online by the end of December and I will happy to have feedback on it. I am not sure it falls into the xarray framework and it may need a dedicated project, but I might be wrong. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,192248351 https://github.com/pydata/xarray/issues/1115#issuecomment-260379241,https://api.github.com/repos/pydata/xarray/issues/1115,260379241,MDEyOklzc3VlQ29tbWVudDI2MDM3OTI0MQ==,19403647,2016-11-14T16:10:55Z,2016-11-14T16:10:55Z,NONE,"I agree with @rabernat in the sense that it could be part of another package (e.g., signal processing). This would also allow the computation of statistical test to assess the significance of the correlation (which is useful since correlation may often be misinterpreted without statistical tests). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,188996339