html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/896#issuecomment-232115059,https://api.github.com/repos/pydata/xarray/issues/896,232115059,MDEyOklzc3VlQ29tbWVudDIzMjExNTA1OQ==,1217238,2016-07-12T17:17:38Z,2016-07-12T17:17:38Z,MEMBER,"@apatlpo Along what axis do your multiple files differ? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458 https://github.com/pydata/xarray/issues/896#issuecomment-232101159,https://api.github.com/repos/pydata/xarray/issues/896,232101159,MDEyOklzc3VlQ29tbWVudDIzMjEwMTE1OQ==,1217238,2016-07-12T16:28:40Z,2016-07-12T16:28:40Z,MEMBER,"This error indicates that Python is running out of memory. Dask (which we use with mfdataset) can help with that, but it doesn't always solve the issue. How is the data arranged in each of the input files? What version of dask are you using? This _might_ be an issue with dask.array not fusing calls to `__getitem__` and loading entire files into memory. cc @mrocklin @jcrist ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458