html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/896#issuecomment-457903796,https://api.github.com/repos/pydata/xarray/issues/896,457903796,MDEyOklzc3VlQ29tbWVudDQ1NzkwMzc5Ng==,26384082,2019-01-27T09:51:12Z,2019-01-27T09:51:12Z,NONE,"In order to maintain a list of currently relevant issues, we mark issues as stale after a period of inactivity If this issue remains relevant, please comment here; otherwise it will be marked as closed automatically ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458 https://github.com/pydata/xarray/issues/896#issuecomment-232115345,https://api.github.com/repos/pydata/xarray/issues/896,232115345,MDEyOklzc3VlQ29tbWVudDIzMjExNTM0NQ==,11750960,2016-07-12T17:18:38Z,2016-07-12T17:18:38Z,CONTRIBUTOR,"Along time_counter ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458 https://github.com/pydata/xarray/issues/896#issuecomment-232115059,https://api.github.com/repos/pydata/xarray/issues/896,232115059,MDEyOklzc3VlQ29tbWVudDIzMjExNTA1OQ==,1217238,2016-07-12T17:17:38Z,2016-07-12T17:17:38Z,MEMBER,"@apatlpo Along what axis do your multiple files differ? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458 https://github.com/pydata/xarray/issues/896#issuecomment-232114342,https://api.github.com/repos/pydata/xarray/issues/896,232114342,MDEyOklzc3VlQ29tbWVudDIzMjExNDM0Mg==,11750960,2016-07-12T17:15:14Z,2016-07-12T17:15:14Z,CONTRIBUTOR,"Thanks for your answer. You'll find below an output from ncdump in order to answer the part about data arrangement (I hope ...) Otherwise, we have: ``` dask 0.8.2 py27_0 defaults xarray 0.7.2 py27_0 defaults netcdf4 1.1.1 np18py27_0 defaults ``` Looking forward for any suggestions. cheers aurelien ``` netcdf NATL60-MJM155_y2008m01d09.5d_gridT { dimensions: x = 5422 ; y = 3454 ; deptht = 300 ; time_counter = UNLIMITED ; // (1 currently) time_bounds = 2 ; variables: float nav_lat(y, x) ; nav_lat:axis = ""Y"" ; nav_lat:standard_name = ""latitude"" ; nav_lat:long_name = ""Latitude"" ; nav_lat:units = ""degrees_north"" ; nav_lat:nav_model = ""grid_T"" ; nav_lat:_Storage = ""chunked"" ; nav_lat:_ChunkSizes = 12, 5422 ; float nav_lon(y, x) ; nav_lon:axis = ""X"" ; nav_lon:standard_name = ""longitude"" ; nav_lon:long_name = ""Longitude"" ; nav_lon:units = ""degrees_east"" ; nav_lon:nav_model = ""grid_T"" ; nav_lon:_Storage = ""chunked"" ; nav_lon:_ChunkSizes = 12, 5422 ; float deptht(deptht) ; deptht:axis = ""Z"" ; deptht:long_name = ""Vertical T levels"" ; deptht:units = ""m"" ; deptht:positive = ""down"" ; deptht:_Storage = ""chunked"" ; deptht:_ChunkSizes = 300 ; float votemper(time_counter, deptht, y, x) ; votemper:long_name = ""temperature"" ; votemper:units = ""degC"" ; votemper:online_operation = ""average"" ; votemper:interval_operation = ""40s"" ; votemper:interval_write = ""5d"" ; votemper:_FillValue = 0.f ; votemper:missing_value = 0.f ; votemper:coordinates = ""time_centered deptht nav_lon nav_lat"" ; votemper:_Storage = ""chunked"" ; votemper:_ChunkSizes = 1, 1, 12, 5422 ; votemper:_DeflateLevel = 1 ; double time_centered(time_counter) ; time_centered:standard_name = ""time"" ; time_centered:long_name = ""Time axis"" ; time_centered:title = ""Time"" ; time_centered:calendar = ""gregorian"" ; time_centered:units = ""seconds since 1958-01-01 00:00:00"" ; time_centered:time_origin = ""1958-01-01 00:00:00"" ; time_centered:bounds = ""time_centered_bounds"" ; time_centered:_Storage = ""chunked"" ; time_centered:_ChunkSizes = 1 ; double time_centered_bounds(time_counter, time_bounds) ; time_centered_bounds:_Storage = ""chunked"" ; time_centered_bounds:_ChunkSizes = 1, 2 ; double time_counter(time_counter) ; time_counter:axis = ""T"" ; time_counter:standard_name = ""time"" ; time_counter:long_name = ""Time axis"" ; time_counter:title = ""Time"" ; time_counter:calendar = ""gregorian"" ; time_counter:units = ""seconds since 1958-01-01 00:00:00"" ; time_counter:time_origin = ""1958-01-01 00:00:00"" ; time_counter:bounds = ""time_counter_bounds"" ; time_counter:_Storage = ""chunked"" ; time_counter:_ChunkSizes = 1 ; double time_counter_bounds(time_counter, time_bounds) ; time_counter_bounds:_Storage = ""chunked"" ; time_counter_bounds:_ChunkSizes = 1, 2 ; float vosaline(time_counter, deptht, y, x) ; vosaline:long_name = ""salinity"" ; vosaline:units = ""psu"" ; vosaline:online_operation = ""average"" ; vosaline:interval_operation = ""40s"" ; vosaline:interval_write = ""5d"" ; vosaline:_FillValue = 0.f ; vosaline:missing_value = 0.f ; vosaline:coordinates = ""time_centered deptht nav_lon nav_lat"" ; vosaline:_Storage = ""chunked"" ; vosaline:_ChunkSizes = 1, 1, 12, 5422 ; vosaline:_DeflateLevel = 1 ; float sossheig(time_counter, y, x) ; sossheig:long_name = ""sea surface height"" ; sossheig:units = ""m"" ; sossheig:online_operation = ""average"" ; sossheig:interval_operation = ""40s"" ; sossheig:interval_write = ""5d"" ; sossheig:_FillValue = 0.f ; sossheig:missing_value = 0.f ; sossheig:coordinates = ""time_centered nav_lon nav_lat"" ; sossheig:_Storage = ""chunked"" ; sossheig:_ChunkSizes = 1, 12, 5422 ; // global attributes: :description = ""ocean T grid variables"" ; :conventions = ""CF-1.1"" ; :production = ""An IPSL model"" ; :start_date = 20040101 ; :output_frequency = ""5d"" ; :CONFIG = ""NATL60"" ; :CASE = ""MJM155"" ; :_Format = ""netCDF-4"" ; } ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458 https://github.com/pydata/xarray/issues/896#issuecomment-232101159,https://api.github.com/repos/pydata/xarray/issues/896,232101159,MDEyOklzc3VlQ29tbWVudDIzMjEwMTE1OQ==,1217238,2016-07-12T16:28:40Z,2016-07-12T16:28:40Z,MEMBER,"This error indicates that Python is running out of memory. Dask (which we use with mfdataset) can help with that, but it doesn't always solve the issue. How is the data arranged in each of the input files? What version of dask are you using? This _might_ be an issue with dask.array not fusing calls to `__getitem__` and loading entire files into memory. cc @mrocklin @jcrist ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,165104458