html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/2863#issuecomment-479223369,https://api.github.com/repos/pydata/xarray/issues/2863,479223369,MDEyOklzc3VlQ29tbWVudDQ3OTIyMzM2OQ==,1217238,2019-04-02T22:00:07Z,2019-04-02T22:00:07Z,MEMBER,"This is a rather large file. The T_2M variable has about 8 billion elements, which won't fit into memory when you load them all at once as float64.

I highly recommend taking a look for this problem: http://xarray.pydata.org/en/stable/dask.html

(or selecting out some of your data first)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,428180638
https://github.com/pydata/xarray/issues/2863#issuecomment-479007709,https://api.github.com/repos/pydata/xarray/issues/2863,479007709,MDEyOklzc3VlQ29tbWVudDQ3OTAwNzcwOQ==,30219501,2019-04-02T13:56:05Z,2019-04-02T15:09:34Z,NONE,"It could be really a memory problem. A smaller dataset with internally zipped NETCDF4 data could be read. I have 50 GB of memory available.
Is there a memory leak for reading this types of files?

","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,428180638
https://github.com/pydata/xarray/issues/2863#issuecomment-478961666,https://api.github.com/repos/pydata/xarray/issues/2863,478961666,MDEyOklzc3VlQ29tbWVudDQ3ODk2MTY2Ng==,30219501,2019-04-02T11:51:25Z,2019-04-02T11:52:30Z,NONE,"I even cannot access the data with eobs[""T_2M""].data. Maybe the file is corrupt, but using 'ncdump' works for this file.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,428180638