home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 120442769

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/463#issuecomment-120442769 https://api.github.com/repos/pydata/xarray/issues/463 120442769 MDEyOklzc3VlQ29tbWVudDEyMDQ0Mjc2OQ== 1197350 2015-07-10T15:53:48Z 2015-07-10T15:53:48Z MEMBER

Just a little follow up...I tried to work around the file limit by serializing the processing of the files and creating xray datasets with with fewer files in them. However, I still eventually hit this error, suggesting that the files are never being closed. For example

I would like to do

python ds = xray.open_mfdataset(ddir + '*.nc' % yr, engine='scipy') EKE = (ds.variables['u']**2 + ds.variables['v']**2).mean(dim='time').load()

This tries to open 8031 files and produces the error: [Errno 24] Too many open files

So then I try to create a new dataset for each year

python EKE = [] for yr in xrange(1993,2015): print yr # this opens about 365 files ds = xray.open_mfdataset(ddir + '/dt_global_allsat_msla_uv_%04d*.nc' % yr, engine='scipy') EKE.append((ds.variables['u']**2 + ds.variables['v']**2).mean(dim='time').load())

This works okay for the first two years. However, by the third year, I still get the error: [Errno 24] Too many open files. This is when the ulimit of 1024 files is exceeded.

Using xray version 0.5.1 via conda module.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  94328498
Powered by Datasette · Queries took 0.624ms · About: xarray-datasette