html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/4918#issuecomment-781514156,https://api.github.com/repos/pydata/xarray/issues/4918,781514156,MDEyOklzc3VlQ29tbWVudDc4MTUxNDE1Ng==,68662648,2021-02-18T17:35:45Z,2021-02-18T17:35:45Z,NONE,upon looking around I needed to use June_ULCVarXr.load() before to_netcdf(). I think it was trying to use dask to write. So to make sure I tried .load(). Apparently it worked ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,809853736
https://github.com/pydata/xarray/issues/4918#issuecomment-781487143,https://api.github.com/repos/pydata/xarray/issues/4918,781487143,MDEyOklzc3VlQ29tbWVudDc4MTQ4NzE0Mw==,68662648,2021-02-18T16:56:12Z,2021-02-18T16:57:16Z,NONE,"So I tested saving a file without running it through dask

Maybe it was trying to use dask to write the file???? I'm going to rerun the code with .load() before to_netcdf to see if that helps
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,809853736
https://github.com/pydata/xarray/issues/4918#issuecomment-780934253,https://api.github.com/repos/pydata/xarray/issues/4918,780934253,MDEyOklzc3VlQ29tbWVudDc4MDkzNDI1Mw==,68662648,2021-02-17T23:59:49Z,2021-02-18T00:01:31Z,NONE,"> The error does not tell me anything, could you try to open the data also with xarray. You'll need to do something along the lines of
>
> * open the files with `ds = xr.open_dataset(filename)`
> * select windspeed `da = ds.windspeed`
> * subset the array using `da.sel` or `da.isel` (http://xarray.pydata.org/en/stable/indexing.html)
> * use one of the functions to combine the data: http://xarray.pydata.org/en/stable/combining.html
The computer the data is stored on is undergoing maintenance tonight -- I'll get on it tomorrow morning to check!
I'm more or less trying a way to handle large data and would love to solve this output error since if I get it working this would be a method I'd use on a lot of extremely large netcdfs moving forward","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,809853736