html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/pull/3858#issuecomment-598577015,https://api.github.com/repos/pydata/xarray/issues/3858,598577015,MDEyOklzc3VlQ29tbWVudDU5ODU3NzAxNQ==,1217238,2020-03-13T06:49:05Z,2020-03-13T06:49:05Z,MEMBER,"If `Nio.open_file` supported these options as keyword arguments instead of environment variables, these arguments could get set in `open_dataset` via `backend_kwargs`.
I think that would be the ideal resolution, but if I recall PyNIO isn't under active development anymore. So in that case, we could consider adding our own solution in xarray to add a new backend argument, but it should be specific PyNIO, not all xarray backends, i.e., it should live entirely in `xarray/backends/pynio_.py`
To make this work robustly with all of xarray's file caching machinery (used with dask, etc), the setup of the environment variable needs to happen inside a helper function wrapping `Nio.open_file`, which could replace `Nio.open_file` on this line:
https://github.com/pydata/xarray/blob/650a981734ce3291f5aaa68648ebde451339f28a/xarray/backends/pynio_.py#L54
I think you could do something similar with overriding environment variables here inside this helper function.
Ideally you could also do clean-up here, deleting these environment variables, but I'm not sure if there's an easy/safe way to do this currently. These methods can get called in multiple threads (e.g., from dask) and I don't think we have a global clean-up mechanism that would work for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,579722569
https://github.com/pydata/xarray/pull/3858#issuecomment-598558696,https://api.github.com/repos/pydata/xarray/issues/3858,598558696,MDEyOklzc3VlQ29tbWVudDU5ODU1ODY5Ng==,1217238,2020-03-13T05:38:46Z,2020-03-13T05:38:46Z,MEMBER,"Thanks for putting together this pull request!
My main concern here is that setting environment variables feels pretty decoupled from the logic of `open_dataset`. It's also rather poor design to make a library only programmable from environment variables, so I wouldn't want to encourage other backends to make use of this practice.
What do you think about writing your own utility for this sort of thing? e.g., based on one of these examples from StackOverflow:
https://stackoverflow.com/questions/2059482/python-temporarily-modify-the-current-processs-environment","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,579722569