html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/1784#issuecomment-367166682,https://api.github.com/repos/pydata/xarray/issues/1784,367166682,MDEyOklzc3VlQ29tbWVudDM2NzE2NjY4Mg==,2443309,2018-02-21T00:10:04Z,2018-02-21T00:10:04Z,MEMBER,"> What does ds.to_netcdf(...) usually return?
If `sync == False` the store is returned, otherwise nothing is returned.
> The term future, when used in a Dask context, generally refers to something that is off computing asynchronously somewhere, rather than a token that holds onto a yet-to-be-submitted lazy graph.
Thanks for the clarification. I wasn't aware of that distinction but it does make sense.
> What is store in this case?
A `store` is `AbstractWritableDataStore`, basically a wrapper class to allow us to read/write to various fileformats with various APIs under a common interface. Notably, each `store` has a `writer` attribute with a `sync` method that calls `dask.array.store`.
----
Another way to do this would be to have user code interact with the sync method directly:
```Python
store = ds.to_netcdf('file.nc', sync=False)
# store.sync calls store.writer.sync() which calls dask.array.sync
delayed_things = store.sync(compute=False)
```
This has the advantage of keeping the `to_netcdf` method a bit cleaner but does expose the `AbstractWritableDataStore` to user code which is typically not a public API object. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,282178751
https://github.com/pydata/xarray/issues/1784#issuecomment-367162456,https://api.github.com/repos/pydata/xarray/issues/1784,367162456,MDEyOklzc3VlQ29tbWVudDM2NzE2MjQ1Ng==,2443309,2018-02-20T23:49:41Z,2018-02-20T23:49:41Z,MEMBER,"@shoyer - Do you have thoughts on how this feature would present to the user? In #1811, I have added the `compute` keyword argument to `to_netcdf` and `to_zarr` and put a `futures` attribute on each store. So the workflow there would be something like:
```Python
store = ds.to_netcdf('file.nc', compute=False)
dask.compute(store.futures)
```
Before I spend too much time on #1811, I want to get some buy in on the API for this feature. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,282178751