html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/847#issuecomment-219529589,https://api.github.com/repos/pydata/xarray/issues/847,219529589,MDEyOklzc3VlQ29tbWVudDIxOTUyOTU4OQ==,1217238,2016-05-16T19:58:29Z,2016-05-16T19:58:29Z,MEMBER,"> What do you think about the reverse approach - checking for a list of known array types, and everything else becomes a scalar?
Sure, we could probably make this work
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,154818715
https://github.com/pydata/xarray/issues/847#issuecomment-219527492,https://api.github.com/repos/pydata/xarray/issues/847,219527492,MDEyOklzc3VlQ29tbWVudDIxOTUyNzQ5Mg==,5635139,2016-05-16T19:50:26Z,2016-05-16T19:50:26Z,MEMBER,"> There are other ways to do this other than checking against a white-list of scalar types, but we can't take the obvious approach of converting everything into a numpy array and then checking the dimensionality because this can't be done safely for some types (e.g., dask.array).
What do you think about the reverse approach - checking for a list of known array types, and everything else becomes a scalar?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,154818715
https://github.com/pydata/xarray/issues/847#issuecomment-219466037,https://api.github.com/repos/pydata/xarray/issues/847,219466037,MDEyOklzc3VlQ29tbWVudDIxOTQ2NjAzNw==,1217238,2016-05-16T16:04:21Z,2016-05-16T16:04:21Z,MEMBER,"You can put anything you want in a Dataset if you provide dimensions explicitly:
```
In [1]: import xarray as xr
In [2]: class Foo:
...: pass
...:
In [3]: xr.Dataset({'foo': ([], Foo())})
Out[3]:
Dimensions: ()
Coordinates:
*empty*
Data variables:
foo object <__main__.Foo object at 0x10824cba8>
```
The problem is that we need some rule to detect the dimensionality of input values, to know if we can treat them as a scalar or if we should raise an error. There are other ways to do this other than checking against a white-list of scalar types, but we can't take the obvious approach of converting everything into a numpy array and then checking the dimensionality because this can't be done safely for some types (e.g., dask.array).
I'm certainly open to ideas on how to improve this. At the very least, we should certainly improve the error message -- the error about mis-matched dimensions arises because we assume that anything that isn't a scalar but that is used as a key in the `data_vars` dict is a 1D coordinate variable along a dimension.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,154818715