home / github / issue_comments

Menu
  • GraphQL API
  • Search all tables

issue_comments: 553709888

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/508#issuecomment-553709888 https://api.github.com/repos/pydata/xarray/issues/508 553709888 MDEyOklzc3VlQ29tbWVudDU1MzcwOTg4OA== 775186 2019-11-14T03:37:59Z 2019-11-14T04:09:02Z CONTRIBUTOR

I just ran in to this issue. While the previous fix seems to handle one case it doesn't handle all the cases. Before I clean this up and open a new PR does this look like its on the right track (it worked for my issue where I was concating multiple datasets which always had the same dims and coordinates but sometimes were missing variables)?

starts at line 353 on concat.py for k in datasets[0].variables: if k in concat_over: try: #new code for ds in datasets: if k not in ds.variables: #make a new array with the same dimensions and coordinates #by default this will be initialized to np.nan which is what we want from .dataarray import DataArray new_array = DataArray(coords=ds.coords, dims=ds.dims) ds[k] = new_array #end new code vars = ensure_common_dims([ds.variables[k] for ds in datasets]) except KeyError: #this can likely be removed then raise ValueError("%r is not present in all datasets." % k) combined = concat_vars(vars, dim, positions) assert isinstance(combined, Variable) result_vars[k] = combined

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  98587746
Powered by Datasette · Queries took 0.643ms · About: xarray-datasette