issues
2 rows where user = 48155582 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
503583044 | MDU6SXNzdWU1MDM1ODMwNDQ= | 3379 | `ds.to_zarr(mode="a", append_dim="time")` not capturing any time steps under Hours | jminsk-cc 48155582 | closed | 0 | 3 | 2019-10-07T17:17:06Z | 2024-05-03T18:34:50Z | 2024-05-03T18:34:50Z | NONE | MCVE Code Sample```python import datetime import xarray as xr date = datetime.datetime(2019, 1, 1, 1, 10) Reading in 2 min time stepped MRMS datads = xr.open_rasterio(dir_path) ds.name = "mrms" ds["time"] = date ds = ds.expand_dims("time") ds = ds.to_dataset() ds.to_zarr("fin_zarr", compute=False, mode="w-") date = datetime.datetime(2019, 1, 1, 1, 12) Reading in 2 min time stepped MRMS dataThis can be the same file since we are adding time manuallyds = xr.open_rasterio(dir_path) ds.name = "mrms" ds["time"] = date ds = ds.expand_dims("time") ds = ds.to_dataset() ds.to_zarr("fin_zarr", compute=False, mode="a", append_dim="time") ``` Expected Output
Problem DescriptionThe outout looks like this:
Where the minutes are repeated for the whole hour until a new hour is appended. It seems to not be handling minutes correctly. Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3379/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue | ||||||
484592018 | MDU6SXNzdWU0ODQ1OTIwMTg= | 3251 | to_zarr append with gcsmap does not work properly | jminsk-cc 48155582 | closed | 0 | 12 | 2019-08-23T15:35:08Z | 2019-08-26T20:20:05Z | 2019-08-26T20:20:05Z | NONE | MCVE Code Sample```python import gcsfs from gcsfs import mapping import xarray as xr gcs_bucket_name = '<bucket_name_here>' gcsmap = mapping.GCSMap(gcs_bucket_name) ds.to_zarr(store=gcsmap, mode='a', consolidated=True, compute=False, append_dim='time') ``` Expected OutputZarr store in bucket appended on time dimension. Problem Description
```python
ValueError Traceback (most recent call last)
<ipython-input-103-cb22683dde66> in <module>
----> 1 ds.to_zarr(store=gcsmap, mode='a', consolidated=True, compute=False, append_dim='time')
~/anaconda3/envs/intake/lib/python3.7/site-packages/xarray/core/dataset.py in to_zarr(self, store, mode, synchronizer, group, encoding, compute, consolidated, append_dim)
1431 return to_zarr(self, store=store, mode=mode, synchronizer=synchronizer,
1432 group=group, encoding=encoding, compute=compute,
-> 1433 consolidated=consolidated, append_dim=append_dim)
1434
1435 def __repr__(self):
~/anaconda3/envs/intake/lib/python3.7/site-packages/xarray/backends/api.py in to_zarr(dataset, store, mode, synchronizer, group, encoding, compute, consolidated, append_dim)
1090 group=group,
1091 consolidated=consolidated,
-> 1092 encoding=encoding)
1093
1094 zstore = backends.ZarrStore.open_group(store=store, mode=mode,
~/anaconda3/envs/intake/lib/python3.7/site-packages/xarray/backends/api.py in _validate_append_dim_and_encoding(ds_to_append, store, append_dim, encoding, **open_kwargs)
1053 if append_dim not in ds.dims:
1054 raise ValueError(
-> 1055 "{} not a valid dimension in the Dataset".format(append_dim)
1056 )
1057 for data_var in ds_to_append:
ValueError: time not a valid dimension in the Dataset
```
Line 1296 in xarray.backends.api
```python
def to_zarr(
dataset,
store=None,
mode=None,
synchronizer=None,
group=None,
encoding=None,
compute=True,
consolidated=False,
append_dim=None,
):
"""This function creates an appropriate datastore for writing a dataset to
a zarr ztore
See `Dataset.to_zarr` for full API docs.
"""
if isinstance(store, Path):
store = str(store)
```
The mutable mapping created by gcsfs gets turned into a string (IE This was tested by running ```python import gcsfs from gcsfs import mapping import xarray as xr from xarray import backends gcs_bucket_name = '<bucket_name_here>' gcsmap = mapping.GCSMap(gcs_bucket_name) same way it is called in _validate_append_dim_and_encoding minus str gcsmapds = backends.zarr.open_zarr(gcsmap) ``` and ```python import gcsfs from gcsfs import mapping import xarray as xr from xarray import backends gcs_bucket_name = '<bucket_name_here>' gcsmap = mapping.GCSMap(gcs_bucket_name) same way it is called in _validate_append_dim_and_encodingds = backends.zarr.open_zarr(str(gcsmap)) ``` The top example reads the data in correctly while the second one fails with:
```python
ValueError Traceback (most recent call last)
<ipython-input-102-52e941f0a8f2> in <module>
----> 1 ds = backends.zarr.open_zarr(str(gcsmap))
~/anaconda3/envs/intake/lib/python3.7/site-packages/xarray/backends/zarr.py in open_zarr(store, group, synchronizer, chunks, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, consolidated, overwrite_encoded_chunks, **kwargs)
554 zarr_store = ZarrStore.open_group(store, mode=mode,
555 synchronizer=synchronizer,
--> 556 group=group, consolidated=consolidated)
557 ds = maybe_decode_store(zarr_store)
558
~/anaconda3/envs/intake/lib/python3.7/site-packages/xarray/backends/zarr.py in open_group(cls, store, mode, synchronizer, group, consolidated, consolidate_on_close)
250 zarr_group = zarr.open_consolidated(store, **open_kwargs)
251 else:
--> 252 zarr_group = zarr.open_group(store, **open_kwargs)
253 return cls(zarr_group, consolidate_on_close)
254
~/anaconda3/envs/intake/lib/python3.7/site-packages/zarr/hierarchy.py in open_group(store, mode, cache_attrs, synchronizer, path, chunk_store)
1114 err_contains_array(path)
1115 elif not contains_group(store, path=path):
-> 1116 err_group_not_found(path)
1117
1118 elif mode == 'w':
~/anaconda3/envs/intake/lib/python3.7/site-packages/zarr/errors.py in err_group_not_found(path)
27
28 def err_group_not_found(path):
---> 29 raise ValueError('group not found at path %r' % path)
30
31
ValueError: group not found at path ''
```
Output of
|
{ "url": "https://api.github.com/repos/pydata/xarray/issues/3251/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | xarray 13221727 | issue |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [active_lock_reason] TEXT, [draft] INTEGER, [pull_request] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [state_reason] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT ); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);