issue_comments
10 rows where author_association = "NONE" and issue = 350899839 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- Let's list all the netCDF files that xarray can't open · 10 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
1382724561 | https://github.com/pydata/xarray/issues/2368#issuecomment-1382724561 | https://api.github.com/repos/pydata/xarray/issues/2368 | IC_kwDOAMm_X85SarPR | ronygolderku 64892520 | 2023-01-14T12:11:03Z | 2023-01-14T12:11:03Z | NONE |
Is there any solution? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
1382669848 | https://github.com/pydata/xarray/issues/2368#issuecomment-1382669848 | https://api.github.com/repos/pydata/xarray/issues/2368 | IC_kwDOAMm_X85Sad4Y | ronygolderku 64892520 | 2023-01-14T05:56:44Z | 2023-01-14T05:56:44Z | NONE | found this one, The dataset was given based on request. That's why... Anyway, anybody want to check, you can find this polar front
Output.``` MissingDimensionsError Traceback (most recent call last) ~\AppData\Local\Temp\ipykernel_9796\2161474679.py in <module> ----> 1 data = xr.open_dataset("C:/Users/admin/Downloads/CTOH_PolarFront_weekly_1993_2019.nc") ~\anaconda3\lib\site-packages\xarray\backends\api.py in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, backend_kwargs, args, *kwargs) 493 494 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None) --> 495 backend_ds = backend.open_dataset( 496 filename_or_obj, 497 drop_variables=drop_variables, ~\anaconda3\lib\site-packages\xarray\backends\netCDF4_.py in open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose) 562 store_entrypoint = StoreBackendEntrypoint() 563 with close_on_error(store): --> 564 ds = store_entrypoint.open_dataset( 565 store, 566 mask_and_scale=mask_and_scale, ~\anaconda3\lib\site-packages\xarray\backends\store.py in open_dataset(self, store, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta) 37 ) 38 ---> 39 ds = Dataset(vars, attrs=attrs) 40 ds = ds.set_coords(coord_names.intersection(vars)) 41 ds.set_close(store.close) ~\anaconda3\lib\site-packages\xarray\core\dataset.py in init(self, data_vars, coords, attrs) 749 coords = coords.variables 750 --> 751 variables, coord_names, dims, indexes, _ = merge_data_and_coords( 752 data_vars, coords, compat="broadcast_equals" 753 ) ~\anaconda3\lib\site-packages\xarray\core\merge.py in merge_data_and_coords(data, coords, compat, join) 486 explicit_coords = coords.keys() 487 indexes = dict(_extract_indexes_from_coords(coords)) --> 488 return merge_core( 489 objects, compat, join, explicit_coords=explicit_coords, indexes=indexes 490 ) ~\anaconda3\lib\site-packages\xarray\core\merge.py in merge_core(objects, compat, join, combine_attrs, priority_arg, explicit_coords, indexes, fill_value) 635 coerced, join=join, copy=False, indexes=indexes, fill_value=fill_value 636 ) --> 637 collected = collect_variables_and_indexes(aligned) 638 639 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat) ~\anaconda3\lib\site-packages\xarray\core\merge.py in collect_variables_and_indexes(list_of_mappings) 294 append_all(coords, indexes) 295 --> 296 variable = as_variable(variable, name=name) 297 298 if variable.dims == (name,): ~\anaconda3\lib\site-packages\xarray\core\variable.py in as_variable(obj, name) 156 # convert the Variable into an Index 157 if obj.ndim != 1: --> 158 raise MissingDimensionsError( 159 f"{name!r} has more than 1-dimension and the same name as one of its " 160 f"dimensions {obj.dims!r}. xarray disallows such variables because they " MissingDimensionsError: 'longitude' has more than 1-dimension and the same name as one of its dimensions ('time', 'longitude'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
1320993705 | https://github.com/pydata/xarray/issues/2368#issuecomment-1320993705 | https://api.github.com/repos/pydata/xarray/issues/2368 | IC_kwDOAMm_X85OvMOp | maxaragon 39450418 | 2022-11-19T23:46:35Z | 2022-11-20T13:31:45Z | NONE | Found another example from ICON NWP model. Files open with netCDF4 library but not with xarray.
Error: ```TypeError Traceback (most recent call last) Cell In [109], line 1 ----> 1 ds = xr.open_dataset('20200825_hyytiala_icon-iglo-12-23.nc') File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/api.py:531, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, backend_kwargs, kwargs) 519 decoders = _resolve_decoders_kwargs( 520 decode_cf, 521 open_backend_dataset_parameters=backend.open_dataset_parameters, (...) 527 decode_coords=decode_coords, 528 ) 530 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None) --> 531 backend_ds = backend.open_dataset( 532 filename_or_obj, 533 drop_variables=drop_variables, 534 decoders, 535 kwargs, 536 ) 537 ds = _dataset_from_backend_dataset( 538 backend_ds, 539 filename_or_obj, (...) 547 kwargs, 548 ) 549 return ds File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/netCDF4_.py:569, in NetCDF4BackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose) 567 store_entrypoint = StoreBackendEntrypoint() 568 with close_on_error(store): --> 569 ds = store_entrypoint.open_dataset( 570 store, 571 mask_and_scale=mask_and_scale, 572 decode_times=decode_times, 573 concat_characters=concat_characters, 574 decode_coords=decode_coords, 575 drop_variables=drop_variables, 576 use_cftime=use_cftime, 577 decode_timedelta=decode_timedelta, 578 ) 579 return ds File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/backends/store.py:29, in StoreBackendEntrypoint.open_dataset(self, store, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta) 26 vars, attrs = store.load() 27 encoding = store.get_encoding() ---> 29 vars, attrs, coord_names = conventions.decode_cf_variables( 30 vars, 31 attrs, 32 mask_and_scale=mask_and_scale, 33 decode_times=decode_times, 34 concat_characters=concat_characters, 35 decode_coords=decode_coords, 36 drop_variables=drop_variables, 37 use_cftime=use_cftime, 38 decode_timedelta=decode_timedelta, 39 ) 41 ds = Dataset(vars, attrs=attrs) 42 ds = ds.set_coords(coord_names.intersection(vars)) File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/conventions.py:509, in decode_cf_variables(variables, attributes, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables, use_cftime, decode_timedelta) 507 # Time bounds coordinates might miss the decoding attributes 508 if decode_times: --> 509 _update_bounds_attributes(variables) 511 new_vars = {} 512 for k, v in variables.items(): File ~/.virtualenvs/INAR/lib/python3.10/site-packages/xarray/conventions.py:410, in _update_bounds_attributes(variables) 408 for v in variables.values(): 409 attrs = v.attrs --> 410 has_date_units = "units" in attrs and "since" in attrs["units"] 411 if has_date_units and "bounds" in attrs: 412 if attrs["bounds"] in variables: TypeError: argument of type 'numpy.float32' is not iterable ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
1320996459 | https://github.com/pydata/xarray/issues/2368#issuecomment-1320996459 | https://api.github.com/repos/pydata/xarray/issues/2368 | IC_kwDOAMm_X85OvM5r | maxaragon 39450418 | 2022-11-20T00:06:17Z | 2022-11-20T00:06:17Z | NONE | @andersy005 indeed, I have updated xarray and works now, previous version was: ``` INSTALLED VERSIONS commit: None python: 3.10.6 (main, Aug 30 2022, 04:58:14) [Clang 13.1.6 (clang-1316.0.21.2.5)] python-bits: 64 OS: Darwin OS-release: 21.6.0 machine: arm64 processor: i386 byteorder: little LC_ALL: None LANG: None LOCALE: (None, 'UTF-8') libhdf5: 1.12.2 libnetcdf: 4.9.0 xarray: 2022.6.0 pandas: 1.4.4 numpy: 1.23.2 scipy: 1.9.1 netCDF4: 1.6.0 pydap: None h5netcdf: None h5py: None Nio: None zarr: None cftime: 1.6.1 nc_time_axis: None PseudoNetCDF: None rasterio: None cfgrib: None iris: None bottleneck: 1.3.5 dask: None distributed: None matplotlib: 3.5.3 cartopy: None seaborn: 0.12.1 numbagg: None fsspec: None cupy: None pint: None sparse: None flox: None numpy_groupies: None setuptools: 63.4.3 pip: 22.2.2 conda: None pytest: None IPython: 8.5.0 sphinx: None ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
800374879 | https://github.com/pydata/xarray/issues/2368#issuecomment-800374879 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDgwMDM3NDg3OQ== | ognancy4life 59902324 | 2021-03-16T15:42:25Z | 2021-03-16T15:42:25Z | NONE | @dcherian Thanks for your reply. I think I understand the issue. What, specifically, do you suggest to fix this issue in my own code considering this is not a dataset I generated? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
785298202 | https://github.com/pydata/xarray/issues/2368#issuecomment-785298202 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDc4NTI5ODIwMg== | ognancy4life 59902324 | 2021-02-24T18:54:35Z | 2021-02-24T18:56:11Z | NONE | Found one! https://www.ncei.noaa.gov/data/oceans/ncei/ocads/data/0191304/ The dataset published in Bushinsky et al. (2019), which is basically the Landshutzer et al. (2014) climatology plus SOCCOM Float-based pCO2 data, and updated through 2018. I've only tried the first file in the list (https://www.ncei.noaa.gov/data/oceans/ncei/ocads/data/0191304/MPI-SOM_FFN_SOCCOMv2018.nc), but suspect the others will have the same issue. Here's the error (sounds like you all have discussed before, but I can't see an easy answer): ```MissingDimensionsError Traceback (most recent call last) <ipython-input-4-9e0af51f1c05> in <module> ----> 1 SOMFFN = xr.open_dataset('MPI-SOM_FFN_SOCCOMv2018.nc') ~/opt/anaconda3/lib/python3.8/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs, use_cftime, decode_timedelta) 573 574 with close_on_error(store): --> 575 ds = maybe_decode_store(store, chunks) 576 577 # Ensure source filename always stored in dataset object (GH issue #2550) ~/opt/anaconda3/lib/python3.8/site-packages/xarray/backends/api.py in maybe_decode_store(store, chunks) 469 470 def maybe_decode_store(store, chunks): --> 471 ds = conventions.decode_cf( 472 store, 473 mask_and_scale=mask_and_scale, ~/opt/anaconda3/lib/python3.8/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables, use_cftime, decode_timedelta) 598 decode_timedelta=decode_timedelta, 599 ) --> 600 ds = Dataset(vars, attrs=attrs) 601 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) 602 ds._file_obj = file_obj ~/opt/anaconda3/lib/python3.8/site-packages/xarray/core/dataset.py in init(self, data_vars, coords, attrs) 628 coords = coords.variables 629 --> 630 variables, coord_names, dims, indexes, _ = merge_data_and_coords( 631 data_vars, coords, compat="broadcast_equals" 632 ) ~/opt/anaconda3/lib/python3.8/site-packages/xarray/core/merge.py in merge_data_and_coords(data, coords, compat, join) 465 explicit_coords = coords.keys() 466 indexes = dict(_extract_indexes_from_coords(coords)) --> 467 return merge_core( 468 objects, compat, join, explicit_coords=explicit_coords, indexes=indexes 469 ) ~/opt/anaconda3/lib/python3.8/site-packages/xarray/core/merge.py in merge_core(objects, compat, join, combine_attrs, priority_arg, explicit_coords, indexes, fill_value) 592 coerced, join=join, copy=False, indexes=indexes, fill_value=fill_value 593 ) --> 594 collected = collect_variables_and_indexes(aligned) 595 596 prioritized = _get_priority_vars_and_indexes(aligned, priority_arg, compat=compat) ~/opt/anaconda3/lib/python3.8/site-packages/xarray/core/merge.py in collect_variables_and_indexes(list_of_mappings) 276 append_all(coords, indexes) 277 --> 278 variable = as_variable(variable, name=name) 279 if variable.dims == (name,): 280 variable = variable.to_index_variable() ~/opt/anaconda3/lib/python3.8/site-packages/xarray/core/variable.py in as_variable(obj, name) 152 # convert the Variable into an Index 153 if obj.ndim != 1: --> 154 raise MissingDimensionsError( 155 "%r has more than 1-dimension and the same name as one of its " 156 "dimensions %r. xarray disallows such variables because they " MissingDimensionsError: 'date' has more than 1-dimension and the same name as one of its dimensions ('time', 'date'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
580442427 | https://github.com/pydata/xarray/issues/2368#issuecomment-580442427 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDU4MDQ0MjQyNw== | blaylockbk 6249613 | 2020-01-30T20:21:30Z | 2020-01-30T20:26:08Z | NONE | Adding another example. While working through the Model Evaluation Tool (MET) tutorial, I created a NetCDF file with the tool, and wasn't able to open the file it created.
Sounds to me like the same error caused by https://github.com/pydata/xarray/issues/2233 Below is the .nc file contents with ```
// global attributes: :MET_version = "V8.1.2" ; :MET_tool = "pcp_combine" ; :RunCommand = "Sum: 4 files with accumulations of 030000." ; :Projection = "Lambert Conformal" ; :hemisphere = "N" ; :scale_lat_1 = "25.000000" ; :scale_lat_2 = "25.000000" ; :lat_pin = "12.190000" ; :lon_pin = "-133.459000" ; :x_pin = "0.000000" ; :y_pin = "0.000000" ; :lon_orient = "-95.000000" ; :d_km = "40.635000" ; :r_km = "6371.200000" ; :nx = "185" ; :ny = "129 grid_points" ; } ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
443304555 | https://github.com/pydata/xarray/issues/2368#issuecomment-443304555 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQ0MzMwNDU1NQ== | nordam 319297 | 2018-11-30T18:59:26Z | 2018-11-30T18:59:26Z | NONE | Indeed. An example file (1.1 MB) can be found here: http://folk.ntnu.no/nordam/entrainment.nc And the error message I get on trying to open this file is:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
443227318 | https://github.com/pydata/xarray/issues/2368#issuecomment-443227318 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQ0MzIyNzMxOA== | rsignell-usgs 1872600 | 2018-11-30T14:53:13Z | 2018-11-30T14:53:13Z | NONE | @nordam , can you provide an example? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
443218629 | https://github.com/pydata/xarray/issues/2368#issuecomment-443218629 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQ0MzIxODYyOQ== | nordam 319297 | 2018-11-30T14:25:00Z | 2018-11-30T14:25:00Z | NONE | Just adding that netCDF files produced as output from the GOTM turbulence model cannot be opened by xarray. I believe the reason is self-referential multidimensional coordinates. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 6