issue_comments
8 rows where author_association = "CONTRIBUTOR" and issue = 350899839 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- Let's list all the netCDF files that xarray can't open · 8 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
419225007 | https://github.com/pydata/xarray/issues/2368#issuecomment-419225007 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxOTIyNTAwNw== | dopplershift 221526 | 2018-09-06T20:10:24Z | 2018-09-06T20:10:24Z | CONTRIBUTOR | That sounds reasonable to me. I don't necessarily expect all of the xarray goodness to work with those files, but I do expect them to open without error. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
419176628 | https://github.com/pydata/xarray/issues/2368#issuecomment-419176628 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxOTE3NjYyOA== | dopplershift 221526 | 2018-09-06T17:28:14Z | 2018-09-06T17:28:14Z | CONTRIBUTOR | @rabernat While I agree that they're (somewhat) confusing files, I think you're missing two things:
IMO, xarray is being overly pedantic here. XArray states that it adopts the Common Data Model (CDM); netCDF-java and the CDM were the tools used to generate the failing examples above. |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
419166240 | https://github.com/pydata/xarray/issues/2368#issuecomment-419166240 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxOTE2NjI0MA== | djhoese 1828519 | 2018-09-06T16:54:43Z | 2018-09-06T16:55:11Z | CONTRIBUTOR | @rabernat For the groups NetCDF files I had in mind the NASA L1B data files for the satellite instrument VIIRS onboard Suomi-NPP and NOAA-20 satellites. You can see an example file here. The summary of the ncdump is: ``` netcdf VNP02IMG.A2018008.0000.001.2018061001540 { dimensions: number_of_scans = 202 ; number_of_lines = 6464 ; number_of_pixels = 6400 ; number_of_LUT_values = 65536 ; ... lots of global attributes ... group: scan_line_attributes { variables: double scan_start_time(number_of_scans) ; scan_start_time:long_name = "Scan start time (TAI93)" ; scan_start_time:units = "seconds" ; scan_start_time:_FillValue = -999.9 ; scan_start_time:valid_min = 0. ; scan_start_time:valid_max = 2000000000. ; ... lots of other variables in this group ... group: observation_data { variables: ushort I04(number_of_lines, number_of_pixels) ; I04:long_name = "I-band 04 earth view radiance" ; I04:units = "Watts/meter^2/steradian/micrometer" ; I04:_FillValue = 65535US ; I04:valid_min = 0US ; I04:valid_max = 65527US ; I04:scale_factor = 6.104354e-05f ; I04:add_offset = 0.0016703f ; I04:flag_values = 65532US, 65533US, 65534US ; I04:flag_meanings = "Missing_EV Bowtie_Deleted Cal_Fail" ; ``` When I first started out with xarray I assumed I would be able to do something like:
Which I can't do, but can do with the python netcdf4 library: ``` In [7]: from netCDF4 import Dataset In [8]: nc = Dataset('VNP02IMG.A2018008.0000.001.2018061001540.nc') In [9]: nc['observation_data/I04'] Out[9]: <class 'netCDF4._netCDF4.Variable'> ``` I understand that I can provide the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
415402600 | https://github.com/pydata/xarray/issues/2368#issuecomment-415402600 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxNTQwMjYwMA== | markelg 6883049 | 2018-08-23T12:50:26Z | 2018-08-23T12:50:26Z | CONTRIBUTOR | I found this problem too long ago (see #457). Back then the workaround we implemented is to exclude the offending variable ("siglay" or "isobaric" in the examples above) with the "drop_variables" optional argument. Of course this is not great if you want to actually use the values in the variable you are dropping. I personally don't like the notion of a "two dimensional coordinate", I find it confusing. However this kind of netCDFs are common, so fully supporting them in xarray would be nice. But I don't know how. Maybe just renaming the variable instead of dropping it with a "rename_variables"? This is the only thing that comes to my mind. |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
413849756 | https://github.com/pydata/xarray/issues/2368#issuecomment-413849756 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxMzg0OTc1Ng== | djhoese 1828519 | 2018-08-17T12:26:42Z | 2018-08-17T12:26:42Z | CONTRIBUTOR | This is mentioned elsewhere (can't find the issue right now) and may be out of scope for this issue but I'm going to say it anyway: opening a NetCDF file with groups was not as easy as I wanted it to be when first starting out with xarray. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
413281638 | https://github.com/pydata/xarray/issues/2368#issuecomment-413281638 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxMzI4MTYzOA== | dopplershift 221526 | 2018-08-15T17:58:12Z | 2018-08-15T17:58:12Z | CONTRIBUTOR | Here's a sample CDL for a file: ``` netcdf temp { dimensions: profile = 1 ; station = 1 ; isobaric = 31 ; station_name_strlen = 10 ; station_description_strlen = 33 ; variables: float isobaric(station, profile, isobaric) ; isobaric:standard_name = "isobaric" ; isobaric:long_name = "isobaric" ; isobaric:units = "Pa" ; isobaric:positive = "down" ; isobaric:axis = "Z" ; float Geopotential_height_isobaric(station, profile, isobaric) ; Geopotential_height_isobaric:standard_name = "Geopotential_height_isobaric" ; Geopotential_height_isobaric:long_name = "Geopotential_height_isobaric" ; Geopotential_height_isobaric:units = "gpm" ; Geopotential_height_isobaric:coordinates = "time longitude latitude isobaric" ; char station_name(station, station_name_strlen) ; station_name:long_name = "station name" ; station_name:cf_role = "timeseries_id" ; char station_description(station, station_description_strlen) ; station_description:long_name = "station description" ; station_description:standard_name = "platform_name" ; double latitude(station) ; latitude:units = "degrees_north" ; latitude:long_name = "profile latitude" ; double longitude(station) ; longitude:units = "degrees_east" ; longitude:long_name = "profile longitude" ; double time(station, profile) ; time:units = "Hour since 2018-08-15T12:00:00Z" ; time:calendar = "proleptic_gregorian" ; time:standard_name = "time" ; time:long_name = "GRIB forecast or observation time" ; // global attributes:
:Conventions = "CDM-Extended-CF" ;
:history = "Written by CFPointWriter" ;
:title = "Extract Points data from Grid file /data/ldm/pub/native/grid/NCEP/GFS/Global_0p5deg/GFS_Global_0p5deg_20180815_1200.grib2.ncx3#LatLon_361X720-p25S-180p0E" ;
:featureType = "timeSeriesProfile" ;
:time_coverage_start = "2018-08-15T18:00:00Z" ;
:time_coverage_end = "2018-08-15T18:00:00Z" ;
:geospatial_lat_min = 39.9995 ;
:geospatial_lat_max = 40.0005 ;
:geospatial_lon_min = -105.0005 ;
:geospatial_lon_max = -104.9995 ;
}
MissingDimensionsError Traceback (most recent call last) <ipython-input-10-d6f8d8651b9f> in <module>() 4 query.add_lonlat().accept('netcdf4') 5 nc = ncss.get_data(query) ----> 6 xr.open_dataset(NetCDF4DataStore(nc)) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 352 store = backends.ScipyDataStore(filename_or_obj) 353 --> 354 return maybe_decode_store(store) 355 356 ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 256 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 257 concat_characters=concat_characters, decode_coords=decode_coords, --> 258 drop_variables=drop_variables) 259 260 _protect_dataset_variables_inplace(ds, cache) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 428 vars, attrs, concat_characters, mask_and_scale, decode_times, 429 decode_coords, drop_variables=drop_variables) --> 430 ds = Dataset(vars, attrs=attrs) 431 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) 432 ds._file_obj = file_obj ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in init(self, data_vars, coords, attrs, compat) 363 coords = {} 364 if data_vars is not None or coords is not None: --> 365 self._set_init_vars_and_dims(data_vars, coords, compat) 366 if attrs is not None: 367 self.attrs = attrs ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in _set_init_vars_and_dims(self, data_vars, coords, compat) 381 382 variables, coord_names, dims = merge_data_and_coords( --> 383 data_vars, coords, compat=compat) 384 385 self._variables = variables ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_data_and_coords(data, coords, compat, join) 363 indexes = dict(extract_indexes(coords)) 364 return merge_core(objs, compat, join, explicit_coords=explicit_coords, --> 365 indexes=indexes) 366 367 ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_core(objs, compat, join, priority_arg, explicit_coords, indexes) 433 coerced = coerce_pandas_values(objs) 434 aligned = deep_align(coerced, join=join, copy=False, indexes=indexes) --> 435 expanded = expand_variable_dicts(aligned) 436 437 coord_names, noncoord_names = determine_coords(coerced) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in expand_variable_dicts(list_of_variable_dicts) 209 var_dicts.append(coords) 210 --> 211 var = as_variable(var, name=name) 212 sanitized_vars[name] = var 213 ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/variable.py in as_variable(obj, name) 112 'dimensions %r. xarray disallows such variables because they ' 113 'conflict with the coordinates used to label ' --> 114 'dimensions.' % (name, obj.dims)) 115 obj = obj.to_index_variable() 116 MissingDimensionsError: 'isobaric' has more than 1-dimension and the same name as one of its dimensions ('station', 'profile', 'isobaric'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
413279893 | https://github.com/pydata/xarray/issues/2368#issuecomment-413279893 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxMzI3OTg5Mw== | dopplershift 221526 | 2018-08-15T17:52:36Z | 2018-08-15T17:52:36Z | CONTRIBUTOR |
```pytbMissingDimensionsError Traceback (most recent call last) <ipython-input-6-e2a87d803d99> in <module>() ----> 1 xr.open_dataset(gfs_cat.datasets[0].access_urls['OPENDAP']) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 344 lock = _default_lock(filename_or_obj, engine) 345 with close_on_error(store): --> 346 return maybe_decode_store(store, lock) 347 else: 348 if engine is not None and engine != 'scipy': ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 256 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 257 concat_characters=concat_characters, decode_coords=decode_coords, --> 258 drop_variables=drop_variables) 259 260 _protect_dataset_variables_inplace(ds, cache) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 428 vars, attrs, concat_characters, mask_and_scale, decode_times, 429 decode_coords, drop_variables=drop_variables) --> 430 ds = Dataset(vars, attrs=attrs) 431 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) 432 ds._file_obj = file_obj ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in init(self, data_vars, coords, attrs, compat) 363 coords = {} 364 if data_vars is not None or coords is not None: --> 365 self._set_init_vars_and_dims(data_vars, coords, compat) 366 if attrs is not None: 367 self.attrs = attrs ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in _set_init_vars_and_dims(self, data_vars, coords, compat) 381 382 variables, coord_names, dims = merge_data_and_coords( --> 383 data_vars, coords, compat=compat) 384 385 self._variables = variables ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_data_and_coords(data, coords, compat, join) 363 indexes = dict(extract_indexes(coords)) 364 return merge_core(objs, compat, join, explicit_coords=explicit_coords, --> 365 indexes=indexes) 366 367 ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_core(objs, compat, join, priority_arg, explicit_coords, indexes) 433 coerced = coerce_pandas_values(objs) 434 aligned = deep_align(coerced, join=join, copy=False, indexes=indexes) --> 435 expanded = expand_variable_dicts(aligned) 436 437 coord_names, noncoord_names = determine_coords(coerced) ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in expand_variable_dicts(list_of_variable_dicts) 209 var_dicts.append(coords) 210 --> 211 var = as_variable(var, name=name) 212 sanitized_vars[name] = var 213 ~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/variable.py in as_variable(obj, name) 112 'dimensions %r. xarray disallows such variables because they ' 113 'conflict with the coordinates used to label ' --> 114 'dimensions.' % (name, obj.dims)) 115 obj = obj.to_index_variable() 116 MissingDimensionsError: 'time' has more than 1-dimension and the same name as one of its dimensions ('reftime', 'time'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ``` |
{ "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 | |
413277673 | https://github.com/pydata/xarray/issues/2368#issuecomment-413277673 | https://api.github.com/repos/pydata/xarray/issues/2368 | MDEyOklzc3VlQ29tbWVudDQxMzI3NzY3Mw== | ocefpaf 950575 | 2018-08-15T17:45:40Z | 2018-08-15T17:45:40Z | CONTRIBUTOR | I believe the last one in the notebook below is already fixed and the first two are mentioned above but here is a data point: http://nbviewer.jupyter.org/gist/ocefpaf/1bf3b86359c459c89d44a81d3129f967 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Let's list all the netCDF files that xarray can't open 350899839 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 4