home / github

Menu
  • GraphQL API
  • Search all tables

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where author_association = "CONTRIBUTOR", issue = 350899839 and user = 221526 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: reactions, created_at (date), updated_at (date)

user 1

  • dopplershift · 4 ✖

issue 1

  • Let's list all the netCDF files that xarray can't open · 4 ✖

author_association 1

  • CONTRIBUTOR · 4 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
419225007 https://github.com/pydata/xarray/issues/2368#issuecomment-419225007 https://api.github.com/repos/pydata/xarray/issues/2368 MDEyOklzc3VlQ29tbWVudDQxOTIyNTAwNw== dopplershift 221526 2018-09-06T20:10:24Z 2018-09-06T20:10:24Z CONTRIBUTOR

That sounds reasonable to me. I don't necessarily expect all of the xarray goodness to work with those files, but I do expect them to open without error.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839
419176628 https://github.com/pydata/xarray/issues/2368#issuecomment-419176628 https://api.github.com/repos/pydata/xarray/issues/2368 MDEyOklzc3VlQ29tbWVudDQxOTE3NjYyOA== dopplershift 221526 2018-09-06T17:28:14Z 2018-09-06T17:28:14Z CONTRIBUTOR

@rabernat While I agree that they're (somewhat) confusing files, I think you're missing two things:

  1. netCDF doesn't enforce naming on dimensions and variables. Full stop. The only naming netCDF will care about is any conflict with an internal reserved name (I'm not sure that those even exist for anything besides attributes.) IMO that's a good thing, but more importantly it's not the netCDF library's job to enforce any of it.

  2. CF is an attribute convention. This also means that the conventions say absolutely nothing about naming of variables and dimensions.

IMO, xarray is being overly pedantic here. XArray states that it adopts the Common Data Model (CDM); netCDF-java and the CDM were the tools used to generate the failing examples above.

{
    "total_count": 2,
    "+1": 2,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839
413281638 https://github.com/pydata/xarray/issues/2368#issuecomment-413281638 https://api.github.com/repos/pydata/xarray/issues/2368 MDEyOklzc3VlQ29tbWVudDQxMzI4MTYzOA== dopplershift 221526 2018-08-15T17:58:12Z 2018-08-15T17:58:12Z CONTRIBUTOR

Here's a sample CDL for a file: ``` netcdf temp { dimensions: profile = 1 ; station = 1 ; isobaric = 31 ; station_name_strlen = 10 ; station_description_strlen = 33 ; variables: float isobaric(station, profile, isobaric) ; isobaric:standard_name = "isobaric" ; isobaric:long_name = "isobaric" ; isobaric:units = "Pa" ; isobaric:positive = "down" ; isobaric:axis = "Z" ; float Geopotential_height_isobaric(station, profile, isobaric) ; Geopotential_height_isobaric:standard_name = "Geopotential_height_isobaric" ; Geopotential_height_isobaric:long_name = "Geopotential_height_isobaric" ; Geopotential_height_isobaric:units = "gpm" ; Geopotential_height_isobaric:coordinates = "time longitude latitude isobaric" ; char station_name(station, station_name_strlen) ; station_name:long_name = "station name" ; station_name:cf_role = "timeseries_id" ; char station_description(station, station_description_strlen) ; station_description:long_name = "station description" ; station_description:standard_name = "platform_name" ; double latitude(station) ; latitude:units = "degrees_north" ; latitude:long_name = "profile latitude" ; double longitude(station) ; longitude:units = "degrees_east" ; longitude:long_name = "profile longitude" ; double time(station, profile) ; time:units = "Hour since 2018-08-15T12:00:00Z" ; time:calendar = "proleptic_gregorian" ; time:standard_name = "time" ; time:long_name = "GRIB forecast or observation time" ;

// global attributes: :Conventions = "CDM-Extended-CF" ; :history = "Written by CFPointWriter" ; :title = "Extract Points data from Grid file /data/ldm/pub/native/grid/NCEP/GFS/Global_0p5deg/GFS_Global_0p5deg_20180815_1200.grib2.ncx3#LatLon_361X720-p25S-180p0E" ; :featureType = "timeSeriesProfile" ; :time_coverage_start = "2018-08-15T18:00:00Z" ; :time_coverage_end = "2018-08-15T18:00:00Z" ; :geospatial_lat_min = 39.9995 ; :geospatial_lat_max = 40.0005 ; :geospatial_lon_min = -105.0005 ; :geospatial_lon_max = -104.9995 ; } which gives:pytb


MissingDimensionsError Traceback (most recent call last) <ipython-input-10-d6f8d8651b9f> in <module>() 4 query.add_lonlat().accept('netcdf4') 5 nc = ncss.get_data(query) ----> 6 xr.open_dataset(NetCDF4DataStore(nc))

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 352 store = backends.ScipyDataStore(filename_or_obj) 353 --> 354 return maybe_decode_store(store) 355 356

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 256 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 257 concat_characters=concat_characters, decode_coords=decode_coords, --> 258 drop_variables=drop_variables) 259 260 _protect_dataset_variables_inplace(ds, cache)

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 428 vars, attrs, concat_characters, mask_and_scale, decode_times, 429 decode_coords, drop_variables=drop_variables) --> 430 ds = Dataset(vars, attrs=attrs) 431 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) 432 ds._file_obj = file_obj

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in init(self, data_vars, coords, attrs, compat) 363 coords = {} 364 if data_vars is not None or coords is not None: --> 365 self._set_init_vars_and_dims(data_vars, coords, compat) 366 if attrs is not None: 367 self.attrs = attrs

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in _set_init_vars_and_dims(self, data_vars, coords, compat) 381 382 variables, coord_names, dims = merge_data_and_coords( --> 383 data_vars, coords, compat=compat) 384 385 self._variables = variables

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_data_and_coords(data, coords, compat, join) 363 indexes = dict(extract_indexes(coords)) 364 return merge_core(objs, compat, join, explicit_coords=explicit_coords, --> 365 indexes=indexes) 366 367

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_core(objs, compat, join, priority_arg, explicit_coords, indexes) 433 coerced = coerce_pandas_values(objs) 434 aligned = deep_align(coerced, join=join, copy=False, indexes=indexes) --> 435 expanded = expand_variable_dicts(aligned) 436 437 coord_names, noncoord_names = determine_coords(coerced)

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in expand_variable_dicts(list_of_variable_dicts) 209 var_dicts.append(coords) 210 --> 211 var = as_variable(var, name=name) 212 sanitized_vars[name] = var 213

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/variable.py in as_variable(obj, name) 112 'dimensions %r. xarray disallows such variables because they ' 113 'conflict with the coordinates used to label ' --> 114 'dimensions.' % (name, obj.dims)) 115 obj = obj.to_index_variable() 116

MissingDimensionsError: 'isobaric' has more than 1-dimension and the same name as one of its dimensions ('station', 'profile', 'isobaric'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839
413279893 https://github.com/pydata/xarray/issues/2368#issuecomment-413279893 https://api.github.com/repos/pydata/xarray/issues/2368 MDEyOklzc3VlQ29tbWVudDQxMzI3OTg5Mw== dopplershift 221526 2018-08-15T17:52:36Z 2018-08-15T17:52:36Z CONTRIBUTOR

python import xarray as xr xr.open_dataset('http://thredds.ucar.edu/thredds/dodsC/grib/NCEP/GFS/Global_0p5deg/TwoD')

```pytb

MissingDimensionsError Traceback (most recent call last) <ipython-input-6-e2a87d803d99> in <module>() ----> 1 xr.open_dataset(gfs_cat.datasets[0].access_urls['OPENDAP'])

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, group, decode_cf, mask_and_scale, decode_times, autoclose, concat_characters, decode_coords, engine, chunks, lock, cache, drop_variables, backend_kwargs) 344 lock = _default_lock(filename_or_obj, engine) 345 with close_on_error(store): --> 346 return maybe_decode_store(store, lock) 347 else: 348 if engine is not None and engine != 'scipy':

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/backends/api.py in maybe_decode_store(store, lock) 256 store, mask_and_scale=mask_and_scale, decode_times=decode_times, 257 concat_characters=concat_characters, decode_coords=decode_coords, --> 258 drop_variables=drop_variables) 259 260 _protect_dataset_variables_inplace(ds, cache)

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/conventions.py in decode_cf(obj, concat_characters, mask_and_scale, decode_times, decode_coords, drop_variables) 428 vars, attrs, concat_characters, mask_and_scale, decode_times, 429 decode_coords, drop_variables=drop_variables) --> 430 ds = Dataset(vars, attrs=attrs) 431 ds = ds.set_coords(coord_names.union(extra_coords).intersection(vars)) 432 ds._file_obj = file_obj

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in init(self, data_vars, coords, attrs, compat) 363 coords = {} 364 if data_vars is not None or coords is not None: --> 365 self._set_init_vars_and_dims(data_vars, coords, compat) 366 if attrs is not None: 367 self.attrs = attrs

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/dataset.py in _set_init_vars_and_dims(self, data_vars, coords, compat) 381 382 variables, coord_names, dims = merge_data_and_coords( --> 383 data_vars, coords, compat=compat) 384 385 self._variables = variables

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_data_and_coords(data, coords, compat, join) 363 indexes = dict(extract_indexes(coords)) 364 return merge_core(objs, compat, join, explicit_coords=explicit_coords, --> 365 indexes=indexes) 366 367

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in merge_core(objs, compat, join, priority_arg, explicit_coords, indexes) 433 coerced = coerce_pandas_values(objs) 434 aligned = deep_align(coerced, join=join, copy=False, indexes=indexes) --> 435 expanded = expand_variable_dicts(aligned) 436 437 coord_names, noncoord_names = determine_coords(coerced)

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/merge.py in expand_variable_dicts(list_of_variable_dicts) 209 var_dicts.append(coords) 210 --> 211 var = as_variable(var, name=name) 212 sanitized_vars[name] = var 213

~/miniconda3/envs/py36/lib/python3.6/site-packages/xarray/core/variable.py in as_variable(obj, name) 112 'dimensions %r. xarray disallows such variables because they ' 113 'conflict with the coordinates used to label ' --> 114 'dimensions.' % (name, obj.dims)) 115 obj = obj.to_index_variable() 116

MissingDimensionsError: 'time' has more than 1-dimension and the same name as one of its dimensions ('reftime', 'time'). xarray disallows such variables because they conflict with the coordinates used to label dimensions. ```

{
    "total_count": 5,
    "+1": 5,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Let's list all the netCDF files that xarray can't open 350899839

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 25.313ms · About: xarray-datasette