home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

6 rows where state = "closed" and user = 48764870 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, state_reason, created_at (date), updated_at (date), closed_at (date)

type 1

  • issue 6

state 1

  • closed · 6 ✖

repo 1

  • xarray 6
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
614144170 MDU6SXNzdWU2MTQxNDQxNzA= 4043 Opendap access failure error aragong 48764870 closed 0     17 2020-05-07T15:24:13Z 2023-12-09T05:20:14Z 2023-12-09T05:20:14Z NONE      

Hi all,

We are having some trouble with opendap access to our own thredds. I've detected that when I tried to access to small subsets of small variable arrays, opendap works perfect but now I obtained this error when I was trying to access to some ERA5 netCDF files (~8gb) with Xarray. Check my example code bellow:

```python import xarray as xr import os from datetime import datetime, timedelta import pandas as pd import shutil import numpy as np import time

tic = time.perf_counter()

------------------------------------------------------------------------------------------------------------------

Inputs Example:

lonlat_box = [-4.5, -2.5, 44, 45] date_ini = datetime(1998,5,28,12) date_end = datetime(1998,6,1,12) output_path = r'test_inputs\ERA5\data' source_path = r'http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global'

source_path = r'D:\2020_REPSOL\Codigos_input_TESEO\raw'

dl = 0.5

------------------------------------------------------------------------------------------------------------------

Create results folders and paths

if os.path.exists(output_path): pass else: os.makedirs(output_path)

Change to (-180,180) if there is 0 to 360

if lonlat_box[0] > 180: lonlat_box[0] = lonlat_box[0] - 360 if lonlat_box[1] > 180: lonlat_box[1] = lonlat_box[1] - 360

Check coordinates

if lonlat_box[0] < -19 or lonlat_box[1] > 5 or lonlat_box[2] < 26 or lonlat_box[3] > 56: print("Invalid coordinates! coordinates must be Lon:(-19º,5º) and Lat:(26º,56)") exit()

Check time range

if date_ini<datetime(1992,1,1,0) or date_end>datetime(2017,12,31,23): print("Invalid time range! This database provide data from 01/1992 to 12/2017") exit()

Create a tuple to store Lon Lat

Lon = (lonlat_box[0], lonlat_box[1]) Lat = (lonlat_box[2], lonlat_box[3]) del lonlat_box

Create date list of files to be loaded

dates = pd.date_range(start=date_ini, end=date_end, closed=None, freq='D') file_list = [] for date in dates: p = list([source_path + '/Wind_ERA5_Global_' + date.strftime("%Y") + '.' + date.strftime("%m") + '.nc']) file_list = file_list + p

Delete repeated elements

file_list = list(dict.fromkeys(file_list)) print('Loading files: \n{}\n'.format("\n".join(file_list)))

Load data

ds = xr.open_mfdataset(file_list)

ds = xr.open_mfdataset(file_list)

Select variables

ds = ds.get(['u', 'v'])

from 0º,360º to -180º,180º

ds['lon'] = (ds.lon + 180) % 360 - 180 ds = ds.sortby('lon', 'lat')

Select spatial subset [lon,lat]

ds = ds.where((ds.lon >= Lon[0] - dl) & (ds.lon <= Lon[1] + dl) & (ds.lat >= Lat[0] - dl) & (ds.lat <= Lat[1] + dl), drop=True)

Select temporal subset

ds = ds.where((ds.time >= np.datetime64(date_ini)) & (ds.time <= np.datetime64(date_end)), drop=True)

Create depth-layers file for 2D simulation

winds_list = []

From xarray to dataframe

df = ds.to_dataframe()

```

Problem Description

If I run the process with local data the code runs perfect and there is no problem at all. I previously downloaded to my local PC two files to perform this test.

But when I used the opendap to generalize the process for any date using the opendap url python source_path = r'http://193.144.213.180:8080/thredds/dodsC/Wind/Wind_ERA5/Global' I found this error ``` python


RuntimeError Traceback (most recent call last) d:\2020_REPSOL\Codigos_input_TESEO\draft_code.py in 82 83 # From xarray to dataframe ---> 84 df = ds.to_dataframe() 85 86 df = df.reset_index()

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\dataset.py in to_dataframe(self) 3335 this dataset's indices. 3336 """ -> 3337 return self._to_dataframe(self.dims) 3338 3339 @classmethod

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\dataset.py in _to_dataframe(self, ordered_dims) 3324 columns = [k for k in self.variables if k not in self.dims] 3325 data = [self._variables[k].set_dims(ordered_dims).values.reshape(-1) -> 3326 for k in columns] 3327 index = self.coords.to_index(ordered_dims) 3328 return pd.DataFrame(OrderedDict(zip(columns, data)), index=index)

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\dataset.py in (.0) 3324 columns = [k for k in self.variables if k not in self.dims] 3325 data = [self._variables[k].set_dims(ordered_dims).values.reshape(-1) -> 3326 for k in columns] 3327 index = self.coords.to_index(ordered_dims) 3328 return pd.DataFrame(OrderedDict(zip(columns, data)), index=index)

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\variable.py in values(self) 390 def values(self): 391 """The variable's data as a numpy.ndarray""" --> 392 return _as_array_or_item(self._data) 393 394 @values.setter

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\variable.py in _as_array_or_item(data) 211 TODO: remove this (replace with np.asarray) once these issues are fixed 212 """ --> 213 data = np.asarray(data) 214 if data.ndim == 0: 215 if data.dtype.kind == 'M':

~\AppData\Local\Continuum\miniconda3\lib\site-packages\numpy\core\numeric.py in asarray(a, dtype, order) 536 537 """ --> 538 return array(a, dtype, copy=False, order=order) 539 540

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\array\core.py in array(self, dtype, kwargs) 996 997 def array(self, dtype=None, kwargs): --> 998 x = self.compute() 999 if dtype and x.dtype != dtype: 1000 x = x.astype(dtype)

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\base.py in compute(self, kwargs) 154 dask.base.compute 155 """ --> 156 (result,) = compute(self, traverse=False, kwargs) 157 return result 158

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\base.py in compute(args, kwargs) 396 keys = [x.dask_keys() for x in collections] 397 postcomputes = [x.dask_postcompute() for x in collections] --> 398 results = schedule(dsk, keys, kwargs) 399 return repack([f(r, a) for r, (f, a) in zip(results, postcomputes)]) 400

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\threaded.py in get(dsk, result, cache, num_workers, pool, kwargs) 74 results = get_async(pool.apply_async, len(pool._pool), dsk, result, 75 cache=cache, get_id=_thread_get_id, ---> 76 pack_exception=pack_exception, kwargs) 77 78 # Cleanup pools associated to dead threads

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs) 460 _execute_task(task, data) # Re-execute locally 461 else: --> 462 raise_exception(exc, tb) 463 res, worker_id = loads(res_info) 464 state['cache'][key] = res

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\compatibility.py in reraise(exc, tb) 110 if exc.traceback is not tb: 111 raise exc.with_traceback(tb) --> 112 raise exc 113 114 import pickle as cPickle

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception) 228 try: 229 task, data = loads(task_info) --> 230 result = _execute_task(task, data) 231 id = get_id() 232 result = dumps((result, id))

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in (.0) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in (.0) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in (.0) 116 elif istask(arg): 117 func, args = arg[0], arg[1:] --> 118 args2 = [_execute_task(a, cache) for a in args] 119 return func(*args2) 120 elif not ishashable(arg):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\core.py in _execute_task(arg, cache, dsk) 117 func, args = arg[0], arg[1:] 118 args2 = [_execute_task(a, cache) for a in args] --> 119 return func(*args2) 120 elif not ishashable(arg): 121 return arg

~\AppData\Local\Continuum\miniconda3\lib\site-packages\dask\array\core.py in getter(a, b, asarray, lock) 80 c = a[b] 81 if asarray: ---> 82 c = np.asarray(c) 83 finally: 84 if lock:

~\AppData\Local\Continuum\miniconda3\lib\site-packages\numpy\core\numeric.py in asarray(a, dtype, order) 536 537 """ --> 538 return array(a, dtype, copy=False, order=order) 539 540

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 602 603 def array(self, dtype=None): --> 604 return np.asarray(self.array, dtype=dtype) 605 606 def getitem(self, key):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\numpy\core\numeric.py in asarray(a, dtype, order) 536 537 """ --> 538 return array(a, dtype, copy=False, order=order) 539 540

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 508 def array(self, dtype=None): 509 array = as_indexable(self.array) --> 510 return np.asarray(array[self.key], dtype=None) 511 512 def transpose(self, order):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\numpy\core\numeric.py in asarray(a, dtype, order) 536 537 """ --> 538 return array(a, dtype, copy=False, order=order) 539 540

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 66 67 def array(self, dtype=None): ---> 68 return self.func(self.array) 69 70 def repr(self):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\coding\variables.py in _scale_offset_decoding(data, scale_factor, add_offset, dtype) 182 183 def _scale_offset_decoding(data, scale_factor, add_offset, dtype): --> 184 data = np.array(data, dtype=dtype, copy=True) 185 if scale_factor is not None: 186 data *= scale_factor

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\coding\variables.py in array(self, dtype) 66 67 def array(self, dtype=None): ---> 68 return self.func(self.array) 69 70 def repr(self):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\coding\variables.py in _apply_mask(data, encoded_fill_values, decoded_fill_value, dtype) 133 ) -> np.ndarray: 134 """Mask all matching values in a NumPy arrays.""" --> 135 data = np.asarray(data, dtype=dtype) 136 condition = False 137 for fv in encoded_fill_values:

~\AppData\Local\Continuum\miniconda3\lib\site-packages\numpy\core\numeric.py in asarray(a, dtype, order) 536 537 """ --> 538 return array(a, dtype, copy=False, order=order) 539 540

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\indexing.py in array(self, dtype) 508 def array(self, dtype=None): 509 array = as_indexable(self.array) --> 510 return np.asarray(array[self.key], dtype=None) 511 512 def transpose(self, order):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\backends\netCDF4_.py in getitem(self, key) 62 return indexing.explicit_indexing_adapter( 63 key, self.shape, indexing.IndexingSupport.OUTER, ---> 64 self._getitem) 65 66 def _getitem(self, key):

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\core\indexing.py in explicit_indexing_adapter(key, shape, indexing_support, raw_indexing_method) 776 """ 777 raw_key, numpy_indices = decompose_indexer(key, shape, indexing_support) --> 778 result = raw_indexing_method(raw_key.tuple) 779 if numpy_indices.tuple: 780 # index the loaded np.ndarray

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\backends\netCDF4_.py in _getitem(self, key) 73 with self.datastore.lock: 74 original_array = self.get_array(needs_lock=False) ---> 75 array = getitem(original_array, key) 76 except IndexError: 77 # Catch IndexError in netCDF4 and return a more informative

~\AppData\Local\Continuum\miniconda3\lib\site-packages\xarray\backends\common.py in robust_getitem(array, key, catch, max_retries, initial_delay) 53 for n in range(max_retries + 1): 54 try: ---> 55 return array[key] 56 except catch: 57 if n == max_retries:

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Variable.getitem()

netCDF4_netCDF4.pyx in netCDF4._netCDF4.Variable._get()

netCDF4_netCDF4.pyx in netCDF4._netCDF4._ensure_nc_success()

RuntimeError: NetCDF: Access failure ```

We thought that can be related to the opendap service config in the thredds, and we try to raise by x100 and even x1000 these parameters. python <Opendap> <ascLimit>50</ascLimit> <binLimit>500</binLimit> <serverVersion>opendap/3.7</serverVersion> </Opendap> The result of these changes is that now the error at the end says: RuntimeError: NetCDF: file not found

We do not know how to do to properly fix opendap access to this information, any help is highly appreciated.

Thank you in advance!!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4043/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  not_planned xarray 13221727 issue
1676792648 I_kwDOAMm_X85j8dNI 7773 opendap access fails only in ubuntu machines aragong 48764870 closed 0     5 2023-04-20T14:01:14Z 2023-06-12T08:02:30Z 2023-04-24T10:14:22Z NONE      

What happened?

I am having errors on opendap access (any of them): ```python import xarray as xr

url = "https://erddap.emodnet.eu/erddap/griddap/bathymetry_2022" url = "https://ihthredds.ihcantabria.com/thredds/dodsC/Bathymetry/Global/Gebco_2020.nc"

ds = xr.open_dataset(url) ```

The point is that only ocurrs when i deploy and test the code on an ubuntu-latest machine using (pyenv+pip), see summary of the action:

my dependencies are defined on pyproject.toml file like this: toml dependencies = [ "geopandas", "xarray", "netCDF4", "h5netcdf", "scipy", "pydap", "zarr", "fsspec", "cftime", "pooch", "dask[complete]", "ipykernel", "matplotlib", "owslib", "shapely", "geojson", "pytest>=7", "lxml", "python-dotenv", ] However, If I use conda on wsl unbuntu 20.04 LTS with this environment.yml, works fine:

yaml name: pyteseo-dev channels: - conda-forge - defaults dependencies: - python>=3.7 - xarray - dask - netcdf4 - bottleneck - ipykernel - matplotlib - geopandas - owslib - shapely - geojson - pytest>=7 - coverage - flit - black - sphinx - myst-nb - sphinx-autoapi - sphinx_rtd_theme - pre_commit - flake8 - pydap - lxml - scipy - python-dotenv

Public repository here: pyTESEO

What did you expect to happen?

I expect to pass all tests

Minimal Complete Verifiable Example

```Python import xarray as xr

url = "https://erddap.emodnet.eu/erddap/griddap/bathymetry_2022" url = "https://ihthredds.ihcantabria.com/thredds/dodsC/Bathymetry/Global/Gebco_2020.nc"

ds = xr.open_dataset(url) ```

MVCE confirmation

  • [X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
  • [ ] Complete example — the example is self-contained, including all data and the text of any traceback.
  • [ ] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
  • [ ] New issue — a search of GitHub Issues suggests this is not a duplicate.

Relevant log output

No response

Anything else we need to know?

No response

Environment

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/7773/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
1086732825 I_kwDOAMm_X85AxjoZ 6100 dataset.sel() argument to select outside closest neighbours is implemented? aragong 48764870 closed 0     2 2021-12-22T11:32:36Z 2021-12-22T15:55:33Z 2021-12-22T15:46:12Z NONE      

Discussed in https://github.com/pydata/xarray/discussions/6099

<sup>Originally posted by **aragong** December 22, 2021</sup> Hi all, I am always struggling with this kind of selection of a desired domain or time range. For me it is very common to select data to make other interpolations and this feature will be very useful in my day by day. I make simple code to expose my doubt, thank you in advance! ```python import numpy as np import pandas as pd import xarray as xr from datetime import datetime # Create random_values dataset with time, latitude, longitude coords longitudes = np.arange(-180, 180, 5) latitudes = np.arange(-90, 90, 5) times = pd.date_range(start=datetime(2021, 1, 1), end=datetime(2021, 12, 31), freq="D") data = np.random.rand(len(times), len(latitudes), len(longitudes)) da = xr.DataArray( data=data, coords=[times, latitudes, longitudes], dims=["time", "latitude", "longitude"], ) ds = da.to_dataset(name="random_values") # Create a slices based on tmin,tmax lonmin,lonmax and latmin,latmax of the desired location and timerange t_min = datetime(2021, 2, 16, 12, 0, 0) t_max = datetime(2021, 3, 6, 12, 0, 0) lon_min = -3 lon_max = 28 lat_min = 12 lat_max = 48 desired_time = slice(t_min, t_max) desired_lon = slice(lon_min, lon_max) desired_lat = slice(lat_min, lat_max) # make standard dataset selection standard_sel_ds = ds.sel(time=desired_time, latitude=desired_lat, longitude=desired_lon) print( f"time_min = {standard_sel_ds['time'].min().values}\n time_max = {standard_sel_ds['time'].max().values}" ) print( f"lon_min = {standard_sel_ds['longitude'].min().values}\n lon_max = {standard_sel_ds['longitude'].max().values}" ) print( f"lat_min = {standard_sel_ds['latitude'].min().values}\n lat_max = {standard_sel_ds['latitude'].max().values}" ) # I would like to have a extra argument to select the outside closest neighbours of the sesired coordinates. Resulting: print( "time_min = 2021-02-16T00:00:00.000000000\n time_max = 2021-03-07T00:00:00.000000000" ) print("lon_min = -5\n lon_max = 30") print("lat_min = 10\n lat_max = 50") # Anyone knows if that is developed in xarray.sel (???) ```

Would be greate to add to method argument this behaviour as "outside_fill" or similar!

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/6100/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
444367776 MDU6SXNzdWU0NDQzNjc3NzY= 2962 Is it possible to perform this interpolation with xarray? aragong 48764870 closed 0     5 2019-05-15T10:46:40Z 2019-05-15T12:01:42Z 2019-05-15T11:57:28Z NONE      

I'm trying to interpolate information from a 3D dataset (lon,lat,time) ussing directly xarray.

When I made a simply interpolation with only one point I have no problem at all.

lat = [44.25] lon = [-4.5] t = datetime.strptime('2000-02-28 01:00:00', '%Y-%m-%d %H:%M:%S')

ds = xr.open_dataset('file.nc') vx = ds['uo_surface'].interp(longitude=lon, latitude=lat, time=t) But now I'm trying to interpolate in the same way several points and the result of this operation following the same syntax shows more results of what I will expected.

lat = [44.25, 45.25] lon = [-4.5, -5] t = datetime.strptime('2000-02-28 01:00:00', '%Y-%m-%d %H:%M:%S')

ds = xr.open_dataset('Currents\oceanTESEO.nc') vx = ds['uo_surface'].interp(longitude=lon, latitude=lat, time=[t, t]) The result is this array:

array([[[0.01750018, 0.05349977], [0.03699994, 0.11299999]], [[0.01750018, 0.05349977], [0.03699994, 0.11299999]]])

However, I expect only 2 values, one for each (lon,lat,t) point. Do I have to implement a loop to do that? I suposse this feature is already included in xarray. Do you know other way to calculate this sort of point interpolation faster and with 4D datarrays (lon,lat,z,time)?

Thank you in advance!!! https://stackoverflow.com/questions/56144678/interpolation-syntax

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2962/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
423356906 MDU6SXNzdWU0MjMzNTY5MDY= 2827 multiple reference times aragong 48764870 closed 0     2 2019-03-20T16:40:33Z 2019-03-21T13:10:32Z 2019-03-21T13:10:31Z NONE      

problem when reference time change between months

I'm having some problems creating a subset form some netcdfs. Mainly I have to open some netcdfs from a global database, make an spatial and temporal subset and create a new netcdf to store the information in my local machine.

The problem appears when my temporal range goes from one month to another one. I load de dataset ussing ds = xr.open_mfdataset() everything is ok and i obtain a variable ds.time.values with all the correct times. The problem is when i tried to export this information to a new one netcdf. I use the comand ds.to_netcdf(filename, mode='w') but the resulting netcdf presents a mistake in the time values. it takes the reference time of the first netcdf but it doesn't have into account that the netcdf's of the following month have a diferent reference time.

I suposse that i'm missing something, Any idea?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2827/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
423320076 MDU6SXNzdWU0MjMzMjAwNzY= 2826 reference time problems when write netcdf aragong 48764870 closed 0     0 2019-03-20T15:33:35Z 2019-03-20T15:34:10Z 2019-03-20T15:34:10Z NONE      

Code Sample, a copy-pastable example if possible

A "Minimal, Complete and Verifiable Example" will make it much easier for maintainers to help you: http://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports

```python

Your code here

```

Problem description

[this should explain why the current behavior is a problem and why the expected output is a better solution.]

Expected Output

Output of xr.show_versions()

# Paste the output here xr.show_versions() here
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/2826/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 19.622ms · About: xarray-datasette