home / github

Menu
  • GraphQL API
  • Search all tables

issues

Table actions
  • GraphQL API for issues

6 rows where user = 34353851 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)

type 2

  • issue 4
  • pull 2

state 2

  • closed 5
  • open 1

repo 1

  • xarray 6
id node_id number title user state locked assignee milestone comments created_at updated_at ▲ closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
869180122 MDU6SXNzdWU4NjkxODAxMjI= 5225 python3.9 dask/array/slicing.py in slice_wrap_lists Don't yet support nd fancy indexing JavierRuano 34353851 closed 0     5 2021-04-27T19:12:48Z 2021-04-28T09:31:22Z 2021-04-28T09:08:48Z NONE      

The code in the line 411, returns Don't yet support nd fancy indexing from dask. https://github.com/JavierRuano/ASI_Steady/blob/main/ASI_Datase_RACKt.py#L411

that was working well with python3.7 .

ASI_Datase_RACKt.py in refresh_Graphics saveFile = self.xarray[indices_maps[str(self.typi)]].where(self.xarray.mask == 1).where( … ▼ Local vars Variable Value NETCDF_FILES_FOLDER '/var/www/stream/stream/data/' end '2020-11-02' indices_maps
{'horton': 'ASI_Horton_2012', 'huang': 'ASI_Huang_2018', 'wang': 'ASI_Wang_2017'} lat_dos 48.75 lat_uno 48.0 lng_dos -5.75 lng_uno -5.0 self
<polls.ASI_Datase_RACKt.ASI object at 0x7f585899e1f0> start
'2020-11-01' /usr/local/lib/python3.9/dist-packages/xarray/core/common.py in where self = self.isel(**indexers) … ▼ Local vars Variable Value DataArray
<class 'xarray.core.dataarray.DataArray'> Dataset <class 'xarray.core.dataset.Dataset'> align
<function align at 0x7f5878522af0> clipcond
<xarray.DataArray (latitude: 68, longitude: 81)> array([[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]]) Coordinates: * latitude (latitude) float32 75.0 74.25 73.5 72.75 ... 26.25 25.5 24.75 * longitude (longitude) float32 -20.0 -19.25 -18.5 -17.75 ... 38.5 39.25 40.0 cond
<xarray.DataArray (latitude: 68, longitude: 81)> array([[False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], ..., [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False], [False, False, False, ..., False, False, False]]) Coordinates: * latitude (latitude) float32 75.0 74.25 73.5 72.75 ... 26.25 25.5 24.75 * longitude (longitude) float32 -20.0 -19.25 -18.5 -17.75 ... 38.5 39.25 40.0 drop
True indexers
{'latitude': array([], dtype=int64), 'longitude': array([], dtype=int64)} nonzeros
<zip object at 0x7f584bfc08c0> other
<NA> self
<xarray.DataArray 'ASI_Horton_2012' (time: 15310, latitude: 68, longitude: 81)> dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> Coordinates: * time (time) datetime64[ns] 1979-01-01T12:00:00 ... 2020-11-30T12:00:00 * latitude (latitude) float32 75.0 74.25 73.5 72.75 ... 26.25 25.5 24.75 * longitude (longitude) float32 -20.0 -19.25 -18.5 -17.75 ... 38.5 39.25 40.0 mask (latitude, longitude) float64 1.0 2.0 2.0 2.0 ... 1.0 1.0 1.0 1.0 Attributes: long_name: air_stagnation_index /usr/local/lib/python3.9/dist-packages/xarray/core/dataarray.py in isel variable = self._variable.isel(indexers, missing_dims=missing_dims) … ▼ Local vars Variable Value drop
False indexers
{'latitude': array([], dtype=int64), 'longitude': array([], dtype=int64)} indexers_kwargs {'latitude': array([], dtype=int64), 'longitude': array([], dtype=int64)} missing_dims
'raise' self
<xarray.DataArray 'ASI_Horton_2012' (time: 15310, latitude: 68, longitude: 81)> dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> Coordinates: * time (time) datetime64[ns] 1979-01-01T12:00:00 ... 2020-11-30T12:00:00 * latitude (latitude) float32 75.0 74.25 73.5 72.75 ... 26.25 25.5 24.75 * longitude (longitude) float32 -20.0 -19.25 -18.5 -17.75 ... 38.5 39.25 40.0 mask (latitude, longitude) float64 1.0 2.0 2.0 2.0 ... 1.0 1.0 1.0 1.0 Attributes: long_name: air_stagnation_index /usr/local/lib/python3.9/dist-packages/xarray/core/variable.py in isel return self[key] … ▼ Local vars Variable Value indexers
{'latitude': array([], dtype=int64), 'longitude': array([], dtype=int64)} indexers_kwargs {} key (slice(None, None, None), array([], dtype=int64), array([], dtype=int64)) missing_dims
'raise' self
<xarray.Variable (time: 15310, latitude: 68, longitude: 81)> dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> Attributes: long_name: air_stagnation_index /usr/local/lib/python3.9/dist-packages/xarray/core/variable.py in getitem data = as_indexable(self._data)[indexer] … ▼ Local vars Variable Value dims
('time', 'latitude', 'longitude') indexer OuterIndexer((slice(None, None, None), array([], dtype=int64), array([], dtype=int64))) key (slice(None, None, None), array([], dtype=int64), array([], dtype=int64)) new_order
None self
<xarray.Variable (time: 15310, latitude: 68, longitude: 81)> dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> Attributes: long_name: air_stagnation_index /usr/local/lib/python3.9/dist-packages/xarray/core/indexing.py in getitem return array[key] … ▼ Local vars Variable Value array
dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> key (array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) self
NdArrayLikeIndexingAdapter(array=dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray>) /usr/local/lib/python3.9/dist-packages/dask/array/core.py in getitem dsk, chunks = slice_array(out, self.name, self.chunks, index2, self.itemsize) … ▼ Local vars Variable Value dependencies
{'copy-60d43625606b762db38f4c336bfabf09'} i
array([], shape=(1, 1, 0), dtype=int64) index
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) index2
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) normalize_index <function normalize_index at 0x7f585a1b7940> out 'getitem-33275cee9730d05017ed841f8e82e286' self
dask.array<copy, shape=(15310, 68, 81), dtype=float64, chunksize=(15310, 68, 81), chunktype=numpy.ndarray> slice_with_bool_dask_array
<function slice_with_bool_dask_array at 0x7f585a1b7b80> slice_with_int_dask_array
<function slice_with_int_dask_array at 0x7f585a1b7a60> /usr/local/lib/python3.9/dist-packages/dask/array/slicing.py in slice_array dsk_out, bd_out = slice_with_newaxes(out_name, in_name, blockdims, index, itemsize) … ▼ Local vars Variable Value blockdims
((15310,), (68,), (81,)) in_name 'copy-60d43625606b762db38f4c336bfabf09' index
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) itemsize
8 missing 0 not_none_count
3 out_name
'getitem-33275cee9730d05017ed841f8e82e286' /usr/local/lib/python3.9/dist-packages/dask/array/slicing.py in slice_with_newaxes dsk, blockdims2 = slice_wrap_lists(out_name, in_name, blockdims, index2, itemsize) … ▼ Local vars Variable Value blockdims
((15310,), (68,), (81,)) in_name 'copy-60d43625606b762db38f4c336bfabf09' index
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) index2
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) itemsize
8 out_name
'getitem-33275cee9730d05017ed841f8e82e286' where_none
[] where_none_orig [] /usr/local/lib/python3.9/dist-packages/dask/array/slicing.py in slice_wrap_lists raise NotImplementedError("Don't yet support nd fancy indexing") … ▼ Local vars Variable Value blockdims
((15310,), (68,), (81,)) in_name 'copy-60d43625606b762db38f4c336bfabf09' index
(array([[[ 0]],

   [[    1]],

   [[    2]],

   ...,

   [[15307]],

   [[15308]],

   [[15309]]]),

array([], shape=(1, 0, 1), dtype=int64), array([], shape=(1, 1, 0), dtype=int64)) itemsize
8 out_name
'getitem-33275cee9730d05017ed841f8e82e286' where_list
[0, 1, 2]

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5225/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
627356505 MDU6SXNzdWU2MjczNTY1MDU= 4110 Feature request: merge compat overlap JavierRuano 34353851 open 0     1 2020-05-29T15:36:29Z 2021-04-19T14:46:46Z   NONE      

xarray.merge should have an option compat overlap. is that option is combine_first, but it has more sense inside of merge. python import xarray as xr x1=xr.DataArray([1,2,3,4,11],dims=['time'],coords=[[4,5,6,7,3]]).to_dataset(name='fusionfria') x2=xr.DataArray([5,6,7,8],dims=['time'],coords=[[0,1,2,3]]).to_dataset(name='fusionfria') x1.combine_first(x2)

It could be useful to avoid the error in xarray.open_mfdataset, if the time index is overlapped between netcdf files.

ValueError: Resulting object does not have monotonic global indexes along dimension time

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/4110/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 issue
842615316 MDExOlB1bGxSZXF1ZXN0NjAyMTUwMzAx 5086 Update examples.rst JavierRuano 34353851 closed 0     0 2021-03-27T22:32:25Z 2021-04-19T09:31:22Z 2021-04-19T09:31:22Z NONE   0 pydata/xarray/pulls/5086

Add an external example of xarray uses

  • [x] Closes #xxxx
  • [ ] Tests added
  • [ ] Passes pre-commit run --all-files
  • [ ] User visible changes (including notable bug fixes) are documented in whats-new.rst
  • [ ] New functions/methods are listed in api.rst
{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5086/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
852809756 MDExOlB1bGxSZXF1ZXN0NjExMDA0Njkz 5129 Closes #5085 JavierRuano 34353851 closed 0     0 2021-04-07T21:01:16Z 2021-04-19T09:31:00Z 2021-04-19T09:31:00Z NONE   0 pydata/xarray/pulls/5129

[x] Closes #5086

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5129/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
    xarray 13221727 pull
569806418 MDU6SXNzdWU1Njk4MDY0MTg= 3795 Dataset problem with chunk DataArray. JavierRuano 34353851 closed 0     8 2020-02-24T11:49:26Z 2021-04-19T09:28:21Z 2021-04-19T09:28:21Z NONE      

When i create a xr.Dataset with two variable which are a DataArray, i only obtain a chunk part of the DataArray. should i do a compute or something similar? How to control the number of the chunk of the DataArray to operate one to one them.

I mean::

/ DataArray / dask.array<shape=(14610, 47, 68, 81), dtype=float32, chunksize=(365, 47, 68, 81)>

/ Dataset / dask.array<shape=(365, 47, 68, 81), dtype=float32, chunksize=(365, 47, 68, 81)>

:-) Perhaps is the answer set chunksize to the total? In my case is .chunk(14610), isn't it? Which is the difference with compute()? the lazy operation is maintained...

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/3795/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue
842610988 MDU6SXNzdWU4NDI2MTA5ODg= 5085 Add example in your wiki. JavierRuano 34353851 closed 0     10 2021-03-27T22:02:56Z 2021-04-19T08:52:19Z 2021-04-18T21:59:47Z NONE      

Add example in wiki our calculation Stagnation index program repository, It is based on xarray with complex functions as ufunc and integration with django and cartopy. It is a Physics faculty UCM project 2020 and the researchers have articles based on that copernicus era5 data.

https://github.com/JavierRuano/ASI_Steady

The website is http://steady-ucm.org

I have created a Example with all necessary dataset https://github.com/JavierRuano/ASI_Steady/tree/main/Examples

The example could be here http://xarray.pydata.org/en/stable/examples.html

Regards Javier Ruano.

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/5085/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed xarray 13221727 issue

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issues] (
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [number] INTEGER,
   [title] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [state] TEXT,
   [locked] INTEGER,
   [assignee] INTEGER REFERENCES [users]([id]),
   [milestone] INTEGER REFERENCES [milestones]([id]),
   [comments] INTEGER,
   [created_at] TEXT,
   [updated_at] TEXT,
   [closed_at] TEXT,
   [author_association] TEXT,
   [active_lock_reason] TEXT,
   [draft] INTEGER,
   [pull_request] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [state_reason] TEXT,
   [repo] INTEGER REFERENCES [repos]([id]),
   [type] TEXT
);
CREATE INDEX [idx_issues_repo]
    ON [issues] ([repo]);
CREATE INDEX [idx_issues_milestone]
    ON [issues] ([milestone]);
CREATE INDEX [idx_issues_assignee]
    ON [issues] ([assignee]);
CREATE INDEX [idx_issues_user]
    ON [issues] ([user]);
Powered by Datasette · Queries took 21.784ms · About: xarray-datasette