home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 317465968

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/pull/1473#issuecomment-317465968 https://api.github.com/repos/pydata/xarray/issues/1473 317465968 MDEyOklzc3VlQ29tbWVudDMxNzQ2NTk2OA== 1217238 2017-07-24T15:48:33Z 2017-07-24T15:48:33Z MEMBER

With the current logic, we normalize everything into a standard indexer tuple in Variable.__getitem__. I think we should explicitly create different kinds of indexers, and then handle them explicitly in various backends/array wrappers, e.g., ```python

in core/indexing.py

class IndexerTuple(tuple): """Base class for xarray indexing tuples."""

def __repr__(self):
    """Repr that shows type name."""
    return type(self).__name__ + tuple.__repr__(self)

class BasicIndexer(IndexerTuple): """Tuple for basic indexing."""

class OuterIndexer(IndexerTuple): """Tuple for outer/orthogonal indexing (.oindex)."""

class VectorizedIndexer(IndexerTuple): """Tuple for vectorized indexing (.vindex)."""

in core/variable.py

class Variable(...): def _broadcast_indexes(self, key): # return a BasicIndexer if possible, otherwise an OuterIndexer if possible # and finally a VectorizedIndexer

in adapters for various backends/storage types

class DaskArrayAdapter(...): def getitem(self, key): if isinstance(key, VectorizedIndexer): raise IndexError("dask doesn't yet support vectorized indexing") ... ```

This is a little more work at the outset because we have to handle each indexer type in each backend, but it avoids the error prone broadcasting/un-broadcasting logic.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  241578773
Powered by Datasette · Queries took 79.554ms · About: xarray-datasette