{"database": "github", "table": "issue_comments", "rows": [["https://github.com/pydata/xarray/pull/5045#issuecomment-822467179", "https://api.github.com/repos/pydata/xarray/issues/5045", 822467179, "MDEyOklzc3VlQ29tbWVudDgyMjQ2NzE3OQ==", 5635139, "2021-04-19T13:29:07Z", "2021-04-19T13:29:07Z", "MEMBER", "Great, this is shaping up.\n\nI think we can find a way of failing early on bad indexes without attempting the whole operation on a copy.\n\nAt the very least, we could call `__getitem__` with the indexes and see whether that passes. There may be better ways yet.\n\nI also think that because the currently proposed code uses a shallow copy, it may be mutating the original when bad indexes are passed \u2014 it's worth adding a test to confirm.", "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", null, 833778859]], "columns": ["html_url", "issue_url", "id", "node_id", "user", "created_at", "updated_at", "author_association", "body", "reactions", "performed_via_github_app", "issue"], "primary_keys": ["id"], "primary_key_values": ["822467179"], "units": {}, "query_ms": 0.6422610022127628}