issues: 513916063
This data as json
| id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | active_lock_reason | draft | pull_request | body | reactions | performed_via_github_app | state_reason | repo | type |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 513916063 | MDU6SXNzdWU1MTM5MTYwNjM= | 3454 | Large coordinate arrays trigger computation | 14314623 | closed | 0 | 2 | 2019-10-29T13:27:00Z | 2019-10-29T15:07:43Z | 2019-10-29T15:07:43Z | CONTRIBUTOR | I want to bring up an issue that has tripped up my workflow with large climate models many times. I am dealing with large data arrays of vertical cell thickness. These are 4d arrays (x, y, z, time) but I would define them as coordinates, not data_variables in the xarrays data model (e.g. they should not be multiplied by a value if a dataset is multiplied). These sort of coordinates might become more prevalent with newer ocean models like MOM6 Whenever I assign these arrays as coordinates operations on the arrays seem to trigger computation, whereas they don't if I set them up as data_variables. The example below shows this behavior. Is this a bug or done on purpose? Is there a workaround to keep these vertical thicknesses as coordinates? ``` import xarray as xr import numpy as np import dask.array as dsa create dataset with with vertical thickness
|
{
"url": "https://api.github.com/repos/pydata/xarray/issues/3454/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
completed | 13221727 | issue |