home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 435213658

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/2525#issuecomment-435213658 https://api.github.com/repos/pydata/xarray/issues/2525 435213658 MDEyOklzc3VlQ29tbWVudDQzNTIxMzY1OA== 1217238 2018-11-01T22:51:55Z 2018-11-01T22:51:55Z MEMBER

skimage implements block_reduce via the view_as_blocks utility function: https://github.com/scikit-image/scikit-image/blob/62e29cd89dc858d8fb9d3578034a2f456f298ed3/skimage/util/shape.py#L9-L103

But given that it doesn't actually duplicate any elements and needs a C-order array to work, I think it's actually just equivalent to use use reshape + transpose, e.g., B = A.reshape(4, 1, 2, 2, 3, 2).transpose([0, 2, 4, 1, 3, 5]) reproduces skimage.util.view_as_blocks(A, (1, 2, 2)) from the docstring example.

So the super-simple version of block-reduce looks like: python def block_reduce(image, block_size, func=np.sum): # TODO: input validation # TODO: consider copying padding from skimage blocked_shape = [] for existing_size, block_size in zip(image.shape, block_size): blocked_shape.extend([existing_size // block_size, block_size]) blocked = np.reshape(image, tuple(blocked_shape)) return func(blocked, axis=tuple(range(1, blocked.ndim, 2)))

This would work on dask arrays out of the box but it's probably worth benchmarking whether you'd get better performance doing the operation chunk-wise (e.g., with map_blocks).

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  375126758
Powered by Datasette · Queries took 0.738ms · About: xarray-datasette