html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue https://github.com/pydata/xarray/issues/1208#issuecomment-275956940,https://api.github.com/repos/pydata/xarray/issues/1208,275956940,MDEyOklzc3VlQ29tbWVudDI3NTk1Njk0MA==,10050469,2017-01-29T23:56:04Z,2017-01-29T23:56:04Z,MEMBER,"I added a PR in order to allow a dev version of bottleneck to be used, too: https://github.com/pydata/xarray/issues/1235","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275955436,https://api.github.com/repos/pydata/xarray/issues/1208,275955436,MDEyOklzc3VlQ29tbWVudDI3NTk1NTQzNg==,1217238,2017-01-29T23:31:29Z,2017-01-29T23:31:29Z,MEMBER,"@fmaussion thanks for puzzling this one out! @ghisvail thanks for the report!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275955256,https://api.github.com/repos/pydata/xarray/issues/1208,275955256,MDEyOklzc3VlQ29tbWVudDI3NTk1NTI1Ng==,10050469,2017-01-29T23:28:18Z,2017-01-29T23:28:18Z,MEMBER,the tests now pass with bottleneck master.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275910605,https://api.github.com/repos/pydata/xarray/issues/1208,275910605,MDEyOklzc3VlQ29tbWVudDI3NTkxMDYwNQ==,10050469,2017-01-29T12:26:03Z,2017-01-29T12:26:03Z,MEMBER,"@shoyer nevermind, I found the bug: https://github.com/kwgoodman/bottleneck/issues/161 Quite a tricky one indeed ;-)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275909069,https://api.github.com/repos/pydata/xarray/issues/1208,275909069,MDEyOklzc3VlQ29tbWVudDI3NTkwOTA2OQ==,10050469,2017-01-29T11:53:27Z,2017-01-29T11:53:27Z,MEMBER,"@shoyer there is something very weird going on. See the following example: ```python import numpy as np import bottleneck as bn import xarray as xr da = xr.DataArray(np.ones((10, 20)).astype(np.int), dims=['x', 'y'], coords={'abc':('y', np.array(['a'] * 9 + ['c'] + ['b'] * 10))}) np.testing.assert_allclose(np.sum(da[:, 9:10]), bn.nansum(da[:, 9:10])) # this will allways work np.testing.assert_allclose(da.groupby('abc').reduce(np.sum), da.groupby('abc').sum()) # this won't ``` This will work with bottleneck installed. Now change the ``astype(np.int)`` to ``astype(np.float)`` and the first assertion would still pass, but the second won't! It will fail with: ``` --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) in () 2 coords={'abc':('y', np.array(['a'] * 9 + ['c'] + ['b'] * 10))}) 3 np.testing.assert_allclose(np.sum(da[:, 9:10]), bn.nansum(da[:, 9:10])) ----> 4 np.testing.assert_allclose(da.groupby('abc').reduce(np.sum), da.groupby('abc').sum()) /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/numpy/testing/utils.py in assert_allclose(actual, desired, rtol, atol, equal_nan, err_msg, verbose) 1409 header = 'Not equal to tolerance rtol=%g, atol=%g' % (rtol, atol) 1410 assert_array_compare(compare, actual, desired, err_msg=str(err_msg), -> 1411 verbose=verbose, header=header, equal_nan=equal_nan) 1412 1413 /home/mowglie/.pyvirtualenvs/py3/lib/python3.4/site-packages/numpy/testing/utils.py in assert_array_compare(comparison, x, y, err_msg, verbose, header, precision, equal_nan) 794 names=('x', 'y'), precision=precision) 795 if not cond: --> 796 raise AssertionError(msg) 797 except ValueError: 798 import traceback AssertionError: Not equal to tolerance rtol=1e-07, atol=0 (mismatch 33.33333333333333%) x: array([ 90., 100., 10.]) y: array([ 90., 100., 1.]) ``` So the new operation is applied only to the first element of the grouped selection, and I didn't manage to us bottleneck only to replicate this (since the first assertion always passes). Before I dig into this I'd like to have your opinion on this: any idea on what could go wrong here? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275448697,https://api.github.com/repos/pydata/xarray/issues/1208,275448697,MDEyOklzc3VlQ29tbWVudDI3NTQ0ODY5Nw==,1217238,2017-01-26T17:14:26Z,2017-01-26T17:14:26Z,MEMBER,"@ghisvail Thanks for your diligence on this. @fmaussion If you can turn one of these into a test case for bottleneck to report upstream that would be super helpful. I would probably start with `test_groupby_sum`. It's likely that this only occurs for arrays with a particular `strides` (memory layout) and `shape`, which is where my [blind guess](https://github.com/kwgoodman/bottleneck/issues/160#issuecomment-272991823) that I suggested on the bottleneck tracker was inspired by.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275396760,https://api.github.com/repos/pydata/xarray/issues/1208,275396760,MDEyOklzc3VlQ29tbWVudDI3NTM5Njc2MA==,10050469,2017-01-26T14:10:48Z,2017-01-26T14:10:48Z,MEMBER,I can confirm that - I have the same problems on my pip virtualenv on linux mint (I tried to reproduce it on [travis](https://github.com/pydata/xarray/pull/1222#issuecomment-274188191) without success),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-275392455,https://api.github.com/repos/pydata/xarray/issues/1208,275392455,MDEyOklzc3VlQ29tbWVudDI3NTM5MjQ1NQ==,1964655,2017-01-26T13:51:15Z,2017-01-26T13:51:15Z,CONTRIBUTOR,"Re-opening. Debian now has a version of Numpy with the fix which broke `bottleneck`. However, the tests still do not pass with the following log: ``` =================================== FAILURES =================================== ___________________ TestDataArray.test_groupby_apply_center ____________________ self = def test_groupby_apply_center(self): def center(x): return x - np.mean(x) array = self.make_groupby_example_array() grouped = array.groupby('abc') expected_ds = array.to_dataset() exp_data = np.hstack([center(self.x[:, :9]), center(self.x[:, 9:10]), center(self.x[:, 10:])]) expected_ds['foo'] = (['x', 'y'], exp_data) expected_centered = expected_ds['foo'] > self.assertDataArrayAllClose(expected_centered, grouped.apply(center)) xarray/tests/test_dataarray.py:1495: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/tests/__init__.py:169: in assertDataArrayAllClose assert_allclose(ar1, ar2, rtol=rtol, atol=atol) xarray/testing.py:125: in assert_allclose assert_allclose(a.variable, b.variable) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ a = array([[ 1.086411e-01, -4.940766e-01, -3.4...e-01, 3.025169e-01, -4.970776e-03, -2.526321e-02, 4.464467e-01]]) b = array([[ 1.086411e-01, -4.940766e-01, -3.4...e-01, 3.025169e-01, -4.970776e-03, -2.526321e-02, 4.464467e-01]]) rtol = 1e-05, atol = 1e-08, decode_bytes = True def assert_allclose(a, b, rtol=1e-05, atol=1e-08, decode_bytes=True): """"""Like :py:func:`numpy.testing.assert_allclose`, but for xarray objects. Raises an AssertionError if two objects are not equal up to desired tolerance. Parameters ---------- a : xarray.Dataset, xarray.DataArray or xarray.Variable The first object to compare. b : xarray.Dataset, xarray.DataArray or xarray.Variable The second object to compare. rtol : float, optional Relative tolerance. atol : float, optional Absolute tolerance. decode_bytes : bool, optional Whether byte dtypes should be decoded to strings as UTF-8 or not. This is useful for testing serialization methods on Python 3 that return saved strings as bytes. See also -------- assert_identical, assert_equal, numpy.testing.assert_allclose """""" import xarray as xr ___tracebackhide__ = True # noqa: F841 assert type(a) == type(b) if isinstance(a, xr.Variable): assert a.dims == b.dims allclose = _data_allclose_or_equiv(a.values, b.values, rtol=rtol, atol=atol, decode_bytes=decode_bytes) > assert allclose, '{}\n{}'.format(a.values, b.values) E AssertionError: [[ 1.08641053e-01 -4.94076627e-01 -3.45099073e-01 2.39968246e-01 E 1.70028797e-04 -3.54496330e-01 2.22851030e-01 3.72358514e-01 E -1.43769756e-01 -2.74687658e-01 -2.45407871e-01 2.15556579e-01 E -4.03911607e-01 3.87184795e-01 -4.18200302e-01 -4.06039565e-01 E -3.83684191e-01 -2.96458046e-01 4.74874038e-02 1.26236739e-01] E [ 3.61580852e-01 -4.79206943e-02 -2.66600721e-01 1.55428994e-01 E 2.61949104e-01 -2.14716556e-01 -3.06136650e-01 1.22278069e-01 E -6.66015156e-02 -1.69577684e-01 3.87460083e-01 1.76306072e-01 E -4.44331607e-01 -5.25418189e-01 1.02862502e-01 -2.85689326e-02 E -2.02277044e-02 9.84154078e-02 2.13794060e-01 3.01328151e-01] E [ 2.95736742e-01 4.25678417e-02 1.09718836e-01 -4.88189636e-01 E 4.03312049e-02 -2.29350832e-01 4.14023579e-01 2.97767319e-02 E -2.67128794e-01 3.21459151e-01 3.17043658e-01 -2.23716814e-01 E -2.66104828e-01 -3.28529266e-01 8.57802322e-02 -3.77281481e-01 E 3.19405419e-01 -2.14206353e-01 1.44674807e-01 -3.10715376e-01] E [ 2.81673580e-01 2.73754200e-03 -3.38780740e-01 -3.34309419e-01 E -3.53325764e-01 3.60925598e-01 -1.82147600e-01 4.78926180e-01 E -5.15467328e-01 -3.67560456e-02 -4.19994747e-01 -1.07446917e-01 E 2.66933947e-01 -1.08693646e-01 4.91030682e-03 2.77558294e-01 E -1.86235207e-01 2.48140256e-01 4.43512756e-02 1.26844065e-01] E [ 3.01363974e-01 -1.09877347e-01 1.22309126e-01 4.31480566e-01 E 4.76386916e-01 2.85348322e-01 2.76157706e-02 4.35295440e-01 E 1.70634220e-02 -1.45109715e-01 3.74386086e-01 -3.66322521e-01 E -3.46253595e-02 -2.91631220e-01 -2.43084987e-01 2.23311739e-01 E -4.16467323e-01 2.14347435e-02 -3.60138536e-01 -9.76142770e-02] E [ -2.41033567e-01 -3.46475587e-01 -1.15366138e-01 4.41278309e-01 E -1.18403292e-01 2.82655028e-01 4.28244101e-01 -1.79257638e-01 E -1.03689353e-01 -2.18645467e-02 4.46969306e-02 5.81598169e-02 E 2.65938587e-01 1.92085853e-01 -4.32495899e-01 -2.24346584e-01 E 5.34276468e-02 5.48206231e-02 -5.89989991e-03 -3.15437391e-01] E [ -1.62051319e-01 -9.45765818e-02 -5.03109630e-01 4.17684367e-01 E -3.06186685e-01 -1.46334025e-01 -9.56430059e-02 -4.25805199e-01 E 1.03351228e-01 4.13053193e-01 -1.06769941e-01 3.80051241e-01 E -2.82681942e-01 -5.40183672e-02 -1.20729983e-01 2.51349129e-01 E 3.98594071e-01 -1.25509013e-02 -2.20384829e-01 3.53787835e-01] E [ 8.63994358e-02 3.30289251e-01 3.34951245e-01 4.74269877e-01 E -3.76742368e-01 -1.06150098e-01 2.75894947e-01 2.58433673e-01 E 1.72780099e-01 -1.23719179e-01 1.87148326e-01 2.30802594e-02 E -1.91651687e-02 -1.24840109e-01 -3.47666075e-02 1.94775931e-01 E 1.57854118e-01 2.36980122e-01 4.19620283e-01 3.92725774e-01] E [ -1.20034155e-01 2.87247357e-01 3.46581678e-02 -2.91391233e-01 E -1.41828417e-01 -1.02849154e-01 -2.22434220e-01 5.88958162e-02 E 5.10373849e-02 8.00973720e-02 -4.20339680e-01 3.77197946e-02 E -1.19250042e-01 1.89674423e-01 -3.08228692e-02 2.09467152e-01 E -2.95852048e-01 3.88376081e-02 -7.35402315e-02 2.08679192e-01] E [ -5.81460294e-02 -4.49176762e-01 1.06408912e-01 4.43941698e-01 E -4.43932980e-01 8.99339636e-02 -1.99292909e-01 -4.60542036e-01 E 1.91585641e-01 -4.28948878e-02 -4.56597716e-01 1.85430017e-01 E 3.29358074e-01 1.29811129e-01 2.80559136e-01 3.70777734e-01 E 3.02516908e-01 -4.97077643e-03 -2.52632109e-02 4.46446732e-01]] E [[ 1.08641053e-01 -4.94076627e-01 -3.45099073e-01 2.39968246e-01 E 1.70028797e-04 -3.54496330e-01 2.22851030e-01 3.72358514e-01 E -1.43769756e-01 1.03282389e-01 -2.45407871e-01 2.15556579e-01 E -4.03911607e-01 3.87184795e-01 -4.18200302e-01 -4.06039565e-01 E -3.83684191e-01 -2.96458046e-01 4.74874038e-02 1.26236739e-01] E [ 3.61580852e-01 -4.79206943e-02 -2.66600721e-01 1.55428994e-01 E 2.61949104e-01 -2.14716556e-01 -3.06136650e-01 1.22278069e-01 E -6.66015156e-02 2.08392364e-01 3.87460083e-01 1.76306072e-01 E -4.44331607e-01 -5.25418189e-01 1.02862502e-01 -2.85689326e-02 E -2.02277044e-02 9.84154078e-02 2.13794060e-01 3.01328151e-01] E [ 2.95736742e-01 4.25678417e-02 1.09718836e-01 -4.88189636e-01 E 4.03312049e-02 -2.29350832e-01 4.14023579e-01 2.97767319e-02 E -2.67128794e-01 6.99429199e-01 3.17043658e-01 -2.23716814e-01 E -2.66104828e-01 -3.28529266e-01 8.57802322e-02 -3.77281481e-01 E 3.19405419e-01 -2.14206353e-01 1.44674807e-01 -3.10715376e-01] E [ 2.81673580e-01 2.73754200e-03 -3.38780740e-01 -3.34309419e-01 E -3.53325764e-01 3.60925598e-01 -1.82147600e-01 4.78926180e-01 E -5.15467328e-01 3.41214002e-01 -4.19994747e-01 -1.07446917e-01 E 2.66933947e-01 -1.08693646e-01 4.91030682e-03 2.77558294e-01 E -1.86235207e-01 2.48140256e-01 4.43512756e-02 1.26844065e-01] E [ 3.01363974e-01 -1.09877347e-01 1.22309126e-01 4.31480566e-01 E 4.76386916e-01 2.85348322e-01 2.76157706e-02 4.35295440e-01 E 1.70634220e-02 2.32860332e-01 3.74386086e-01 -3.66322521e-01 E -3.46253595e-02 -2.91631220e-01 -2.43084987e-01 2.23311739e-01 E -4.16467323e-01 2.14347435e-02 -3.60138536e-01 -9.76142770e-02] E [ -2.41033567e-01 -3.46475587e-01 -1.15366138e-01 4.41278309e-01 E -1.18403292e-01 2.82655028e-01 4.28244101e-01 -1.79257638e-01 E -1.03689353e-01 3.56105501e-01 4.46969306e-02 5.81598169e-02 E 2.65938587e-01 1.92085853e-01 -4.32495899e-01 -2.24346584e-01 E 5.34276468e-02 5.48206231e-02 -5.89989991e-03 -3.15437391e-01] E [ -1.62051319e-01 -9.45765818e-02 -5.03109630e-01 4.17684367e-01 E -3.06186685e-01 -1.46334025e-01 -9.56430059e-02 -4.25805199e-01 E 1.03351228e-01 7.91023240e-01 -1.06769941e-01 3.80051241e-01 E -2.82681942e-01 -5.40183672e-02 -1.20729983e-01 2.51349129e-01 E 3.98594071e-01 -1.25509013e-02 -2.20384829e-01 3.53787835e-01] E [ 8.63994358e-02 3.30289251e-01 3.34951245e-01 4.74269877e-01 E -3.76742368e-01 -1.06150098e-01 2.75894947e-01 2.58433673e-01 E 1.72780099e-01 2.54250868e-01 1.87148326e-01 2.30802594e-02 E -1.91651687e-02 -1.24840109e-01 -3.47666075e-02 1.94775931e-01 E 1.57854118e-01 2.36980122e-01 4.19620283e-01 3.92725774e-01] E [ -1.20034155e-01 2.87247357e-01 3.46581678e-02 -2.91391233e-01 E -1.41828417e-01 -1.02849154e-01 -2.22434220e-01 5.88958162e-02 E 5.10373849e-02 4.58067419e-01 -4.20339680e-01 3.77197946e-02 E -1.19250042e-01 1.89674423e-01 -3.08228692e-02 2.09467152e-01 E -2.95852048e-01 3.88376081e-02 -7.35402315e-02 2.08679192e-01] E [ -5.81460294e-02 -4.49176762e-01 1.06408912e-01 4.43941698e-01 E -4.43932980e-01 8.99339636e-02 -1.99292909e-01 -4.60542036e-01 E 1.91585641e-01 3.35075160e-01 -4.56597716e-01 1.85430017e-01 E 3.29358074e-01 1.29811129e-01 2.80559136e-01 3.70777734e-01 E 3.02516908e-01 -4.97077643e-03 -2.52632109e-02 4.46446732e-01]] xarray/testing.py:123: AssertionError _______________________ TestDataArray.test_groupby_math ________________________ self = def test_groupby_math(self): array = self.make_groupby_example_array() for squeeze in [True, False]: grouped = array.groupby('x', squeeze=squeeze) expected = array + array.coords['x'] actual = grouped + array.coords['x'] self.assertDataArrayIdentical(expected, actual) actual = array.coords['x'] + grouped self.assertDataArrayIdentical(expected, actual) ds = array.coords['x'].to_dataset('X') expected = array + ds actual = grouped + ds self.assertDatasetIdentical(expected, actual) actual = ds + grouped self.assertDatasetIdentical(expected, actual) grouped = array.groupby('abc') expected_agg = (grouped.mean() - np.arange(3)).rename(None) actual = grouped - DataArray(range(3), [('abc', ['a', 'b', 'c'])]) actual_agg = actual.groupby('abc').mean() > self.assertDataArrayAllClose(expected_agg, actual_agg) xarray/tests/test_dataarray.py:1541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/tests/__init__.py:169: in assertDataArrayAllClose assert_allclose(ar1, ar2, rtol=rtol, atol=atol) xarray/testing.py:125: in assert_allclose assert_allclose(a.variable, b.variable) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ a = array([ 0.504303, -0.53899 , -1.97593 ]) b = array([ 0.504303, -0.53899 , -0.175924]) rtol = 1e-05, atol = 1e-08, decode_bytes = True def assert_allclose(a, b, rtol=1e-05, atol=1e-08, decode_bytes=True): """"""Like :py:func:`numpy.testing.assert_allclose`, but for xarray objects. Raises an AssertionError if two objects are not equal up to desired tolerance. Parameters ---------- a : xarray.Dataset, xarray.DataArray or xarray.Variable The first object to compare. b : xarray.Dataset, xarray.DataArray or xarray.Variable The second object to compare. rtol : float, optional Relative tolerance. atol : float, optional Absolute tolerance. decode_bytes : bool, optional Whether byte dtypes should be decoded to strings as UTF-8 or not. This is useful for testing serialization methods on Python 3 that return saved strings as bytes. See also -------- assert_identical, assert_equal, numpy.testing.assert_allclose """""" import xarray as xr ___tracebackhide__ = True # noqa: F841 assert type(a) == type(b) if isinstance(a, xr.Variable): assert a.dims == b.dims allclose = _data_allclose_or_equiv(a.values, b.values, rtol=rtol, atol=atol, decode_bytes=decode_bytes) > assert allclose, '{}\n{}'.format(a.values, b.values) E AssertionError: [ 0.5043027 -0.53899037 -1.97592983] E [ 0.5043027 -0.53899037 -0.17592373] xarray/testing.py:123: AssertionError ----------------------------- Captured stderr call ----------------------------- /<>/.pybuild/pythonX.Y_2.7/build/xarray/tests/test_dataarray.py:1529: FutureWarning: the order of the arguments on DataArray.to_dataset has changed; you now need to supply ``name`` as a keyword argument ds = array.coords['x'].to_dataset('X') ________________________ TestDataArray.test_groupby_sum ________________________ self = def test_groupby_sum(self): array = self.make_groupby_example_array() grouped = array.groupby('abc') expected_sum_all = Dataset( {'foo': Variable(['abc'], np.array([self.x[:, :9].sum(), self.x[:, 10:].sum(), self.x[:, 9:10].sum()]).T), 'abc': Variable(['abc'], np.array(['a', 'b', 'c']))})['foo'] self.assertDataArrayAllClose(expected_sum_all, grouped.reduce(np.sum)) > self.assertDataArrayAllClose(expected_sum_all, grouped.sum()) xarray/tests/test_dataarray.py:1440: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ xarray/tests/__init__.py:169: in assertDataArrayAllClose assert_allclose(ar1, ar2, rtol=rtol, atol=atol) xarray/testing.py:125: in assert_allclose assert_allclose(a.variable, b.variable) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ a = array([ 45.861725, 46.894773, 4.057272]) b = array([ 45.861725, 46.894773, 0.877923]) rtol = 1e-05, atol = 1e-08, decode_bytes = True def assert_allclose(a, b, rtol=1e-05, atol=1e-08, decode_bytes=True): """"""Like :py:func:`numpy.testing.assert_allclose`, but for xarray objects. Raises an AssertionError if two objects are not equal up to desired tolerance. Parameters ---------- a : xarray.Dataset, xarray.DataArray or xarray.Variable The first object to compare. b : xarray.Dataset, xarray.DataArray or xarray.Variable The second object to compare. rtol : float, optional Relative tolerance. atol : float, optional Absolute tolerance. decode_bytes : bool, optional Whether byte dtypes should be decoded to strings as UTF-8 or not. This is useful for testing serialization methods on Python 3 that return saved strings as bytes. See also -------- assert_identical, assert_equal, numpy.testing.assert_allclose """""" import xarray as xr ___tracebackhide__ = True # noqa: F841 assert type(a) == type(b) if isinstance(a, xr.Variable): assert a.dims == b.dims allclose = _data_allclose_or_equiv(a.values, b.values, rtol=rtol, atol=atol, decode_bytes=decode_bytes) > assert allclose, '{}\n{}'.format(a.values, b.values) E AssertionError: [ 45.86172541 46.89477337 4.05727211] E [ 45.86172541 46.89477337 0.87792268] xarray/testing.py:123: AssertionError ============= 3 failed, 1161 passed, 341 skipped in 40.75 seconds ============== ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-273569412,https://api.github.com/repos/pydata/xarray/issues/1208,273569412,MDEyOklzc3VlQ29tbWVudDI3MzU2OTQxMg==,1217238,2017-01-18T19:06:58Z,2017-01-18T19:06:58Z,MEMBER,"OK, thanks for looking into this! On Wed, Jan 18, 2017 at 10:36 AM, Ghislain Antony Vaillant < notifications@github.com> wrote: > We'd need to wait for numpy-1.12.1 to be absolutely sure. I don't have > time to deploy a dev version of numpy to test. > > — > You are receiving this because you commented. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-273561115,https://api.github.com/repos/pydata/xarray/issues/1208,273561115,MDEyOklzc3VlQ29tbWVudDI3MzU2MTExNQ==,1964655,2017-01-18T18:36:38Z,2017-01-18T18:36:38Z,CONTRIBUTOR,We'd need to wait for numpy-1.12.1 to be absolutely sure. I don't have time to deploy a dev version of numpy to test.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-273560457,https://api.github.com/repos/pydata/xarray/issues/1208,273560457,MDEyOklzc3VlQ29tbWVudDI3MzU2MDQ1Nw==,1217238,2017-01-18T18:34:05Z,2017-01-18T18:34:05Z,MEMBER,Were you able to verify that the xarray tests pass after the numpy fix?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-273559627,https://api.github.com/repos/pydata/xarray/issues/1208,273559627,MDEyOklzc3VlQ29tbWVudDI3MzU1OTYyNw==,1964655,2017-01-18T18:31:01Z,2017-01-18T18:31:01Z,CONTRIBUTOR,"It turned out to be a bug in numpy 1.12.0, fixed in 1.12.1, which made `bottleneck` fail. Closing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-272832666,https://api.github.com/repos/pydata/xarray/issues/1208,272832666,MDEyOklzc3VlQ29tbWVudDI3MjgzMjY2Ng==,1964655,2017-01-16T11:04:48Z,2017-01-16T11:04:48Z,CONTRIBUTOR,"Thanks, I'll iterate with the Debian maintainer of bottleneck.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727 https://github.com/pydata/xarray/issues/1208#issuecomment-272763117,https://api.github.com/repos/pydata/xarray/issues/1208,272763117,MDEyOklzc3VlQ29tbWVudDI3Mjc2MzExNw==,1217238,2017-01-16T02:58:06Z,2017-01-16T02:58:06Z,MEMBER,"Thanks for the report. My *guess* is that this is an issue with the bottleneck build -- the large float values (e.g., 1e+248) in the final tests suggest some sort of overflow and/or memory corruption. The values summed in these tests are random numbers between 0 and 1. Unfortunately, I can't reduce this locally using the conda build of bottleneck 1.2.0 on OS X, and our build on Travis-CI (using Ubuntu and conda) is also succeeding. Do you have any more specific details that describe your test setup, other than using the pre-build bottleneck 1.2.0 package? If my hypothesis is correct, this test on bottleneck might trigger a test failure in the ubuntu build process (but it passed in bottleneck's tests on TravisCI): https://github.com/kwgoodman/bottleneck/compare/master...shoyer:possible-reduce-bug?expand=1#diff-a0a3ffc22e0a63118ba4a18e4ab845fc ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,200908727