sha,message,author_date,committer_date,raw_author,raw_committer,repo,author,committer 4f5ca73cd922b3c08cb30a34795e18957d0926ac,"Make concat more forgiving with variables that are being merged. (#3364) * Make concat more forgiving with variables that are being merged. * rename test. * simplify test. * make diff smaller.",2019-10-14T18:06:53Z,2019-10-14T18:06:53Z,0c7e9e762dbfd6554e60c953bf27493047d95109,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,2448579, 291cb805bf0bf326da87152cc6548191bbeb6aab,"Add groupby.dims & Fix groupby reduce for DataArray (#3338) * Fix groupby reduce for DataArray * bugfix. * another bugfix. * bugfix unique_and_monotonic for object indexes (uniqueness is enough) * Add groupby.dims property. * update reduce docstring to point to xarray.ALL_DIMS * test for object index dims. * test reduce dimensions error. * Add whats-new * fix docs build * sq whats-new * one more test. * fix test. * undo monotonic change. * Add dimensions to repr. * Raise error if no bins. * Raise nice error if no groups were formed. * Some more error raising and testing. * Add dataset tests. * update whats-new. * fix tests. * make dims a cached lazy property. * fix whats-new. * whitespace * fix whats-new",2019-10-10T18:23:20Z,2019-10-10T18:23:20Z,0c7e9e762dbfd6554e60c953bf27493047d95109,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,2448579, 3f0049ffc51e4c709256cf174c435f741370148d,"Speed up isel and __getitem__ (#3375) * Variable.isel cleanup/speedup * Dataset.isel code cleanup * Speed up isel * What's New * Better error checks * Speedup * type annotations * Update doc/whats-new.rst Co-Authored-By: Maximilian Roos <5635139+max-sixty@users.noreply.github.com> * What's New * What's New * Always shallow-copy variables",2019-10-09T18:01:29Z,2019-10-09T18:01:29Z,2ff6d9e74ee03928d143f0bb1557924a28d3b23d,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6213168, 132733a917171fcb1f269406eb9e6668cbb7e376,"Fix concat bug when concatenating unlabeled dimensions. (#3362) * Fix concat bug when concatenating unlabeled dimensions. * Add whats-new * Add back older test. * fix test * Revert ""fix test"" This reverts commit c33ca34a012c97c82be278fb0b8c1aeb000a284d. * better fix",2019-10-08T22:13:47Z,2019-10-08T22:13:47Z,0c7e9e762dbfd6554e60c953bf27493047d95109,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,2448579, 6fb272c0fde4bfaca9b6322b18ac2cf962e26ee3,"Rolling minimum dependency versions policy (#3358) * - Downgrade numpy to 1.14, pandas to 0.20, scipy to 0.19 (24 months old) - Downgrade dask to 1.1 (6 months old) - Don't pin patch versions * Apply rolling policy (see #3222) * Automated tool to verify the minimum versions * Drop Python 3.5 * lint * Trivial cosmetic * Cosmetic * (temp) debug CI failure * Parallelize versions check script * Remove hacks for legacy dask * Documentation * Assorted cleanup * Assorted cleanup * Fix regression * Cleanup * type annotations upgraded to Python 3.6 * count_not_none backport * pd.Index.equals on legacy pandas returned False when comparing vs. a ndarray * Documentation * pathlib cleanup * Slide deprecations from 0.14 to 0.15 * More cleanups * More cleanups * Fix min_deps_check * Fix min_deps_check * Set policy of 12 months for pandas and scipy * Cleanup * Cleanup * Sphinx fix * Overhaul readthedocs environment * Fix test crash * Fix test crash * Prune readthedocs environment * Cleanup * Hack around versioneer bug on readthedocs CI * Code review * Prevent random timeouts in the readthedocs CI * What's New polish * Merge from Master * Trivial cosmetic * Reimplement pandas.core.common.count_not_none",2019-10-08T21:23:46Z,2019-10-08T21:23:46Z,2ff6d9e74ee03928d143f0bb1557924a28d3b23d,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6213168, 1ce91051e3751a65dbbbc7c5ff3e1a2f00ea6ee5,Fix DataArray api doc (#3309),2019-09-15T20:27:30Z,2019-09-15T20:27:30Z,07354799b3d1efde7621fcd783157c7e32436919,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,23618263, 053aed1f76b7ccbac86078475b61c23182ace982,"Reenable cross engine read write netCDF test (#2739) Fixes https://github.com/pydata/xarray/issues/2050 I'm not quite sure what was going on, but it passes now.",2019-02-04T04:42:16Z,2019-02-04T04:42:16Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, 882deac6a38078a64032f88d4785567fd9be8d56,DOC: refresh whats-new for 0.11.3 / 0.12.0 (#2718),2019-01-27T17:11:47Z,2019-01-27T17:11:47Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, ddacf405fb256714ce01e1c4c464f829e1cc5058,"Update indexing.rst (#2700) Grammar error",2019-01-23T21:09:41Z,2019-01-23T21:09:41Z,85870753baa10542dc863b9b18125d3dd5515882,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,32141163, 70e9eb8fc834e4aeff42c221c04c9713eb465b8a,"add missing , and article in error message (#2557) Add missing , and article in error message when attribute values have the wrong type.",2018-11-16T16:40:02Z,2018-11-16T16:40:02Z,a603db48d9c38ebfa866bb90af2d40db4fc4ca3d,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,500246, eece515aaa1bd9ec78dffe47d8d839cd69515b56,Drop the hack needed to use CachingFileManager as we don't use it anymore. (#2544),2018-11-06T16:23:16Z,2018-11-06T16:23:16Z,43536d21a9a3f92bdd2ea532482e40190547761a,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,226037, ecee9a0fe01db13bce1e234519614aeed53a7f07,DOC: move xskillscore to 'Extend xarray capabilities' (#2387),2018-08-28T18:11:55Z,2018-08-28T18:11:55Z,3238b5518bbee4b41d6a5d78325e5a83ec253e79,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, fe99a22ca7bcb1f854c22f5f6894d3c5d40774a6,Mark some tests related to cdat-lite as xfail (#2354),2018-08-10T16:09:29Z,2018-08-10T16:09:29Z,caabe6633d9bef7f2f52807cc5a1f477f3a077d4,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6815844, ebe0dd03187a5c3138ea12ca4beb13643679fe21,"CFTimeIndex (#1252) * Start on implementing and testing NetCDFTimeIndex * TST Move to using pytest fixtures to structure tests * Address initial review comments * Address second round of review comments * Fix failing python3 tests * Match test method name to method name * First attempts at integrating NetCDFTimeIndex into xarray This is a first pass at the following: - Resetting the logic for decoding datetimes such that `np.datetime64` objects are never used for non-standard calendars - Adding logic to use a `NetCDFTimeIndex` whenever `netcdftime.datetime` objects are used in an array being cast as an index (so if one reads in a Dataset from a netCDF file or creates one in Python, which is indexed by a time coordinate that uses `netcdftime.datetime` objects a NetCDFTimeIndex will be used rather than a generic object-based index) - Adding logic to encode `netcdftime.datetime` objects when saving out to netCDF files * Cleanup * Fix DataFrame and Series test failures for NetCDFTimeIndex These were related to a recent minor upstream change in pandas: https://github.com/pandas-dev/pandas/blame/master/pandas/core/indexing.py#L1433 * First pass at making NetCDFTimeIndex compatible with #1356 * Address initial review comments * Restore test_conventions.py * Fix failing test in test_utils.py * flake8 * Update for standalone netcdftime * Address stickler-ci comments * Skip test_format_netcdftime_datetime if netcdftime not installed * A start on documentation * Fix failing zarr tests related to netcdftime encoding * Simplify test_decode_standard_calendar_single_element_non_ns_range * Address a couple review comments * Use else clause in _maybe_cast_to_netcdftimeindex * Start on adding enable_netcdftimeindex option * Continue parametrizing tests in test_coding_times.py * Update time-series.rst for enable_netcdftimeindex option * Use :py:func: in rst for xarray.set_options * Add a what's new entry and test that resample raises a TypeError * Move what's new entry to the version 0.10.3 section * Add version-dependent pathway for importing netcdftime.datetime * Make NetCDFTimeIndex and date decoding/encoding compatible with datetime.datetime * Remove logic to make NetCDFTimeIndex compatible with datetime.datetime * Documentation edits * Ensure proper enable_netcdftimeindex option is used under lazy decoding Prior to this, opening a dataset with enable_netcdftimeindex set to True and then accessing one of its variables outside the context manager would lead to it being decoded with the default enable_netcdftimeindex (which is False). This makes sure that lazy decoding takes into account the context under which it was called. * Add fix and test for concatenating variables with a NetCDFTimeIndex Previously when concatenating variables indexed by a NetCDFTimeIndex the index would be wrongly converted to a generic pd.Index * Further namespace changes due to netcdftime/cftime renaming * NetCDFTimeIndex -> CFTimeIndex * Documentation updates * Only allow use of CFTimeIndex when using the standalone cftime Also only allow for serialization of cftime.datetime objects when using the standalone cftime package. * Fix errant what's new changes * flake8 * Fix skip logic in test_cftimeindex.py * Use only_use_cftime_datetimes option in num2date * Require standalone cftime library for all new functionality Add tests/fixes for dt accessor with cftime datetimes * Improve skipping logic in test_cftimeindex.py * Fix skipping logic in test_cftimeindex.py for when cftime or netcdftime are not available. Use existing requires_cftime decorator where possible (i.e. only on tests that are not parametrized via pytest.mark.parametrize) * Fix skip logic in Python 3.4 build for test_cftimeindex.py * Improve error messages when for when the standalone cftime is not installed * Tweak skip logic in test_accessors.py * flake8 * Address review comments * Temporarily remove cftime from py27 build environment on windows * flake8 * Install cftime via pip for Python 2.7 on Windows * flake8 * Remove unnecessary new lines; simplify _maybe_cast_to_cftimeindex * Restore test case for #2002 in test_coding_times.py I must have inadvertently removed it during a merge. * Tweak dates out of range warning logic slightly to preserve current default * Address review comments",2018-05-13T05:19:09Z,2018-05-13T05:19:09Z,f04422bfa03dadc75fab6ce64de238fbe48f5be4,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6628425, c046528522a6d4cf18c81a19aeae82f5f7d63d34,DOC: Update link to documentation of Rasterio (#2110),2018-05-09T15:28:32Z,2018-05-09T15:28:31Z,f09379ef63029a719ce2b193d06b1c06792e4338,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,4320866, 9261601f89c0d3cfc54db16718c82399d95266bd,"Remove note about xray from docs index (#2000) We made this change several years ago now -- it's no longer timely news to share with users.",2018-03-22T04:36:55Z,2018-03-22T04:36:55Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, 55128aac84cf59906dac1cb3ea4bd643879a5463,"Check for minimum Zarr version. (#1960) * Check for minimum Zarr version. ws * Exclude version check from coverage. additional ws.",2018-03-06T19:00:27Z,2018-03-06T19:00:27Z,83204c1f5a757fca67372ecf672c454e1863946a,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,9453271, 697cc74b9af5fbfedadd54fd07019ce7684553ec,"Use requires_netcdftime decorators in test_coding_times.py (#1929) * Switch requires_netCDF4 decorators to requires_netcdftime decorators in test_coding_times.py * Fix 'netcdftime' has no attribute 'netcdftime' error",2018-02-21T06:18:34Z,2018-02-21T06:18:34Z,f04422bfa03dadc75fab6ce64de238fbe48f5be4,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6628425, d191352b6c1e15a2b6105b4b76552fe974231396,"Adding .stickler.yml configuration file (#1913) * Adding .stickler.yml * Add max line length and py3k",2018-02-15T23:21:28Z,2018-02-15T23:21:28Z,cd5db554c2897b94311505ebea23941ed945586b,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,16011037, 65e5f05938dc40c6e169377f8c0b6e7774d96866,"Read small integers as float32, not float64 (#1840) * Read small integers as float32, not float64 AKA the ""I just wasted 4.6 TB of memory"" patch. * Tests now include a clean flake8 run",2018-01-23T20:15:28Z,2018-01-23T20:15:28Z,3ce4a2c03644349db2459c99f8b3abfb25e90fed,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,12229877, 6aa225f5dae9cc997e232c11a63072923c8c0238,"Normalisation for RGB imshow (#1819) * Normalisation for RGB imshow * Add test for error checking",2018-01-19T05:01:06Z,2018-01-19T05:01:06Z,3ce4a2c03644349db2459c99f8b3abfb25e90fed,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,12229877, 502a988ad5b87b9f3aeec3033bf55c71272e1053,"pandas casting issues (#1734) * pandas casting issues * update * single value * whats-new * add credit for 0x0L doc updates",2018-01-11T21:24:42Z,2018-01-11T21:24:42Z,d306fd639d8bc55b874f7d03c7e8a00d11e354b5,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,3621629, 5a6dea40164b5c47a78d62dce77a317b39ef2c9c,"Fix multidimensional coordinates (#1768) * Add test case for 1763. * Add whitespace and wrap long lines to pass flake8 tests. * Fix test case - time should not be a coordinate! * When encoding coordinates, change the criteria so that the dimensions of the coordinate must be a subset (or equal to) the dimensions of the variable. * Add note about bug fix (#1763) to the docs.",2018-01-11T16:54:47Z,2018-01-11T16:54:47Z,0e62f35c0b2e20af1a6b53b9920edbdbb3de7941,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1554921, 50b0a69a7aa0fb7ac3afb28e7bd971cf08055f99,"Switch (some) coding/encoding in conventions.py to use xarray.coding. (#1803) * Switch (some) coding/encoding in conventions.py to use xarray.coding. The goal here is to eventually convert everything in xarray.conventions to using the new coding module, which is more modular and supports dask arrays. For now, I have switched over datetime, timedelta, unsigned integer, scaling and mask coding to use new coders. Integrating these into xarray.conventions lets us harness our existing test suite and delete a lot of redundant code. Most of the code/tests is simply reorganized. There should be no changes to public API (to keep this manageable for review). All of the original tests that are still relevant should still be present, though I have reorganized many of them into new locations to match the revised code. * Fix zarr and cmds export * add whats-new and small cleanup * Move constant to top of module * use _NS_PER_TIME_DELTA",2018-01-11T16:53:08Z,2018-01-11T16:53:08Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, 1a012080e0910f3295d0fc26806ae18885f56751,"Fix unexpected behavior of .set_index() since pandas 0.21.0 (#1723) * fix set_index behavior using pandas 0.21.0 * review comments",2017-11-17T00:54:50Z,2017-11-17T00:54:50Z,5d060630fae3d5a365a239dae12fe85ceed98a8d,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,4160723, 9d8ec381d961e76c13f08b5ef741e43a0ecc9a73,Fix typo in np.ix_ function name (#1714),2017-11-13T16:07:26Z,2017-11-13T16:07:26Z,fc4d8d115e5e903228bc82a5759ce0b81be5ff3c,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,374657, 7e9193c0093e0accfbef2312db8fabf5fd4eaa8b,"Tweak to to_dask_dataframe() (#1667) * Tweak to to_dask_dataframe() - Add a `dim_order` argument - Always write columns for each dimension - Docstring to NumPy format * Fix windows test failure * More windows failure * Fix failing test * Use da.arange() inside to_dask_dataframe",2017-10-31T03:19:20Z,2017-10-31T03:19:20Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, 20f9ffd86d2ca05c6fcf5285e40dd61c9788df49,"Remove Gitter links (#1673) Nobody consistently monitors Gitter, so it's not a good place to suggest for reaching out.",2017-10-31T03:15:49Z,2017-10-31T03:15:49Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, 53b1e4a56ce8ea13d5d5b3ea3f8c432c0206b3de,fix bad merge resolution,2017-10-29T06:31:55Z,2017-10-29T06:31:55Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, bd682224a57cb383993c7653957cc5c1276dde9f,allow cleanup failures on scipy/netcdf4 test,2017-10-29T05:56:19Z,2017-10-29T05:56:19Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 5e61d8c5eba9c5bfeb417215a1a461c72a5c31db,fix merge conflicts,2017-10-28T01:37:19Z,2017-10-28T01:37:19Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, d70258a77fa3e0207c8ce643e9b2121fe286410e,add whats new note,2017-10-28T01:32:26Z,2017-10-28T01:32:26Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 8311f0b5f50d999aca93e536e9068d974d27b591,Merge branch 'master' of github.com:pydata/xarray into fix/1652,2017-10-27T23:43:29Z,2017-10-27T23:43:29Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 0a79e83821445ae250ee0ff5cff260c25bda2668,remove extra np.errstate(all=ignore) context managers,2017-10-27T23:35:42Z,2017-10-27T23:35:42Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, ed838f68a9efd2ab5a47f796d54f5890a1ab45f8,remove pytest command line argument that raises errors when warnings are issued,2017-10-27T19:48:54Z,2017-10-27T19:48:54Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 8d775bb5b3bd20a0f90872218567ffd2f9177bbc,fix some numpy warnings using numpy.seterr,2017-10-25T16:25:54Z,2017-10-25T16:25:54Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, df2d3efe63b1634714dbbf1a4ca4271311e87dab,more warnings fixes,2017-10-25T05:57:09Z,2017-10-25T05:57:09Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 4ba28bb44632b7fe31784fbbbb55bc292e4effd3,fixes for warnings related to unit tests,2017-10-25T04:33:57Z,2017-10-25T04:33:57Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 09101d63db0ece7e66b0fcee8082a13783733708,fix for pynio,2017-10-25T02:54:23Z,2017-10-25T02:54:23Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 0b1efd22f03510fd33c762f35d32a508f96a6ece,fix multiple kwargs,2017-10-24T22:48:19Z,2017-10-24T22:48:19Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 92c49a89ac926f35723fdbc0af5054dbc0f2f87d,mods for pynio,2017-10-24T22:27:20Z,2017-10-24T22:27:20Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 5a68bc59e3341be7ea638be2592ef3fba34bdd3c,refactor roundtrip tests,2017-10-24T22:06:10Z,2017-10-24T22:06:10Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 583b8e80bfe399144ab4f055ae15a04e58d65a2f,"Fix rasterio backend for rasterio=1.0a10 (#1642) * Fix window parameter for rasterio=1.0a10 Latest alpha, and future 1.0 release, need either a Window (a new class) or a tuple (gets converted to Window). Current behavior using a list bypasses the conversion to Window that happens for tuples * Test against rasterio 1.0 alpha as allowed failure * Add fix to whats-new",2017-10-24T20:53:09Z,2017-10-24T20:53:09Z,7d1b627df8c17aae344b193f1b8d9003f3c5abf5,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,3585769, 6ea50425f9a38bc382b39b1884ef2ec0d897dde2,pep8,2017-10-24T17:31:02Z,2017-10-24T17:31:02Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 6347cadd943d2f64047683b1e9707b7d6b711b0e,one more fix,2017-10-24T17:25:22Z,2017-10-24T17:25:22Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, d8019409bea645223628645131efae64821410a0,add engine attribute to test cases,2017-10-24T17:17:03Z,2017-10-24T17:17:03Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 6b4a30d0c04d4c4dcb0a057d5d9f55373c2aa085,cleanup tests part 1,2017-10-24T05:49:20Z,2017-10-24T05:51:28Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, faa50987e74dc7eb4f1f9ce05c8239cee906ffa0,remove check for append mode,2017-10-24T05:36:43Z,2017-10-24T05:36:43Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 04671e6f0c26f09943c4a58a92e04056ca6d8d47,Merge branch 'master' of github.com:pydata/xarray into fix/1215,2017-10-24T05:34:12Z,2017-10-24T05:34:12Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, d3e2b97cacd7023972e703c47c9dba505541dea1,doc fixes,2017-10-22T19:00:16Z,2017-10-22T19:00:16Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 0418264df2be7dbe98c43cafaac9acbf9e933c88,assert_allclose for scipy backend,2017-10-21T06:13:59Z,2017-10-21T06:13:59Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 1782108972712394a94835c92dbdc06e1871fc84,more tests and docs,2017-10-21T04:29:24Z,2017-10-21T04:33:58Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 779a4b1ed1e58de0532703bea8d7186676f4daab,Merge branch 'master' of github.com:pydata/xarray into fix/1215,2017-10-21T02:10:39Z,2017-10-21T02:10:39Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 3625035f740807836b5ff734392130aa1ae6db9b,overwrite existing vars,2017-10-18T23:15:41Z,2017-10-18T23:15:41Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, d13d48c1309fb836485a1bf58d87a3dac1031828,Merge branch 'master' of github.com:pydata/xarray into fix/1215,2017-10-18T22:38:42Z,2017-10-18T22:38:42Z,5f199557d0f8f69fbea5e027a407146e2669a812,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, 27132fba1e465955e76ed155ba6cbb769c0904df,"data_vars option added to open_mfdataset (#1580) * add data_vars option to open_mfdataset * use single quotes * fix the 'line too long' warning from flake8 * document the data_vars keyword for open_mfdataset * improve the data_vars record in whats-new * update my name in wats-new.rst * Start writing the test for the data_vars keyword * use the data_vars keyword in combine * address flake8 warnings for test_backend.py * ignore flake8 warnings concerning whats-new.rst * fix function reference in whats-new.rst * open_mfdataset does not accept dim keyword argument * use single quotes for strings in the added tests * refactor data_vars related tests * Use with for opening mfdataset in data_vars related tests * add @requires_scipy_or_netCDF4 to the data_vars test class * address flake8 warnings about long lines in the data_vars related tests. * close opened datasets in case of a ValueError in open_mfdataset, seems important for Windows * fix line too long warnings from flake8 * refactor tests and open_mfdataset, to address comments * refactor tests for data_vars keyword in open_mfdataset * refactor to address flake8 warnings * add another example of data_vars usage in open_mfdataset * add coords keyword to open_mfdataset * add a memory and performance related observations to the whats-new and modify code snippets to use single quotes for consistency. * fixed a grammar mistake * quote variable names referenced in the text * add tests for coords keyword in the open_mfdataset, along with the similar tests for the data_vars keyword. * split a test into 2 to simplify, introduce context manager for setting up test inputs in OpenMFDatasetWithDataVarsAndCoordsKwTest",2017-10-10T20:51:18Z,2017-10-10T20:51:18Z,85c2d308af157d6c3d19c6d01de6605af4718a3c,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,900941, 14b5f1ca84b41696b7e8e3ec5e07d8064acc9a55,"Load nonindex coords ahead of concat() (#1551) * Load non-index coords to memory ahead of concat * Update unit test after #1522 * Minimise loads on concat. Extend new concat logic to data_vars. * Trivial tweaks * Added unit tests Fix loads when vars are found different halfway through * Add xfail for #1586 * Revert ""Add xfail for #1586"" This reverts commit f99313cc7b266964cb540c4dbd858f701461eb2b.",2017-10-09T21:15:30Z,2017-10-09T21:15:30Z,2ff6d9e74ee03928d143f0bb1557924a28d3b23d,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6213168, 5bd40151fd92e4342c43ba7cd6039e63e1611812,"Fix py3 pynio backend (#1612) * Update pynio backend for python 3 Update pynio backend to work with python 3. Previous version accessed the iteritems() method of of the Nio variables object, which no longer exists in the python 3 version. Instead, access the items() method here. * Add tests for pynio python3 support Add tests for development version of pynio with python3 support. There is currently no official release version of pynio with python3 support, so we need to pull in the development version from the ncar conda channel to use pynio in python 3 environments and to run the pynio tests in the test suite. Also fixes a bug in the test initialization in which the presence of the pynio library was tested by trying to import ""pynio"" instead of ""Nio"" (the actual name of the library). * Add notes to bug fixes documenting changes. Add notes to big fixes section in documentation to highlight changes. * Fix requirements for py36-pynio-dev test Fix requirements for py36-pynio-dev test to remove unnecessary option. * Fix py36-pynio tests Added tests were only specified in the allowed failures and needed to be added to the main include block as well.",2017-10-06T02:05:52Z,2017-10-06T02:05:52Z,c089e069ffcc15b5d807374b149e8aca6c90b7f6,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,15826727, dc7d733bcc10ce935304d65d03124471661243a3,"Groupby-like API for resampling (#1272) * Implement basic functionality by adding ""DataArrayResample"" and ""DatasetResample"" subclasses of their equivalent ""GroupBy"" implementation * Re-factor old resample logic to separate method * Adding test cases for new api * Adding more DataArray resample tests * Adding test_dataset test cases * Update to use proxy __resample_dim__ on resampling ops * Re-factor proxy resampling dimension and catch rare error if used as actual resampling dimension name * BUG: Fixed loss of attrs on DatasetResample even when keep_attrs=True * Update docs to add information about new resampling api * Adding 'Whats new' entry * Tweak auxiliary groupby apply and reduce methods * Squash bugs from py.test following rebase to master Un-do _combine temporary debugging output * Fixing typo in groupby.py * Add a test for resampling which includes timezones in the datetime stamps * Rolling back the timezone tests... this will need to be tackled separately * Re-factored resample into it's own module for easier maintenance * Adding support for 'count' via old api * Re-organizing new vs old resampling api tests * Expanded old-vs-new api tests for Dataset to replace deprecated tests * Consolidated old reasmpling api tests into new-vs-old for dataarray * Wrapping old api test invocations with pytest.warns * Added stub tests for upsampling * Update documentation with upsampling stub * Factor out a Resample object and add initial up-sampling methods - bfill, pad, asfreq * Add interpolation up-sampling to DataArray * Refine DataArray upsampling interpolation and extend to Dataset * Fix wrong time dimension length on test cases for upsampling * First initial revisions to @shoyer's comments; before modifying implementation of DataArrayResample * Tweaks to resample.py to lean on super-methods * Implementing interpolation test cases * BUG: Fix asfreq only returning 1D data in nd case * Add pad/asfreq upsampling tests * Add a check if old/new api is mixed; remove old api details from resample docstring * Fix an old bug in datetime components of timeseries doc * Tweaking time-series doc * Added what's new entry * Drop existing non-dimension coordinates along resample dimension * Update seaborn to v0.8 to fix issues with default plot styles * nearest-neighbor up-sampling now relies on re-index instead of interpolate * Adding nearest upsampling test; tweaked inference of re-indexing dimensions to avoid always having to compute means * Move what's new entry to breaking changes * Updating docs in breaking changes with example and link to timeseries page * BUG: Fixing creating merged coordinates for Dataset upsampling case * Remove old notice about resampling api updates * Applying shoyer's clean-up of figuring out valid coords on Dataset up-sampling * Add note about monotonicity assumption before interpolation * fix some pep8 and comments * More informative error message when resampling/interpolating dask arrays * Fix flake8 * Fixing issues with test cases, including adding one for upsampling dask arrays * Clean up scipy imports * Adding additional tweaks to cover scipy/numpy dependencies in tests",2017-09-22T16:27:35Z,2017-09-22T16:27:35Z,2db87938e25c0bd59007523c53fe1e7aff827892,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,, a993317be46e6cba96424faa9fbcc54d3753d571,Uint support in reduce methods with skipna (#1564),2017-09-08T16:12:22Z,2017-09-08T16:12:22Z,caabe6633d9bef7f2f52807cc5a1f477f3a077d4,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6815844, 78ca20a6ea1a42eb637ae2ef09189f481cfda9a2,"Fixed issue #1520, by adding a if-else that checks for None (#1526) * Fixed issue #1520, by adding a if-else that checks for None * Changed the read_rasterio to skip CRS entirely if it is None * Added whats new comment on fix of issue #1520 * Resolved a conflict with master branch. * Deleted a line change from whats new",2017-09-01T17:58:41Z,2017-09-01T17:58:41Z,f7c95a479f65ab37e8e9d5e51694e45a5bebb046,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,25015426, 8e541deca2e20efe080aa1bca566d9966ea2f244,"Added show_commit_url to asv.conf (#1515) * Added show_commit_url to asv.conf This should setup the proper links from the published output to the commit on Github. FYI the benchmarks should be running stably now, and posted to http://pandas.pydata.org/speed/xarray. http://pandas.pydata.org/speed/xarray/regressions.xml has an RSS feed to the regressions. * Update asv.conf.json",2017-08-23T16:01:49Z,2017-08-23T16:01:49Z,414a3ca56e5eb92bdfc6b3cac35417bf5ba51f54,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1312546, dbf9307db5ce51ef9e59ccc78387ac7ceda6b1b8,"Fix a bug in assert_allclose where rtol and atol were ignored (#1488) * Fix a bug in assert_allclose where rtol and atol were ignored * Add regression test",2017-07-27T19:57:29Z,2017-07-27T19:57:29Z,f10b21bed2846b879806f87039b77245b18e7671,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,1217238, d275ad6df25457b53a594953f45b252d14260115,"Speed up `decode_cf_datetime` (#1414) * Speed up `decode_cf_datetime` Instead of casting the input numeric dates to float, they are casted to nanoseconds as integer which makes `pd.to_timedelta()` work much faster (x100 speedup on my machine) * Moved _NS_PER_TIME_DELTA to top of module file * Added entry to `whats-new.rst`",2017-07-25T17:42:51Z,2017-07-25T17:42:51Z,3635471f9c1d642795b95b75219f5a713b279868,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,102827, 5e50c0dc4d0e8238437963cd79d31daaddd41cd8,"Restored dim order in DataArray.rolling().reduce() (#1277) Restoring dim order in rolling.reduce().",2017-02-27T17:11:01Z,2017-02-27T17:11:01Z,caabe6633d9bef7f2f52807cc5a1f477f3a077d4,5f199557d0f8f69fbea5e027a407146e2669a812,13221727,6815844,