id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,active_lock_reason,draft,pull_request,body,reactions,performed_via_github_app,state_reason,repo,type 1377097243,PR_kwDOAMm_X84_J8JL,7051,Add parse_dims func,43316012,closed,0,,,6,2022-09-18T15:36:59Z,2022-12-08T20:10:01Z,2022-11-30T23:36:33Z,COLLABORATOR,,0,pydata/xarray/pulls/7051,"This PR adds a `utils.parse_dims` function for parsing one or more dimensions. Currently every function that accepts multiple dimensions does this by itself. I decided to first see if it would be useful to centralize the dimension parsing and collect inputs before adding it to other functions.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull 1421441672,PR_kwDOAMm_X85BcmP0,7209,Optimize some copying,43316012,closed,0,,,8,2022-10-24T21:00:21Z,2022-12-08T20:09:49Z,2022-11-30T23:36:56Z,COLLABORATOR,,0,pydata/xarray/pulls/7209,"- [x] Potentially closes #7181 - [x] Tests added - [ ] User visible changes (including notable bug fixes) are documented in `whats-new.rst` I have passed along some more memo dicts, which could prevent some double deep-copying of the same data (don't know how exactly, but who knows :P) Also, I have found some copy calls that did not pass along the deep argument (I am not sure if that breaks things, lets find out). And finally I have found some places where shallow copies are enough. All together it should improve the performance a lot when copying things around.","{""url"": ""https://api.github.com/repos/pydata/xarray/issues/7209/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,,13221727,pull