html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/349#issuecomment-91445951,https://api.github.com/repos/pydata/xarray/issues/349,91445951,MDEyOklzc3VlQ29tbWVudDkxNDQ1OTUx,1217238,2015-04-10T06:15:48Z,2015-04-10T06:16:02Z,MEMBER,"You can view the rendered docs on readthedocs, even for the dev version:
http://xray.readthedocs.org/en/latest/io.html#combining-multiple-files
open_mfdataset is not quite ready for prime-time -- it needs better documentation and the library we use to power it (dask) has a few annoying bugs that will hopefully be fixed soon. I can't offer any guarantees, but if you want to give it a try (you'll need to install the development version of dask), let me know how it goes. I'll be releasing a new version of xray once those dask fixes go in, probably within the next week or two.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-91437792,https://api.github.com/repos/pydata/xarray/issues/349,91437792,MDEyOklzc3VlQ29tbWVudDkxNDM3Nzky,6063709,2015-04-10T05:49:17Z,2015-04-10T05:49:17Z,CONTRIBUTOR,"Great to see open_mfdataset implemented! Awesome. The links on the documentation pages seem borked though:
https://github.com/xray/xray/blob/0cd100effc3866ed083c366723da0b502afa5a96/doc/io.rst
e.g. "":py:func:`~xray.auto_combine`"" https://github.com/xray/xray/blob/0cd100effc3866ed083c366723da0b502afa5a96/doc/io.rst#id30
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-81424360,https://api.github.com/repos/pydata/xarray/issues/349,81424360,MDEyOklzc3VlQ29tbWVudDgxNDI0MzYw,1217238,2015-03-16T05:20:14Z,2015-03-16T05:20:14Z,MEMBER,"I literally merged this into the dev version of the docs a few hours ago :).
Lazy loading goodness is next on my to-do list. Hopefully I'll have more to share soon.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-81422044,https://api.github.com/repos/pydata/xarray/issues/349,81422044,MDEyOklzc3VlQ29tbWVudDgxNDIyMDQ0,6063709,2015-03-16T05:16:08Z,2015-03-16T05:16:08Z,CONTRIBUTOR,"Sorry, I thought I had read the docs (which are very good BTW). Thanks.
I have some large files and only want to pick out a single variable from each, and was hoping for some lazy-loading goodness.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-81401283,https://api.github.com/repos/pydata/xarray/issues/349,81401283,MDEyOklzc3VlQ29tbWVudDgxNDAxMjgz,1217238,2015-03-16T04:22:22Z,2015-03-16T04:22:22Z,MEMBER,"@aidanheerdegen Not directly (yet), but there are some straightforward recipes. In fact, this has been a popular question, so I wrote a new doc section on this the other day:
http://xray.readthedocs.org/en/latest/io.html#combining-multiple-files
For now, this is only the development version docs but everything is equally valid for the latest released version.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-81399209,https://api.github.com/repos/pydata/xarray/issues/349,81399209,MDEyOklzc3VlQ29tbWVudDgxMzk5MjA5,6063709,2015-03-16T04:11:37Z,2015-03-16T04:11:37Z,CONTRIBUTOR,"Is there support for an MFDataset-like multiple file open in xray?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-76891303,https://api.github.com/repos/pydata/xarray/issues/349,76891303,MDEyOklzc3VlQ29tbWVudDc2ODkxMzAz,1217238,2015-03-03T05:53:11Z,2015-03-03T05:53:11Z,MEMBER,"Slicing the data you need before concatenating is definitely a good strategy here.
Eventually, I'm optimistic that we'll be able to make `concat` not require loading everything into memory (https://github.com/xray/xray/issues/328)
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-76890735,https://api.github.com/repos/pydata/xarray/issues/349,76890735,MDEyOklzc3VlQ29tbWVudDc2ODkwNzM1,7300413,2015-03-03T05:45:50Z,2015-03-03T05:45:50Z,NONE,"Thanks. But that really kills my machine, even though I have 12 GB of RAM.
What I finally ended up doing is slicing the initial dataset created from one nc file
to access the level+variable that I wanted. This gives me a DataArray object
which I then xray.concat() with similar objects created from other variables.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-76883374,https://api.github.com/repos/pydata/xarray/issues/349,76883374,MDEyOklzc3VlQ29tbWVudDc2ODgzMzc0,1217238,2015-03-03T04:02:25Z,2015-03-03T04:02:25Z,MEMBER,"Oh, OK. In that case, you do want to use `concat`.
Something like this should work:
``` python
ds = xray.concat([xray.open_dataset(f) for f in my_files], dim='time')
```
xray doesn't use or set unlimited dimensions. (It's pretty irrelevant for us, given that NumPy arrays can be stored in either row-major or column-major order.)
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-76883017,https://api.github.com/repos/pydata/xarray/issues/349,76883017,MDEyOklzc3VlQ29tbWVudDc2ODgzMDE3,7300413,2015-03-03T03:57:55Z,2015-03-03T03:57:55Z,NONE,"No, not really. each file contains one year of data for four variables, and I have 35 files (1979-...)
I tried Dataset.merge as you suggested, but it says conflicting value for variable time, which I
guess is what you would expect.
Can xray modify the nc file to make the time dimension unlimited? then I could simply use something
like MFDataset...
TIA,
Joy
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251
https://github.com/pydata/xarray/issues/349#issuecomment-76748269,https://api.github.com/repos/pydata/xarray/issues/349,76748269,MDEyOklzc3VlQ29tbWVudDc2NzQ4MjY5,1217238,2015-03-02T16:46:18Z,2015-03-02T16:46:18Z,MEMBER,"To clarify -- you have different files for different variables? For example, one file has temperature, another has dewpoint, etc? I think you want to use the Dataset.merge method for this.
On Mon, Mar 2, 2015 at 3:09 AM, JoyMonteiro notifications@github.com
wrote:
> Hello,
> I have multiple nc files, and I want to pick one variable from all of them to
> write to a separate file, and if possible pick one vertical level. The issue
> is that it has no aggregation dimension, so MFDataset does not work.
> The idea is to get all data about one variable from one vertical level into
> a single file.
> When I use the example in the netCDF4-python website, concat merges
> all variables along all dimensions, making the in-memory size really large.
> I'm new to xray, and I was hoping something of this sort can be done.
> In fact, I don't really need to write it to a new file. Even if I can get
> one ""descriptor"" (instead of an array of Dataset objects) to access my data, I will be quite
> happy!
> TIA,
>
> ## Joy
>
> Reply to this email directly or view it on GitHub:
> https://github.com/xray/xray/issues/349
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,59467251