issue_comments
14 rows where issue = 377947810 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- How do I copy my array forwards in time? · 14 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | performed_via_github_app | issue |
---|---|---|---|---|---|---|---|---|---|---|---|
460600719 | https://github.com/pydata/xarray/issues/2547#issuecomment-460600719 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQ2MDYwMDcxOQ== | tommylees112 21049064 | 2019-02-05T11:15:49Z | 2019-02-05T11:15:49Z | NONE | But the original question was answered so thank you very much! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
460600500 | https://github.com/pydata/xarray/issues/2547#issuecomment-460600500 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQ2MDYwMDUwMA== | tommylees112 21049064 | 2019-02-05T11:15:01Z | 2019-02-05T11:15:01Z | NONE | Sorry for the silence! I got pulled away to another project. Unfortunately I wasn't able to finish completing the task in xarray but I found that the easiest way around the problem was to use a combination of two functions: ```python def change_missing_vals_to_9999f(ds, variable): """ Change the missing values from np.nan to -9999.0f""" arr = ds[variable].values
def change_missing_data_values(filename): """ change the values INSIDE the .nc file to -9999.0f """ assert ( filename.split(".")[-1] == "nc" ), "This function only works with .nc files. Filename: {}".format(filename) print(" Processing {} ").format(filename)
``` and then another function using the
RUN HERE: ``` @click.command() @click.argument("filename", type=str) def main(filename): """ Run the two commands a) change the Values INSIDE the .nc file [python, numpy, xarray] b) change the associated METADATA for the .nc file headers [nco] """ change_missing_data_values(filename) change_nc_FillValue(filename)
``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
460020696 | https://github.com/pydata/xarray/issues/2547#issuecomment-460020696 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQ2MDAyMDY5Ng== | jhamman 2443309 | 2019-02-03T03:50:47Z | 2019-02-03T03:50:47Z | MEMBER | @tommylees112 - this issue has sat for a bit of time now. Did you end up with a solution here? If so, is okay with you if we close this out? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436695050 | https://github.com/pydata/xarray/issues/2547#issuecomment-436695050 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjY5NTA1MA== | spencerkclark 6628425 | 2018-11-07T16:53:14Z | 2018-11-07T16:53:14Z | MEMBER | The // global attributes: :Conventions = "CF-1.0" ; :content = "HARMONIZED WORLD SOIL DATABASE; first it was aggregated to one global file; then the missing areas were filled with interpolated data; then separate tiles were extracted" ; :scaling_factor = "20" ; ``` Is the time encoding important for your land surface model? That does change in my example (see the units attribute); you might need some special logic to handle that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436688702 | https://github.com/pydata/xarray/issues/2547#issuecomment-436688702 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjY4ODcwMg== | tommylees112 21049064 | 2018-11-07T16:36:09Z | 2018-11-07T16:46:01Z | NONE | @spencerkclark Thanks very much that is awesome! One final Q: How do I set the The current output of
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436679811 | https://github.com/pydata/xarray/issues/2547#issuecomment-436679811 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjY3OTgxMQ== | spencerkclark 6628425 | 2018-11-07T16:14:27Z | 2018-11-07T16:14:27Z | MEMBER | Thanks @tommylees112 -- note that the In [2]: ds = xr.open_dataset('Rg_dummy.nc') In [3]: times = pd.date_range("2000-01-01", "2000-12-31", name="time") In [4]: ds['time'] = np.array([times[0]]) In [5]: ds2 = ds.reindex(time=times, method='ffill') In [6]: ds2.to_netcdf('result.nc') ``` Regarding the issue saving to files -- I can reproduce that issue with older xarray versions. It is related to 2512 and was fixed in #2513 (i.e. it works with the master version of xarray). The good news is this bug only applies to saving |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436668695 | https://github.com/pydata/xarray/issues/2547#issuecomment-436668695 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjY2ODY5NQ== | tommylees112 21049064 | 2018-11-07T15:45:22Z | 2018-11-07T15:46:15Z | NONE | That all worked great until I tried to write out to a .nc file. ```python data_dir = "./" filename = "Rg_dummy.nc" get the datetime rangetimes = pd.date_range("2000-01-01", "2000-12-31", name="time") var = "Rg" copyfile(data_dir + filename, "temp.nc") ds = xr.open_dataset("temp.nc") print("Temporary Data read to Python") FORWARD FILL FROM THE ORIGINAL DATA to new timestepsds['time'] = np.array([times[0]]) ds.reindex({"time":times}) ds.ffill("time") ds.to_netcdf(filename, format="NETCDF3_CLASSIC") print(filename, "Written!") remove temporary fileos.remove(data_dir+"temp.nc") print("Temporary Data Removed") del ds ``` I get the following Error message: ``` Temporary Data read to Python TypeError Traceback (most recent call last) <ipython-input-228-e3d645224353> in <module>() 15 ds.ffill("time") 16 ---> 17 ds.to_netcdf(filename, format="NETCDF3_CLASSIC") 18 print(filename, "Written!") 19 /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/core/dataset.pyc in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1148 engine=engine, encoding=encoding, 1149 unlimited_dims=unlimited_dims, -> 1150 compute=compute) 1151 1152 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None, /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/api.pyc in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims, compute) 721 try: 722 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 723 unlimited_dims=unlimited_dims, compute=compute) 724 if path_or_file is None: 725 return target.getvalue() /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/core/dataset.pyc in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims, compute) 1073 1074 store.store(variables, attrs, check_encoding, -> 1075 unlimited_dims=unlimited_dims) 1076 if sync: 1077 store.sync(compute=compute) /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/common.pyc in store(self, variables, attributes, check_encoding_set, unlimited_dims) 366 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 367 self.set_variables(variables, check_encoding_set, --> 368 unlimited_dims=unlimited_dims) 369 370 def set_attributes(self, attributes): /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in set_variables(self, args, kwargs) 405 def set_variables(self, args, kwargs): 406 with self.ensure_open(autoclose=False): --> 407 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 408 409 def encode_variable(self, variable): /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/common.pyc in set_variables(self, variables, check_encoding_set, unlimited_dims) 403 check = vn in check_encoding_set 404 target, source = self.prepare_variable( --> 405 name, v, check, unlimited_dims=unlimited_dims) 406 407 self.writer.add(source, target) /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 451 least_significant_digit=encoding.get( 452 'least_significant_digit'), --> 453 fill_value=fill_value) 454 _disable_auto_decode_variable(nc4_var) 455 netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Dataset.createVariable() netCDF4/_netCDF4.pyx in netCDF4._netCDF4.Variable.init() TypeError: illegal primitive data type, must be one of ['i8', 'f4', 'f8', 'S1', 'i2', 'i4', 'u8', 'u4', 'u1', 'u2', 'i1'], got datetime64[ns] ``` and if I try with the default netcdf writing options
I get this error message: ``` Temporary Data read to Python ValueError Traceback (most recent call last) <ipython-input-229-453d5f074d33> in <module>() 15 ds.ffill("time") 16 ---> 17 ds.to_netcdf(filename) 18 print(filename, "Written!") 19 /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/core/dataset.pyc in to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute) 1148 engine=engine, encoding=encoding, 1149 unlimited_dims=unlimited_dims, -> 1150 compute=compute) 1151 1152 def to_zarr(self, store=None, mode='w-', synchronizer=None, group=None, /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/api.pyc in to_netcdf(dataset, path_or_file, mode, format, group, engine, writer, encoding, unlimited_dims, compute) 721 try: 722 dataset.dump_to_store(store, sync=sync, encoding=encoding, --> 723 unlimited_dims=unlimited_dims, compute=compute) 724 if path_or_file is None: 725 return target.getvalue() /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/core/dataset.pyc in dump_to_store(self, store, encoder, sync, encoding, unlimited_dims, compute) 1073 1074 store.store(variables, attrs, check_encoding, -> 1075 unlimited_dims=unlimited_dims) 1076 if sync: 1077 store.sync(compute=compute) /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/common.pyc in store(self, variables, attributes, check_encoding_set, unlimited_dims) 366 self.set_dimensions(variables, unlimited_dims=unlimited_dims) 367 self.set_variables(variables, check_encoding_set, --> 368 unlimited_dims=unlimited_dims) 369 370 def set_attributes(self, attributes): /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in set_variables(self, args, kwargs) 405 def set_variables(self, args, kwargs): 406 with self.ensure_open(autoclose=False): --> 407 super(NetCDF4DataStore, self).set_variables(*args, kwargs) 408 409 def encode_variable(self, variable): /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/common.pyc in set_variables(self, variables, check_encoding_set, unlimited_dims) 403 check = vn in check_encoding_set 404 target, source = self.prepare_variable( --> 405 name, v, check, unlimited_dims=unlimited_dims) 406 407 self.writer.add(source, target) /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in prepare_variable(self, name, variable, check_encoding, unlimited_dims) 418 unlimited_dims=None): 419 datatype = _get_datatype(variable, self.format, --> 420 raise_on_invalid_encoding=check_encoding) 421 attrs = variable.attrs.copy() 422 /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in _get_datatype(var, nc_format, raise_on_invalid_encoding) 99 def _get_datatype(var, nc_format='NETCDF4', raise_on_invalid_encoding=False): 100 if nc_format == 'NETCDF4': --> 101 datatype = _nc4_dtype(var) 102 else: 103 if 'dtype' in var.encoding: /home/mpim/m300690/miniconda3/envs/holaps/lib/python2.7/site-packages/xarray/backends/netCDF4_.pyc in _nc4_dtype(var) 122 else: 123 raise ValueError('unsupported dtype for netCDF4 variable: {}' --> 124 .format(var.dtype)) 125 return dtype 126 ValueError: unsupported dtype for netCDF4 variable: datetime64[ns] ``` Version informationIf this is useful: ``` In [230]: xr.show_versions() INSTALLED VERSIONScommit: None python: 2.7.15.final.0 python-bits: 64 OS: Linux OS-release: 2.6.32-696.18.7.el6.x86_64 machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: None.None xarray: 0.10.7 pandas: 0.23.0 numpy: 1.11.3 scipy: 1.1.0 netCDF4: 1.4.0 h5netcdf: None h5py: None Nio: None zarr: None bottleneck: 1.2.1 cyordereddict: None dask: None distributed: None matplotlib: 1.5.1 cartopy: None seaborn: None setuptools: 39.1.0 pip: 18.1 conda: None pytest: None IPython: 5.7.0 sphinx: None ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436638944 | https://github.com/pydata/xarray/issues/2547#issuecomment-436638944 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjYzODk0NA== | spencerkclark 6628425 | 2018-11-07T14:23:42Z | 2018-11-07T14:23:42Z | MEMBER | @tommylees112 I had a look at your dataset and noticed that the time coordinate was not encoded in a way that allows xarray to automatically decode the time values into datetimes. Specifically, the units attribute is not in the format of some unit of time (e.g. 'seconds') since a given reference date. ``` $ ncdump -h Rg_dummy.nc netcdf Rg_dummy { dimensions: time = 1 ; y = 200 ; x = 200 ; variables: double time(time) ; time:_FillValue = NaN ; time:standard_name = "time" ; time:units = "day as %Y%m%d.%f" ; time:calendar = "proleptic_gregorian" ; short Rg(time, y, x) ; Rg:_FillValue = -1s ; Rg:long_name = "HWSD sub sum content" ; Rg:units = "percent wt" ; Rg:valid_range = 97., 103. ; double latitude(y, x) ; latitude:_FillValue = -99999. ; double longitude(y, x) ; longitude:_FillValue = -99999. ; // global attributes:
:Conventions = "CF-1.0" ;
:content = "HARMONIZED WORLD SOIL DATABASE; first it was aggregated to one global file; then the missing areas were filled with interpolated data; then separate tiles were extracted" ;
:scaling_factor = "20" ;
}
Regarding assigning a new value to the time coordinate, the following should work:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436606451 | https://github.com/pydata/xarray/issues/2547#issuecomment-436606451 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjYwNjQ1MQ== | tommylees112 21049064 | 2018-11-07T12:24:36Z | 2018-11-07T12:25:02Z | NONE | The ds.time[0] won't let me set it's value to a datetime. Instead it returns a float:
And none of the following work: ```python doesn't change the time valueds.time[0].values = times[0] returns an error because I can't assign to a function callds.time[0].item() = times[0] returns ValueError: replacement data must match the Variable's shapeds['time'].values = np.array(times[0]) ``` Thanks for your help! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436422432 | https://github.com/pydata/xarray/issues/2547#issuecomment-436422432 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjQyMjQzMg== | max-sixty 5635139 | 2018-11-06T21:54:18Z | 2018-11-06T21:54:18Z | MEMBER | Can you ensure |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436400765 | https://github.com/pydata/xarray/issues/2547#issuecomment-436400765 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjQwMDc2NQ== | tommylees112 21049064 | 2018-11-06T20:40:20Z | 2018-11-06T20:41:25Z | NONE | Data: netcdf_files.zip Code below: ```python import numpy as np import pandas as pd import xarray as xr from shutil import copyfile import os data_dir = "./" filename = "Rg_dummy.nc" get the datetime rangetimes = pd.date_range("2000-01-01", "2000-12-31", name="time") var = "Rg" copyfile(filename, "temp.nc") ds = xr.open_dataset("temp.nc") print("Temporary Data read to Python") FORWARD FILL FROM THE ORIGINAL DATA to new timestepsds.reindex({"time":times}) ds.ffill("time") ds.to_netcdf(filename, format="NETCDF3_CLASSIC")print(filename, "Written!")remove temporary fileos.remove(data_dir+"temp.nc") print("Temporary Data Removed") del ds ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436390574 | https://github.com/pydata/xarray/issues/2547#issuecomment-436390574 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjM5MDU3NA== | max-sixty 5635139 | 2018-11-06T20:06:54Z | 2018-11-06T20:06:54Z | MEMBER | Your original value needs to be in the time index in order to remain there after the reindex. Currently it looks like a float value:
If you have a repro example (link in issue template) it's easier to offer help on the whole issue |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436383962 | https://github.com/pydata/xarray/issues/2547#issuecomment-436383962 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjM4Mzk2Mg== | tommylees112 21049064 | 2018-11-06T19:45:46Z | 2018-11-06T19:50:00Z | NONE | Thanks for your help! you definitely understood me correctly! This doesn't seem to work as it fills my
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 | |
436336253 | https://github.com/pydata/xarray/issues/2547#issuecomment-436336253 | https://api.github.com/repos/pydata/xarray/issues/2547 | MDEyOklzc3VlQ29tbWVudDQzNjMzNjI1Mw== | max-sixty 5635139 | 2018-11-06T17:23:28Z | 2018-11-06T17:23:28Z | MEMBER | IIUC, this is actually much easier - reindex on your enlarged |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
How do I copy my array forwards in time? 377947810 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [performed_via_github_app] TEXT, [issue] INTEGER REFERENCES [issues]([id]) ); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 4