home / github / issues

Menu
  • GraphQL API
  • Search all tables

issues: 102416901

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
102416901 MDU6SXNzdWUxMDI0MTY5MDE= 545 Negative timesteps after .to_netcdf with long time periods? 13906519 closed 0     3 2015-08-21T16:36:36Z 2019-02-01T01:15:22Z 2019-02-01T01:15:22Z NONE      

Hi.

I discovered that I get negative times on the time dimension for long intervals (years 1700-2013, monthly timestep).

``` python import xray import numpy as np import pandas as pd

years = range(1700,2014) LATS = np.arange(-89.75, 90.0, 0.5) LONS = np.arange(-179.75, 180.0, 0.5)

tlist = pd.date_range('%d-01-01' % years[0], periods=12*len(years), freq='M')

da = xray.DataArray(np.ones((12len(years), 360, 720))-9999, \ [('time', tlist), ('latitude', LATS), ('longitude', LONS) ])

i then fill the dataarray with info from a text file (using read_csv from pandas)

eventually I dump to netcdf

ds = xray.Dataset({"mgpp": da}) ds.to_netcdf('test_%d-%d.nc' % (years[0], years[-1])) ```

If I "ncdump -c mgpp_1700-2013.nc I get:

``` netcdf mgpp_1700-2013 { dimensions: latitude = 360 ; time = 3768 ; longitude = 720 ; variables: float latitude(latitude) ; float mgpp(time, latitude, longitude) ; mgpp:units = "gCm-2" ; float longitude(longitude) ; float time(time) ; time:units = "days since 1700-01-31 00:00:00" ; time:calendar = "proleptic_gregorian" ; data:

time = 0, 28, 59, 89, 120, 150, 181, 212, 242, 273, 303, 334, 365, 393, 424, 454, 485, 515, 546, 577, 607, 638, 668, 699, 730, 758, 789, 819, 850, 880, 911, 942, 972, 1003, 1033, 1064, 1095, 1123, 1154, 1184, 1215, 1245, 1276, 1307, 1337, 1368, 1398, 1429, 1460, 1489, 1520, 1550, 1581, 1611, 1642, 1673, 1703, 1734, 1764, 1795, 1826, 1854, 1885, 1915, 1946, 1976, 2007, 2038, 2068, 2099, 2129, 2160, 2191, 2219, 2250, 2280, 2311, 2341, 2372, 2403, 2433, 2464, 2494, 2525, 2556, 2584, 2615, 2645, 2676, 2706, 2737, 2768, 2798, 2829, 2859, 2890, 2921, 2950, 2981, 3011, 3042, 3072, 3103, 3134, 3164, 3195, 3225, 3256, 3287, 3315, 3346, 3376, 3407, 3437, 3468, 3499, 3529, 3560, 3590, 3621, 3652, 3680, 3711, 3741, 3772, 3802, (...) ```

and eventually:

(...) 106435, 106466, 106497, 106527, 106558, 106588, 106619, 106650, 106679, 106710, 106740, -106732.982337963, -106702.982337963, -106671.982337963, -106640.982337963, -106610.982337963, -106579.982337963, -106549.982337963, -106518.982337963, -106487.982337963, -106459.982337963, -106428.982337963, -106398.982337963, -106367.982337963, -106337.982337963, -106306.982337963, -106275.982337963, -106245.982337963, -106214.982337963, -106184.982337963, -106153.982337963, -106122.982337963, -106094.982337963, -106063.982337963, -106033.982337963, -106002.982337963, -105972.982337963, -105941.982337963, -105910.982337963, -105880.982337963, -105849.982337963, (...)

Not sure if I can inflence that at "dump" time with to_netcdf? I know about the time limitation, but my years should be non-critical, no?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/545/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 2 rows from issues_id in issues_labels
  • 3 rows from issue in issue_comments
Powered by Datasette · Queries took 0.639ms · About: xarray-datasette