home / github / issues

Menu
  • Search all tables
  • GraphQL API

issues: 180080354

This data as json

id node_id number title user state locked assignee milestone comments created_at updated_at closed_at author_association active_lock_reason draft pull_request body reactions performed_via_github_app state_reason repo type
180080354 MDU6SXNzdWUxODAwODAzNTQ= 1020 Memory error when converting dataset to dataframe 1961038 closed 0     4 2016-09-29T15:21:58Z 2021-05-04T18:03:32Z 2016-09-30T15:38:01Z NONE      

When working with NOAA's High Resolution Rapid Refresh (HRRR) model output data (GRIB2, automatically converted to NetCDF via either GrADS Data Server or THREDDS), xrd.to_dataframe() throws a MemoryError.

Here's a sample URL that I use to assign to an xarray dataset object:

'http://nomads.ncep.noaa.gov:9090/dods/hrrr/hrrr'+model_day+'/hrrr_sfc_'+model_hour+'z'

where model_day is of the form YYYYMMDD and model_hour is of the form HH

These datasets are quite large (1155x2503 latxlon) ... is there a limit as to how large an xarray dataset can be for it to be converted to a dataframe?

{
    "url": "https://api.github.com/repos/pydata/xarray/issues/1020/reactions",
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  completed 13221727 issue

Links from other tables

  • 1 row from issues_id in issues_labels
  • 4 rows from issue in issue_comments
Powered by Datasette · Queries took 0.714ms · About: xarray-datasette