home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where issue = 309592370 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • shoyer 2

issue 1

  • dask ImportWarning causes pytest failure · 2 ✖

author_association 1

  • MEMBER 2
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
377118783 https://github.com/pydata/xarray/issues/2025#issuecomment-377118783 https://api.github.com/repos/pydata/xarray/issues/2025 MDEyOklzc3VlQ29tbWVudDM3NzExODc4Mw== shoyer 1217238 2018-03-29T04:36:06Z 2018-03-29T04:36:06Z MEMBER

Thanks for noting this. I think https://github.com/pydata/xarray/pull/2026 will fix it.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  dask ImportWarning causes pytest failure 309592370
377113745 https://github.com/pydata/xarray/issues/2025#issuecomment-377113745 https://api.github.com/repos/pydata/xarray/issues/2025 MDEyOklzc3VlQ29tbWVudDM3NzExMzc0NQ== shoyer 1217238 2018-03-29T03:57:01Z 2018-03-29T03:57:01Z MEMBER

Yep, I'm seeing the same thing. Here's the full traceback: ``` ============================================================== FAILURES =============================================================== _____ NetCDF4DataTest.test_88_character_filename_segmentation_fault _____

error = <class 'Warning'>, pattern = 'segmentation fault'

@contextmanager
def raises_regex(error, pattern):
    __tracebackhide__ = True  # noqa: F841
    with pytest.raises(error) as excinfo:
      yield

xarray/tests/init.py:150:


self = <xarray.tests.test_backends.NetCDF4DataTest testMethod=test_88_character_filename_segmentation_fault>

def test_88_character_filename_segmentation_fault(self):
    # should be fixed in netcdf4 v1.3.1
    with mock.patch('netCDF4.__version__', '1.2.4'):
        with warnings.catch_warnings():
            warnings.simplefilter("error")
            with raises_regex(Warning, 'segmentation fault'):
                # Need to construct 88 character filepath
              xr.Dataset().to_netcdf('a' * (88 - len(os.getcwd()) - 1))

xarray/tests/test_backends.py:1143:


self = <xarray.Dataset> Dimensions: () Data variables: empty path = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', mode = 'w', format = None, group = None, engine = None encoding = {}, unlimited_dims = None

def to_netcdf(self, path=None, mode='w', format=None, group=None,
              engine=None, encoding=None, unlimited_dims=None):
    """Write dataset contents to a netCDF file.

        Parameters
        ----------
        path : str, Path or file-like object, optional
            Path to which to save this dataset. File-like objects are only
            supported by the scipy engine. If no path is provided, this
            function returns the resulting netCDF file as bytes; in this case,
            we need to use scipy, which does not support netCDF version 4 (the
            default format becomes NETCDF3_64BIT).
        mode : {'w', 'a'}, optional
            Write ('w') or append ('a') mode. If mode='w', any existing file at
            this location will be overwritten. If mode='a', existing variables
            will be overwritten.
        format : {'NETCDF4', 'NETCDF4_CLASSIC', 'NETCDF3_64BIT','NETCDF3_CLASSIC'}, optional
            File format for the resulting netCDF file:

            * NETCDF4: Data is stored in an HDF5 file, using netCDF4 API
              features.
            * NETCDF4_CLASSIC: Data is stored in an HDF5 file, using only
              netCDF 3 compatible API features.
            * NETCDF3_64BIT: 64-bit offset version of the netCDF 3 file format,
              which fully supports 2+ GB files, but is only compatible with
              clients linked against netCDF version 3.6.0 or later.
            * NETCDF3_CLASSIC: The classic netCDF 3 file format. It does not
              handle 2+ GB files very well.

            All formats are supported by the netCDF4-python library.
            scipy.io.netcdf only supports the last two formats.

            The default format is NETCDF4 if you are saving a file to disk and
            have the netCDF4-python library available. Otherwise, xarray falls
            back to using scipy to write netCDF files and defaults to the
            NETCDF3_64BIT format (scipy does not support netCDF4).
        group : str, optional
            Path to the netCDF4 group in the given file to open (only works for
            format='NETCDF4'). The group(s) will be created if necessary.
        engine : {'netcdf4', 'scipy', 'h5netcdf'}, optional
            Engine to use when writing netCDF files. If not provided, the
            default engine is chosen based on available dependencies, with a
            preference for 'netcdf4' if writing to a file on disk.
        encoding : dict, optional
            Nested dictionary with variable names as keys and dictionaries of
            variable specific encodings as values, e.g.,
            ``{'my_variable': {'dtype': 'int16', 'scale_factor': 0.1,
                               'zlib': True}, ...}``
        unlimited_dims : sequence of str, optional
            Dimension(s) that should be serialized as unlimited dimensions.
            By default, no dimensions are treated as unlimited dimensions.
            Note that unlimited_dims may also be set via
            ``dataset.encoding['unlimited_dims']``.
        """
    if encoding is None:
        encoding = {}
    from ..backends.api import to_netcdf
    return to_netcdf(self, path, mode, format=format, group=group,
                     engine=engine, encoding=encoding,
                   unlimited_dims=unlimited_dims)

xarray/core/dataset.py:1131:


dataset = <xarray.Dataset> Dimensions: () Data variables: empty path_or_file = '/Users/shoyer/dev/xarray/aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', mode = 'w', format = None group = None, engine = 'netcdf4', writer = None, encoding = {}, unlimited_dims = None

def to_netcdf(dataset, path_or_file=None, mode='w', format=None, group=None,
              engine=None, writer=None, encoding=None, unlimited_dims=None):
    """This function creates an appropriate datastore for writing a dataset to
    disk as a netCDF file

    See `Dataset.to_netcdf` for full API docs.

    The ``writer`` argument is only for the private use of save_mfdataset.
    """
    if isinstance(path_or_file, path_type):
        path_or_file = str(path_or_file)
    if encoding is None:
        encoding = {}
    if path_or_file is None:
        if engine is None:
            engine = 'scipy'
        elif engine != 'scipy':
            raise ValueError('invalid engine for creating bytes with '
                             'to_netcdf: %r. Only the default engine '
                             "or engine='scipy' is supported" % engine)
    elif isinstance(path_or_file, basestring):
        if engine is None:
            engine = _get_default_engine(path_or_file)
        path_or_file = _normalize_path(path_or_file)
    else:  # file-like object
        engine = 'scipy'

    # validate Dataset keys, DataArray names, and attr keys/values
    _validate_dataset_names(dataset)
    _validate_attrs(dataset)

    try:
        store_open = WRITEABLE_STORES[engine]
    except KeyError:
        raise ValueError('unrecognized engine for to_netcdf: %r' % engine)

    if format is not None:
        format = format.upper()

    # if a writer is provided, store asynchronously
    sync = writer is None

    # handle scheduler specific logic
  scheduler = get_scheduler()

xarray/backends/api.py:639:


get = None, collection = None

def get_scheduler(get=None, collection=None):
    """ Determine the dask scheduler that is being used.

    None is returned if not dask scheduler is active.

    See also
    --------
    dask.utils.effective_get
    """
    try:
        from dask.utils import effective_get
        actual_get = effective_get(get, collection)
        try:
          from dask.distributed import Client

xarray/backends/common.py:46:


from __future__ import absolute_import, division, print_function

try:
  from distributed import *

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/dask/distributed.py:5:


from __future__ import print_function, division, absolute_import

from .config import config

from .core import connect, rpc

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/init.py:4:


from __future__ import print_function, division, absolute_import

from collections import defaultdict, deque
from concurrent.futures import CancelledError
from functools import partial
import logging
import six
import traceback
import uuid
import weakref

from six import string_types

from toolz import assoc

from tornado import gen
from tornado.ioloop import IOLoop
from tornado.locks import Event

from .comm import (connect, listen, CommClosedError, normalize_address, unparse_host_port, get_address_host_port)

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/core.py:20:


from __future__ import print_function, division, absolute_import

from .addressing import (parse_address, unparse_address,
                         normalize_address, parse_host_port,
                         unparse_host_port, resolve_address,
                         get_address_host_port, get_address_host,
                         get_local_address_for,
                         )
from .core import connect, listen, Comm, CommClosedError


def _register_transports():
    from . import inproc
    from . import tcp

_register_transports()

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/comm/init.py:17:


def _register_transports():
  from . import inproc

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/comm/init.py:13:


from __future__ import print_function, division, absolute_import

from collections import deque, namedtuple
import itertools
import logging
import os
import threading
import weakref

from tornado import gen, locks
from tornado.concurrent import Future
from tornado.ioloop import IOLoop

from ..compatibility import finalize

from ..protocol import nested_deserialize

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/comm/inproc.py:15:


from __future__ import print_function, division, absolute_import

from functools import partial

from .compression import compressions, default_compression

from .core import (dumps, loads, maybe_compress, decompress, msgpack)

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/protocol/init.py:6:


from __future__ import print_function, division, absolute_import

import logging

import msgpack

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/distributed/protocol/core.py:5:


from msgpack._version import version
from msgpack.exceptions import *

from collections import namedtuple


class ExtType(namedtuple('ExtType', 'code data')):
    """ExtType represents ext type in msgpack."""
    def __new__(cls, code, data):
        if not isinstance(code, int):
            raise TypeError("code must be int")
        if not isinstance(data, bytes):
            raise TypeError("data must be bytes")
        if not 0 <= code <= 127:
            raise ValueError("code must be 0~127")
        return super(ExtType, cls).__new__(cls, code, data)


import os
if os.environ.get('MSGPACK_PUREPYTHON'):
    from msgpack.fallback import Packer, unpack, unpackb, Unpacker
else:
    try:
      from msgpack._packer import Packer

../../miniconda3/envs/xarray-py36/lib/python3.6/site-packages/msgpack/init.py:25:


??? E ImportWarning: can't resolve package from spec or package, falling back on name and path

msgpack/_packer.pyx:7: ImportWarning

During handling of the above exception, another exception occurred:

self = <xarray.tests.test_backends.NetCDF4DataTest testMethod=test_88_character_filename_segmentation_fault>

def test_88_character_filename_segmentation_fault(self):
    # should be fixed in netcdf4 v1.3.1
    with mock.patch('netCDF4.__version__', '1.2.4'):
        with warnings.catch_warnings():
            warnings.simplefilter("error")
            with raises_regex(Warning, 'segmentation fault'):
                # Need to construct 88 character filepath
              xr.Dataset().to_netcdf('a' * (88 - len(os.getcwd()) - 1))

xarray/tests/test_backends.py:1143:


self = <contextlib._GeneratorContextManager object at 0x1209dc4e0>, type = <class 'ImportWarning'> value = ImportWarning("can't resolve package from spec or package, falling back on name and path",) traceback = <traceback object at 0x120ca93c8>

def __exit__(self, type, value, traceback):
    if type is None:
        try:
            next(self.gen)
        except StopIteration:
            return False
        else:
            raise RuntimeError("generator didn't stop")
    else:
        if value is None:
            # Need to force instantiation so we can reliably
            # tell if we get the same exception back
            value = type()
        try:
          self.gen.throw(type, value, traceback)

E AssertionError: exception ImportWarning("can't resolve package from spec or package, falling back on name and path",) did not match pattern 'segmentation fault' ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  dask ImportWarning causes pytest failure 309592370

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 11.594ms · About: xarray-datasette