home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

3 rows where author_association = "NONE" and user = 11997114 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • GProtoZeroW · 3 ✖

issue 1

  • Use of Xarray instead of np.meshgrid 3

author_association 1

  • NONE · 3 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions performed_via_github_app issue
350974418 https://github.com/pydata/xarray/issues/1773#issuecomment-350974418 https://api.github.com/repos/pydata/xarray/issues/1773 MDEyOklzc3VlQ29tbWVudDM1MDk3NDQxOA== GProtoZeroW 11997114 2017-12-12T08:03:07Z 2017-12-12T08:03:07Z NONE

@shoyer I saw where you were going with this and here is the result I got. The trick was to preseed the dataset with an array of dtype 'object' ``` MasterShape=SimDataSet['CoorSpace'].shape; MasterShape

dealing with the non broadcasting function

SimDataSet['vector']=(['x', 'y', 'z', 't'], np.zeros(MasterShape, dtype=object)) for i in Iter.product(SimDataSet.indexes.values(), repeat=1): #print(i, vector_funcN(i)) SimDataSet['vector'].loc[dict(x=i[0], y=i[1], z=i[2], t=i[3])]=vector_funcN(*i)

full code import numpy as np import xarray as xr from sympy import * init_printing() import itertools as Iter

scale_func = lambda x, y, z, t: np.cos(1x+2y+3z-4t) scale_func(1,1,1,1)

x, y, z, t=symbols('x, y, z, t') xDirVec=Matrix([1,0,0]) vector_func=cos(1x+2y+3z-4t)*xDirVec vector_func

vector_funcN=lambdify((x, y, z, t), vector_func, dummify=False)

data point test

vector_funcN(1,1,1,1)

DomainSpaceTimeSize = 5 # using cartesian 4D SpaceTimeDensity = [10, 5] # 100 divisions in space 5 in time

x_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) y_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) z_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) t_coord = np.linspace(0, +DomainSpaceTimeSize, SpaceTimeDensity[1])

dx=xr.DataArray(x_coord, dims='x') dy=xr.DataArray(y_coord, dims='y') dz=xr.DataArray(z_coord, dims='z') dt=xr.DataArray(t_coord, dims='t')

CoorEnterFunc = lambda x, y, z, t: 1+0x+0y+0z+0t

SimDataSet = xr.Dataset({'CoorSpace':(['x', 'y', 'z', 't'], CoorData)}, coords={'x':x_coord, 'y':y_coord, 'z':z_coord, 't':t_coord}) MasterShape=SimDataSet['CoorSpace'].shape; MasterShape

SimDataSet['Scaler']=(['x', 'y', 'z', 't'],scale_func(*np.meshgrid(x_coord, y_coord, z_coord, t_coord)) )

dealing with the non broadcasting function

SimDataSet['vector']=(['x', 'y', 'z', 't'], np.zeros(MasterShape, dtype=object)) for i in Iter.product(SimDataSet.indexes.values(), repeat=1): #print(i, vector_funcN(i)) SimDataSet['vector'].loc[dict(x=i[0], y=i[1], z=i[2], t=i[3])]=vector_funcN(*i)

SimDataSet.info() ```

I think there is definitely room for improvement. But I am now going to start implementing said EM example and y'alls help should get me to step 4. I step 5 should be interesting (I am not jinxing myself here) and will let y'all know when I get started on step 6 (I have done it before but very ad-hoc to say the least ) . I know holoviews works with xarray but it is high time that yt starts really taking in data from sources other than ancient obscure astronomy Fortran programs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use of Xarray instead of np.meshgrid 280899335
350840130 https://github.com/pydata/xarray/issues/1773#issuecomment-350840130 https://api.github.com/repos/pydata/xarray/issues/1773 MDEyOklzc3VlQ29tbWVudDM1MDg0MDEzMA== GProtoZeroW 11997114 2017-12-11T19:59:20Z 2017-12-11T19:59:20Z NONE

@shoyer the gx, dy, gz, gt =xr.broadcast(dx, dy, dz, dt)
worked thanks

long story short on lambdfy, it tries to convert a sympy expression to a string then does a bunch of string mappings from the expression string to numpy (and starting to come online scipy) functions then tags on a lambda to the front of that and passes it back to the user via the black magic of eval (there is ongoing work to make this a lot better and its a rabbit hole). lambdfy also typically does some vodo in the background do hide non eval friendly parts of the eval string in a dummy variable. So for example, if you define a variable like invmean (from exp distros) invmean=symbols('lambda') and try to pass it with dummify=False in an expression there is going to be two lambda's in the same string and eval crashes. And starting I think with version 1.0 of sympy they defaulted to using numpy as the output templet in all cases. So technically all outputs are essentially a lambda function acting on an array. [bounes info np.piecewise used to drive me crazy but sympy's Piecewise is much more intuitive so I just use sympy's and then use lambdfy on the sympy version and then by the python gods I have a working numpy piecewise function]

So, in theory, there should not be this issue at hand with using lambdfy but in practice, its treating the [1:] parts of the passed in matrix as float and not as np.float which have broadcasting. And there ends that excursion into the wonderland of sympy. With the moral of the story is the output is a lambda function that does not alway do what you want it to do

So back to the code at hand. I tried xarray.apply_ufunc(vector_funcN, dx, dy, dz, dt) with the following error given back ValueError: dimensions ('x', 'y', 'z', 't') must have the same length as the number of data dimensions, ndim=2

So first off would I want to apply the gx instead of dx, dy, dz, dt since gx now has the comp space impeded in it like the end result of @fmaussion fix. And secondly, I tried setting vectorize=True with setting sequence error thrown back at me.

So just so we're all on the same page. 1. Define the individual spacetime comp domain axis and there ranges/step size 2. Bind the individual domain axis dominions in one xarray datacube to create one coherent spacetime grid stored in a xarray datacube 3. pass the xarray spacetime grid into other functions such as scaler, vector, tensor functions (with names and units) as new data_vars of a flushed out xr.Dataset 4. use resulting Dataset to pass to such things as yt for volumetric rendering or perform more processing on resulting dataset

Find ways to minimize np.meshgrid since there are times when that does not work and instead use xr.apply_ufunc so each spacetime comp domain data point stored already in a xarray datacube is passed into the function and the results are then bound together into an xr.Dataset for further derived processing or visualization.

So if I am working with an Electromagnetic example the workflow would be 1. define the extent and step size in x, y, z, t 2. bind individual dimensions into cohesive computing domain stored in basis xarray datacube 3. define a the scaler electrical potential function and the vector magnetic function (let's just say we are going to try to do this in sympy and then convert it with lambdfy 4. find the numerical values of the electric and magnetic potential in the spacetime grid and bind them with the coordinates into a xarray.dataset (to keep things simple where working in free space, but if material domains are present this would be the perfect use of xarray to store that information in the dataset parral to the potentials and fields 5. find the electric and magnetic fields from the potentials and add them to the dataset 6. pass final dataset over to yt for volumetric rendering 7. review output with an expansive drink in hand because that would have cost a mint to do in certain proprietary "languages"

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use of Xarray instead of np.meshgrid 280899335
350818717 https://github.com/pydata/xarray/issues/1773#issuecomment-350818717 https://api.github.com/repos/pydata/xarray/issues/1773 MDEyOklzc3VlQ29tbWVudDM1MDgxODcxNw== GProtoZeroW 11997114 2017-12-11T18:46:12Z 2017-12-11T18:46:12Z NONE

@fmaussion Thank you, that's going to help with a lot of my problems for my computational work

@shoyer I am looking at the docs for xarray.broadcast() and xarray.apply_ufunc . below is my attempt at trying what you said (though I am most likely missing something) using the broadcast. But broadcast only act's on two arrays at a time. Below is what I tried from what I think your saying (example would clarify this quickly)

``` import numpy as np import xarray as xr

DomainSpaceTimeSize = 5 # using cartesian 4D SpaceTimeDensity = [100, 5] # 100 divisions in space 5 in time

x_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) y_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) z_coord = np.linspace(-DomainSpaceTimeSize, +DomainSpaceTimeSize, SpaceTimeDensity[0]) t_coord = np.linspace(0, +DomainSpaceTimeSize, SpaceTimeDensity[1])

dx=xr.DataArray(x_coord, dims='x') dy=xr.DataArray(y_coord, dims='y') dz=xr.DataArray(z_coord, dims='z') dt=xr.DataArray(t_coord, dims='t')

ds, =xr.broadcast(dx, dy) ds, =xr.broadcast(ds, dz) ds, _=xr.broadcast(ds, dt) ds `` @shoyer your reply on usingxarray.apply_ufuncis what I am looking for when I got broadcasting issues such as this one from using sympy'slambdfy` (for sake of argument lets stay out of the sympy lambdfy rabit hole and say this could have come from anywhere) on a vector function

``` from sympy import * init_printing()

x, y, z, t=symbols('x, y, z, t') xDirVec=Matrix([1,0,0])

vector_func=cos(1x+2y+3z-4t)*xDirVec vector_func

vector_funcN=lambdify((x, y, z, t), vector_func, dummify=False)

data point test

vector_funcN(1,1,1,1)

target compersion

vector_func_npRef=lambda x, y, z, t: np.array([np.cos(1x+2y+3z-4t), 0, 0])

point test

vector_funcN(1,1,1,1)

point test match test: pass

np_SpaceResult=vector_func_npRef(np.meshgrid(x_coord, y_coord, z_coord, t_coord)).shape sp_SpaceResult=vector_funcN(np.meshgrid(x_coord, y_coord, z_coord, t_coord)).shape

and as can be seen the two shapes dont match

`` so then how would I usexr.apply_ufuncon thelambdfyfunctionvector_funcN`. Again quick example would be much appreciated

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  Use of Xarray instead of np.meshgrid 280899335

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [performed_via_github_app] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
);
CREATE INDEX [idx_issue_comments_issue]
    ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
    ON [issue_comments] ([user]);
Powered by Datasette · Queries took 18.747ms · About: xarray-datasette