home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 350840130

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions performed_via_github_app issue
https://github.com/pydata/xarray/issues/1773#issuecomment-350840130 https://api.github.com/repos/pydata/xarray/issues/1773 350840130 MDEyOklzc3VlQ29tbWVudDM1MDg0MDEzMA== 11997114 2017-12-11T19:59:20Z 2017-12-11T19:59:20Z NONE

@shoyer the gx, dy, gz, gt =xr.broadcast(dx, dy, dz, dt)
worked thanks

long story short on lambdfy, it tries to convert a sympy expression to a string then does a bunch of string mappings from the expression string to numpy (and starting to come online scipy) functions then tags on a lambda to the front of that and passes it back to the user via the black magic of eval (there is ongoing work to make this a lot better and its a rabbit hole). lambdfy also typically does some vodo in the background do hide non eval friendly parts of the eval string in a dummy variable. So for example, if you define a variable like invmean (from exp distros) invmean=symbols('lambda') and try to pass it with dummify=False in an expression there is going to be two lambda's in the same string and eval crashes. And starting I think with version 1.0 of sympy they defaulted to using numpy as the output templet in all cases. So technically all outputs are essentially a lambda function acting on an array. [bounes info np.piecewise used to drive me crazy but sympy's Piecewise is much more intuitive so I just use sympy's and then use lambdfy on the sympy version and then by the python gods I have a working numpy piecewise function]

So, in theory, there should not be this issue at hand with using lambdfy but in practice, its treating the [1:] parts of the passed in matrix as float and not as np.float which have broadcasting. And there ends that excursion into the wonderland of sympy. With the moral of the story is the output is a lambda function that does not alway do what you want it to do

So back to the code at hand. I tried xarray.apply_ufunc(vector_funcN, dx, dy, dz, dt) with the following error given back ValueError: dimensions ('x', 'y', 'z', 't') must have the same length as the number of data dimensions, ndim=2

So first off would I want to apply the gx instead of dx, dy, dz, dt since gx now has the comp space impeded in it like the end result of @fmaussion fix. And secondly, I tried setting vectorize=True with setting sequence error thrown back at me.

So just so we're all on the same page. 1. Define the individual spacetime comp domain axis and there ranges/step size 2. Bind the individual domain axis dominions in one xarray datacube to create one coherent spacetime grid stored in a xarray datacube 3. pass the xarray spacetime grid into other functions such as scaler, vector, tensor functions (with names and units) as new data_vars of a flushed out xr.Dataset 4. use resulting Dataset to pass to such things as yt for volumetric rendering or perform more processing on resulting dataset

Find ways to minimize np.meshgrid since there are times when that does not work and instead use xr.apply_ufunc so each spacetime comp domain data point stored already in a xarray datacube is passed into the function and the results are then bound together into an xr.Dataset for further derived processing or visualization.

So if I am working with an Electromagnetic example the workflow would be 1. define the extent and step size in x, y, z, t 2. bind individual dimensions into cohesive computing domain stored in basis xarray datacube 3. define a the scaler electrical potential function and the vector magnetic function (let's just say we are going to try to do this in sympy and then convert it with lambdfy 4. find the numerical values of the electric and magnetic potential in the spacetime grid and bind them with the coordinates into a xarray.dataset (to keep things simple where working in free space, but if material domains are present this would be the perfect use of xarray to store that information in the dataset parral to the potentials and fields 5. find the electric and magnetic fields from the potentials and add them to the dataset 6. pass final dataset over to yt for volumetric rendering 7. review output with an expansive drink in hand because that would have cost a mint to do in certain proprietary "languages"

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
  280899335
Powered by Datasette · Queries took 79.12ms · About: xarray-datasette