html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,performed_via_github_app,issue
https://github.com/pydata/xarray/issues/7772#issuecomment-1519897098,https://api.github.com/repos/pydata/xarray/issues/7772,1519897098,IC_kwDOAMm_X85al8oK,123355381,2023-04-24T10:51:16Z,2023-04-24T10:51:16Z,NONE,Thank you @dcherian . I cannot reproduced this on `main`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1676561243
https://github.com/pydata/xarray/issues/7772#issuecomment-1517649648,https://api.github.com/repos/pydata/xarray/issues/7772,1517649648,IC_kwDOAMm_X85adX7w,123355381,2023-04-21T10:57:28Z,2023-04-21T10:57:28Z,NONE,"The first point that you mentioned does not seem to be correct. Please see the below code (we took the sparse matrix ) and output:
```
import xarray as xa
import numpy as np
def get_data():
lat_dim = 7210
lon_dim = 7440
lat = [0] * lat_dim
lon = [0] * lon_dim
time = [0] * 5
nlats = lat_dim; nlons = lon_dim; ntimes = 5
var_1 = np.empty((ntimes,nlats,nlons))
var_2 = np.empty((ntimes,nlats,nlons))
var_3 = np.empty((ntimes,nlats,nlons))
var_4 = np.empty((ntimes,nlats,nlons))
data_arr = np.random.uniform(low=0,high=0,size=(ntimes,nlats,nlons))
data_arr[:,0,:] = 1
data_arr[:,:,1] = 1
var_1[:,:,:] = data_arr
var_2[:,:,:] = data_arr
var_3[:,:,:] = data_arr
var_4[:,:,:] = data_arr
dataset = xa.Dataset(
data_vars = {
'var_1': (('time','lat','lon'), var_1),
'var_2': (('time','lat','lon'), var_2),
'var_3': (('time','lat','lon'), var_3),
'var_4': (('time','lat','lon'), var_4)},
coords = {
'lat': lat,
'lon': lon,
'time':time})
print(sum(v.size * v.dtype.itemsize for v in dataset.variables.values()))
print(dataset.nbytes)
if __name__ == ""__main__"":
get_data()
```
```
8582901240
8582901240
```
As we can observe here both `nbytes` and `self.size * self.dtype.itemsize` gives the same size.
And for the 2nd point can you share any solution for the nbytes for the `netCDF` or `grib` file as it takes too much memory and killed the process?
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,1676561243