I have a large dataframe with 36 columns, I want to calculate correlation between two successive rows. I tried data.corrwith(data, axis=0, method=’pearson’, drop=False) wrong result: (‘Amplitude’, 0) 1.0 (‘Amplitude’, 1) 1.0 (‘Amplitude’, 2) 1.0 (‘Amplitude’, 3) 1.0 (‘Amplitude’, 4) 1.0 (‘Amplitude’, 5) 1.0 (‘Amplitude’, 6) 1.0 (‘Amplitude’, 7) 1.0 (‘Amplitude’, 8) 1.0 (‘Amplitude’, 9) 1.0 […]

# Tag: axis=(0

I get an error using gradient visualization with transfer learning in TF 2.0. The gradient visualization works on a model that does not use transfer learning. When I run my code I get the error: assert str(id(x)) in tensor_dict, ‘Could not compute output ‘ + str(x) AssertionError: Could not compute output Tensor(“block5_conv3/Identity:0”, shape=(None, 14, 14, […]

I have a 3D image which is a numpy array of shape (1314, 489, 3) and looks as follows: Now I want to calculate the mean RGB color value of the mask (the cob without the black background). Calculating the RGB value for the whole image is easy: print(np.mean(colormaskcutted, axis=(0, 1))) >>[186.18434633 88.89164511 46.32022921] But […]

## Weighted average of dataframes

I have found some answers about averaging dataframes, but none that includes the treatment of weights. I have figured a way to get to the result I want (see title) but I wonder if there is a more direct way of achieving the same goal. What I do is: transform each dataframe into array of […]

I am a beginner of the programming code. I am trying to plot two data together. But I have a problem with the size of each data. I could not resolve the problem. Any help or links to the relevant materials. Thank you. import numpy as np import matplotlib.pyplot as plt import os os.environ[‘PROJ_LIB’] = […]

I have a signal that I would like to upsample e.g. x=np.array([8,1,2,3,4,5]) y=np.array([3,1,8,4,0,2]) As you can see, the x has a non-constant sampling frequency. Furthermore, its values are not necessarily increasing. My original idea was to upsample x using the resampling function: x_new=scipy.signal.resample(x, N_points, t=None, axis=0, window=None); and then using the linear interpolation to find […]

## multi-layer grayscale input in u-net

I have successfully trained a u-net for the specific task of cell segmentation using (256, 256, 1) grayscale input and (256, 256, 1) binary label. I used zhixuhao’s unet implemention in Keras (git rep. here).What I am trying to do now is to train the same model using multiple grayscale layer as input. To make […]

I have a situation like Pandas Group Weighted Average of Multiple Columns but where some values of one column are sometimes NaN. That is, I am doing the following: import pandas as pd import numpy as np df=pd.DataFrame({‘category’:[‘a’,’a’,’b’,’b’], ‘var1’:np.random.randint(0,100,4), ‘var2’:np.random.randint(0,100,4), ‘weights’:np.random.randint(0,10,4)}) df.loc[1,’var1′]=np.nan df category var1 var2 weights 0 a 74.0 99 9 1 a NaN […]

My code starts with taking 4 inputs (mainly from a text file, but I started with user input as a test): both the x and y axis of a circle on a grid, the radius of said circle and a number of points on the grid (this part works fine). Then, it creates a two […]

I have a dataset with id, event and metric columns: df = pd.DataFrame([[‘a’,’x’, 1], [‘a’,’x’,2], [‘b’,’x’,3], [‘b’,’x’,3], [‘a’,’z’,4], [‘a’,’z’,5], [‘b’,’y’,5]], columns = [‘id’,’event’,’metric’]) id event metric 0 a x 1 1 a x 2 2 b x 3 3 b x 3 4 a z 4 5 a z 5 6 b y 5 I […]