I intend to implement CNN like progressively expanded neural network in Keras. The basic idea is the first input node can be decomposed into multiple nodes with different orders and coefficient, then each term is progressively added to form new neurons. decomposing single nodes to multiple ones can generate different non-linear line connection that generated […]

- Tags (test_imgs, 2))) model.add(Dropout(0.25)) model.add(Flatten()) model.add(Dense(512, 3, activation='relu', activation='relu')) model.add(Dropout(0.5)) for i in range(2, activation='relu')) model.add(MaxPooling2D(pool_size=(2, activation='softmax')(y)) return model this is my naive attempt to realize the above attached computational graph but it didn't work, alpha is coefficient, approx_order): inputs=Input(shape=(input_shape)) x= Dense(input_shape)(inputs) y= Dense(output_shape)(x) mode, approx_order+1): y=add([y, Conv2D, Dense(output_shape)(Activation(lambda x: pown(x, Dropout, Flatten from keras.datasets import cifar10 from keras.utils import to_categorical (train_imgs, I intend to implement CNN like progressively expanded neural network in Keras. The basic idea is the first input node can be decomposed into, I really don't know what's the right way of decomposing the input node to multiple one. Can anyone point me out how to make this happen? any, input_shape=input_shape)) model.add(Conv2D(filters=32, kernel_size=3, MaxPooling2D, n): return(x**n) def expandable_cnn(input_shape, n=i))(x))]) model.add(Dense(n_class, ncols_tr, ncols_ts, ndims_tr = train_imgs.shape[1:] nrows_ts, ndims_tr) test_data = test_imgs.reshape(test_imgs.shape[0], ndims_tr) train_data = train_data.astype('float32') trast_data = test_data.astype('float32') train_data //= 255 test_data //=, ndims_ts = test_imgs.shape[1:] train_data = train_imgs.reshape(train_imgs.shape[0], ndims_ts) input_shape = (nrows_tr, nrows_tr, nrows_ts, output_shape, p is power, padding='same', test_label)= cifar10.load_data() output_class = np.unique(train_label) n_class = len(output_class) nrows_tr, the Taylor series is an approximation function but the decomposing node is not quite intuitive to me in terms of implementation. any possible, then each term is progressively added to form new neurons. decomposing single nodes to multiple ones can generate different non-linear line c, train_label = [], w is weight mathematical representation of the above computational graph as follow: above computational graph is based on feedforward neura