### How to find correlation between two time series using machine learning

I want to find the correlation between two multivariate time series using machine learning (ML) methods. I want to know: 1 – which one of ML methods can be used in general for this task? 2 – if I use neural networks, the input and output shapes would be (values at time steps, number of…

### What are evolutionary algorithms for topology and weights evolving of ANN (TWEANN) other than NEAT?

I wonder, if there are other than NEAT approaches to evolving architectures and weights of artificial neural networks? To be more specific: I am looking for projects/frameworks/libraries that use evolutionary/genetic algorithms to simultanousely evolve both topology and train weights of ANNs other than NEAT approach. By ‘other’ I mean similar to NEAT but not based…

### Which deep learning models are suitable for image-to-image mapping?

I am working on a problem in which I need to train a neural network to map one or more input images to one or more output images (1 channel for image). Below I report some examples of input&output. In this case I report 1 input and 1 output image, but may need to pass…

### Which deep learning models are suitable for image-to-image mapping?

I am working on a problem in which I need to train a neural network to map one or more input images to one or more output images (1 channel for image). Below I report some examples of input&output. In this case I report 1 input and 1 output image, but may need to pass…

### Which deep learning models are suitable for image-to-image mapping?

I am working on a problem in which I need to train a neural network to map one or more input images to one or more output images (1 channel for image). Below I report some examples of input&output. In this case I report 1 input and 1 output image, but may need to pass…

### How can I implement derivative of Softmax funtion for matrices in Python?

I have trouble understanding how to implement derivative of Softmax function. Here is what I tried: def Softmax(x): e_x = np.exp(x – np.max(x)) return e_x / e_x.sum() def d_Softmax(X): x=Softmax(X) s=x.reshape(-1,1) return (np.diagflat(s) – np.dot(s, s.T)) I am not sure if it works as it should. Normal softmax function when taking [m x n] matrix…

### the role of probability in supervised learning

In the literature and textbooks, one often sees supervised learning expressed as a conditional probability, e.g., $\ \ \ \ \ \rho(\vec{y}|\vec{x},\vec{\theta})$ where $\vec{\theta}$ denotes a learned set of network parameters, $\vec{x}$ is an arbitrary input, and $\vec{y}$ is an arbitrary output. If we assume we have already learned $\vec{\theta}$, then, in words, $\rho(\vec{y}|\vec{x},\vec{\theta})$ is…

### Is the temperature equal to epsilon in Reinforcement Learning?

This is a piece of code from my homework. # action policy: implements epsilon greedy and softmax def select_action(self, state, epsilon): qval = self.qtable[state] prob = [] if (self.softmax): # use Softmax distribution prob = sp.softmax(qval / epsilon) #print(prob) else: # assign equal value to all actions prob = np.ones(self.actions) * epsilon / (self.actions -1)…