Formula for the learning curve in neural networks?

In training a neural networks you often see the curve showing how fast the neural network is gaining intelligence. It usually grows very fast then slow down to almost horizontal. Is there a mathematical formula that matches these curves? Some similar curves are: $$y=1-e^{-x}$$ $$y=\frac{x}{1+x}$$ $$y=\tanh(x)$$ $$y=1+x-\sqrt{1+x^2}$$ Is there a theoertical reason for this shape?