- Recent studies have shown that novel continuous dropout methods can be viewed as a Bayesian interpretation of model parameters, though most such studies have shown results using normal distributions. As the posterior distributions over neural network nodes and parameters are intractable, given that they are a result of artificial construction to improve model performance rather than a result of observation, there is no justification in assuming that they are necessarily normal. In this paper, a unimodal and symmetric distribution called the generalized normal distribution, sometimes referred to as the exponential power distribution, is instantiated with various shape and scale parameter configurations. These instantiated distributions are tested as nodal representations in multilayer perceptrons trained against the MNIST and MNIST Fashion datasets. Results conclude that the shape parameter of a generalized normal distribution has a statistically significant effect on the performance of the multilayer perceptron in continuous dropout against MNIST. Results also suggest, though not conclusively, that a Gaussian distribution is not necessarily optimal in continuous dropout against MNIST.