As the name suggests, the drop out is about dropping something out. And For deep learning that sth, is the values from a layer to a layer.

Question : Does dropout guarantee performance improvement ?
Answer: Drop out does not guarantee performance improvement. The performance may be near identical. However it is useful for over-fitting prevention and is most useful when training very large networks.
REF: https://www.tensorflow.org/get_started/mnist/pros (See Last Paragraph )

 Can that drop out be a specific pattern ?

No, since if we drop sth out, in a specific and regular pattern. For example index 0 to 10 always then , the risk is that the deep learning model will infer the drop out pattern and readjust accordingly. Hence we need to drop the values from a layer to another layers in a random manner.

Size of Drop out– : Set half of the values to zero.

drop-out

Fig : Showing Drop out in acion (Ref : Udacity)

Does drop out reduce the model training time ?

Drop out not only reduces over-fitting, but is more important useful to save the training time, as we reduce the training size  significantly by dropping the training set data.

What if drop out does not increase the accuracy ?

You should use the bigger networks then.

Should you use drop out during evaluation, implementation ? : No, during training only.

Because during evaluation, we want our model to be deterministic and its best if we get multiple reaffirmation for same values.

drop-out-during-evaluation

Fig : Never remove dropout during evaluation. REF (Udacity)

Advertisements