Dropout-style feature-detector disabling that reduces overfitting and unlocks more reliable neural network generalization.
The method trains a neural network by using a probabilistic switch to randomly disable feature detectors (units) in selected layers for each training example, then normalizes the contributions when applying the trained network to test data. Technically, it creates an ensemble effect from many thinned sub-networks, typically driven by fixed drop probabilities, rather than relying on a single deterministic model. Previously, using this regularization strategy at scale often required licensing or access to proprietary training techniques from organizations like the assignee.
Build generalization-focused image and text classifiers that train with dropout regularization without licensing fees.
Submit a problem you're solving with this technology and connect with founders, investors, and researchers.