Abstract: The ability to detect out-of-distribution (OOD) samples is vital to secure the reliability of deep neural networks in real-world applications. Considering the nature of OOD samples, detection methods should not have hyperparameters that need to be tuned depending on incoming OOD samples. However, most of the recently proposed methods do not meet this requirement, leading to compromised performance in real-world applications. In this paper, we propose a simple and computationally efficient, hyperparameter-free method that uses cosine similarity. Although recent studies show its effectiveness for metric learning, it remains uncertain if cosine similarity works well also for OOD detection and, if so, why. We provide an intuitive explanation of why cosine similarity works better than the standard methods that use the maximum of softmax outputs or logits. Besides, there are several differences in the design of output layers, which are essential to achieve the best performance. We show through experiments that our method outperforms the existing methods on the evaluation test recently proposed by Shafaei et al., which takes the above issue of hyperparameter dependency into account; it achieves at least comparable performance to the state-of-the-art on the conventional test, where other methods but ours are allowed to use explicit OOD samples for determining hyperparameters.

SlidesLive

Similar Papers

Bridging Adversarial and Statistical Domain Transfer via Spectral Adaptation Networks
Christoph Raab (FHWS)*, Philipp Väth (FHWS), Peter Meier (FHWS), Frank-Michael Schleif (FHWS)
Image Inpainting with Onion Convolutions
Shant Navasardyan (Picsart Inc.)*, Marianna Ohanyan (Picsart Inc.)
Channel Pruning for Accelerating Convolutional Neural Networks via Wasserstein Metric
Haoran Duan (University of Science and Technology of China (USTC))*, Hui Li (University of Science and Technology of China (USTC))