One-class classification (OCC) aims to train a classifier only with the target class data and attracts great attention for its strong applicability in real-world application. Despite a lot of advances have been made in OCC, it still lacks the effective OCC loss functions for deep learning. In this paper, a novel logarithmic barrier function based OCC loss (LBL) that assigns large gradients to the margin samples and thus derives more compact hypersphere, is first proposed by approximating the OCC objective smoothly. But the optimization of LBL may be instability especially when samples lie on the boundary leading to the infinity loss. To address this issue, then, a unilateral relaxation Sigmoid function is introduced into LBL and a novel OCC loss named LBLSig is proposed. The LBLSig can be seen as the fusion of the mean square error (MSE) and the cross entropy (CE) and the optimization of LBLSig is smoother owing to the unilateral relaxation Sigmoid function. The effectiveness of the proposed LBL and LBLSig is experimentally demonstrated in comparisons with several popular OCC algorithms on different network structures. The source code can be found at https://github.com/ML-HDU/LBL_LBLSig.
翻译:单类分类(OCC)旨在仅利用目标类数据进行分类器训练,因其在现实应用中的强大适用性而备受关注。尽管单类分类已取得诸多进展,但其仍缺乏适用于深度学习的有效OCC损失函数。本文首次通过平滑逼近OCC目标,提出了一种基于对数障碍函数的新型OCC损失函数(LBL),该函数为边界样本分配较大梯度,从而推导出更紧凑的超球面。然而,LBL的优化可能不稳定,特别是当样本位于边界导致损失趋于无穷时。为解决此问题,随后在LBL中引入单侧松弛Sigmoid函数,并提出名为LBLSig的新型OCC损失函数。LBLSig可视为均方误差(MSE)与交叉熵(CE)的融合,且由于单侧松弛Sigmoid函数的作用,其优化过程更为平滑。通过在不同网络结构上与多种主流OCC算法进行比较实验,验证了所提出的LBL和LBLSig的有效性。源代码可在 https://github.com/ML-HDU/LBL_LBLSig 获取。