|
摘要: |
LUMs(Large-margin Unified Machines)在分类学习中受到广泛关注,LUMs是一类最大化间隔分类器,它提供了一种独特的软分类到硬分类转化的方式.本文研究的是基于独立不同分布样本和LUM损失函数的二分类在线学习算法.同时,在线算法的每一步迭代,涉及的LUM损失函数的参数是随着迭代在逐渐减小的.在这种假设下,我们基于再生核希尔伯特空间(RKHS),给出了在线算法的收敛阶. |
关键词: 非同分布样本 在线分类算法 变参数LUM损失函数 再生核希尔伯特空间 |
DOI: |
分类号:O29 |
基金项目: |
|
ONLINE LUM CLASSIFICATION WITH VARYING THRESHOLDS AND NON-IDENTICAL SAMPLING DISTRIBUTIONS |
WANG Ze-xing
|
Abstract: |
Large-margin Unified Machines (LUMs) have been widely studied in classification. LUMs is a family of large-margin classifiers and it offers a unique transition from soft to hard classification. In this paper, we are devoted to investigate the online binary classification algorithm with LUM loss function and non-identical sampling distributions, where each time a sample is drawn independently from different probability distributions. Especially, we also consider the LUM loss function with varying thresholds where parameter of the loss function decreases with the iteration process. The numerical convergence analysis of the algorithm associated with reproducing kernel Hilbert space (RKHS) is presented and the learning rate of this general framework is obtained. |
Key words: sampling with non-identical distributions online classification LUM loss with varying threshold reproducing kernel Hilbert spaces |