该文提出了一种类加权的双v支持向量机，称为WD v-SVM。给出了求解WD v-SVM的KKT条件。理论分析表明，WD v-SVM中的参数v+和v-具有与v-SVM类似的物理含义，分别对应于加权正类和负类中边界向量比例的上界和支持向量比例的下界，从而有利于分类识别中的参数取值。此外，通过调整类加权可提高WD v-SVM对小样本类的分类性能。实验结果表明WD v-SVM既保持了v-SVM的优势，即WD v-SVM的参数具有明确的物理含义，又解决了v-SVM由于样本类不平衡导致的分类错误偏差问题。
A new class-Weighted Dual v-SVM, termed as WD v-SVM, is proposed and Karush-Kuhn Tucker condition (KKT) is derived for it. The dual parameters v+ and v- are analyzed theoretically, and it is deduced that they represent the upper and the lower bound for the percentage of bounded support vectors and the support vectors in the weighted positive or negative class respectively, which is similar to their counterparts in v-SVM. Therefore, the classification performance of small sample class is improved through adjusting its class weight. Experimental results show that the WD v-SVM not only keeps the advantages of v-SVM, but also solves the problem of larger classification error rate of small sample class.