Parameter Selection Method for Support Vector Regression Based on Adaptive Fusion of the Mixed Kernel Function

Table 1

Four types of kernel functions and their characteristics.

Kernel function

Characteristics

Linear kernel function: .

It is a special case of the kernel function. The parameters are few and the speed is fast [28].

Polynomial kernel function: , where and are the kernel parameters and satisfy the condition ,.

It is a global kernel function. And it becomes a linear kernel function when . The greater the value of , the higher the dimension of mapping, and the greater the amount of computation. When is too large, the complexity of the learning machine is also increased. The promotion ability of the support vector regression is reduced, and it is easy to introduce the phenomenon of overfitting [29].

Gauss kernel function (RBF kernel): , where .

RBF kernel function is a strong local kernel function, and the external pushing ability decreases with the increase of parameters. Compared with the general kernel functions, Gauss kernel function only needs to determine a parameter, and constructing the kernel function model is relatively simple. Therefore, RBF kernel function is currently the most widely used one [30].

Sigmoid kernel function: , where , are the kernel parameters and satisfy the condition .

The theoretical basis of support vector regression determines the global optimal value of the support vector regression rather than the local minimum value. It also ensures that it will not cause an overlearning phenomenon because of good generalization ability of unknown samples [31].

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.