In the high-dimensional feature space, the decision hyperplane of learning machine with support vector machine style tends to pass through the origin and the bias b is not need. However, bias exists in v-support vector regression (v-SVR). To study the role of bias in v-SVR, optimization formulation of v-SVR without bias is proposed and the corresponding method of solving the dual optimization formulation is presented. The experimental results on benchmark data sets show that the generalization ability of v-SVR without bias is better than v-SVR. Based on the analysis of solution space on dual optimization formulation, the bias b should not be contained in the optimization formulation of v-SVR, and the hyperplane of v-SVR should pass through the origin.