Sequential minimal optimization (SMO) algorithm is an effective method for solving large-scale support vector machine (SVM). The existing algorithms need to judge which quadrant the four Lagrange multipliers lie in, which complicates their implementation. In addition, the existing algorithms all assume that the kernel functions are positive definite or positive semi-definite, limiting their applications. Having considered these deficiencies of the traditional ones, a simplified SMO algorithm based on SVR is proposed, and further applied in solving ε-SVR with non-positive kernels. Different from the existing algorithms, the proposed algorithm in this paper just considers two Lagrange multipliers in implementation by expanding the original dual programming of ε-SVR and solving its KKT conditions, thus it is easily applied in solving ε-SVR with non-positive kernels. The presented algorithm is evaluated by a benchmark problem. Compared with the existing algorithms, the simplified one is much easier to be implemented without sacrificing space and time efficiency, and can achieve an ideal regression accuracy under the premise of ensuring convergence. Therefore it has certain theoretical and practical significance. Furthermore, the proposed algorithm is benefit to present a general-purpose SMO algorithm for SVR with all types of loss functions. Additionally, the proposed method, which is used to deal with ε-SVR, is also available to the other SVR with non-positive kernels.