Through analyzing the stochastic optimization results, it is been found that optimization results satisfy normal distribution, and the expected value is the optimal solution. A secondary optimization method based on the statistical theory and the Newton method is proposed to improve the optimization results of stochastic optimization algorithms, which can overcome the average method’s shortcoming that the precision requirements are often can not be met. Taking multiple optimization results of four classic test functions optimized by genetic algorithm as examples, the average method and the secondary optimization method are respectively used to synthesize the optimization results. Experiments show that, in dealing with multiple stochastic optimization results, the secondary optimization method has higher accuracy and better stability than those of the average method.