Abstract:Random weight neural network(RWNN) has strong potential for solving qualitative and quantitative data analysis problems, and its most prominent feature is the random generation of parameters in the hidden layer. This feature makes RWNN has many advantages over neural networks based on gradient descent optimization for fine-tuning node parameters, such as simple structure, easy implementation, and low human intervention. The parameters between the hidden layer and the input layer of RWNN are randomly generated from a fixed interval, while the output weights between the hidden layer and the output layer are solved using an analytical method. The incremental construction method starts from a small initial network and gradually adds new nodes to the hidden layer to improve the quality of the model until the expected performance goal is met. This paper provides a comprehensive review of the research progress of incremental RWNN by focusing on basic theory, incremental construction learning method, and future open research directions. First, the basic structure, theory and analysis of RWNN are introduced, and the improvements and applications of RWNN in the incremental construction learning methods are further highlighted. Finally, future open research questions and promising directions of RWNN are pointed out.