Most existing studies on computational modeling of neural plasticity have focused on synaptic plasticity. However, regulation of the internal weights in the reservoir based on synaptic plasticity often results in unstable learning dynamics. In this article, a structural synaptic plasticity learning rule is proposed to train the weights and add or remove neurons within the reservoir, which is shown to be able to alleviate the instability of the synaptic plasticity, and to contribute to increase the memory capacity of the network as well. Our experimental results also reveal that a few stronger connections may last for a longer period of time in a constantly changing network structure, and are relatively resistant to decay or disruptions in the learning process. These results are consistent with the evidence observed in biological systems. Finally, we show that an echo state network (ESN) using the proposed structural plasticity rule outperforms an ESN using synaptic plasticity and three state-of-the-art ESNs on four benchmark tasks.