会议论文
Tan, Guo-zhen
Deng, Qing-qing,Tian, Zhu,Yang, Ji-xiang
2007-12-15
EI、CPCI-S、Scopus
A
546-549
Reducing training time for artificial neural network (ANN) when training large samples is an active area of research. The back propagation (BP) is wildly used in Short-term Traffic Flow Forecasting which requires the training set Size be much larger than the network size. In order to improve training speed, Data parallelism is a good idea. A novel data parallel RP based on dish network is proposed in this paper. Theoretical and experimental evidence prove that the dish data parallel BP reduce the communication cost compared with the traditional one. Meanwhile, by using the real traffic flow data of DaLian city, experiments show that this dish data parallel BP improves the training speed and enhances speed-zip radio.