Access the full text.
Sign up today, get an introductory month for just $19.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
Hindawi Journal of Advanced Transportation Volume 2019, Article ID 4145353, 10 pages https://doi.org/10.1155/2019/4145353 Research Article 1,2 2 2 1 Xianglong Luo , Danyang Li, Yu Yang, and Shengrui Zhang School of Highway, Chang’an University, Xi’an 710064, China School of Information Engineering, Chang’an University, Xi’an 710064, China Correspondence should be addressed to Xianglong Luo; xlluo@chd.edu.cn Received 12 September 2018; Revised 15 January 2019; Accepted 1 February 2019; Published 27 February 2019 Guest Editor: Yasser Hassan Copyright © 2019 Xianglong Luo et al. is Th is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The traffic flow prediction is becoming increasingly crucial in Intelligent Transportation Systems. Accurate prediction result is the precondition of traffic guidance, management, and control. To improve the prediction accuracy, a spatiotemporal traffic flow prediction method is proposed combined with k-nearest neighbor (KNN) and long short-term memory network (LSTM), which is called KNN-LSTM model in this paper. KNN is used to select mostly related neighboring stations with the test station and capture spatial features of traffic flow. LSTM is utilized to mine temporal variability of traffic flow, and a two-layer LSTM network is applied to predict traffic flow respectively in selected stations. eTh final prediction results are obtained by result-level fusion with rank-exponent weighting method. eTh prediction performance is evaluated with real-time traffic flow data provided by the Transportation Research Data Lab (TDRL) at the University of Minnesota Duluth (UMD) Data Center. Experimental results indicate that the proposed model can achieve a better performance compared with well-known prediction models including autoregressive integrated moving average (ARIMA), support vector regression (SVR), wavelet neural network (WNN), deep belief networks combined with support vector regression (DBN-SVR), and LSTM models, and the proposed model can achieve on average 12.59% accuracy improvement. 1. Introduction prediction, but these methods are sensitive to the traffic data for different situations. The nonparametric methods include The accurate prediction of future traffic conditions (e.g., traf- articfi ial neural networks (ANNS) [6–9], k-nearest neighbor fic flow, travel speed and travel time) is crucial requirement (KNN) [10–14], support vector regression (SVR) [15, 16], and for Intelligent Transportation Systems (ITS), which can help Bayesian model [17, 18]. Compared to the parametric meth- administrators take adequate preventive measures against ods, nonparametric methods are more effective in prediction congestion and travelers take better-informed decisions. performance. Even so, nonparametric methods require large Among different applications in ITS, traffic flow prediction amount of historical data and training process. The hybrid has attracted significant attention over the past few decades. methods are mainly combining the parametric approach with It is still a challenging topic for transportation researchers. nonparametric approach [19–29]. Although the prediction accuracy of nonparametric methods and hybrid methods is Due to the stochastic characteristics of traffic flow, accurate tracffi prediction is not a straightforward task. superior to parametric methods, all these methods mainly In order to deal with this issue, many techniques are de- considered the data closed to the prediction station, which could not fully reveal the spatiotemporal characteristics of ployed for modeling the evolution of the traffic circulation. These existing prediction schemes are classified roughly traffic flow data. Vlahogianni et al. [30] summarized existing into three categories: parametric methods, nonparametric tracffi o fl w prediction algorithms from 2004 to 2013. Suhas methods, and hybrid methods. The parametric methods et al. [31] followed a systematic study to aggregate previous include Autoregressive Integrated Moving Average method works on traffic prediction, highlight marked changes in (ARIMA) [1], Seasonal Autoregressive Integrated Moving trends, and provide research direction for future work. Lana Average method (SARIMA) [2, 3], and Kalman filter [4, et al. [32] summarized the latest technical achievements in 5]. The parametric methods are widely used in traffic flow tracffi prediction efi ld, along with an insightful update of 2 Journal of Advanced Transportation the main technical challenges that remain unsolved. The a cascade connected LSTM was used to predict tracffi o fl w. readers interested in details of models that applied in traffic However, the architecture of proposed LSTM model was prediction field could refer to review reference paper. overly complicated, making comprehension difficult. The prediction results were not very stable and reliable in different With the widespread traditional tracffi sensors and new emerging traffic sensor technologies, tremendous traffic sen- observation points. sors have been deployed on the existing road network, and a In this paper, inspired by the successful application of LSTM in tracffi o fl w prediction, the high spatiotemporal large volume of historical traffic data at very high spatial and temporal resolutions has become available. It is a challenge to correlation characteristics of traffic flow data are considered deal with these big tracffi data with conventional parametric in order to improve prediction performance. A hybrid tracffi methods. But for nonparametric methods, most are shallow flow prediction methodology is proposed based on KNN and in architecture, which cannot penetrate the deep correlation LSTM. KNN is used to choose mostly related neighboring stations with the test station. A multilayer LSTM is applied to and implicit traffic information. Recently, deep learning, an emerging machine learning method, has drawn a lot of predict traffic flow in all selected stations. The final prediction attention from both academic and industrial filed. Traffic flow results are obtained by weighting the prediction values in all selected stations. The weights are assigned by adjusting the prediction based on deep learning methods has become a new trend. weight dispersion measure with rank-exponent method. The Huang et al. [33] proposed a deep architecture for traffic experiment results show that proposed method has better performance on accuracy compared with most existing tracffi flow prediction with deep belief networks (DBN) and mul- titask learning. Lv et al. [34] used a stacked autoencoder prediction methods. (SAE) model to learn generic traffic flow features. Duan et al. The main contributions of this paper are summarized as [35] evaluated the performance of the SAE model for traffic follows. flow prediction at daytime and nighttime. Soua et al. [36] (1) A hybrid tracffi o fl w prediction methodology is pro- posed combined KNN with LSTM, which utilizes the spa- proposed a DBN based approach to predict traffic flow with historical tracffi o fl w, weather data, and event-based data. An tiotemporal characteristics of traffic flow data. Experimental extension of dempster-shafer evidence theory was used to results demonstrate that proposed approach can achieve on average 12.59% accuracy improvement compared to ARIMA, fuse tracffi prediction beliefs coming from streams of data and event-based data models. Koesdwiady et al. [37] pre- SVR, WNN, DBN-SVR, and LSTM models. dicted the tracffi o fl w and weather data separately using DBN. (2) The prediction results are obtained by weighting the The result of each prediction was merged using dada fusing prediction values in all selected stations by adjusting the techniques. Yang et al. [38] proposed a stacked autoencoder weight dispersion measure with rank-exponent method. Dif- Levenberg-Marquardt model to improve prediction accuracy. ferent from the traditional weighting method, the proposed The Taguchimethodwas developedto optimize the model method highlights the importance of the highly relevant structure. Zhou et al. [39] introduced an adaptive boosting stations to the prediction result. (3) From classical understanding, closer stations from the scheme for the stacked autoencoder network. Polson and Sokolov [40] developed a deep learning model to predict prediction station have more correlation than those further traffic ow fl s. An architecture was proposed combined with stations. In fact, some further stations have also correlation with the prediction station. However, it is consistent with a linear model that was tfi ted using regularization and a sequence of tanh layers. Zhang and Huang [41] employed the general fact that the traffic flow in the upstream and the genetic algorithm to find the optimal hyperparameters downstream has great inu fl ence on the prediction result in of DBN models. In recent years, recurrent neural network the traffic flow prediction. (RNN) was more practical in comparison with other deep The rest of this paper is organized as follows. Section 2 learning structures for processing sequential data. Ma et gives details on a hybrid tracffi prediction method based on al. [42] utilized a deep Restricted Boltzmann Machine and KNN and LSTM. In Section 3, the dataset used is introduced RNN architecture to model and predict tracffi congestion. for the numerical experiments. The results and performance evaluation are also presented. Finally, the conclusions and the However, the traditional RNNs face problems of vanishing gradients and exploding gradients. To solve this problem, future research are given in Section 4. a long short-term memory network (LSTM) was proposed. Because LSTM can automatically calculate the optimal time 2. Methodology lags and capture the features of time series with longer time span, a better performance can be achieved with LSTM model 2.1. LSTM Network. RNN is a neural network that is spe- in traffic flow prediction. LSTM was developed to capture cialized for processing time sequences. Different from con- the long-term temporal dependency for traffic sequences by ventional networks, RNN allows a “memory” of previous Ma et al. [43]. Shao and Soong [44] utilized LSTM to learn inputs to persist in the network internal state, which can then more abstract representations in the nonlinear traffic flow be used to influence the network output. Traditional RNN data. In recent years, LSTM was very successful in traffic exhibits a superior capability of modeling nonlinear time flow prediction, but the spatiotemporal characteristics of sequence problems, such as speech recognition, language tracffi o fl w were hardly considered. Zhao et al. [45] proposed modeling, and image captioning. However, traditional RNN an origin destination correlation matrix to represent the is not able to train the time sequence with long time lags. correlations of different links within the road network, and To overcome the disadvantages of traditional RNN, LSTM is Journal of Advanced Transportation 3 Output Cells 𝑁 𝐻 𝑡 𝑡 𝑡−1 𝑎 = ∑ 𝜔 𝑥 + ∑ 𝜔 𝑏 (5) 𝑐 𝑛𝑐 𝑛 ℎ𝑐 ℎ Output gate 𝑛=1 ℎ=1 𝑡 𝑡 𝑡−1 𝑡 𝑡 𝑠 =𝑏 𝑠 +𝑏 𝑔(𝑎 ) (6) 𝑐 𝑓 𝑐 𝑖 𝑐 Forget gate Cell Output Gates 𝑁 𝐻 𝐶 𝑡 𝑡 𝑡−1 𝑡−1 g 𝑎 = ∑ 𝜔 𝑥 + ∑ 𝜔 𝑏 + ∑ 𝜔 𝑠 (7) 𝑛𝑜 ℎ𝑜 𝑐𝑜 𝑜 𝑛 ℎ 𝑐 𝑛=1 𝑐=1 ℎ=1 Input gate 𝑡 𝑡 𝑏 =𝜎(𝑎 ) (8) 𝑜 𝑜 Input Figure 1: LSTM memory block with one cell. Cell Outputs 𝑡 𝑡 𝑡 𝑏 =𝑏 ℎ(𝑠 ) (9) 𝑐 𝑜 𝑐 proposed. LSTM is a special kind of RNN, designed to learn long term dependencies. The LSTM architecture consists of By the function of the different gates, LSTM network a set of memory blocks. Each block contains one or more has the capability of processing arbitrary time lags for time self-connected memory cells and three gates, namely, input sequence with long dependency. gate, forget gate, and output gate. The typical structure of LSTM memory block with one cell is in Figure 1. Input gate takes a new input from outside and process newly coming 2.2. KNN Algorithm. KNN algorithm is a nonparametric data. Forget gate decides when to forget the previous state method used for classification and regression. The KNN and thus selects the optimal time lag for the input sequence. method makes use of a database to search for data that are Output gate takes all results calculated and generates output similar to the current data. These found data are called the for LSTM cell. nearest neighbors of the current data. In this paper, KNN is Let us denote the input time series as 𝑋= used to select mostly related neighboring stations with the [𝑥 ,𝑥 ,⋅⋅⋅ ,𝑥 ],and 𝑇 is input sequence length. 𝑁 is test station. Suppose there are M stations in the road network. 1 2 𝑇 the number of inputs, 𝐻 is the number of cells in the hidden 𝑋 (𝑡) = [𝑥 (𝑡),𝑥 (𝑡−1),⋅⋅⋅ ,𝑥 (𝑡−𝑇)] is the historical traffic 𝑜 0 𝑜 𝑜 layer, and 𝐶 is the number of memory cells. The subscripts 𝑖 , flow data in test station, and 𝑇 is the sample data length. 𝑓 ,and 𝑜 refer to the input gate, forget gate, and output gate, 𝑋 (𝑡) = [𝑥 (𝑡),𝑥 (𝑡−1),⋅⋅⋅ ,𝑥 (𝑡−𝑇)] (𝑚 = 1,2,⋅⋅⋅𝑀−1) 𝑚 𝑚 𝑚 𝑚 th respectively. 𝜔 is the weight of the connection from 𝑖 unit is the historical traffic flow data in the m station, which is different from the test station. The Euclidean distance [see to unit 𝑗 . 𝑎 is the network input to some unit 𝑗 at time 𝑡 ,and 𝑡 𝑡 (10)] is used to measure the correlation between the test 𝑏 is the value after activation function in the same unit. 𝑠 is 𝑗 𝑐 station with others. the state of cell at time 𝑡 . 𝜎 is the activation function of the gates, and 𝑔 and ℎ are, respectively, the cell input and output activation functions. The LSTM model can be conducted by 𝑑 = 𝑋 (𝑡 )−𝑋 (𝑡 ) = ∑ (𝑥 (𝑗) − 𝑥 (𝑗)) 𝑚 𝑜 𝑚 𝑜 𝑚 2 √ (10) the following equations. Input Gates According to the calculated distance, a total of K-nearest 𝑁 𝐻 𝐶 neighbors are found, and K stations are selected as mostly 𝑡 𝑡 𝑡−1 𝑡−1 𝑎 = ∑ 𝜔 𝑥 + ∑ 𝜔 𝑏 + ∑ 𝜔 𝑠 (1) 𝑖 𝑛𝑖 𝑛 ℎ𝑖 ℎ 𝑐𝑖 𝑐 related stations with the test station. 𝑛=1 ℎ=1 𝑐=1 𝑡 𝑡 2.3. Proposed Method. Different form the conventional 𝑏 =𝜎(𝑎 ) (2) 𝑖 𝑖 LSTM network, KNN algorithm is used to select spatiotem- poral correlation stations with the test station at first. A Forget Gates two-layer LSTM network is applied to predict traffic flow, 𝑁 𝐻 𝐶 respectively, in selected stations. The final prediction results 𝑡 𝑡 𝑡−1 𝑡−1 𝑎 = ∑ 𝜔 𝑥 + ∑ 𝜔 𝑏 + ∑ 𝜔 𝑠 (3) in test station are obtained by weighting with rank-exponent ℎ𝑓 𝑓 𝑛 ℎ 𝑐 𝑛=1 𝑐=1 ℎ=1 method. At time 𝑡 , the tracffi o fl w data in the test station is denoted as 𝑋 (𝑡) = [𝑥 (𝑡),𝑥 (𝑡−1),⋅⋅⋅ ,𝑥 (𝑡−𝑇)] .The tracffi 𝑡 𝑡 𝑜 0 𝑜 𝑜 𝑏 =𝜎(𝑎 ) (4) 𝑓 𝑓 flow data for 𝑀−1 stations near the test station is denoted as 𝑐𝑓 𝑛𝑓 𝑖𝑗 4 Journal of Advanced Transportation 𝑋 𝑡 () 3. Experiments 𝑀−1 3.1. Data Description. The data used to evaluate the per- 𝑥 (𝑡 ),𝑥 (𝑡− 1 ),⋅⋅⋅ ,𝑥 (𝑡−𝑇 ) 1 1 1 formance of the proposed model was collected in mainline [ ] 𝑥 (𝑡 ),𝑥 (𝑡− 1 ),⋅⋅⋅ ,𝑥 (𝑡−𝑇 ) [ ] (11) 2 2 2 detectors provided by the Transportation Research Data Lab [ ] [ ] . . . (TDRL) at the University of Minnesota Duluth (UMD) Data [ ] . . . [ . . ⋅⋅⋅ . ] Center from March 1st, 2015, to April 30th, 2015. The sampling period of the testing dataset was 5 min. In our experiment, 𝑥 (𝑡 ),𝑥 (𝑡−1 ),⋅⋅⋅ ,𝑥 (𝑡−𝑇 ) [ ] 𝑀−1 𝑀−1 𝑀−1 we selected the road network in Figure 3 as the experiment area. The area mainly contains four expressways numbered 𝑋 ()𝑡 (𝑖 = 1,2,⋅⋅⋅𝐾)is the station selected by KNN. The 𝑐𝑖 I394, I494, US169, and TH100.There are 36 stations in the prediction tracffi o fl w in the selected stations and test station experiment area. The station locations and ID that are used can be calculated as are shown in Figure 3. Stations S339 and S448 are located near a transportation hub in road networks in the experiments. (12) 𝑋 (𝑡+1 )=𝑊 𝑏 +𝑏 (𝑖 = 1,2,⋅⋅⋅𝐾 ) 𝑐𝑖 ℎ𝑜 𝑐 Therefore, they were selected as the test stations for the tracffi ow fl prediction. Due to the similarity of traffic ow fl on the same workday in different weeks, we used the data in the where 𝑊 is the weight matrix between the hidden layer ℎ𝑜 and output layer and𝑏 is bias term. The final prediction results one workday as train and test data in order to ensure the prediction stability. In our experiment, we chose the traffic in test station are obtained by weighting according to (12). ow fl data on Tuesday. Of course, we can choose any one workday from Monday to Friday. There was a total of 9- ̂ day traffic flow data on Tuesday in our test dataset. The 𝑦 (𝑡+1 )= ∑ 𝑊 𝑋 (13) 𝑜 𝑖 𝑐𝑖 dataset was divided into two datasets. The data in rs fi t 8 𝑖=1 days was used as training sample, while the remaining data was employed as the testing sample for measuring prediction where 𝑊 is the weight coefficient. The Rank-Exponent performance. The most commonly used prediction interval is method of weights is used in this paper. Rank-Exponent 5 min, and we also select the prediction time interval as 5 min, method can provide some degree of flexibility by adjusting and it is verified to be reasonable by the real experimental the weight dispersion measure as shown in (13). The value of results. 𝑧 is set to 2 as indicated by the authors [46]. Traffic ow fl s for 5 consecutive Tuesdays are shown in Figure 4 in the station S339, and typical traffic flows are (𝐾 − 𝑟 +1) shown in Figure 5 in the station S339 and four neigh- 𝑊 = (14) ∑ (𝐾 − 𝑟 +1) boring stations. From Figure 4, we can see that there is 𝑖=1 a little difference in the rush hours; however, the pro- 𝑡ℎ files of the traffic flows are basically consistent. From where 𝑟 is the rank of the 𝑖 selected station, 𝐾 is the Figure 5, it can be seen that there are some differences total number of selected stations, and 𝑧 is weight dispersion in different stations, but the data distribution is similar measure. to the station S339. Because tracffi o fl w data has high The flowchart of the proposed method is shown in spatiotemporal correlation characteristics, it is effective to Figure 2, and the detailed calculation process is shown as improve traffic prediction accuracy with the spatiotemporal follows. correlations. Step 1. Calculate the Euclidean distance between adjacent 3.2. Performance Indexes. In order to evaluate the prediction 𝑀−1 stations with the test station according to (10). performance, Root Mean Square Error (RMSE), which was the most frequently used metrics of prediction performance Step 2. Select mostly related 𝐾 stations with the test station. in previous work, and predicting accuracy (ACC) were Step 3. Predict traffic flow with LSTM network, respectively, chosen to evaluate the difference between the actual values in selected stations according to (13). with predicted values. Step 4. Weigh prediction value in selected stations according to (14). √ (15) 𝐸𝑆 = ∑ (𝑦̂ −𝑦 ) 𝑖 𝑖 𝑖=1 Step 5. Calculate the RMSE for the predicted traffic flow. 1 𝑦̂ −𝑦 𝑖 𝑖 Step 6. Repeat Steps 2–5 with the different 𝐾 (𝐾≤𝑀 ). 𝐶 = (1 − ∑ ) × 100% (16) 𝑁 𝑦 𝑖=1 Step 7. Find the smallest RMSE in all the different 𝐾 . where 𝑁 is the length of prediction data and 𝑦 and 𝑦̂ are 𝑖 𝑖 th Step 8. Obtain the predicted traffic flow in the test station the measuredvalue andpredictedfor i validation sample, when RMSE is the smallest. respectively. 𝐴𝐶 𝑅𝑀 Journal of Advanced Transportation 5 Station 1 Station 2 Test station Station M …… X (t) X (t) X (t) X (t) 1 2 o M Selected K related stations KNN X (t) X (t) …… X (t) X (t) c1 c2 o cK …… LSTM LSTM LSTM LSTM …… X (t + 1) X (t + 1) X (t + 1) X (t + 1) c1 c2 o cK Weighted fusion y (t + 1) Calculate MSE Yes K≤ M No Predicted flow with the smallest MSE Figure 2: The flowchart of the proposed method. 4. Results and Discussions number of hidden nodes is set as 6, the learning rate as 0.001, and the iteration number as 500. For DBN model, 3-layer 4.1. Results Analysis. In our experiment, stations S339 and architecture is used, and the number of nodes in each layer S448 are chosen as the test stations, which are located in is set to 128 for simplicity. the two directions of the road network. The timesteps are an The predicted results of different models and real traffic important hyperparameter, which are the input size to the flow are shown within one day in Figures 6 and 7. It is model and determines number of LSTM blocks in each level. observed that the predicted traffic flow has similar traffic Through experiment, when timesteps are set as 6, the predic- patterns with the real traffic flow and the prediction value tion performance can achieve the optimal value. To validate of the proposed KNN-LSTM model is almost coincided with the efficiency of the proposed method, the performance is the measured data, especially in morning and evening peak compared with some representative approaches, including hours. The RMSE and ACC for different models are shown ARIMA model, SVR, wavelet neural network (WNN), DBN, for stations S339 and S448 in Table 1. It can be seen that and LSTM. In SARIMA model, AR and MA order are set as 5 the proposed method has the minimum RMSE. The average and 4, and normal and seasonal differencing order are set as ACC for the proposed method is 95.75%, which improve 1 and 2. In SVR model, kernel function is set as Radial Basis by 28.92%, 8.31%, 14.44%, 6.95%, and 4.32% compared with Function (RBF), the penalty parameter of the error term as other models. The traditional ARIMA model has the worst 300, and the iteration number as 1000. In WNN model, the prediction performance, which assumes the traffic flow data 6 Journal of Advanced Transportation S711 S712 S737 S446 S713 S401 S403 S714 S447 S404 S339 S292 S340 S448 S341 S346 S344 S342 S337 S345 S338 S405 S343 S336 S321 S450 S293 S407 S294 S451 S408 S409 S452 S295 S410 S453 Figure 3: The ID and locations of stations in our experiment. 2 000 1 500 1 000 3:00 6:00 9:00 12:00 15:00 18:00 21:00 24:00 Time (h) ＭＮ ＮＢ The 1 Tuesday The 4 Tuesday ＮＢ Ｈ＞ The 2 Tuesday The 5 Tuesday Ｌ＞ The 3 Tuesday Figure 4: Traffic flows for 5 consecutive Tuesdays in the station S339. is a stationary process but this is not always true in reality. S344, S336, and S293. The optimal 𝐾 is set as 6 for station The SVR and WNN method receive better RMSE and ACC S448, and the ID numbers of selected stations are S448, S447, than the ARIMA model, while they show weakness when S446, S450, S737, and S452. As shown in Figure 3, it can compared with the deep learning methods. The DBN model be seen that almost all of the selected stations are located has also no obvious advantage over SVR. in upstream and downstream in the test stations. From classical understanding, closer stations from the prediction station have more correlation than those further stations. 4.2. Discussions. In this paper, KNN is used to select mostly In fact, some further stations have also correlation with the related 𝐾 stations with the test station. The different 𝐾 values prediction station. For the test station S339, the closer station have dieff rent prediction performance. We search for all S343 is not selected, and closer station S451 is not selected for possible values for 𝐾 ,the corresponding 𝐾 is the optimal the test station S448. However, it is consistent with the general value when the RMSE is minimum. The optimal 𝐾 is set as 10 fact that the traffic flow in the upstream and downstream for the station S339 in our experiment, and the ID numbers of has great inu fl ence on the prediction result in the tracffi selected stations are S339, S340, S341, S321, S337, S342, S338, Traffic Volume (vehicle/h) Journal of Advanced Transportation 7 1 500 1 000 3:00 6:00 9:00 12:00 15:00 18:00 21:00 24:00 Time (h) S339 S321 S337 S340 S341 Figure 5: Traffic flows in the station S339 and 4 neighboring stations. 1 500 1 000 4:00 8:00 12:00 16:00 20:00 24:00 Time (h) DBN-SVR Real Values ARIMA LSTM SVR KNN-LSTM WNN Figure 6: eTh real and predicted traffic flow in S339. Table 1: Prediction performances of different models. flow prediction. When 𝐾=1 ,the temporal correlation is only considered, the average ACC is 91.43% which is S339 S448 decreased by 4.32% compared with the proposed method. It Models RMSE ACC(%) RMSE ACC(%) indicates that spatiotemporal features have important roles in ARIMA 36.3223 61.09 44.6856 72.57 the traffic prediction. These results verify the superiority and feasibility of the KNN-LSTM, which employ KNN to capture SVR 7.7424 88.17 18.4911 86.71 the spatial features and mine temporal regularity with the WNN 8.5240 74.69 12.4526 87.93 LSTM networks. DBN-SVR 7.3277 89.60 15.3746 88.01 LSTM 1.8185 90.39 2.7499 92.47 KNN-LSTM 1.7403 94.59 2.5465 96.91 5. Conclusions In this paper, we proposed a spatiotemporal traffic flow prediction method combined with KNN and LSTM. KNN respectively, in selected stations. LSTM is able to exploit is used to select mostly related neighboring stations that the long-term dependency in the traffic flow data and dis- indicated the spatiotemporal correlation with the test sta- cover the latent feature representations hidden in the traffic tion. A LSTM network was applied to predict tracffi o fl w, flow, which yields better prediction performance. The final Traffic Volume (vehicle/h) Traffic Volume (vehicle/h) 8 Journal of Advanced Transportation 2 000 1 500 1 000 4:00 8:00 12:00 16:00 20:00 24:00 Time (h) Real Values DBN-SVR ARIMA LSTM KNN-LSTM SVR WNN Figure 7: eTh real and predicted traffic flow in S448. prediction results in test station are obtained by weighting [2] B. M. Williams and L. A. Hoel, “Modeling and forecasting vehic- ular traffic flow as a seasonal ARIMA process: theoretical basis with rank-exponent method. We evaluated the performance and empirical results,” Journal of Transportation Engineering, of our model with real traffic data provided by TDRL and vol.129,no.6,pp.664–672, 2003. compared with ARIMA, SVR, WNN, DBN, and LSTM [3] G. Shi, J. Guo, W. Huang, and B. M. Williams, “Modeling model. The results show that proposed model is superior seasonal heteroscedasticity in vehicular traffic condition series to other methods. Since the traffic flow data is affected by using a seasonal adjustment approach,” Journal of Transporta- weather, incident, and other factors, the impact of these tion Engineering, vol.140,no. 5,pp. 1053–1058,2014. factors on traffic flow data will be further studied so as to [4] L.L.Ojeda, A.Y. Kibangou, and C. C.De Wit,“Adaptive improve the prediction accuracy. Kalman filtering for multi-step ahead traffic flow prediction,” in Proceedings of the IEEE American Control Conference,pp.4724– Data Availability 4729,Washington, DC,USA,2013. [5] J. Guo,W.Huang, and B. M. Williams, “AdaptiveKalmanfilter The data used in this paper are collected in mainline approach for stochastic short-term traffic flow rate prediction detectors provided by the Transportation Research Data Lab and uncertainty quantification,” Transportation Research Part C: (TDRL) at the University of Minnesota Duluth (UMD) Data Emerging Technologies,vol.43,pp. 50–64, 2014. Center. (http://www.d.umn.edu/∼tkwon/TMCdata/TMCar- [6] B.L.Smith and M. J.Demetsky,“Short-term traffic flow chive.html) If any researcher requests for these data, they can prediction: neural network approach,” Transportation Research download from the website. Record, vol.1453,pp.98–104,1994. [7] Y. W.Zhang, Z.P.Song, X.L.Weng,and Y. L.Xie,“A new soil- Conflicts of Interest water characteristic curve model for unsaturated loess based on wetting-induced pore deformation,” Geou fl ids ,vol.2019,Article The authors declare that there are no conflicts of interest ID 5261985, 13 pages, 2019. regarding the publication of this paper. [8] Q. P. Wang and H. Sun, “Traffic structure optimization in historic districts based on green transportation and sustainable Acknowledgments development concept,” Advances in Civil Engineering,vol. 2019, Article ID 9196263, 18 pages, 2019. This research was partly supported by the National Key [9] D. Chen, “Research on traffic flow prediction in the big data R&D Program of China (2018YFC0808706) and the National environment based on the improved RBF neural network,” IEEE Natural Science Foundation of China (Grant no. 5157081053). Transactions on Industrial Informatics, vol.13,no.4, pp. 2000– The authors are also grateful to the UMD Data Center 2008, 2017. (TDRL) for providing the data. [10] M. Berna´s, B. Płaczek, P. Porwik, and T. Pamuła, “Segmentation of vehicle detector data for improved k-nearest neighbours- based traffic flow prediction,” IET Intelligent Transport Systems, References vol. 9, no. 3, pp. 264–274, 2015. [1] M. van der Voort,M.Dougherty,and S. Watson,“Combining [11] S.Wu,Z.Yang, X.Zhu, and B.Yu, “Improved KNN for short- Kohonen maps with ARIMA time series models to forecast traf- term traffic forecasting using temporal and spatial information,” fic flow,” Transportation Research Part C: Emerging Technologies, Journal of Transportation Engineering,vol. 140, no. 7, Article ID vol. 4, no. 5, pp. 307–318, 1996. 04014026, 2014. Traffic Volume (vehicle/h) Journal of Advanced Transportation 9 [12] P. Dell’acqua, F. Bellotti, R. Berta, and A. De Gloria, “Time- [29] J. Lai, J. Qiu, H. Fan, Q. Zhang, J. Wang, and J. Chen, “Fiber aware multivariate nearest neighbor regression methods for Bragg grating sensors-based in situ monitoring and safety traffic flow prediction,” IEEE Transactions on Intelligent Trans- assessment of loess tunnel,” Journal of Sensors,vol.2016, Article portation Systems, vol.16,no. 6,pp.3393–3402, 2015. ID 8658290, 10 pages, 2016. [13] P.Cai, Y. Wang,G. Lu,P.Chen, C.Ding, and J.Sun,“A [30] E.I. Vlahogianni,M.G. Karlaift s, and J.C.Golias, “Short- spatiotemporal correlative k-nearest neighbor model for short- term traffic forecasting: where we are and where we’re going,” term traffic multistep forecasting,” Transportation Research Part Transportation Research Part C: Emerging Technologies,vol. 43, C: Emerging Technologies,vol.62, pp. 21–34,2016. pp. 3–19, 2014. [14] B. Sun, W. Cheng, P. Goswami, and G. Bai, “Short-term [31] S.Suhas, V.V.Kalyan, M.Katti,B.V.Prakash, and C.Naveena, traffic forecasting using self-adjusting k-nearest neighbours,” “A comprehensive review on traffic prediction for intelligent IET Intelligent Transport Systems,vol.12,no.1,pp.41–48, 2018. transport system,” in Proceedings of the 2017 International Con- [15] M. Castro-Neto, Y.-S. Jeong, M.-K. Jeong, and L. D. Han, ference on Recent Advances in Electronics and Communication “Online-SVR for short-term traffic flow prediction under Technology (ICRAECT), pp. 138–143, Bangalore, India, March typical and atypical traffic conditions,” Expert Systems with 2017. Applications, vol.36,no. 3,pp.6164–6173, 2009. [32] I. Lana, J. Del Ser, M. Velez, and E. I. Vlahogianni, “Road [16] Y. Sun, B. Leng, and W. Guan, “A novel wavelet-SVM short- traffic forecasting: recent advances and new challenges,” IEEE time passenger flow prediction in Beijing subway system,” Intelligent Transportation Systems Magazine, vol.10,no.2,pp. Neurocomputing, vol. 166, pp. 109–121, 2015. 93–109, 2018. [17] J. Wang,W. Deng, and Y.Guo,“New Bayesiancombination [33] W. Huang, G. Song, H. Hong, and K. Xie, “Deep architecture method for short-term traffic flow forecasting,” Transportation for traffic flow prediction: deep belief networks with multitask Research Part C: Emerging Technologies,vol.43,pp. 79–94, 2014. learning,” IEEE Transactions on Intelligent Transportation Sys- [18] Y. Xu,Q.-J. Kong, R.Klette, and Y. Liu, “Accurate and inter- tems, vol. 15, no. 5, pp. 2191–2201, 2014. pretable bayesian MARS for traffic flow prediction,” IEEE [34] Y. Lv, Y. Duan, W. Kang et al., “Traffic flow prediction with big Transactions on Intelligent Transportation Systems,vol.15,no. data: a deep learning approach,” IEEE Transactions on Intelligent 6, pp. 2457–2469, 2014. Transportation Systems,vol.16,no. 2,pp. 865–873,2015. [19] J.Lai,K. Wang,J.Qiu, F.Niu, J.Wang,and J.Chen, “Vibration [35] Y. Duan, Y. Lv, and F. Y. Wang, “Performance evaluation of the response characteristics of the cross tunnel structure,” Shock and deep learning approach for traffic flow prediction at different Vibration, vol. 2016, Article ID 9524206, 16 pages, 2016. times,” in Proceedings of the IEEE International Conference on [20] D.Xia,B.Wang,H.Li,Y. Li,and Z.Zhang, “A distributed Service Operations and Logistics, and Informatics, pp. 223–227, spatial-temporal weighted model on MapReduce for short-term Beijing, China, 2016. traffic flow forecasting,” Neurocomputing,vol.179,pp.246–263, [36] R. Soua, A. Koesdwiady, and F. Karray, “Big-data-generated traffic flow prediction using deep learning and dempster-shafer [21] F. Moretti, S. Pizzuti, S. Panzieri, and M. Annunziato, “Urban theory,” in Proceedings of the IEEE International Joint Conference traffic flow forecasting through statistical and neural network on Neural Networks, pp. 3195–3202, Vancouver, Canada, July bagging ensemble hybrid modeling,” Neurocomputing,vol.167, pp. 3–7, 2015. [37] A. Koesdwiady, R. Soua, and F. Karray, “Improving traffic flow [22] Z. J.Zhou,C.N.Ren,G. J.Xu,H. C.Zhan,andT.Liu,“Dynamic prediction with weather information in connected cars: a deep failure mode and dynamic response of high slope using shaking learning approach,” IEEE Transactions on Vehicular Technology, table test,” Shock and vibration,vol. 2019, ArticleID 4802740, 15 vol. 65, no. 12, pp. 9508–9517, 2016. pages, 2019. [38] H. F. Yang, T.S. Dillon, and Y.P.Chen, “Optimized structure [23] Y. W.Zhang, X.L. Weng,Z.P.Song, and Y. F.Sun,“Modeling of the traffic flow forecasting model with a deep learning of loess soaking induced impacts on metro tunnel using water approach,” IEEE Transactions on Neural Networks & Learning soaking system in centrifuge,” Geou fl ids ,vol.2019, Article ID Systems,vol.28, no. 10,pp.2371–2381, 2016. 5487952, 13 pages, 2019. [39] T.Zhou,G.Han,X.Xuetal.,“𝛿 -agree AdaBoost stacked autoen- [24] S.-D.Oh, Y.-J.Kim,and J.-S. Hong, “Urban traffic flow pre- coder for short-term traffic flow forecasting,” Neurocomputing, diction system using a multifactor pattern recognition model,” vol. 247, pp. 31–38, 2017. IEEE Transactions on Intelligent Transportation Systems,vol. 16, [40] N. G. Polson and V. O. Sokolov, “Deep learning for short- no.5,pp.2744–2755,2015. term traffic flow prediction,” Transportation Research Part C: [25] Z.J.Zhou,S.S. Zhu,X.Kong, J.T.Lei,andT.Liu,“Optimization Emerging Technologies,vol.79,pp.1–17,2017. analysis of settlement parameters for post-grouting piles in loess area of Shaanxi, China,” Advances in Civil Engineering,vol.2019, [41] Y. Zhang and G. Huang, “Traffic flow prediction model based Article ID 7085104, 11 pages, 2019. on deep belief network and genetic algorithm,” IET Intelligent Transport Systems,vol.12, no. 6,2018. [26] M.-L. Huang, “Intersection traffic flow forecasting based on ]- GSVR with a new hybrid evolutionary algorithm,” Neurocom- [42] X. Ma,H. Yu, Y. Wang,and Y. Wang,“Large-scaletrans- puting, vol.147, no.1,pp.342–349, 2015. portation network congestion evolution prediction using deep learning theory,” PLoS ONE,vol. 10, no.3, Article ID e0119044, [27] X.Luo,L. Niu, and S.Zhang,“An algorithm for traffic flow prediction basedon improvedSARIMA andGA,” KSCE Journal of Civil Engineering, vol.22, no.10,pp.4107–4115, 2018. [43] X. Ma, Z.Tao,Y. Wang, H. Yu, and Y.Wang,“Long short- [28] X. Luo, D. Li, and S. Zhang, “Traffic flow prediction during the term memory neural network for traffic speed prediction using remote microwave sensor data,” Transportation Research Part C: holidays based on DFT and SVR,” Journal of Sensors,vol. 2019, Article ID 6461450, 10 pages, 2019. Emerging Technologies,vol.54,pp.187–197,2015. 10 Journal of Advanced Transportation [44] H. Shao and B.-H. Soong, “Traffic flow prediction with Long Short-Term Memory Networks (LSTMs),” in Proceedings of the IEEE Region 10 Conference, pp. 2986–2989, Singapore, 2016. [45] Z.Zhao,W.Chen,X.Wu,P.C.Chen,andJ.Liu,“LSTMnetwork: a deep learning approach for short-term traffic forecast,” IET Intelligent Transport Systems,vol.11, no. 2,pp.68–75,2017. [46] F. G. Habtemichael, M. Cetin, and K. A. Anuar, “Methodology for quantifying incident-induced delays on freeways by group- ing similar traffic patterns,” in Proceedings of the Transportation Research Record 94th Annual Meeting,Washington, DC, USA, 2015. International Journal of Advances in Rotating Machinery Multimedia Journal of The Scientific Journal of Engineering World Journal Sensors Hindawi Hindawi Publishing Corporation Hindawi Hindawi Hindawi Hindawi www.hindawi.com Volume 2018 http://www www.hindawi.com .hindawi.com V Volume 2018 olume 2013 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 Journal of Control Science and Engineering Advances in Civil Engineering Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 Submit your manuscripts at www.hindawi.com Journal of Journal of Electrical and Computer Robotics Engineering Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 VLSI Design Advances in OptoElectronics International Journal of Modelling & Aerospace International Journal of Simulation Navigation and in Engineering Engineering Observation Hindawi Hindawi Hindawi Hindawi Volume 2018 Volume 2018 Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com www.hindawi.com www.hindawi.com Volume 2018 International Journal of Active and Passive International Journal of Antennas and Advances in Chemical Engineering Propagation Electronic Components Shock and Vibration Acoustics and Vibration Hindawi Hindawi Hindawi Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018
Journal of Advanced Transportation – Hindawi Publishing Corporation
Published: Feb 27, 2019
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get an introductory month for just $19.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.