| Login
dc.contributor.author | Rahman, Md. Abdur | |
dc.contributor.author | Hossain, Al-Amin | |
dc.contributor.author | Jawad, Tahmid | |
dc.date.accessioned | 2022-12-19T09:08:16Z | |
dc.date.available | 2022-12-19T09:08:16Z | |
dc.date.issued | 2022-05-30 | |
dc.identifier.citation | [1] “Shodhganga@INFLIBNET: Short Term Load Forecasting for Smart Power Systems.” http://shodhganga.inflibnet.ac.in:8080/jspui/handle/10603/222275 (accessed May 09, 2022). [2] A. S. Khwaja, A. Anpalagan, M. Naeem, and B. Venkatesh, “Joint bagged-boosted artificial neural networks: Using ensemble machine learning to improve short-term electricity load forecasting,” Electric Power Systems Research, vol. 179, p. 106080, Feb. 2020, doi: 10.1016/J.EPSR.2019.106080. [3] B. Farsi, M. Amayri, N. Bouguila, and U. Eicker, “On short-term load forecasting using machine learning techniques and a novel parallel deep LSTM-CNN approach,” IEEE Access, vol. 9, pp. 31191–31212, 2021, doi: 10.1109/ACCESS.2021.3060290. [4] M. Sajjad et al., “A Novel CNN-GRU-Based Hybrid Approach for Short-Term Residential Load Forecasting,” IEEE Access, vol. 8, pp. 143759–143768, 2020, doi: 10.1109/ACCESS.2020.3009537. [5] A. D. Papalexopoulos and T. C. Hesterberg, “A regression-based approach to short-term system load forecasting,” IEEE Transactions on Power Systems, vol. 5, no. 4, pp. 1535–1547, 1990, doi: 10.1109/59.99410. [6] M. T. Hagan and S. M. Behr, “The Time Series Approach to Short Term Load Forecasting,” IEEE Transactions on Power Systems, vol. 2, no. 3, pp. 785–791, 1987, doi: 10.1109/TPWRS.1987.4335210. [7] H. S. Hippert, C. E. Pedreira, and R. C. Souza, “Neural networks for short-term load forecasting: A review and evaluation,” IEEE Transactions on Power Systems, vol. 16, no. 1, pp. 44–55, Feb. 2001, doi: 10.1109/59.910780. [8] A. Khotanzad, R. Afkhami-Rohani, T. L. Lu, A. Abaye, M. Davis, and D. J. Maratukulam, “ANNSTLF - A neural-network-based electric load forecasting system,” IEEE Transactions on Neural Networks, vol. 8, no. 4, pp. 835–846, 1997, doi: 10.1109/72.595881. [9] A. Khotanzad and R. Afkhami-Rohani, “ANNSTLF - Artificial neural network short-term load forecaster - generation three,” IEEE Transactions on Power Systems, vol. 13, no. 4, pp. 1413– 1422, 1998, doi: 10.1109/59.736285. [10] H. S. Hippert and C. E. Pedreira, “Estimating temperature profiles for short-term load forecasting: Neural networks compared to linear models,” IEE Proceedings: Generation, Transmission and Distribution, vol. 151, no. 4, pp. 543–547, Jul. 2004, doi: 10.1049/IPGTD:20040491. [11] A. Kliotanzad, M. H. Davis, and Alireza’Abaye, “An artificial neural network hourly temperature forecaster with applications in load forecasting,” IEEE Transactions on Power Systems, vol. 11, no. 2, pp. 870–876, 1996, doi: 10.1109/59.496168. [12] S. Fan, K. Methaprayoon, and W. J. Lee, “Multiregion load forecasting for system with large geographical area,” IEEE Transactions on Industry Applications, vol. 45, no. 4, pp. 1452–1459, 2009, doi: 10.1109/TIA.2009.2023569. [13] S. Rahman, “Formulation and Analysis of a Rule-Based Short-Term Load Forecasting Algorithm,” Proceedings of the IEEE, vol. 78, no. 5, pp. 805–816, 1990, doi: 10.1109/5.53400. [14] N. F. Hubele and C. S. Cheng, “Identification of seasonal short-term load forecasting models using statistical decision functions,” IEEE Transactions on Power Systems, vol. 5, no. 1, pp. 40–45, 1990, doi: 10.1109/59.49084. [15] G. Zhang, X. Bai, and Y. Wang, “Short-time multi-energy load forecasting method based on CNN-Seq2Seq model with attention mechanism,” Machine Learning with Applications, vol. 5, p. 100064, Sep. 2021, doi: 10.1016/J.MLWA.2021.100064. [16] S. He et al., “A per-unit curve rotated decoupling method for CNN-TCN based day-ahead load forecasting,” IET Generation, Transmission & Distribution, vol. 15, no. 19, pp. 2773–2786, Oct. 2021, doi: 10.1049/GTD2.12214. [17] S. M. J. Jalali, S. Ahmadian, A. Khosravi, M. Shafie-khah, S. Nahavandi, and J. P. S. Catalao, “A Novel Evolutionary-based Deep Convolutional Neural Network Model for Intelligent Load Forecasting,” IEEE Transactions on Industrial Informatics, 2021, doi: 10.1109/TII.2021.3065718. [18] G. Dudek, P. Pelka, and S. Smyl, “A Hybrid Residual Dilated LSTM and Exponential Smoothing Model for Midterm Electric Load Forecasting,” IEEE Transactions on Neural Networks and Learning Systems, 2021, doi: 10.1109/TNNLS.2020.3046629. 74 [19] M. Khan, H. Wang, A. Riaz, A. Elfatyany, and S. Karim, “Bidirectional LSTM-RNN-based hybrid deep learning frameworks for univariate time series classification,” Journal of Supercomputing, vol. 77, no. 7, pp. 7021–7045, Jul. 2021, doi: 10.1007/S11227-020-03560- Z/FIGURES/6. [20] “time series | statistics | Britannica.” https://www.britannica.com/topic/time-series (accessed May 10, 2022). [21] “Time series - Wikipedia.” https://en.wikipedia.org/wiki/Time_series (accessed May 10, 2022). [22] “What Is Time Series Forecasting?” https://machinelearningmastery.com/time-seriesforecasting/ (accessed May 10, 2022). [23] P. A. Gagniuc, “Markov chains : from theory to implementation and experimentation”. [24] “Stationary process - Wikipedia.” https://en.wikipedia.org/wiki/Stationary_process (accessed May 10, 2022). [25] “How to Remove Trends and Seasonality with a Difference Transform in Python.” https://machinelearningmastery.com/remove-trends-seasonality-difference-transform-python/ (accessed May 10, 2022). [26] “Stationarity in Time Series Analysis Explained using Python.” https://blog.quantinsti.com/stationarity/ (accessed May 10, 2022). [27] “How to Check if Time Series Data is Stationary with Python.” https://machinelearningmastery.com/time-series-data-stationary-python/ (accessed May 10, 2022). [28] “Augmented Dickey–Fuller test - Wikipedia.” https://en.wikipedia.org/wiki/Augmented_Dickey%E2%80%93Fuller_test (accessed May 10, 2022). [29] “What is a trend in time series? - GeeksforGeeks.” https://www.geeksforgeeks.org/what-is-atrend-in-time-series/ (accessed May 10, 2022). [30] “Mann Kendall Trend Test: Definition, Running the Test - Statistics How To.” https://www.statisticshowto.com/mann-kendall-trend-test/ (accessed May 10, 2022). [31] “Seasonal Kendall Test - Statistics How To.” https://www.statisticshowto.com/seasonalkendall-test/ (accessed May 10, 2022). [32] “Time Series and Forecasting Time Series • A time series is a sequence of measurements over time, usually obtained at equally spaced intervals-Daily-Monthly-Quarterly-Yearly”. [33] “What Are Data Trends and Patterns, and How Do They Impact Business Decisions? - DATAVERSITY.” https://www.dataversity.net/data-trends-patterns-impact-business-decisions/ (accessed May 10, 2022). [34] “Seasonality - Wikipedia.” https://en.wikipedia.org/wiki/Seasonality (accessed May 10, 2022). [35] “Seasonality Definition.” https://www.investopedia.com/terms/s/seasonality.asp (accessed May 10, 2022). [36] “seasonality - Barrons Dictionary - AllBusiness.com.” https://www.allbusiness.com/barrons_dictionary/dictionary-seasonality-4946957-1.html (accessed May 10, 2022). [37] “How to Identify and Remove Seasonality from Time Series Data with Python.” https://machinelearningmastery.com/time-series-seasonality-with-python/ (accessed May 10, 2022). [38] “Chapter 2 Time series graphics | Forecasting: Principles and Practice (2nd ed).” https://otexts.com/fpp2/graphics.html (accessed May 10, 2022). [39] “Noise in time series data. There are two types of noise in time… | by Gajanan Kothawade | Medium.” https://medium.com/@kothawadegs/noise-in-time-series-data-63c5450e10f9 (accessed May 10, 2022). [40] “White Noise Time Series with Python.” https://machinelearningmastery.com/white-noisetime-series-python/ (accessed May 10, 2022). [41] “Lecture 11: White and red noise”. [42] M. Vilela et al., “Fluctuation Analysis of Activity Biosensor Images for the Study of Information Flow in Signaling Pathways,” Methods in Enzymology, vol. 519, pp. 253–276, Jan. 2013, doi: 10.1016/B978-0-12-405539-1.00009-9. [43] “Autocorrelation - Wikipedia.” https://en.wikipedia.org/wiki/Autocorrelation (accessed May 10, 2022). [44] “A Gentle Introduction to Autocorrelation and Partial Autocorrelation.” https://machinelearningmastery.com/gentle-introduction-autocorrelation-partialautocorrelation/ (accessed May 10, 2022). 75 [45] “Partial autocorrelation function - Wikipedia.” https://en.wikipedia.org/wiki/Partial_autocorrelation_function (accessed May 10, 2022). [46] “Autoregressive model - Wikipedia.” https://en.wikipedia.org/wiki/Autoregressive_model (accessed May 10, 2022). [47] “Autoregressive Model: Definition & The AR Process - Statistics How To.” https://www.statisticshowto.com/autoregressive-model/ (accessed May 10, 2022). [48] “Autoregressive–moving-average model - Wikipedia.” https://en.wikipedia.org/wiki/Autoregressive%E2%80%93moving-average_model (accessed May 10, 2022). [49] J. Gurland and P. Whittle, “Hypothesis Testing in Time Series Analysis.,” J Am Stat Assoc, vol. 49, no. 265, p. 197, Mar. 1954, doi: 10.2307/2281054. [50] P. Whittle, Hypothesis testing in time series analysis. Uppsala: Almqvist & Wiksells boktr., 1951. [51] “P The Statistical Theory of Linear Systems”, Accessed: May 10, 2022. [Online]. Available: https://epubs.siam.org/terms-privacy [52] “Autoregressive integrated moving average - Wikipedia.” https://en.wikipedia.org/wiki/Autoregressive_integrated_moving_average (accessed May 10, 2022). [53] D. Kwiatkowski, P. C. B. Phillips, P. Schmidt, and Y. Shin, “Testing the null hypothesis of stationarity against the alternative of a unit root. How sure are we that economic time series have a unit root?,” Journal of Econometrics, vol. 54, no. 1–3, pp. 159–178, 1992, doi: 10.1016/0304-4076(92)90104-Y. [54] “8.9 Seasonal ARIMA models | Forecasting: Principles and Practice (2nd ed).” https://otexts.com/fpp2/seasonal-arima.html (accessed May 10, 2022). [55] “Machine learning - Wikipedia.” https://en.wikipedia.org/wiki/Machine_learning (accessed May 10, 2022). [56] T. M. Mitchell, Machine Learning. New York: McGraw-Hill, 1997. [57] “Machine Learning: What it is and why it matters | SAS.” https://www.sas.com/en_us/insights/analytics/machine-learning.html (accessed May 10, 2022). [58] “Linear regression - Wikipedia.” https://en.wikipedia.org/wiki/Linear_regression (accessed May 10, 2022). [59] D. A. Freedman, “Statistical models: Theory and practice,” Statistical Models: Theory and Practice, pp. 1–442, Jan. 2009, doi: 10.1017/CBO9780511815867. [60] “Methods of Multivariate Analysis - Alvin C. Rencher, William F. Christensen - Google Books.” https://books.google.com.bd/books?id=0gPAuKub3QC&pg=PA19&redir_esc=y#v=onepage&q&f=false (accessed May 10, 2022). [61] H. L. Seal, “Studies in the History of Probability and Statistics. XV The historical development of the Gauss linear model,” Biometrika, vol. 54, no. 1–2, pp. 1–24, Jun. 1967, doi: 10.1093/BIOMET/54.1-2.1. [62] “Linear Regression for Machine Learning.” https://machinelearningmastery.com/linearregression-for-machine-learning/ (accessed May 10, 2022). [63] “Ordinary least squares - Wikipedia.” https://en.wikipedia.org/wiki/Ordinary_least_squares (accessed May 10, 2022). [64] “Gradient descent - Wikipedia.” https://en.wikipedia.org/wiki/Gradient_descent (accessed May 10, 2022). [65] “Lasso (statistics) - Wikipedia.” https://en.wikipedia.org/wiki/Lasso_(statistics) (accessed May 10, 2022). [66] “Tikhonov regularization - Wikipedia.” https://en.wikipedia.org/wiki/Tikhonov_regularization (accessed May 10, 2022). [67] “| Xoriant.” https://www.xoriant.com/blog/product-engineering/decision-trees-machinelearning-algorithm.html (accessed May 10, 2022). [68] “1.10. Decision Trees — scikit-learn 1.0.2 documentation.” https://scikitlearn.org/stable/modules/tree.html (accessed May 10, 2022). [69] “Decision Tree in Machine Learning | by Prince Yadav | Towards Data Science.” https://towardsdatascience.com/decision-tree-in-machine-learning-e380942a4c96 (accessed May 10, 2022). [70] “Unlocking the True Power of Support Vector Regression | by Ashwin Raj | Towards Data Science.” https://towardsdatascience.com/unlocking-the-true-power-of-support-vectorregression-847fd123a4a0 (accessed May 10, 2022). 76 [71] “Support Vector Regression (SVR) | Analytics Vidhya.” https://medium.com/analyticsvidhya/support-vector-regression-svr-model-a-regression-based-machine-learning-approachf4641670c5bb (accessed May 10, 2022). [72] A. Singh, V. Kotiyal, S. Sharma, J. Nagar, and C. C. Lee, “A Machine Learning Approach to Predict the Average Localization Error with Applications to Wireless Sensor Networks,” IEEE Access, vol. 8, pp. 208253–208263, 2020, doi: 10.1109/ACCESS.2020.3038645. [73] “sklearn.svm.SVR — scikit-learn 1.0.2 documentation.” https://scikitlearn.org/stable/modules/generated/sklearn.svm.SVR.html (accessed May 10, 2022). [74] “K-Nearest Neighbor. A complete explanation of K-NN | by Antony Christopher | The Startup | Medium.” https://medium.com/swlh/k-nearest-neighbor-ca2593d7a3c4 (accessed May 10, 2022). [75] “k-nearest neighbors algorithm - Wikipedia.” https://en.wikipedia.org/wiki/Knearest_neighbors_algorithm (accessed May 10, 2022). [76] S. M. Piryonesi and T. E. El-Diraby, “Role of Data Analytics in Infrastructure Asset Management: Overcoming Data Size and Quality Problems,” Journal of Transportation Engineering, Part B: Pavements, vol. 146, no. 2, p. 04020022, Apr. 2020, doi: 10.1061/JPEODX.0000175. [77] Trevor. Hastie, Robert. Tibshirani, and J. H. (Jerome H. ) Friedman, The elements of statistical learning : data mining, inference, and prediction : with 200 full-color illustrations. New York: Springer, 2001. [78] P. A. Jaskowiak, P. A. Jaskowiak, and R. J. G. B. Campello, “Comparing Correlation Coefficients as Dissimilarity Measures for Cancer Classification in Gene Expression Data,” BRAZILIAN SYMPOSIUM ON BIOINFORMATICS (BSB), pp. 1--8, 2011, Accessed: May 10, 2022. [Online]. Available: https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.208.993 [79] D. Coomans and D. L. Massart, “Alternative k-nearest neighbour rules in supervised pattern recognition : Part 1. k-Nearest neighbour classification by using alternative voting rules,” Analytica Chimica Acta, vol. 136, no. C, pp. 15–27, Jan. 1982, doi: 10.1016/S0003- 2670(01)95359-0. [80] “Wayback Machine.” https://web.archive.org/web/20150119081741/http://oz.berkeley.edu/~breiman/arcall96.pdf (accessed May 10, 2022). [81] “Boosting (machine learning) - Wikipedia.” https://en.wikipedia.org/wiki/Boosting_(machine_learning) (accessed May 10, 2022). [82] Z. Zhou, “Boosting 25 Years,” Else, 2020. [83] “Boosting in Machine Learning | Boosting and AdaBoost - GeeksforGeeks.” https://www.geeksforgeeks.org/boosting-in-machine-learning-boosting-and-adaboost/ (accessed May 10, 2022). [84] “XGBoost - Wikipedia.” https://en.wikipedia.org/wiki/XGBoost (accessed May 10, 2022). [85] “A Gentle Introduction to XGBoost for Applied Machine Learning.” https://machinelearningmastery.com/gentle-introduction-xgboost-applied-machine-learning/ (accessed May 10, 2022). [86] “LightGBM (Light Gradient Boosting Machine) - GeeksforGeeks.” https://www.geeksforgeeks.org/lightgbm-light-gradient-boosting-machine/ (accessed May 10, 2022). [87] “LGB, the winning Gradient Boosting model ⋆ Code A Star.” https://www.codeastar.com/lgbwinning-gradient-boosting-model/ (accessed May 10, 2022). [88] “A Primer to Ensemble Learning – Bagging and Boosting .” https://analyticsindiamag.com/primer-ensemble-learning-bagging-boosting/ (accessed May 10, 2022). [89] “Bagging and Random Forest Ensemble Algorithms for Machine Learning.” https://machinelearningmastery.com/bagging-and-random-forest-ensemble-algorithms-formachine-learning/ (accessed May 10, 2022). [90] “Random Forest | Introduction to Random Forest Algorithm.” https://www.analyticsvidhya.com/blog/2021/06/understanding-random-forest/ (accessed May 10, 2022). [91] “What is Deep Learning?” https://machinelearningmastery.com/what-is-deep-learning/ (accessed May 10, 2022). [92] “Deep learning - Wikipedia.” https://en.wikipedia.org/wiki/Deep_learning (accessed May 10, 2022). 77 [93] Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 2015 521:7553, vol. 521, no. 7553, pp. 436–444, May 2015, doi: 10.1038/nature14539. [94] “What Is Deep Learning? | How It Works, Techniques & Applications - MATLAB & Simulink.” https://www.mathworks.com/discovery/deep-learning.html (accessed May 10, 2022). [95] “A Comprehensive Guide to Convolutional Neural Networks — the ELI5 way | by Sumit Saha | Towards Data Science.” https://towardsdatascience.com/a-comprehensive-guide-toconvolutional-neural-networks-the-eli5-way-3bd2b1164a53 (accessed May 10, 2022). [96] M. v. Valueva, N. N. Nagornov, P. A. Lyakhov, G. v. Valuev, and N. I. Chervyakov, “Application of the residue number system to reduce hardware costs of the convolutional neural network implementation,” Mathematics and Computers in Simulation, vol. 177, pp. 232–243, Nov. 2020, doi: 10.1016/J.MATCOM.2020.04.031. [97] J. Tanida, K. Itoh, W. Zhang, and Y. Ichioka, “Parallel distributed processing model with local space-invariant interconnections and its optical architecture,” Applied Optics, Vol. 29, Issue 32, pp. 4790-4797, vol. 29, no. 32, pp. 4790–4797, Nov. 1990, doi: 10.1364/AO.29.004790. [98] “Convolutional neural network - Wikipedia.” https://en.wikipedia.org/wiki/Convolutional_neural_network (accessed May 10, 2022). [99] R. Venkatesan and B. Li, “Convolutional Neural Networks in Visual Computing : A Concise Guide,” Convolutional Neural Networks in Visual Computing, Oct. 2017, doi: 10.4324/9781315154282. [100] R. Venkatesan and B. Li, “Convolutional neural networks in visual computing : a concise guide,” p. 168. [101] D. C. C. Cires¸an, U. Meier, J. Masci, L. M. Gambardella, and J. ¨ Urgen Schmidhuber, “Flexible, High Performance Convolutional Neural Networks for Image Classification”. [102] A. Krizhevsky, I. Sutskever, and G. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks”. [103] D. Ciregan, U. Meier, and J. Schmidhuber, “Multi-column deep neural networks for image classification,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 3642–3649, 2012, doi: 10.1109/CVPR.2012.6248110. [104] “MNIST Demos on Yann LeCun’s website.” http://yann.lecun.com/exdb/lenet/ (accessed May 10, 2022). [105] O. I. Abiodun, A. Jantan, A. E. Omolara, K. V. Dada, N. A. E. Mohamed, and H. Arshad, “State-of-the-art in artificial neural network applications: A survey,” Heliyon, vol. 4, no. 11, p. e00938, Nov. 2018, doi: 10.1016/J.HELIYON.2018.E00938/ATTACHMENT/7F6968D1- 2173-4A69-BC61-62AD41135D5C/MMC2. [106] A. Tealab, “Time series forecasting using artificial neural networks methodologies: A systematic review,” Future Computing and Informatics Journal, vol. 3, no. 2, pp. 334–340, Dec. 2018, doi: 10.1016/J.FCIJ.2018.10.003. [107] X. Li and X. Wu, “Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition,” Oct. 2014, doi: 10.48550/arxiv.1410.4281. [108] H. H. Sak, A. Senior, and B. Google, “Long Short-Term Memory Recurrent Neural Network Architectures for Large Scale Acoustic Modeling”. [109] “Turing Machines are Recurrent Neural Networks.” http://users.ics.aalto.fi/tho/stes/step96/hyotyniemi1/ (accessed May 10, 2022). [110] A. Milos Miljanovic, “Comparative analysis of Recurrent and Finite Impulse Response Neural Networks in Time Series Prediction.” [111] “LSTM Networks | A Detailed Explanation | Towards Data Science.” https://towardsdatascience.com/lstm-networks-a-detailed-explanation-8fae6aefc7f9 (accessed May 10, 2022). [112] “Long short-term memory - Wikipedia.” https://en.wikipedia.org/wiki/Long_shortterm_memory (accessed May 10, 2022). [113] “LSTM can Solve Hard Long Time Lag Problems.” https://papers.nips.cc/paper/1996/hash/a4d2f0d23dcc84ce983ff9157f8b7f88-Abstract.html (accessed May 10, 2022). [114] F. A. Gers, J. Schmidhuber, and F. Cummins, “Learning to Forget: Continual Prediction with LSTM,” Neural Computation, vol. 12, no. 10, pp. 2451–2471, Oct. 2000, doi: 10.1162/089976600300015015. [115] Ovidiu. Calin, “Deep learning architectures : a mathematical approach,” p. 760. 78 [116] “What is a Gated Recurrent Unit (GRU)? - Definition from Techopedia.” https://www.techopedia.com/definition/33283/gated-recurrent-unit-gru (accessed May 10, 2022). [117] F. A. Gers, J. Schmidhuber, and F. Cummins, “Learning to forget: Continual prediction with LSTM,” IEE Conference Publication, vol. 2, no. 470, pp. 850–855, 1999, doi: 10.1049/CP:19991218. [118] “Recurrent Neural Network Tutorial, Part 4 – Implementing a GRU/LSTM RNN with Python and Theano – WildML.” https://web.archive.org/web/20211110112626/http://www.wildml.com/2015/10/recurrentneural-network-tutorial-part-4-implementing-a-grulstm-rnn-with-python-and-theano/ (accessed May 10, 2022). [119] “Understanding GRU Networks In 2021.” https://www.jigsawacademy.com/blogs/datascience/gru/ (accessed May 10, 2022). [120] M. Massaoudi, S. S. Refaat, I. Chihi, M. Trabelsi, F. S. Oueslati, and H. Abu-Rub, “A novel stacked generalization ensemble-based hybrid LGBM-XGB-MLP model for Short-Term Load Forecasting,” Energy, vol. 214, p. 118874, Jan. 2021, doi: 10.1016/J.ENERGY.2020.118874. [121] S. H. Rafi, N. Al-Masood, S. R. Deeba, and E. Hossain, “A short-term load forecasting method using integrated CNN and LSTM network,” IEEE Access, vol. 9, pp. 32436–32448, 2021, doi: 10.1109/ACCESS.2021.3060654. | en_US |
dc.identifier.uri | http://hdl.handle.net/123456789/1618 | |
dc.description | Supervised by Md. Thesun Al-Amin, Assistant Professor, Department of Electrical and Electronic Engineering(EEE), Islamic University of Technology(IUT), Board Bazar, Gazipur-1704, Bangladesh This thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Science in Electrical and Electronic Engineering, 2022. | en_US |
dc.description.abstract | STLF (Short Term Load Forecasting) has traditionally become one of the most crucial, delicate, and precise demanding variables in energy systems. An efficient STLF enhances not only the financial feasibility of the system, but also its safety, consistency, and dependability in performance, allowing for the realization of a prospective Smart Electricity System. The state-of-the-art models exhibit significant nonlinearity in load information from available projections, as well as limited applicability in real-world circumstances. However, for real-world application, the energy forecasting area requires better resilience, improved prediction accuracy, and adaptability capacity. The study given in this paper supports the case for a hybrid strategy, in which the complimentary qualities of several cognitive methodologies are merged to provide a superior solution to the STLF problem. The deep learning models for STLF integrated with statistical techniques are presented in this paper for an accurate load forecasting. Temperature, humidity, and day type are all taken into account since they have a substantial effect on the overall performance of an appropriate STLF. The load demand data has been collected from the PGCB database, the weather data has been collected from the rp5 archive and the holidays are considered from the government calendar which excludes the data collection part of our research. The data refinement process has been done where many preprocessing techniques are applied on the raw data. With the proper data analysis and scaling the further process fed into the deep learning models where the training, validation and testing were done. In terms of computing complexity and prediction accuracy, the suggested model outperforms the prior hybrid models. The proposed technique of our methodology against existing power prediction information reveals that it performs better in terms of precision and accuracy. The evaluation has been done considering the Mean Absolute Percentage Error (MAPE) and R-squared score which outperforms the existing literature reviews. When compared to existing baseline models, the suggested technique had the lowest error rate on the Power Grid Company Bangladesh dataset. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Department of Electrical and Electronic Engineering(EEE), Islamic University of Technology(IUT), | en_US |
dc.subject | CNN, LSTM, GRU, Deep Learning, Load Forecasting, Machine Learning, PGCB | en_US |
dc.title | An Efficient Short Term Load Demand Forecasting Using a Novel Parallel CNN-BiLSTM Hybrid Neural Network for Bangladesh Perspective | en_US |
dc.type | Thesis | en_US |