Predicting the Amount of Digestive Enzymes Medicine Usage with LSTM

Adhistya Erna Permanasari (1), Abi Mahan Zaky (2), Silmi Fauziati (3), Ida Fitriana (4)
(1) Universitas Gadjah Mada
(2) Universitas Gadjah Mada
(3) Universitas Gadjah Mada
(4) Universitas Gadjah Mada
Fulltext View | Download
How to cite (IJASEIT) :
Permanasari, Adhistya Erna, et al. “Predicting the Amount of Digestive Enzymes Medicine Usage With LSTM”. International Journal on Advanced Science, Engineering and Information Technology, vol. 8, no. 5, Oct. 2018, pp. 1845-9, doi:10.18517/ijaseit.8.5.6511.
Medicines are widely used to prevent or cure illness. One of the medicines which often used to relieve stomach pain is a medicines that contains digestive enzymes. This type of medicines is much needed by hospitals and other health institutions. Hospitals and other health institutions should ensure the availability of medications for patients. This situation forces health institutions to deal with the uncertainty of medicine usage. Hospitals as one of the health institutions have some challenges. One of the challenges that must be faced is to ensure the availability of medicines for patients. The ability to predict can help ensure medicines availability in hospital. In this study will presents the forecasting model using Long Term Short Memory (LSTM) method to predict the need for medicines that contain digestive enzymes in the hospital. This method is chosen because it is known to have a high accuracy to predict stationary data. One of the methods used in input identification for the LSTM method is by using the Autocorrelation Function (ACF) and Partial Autocorrelation Function (PACF). The results of this study indicate that the use of LSTM method suitable for time series forecasting in historical dataset, with 12.733 Root Mean Square Error (RMSE) value.

M. Jalalpour, Y. Gel, and S. Levin, “Forecasting demand for health services : Development of a publicly available toolbox,” Oper. Res. Heal. Care, vol. 5, pp. 1-9, 2015.

C. Qingkui and R. Junhu, “Study on the Demand Forecasting of Hospital Stocks Based on Data Mining and BP Neural Networks,” in International Conference on Electronic Commerce and Business Intelligence Study, 2009, pp. 284-289.

í. Lublóy, “Factors affecting the uptake of new medicines : a systematic literature review,” BMC Health Serv. Res., pp. 1-25, 2014.

WHO, “Essential medicines and health products.” [Online]. Available: http://www.who.int/medicines/areas/access/supply/en/index2.html. [Accessed: 10-Aug-2016].

Robinson, John W. "Regression tree boosting to adjust health care cost predictions for diagnostic mix." Health services research 43.2 (2008): 755-772.

Zhao, Y., A. S. Ash, R. P. Ellis, J. Z. Ayanian, G. C. Pope, B. Bowen, L. Weyuker. 2005. Predicting pharmacy costs and other medical costs using diagnoses and drug claims. Med. Care 43 34-43.

Bertsimas, Dimitris, et al. "Algorithmic prediction of healthcare costs." Operations Research 56.6 (2008): 1382-1392.

Gamboa, John Cristian Borges. "Deep Learning for TimeSeries Analysis." arXiv preprint arXiv:1701.01887 (2017).

Tesauro, Gerald. "Practical issues in temporal difference learning." Advances in neural information processing systems. 1992.

R. Ghousi, S. Mehrani, and M. Momeni, “Application of Data Mining Techniques in Drug Consumption Forecasting to Help Pharmaceutical Industry Production Planning,” in International Conference on Industrial Engineering and Operations Management, 2012, pp. 1162- 1167.

F. A. Gers, J. Schmidhuber, and F. Cummins, “Learning to forget: Continual prediction with lstm,” Neural computation, vol. 12, no. 10, pp. 2451-2471, 2000.

S. Kaushik, A. Choudhury, N. Dasgupta, S. Natarajan, L. A. Pickett, and V. Dutt, “Using LSTMs for Predicting Patient’s Expenditure on Medications”, in International Conference on Machine Learning and Data Science, 2017, pp. 120-127.

L. B. Godfrey, M. S. Gashler, “Neural Decomposition of Time-Series Data”, in IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2017, pp. 2796-2801.

I. N. Soyiri and D. D. Reidpath, “An overview of health forecasting,” Environ. Health Prev. Med., vol. 18, pp. 1-9, 2013.

S. Makridakis, S. C. Wheelwright, and V. E. McGee, Forecasting: Methods and Applications, 2nd ed. John Wiley & Sons Inc, 1983.

J. G. De Gooijer and R. J. Hyndman, “25 years of time series forecasting,” Int. J. Forecast., vol. 22, no. 44, pp. 443-473, 2006.

S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation, vol. 9, no. 8, pp. 1735-1780, 1997.

J. Zheng, C. Xu, Z. Zhang and X. Li, "Electric load forecasting in smart grids using Long-Short-Term-Memory based Recurrent Neural Network," 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, pp 1-6, 2017.

J. Vermaak, E. C. Botha, “Recurrent Neural Networks for Short-Term Load Forecasting”, IEEE Transactions on Power Systems, vol. 13, no. 1, pp. 126-132, February 1998.

“Enzyplex Product Details” [Online]. Available: https://www.unilab.com.ph/products/enzyplex. [Accessed 18-Mar-2018].

R. J. Hyndman and G. Athanasopoulos, “Stationarity and differencing.” [Online]. Available: https://www.otexts.org/fpp/8/1. [Accessed: 23-Mar-2018].

“P Values (Calculated Probability) and Hypothesis Testing - StatsDirect” [Online]. Available: https://www.statsdirect.com/help/basics/p_values.htm. [Accessed 18-Mar-2018].

“Complete guide to create a Time Series Forecast (with Codes in Python)” [Online]. Available: https://www.analyticsvidhya.com/blog/2016/02/time-series-forecasting-codes-python. [Accessed 18-Mar-2018].

“Time Series Forecasting with the Long Short-Term Memory Network in Python - Machine Learning Mastery” [Online]. Available: https://machinelearningmastery.com/time-series-forecasting-long-short-term-memory-network-python. [Accessed 18-Mar-2018].

Kingma, Diederik, and Jimmy Ba. "Adam: A method for stochastic optimization." arXiv preprint arXiv:1412.6980 (2014).

H. Abolfazli, S. M. Asadzadeh, and S. M. Asadzadeh, “Forecasting Rail Transport Petroleum Consumption Using an Integrated Model of Autocorrelation Function-Artificial Neural Network”, vol. 11, no. 2, pp. 203-214, 2014.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).