Comparison between Cascade Forward and Multi-Layer Perceptron Neural Networks for NARX Functional Electrical Stimulation (FES)-Based Muscle Model

Ihsan Mohd Yassin (1), Rozita Jailani (2), Megat Syahirul Amin Megat Ali (3), Rahimi Baharom (4), Abu Huzaifah Abu Hassan (5), Zairi Ismael Rizman (6)
(1) Faculty of Electrical Engineering, Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia
(2) Faculty of Electrical Engineering, Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia
(3) Faculty of Electrical Engineering, Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia
(4) Faculty of Electrical Engineering, Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia
(5) Faculty of Electrical Engineering, Universiti Teknologi MARA, Shah Alam, Selangor, Malaysia
(6) Universiti Teknologi MARA
Fulltext View | Download
How to cite (IJASEIT) :
Mohd Yassin, Ihsan, et al. “Comparison Between Cascade Forward and Multi-Layer Perceptron Neural Networks for NARX Functional Electrical Stimulation (FES)-Based Muscle Model”. International Journal on Advanced Science, Engineering and Information Technology, vol. 7, no. 1, Feb. 2017, pp. 215-21, doi:10.18517/ijaseit.7.1.1388.
This paper presents the development and comparison of muscle models based on Functional Electrical Stimulation (FES) stimulation parameters using the Nonlinear Auto-Regressive model with Exogenous Inputs (NARX) using Multi-Layer Perceptron and Cascade Forward Neural Network (CFNN). FES stimulations with varying frequency, pulse width and pulse duration were used to estimate the muscle torque. About 722 data points were used to create muscle model. One Step Ahead (OSA) prediction, correlation tests and residual histogram analysis were performed to validate the model. The optimal Multi-Layer Perceptron (MLP) results were obtained from input lag space of 1, output lag space of 43 and hidden units 30. The MLP selected a total of three terms were selected to construct the final model, which producing a final Mean Square Error (MSE) of 1.1299. The optimal CFNN results were obtained from input lag space of 1, output lag space of 5 and hidden units 20 with similar terms selected. The final MSE produced was 1.0320. The proposed approach managed to approximate the behavior of the system well with unbiased residuals, which CFNN showing 8.66% MSE improvement over MLP with 33.33% less hidden units.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).