Advanced Object Tracking through Conditional Online Updates and Noise Suppression

Suchang Lim (1), Jongchan Kim (2)
(1) Department of Computer Engineering, Sunchon National University, 235, Jungang-ro, Suncheon-si, Jeollanam-do, Republic of Korea
(2) Department of Computer Engineering, Sunchon National University, 235, Jungang-ro, Suncheon-si, Jeollanam-do, Republic of Korea
Fulltext View | Download
How to cite (IJASEIT) :
Lim, Suchang, and Jongchan Kim. “Advanced Object Tracking through Conditional Online Updates and Noise Suppression”. International Journal on Advanced Science, Engineering and Information Technology, vol. 14, no. 5, Oct. 2024, pp. 1596-01, doi:10.18517/ijaseit.14.5.20444.
Object tracking under complex environmental conditions, such as background clutter, occlusion, and rapid motion, presents significant challenges. This paper addresses these issues by proposing a tracking algorithm integrating background suppression, target region enhancement, and an adaptive online template update mechanism to improve tracking accuracy. The proposed method uses the initial bounding box of the target object as a reference template and selectively updates specific regions online to suppress noise and retain critical features. We evaluated the proposed method using the OTB dataset to validate it. The baseline model without the proposed method showed a success rate of 0.417 and a precision of 0.586, while the algorithm with the proposed method achieved improved values of 0.524 and 0.728, respectively. Qualitative evaluations further confirmed the robustness of the proposed method, demonstrating high performance in scenarios with occlusion and complex backgrounds. Rather than updating all regions indiscriminately, the proposed method selectively updates the template using representative values from the target object's information. This selective update mechanism ensures the incorporation of the most relevant and accurate features, enabling the algorithm to adapt to changes in the target's appearance while minimizing noise integration. Emphasizing the feature regions and suppressing noise are also critical for maintaining a clear and precise representation of the target object, reducing the likelihood of confusion by irrelevant background information. Future research will focus on developing balanced update strategies that integrate new information while maintaining stable and reliable target characteristics.

H. Cai, L. Lan, J. Zhang, X. Zhang, C. Xiao, and Z. Luo, “Online intervention siamese tracking,” Information Sciences, vol. 637, p. 118954, Aug. 2023, doi:10.1016/j.ins.2023.118954.

H. Lu, Y. Zhang, Y. Li, C. Jiang, and H. Abbas, “User-oriented virtual mobile network resource management for vehicle communications,” IEEE Trans. Intell. Transp Syst., vol. 22, no. 6, pp. 3521–3532, May. 2021, doi:10.1109/tits.2020.2991766.

H. Lu, Y. Li, S. Mu, D. Wang, H. Kim, and S. Serikawa, “Motor anomaly detection for unmanned aerial vehicles using reinforcement learning,” IEEE Internet ThingsJ., vol. 5, no. 6, pp.2315–2322, Aug. 2018, doi: 10.1109/jiot.2017.2737479.

C. Li, X. Liang, Y. Lu, N. Zhao, and J. Tang, “RGB-T object tracking:benchmark and baseline,” Pattern Recognition, vol. 96, p. 106977, Aug. 2019, doi:10.1016/j.patcog.2019.106977.

P. Li, D. Wang, L. Wang, and H. Lu, “Deep visual tracking: review and experimental comparison,” Pattern Recognition, vol. 76, pp. 323–338, Apr. 2018, doi:10.1016/j.patcog.2017.11.007.

A. W. Smeulders, D. M. Chu, R. Cucchiara, S. Calderara, A. Dehghan, and M. Shah, “Visual tracking: an experimental survey,” IEEE Trans. Pattern Anal. Mach.Intell., vol. 36, no. 7, pp. 1442–1468, 2013, doi:10.1109/tpami.2013.230.

L. Zhou, Y. Jin, H. Wang, Z. Hu, and S. Zhao, “Robust DCF object tracking with adaptive spatial and temporal regularization based on target appearance variation,” Signal Processing, vol. 195, p. 108463, Jun. 2022, doi:10.1016/j.sigpro.2022.108463.

S. Javed, M. Danelljan, F. S. Khan, M. H. Khan, M. Felsberg, and J. Matas, "Visual Object Tracking With Discriminative Filters and Siamese Networks: A Survey and Outlook," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 5, pp. 6552-6574, May 2023, doi: 10.1109/tpami.2022.3212594.

T. Zhang, C. Xu, and M. H. Yang, "Learning Multi-Task Correlation Particle Filters for Visual Tracking," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 41, no. 2, pp. 365-378, Feb. 2019, doi:10.1109/tpami.2018.2797062.

J. F. Henriques, R. Caseiro, P. Martins, and J. Batista, "High-Speed Tracking with Kernelized Correlation Filters," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 3, pp. 583-596, Mar. 2015, doi:10.1109/tpami.2014.2345390.

N. Dalal, and B. Triggs, "Histograms of oriented gradients for human detection," 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, vol. 1, pp. 886-893, 2005, doi:10.1109/cvpr.2005.177.

A. Berthelier, T. Chateau, S. Duffner, C. Garcia, and C. Blanc, “Deep model compression and architecture optimization for embedded systems: A survey,” Journal of Signal Process. System, vol. 93, pp. 863–878, Oct. 2021, doi:10.1007/s11265-020-01596-1.

M. Danelljan, L. Van Gool, and R. Timofte, "Probabilistic Regression for Visual Tracking," In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, Seattle, WA, USA, pp. 7181-7190, 2020, doi:10.1109/cvpr42600.2020.00721.

K. Nai, Z. Li, and H. Wang, "Dynamic feature fusion with spatial-temporal context for robust object tracking," Pattern Recognition, vol. 130, p. 108775, Oct. 2022, doi:10.1016/j.patcog.2022.108775.

D. Elayaperumal, and Y. H. Joo, “Robust visual object tracking using context-based spatial variation via multi-feature fusion,” Information Sciences, vol. 577, pp. 467-482, Oct. 2021, doi:10.1016/j.ins.2021.06.084.

M. Mueller, N. Smith, and B. Ghanem, "Context-Aware Correlation Filter Tracking," 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, pp. 1387-1395, 2017, doi: 10.1109/cvpr.2017.152.

J. Zhang, M. Miao, H. Zhang, J. Wang, Y. Zhao, Z. Chen, and J. Qiao, “Object semantic-guided graph attention feature fusion network for Siamese visual tracking,” Journal of Visual Communication and Image Representation, vol. 90, p. 103705, Feb. 2023, doi:10.1016/j.jvcir.2022.103705.

O. Russakovsky, J. Deng, H. Su et al., “ImageNet Large Scale Visual Recognition Challenge,” International journal of computer vision, vol. 115, pp. 211–252, Apr. 2015, doi:10.1007/s11263-015-0816-y.

K. He, X. Zhang, S. Ren, and J. Sun, "Deep Residual Learning for Image Recognition," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, pp. 770-778, Jun. 2016, doi: 10.1109/cvpr.2016.90.

Y. Wu, J. Lim, and M. H. Yang, "Object Tracking Benchmark," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 9, pp. 1834-1848, Sep. 2015, doi:10.1109/tpami.2014.2388226.

S. Xuan, S. Li, Z. Zhao, L. Kou, Z. Zhou, and G. S. Xia, "Siamese Networks with Distractor-Reduction Method for Long-term Visual Object Tracking," Pattern Recognition, vol. 112, p. 107698, Apr. 2021, doi:10.1016/j.patcog.2020.107698.

Y. Zeng, B. Zeng, H. Hu, and H. Zhang, "PRAT: Accurate Object Tracking Based on Progressive Attention," Engineering Applications of Artificial Intelligence, vol. 126, p. 106988, Nov. 2023, doi:10.1016/j.engappai.2023.106988.

H. Wang, and F. Guo, "Online Object Tracking Based Interactive Attention," Computer Vision and Image Understanding, vol. 236, p. 103809, Nov. 2023, doi:10.1016/j.cviu.2023.103809.

J. Lu, S. Li, W. Guo, M. Zhao, J. Yang, Y. Liu, and Z. Zhou, "Siamese Graph Attention Networks for Robust Visual Object Tracking," Computer Vision and Image Understanding, vol. 229, p. 103634, Mar. 2023, doi:10.1016/j.cviu.2023.103634.

Y. Zhang, T. Wang, K. Liu, B. Zhang, and L. Chen, "Recent advances of single-object tracking methods: A brief survey," Neurocomputing, vol. 455, pp. 1-11, Sep. 2021, doi:10.1016/j.neucom.2021.05.011.

F. Chen, X. Wang, Y. Zhao, S. Lv, and X. Niu, "Visual Object Tracking: A Survey," Computer Vision and Image Understanding, vol. 222, p. 103508, Sep. 2022, doi:10.1016/j.cviu.2022.103508.

J. Zhang, J. Sun, J. Wang, Z. Li, and X. Chen. "An Object Tracking Framework with Recapture Based on Correlation Filters and Siamese Networks," Computers and Electrical Engineering, vol. 98, p. 107730, Mar. 2022, doi:10.1016/j.compeleceng.2022.107730.

X. Gao, Y. Zhou, S. Huo, Z. Li, and K. Li, "Robust Object Tracking via Deformation Samples Generator," Journal of Visual Communication and Image Representation, vol. 83, p. 103446, Feb. 2022, doi: 10.1016/j.jvcir.2022.103446.

S. Gao, C. Zhou, and J. Zhang, "Generalized Relation Modeling for Transformer Tracking," Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, pp. 18686-18695, Jun. 2023, doi:10.1109/CVPR52729.2023.01792.

H. Zhao, G. Yang, D. Wang, and H. Lu, "Deep Mutual Learning for Visual Object Tracking," Pattern Recognition, vol. 112, p. 107796, Apr. 2021, doi:10.1016/j.patcog.2020.107796.

H. Rumapea, M. Zarlis, S. Efendy, and P. Sihombing, “Improving Convective Cloud Classification with Deep Learning: The CC-Unet Model”, Int. J. Adv. Sci. Eng. Inf. Technol., vol. 14, no. 1, pp. 28–36, Feb. 2024, doi: 10.18517/ijaseit.14.1.18658.

X. Y. Chan, T. Connie, and M. K. O. Goh, “Facial and Body Gesture Recognition for Determining Student Concentration Level”, Int. J. Adv. Sci. Eng. Inf. Technol., vol. 13, no. 5, pp. 1693–1702, Oct. 2023, doi: 10.18517/ijaseit.13.5.19035

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).