Development of an Emulator for Radiation Physics Simulator Using Stochastic Variational Gaussian Process Model
How to cite (IJASEIT) :
C. Currin, T. Mitchell, M. Morris, and D. Ylvisaker. “A Bayesian approach to the design and analysis of computer experiments,” Oak Ridge National Lab., TN, USA, Tech. Rep. ORNL-6498, Sep. 1988.
T. J. Santner, B. J. Williams, W. I. Notz, and B. J. Williams. “The design and analysis of computer experiments,” New York: Springer, 2003.
A. O'Hagan. “Curve fitting and optimal design for prediction,” Journal of the Royal Statistical Society: Series B (Methodological)., vol. 40, no. 1, pp. 1-24, 1978, 10.1111/j.2517-6161.1978.tb01643.x.
T. Beckers. “An introduction to Gaussian process models,” arXiv preprint, arXiv:2102.05497, 2021.
J. Hensman, N. Fusi, and N. D. Lawrence. “Gaussian processes for big data,” arXiv preprint, arXiv:1309.6835, 2013.
P. Ghasemi, M. Karbasi, A. Z. Nouri, M. S. Tabrizi, and H. M. Azamathulla. “Application of Gaussian process regression to forecast multi-step SPEI drought index,” Alexandria Engineering Journal., vol. 60, no. 6, pp. 5375-5392, 2021.
S. Rasp, P. D. Dueben, S. Scher, J. A. Weyn, S. Mouatadid, and N. Thuerey. “WeatherBench: a benchmark data set for data-driven weather forecasting,” Journal of Advances in Modeling Earth Systems, vol. 12, no. 11, 2020, doi: 10.1029/2020MS002203.
Y. Lee and J. S. Park. “Generalized Nonlinear Least Squares Method for the Calibration of Complex Computer Code Using a Gaussian Process Surrogate,” Entropy., vol. 22, no. 9, pp. 985, 2020.
Y. A. Seo and J. S. Park. “Expectation-Maximization Algorithm for the Calibration of Complex Simulator Using a Gaussian Process Emulator,” Entropy., vol. 23, no. 1, pp. 53, 2020.
S. Roh and H. J. Song. “Evaluation of neural network emulations for radiation parameterization in cloud-resolving model,” Geophysical Research Letters., vol. 47, no. 21, 2020.
J. Luo, X. Ma, Y. Ji, X. Li, Z. Song, and W. Lu. “Review of machine learning-based surrogate models of groundwater contaminant modeling,” Environmental Research., vol. 238, part. 2, 2023, doi:10.1016/j.envres.2023.117268.
M. S. Go, J. H. Lim, and S. Lee. “Physics-informed neural network-based surrogate model for a virtual thermal sensor with real-time simulation,” International Journal of Heat and Mass Transfer., vol. 214, 2023, doi: 10.1016/j.ijheatmasstransfer.2023.124392.
R. M. Slot, J. D. Sørensen, B. Sudret, L. Svenningsen, and M. L. Thøgersen. “Surrogate model uncertainty in wind turbine reliability assessment,” Renewable Energy., vol. 151, pp. 1150-1162, 2020, doi:10.1016/j.renene.2019.11.101.
M. Tang, Y. Liu, and L. J. Durlofsky. “A deep-learning based surrogate model for data assimilation in dynamic subsurface flow problems,” Journal of Computational Physics., vol. 413, 2020, doi:10.1016/j.jcp.2020.109456.
P. Jiang, Q. Zhou, X. Shao, P. Jiang, Q. Zhou, and X. Shao. “Surrogate-model-based design and optimization,” Springer Singapore., pp. 135-236, 2020.
J. S. Park. “Tuning complex computer codes to data and optimal designs,” University of Illinois at Urbana-Champaign, 1991.
D. D. Cox, J. S. Park, and C. E. Singer. “A statistical method for tuning a computer code to a database,” Computational statistics & data analysis., vol. 37, no. 1, pp. 77-92, 2001, doi:10.1016/S0167-9473(00)00057-8.
M. D. Hoffman, D. M. Blei, and J. Paisley. “Stochastic variational inference,” Journal of Machine Learning Research., 2013.
J. Quinonero-Candela and C. E. Rasmussen. “A unifying view of sparse approximate Gaussian process regression,” The Journal of Machine Learning Research., vol. 6, pp. 1939-1959, 2005.
C. Ding, H. Rappel, T. Dodwell. “Full-field order-reduced Gaussian Process emulators for nonlinear probabilistic mechanics,” Computer Methods in Applied Mechanics and Engineering., vol. 405, 2023, doi:10.1016/j.cma.2022.115855.
Y. A. Seo, Y. Lee, and J. S. Park. “Iterative method for tuning complex simulation code,” Communications in Statistics-Simulation and Computation., vol. 51, no. 7, pp. 3975-3992, 2022, doi:10.1080/03610918.2020.1728317.
H. Liu, Y. S. Ong, X. Shen, and J. Cai. “When Gaussian process meets big data: A review of scalable GPs,” IEEE transactions on neural networks and learning systems., vol. 30, no. 11, pp. 4405-4423, 2020, doi: 10.1109/TNNLS.2019.2957109.
M. M. Noack, H. Krishnan, M. D. Risser, and K. G. Reyes. “Exact Gaussian processes for massive datasets via non-stationary sparsity-discovering kernels," Scientific reports., vol. 13, no. 1, pp. 3155, 2023.
S. Damm, D. Forster, D. Velychko, Z. Dai, A. Fischer, and J. Lücke. “The ELBO of Variational Autoencoders converges to a Sum of Three Entropies,” arXiv preprint, arXiv:2010.14860, 2020.
I. Torroba, C. I. Sprague, and J. Folkesson. “Fully-probabilistic Terrain Modelling with Stochastic Variational Gaussian Process Maps,” arXiv preprint, arXiv:2203.10893, 2022.
M. Ketenci, A. Perotte, N. Elhadad, and I. Urteaga. “A Coreset-based, Tempered Variational Posterior for Accurate and Scalabel Stochastic Gaussian Process Inference,” arXiv preprint, arXiv:2311.01409, 2023.
I. Torroba, M. Cella, A. Teran, N. Rolleberg, and J. Folkesson. “Online stochastic variational gaussian process mapping for large-scale bathymetric slam in real time,” IEEE Robotics and Automation Letters., vol. 8, no. 6, 2023, doi:10.1109/LRA.2023.3264750.
H. Yu and Y. Chen. “Stochastic Motion Planning as Gaussian Variational Inference: Theory and Algorithms,” arXiv preprint, arXiv:2308.14985, 2023.
R. Meng, H. K. Lee, and K. Bouchard. “Stochastic Collapsed Variational Inference for Structured Gaussian Process Regression Networks,” In Conference of the International Federation of Classification Societies, Cham: Springer International Publishing, pp. 253-261, 2022.
C. Hua, X. Cao, B. Liao, and S. Li. “Advances on intelligent algorithms for scientific computing: an overview,” Frontiers in Neurorobotics, vol. 17, 2023, doi:10.3389/fnbot.2023.1190977.
J. Chen, Y. Saad, and Z. Zhang. “Graph coarsening: from scientific computing to machine learning,” SeMA Journal., vol. 79, pp. 187-223, 2022, doi:10.1007/s40324-021-00282-x.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).