Comparison of optimizers (Adam, RMSprop, SGD and Adagrad) in a neural network for mineral resource classification: a case study in a copper deposit in Peru

Document Type : Research Paper

Authors

1 Department of Mining Engineering, Faculty of Engineering, National University of Trujillo, Trujillo, Perú,

2 Faculty of Chemical Engineering, National University of the Altiplano, Puno, Perú.

10.22059/ijmge.2025.393902.595243

Abstract

This study has compared the performance of various optimizers in mineral resource classification using a multilayer perceptron artificial neural network (MLP) applied to a copper deposit in Peru. The optimizers Adam (Adaptive moment estimation), RMSprop (Root mean square propagation), SGD (Stochastic gradient descent), and Adagrad (Adaptive gradient algorithm) were evaluated to assess their impact on the spatial continuity of block classification. A total of 318,443 blocks were estimated using ordinary kriging, based on key variables including estimated grade, kriging variance, average sample distance, number of composited samples, the kriging Lagrangian and geological confidence. The methodology involved a mixed multivariable block-by-block clustering using the k-prototypes algorithm, followed by block smoothing through an artificial neural network with different optimizers. Results shows that the Adam optimizer achieved the highest overall accuracy (93%), outperforming both RMSprop and SGD (92%), as well as Adagrad (90%). In addition, Adam yielded a more homogeneous classification of mineral resources. It categorized 75,869 blocks as measured (1,395.99 Mt total tonnage, 5.40 Mt fine copper), 120,039 as indicated (2,208.72 Mt and 6.56 Mt fine copper), and 122,535 as inferred (2,254.64 Mt and 6.29 Mt fine copper). In conclusion, the model trained with the Adam optimizer demonstrated superior precision and stability in mineral resource classification, effectively mitigating the “spotty dog effect” and improving the geological coherence of the block model.

Keywords

Main Subjects