Geostatistics had its origins in the 1950s with the initial works of Herbert Sichel and Danie Krige. Becoming aware of these findings, Georges Matheron formalized them in the late 1950s and early 1960s into what is known today as the fundamentals of geostatistics. In the years that followed, the field of geostatistics quickly developed into a successful area of spatial estimation and simulation of natural phenomena, and despite its critics, geostatistics has proved itself most useful in industries such as mining, forestry, fisheries and soil sciences. As early as the 1950s, Danie Krige was aware of the spatial correlation exhibited between mining samples, and since its formalization in the early 1960s, geostatistics has been extensively used to evaluate mineral resources and reserves.
Over and above the newer geostatistical methods that were constantly being introduced, the advent of computers and the increasing power and processing capabilities of these machines made it possible to compute more and more geostatistical iterations in a much shorter period of time. Geostatistical software packages were also developed and consistently improved, and today this software industry is a big market. Together with these continual improvements, the user friendliness of software packages has also improved, allowing for the geostatistical analysis of increasingly complex ore deposits in less time.
Example of a Geostatistical Grade Model
A detailed search of the internet will show that a large number of journal articles and books have been dedicated to the field of geostatistics, and nowadays various large geostatistical conferences take place globally. Many schools throughout the world also offer courses in geostatistics, and the applications of geostatistics seems endless. Geostatistics rightly finds itself with a very good standing in the scientific community.
Nonetheless, it remains true that any good geostatistical estimate is built upon sound geological and mathematical foundations. Understanding the assumptions of the methods being utilized, and how deviations from these assumptions will ultimately affect the estimates, remains non-negotiable. Furthermore, appreciation of what the software package algorithms are performing to the data is equally important. Simply being able to operate the software package is not enough. So, it is concerning to hear comments such as:
- “geostatistics is nothing more than a black-box exercise”
- “a few presses of a button can provide a functional mineral resource estimate”
- “the assumptions of the geostatistical technique being used are guidelines and generally do not need to be adhered to”
Indeed, it has become relatively easy to perform and apply geostatistical methods on datasets without first considering the quality and statistical properties of the data, the assumptions of the geostatistical method being used, or the algorithms of the software. Nevertheless, a practitioner should never shy away from a detailed prior investigation of the dataset and a thorough understanding of the geology, appreciation of the mathematical assumptions of the geostatistical technique and requirements of the methods being applied. Despite geostatistical software packages being a powerful tool in a practitioners’ arsenal, it should not be used merely as a black-box or button-pressing instrument.
For this reason, it is imperative that a geostatistical practitioner remains abreast of latest developments in geostatistics through research and continual development. Regulatory bodies appreciate this fact and nowadays often require members to score continual development points through research to maintain their professional membership. This is a good practice and will facilitate the successful growth of the geostatistical field.
Moreover, practitioners should be knowledgeable in the use of their preferred software package, and should avoid using techniques in which they have limit confidence or understanding. Despite the countless number of methods available for geostatistical estimation, Micon has often found that the so called lesser complicated geostatistical methods with more basic assumptions perform just as well as the more complex methods, provided the practitioner uses their expertise and experience of the natural phenomena through incorporating geological, mathematical and geostatistical properties into the spatial estimates.
A software package can very successfully perform many mathematical computations in a small amount of time, but it cannot replace the experience and understanding that the practitioner has of the natural phenomenon. Time should therefore be taken by the practitioner to fully understand the characteristics of the dataset and the methods that will used, and to validation the estimate results at the end of the study. Validation of estimate block models should be done not only against the available dataset, but also against the known and expected geology or natural phenomenon structures and constraints. Doing so will result in models that are both sensible and reliable.
Naturally, the inputs into the geostatistical estimate, i.e. the dataset itself, should be trustworthy. The common saying of “garbage in, garbage out” holds very true in the geostatistical domain, and no geostatistical method, no matter how complex, will be able to account for and correct poor-quality data. A comprehensive sampling collection protocol, together with a respectable QA/QC procedure and measurement system will ensure levels of high quality in the database. That being said, it is also very important that the database also be stored in a safe domain wherein integrity can be ensured.
If done diligently, a knowledgeable practitioner can harvest the benefits of good sample capture and storage protocols, together with geostatistical expertise and software knowhow, to produce robust and reliable geostatistical models. Such models will be constructive, by establishing confidence if the spatial characteristics of the natural phenomenon, not introducing further uncertainty.
Geostatistics is therefore more than just a black-box or button-pushing exercise. It is, in reality, a process involving a suitable collection of data and geological information, a detailed analysis of this data, an understanding of suitable methods for estimation and their inherent assumptions, and a systematic and comprehensive validation of the output model. Tools such as software packages are there to make this process easier and more time efficient. They are not magic wands that mysteriously produce models that fit the characteristics of the natural phenomenon. A sensible geostatistical estimate takes time, understanding, experience and skill to produce. With enough forethought, geostatistics is undeniably a powerful tool in the calculation of spatial estimates.
.
.
.
Excellent. Thank you for the article. It puts the importance and value of geostatistics into perspective.
Great article, many inexperienced Resource Geologists just accept the geostatistical software defaults without fully understanding the implications. Often the defaults have been put in place by programmers and they most probably do not support their geostatistical assumptions or mineral deposit type.
A very true and honest description of the work that we carry out on a daily basis my friends. In my Gold mining industry ”Geology” is the core step and key to all work carried out for my geostats models. Thank u MICON.
Hi Craig Good article.
As you intimated in your article, too little time is taken in gaining an understanding of what you are working on and with what. This invariably leads to simplistic interpretations of what the basic parameters are of the model. In our 115 step evaluation protocol only 5 steps are assigned to modelling of the semi variogram and final Kriging of estimates. All the rest occur before these, and are there to ensure that not only do we have a decent understanding of what one is working with, one also gains the knowledge of what to do with what we have, and utilise the correct restraining parameters in our estimation. So in other words if you are starting a project from scratch, a good rule of thumb is that 96% of your time should be spent examining and analysing your data and the remaining 4% spent on actual running of your estimates.
Excellent article. As said by Jon, understanding the geology of the deposit is the very first step and the most important aspect. Variography must reflect the anisotropy of mineralisation within the different domains identified, avoiding pitfalls due to information effect and other details. I agree with Leon regarding the amount of time for the estimation itself. Paraphrasing Mr. Harry Parker: “Do not assume anything. Check everything.” Many thanks Craig.
A good article and discussions.
I forgot who once said that “Geostatistics without geology is just statistics” and that at the end of the day, the mineral resources and ore reserves figures are still “estimates.”