Quality Control and Quality Assurance Programmes: A Priority, Not an Afterthought

by | May 14, 2016

QA-QC AnalysisThe integrity of a resource database is fundamental to a company’s success in securing debt or equity finance for a new mining project. The validity and quality of data can only be guaranteed when appropriate sampling and assaying protocols have been implemented. No amount of mathematical sophistry can replace them.

Since the latter part of the 1990’s as a result of well documented financial scandals within the mining industry, companies have instigated Quality Assurance/Quality Control (QA/QC) programs, increasing the requirement for the development of clear policies, applied during all phases of sampling programmes. This has been mirrored by the regulatory environment evolving and setting out specific responsibilities and duties of the reporting issuers and independent Qualified Persons (QP). However, the regulatory guidelines such as the JORC Code (2012) and CIM (NI43-101) are not prescriptive regarding sample specifications, as this aspect is specifically left to the judgement of the QP due to the multiple factors requiring consideration on a project-by-project basis during the design of a robust sampling and QA/QC programme.

More often than not, though, Micon observes that QA/QC programmes are overlooked in a project’s design due to a number of factors but, most commonly, due to the difficulty of justifying additional sampling, the cost of which which does not directly aid in establishing the presence or continuity of mineralization. Where QA/QC programmes have been undertaken it is often as an afterthought, either with a few samples added into the sample stream as a somewhat token gesture, or by over-reliance being placed on the laboratory’s own QA/QC program to cover this aspect. Even where some QA/QC sampling has been undertaken, it is unfortunately very common to see that no analysis of the data has been completed and therefore, no timely remedial action taken. This can and does often lead to delays and additional costs at the later stages of project development, or an increase in risk associated with a project’s resource database, thus affecting the classification of resource models due to an increase in uncertainty associated with the dataset.

QA/QC programmes need to be carefully designed and implemented at all stages of exploration and development where sampling of material is undertaken. Furthermore, QA/QC programs require review prior to, and during, each sampling programme with modifications made where necessary based on numerous factors such as sample type, size, and the proposed sample processing and treatment methods.

QA/QC is required to ensure all chemical data generated over the course of a sampling programme (exploration or development) adheres to the key fundamental aspects of:

  • Accuracy – the degree to which an analysis is reflective of the expected result.
  • Precision – the repeatability of the result, and
  • Identification of sampling failures/errors/contamination.

The key areas subject to QA/QC auditing prior to mineral resource estimation are project-specific protocols and available sample data from QA/QC sampling and analysis.

Project-specific protocols, often in the form of Standard Operations Procedures (SOPs), should be in place before any sampling commences. SOPs should outline the procedures and operating practices to be implemented during the sampling programmes, from the initial set up of the drill rig through to the analysis of QA/QC data upon receipt of the sampling results. SOPs help to ensure that a culture of QA/QC is established throughout sampling programmes, and also aid in identifying areas of risk within a procedure where errors could occur.

Just as the project-specific protocols are individual to a sampling programme, the QA/QC monitoring practice should also be specific. However, three essential stages which are applicable to all projects are the following:

  • It must be standard practice to include the right mix of QA/QC materials in every batch of samples submitted to a laboratory.
  • The geologist/technician initiating the analysis must critically review the results of all QA/QC samples within a sample batch as soon as the results are received from the laboratory. This should preferably be undertaken prior to the associated data being included within the resource database, and used to update any resource models.
  • Action must be taken when QA/QC results fall outside of predetermined acceptable limits.

QA/QC materials inserted into sample streams are used to assess the three fundamental aspects (precision, accuracy, identification of errors) of QA/QC. Best practice QA/QC programmes include a combination of the following QA/QC materials:

  • Primary Standards (or Standards, Certified Reference Materials (CRMs))

Material with a known metal content and specific chemical characteristics similar to the mineralisation being sampled, is utilised to evaluate bias within a sample dataset. These can be externally sourced commercial standards (CRMs) or company ‘in house’ standards (which require a specific QA/QC analytical testing programme). A suite of project-specific primary standards should be available across a range of known grades, one of which should be close to a likely mining cut-off grade. However, the number of standards should be kept relatively small to provide a purposeful assay population size for assessment. For example Micon has reviewed projects in the past where in order of 20 different standards were used with some standards being only inserted into the sample stream 2 or 3 times, therefore not providing a large enough population for meaningful assessment.

  • Blank Samples

Samples containing no detectable trace of the key mineral(s) identified within the resource are inserted into sample streams to identify the presence of any contamination introduced at the laboratory or in sample preparation. Blanks are generally sourced in-house from barren drill spoil or local rock outcrops. In house blanks require their own QA/QC testing; typically sampling should be undertaken at a minimum of 3 laboratories on a quarterly basis to ensure the blank material is of suitable quality. Alternatively, quantities of material suitable for use as blanks can be purchased commercially. Blank samples allow the QP to monitor the cleanliness of the sample preparation equipment (on site or at the laboratory) and calibration of analytical equipment.

  • Duplicates

These are the most common QA/QC sample, allowing for the determination of analytical precision, but they do not allow for the monitoring of accuracy and therefore need to be accompanied by standards and blanks. ‘Field’ duplicate samples are formed from splitting the original sample interval into equal portions and submitting both samples for analysis (pulp duplicates should be taken by the laboratory of choice during sample preparation). A drawback with following pre-determined duplication (i.e. every 20 samples) is that it can lead to many repeats of background (waste) material. It is therefore preferable to duplicate a high proportion of anomalous (ore-grade) samples. It is good practice not to give a duplicate sample the next consecutive number to the primary sample it is paired with, but to allocate it a number later in the sequence. This allows for the assessment of drift throughout the course of the analytical job.

The inclusion of QA/QC materials within a sample stream should be developed with regards to the scale of the sampling programme in order to obtain a suitable sample populations for statistical analysis. Whichever system is implemented for the inclusion of QA/QC materials (e.g., one standard inserted every 50 samples, one per batch of 100 samples, 5 standard samples inserted randomly, etc.) accurate recording within sample records should accompany it, so that samples can be easily identified for statistical analysis. The frequency of QA/QC materials varies depending on the type and stage of sampling programs, for example:

  • When dealing with a soil sampling program where the requirement from the data is precision rather than accuracy, the submission of expensive external ‘certified’ standards would not be cost effective. Routine submission of ‘In house’ standard material, along with a few duplicated sites near any known mineralisation may well suffice. Overall, quality control samples need not exceed say 1 in 50 (2%), but is dependent on the total sample programmes population size.
  • When drilling a prospect that has a good chance of becoming a resource, a higher proportion of QA/QC samples will be required. These might comprise of ‘in house’ standards and blanks, duplicates, and certified standards. In addition it is prudent to conduct occasional but regular cross checks of mineralized samples at other ‘umpire’ laboratories (with QA/QC samples included within batches).
  • A general industry ‘rule of thumb’ for the proportion of QA/QC samples submitted for drill programs is 5%, comprising of 2% ‘Certified Reference Material’ (CRM) samples, 2% Duplicate samples and 1% Blank samples. So for example in practice, there is a requirement to include in the sample batch at least 1 CRM every 50 samples, 1 Duplicate every 50 samples and 1 Blank every 100 samples. This ‘rule of thumb’ will not be suitable for all projects and should perhaps be considered as a minimum requirement. Consideration may also be given to increasing the QA/QC samples to 10% for an initial program and then, once a satisfactory analysis has been conducted, reducing the QA/QC sampling to 5% in subsequent programs.

The analysis of the QA/QC sample data results is a key to assessing the three fundamental aspects. There are a number of statistical analysis and plots that can be undertaken on the sample populations to evaluate the risk associated with the resource database, such as Shewhart X (average) and R (range) charts, scatter plots, Sichel ratio, HARD analysis, HRD analysis, Thompson-Howarth Plots, and Q-Q plots to name just a few.

If analysis of the QA/QC samples identifies a failure or series of failures within a sample batch or series of batches, action should be taken to establish the source of the failure. Investigations into the sample records should primarily be undertaken as mislabelling of samples is a common error. Poor correlation between duplicate samples may not be the fault of the laboratory sample preparation or analytical processes but due to natural variability or poor sampling practice. With regards to failures of standard samples, consideration should be given to re-assaying a series of, say, 20 samples either side of the standard along with re-assaying of the standard itself. However, some judgement must be applied on the importance of those samples: for example, if the reported standard result is just outside acceptable limits but the 20 or 30 samples either side report very low or below detection limit grades, re-analysis may not be justified unless an ore-grade intersection was expected.

An alternative action in the event of failures of standard samples would be to have the laboratory re-assay the entire batch. In some instances, it has been known for standards to have an inherently high variance in which case a comparison of the regular sample assays and their re-assays can be conducted. Standards can also be used in the selection of a primary assay laboratory by submitting a range of standards to several laboratories. Those unable to reproduce the expected results should be regarded with caution.

QA/QC of analytical data associated with all stages of a mineral projects development is a primary requirement and should not be considered as an optional extra. The QA/QC associated with a project’s dataset is essential in determining that the data is fit for purpose, and mitigates the potential risks associated with a project.

 

 

 

 

 

 

8 Comments

  1. Philip

    Well explained and very clear

    Reply
  2. Dave

    Excellent article, thanks. Helped me explain the requirement to management.

    Reply
  3. Thomas Rogers

    What action is required on duplicate samples that fail >20%. The samples I am looking at are close to detection. How does one decide to reject a batch based on duplicates?

    Reply
    • Craig Morgan

      Dear Thomas,

      Some sort of action is most definitely required for a duplicate QA/QC system presenting a greater than 20% failure rate for a particular grade range. It is difficult to determine what action to take without further information, but the crux of the investigation should be in identifying where and why this deviation in duplicate grades is occurring.

      One needs to statistically investigate the resultant duplicate dataset and understand the geological and chemical properties of the orebody and samples to pin down exactly where the problem may lie. Things that need to be carefully considered include:

      • How exactly is the pass/failure criterion being defined? What constitutes a pass/failure, and is this criterion perhaps too stringent or too lenient?

      • What is the grade value of the detection limit, and how accurate is the laboratory with its assaying of values close to the detection limit?

      • How are the duplicates being produced and handled, and how near or far apart are theseINSERTed into the assaying stream?

      • Is the duplicate assaying being done at the same laboratory or at two different laboratories?

      • What method of assay is the laboratory using, and how does the laboratory’s QA/QC results look like with regard to the precision close to the detection limit?

      • How close or far is the cut-off grade from the detection limit? It is important that waste is consistently assayed as waste, and ore consistently assayed as ore. If the cut-off is greatly above the detection limit, the overall impact of repeatability close to the detection limit may, although still a concern, be of lesser alarm.

      • Is the inherent grade variability and distribution of the samples well defined and understood? The fundamental idea of a nugget effect could lead to high variability of samples of lower grade and thus a lower repeatability around the detection limit.

      From your question, it appears that indeed repeatability issues are being introduced somewhere in the sampling and assay system. Identifying that there is an issue is the first step in rectifying the problem.

      If you would like to discuss the matter further, please feel free to contact me at Micon’s Toronto offices.

      .
      .
      .

      Reply
      • Thomas Rogers

        Thanks Craig. I am now completing a QAQC report with my colleague for the same project. I will come back to the duplicates story later. I have a question in tolerance limits. I have seen 5%, 10%, 20% and 25% used on scatter plots and the like. How does what determine the tolerance limits so they are defensible?

        Reply
        • Diouf_Latgrand

          I would suggest:

          +/-30% or more when dealing with FieldDuplicate. this takes into account the nuggect effect.

          +/-20% for crushed duplicates…or 90% with a HARD of 20%

          +/-10% for pulp duplicates…or 90% with a HARD of 10%

          Reply
  4. Caleb Roger Owusu

    How does QA/QC affect sampling

    Reply
    • Asemahle

      How do I obtain mineral quality control.

      Reply

Submit a Comment

Your email address will not be published. Required fields are marked *