The SAG grindability index test

You are here: Resources / The SAG grindability index test

In this paper, the authors undertake a critical review of the Starkey test and the publicly available information related to the test equipment, procedures, and scale-up methodology. The following recommendations are proposed to improve the test method:

1. The test should be conducted for a fixed grinding time of 120 min, regardless of the time required to reach 80% passing Tyler #10 mesh.

2. The test should be conducted with constant time intervals of 15, 30, 60, and 120 min (cumulative) in order to facilitate the application of geostatistics to the resulting index values. This would also allow for multiple tests to be conducted in parallel (through the use of multiple mill rollers).

3. The feed size should be prepared using a more rigorous procedure to ensure constant mass in each of the course screen fractions.

4. The curve of finished product versus time should be modeled and the resulting index calculated from the model for a standard feed size distribution, so that errors attributable to the sample preparation step are  minimized.

The improved feed preparation steps and the use of constant grinding intervals enables the development of a faster alternative to the standard test that is more cost effective for high volume geometallurgical programs.

In addition to the updated procedures, a new calibration equation is proposed, with calibration factors for pebble crushing, fine feed and  autogenous grinding, based on information in  the public  literature. Detailed descriptions of the test equipment, procedures, and calibration are provided, and it is proposed that this become an open standard procedure for SAG mill hardness testing, particularly for soft to medium-hard ores, over which range the test is most effective.

Categories

3335