Why don't I get the desired I/sigI value in the high resolution shell?

This is most likely a combination of statistical jitter and ice-rings. The problem is the following:

  • when doing statistics (I/sigI, Rmerge, completeness etc), resolution shells that are within typical ice-ring resolutions can be very noisy
  • this can be due to poorly integrated reflections (which could be overlapped by ice-ring reflections or the background was badly estimated within more diffuse ice-rings) or because the ice-ring was actually excluded during integration - so that the resolution bin for statistics is very sparsely populated
  • the default in autoPROC is to take all those binned statistics as they are - unless the user switched on the automatic treatment of ice-rings, i.e. exclusion of detected ice-rings at the integration step. Those excluded resolution ranges will then be ignored during the high-resolution criteria decision making.
  • typical ice-ring resolutions are around 3.9, 3.67, 3.44, 2.67, 2.25, 2.07, 1.95, 1.92, 1.88 and 1.72 A.

So we have two different cases:

1. the images have ice-rings but we didn't tell autoPROC to deal with them.

This could have the effect of getting e.g. a very low I/sigI value in a resolution bin affected by an ice ring - and the high-resolution criteria (e.g. I/sigI value of 2.0) is applied too early. The resolution bins beyond this ice-ring affected range could again have I/sigI values above 2.0, but it is the first bin below 2.0 that determines the high-resolution cut-off.

2. the images have ice-rings and we told autoPROC to deal with those automatically.

In that case any resolution bin within the range of those ice-rings will be ignored during calculation of statistics. But what if e.g. the I/sigI criteria would be met exactly within this resolution range? Since we ignore it it will be the resolution range just before the ice ring and the one just after the ice ring that are used (through simple linear interpolation) to define the high-resolution limit. Because autoPROC never 'sees' the values within the ice-ring resolution range, this can lead to a slightly over-optimistic high-resolution cut.

It is difficult to come up with a good mechanism to deal with ice rings and high-resolution cutoffs at the same time. One possibility is to switch off the automatic adjustment of determining these values, eg with the parameter settings


It might be easiests to add those lines to a file and run it with the -M <file> argument. Alternatively, use those settings on the command-line of each process or aP_scale job. Or place them into a so-called macro file.

Then one can do the determination by hand (eg based on the *.mrfana file with all the statistics) and run the scaling standalone with

% aP_scale \
  -M NoHighResCut \
  -mtz XDS_ASCII.mtz \
  -P Lyso A 1 -b 1-1999 \
  -R 40.0 2.2 \
  -id 01 | tee 01_aP_scale.log