SHARP 2.2.0 - release notes
- Introduction
- List of changes and notes
- Installation
- References
These are the release notes for the new 2.2.0 production version of SHARP. Between the previous
versions 2.0.4 and the current 2.2.0 release, several changes have
occurred - mostly invisible to the user.
The current production version of SHARP (2.2.0) is intended as a
drop-in replacement of SHARP 2.0.4 (the previous production
version). It can be run in the same way as SHARP 2.0.4 using the
updated interface (SUSHI 3.4.x).
This release has been tested on
- Linux
- ia32: Pentium 4, Xeon, Athlon etc
- x86_64: AMD64, Opteron
- ia64: Itanium2
- Mac OsX 10.4.X (aka Tiger)
- SGI/Irix
- HP Tru64 4.0F/5.X
Changes between 2.0.4 and 2.2.0
- Parallel excution using openMP
A parallel version of SHARP using openMP is included in this
release. This allows making use of several CPUs installed in a
machine. It is available for linux (ia32/x86_64 and
ia64/Itamium), SGI/Irix and Hp/Tru64. See the installation
instructions on how to enable it.
- Spherically averaged clusters
SHARP now has a model of the scattering from clusters using a
spherically averaged description. More information can be found
in the manual describing the Sharp INput file syntax (Appendix
3, keyword: SPHCLUSTER).
- MTZ files
Since this current version has been linked against the CCP4 5.0.2 set of libraries (for
handling MTZ files), the MTZ files produced will have this
format. Therefore, the current release has been mainly tested (and is
supported) under CCP4 5.0.2 (4.2.2 might still work - but you are
encouraged to update to the latest 5.0.X release of CCP4). The
upcoming CCP4 6.0 release should not pose any problem - since
the underlying libraries for handling MTZ files are basically
identically. If you encounter problems running this SHARP
version in a CCP4 6.X environment, please let us know.
- This is a production release
The code has been extensively tested on both simulated data and
on all the real data sets at our disposal. It was in active beta
testing for over 6 months. We don't expect any significant
difference in behaviour for this version (relative to the
previous production version). However, would be grateful for any
feedback from the users, especially those familiar with the
previous versions of the program, about potential problems as
well as overall code behaviour.
Important : |
Please make sure to also read the release note for the
latest version of Sushi (which also contains the
autoSHARP set of programs and scripts).
|
Changes between 2.0.1 and 2.0.4
- MTZ files
Since this current version has been linked against the CCP4 5.0.2 set of libraries (for
handling MTZ files), the MTZ files produced will have this
format. Therefore, the current release has been mainly tested (and is
supported) under CCP4 5.0.2 (4.2.2 might still work - but you are
encouraged to update to the latest 5.0.X release of CCP4).
Important : |
Please make sure to also read the release note for the
latest version of Sushi (which also contains the
autoSHARP set of programs and scripts).
|
Changes between 2.0.0 and 2.0.1
- General
The algorithms implemented in this major revision
of the code remain the same as
in the relevant papers published previously [1], but
some subtle changes have been introduced at various stages of the
computations. However, the actual implementation is almost completely
new. In the following sections the main changes are explained.
- Double precision arithmetic
All computations are now done in double precision. This transition
was necessary to cope with very large range of values encountered in
likelihood computations, a common phenomenon in statistics. Despite
this, the new code is much faster and uses less memory.
- Core libraries
All low-level functions (mainly the likelihood functions and their
derivatives) have been redesigned and rewritten from the ground-up. As
mentioned above, they are the same as previously published. However,
we anticipate changes there in the near future. A fairly
sophisticated part of this section of the code is an algorithm to
integrate various likelihood functions over the complex plane. Speed-up
gains achieved initially has been deliberately sacrificed partly to
achieve a higher numerical accuracy.
- Middle layer of the code
All computations are now orchestrated by a set of functions reflecting
to a large extend the special nature of the optimization problem involved.
This way of coding is related to problems like automatic
differentiation [2] and use of sparsity of some matrices emerging at
intermediate steps of the computations, but does not involve any
approximations. Big memory savings and speed gains have been achieved
here. The scalability
wrt. model size has improved significantly. With the present version
it is feasible to investigate structures containing hundreds of
sites. The code is also very well suited for further extensions.
- New optimizer
The old minimizer (of the line-search type) has been replaced by a
state-of-the-art bound-constrained trust region type Newton method
algorithm [3],[4] which uses as a preconditioner a sparse incomplete
Cholesky factorization with predictable memory requirements [5].
Although SHARP often solves relatively small problems (in terms of number of
variables), this optimizer is well suited for very large ones.
- Improvements in handling heavy atom model
Model deficiencies caused by incorrect sites can now be detected at
earlier stages making the optimization procedure more robust.
A sparse approximation of the final Hessian can now be used to achieve
further speedup for high resolution data and/or structures with many
heavy atom sites.
- This is a production release
The code has been extensively tested on both simulated data and on all the
real data sets at our disposal. It was in active beta testing
for over 6 months. However, testing is never exhaustive and can
only reveal bugs, but not the absence of problems. We would be
grateful for any feedback from the users, especially those
familiar with the previous versions of the program, about
potential problems as well as overall code behaviour.
You will need to update your licence keys for this version; please go
to the main SHARP/autoSHARP page at
http://www.globalphasing.com/sharp/
and follow the instructions for licence
renewal.
See also the main
installation instructions as well as the upgrade notes at the main
SHARP/autoSHARP
pages.
[1] La Fortelle, E. de & Bricogne, G.,
Maximum-Likelihood Heavy-Atom Parameter Refinement for Multiple
Isomorphous Replacement and Multiwavelength Anomalous Diffraction
Methods, Methods in Enzymology, Macromolecular Crystallography, volume
276, pp. 472-494, edited by R. M. Sweet and C. W. Carter, Jr. New
York: Academic Press.
[2] Evaluating Derivatives: Principles and
Techniques of Algorithmic Differentiation, Andreas Griewank, Frontiers
in Applied Mathematics 19, SIAM, Philadelphia, USA, 2000.
[3] A.R. Conn, N. Gould and Ph. L. Toint,
Trust-Region Methods, SIAM, Philadelphia, USA, 2000.
[4] Chih-Jen Lin, Jorge J. Moré, Newton's method
for large bound-constrained optimization problems, SIAM Journal on
Optimization, Volume 9, Number 4, pp. 1100-1127, 1999.
[5] Chih-Jen Lin, Jorge J. Moré, Incomplete
Cholesky factorizations with limited memory, SIAM Journal on
Scientific Computing, 21, pages 24-45, 1999.
[6]Schiltz, M., P. Dumas, E. Ennifar, C. Flensburg,
W. Paciorek, C. Vonrhein and G. Bricogne. Phasing in the presence of
severe site-specific radiation damage through dose-dependent modelling
of heavy atoms. Acta Crystallogr D Biol Crystallogr, 60(Pt
6):1024-31, 2004.
[7]Bricogne, G., C. Vonrhein, C. Flensburg,
M. Schiltz and W. Paciorek. Generation, representation and flow of
phase information in structure determination: recent developments in
and around SHARP 2.0. Acta Crystallogr D Biol Crystallogr,
59(Pt 11):2023-30, 2003.
http://www.globalphasing.com,
<sharp-develop@GlobalPhasing.com>
Last modification: 17.11.05