[sharp-discuss] question on mir phasing

Clemens Vonrhein vonrhein at globalphasing.com
Thu May 27 18:22:11 CEST 2010


Hi Fengyun,

sorry for not giving a simple answer but rather posing more questions
...

On Thu, May 27, 2010 at 10:48:51AM -0500, alpharyun at rice.edu wrote:
> Hi everyone,
> 
> I met a problem when I phase with mir. Any suggestion is appreciated.
> 
> What I have now,
> 
> native  resolution to 2.2 A
> Hg-derivative resolution to 2.78 A (Rfactor=0.084)
> Pt-derivative resolution to 2.85 A (Rfactor=0.169)

I guess this is the overall R-factor between the derivative and the
native? Ideally you want low values (isomorphous) and high values
(clear difference due to HA incorporation) at the same time.

How good are the statistics that might show that some HA substitution
actually happened:

  * fluorescence scan (after backsoaking?)

  * anomalous statistics for derivatives?

  * <FPH-FP> values for isomorphous difference?

> The space group should be H3 or H32.

In all datasets you can't distinguish between those two? So the same
section of reciprocal space is missing in all of them?

> When I use autoSharp with both two derivatives with all reflection
> data, nothing meaningful comes out.

Ok - that opens up a whole bunch of questions about what 'nothing
meaningful ' means:

  * do you get any warning messages about data quality from autoSHARP
    before it even attempts HA detection with SHELXC/D? 

  * what does it have to say about likely strength of your HA signal
    (ano and isomorphous)?

  * do you get a HA substructure solution from SHELXD that looks
    sensible? Especially:

    - what data (ANO/ISO) is it using for which derivative?

    - do you have a convincing looking plot of CCall vs CCweak (some
      high value points clearly separated from the junk)?

  * does autoSHARP leave all/most of these sites in when it does the
    iterative SHARP (phasing) and LLG analysis (model
    correction/completion) steps?

  * (important!) do you get a significant difference in scores for
    deciding which hand is correct?

    ==> unless you're really sure about your HA solution and the
        phasing: if the two hands don't show a significant difference,
        in 99% of cases it is basically a waste of time looking at the
        maps ... and convincing yourself that there is some protein
        visible.

> If I lower the high-resolution limit, there's still no meaningful
> coming out.

To what resolution limit are you going? Useful might be to go for the
limit of your HA signal - but if that is only 5-6A then the statistics
coming out of SHELXC/D can be very noisy and inflated.

> So I tried to phase with individual derivative dataset with H3 or H32.  
> But only in H3, I could see some reasonable map.

And also only in H3 are you getting a convincing substructure solution
plus a significant difference in score during hand-determination?

> For Hg-derivative, some reasonable density (looks like long-helix  
> density) could be seen at 15 to 5 A.

In that resolution range (5A) I find it easier to look for beta-sheets
- they are larger secondary-structure blocks than helices.

Why are you looking at a low-resolution cutoff of 15A? Nowadays, it
should be standard to collect data to say 40A (or lower). Is there
something wrong with your low-resolution dtaa? Big beamstop and/or
overloads and/or beamstop masking problem?

> And the phasing power is 0.297/0.147 for iso/ano data.

Not very high - is that overall value to 5A? Then it is ... well:
poor.

At what resolution does it drop below 1.0?

> For Pt-derivative, some reasonable density (also looks like helix)  
> could be seen at 15 to 4 A. And the phasing power is 0.108/0.175 for
> iso/ano data.

Same question as above.

> Right now, I want to combine these two datasets. I know that these
> two datasets might be phased on different origin and hand.

If your confident about the helices in your two maps, then the hand
should already be correct (check score difference from autoSHARP!). So
it is 'only' an origin issue.

> Is there any smart way to combine them?

You could create a *.hatom file e.g. from the Hg sites, place this
into sharpfiles/datafiles directory and then run autoSHARP again. But
this time you pick that *.hatom file for the Hg derivative: this will
make autoSHARP skip the HA detection stage and start with the known
sites (as given in the *.hatom file). It will use both derivatives in
SHARP and the resulting LLG (residual) maps to then find the Pt sites
- which would be oin the same hand and on the same origin. 

Cheers

Clemens

-- 

***************************************************************
* Clemens Vonrhein, Ph.D.     vonrhein AT GlobalPhasing DOT com
*
*  Global Phasing Ltd.
*  Sheraton House, Castle Park 
*  Cambridge CB3 0AX, UK
*--------------------------------------------------------------
* BUSTER Development Group      (http://www.globalphasing.com)
***************************************************************


More information about the sharp-discuss mailing list