Prof. Jesper Olsen

An Interview with Professor Jesper Olsen

Prof. Jesper Olsen is Group Leader in Mass Spectrometry for Quantitative Proteomics, Proteomics Program at the Novo Nordisk Foundation Center for Protein Research at University of Copenhagen, Denmark.

Prof. Olsen has had a long-standing interest in applying proteomics technology to systems-wide analyses of dynamic post-translational modifications (e.g. phosphorylation, ubiquitylation, acetylation and glycosylation) that regulate cell signal transduction pathways. Prof. Olsen’s research is also focused towards continuously developing phosphoproteomics technology with the aim to be more robust, reproducible and rapid.

                       

What role does sample preparation play in proteomics today?

Certainly, when it comes to the analysis of dynamic molecules, I am convinced that the key challenges today lie mainly within the sample preparation step.

When you look at the proteomics workflow as a whole, there have been major developments and refinements over the last decades relating to mass spectrometry equipment. Also, new and improved bioinformatics tools have become available, facilitating analysis of the generated data.

What we are left with is the old problem of “garbage in, garbage out.” You have to be able to do things in a reproducible way if you want to capture these very dynamic events.

How does looking at dynamic samples affect the way you work? 

We can take the example of peptide degradation. When our group became interested in, and started looking at, endogenous peptides, the results of degradation became very evident.

Here, you really have to control and minimize post mortem activity of enzymes if you want to generate accurate, meaningful results. Any degradation of the sample will lead to additional and compromising competition of the signal for all analytes. This immediately kills the dynamic range of your overall experiment.

We look at both technical and biological replicates of samples where we have introduced different kinds of perturbation and where we therefore expect to see changes in the phosphoproteome or peptidome. Obviously, we are looking for the results to show that the observed technical variability is less than the biological variability. This is how we optimize and set our standards in the lab.

How do you ensure reproducible results?

We look at both technical and biological replicates of samples where we have introduced different kinds of perturbation and where we therefore expect to see changes in the phosphoproteome or peptidome. Obviously, we are looking for the results to show that the observed technical variability is less than the biological variability. This is how we optimize and set our standards in the lab.

Why did you choose heat stabilization for sample preparation?

We’ve been using a Stabilizor instrument for several years. We became convinced about its merits when we compared neuropeptide extracts with and without heat stabilization. In the samples that were not heat stabilized we just found so many truncated versions of peptides.

We’ve compared heat stabilization with other techniques, such as microwave irradiation and snap freezing with inhibitors. What we’ve found is that heat stabilized samples are better preserved and provide higher reproducibility.

The Stabilizor instrument is also popular in the laboratory since it is very easy to use, and researchers know that their experimental data will be reproducible. 

Does the scientific community focus enough on sample preparation? 

It’s gaining more and more attention, but it’s still not always sufficient. At least there is a tendency now that when you want to publish proteomics data you need to demonstrate the reproducibility of your data using at least three biological replicates. I think it’s nice that the scientific community is now demanding that.