The advances in DNA analysis and the risks arising from the ability to work with minute quantities of DNA

It is not an overstatement to say that the advent of forensic DNA analysis has revolutionised the place of scientific evidence in court. The public perception, judged by a wholesale acceptance of the National DNA Database (NDNAB) as a “Good Thing”, underlines the need to educate people, and especially those involved in the investigation and prosecution of crime, of the potential dangers that accompany the undoubted benefits of this technology.

DNA analysis is one of the most scientifically robust techniques to be placed before a court. A series of legal and scientific challenges has honed the collection, processing, analysis and evaluation of DNA evidence to reduce the possibilities for erroneous results. Already some other evidence types have benefited from the advances in evidence evaluation that followed the challenges to DNA. On the other hand, some identification disciplines with a long pedigree in court, but little or none in science, have signally failed to catch the wind that is now blowing through forensic science. DNA has reminded us that no matter how small or large the numbers, all scientific evidence is probabilistic, and only an exclusion of involvement can be regarded as certain.

Overwhelming odds?

For a DNA profile 10 areas of DNA are analysed, rather like examining only 10 shelves in a library. Maybe 10% of people have a Harry Potter book. Five per cent have Alice in Wonderland. Thirty per cent have A Brief History of Time (and 0.01% understood it!). The probability of someone having all three books is 1/10 x 1/20 x 3/10 = 3/2000, or approximately 1 in 666. You can see that no matter how many books you choose to include in the search, the probability of finding a person with all of those books will get smaller, but never reach 0.

Two features of DNA evidence that can be viewed as its major benefits also produce its greatest potential dangers: specificity and sensitivity. Specificity can be considered as being similar to what used to be called discriminating power, in effect how useful it is in telling two people apart. In DNA this is usually expressed as the match probability: the probability that the DNA profile would be obtained by choosing a person at random from the population. This number is now routinely in excess of 1 in a billion (1 thousand million). Unfortunately this is usually wrongly perceived as meaning that the odds are 1 billion to 1 that the accused is the perpetrator. This even has a name: the prosecutor’s fallacy. (The defence have their very own, different, fallacy too.) Juries, now fully conversant with forensic science via CSI, Waking the Dead and Silent Witness, add these compelling figures to the compelling figures of their TV heroes to conclude that “guilty” is the only sound choice. We have now even coined the term “CSI effect” to recognise the (damaging?) effect of these programmes on juries.

Sensitivity is a measure of how little DNA we need to perform analyses and produce a profile. Until recently DNA was recovered from visible stains like blood splashes or semen stains. Then we started to collect samples, usually by swabbing, from areas where we may expect to find DNA from body fluids, e.g. cigarette ends, cutlery, spectacle frames. The introduction of Low Copy Number DNA (LCN) has seen us now enter an era where single pieces of DNA may produce profiles; that is, less than 100 picograms (0.0000000001g) of DNA.

A little part of yourself

To understand why that may be a problem it is useful to consider that each of us have about 1014 cells in our body, each with a full DNA profile packed inside them. We lose a number of these cells every minute of every day (and night – that’s what keeps a family of bed bugs in food). Everywhere you go you probably leave your DNA. And here’s the problem: your DNA goes places you’ve never been. This is probably one of the main differences between DNA and fingerprints. A correct fingerprint identification, on a fixed object, may establish that you were at a particular point, but DNA can be transferred from you to someone else and from that someone to somewhere else you may never have been.

When we had blood or semen stains that could be seen, we were perhaps a little more confident that this established a link between the source of the stain and the location of the donor. But if you can walk through the supermarket and one of your cells blows into a vehicle or onto a surface a distance away, then it has literally distanced itself from you.

Combine the compelling specificity with molecular level sensitivity, and a population DNA database, and you have a potential scenario where your DNA is found at a crime scene that you have never been near. Your name and address are obtained from the database and the police informed that there is a 1 in a billion match probability. On questioning you say, truthfully, that you have never been in that location. What is the policeman thinking? “Mistake”, “Sorry to have troubled you, sir”, or “Liar”?

Inevitably, you become a suspect. Is it just possible that this apparently compelling evidence will be placed alongside some other circumstantial evidence (can YOU account for, and prove, where you are every hour of every day?) to gain a conviction?

Contaminated evidence?

As much as the DNA technologies created controversy and challenges when they were introduced, LCN DNA has produced its very own set of problems. Not least among these is the limited number of providers of this technology. In many cases they are working with old, degraded, or sub-microscopic volumes of material.

Many, if not all, of these old, or “cold”, cases occurred before DNA forced a rethink of the possibilities for contamination of evidence. Exhibits were collected with little regard for who was handling them or the possibilities of cross contamination from suspects to items via the investigating officer. Even the laboratory environment or procedures would not be designed to protect against the transfer of such low amounts of material. This was not negligence; it just didn’t consider the possibility of such traces becoming important.

In forensic science the fact to be established is that the DNA profile originated from the material recovered from a crime scene or a suspect, not the investigator, the laboratory, packaging, or analytical instruments. A “negative control” is set up by simply processing a “blank” sample that has no DNA. All being well, this control will not show any DNA. The presence of DNA in the negative control illustrates that there has been a source of contamination in the analytical method. It does not, of itself, show where that occurred, merely that it has. The tradition over many years has been, for very sound reasons, that anything found in the “negative control” invalidates the analysis.

There are now some who argue that this principle cannot be applied to LCN DNA analysis, because even in a tightly controlled analytical procedure a significant number of supposedly negative controls give a positive result, i.e. they indicate the presence of DNA.

The issue of course is that if it cannot be established that the DNA has been introduced during the analysis, how can any of the DNA found in the crime stains be shown NOT to have come from the procedure rather than the scene?

Lastly, the very small amounts of DNA and the vagaries of the method mean that it is frequently the case that replicate samples, that should produce the same results, don’t. The process gets around this difficulty by simply taking a vote of three replicates. DNA types found in two of the three are regarded as real and counted in the “consensus” profile. We use the consensus result as the basis of the statistical calculation of how rare a combination is in the population at large – in effect the probative value of the DNA evidence.

Probability and uncertainty

Now imagine that we take 10 of these consensus results for different areas of DNA to calculate the match probability. This process will yield a statement of the form, “the probability of this profile coming from X rather than some unknown, unrelated person is…”, and then a number that is frequently of the order of billions, but with no statement of the confidence that we can place in that result despite the clear, and probably measurable, uncertainty that must exist.

This is not an argument for the abandonment of any or all DNA methods. It is a warning of the dangers of not understanding the potential for honest error and margins of error. These new techniques are undoubtedly of tremendous value as intelligence in criminal investigation. In cold cases the requirement for other corroborative evidence must reflect the increased uncertainty in the LCN results.

The scientific issue is the degree of confidence that can be placed in the results and the consequent opinion. The legal issue is whether the destructive techniques meet the requirements of physical evidence acceptable in court.

    Professor Allan Jamieson is Director of The Forensic Institute, Glasgow t: 0141 202 0700 e:

Share this article
Add To Favorites