You’re hauled in for questioning, and the police seem strangely confident that you’re the right suspect. Finally, one of the detectives lets it slip: There’s fingerprint evidence against you.
Television shows that focus around forensic evidence makes it seem like fingerprint analysis is a disciplined science with machines that spit out incontrovertible results upon which a prosecutor can hang their hat. The reality, however, is much more complicated.
Fingerprint evidence has been used in American courts since 1911. Their long history of use, however, may have contributed to their undeserved enshrinement in the minds of jury members, judges and the general public. Once fingerprints became accepted forms of evidence in court, pretty much nobody questioned their validity — until recently. It’s just been in the last few years that scientists have started to acknowledge with defense attorneys and defendants have long known: Fingerprint analysis is more art than science, and the results often can’t be duplicated from one examiner to another.
In fact, a National Academy of Sciences report issued in 2009 indicates that some examiners can’t even duplicate their own results when presented with the same fingerprints years later.
It is somewhat easy to identify fingerprints when you have a perfectly clean set of prints to compare to another perfectly clean set of prints taken from a suspect — but that’s not how real life actually works. Most of the time, fingerprints found at crime scenes are muddled up with dozens of other fingerprints, incomplete partials that don’t show the whole print or otherwise difficult to read.
Until the subjective process of matching fingerprints drastically improves and evidentiary standards are raised, innocent people can still end up convicted on shaky grounds. Don’t let yourself or your loved one go to prison based on subjective evidence. Talk to an experienced defense attorney about your case as soon as possible.