illustration of forensic analysis techniquesINFOGRAPHIC BY JUDE BUFFUMCutting-edge DNA identification techniques used by the office of New York City’s chief medical examiner were less reliable than claimed, some experts say. The allegation comes in a letter sent by the Legal Aid Society and the Federal Defenders of New York, an organization of public defenders, ProPublica and The New York Times reported last week (September 4).

“I’m 100 percent convinced that there are many people who are incarcerated who were convicted with DNA evidence who are innocent,” Bicka Barlow, a lawyer with a background in genetics and molecular biology, tells the Times.

At issue in the letter are techniques called “high-sensitivity testing,” or low copy number analysis, which detects trace amounts of DNA, and the Forensic Statistical Tool (FST), a software program to calculate whether a given person’s genetic material is likely present in a sample of mixed DNA. The lab in the office of the chief medical examiner estimates it has used high-sensitivity testing on evidence for 3,450 cases and FST for 1,350 cases, the Times reports.

See “Forensics 2.0

Julie Fry, a lawyer with the Legal Aid Society, tells the Associated Press that the high-sensitivity testing technique is “like making a copy of a copy of a copy. Eventually it’s going to be faded.”

“And with FST, it’s a computer program,” she says. “We don’t have access to the code—and we can’t tell if it’s accurate or not. We don’t know what’s in the black box,” she said.

Former FBI forensic scientist Bruce Budowle has also criticized FST, testifying at a hearing a few years ago that “five-person mixtures can look like three-person,” according to the Times. “Four contributors can look like two-person mixtures. It’s almost impossible to actually be accurate.”