Computer algorithms ‘could replace animal testing’
Computer algorithms could replace toxicology testing on animals, scientists say, after new research found such methods are better at predicting toxicity.
Scientists from John Hopkins Bloomberg School of Public Health mined a large database of known chemicals, to map the relationships between chemical structures and toxic properties.
Findings published in the journal Toxicological Sciences shows the map can be used to automatically predict the toxic properties of a chemical compound more accurately than a single animal test.
Principal investigator Thomas Hartung said: “These results are a real eye opener - they suggest that we can replace many animal tests with computer-based prediction and get more reliable results.”
The most advanced tool they developed was, on average, 87 per cent accurate in reproducing the consensus results of animal tests. This was done across nine tests which represent 57 per cent of the world’s animal toxicology testing. By comparison, the repetition of the same animal tests in the database was only 81 per cent accurate on average.
Each year millions of animals such as mice, rabbits, guinea pigs and dogs are used for chemical toxicity tests in laboratories around the world. Whilst this is often required by law to protect consumers, the practice is unpopular with the public for moral reasons, and among manufacturers due to the high costs and uncertainties about results.
According to Hartung, a new pesticide may require 30 separate animal tests, costing the sponsoring company around $20 million. The study found that the same chemical in the database had often been tested dozens of times in the same way.
“Our automated approach clearly outperformed the animal test, in a very solid assessment using data on thousands of different chemicals and tests,” he added. “So it’s big news for toxicology.”