Meeting report – DSEC Pre-GLP Safety Strategy

Jane1I attended a virtual conference last week, hosted by the Drug Safety Executive Council (DSEC), on “Pre-GLP Safety Strategy Session: Key Areas to Address When Planning a Pre-GLP Safety Evaluation”. It was an excellent session, very well run with good moderators, and some excellent speakers.


Werner Coussement (Johnson & Johnson) kicked off, with an excellent talk on the J&J strategies for safety screening. He gave an overview of what screens / assays they use, and then went into more detail for various areas of key toxicities (e.g. genotox, hepatotox, phospholipidosis). His conclusions were that to deal with the current challenges (many more molecules to test; information overload; lots of new technologies) pharmaceutical companies need to use more predictive rather than observational science; get safety embedded earlier in discovery, and improve knowledge sharing.

Laszlo Urban (Novartis) then gave a talk on how Novartis are managing their safety strategy, particularly at lead selection and optimisation. Again he emphasized that it is key to bring safety earlier in the DD process – right from the target selection through hit expansion, lead optimisation and candidate selection.

Dave Cook’s talk (AstraZeneca) whizzed through many of the issues that we are familiar with from – prediction of hazard, prediction of risk; and how to make better choices earlier in drug discovery. He talked about how we need to move beyond the quantitative assay data to exploiting qualitative prior knowledge – “linking the biological world to the chemical world”.

Stefan Platz (Roche) ran through how they evaluate new tools and technologies. The bottom line was really that the strategy from target identification to GLP tox testing had to be iterative, with constant re-analysing of data for toxicity and DMPK perspectives.

The final talk was from Abigail Jacobs (CDER, FDA), and discussed assessing new biomarkers for early safety testing. She gave a good overview of the considerations for establishing the relevance of non-clinical biomarkers to humans, for early safety testing.

During the conference, there were two sessions of moderated discussions, between groups of 8-10 attendees. These were also very interesting, and in both sessions, key themes that recurred related to re-use of historic data, and data-sharing between companies. Some issues were brought out regarding the re-use of historic data. For example, one view was that the molecular entities being designed and tested now are very different from those from ~30 years ago, so data may not be so relevant. Would sensitivities around animal testing cause issues with data sharing?

We also discussed whether it would be easier for companies to share their failed compounds – i.e. keep the “good” chemistry proprietary, but if there are pharmacophores with known side effects, enable these structures to be shared. This would still be a hugely valuable resource – a known set of pharmacophores to screen against – and of benefit to the wider healthcare environment, particularly the end-users of the drug development pipeline, i.e. the patients. The overall feeling was that data-sharing was a “good thing” and would increase over the next few years, both by “pushes” from inside Pharma, and “pulls” from patients, governments, and other external organisations.

All in all, an excellent conference – thanks to DSEC for organising, and to all the speakers for their time and effort!

Leave a Reply