Redesign of preclinical study programmes

eTRANSAFE will focus on the challenging field of translational safety evaluation and aims to provide in silico solutions for identifying when and by how much the preclinical toxicological observations can predict clinical adverse drug reactions. So far chemoinformatics approaches have largely relied on the computational prediction of the chemical-biological interaction and the subsequent consequences for adverse drug outcomes. However, little mechanistic toxicological information has been integrated into the computational toxicology strategies yet. The lack of direct interactions with experimental toxicologists has hampered progress to verify the predictive tools at the biological level. Conversely, the experimental toxicologists have generated massive amounts of data based on in vivo animal testing as well as high throughput screening methods, without systematically integrating the advances of chemoinformatics and bioinformatics. The preceding eTOX project successfully demonstrated for in vivo toxicity that such a data sharing and integration is possible and highly valuable. The ambition of the eTRANSAFE project is now to extent the integration to chemoinformatics, bioinformatics, experimental toxicology and clinical drug safety and to make the most of the unique expertise present in Europe in these areas.

eTRANSAFE also focuses its ambition in the translational challenges of the drug safety assessment. These challenges relate to both retrospective and prospective data analysis. The retrospective analysis (“how well did animal studies predict the toxicities in humans?”) is a recurring theme in the discussion about the usefulness of animal experimentation (Shanks 2009) and underwent its first thorough investigation with the seminal paper of Olson et al. (Olson 2000). Numerous papers followed, but the effort to compile the data across companies and disciplines (preclinical vs. clinical safety) remained the same due to the lack of data resources that would allow carrying out such analyses in an efficient way. On the other hand, the prospective analysis (“how well will the preclinical toxicity translate into human findings?”) is currently not performed systematically during early drug development.

eTRANSAFE aims to generate major advancements in the strategies for drug safety assessment and their resulting performance. Whereas the eTOX project had to limit its aspiration for data collection and sharing mainly to systemic toxicity studies, the data to be managed in eTRANSAFE goes far beyond these limitations in terms of amount and types of data to be managed. In addition, eTRANSAFE aims to generate a qualitative leap in the development of tools for evidence-based safety assessment (by means of read-across) and the complexity and accuracy of the predictive tools.

From first-hand experience in other related projects, partners recognize the complexity of the challenge we face, that only part of the challenge is purely technical in nature (e.g. designing and building a computational platform that supports interrelated functionality). Part of the challenge is scientific (e.g. building prediction models that relate preclinical data to human data). Part of the challenge is semantic (e.g. building controlled terminologies when standards are not available across different data domains). But the social component of this project cannot be underestimated. The context is complex with IPRs and commercial interests interacting with regulatory requirements; with ethics and privacy surrounding the clinical data; with the desire of academics to publish as quickly as possible. The eTRANSAFE project, therefore, must be seen as a socio-technical project where the technical work needs to be embedded in a social context. A project, for example, not allowing the private and sensitive data (e.g. patient-level data or sensitive company data) to be combined with public data in a secure and strictly controlled environment will fail for translational risk assessment purposes.

eTRANSAFE proposes an ambitious and flexible IT architecture capable of efficiently integrating and managing a proprietary and public data framework, allowing its exploitation for the development of a platform of useful software applications, complemented by permanent dialogue and collaboration with the regulators, as well as community building around the project.

Twitter

Latest News

Contact




    Newsletter