NONMEM Users Network Archive

Hosted by Cognigen

Validation Strategy for NONMEM

From: Vilicich, Mark <Mark.Vilicich>
Date: Fri, 17 Oct 2008 13:08:10 -0700

Dear All,

I am interested in perspectives on strategies for "validating" NONMEM.
Also, experiences from or with the FDA since the FDA is: a key user,
customer of analysis and auditor of NONMEM use in the industry. Without
a large nonmem staff here, the challenge I see is in scaling the
validation strategy to provide the most efficient environment for doing
analysis that is defensible to both internal and external audits based
on the associated GxP risk level.

Below are the concepts I've cobbled together, though instead of my
reinventing the wheel I appreciate anything you could share. Any and all
gems of insight you can share whether it regard the big picture or some
detailed specifics, IT centric or business process related. You may send
them back to the listserver or me directly as you feel appropriate.

From searching the archives and other random bits of knowledge on
NONMEM, part of the validation strategy is to recognize that NONMEM is
not to be literally validated. NONMEM may be considered more of a
development environment, optimized for developing specialized forms of
complex analysis and modeling. As a development platform, an approach
could be that NONMEM itself is qualified and each specific analysis is
validated individually.

To support establishing a defensible NONMEM environment, I've also read
discussions on integrating common software development best practices
such as version control of the "programming" of nonmem, NMQual and other
commercial and custom tools for capturing all the metadata related to
running a specific NONMEM job. These themes support defining the state
of the NONMEM environment and ability to reproduce the outcomes.

Also, reading in the archives about the differences in the numeric
outcomes of NONMEM analyses based on the hardware platform, etc. are
helpful to know up front and to consider in the validation strategy so
it is not destined to failure if the target environment is multiplatform
or otherwise complex.

Gaps noticed/topics not discussed:
Is there opportunity in looking at the risk based approached sanctioned
by the FDA a few years ago that would make the total validation
deliverable, including both the application and the model development
process, more lean and targeted at the primary risk targets?

Does this scientific software environment lend itself to use of modern
agile software development methodologies that go far beyond basic
iterative approaches. These methodologies are being used in software
development for the regulated/GxP industry.

I've seen the excellent presentation from 2004 that Joga Gobburu from
the FDA gave, seems like there has been some progression of thought or
actions on the proposals included there. Any references to follow-up
information on it would be helpful?

 Regards ,

Mark Vilicich
Early Development

Received on Fri Oct 17 2008 - 16:08:10 EDT

The NONMEM Users Network is maintained by ICON plc. Requests to subscribe to the network should be sent to:

Once subscribed, you may contribute to the discussion by emailing: