The implications of CDISC on validation
Not sure how many of my readers, if any, are aware of the initiatives being advanced by the Clinical Data Interchange Standards Consortium (CDISC), but they're setting forth standards for consistent collection and reporting of clinical trial data. One of their standards especially, the Standard Data Tabulation Model (SDTM), is going to have great impact on the world of computer systems validation.
SDTM has been introduced by FDA in a press release as a standardized format in which clinical data will be accepted in regulatory submissions. As a result, CDISC and the SDTM are oft-heard buzzwords in the clinical trial industry. They're not as often heard yet in the computer systems validation world, but they soon will be.
SDTM provides a standardized format and organizational scheme for the purpose of capturing, storing, reporting on, and retrieving clinical trial data. Currently, the data for each study within a trial can be in a different format and organizational scheme. The applications currently used to verify and report on that data, generally written using the SAS platform, are only informally tested and validated due to the sheer number of them (consider that a full suite must be written for each format and organizational scheme) and to the aggressive schedules pursued by pharmaceutical companies. SDTM is going to have an enormous amount of impact on this. With a single standard format and organizational scheme to work with, more time can be invested in the verification and reporting programs (edit checks and tables, listings, and figures programs respectively for the most part) and especially in the validation of these programs. They will reduce the amount of time spent programming and validating, while at the same time allowing for more extensive validation to be done on each program.
Consider that initially you would have, say, 100 programs and perform a cursory validation on each that takes a half hour. You're spending 50 hours on validation at that point. With SDTM, you will have perhaps 15 programs, and you can spend 2 hours validating each one and still end out under the initial time spent. In addition, more extensive validation up front reduces the amount of fixes that need to be put in (and revalidated) once the program is being used in production. It's a winning proposition for everyone.
SDTM has been introduced by FDA in a press release as a standardized format in which clinical data will be accepted in regulatory submissions. As a result, CDISC and the SDTM are oft-heard buzzwords in the clinical trial industry. They're not as often heard yet in the computer systems validation world, but they soon will be.
SDTM provides a standardized format and organizational scheme for the purpose of capturing, storing, reporting on, and retrieving clinical trial data. Currently, the data for each study within a trial can be in a different format and organizational scheme. The applications currently used to verify and report on that data, generally written using the SAS platform, are only informally tested and validated due to the sheer number of them (consider that a full suite must be written for each format and organizational scheme) and to the aggressive schedules pursued by pharmaceutical companies. SDTM is going to have an enormous amount of impact on this. With a single standard format and organizational scheme to work with, more time can be invested in the verification and reporting programs (edit checks and tables, listings, and figures programs respectively for the most part) and especially in the validation of these programs. They will reduce the amount of time spent programming and validating, while at the same time allowing for more extensive validation to be done on each program.
Consider that initially you would have, say, 100 programs and perform a cursory validation on each that takes a half hour. You're spending 50 hours on validation at that point. With SDTM, you will have perhaps 15 programs, and you can spend 2 hours validating each one and still end out under the initial time spent. In addition, more extensive validation up front reduces the amount of fixes that need to be put in (and revalidated) once the program is being used in production. It's a winning proposition for everyone.
<< Home