Date: Sun, 28 Dec 2003 12:10:39 -0000
Reply-To: Roland <roland@RASHLEIGH-BERRY.FSNET.CO.UK>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: Roland <roland@RASHLEIGH-BERRY.FSNET.CO.UK>
Organization: Universe Monitors
Subject: Re: why 3 weeks from database release to final reports?
"Lou" <email@example.com> wrote in message
> "Roland" <firstname.lastname@example.org> wrote in message
> > Why do the pharmaceuticals pat themselves on the backs if it takes them
> > weeks from database release to final statistical reports? If these are
> > pre-defined reports from months back then why aren't they correct and
> > to run in an instant? Why can't the reporting programs be so thoroughly
> > validated that they don't need a further QC? If it were telephone
> > then would the phone company be happy with a 3 week delay between meters
> > read and bills being sent out? Would they have QC people
> > hundreds of bills checking for errors and have meetings to discuss bill
> > detail layout? Methinks 3 weeks is 3 weeks too much.
> You've obviously never worked in the field.
I have and for 8 years...
> We aren't dealing with nice cut
> and dried data but with human beings in human situations. In telephone
> billing, it's pretty straightforward as to whether or not a given call was
> connected and for how long, but that's not the case for trials data.
> You might have written your tables months earlier and validated them fully
> against test or partial data. Then along comes the situation where the
> of medication for two patients at one site were interchanged, and each of
> the two patients were in different study arms or at different dosing
> What are you going to do about it, and who decides?
You hear about it in advance, people make decisions in advance, and you make
your programming changes in advance...
> Or some patient had an adverse event severe enough to cause him to drop
> of the study, but on the termination form the reason for early termination
> is given as something other than the adverse event and the investigator at
> the site insists that both the adverse event form and the termination form
> are correct and refuses to change the data on either form, and all of a
> sudden you have two tables that don't agree - the adverse event table says
> you had an early termination due to an adverse event but the termination
You report the collected data. There can be no dispute. Even if the data is
inconsistent then you report it as it stands. You do not plug the data or
> table says there were no terminations due to adverse events and the
> INSISTS that the bundle of tables be internally consistent?
They can insist all they like. At the end of the day you report on the
collected data. The data collected for a trial are the official record of
the trial, hence the regulations surrounding the collection of data and the
documentation of changes. Listings just list the data and tables just
tabulate that data.
> For reporting purposes, adverse events, prior and concomitant medications
> are run against some dictionary to standardize the terms used. The
> dictionaries map the usual variations to some standard term, but if
> writes on the form something like "slapped by a tree branch while
> snowmobiling at night necessitating stitches and causing hellacious
> headaches" what is the standard term? Sure this is done before the
> is "released" but after you run the tables some sharp eye notices that
> same incident was reported three times against skin (for the stitches),
> neurological (for the headaches), and vision (for not seeing the branch in
> time to avoid it or maybe for night blindness) and it's only noticed when
> the tables don't make sense.
Then chancks should have been in place to spot it or the Data Managers
should be more careful.
> Sometimes, data make sense line by line, but in the aggregate they imply
> that a patient took more doses of medication than were actually dispensed,
Put data checks in place...
> or took medication for longer (by double or triple) than the study period.
> The first organized look you have at the data as a whole that may have
> trickling in over the last year is when the database is "locked" and
> are run for the first time. These inconsistencies have to be fixed,
> that they have to be explained.
Put data checks in place. Test for all these eventualities as soon as the
data comes in...
> I worked one place, totally different field, where the programming
> was overseen by an ex Navy guy. He'd served years aboard ships as an
> officer, and he knew that, say, a radar antenna could sweep the sky a
> million times and the software that analyzed the results and displayed a
> plot on a screen didn't change. He couldn't see why the programmers who
> maintained the payroll system were constantly rewriting sections of code,
> and thought they were really doing nothing, just looking busy to collect a
He should have made them walk the plank. But then HIS salary check wouldn't
Payroll systems evolve into a state of maximum confusion. I've worked on
one. The same should not be true for Clinical reporting.
> I you think 3 weeks is 3 weeks too much, you've never been there.
8 years in clinical reporting using SAS after 11 years using SAS for
Capacity Planning and performance tuning including writing a chargeback
system after a number of years doing data conversion and before that working
for a payroll section. I've seen a broad spectrum of computing and the
longer I stay in the Clinical reporting field the more I think it should be
sorted out. I hope to be given that opportunity in the new year.