June 4, 2020

Chloroquine, yet again

Two and a half important things today.

First, a reasonably-sized, well-conducted trial of chloroquine after exposure to prevent disease found no real evidence of benefit (NYTimes) The  original research paper is supposed to be here, but isn’t up yet.

Second, Melissa Davey, Stephanie Kirchgaessner, and Sarah Boseley, in the Guardian have followed up on the concerns about that large multinational observational study.   If you missed it so far, the paper claimed to have data on over 90,000 hospitalised patients in six continents, and found patients getting chloroquine had substantially mortality. The data were supposed to have been collected from the hospitals electronically, more or less in real time, and sent to a server in the USA, but data use agreements supposedly meant that the hospitals (or even the countries they were in) couldn’t be identified — only the continents.  The Guardian journalists had the same idea as I had, that the Australian data (5 hospitals, 63 patients) was the weak point for checking: Australia is a small continent, with a small COVID outbreak, and a highly concentrated population, so you’d only have to call a fairly small number of hospitals and get them to deny sending their data to Surgisphere.  Unlike me, they actually did it:

The Guardian has since contacted five hospitals in Melbourne and two in Sydney, whose cooperation would have been essential for the Australian patient numbers in the database to be reached. All denied any role in such a database, and said they had never heard of Surgisphere. Desai did not respond to requests to comment on their statements.

So, the Australian data, after being obviously impossible and being corrected, are impossible again.  Australian patients make up only a tiny fraction of the claimed database — less than one in one thousand — but as Mies van der Rohe said, “God is in the details”

The additional half topic is that Surgisphere says they are having the data audited 

We believe that an independent academic audit that validates those three functions as it relates to our papers in the New England Journal of Medicine and The Lancet will bring further transparency to our work, further highlight the quality of our work, and also continue to deserve the confidence of our work by our colleagues.

This process will follow strict boundaries as it relates to our data use agreements, among other considerations. We are pursuing such an independent audit with all due haste while ensuring compliance with various legal and regulatory concerns.

That’s good, but they don’t say by whom. Also “This process will follow strict boundaries as it relates to our data use agreements, among other considerations” is potentially a problem. Surgisphere have claimed that the data use agreements do not allow the hospitals to be identified. An audit that does not verify the participation of at least an auditor-determined sample of the hospitals is not worth much — and ideally there would be some checks of data against records in those hospitals. A proper audit would be a lot of work, but Surgisphere claim to have carried out some such checks themselves: they say in the research paper

Collection of a 100% sample from each health-care entity is validated against financial records and external databases to minimise selection bias.

Auditing, and data access in general, are expensive and annoying if you have to do them, but so is getting the wrong answer on treatment effectiveness in COVID-19.

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »

Comments

  • avatar
    Steve Curtis

    As the Guardian story puts it:
    “Peter Ellis, the chief data scientist of Nous Group, an international management consultancy that does data integration projects for government departments, expressed concern that Surgisphere database was “almost certainly a scam”.
    ‘There’s no evidence online of [Surgisphere] having any analytical software earlier than a year ago. It takes months to get people to even look into joining these databases, it involves network review boards, security people, and management. It just doesn’t happen with a sign-up form and a conversation.”
    I think any basic checking, which the Lancet should have done, will find the study is just an empty paper bag.

    5 years ago