Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Meta-Analysis
. 1997 Oct;18(5):431-44.
doi: 10.1016/s0197-2456(97)00097-4.

The relationship between study design, results, and reporting of randomized clinical trials of HIV infection

Affiliations
Meta-Analysis

The relationship between study design, results, and reporting of randomized clinical trials of HIV infection

J P Ioannidis et al. Control Clin Trials. 1997 Oct.

Abstract

We examined whether the study design of randomized clinical trials for medications against human immunodeficiency virus (HIV) may affect the results and whether the outcomes of these trials affect reporting and publication. We used a database of 71 published randomized HIV-related drug efficacy trials and considered the following study design factors: endpoint definition and method of analysis, masked design, sample size, and duration of follow-up. Large variation was noted in the methods of analysis for surrogate endpoints. Often statistical significance for a surrogate endpoint was not associated with statistical significance for the clinical endpoint or for survival in the same trial, although disagreements in the direction of the treatment effect for surrogate endpoints and survival within individual trials were uncommon. Open-label design seemed to affect the magnitude of the treatment effect for two treatments. The magnitude of the treatment effect in trials of zidovudine monotherapy was inversely related to their sample size, but this probably reflected the confounding effect of longer duration of follow-up in large trials (with a resulting loss of efficacy) rather than publication bias. There was, however, evidence for potential bias in reporting and publication of HIV-related trials. Meta-analyses of published trials for specific treatments demonstrated a sizable treatment benefit for all the examined medications regardless of whether these medications were officially approved, controversial, or abandoned, raising concerns about either publication bias or unjustifiable rejection of potentially useful medications. Compared with trials published in specialized journals, trials published in journals of wide readership were larger (p = 0.001) and 4.4 times more likely to report "positive" results (p = 0.01). We identified several examples of trials with "negative" results that have remained unpublished for a long time. In conclusion, study design factors may have an impact on the magnitude and significance of the treatment effect in HIV-related trials. Bias in reporting can further affect the information that these studies provide.

PubMed Disclaimer

Similar articles

Cited by

Publication types

MeSH terms

Substances

LinkOut - more resources