What Is Going Wrong With Research? Finding the Right Answer
In a recent issue of Practical Pain Management, Mark Cooper and I summarized a workshop on the role of activated glia in the onset and course of maldynia.1 [Editor's Note: You can read Part 1 of this article: Giving Severe and Chronic Pain a Name: Malydynia.] Expanding science suggests that neuroinflammation causes common features of a variety of diseases and conditions that are characterized by maldynia. How will practitioners from disparate specialties cooperate to identify these common features and their underlying pathophysiology, biochemistry, and biophysics? How can we identify therapeutic interventions that target these common features, crossing, as they must, disease differences and diverse specialty interests? This is the role of clinical research.
In this editorial, I propose that there is an opportunity, if not a responsibility, for all practitioners to search for new knowledge in pain management. It’s not just a job for a select coterie of academic clinicians and scientists. The work begins by making the tools of clinical research accessible to everyone. It’s a worthy effort.
Enhancing the Practice of Pain Care
Optimal outcomes, practitioner accountability, and cost containment all require rational choices for the best diagnostic and therapeutic alternatives. Advancing technologies bombard us with an expanding array of choices. Our only hope of doing what is right and good for our patients is to apply the best evidence available with the virtues of respect, beneficence, and justice.2 Doing right and good according to the best evidence, naturally, depends on the quality of the evidence. In the first part of this commentary, I presented four arguments for the democratization of clinical research.
In this editorial, I relate an experience of creating and implementing a disease registry and database for a 20-year longitudinal study of the health effects of complex regional pain syndrome (CRPS). I do not intend this to be a report of the results of that study, which is far from complete. Rather, it is a narrative of the opportunities and challenges inherent in undertaking clinical research. Finally, I offer some suggestions for ways in which pain practitioners can create information systems that not only will facilitate research in clinical practice, but probably will also help practitioners organize the objective impairments and subjective experience of each patient. Thus, the effort enhances the clinical practice of pain care.
A Cautionary Tale
Toward the end of 2006, a representative of a family foundation approached the Reflex Sympathetic Dystrophy Syndrome Association (RSDSA)3 with an offer to fund research on the long-term health effects of CRPS, previously known as reflex sympathetic dystrophy. RSDSA is a national, not-for-profit organization whose mission is to promote greater awareness and earlier recognition of CRPS, to fund innovative research, and to provide access to resources and support to people with CRPS, their friends, and families. The foundation’s offer amounted to an informal “request for proposal,” under the unusual circumstance that the meeting of the foundation’s board, which would decide on the research proposal and on the allocation of funds, was to take place 3 weeks hence.
The board acted on a “research proposal abstract.” Abstracts or summaries of original research reports and review articles are common in academic journals and books. A common example of abstract formatting is the one still used by The New England Journal of Medicine: background, methods, results, and conclusions.4 There are many styles of formatting that range from almost a dozen headings to none at all.5 I mention this along with the present example to illustrate how any basic science or clinical problem can fall into an easily accustomed and recognizable format. One can write a research proposal abstract in 30 minutes. Refining a proposal abstract to be worthy of pursuit takes longer, of course.
Thus it was that RSDSA was able to meet the deadline of the foundation’s board and obtain funding as the sponsor of a 20-year study of the health effects of CRPS. The proposal abstract turned into a full research proposal with protocols for each of the abstract headings. The proposal turned into an application for approval by an accredited review board. In the present case, I avoided academic entanglements by applying to a private review board. Institutional review board (IRB) is a term of art for such boards because they are commonly maintained by academic institutions, medical centers, and scientific institutions. IRBs ensure the proper conduct of research and protect the rights and safety of study subjects.6 For research using animal models, the analogous agency is the Institutional Animal Care and Use Committee. 7 To satisfy the needs of nonacademic or nonaffiliated investigators and sponsors, notably pharmaceutical and device manufacturers, private IRBs exist.
The proposed survey—a collection of enrollment and outcome instruments for RSDSA’s 20-year research—is very large: nine instruments, each of which is a lengthy set of questions, some with branching and cascading responses. The size of the questionnaires and the acquisition of responses to them over the Internet required expertise and skills beyond those of the average office computer user—certainly beyond those of this office computer user.
RSDSA awarded a competitive contract for database management to Emerge.MD.8 The company was known to me as the database manager of the North American Research Committee on Multiple Sclerosis,9 which conducts an ongoing survey of people with multiple sclerosis. The requirements that militated for the contracting of a database management company were: a) integrity of the data, b) confidentiality of the data, and c) long-term security of the data. This is a 20-year project, but there’s no reason why RSDSA couldn’t maintain this registry of people with CRPS in perpetuity. Most academic journals insist that clinical research methods observe the outcome measures for 2 years. I’ve always wondered whether that’s really because most conditions reach a steady state 2 years after diagnosis or treatment, or because the labor for most academic research comes from students, interns, residents, and fellows whose term of service is limited. Maintaining clinical research protocols takes an order of magnitude more work once the “indentured servant” who started the project leaves the training program. Cynicism is not my purpose here. The observation supports the arguments for the democratization of clinical research.