Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Measuring and Improving Performance in Multicenter Research Consortia

Measuring and Improving Performance in Multicenter Research Consortia Abstract Background: Some evidence suggests that the quality of the organization and management of research consortia influences productivity and staff satisfaction. Collaborators in a research consortium generally focus on developing and implementing studies and thus rarely assess the process of collaboration. We present an approach to evaluating and improving a research consortium, using the HMO Cancer Research Network (CRN) as an example. Methods: Five domains are evaluated: extent of collaboration and quality of communication; performance of projects and infrastructure; data quality; scientific productivity; and impact on member organizations. The primary assessment tool is a survey of CRN scientists and project staff, undertaken annually. Results: Each year, the evaluation has identified critical aspects of this collaboration that could be improved. Several tangible changes have been implemented to improve productivity of the consortium. The most important result of the CRN Evaluation is the ability to have open dialogue about ways to improve its overall performance. Conclusion: Optimizing the process of collaboration will contribute to achievement of the scientific goals. The experience of the CRN provides a useful framework and process for evaluating the structure of consortium-based research. Multicenter consortia are used increasingly as a vehicle for conducting research. Cooperative Oncology Groups are established consortia conducting cancer treatment trials. Other aspects of cancer research, such as assessment of care, studies of the etiology of rare cancers, or organization-level responses to established cancer prevention guidelines, also lend themselves to consortium-based research. Thus, many large collaborations beyond cancer treatment trials have been funded in the past decade by the National Cancer Institute (NCI). The HMO Cancer Research Network (CRN), a consortium of research centers based in health care delivery systems, is one such collaboration. Some evidence suggests that productivity can be modified in the context of a research consortium. Kaluzny (1) found that size, structure, and complexity contribute to the ability of multiple organizations to adapt in a collaborative environment and that organizations who could mold diverse perspectives to meet common goals were more productive in terms of patient accrual performance. Achieving success and ensuring scientific productivity in a collaborative research enterprise generally require much more effort than a single-site research project. A multicenter collaboration benefits from a clear organizational structure, strong leadership, shared goals, and effective communication channels. These characteristics foster cooperation and can increase the effectiveness of the collaboration. Since research collaborations generally focus on implementing multiple studies simultaneously, rarely do collaborators turn their lens inward to assess the process of collaboration and its impact on research. We present one approach to evaluating and improving a research consortium, using the HMO CRN as an example. The NCI stipulated that the CRN grant recipient periodically evaluate its performance. Thus, a specific evaluation core was defined in the CRN application. This was the first time NCI included such a stipulation. However, based partly on the CRN's success with this requirement, it is now considered an integral aspect of consortium-based research. Similar evaluations are being instituted by other large collaborations (2). METHODS Setting The HMO CRN consists of the research programs, enrollee populations, and databases of 11 integrated health care delivery systems across the United States. The health care delivery systems participating in the CRN are: Group Health Cooperative, Harvard Pilgrim Health Care, Henry Ford Health System/Health Alliance Plan, HealthPartners Research Foundation, the Meyers Primary Care Institute of the Fallon Healthcare System/University of Massachusetts, and Kaiser Permanente in six regions (Colorado, Georgia, Hawaii, Northwest [Oregon and Washington], Northern California, and Southern California). The 11 health plans have nearly 10 million enrollees. The CRN conducts collaborative research on the effectiveness of cancer prevention and control strategies in its health care systems. The CRN portfolio contains 27 projects in various stages of completion, and an infrastructure including committees and guidelines to bolster organizational efficiency. Conceptual Models for Evaluation and Improvement Two important conceptual frameworks guide the CRN. Kaluzny (2) has described a “lateral alliance” as a model for collaborations. In lateral alliances, similar organizations pool resources in pursuit of mutually beneficial goals. This model helped the CRN determine what aspects of our own “alliance” should be evaluated. The Model for Improvement (3) also guided the CRN by providing a tested process for evaluating and improving our efforts. This model provides a framework for making changes rapidly using Plan, Do, Study, Act (PDSA) cycles. The overall goal of the CRN Evaluation is to annually assess and improve its research projects, infrastructure, and overall performance. Although publications are an important and relevant outcome of research projects, they come late in the research cycle–often too late to take corrective action if collaborative processes are deficient. For large research enterprises, the process-oriented evaluation described here can complement customary barometers of research output (e.g., peer-reviewed publications, participant retention, influence on practice and policy), especially early in the cycle of large initiatives. Also, information generated by process evaluation is useful for both internal and external stakeholders. Although external evaluation informs decisions about continued funding of a project, internal evaluation informs the creation of approaches that improve overall efficiency and quality of the research process. The evaluation measures performance over time to stimulate and guide improvement activities for each of the CRN's components. Feedback includes numerical ratings plus qualitative suggestions and highlights strengths or weaknesses of each component. Each year's findings are translated into improvement plans with specific goals. Klabunde et al. (4) identified four challenges to the integrity and success of research alliances: 1) maintaining a unified purpose despite multiple and changing environments, 2) achieving and sustaining integration across organizational boundaries, 3) meeting complex information needs created by a multiorganizational arrangement, and 4) fostering a cooperative spirit among alliance partners, who … may have been competitors. As we developed our evaluation strategy, we were attuned to these challenges, and our overall evaluation plan includes an assessment of five broad domains: 1) extent of collaboration and quality of communication, 2) performance of projects and infrastructure, 3) data quality, 4) scientific productivity, and 5) impact on member organizations. The primary assessment tool is an annual survey of CRN scientists and project staff. Yearly progress reports from project teams, accrual of publications and presentations, and the use of infrastructure resources are other measures the CRN uses to evaluate its performance. CRN Participant Survey Each CRN project is rated on four characteristics that are central to a successful collaboration: leadership, communication, organization, and effectiveness of interactions. Also, key components of the CRN infrastructure (data resources, committees, Web site, other communication tools) are assessed for effectiveness, relevance, consistency with CRN aims, and utility. Finally, the survey measures the impact of the CRN on the researchers and participating organizations. Sample questions from the evaluation survey are presented in Table 1, including questions about the projects, steering committee, and member institutions. Given the dynamic nature of this consortium, questions are added to assess new elements and processes, such as the CRN data warehouse, communication strategies, and execution of the recompetition for a second 4-year grant. The complete survey form is available here. Table 1.  Sample questions from the Cancer Research Network participant survey How effective is the Steering Committee at:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      a. Fostering cooperation between sites  1  2  3  4  9      b. Making decisions that support the overall goals of the CRN  1  2  3  4  9  How well do CRN publications guidelines address:  Very Poorly      Very Well  Can't Evaluate      a. Authorship responsibilities  1  2  3  4  9      b. Collaborative manuscript development  1  2  3  4  9  Compared to a year ago, how well is the CRN is doing in each of the following areas:              Worse  The Same  Better  Can't Evaluate        a. Promoting communication across sites  1  2  3  9        b. Data coordination  1  2  3  9        c. Overall leadership  1  2  3  9        d. Providing information concerning other projects  1  2  3  9        e. Providing useful information on the web  1  2  3  9        f. Dealing with publication issues  1  2  3  9        g. Assisting with administrative issues  1  2  3  9        h. Assisting with new proposal development  1  2  3  9    Project-specific questions:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      1. How effective is the leadership of the project?  1  2  3  4  9      2. How effective is the organization of the project for accomplishing the scope of work?  1  2  3  4  9      3. How effective are the project meetings? (Consider both conference calls and in-person meetings.)  1  2  3  4  9      4. How effective are the project communications among different sites and different working groups?  1  2  3  4  9  Selected items on the impact of the CRN:            Impact on Health Plan  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN is highly visible to the oncologists in my health plan.  1  2  3  4  9      As a result of the CRN, our health plan is more supportive of our research center.  1  2  3  4  9  Impact on Research Organization  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN will enhance our research organization's reputation.  1  2  3  4  9      The CRN provides my research center with the opportunity to improve the quality of its cancer research program.  1  2  3  4  9  Impact on Individual  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      I view the CRN as an opportunity to enhance my research career.  1  2  3  4  9      The opportunity to work with researchers outside of our health care system is important to me.  1  2  3  4  9  How effective is the Steering Committee at:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      a. Fostering cooperation between sites  1  2  3  4  9      b. Making decisions that support the overall goals of the CRN  1  2  3  4  9  How well do CRN publications guidelines address:  Very Poorly      Very Well  Can't Evaluate      a. Authorship responsibilities  1  2  3  4  9      b. Collaborative manuscript development  1  2  3  4  9  Compared to a year ago, how well is the CRN is doing in each of the following areas:              Worse  The Same  Better  Can't Evaluate        a. Promoting communication across sites  1  2  3  9        b. Data coordination  1  2  3  9        c. Overall leadership  1  2  3  9        d. Providing information concerning other projects  1  2  3  9        e. Providing useful information on the web  1  2  3  9        f. Dealing with publication issues  1  2  3  9        g. Assisting with administrative issues  1  2  3  9        h. Assisting with new proposal development  1  2  3  9    Project-specific questions:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      1. How effective is the leadership of the project?  1  2  3  4  9      2. How effective is the organization of the project for accomplishing the scope of work?  1  2  3  4  9      3. How effective are the project meetings? (Consider both conference calls and in-person meetings.)  1  2  3  4  9      4. How effective are the project communications among different sites and different working groups?  1  2  3  4  9  Selected items on the impact of the CRN:            Impact on Health Plan  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN is highly visible to the oncologists in my health plan.  1  2  3  4  9      As a result of the CRN, our health plan is more supportive of our research center.  1  2  3  4  9  Impact on Research Organization  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN will enhance our research organization's reputation.  1  2  3  4  9      The CRN provides my research center with the opportunity to improve the quality of its cancer research program.  1  2  3  4  9  Impact on Individual  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      I view the CRN as an opportunity to enhance my research career.  1  2  3  4  9      The opportunity to work with researchers outside of our health care system is important to me.  1  2  3  4  9  View Large The CRN Steering Committee (including NCI) developed the evaluation instrument. Most items are quantitative ratings using Likert scales. Since rating scales cannot capture all aspects of performance, open-ended items collect qualitative input on the strengths and areas for improvement for each project/committee. CRN collaborators agreed at the outset that this evaluation must always be viewed as useful and meaningful to its stakeholders. Therefore, the survey is reviewed annually, and questions are added or deleted to best meet the consortium's needs. Questions about the impact of the CRN on the individual staff and member organizations remain constant for year-to-year comparisons. Respondents As projects evolve, the pool of relevant survey respondents changes from year to year. Site and project principal investigators identify the universe of potential respondents for each year's survey. Generally, these include all individuals who were involved in at least one facet of the CRN (e.g., a project or committee). Skip patterns are included for every question, since some components of the CRN may not be relevant to every respondent. The survey is administered at the end of each grant year. Because this is regarded as a quality improvement effort, neither institutional review board approval nor written informed consent are required. In the first 4 years, this was a self-administered mailed survey. However, in year 5, we switched to a Web-based survey to streamline data entry. Since participants were evaluating their peers, or in some cases their supervisors, protecting the identity of respondents has always been a chief concern. In year 1, the surveys were confidential, but in later years, they were anonymous. Therefore, reminders to complete the surveys can be circulated only to the entire respondent pool, rather than individuals. No incentives are provided. Data Synthesis Means are computed for each quantitative item, and cross-tabulations are generated that show the mean score for the items by role on project and level of involvement (i.e., involved in single versus multiple CRN components). Qualitative feedback on a project or infrastructure component is disseminated to the relevant project or committee leader and the CRN Steering Committee. Project teams and committees review results and formulate improvement plans during in-person meetings. RESULTS Table 2 shows the number of participants and response rates by year, type of respondent, and level of participation. Because participation in different components of the CRN varies by individual, the number of responses to a given question may also vary based on the number of respondents involved in that component. Given evolution in project staffing, different individuals may respond to the questionnaire or specific sections from year to year. The pool of respondents is identified by the site and project leaders and includes CRN participants who may be involved in only a single aspect of the collaboration. However, we know anecdotally that those who are most heavily involved in the CRN participate in the survey each year. Table 2.  Cancer Research Network evaluation respondents Survey year  No. of surveys sent  No. of surveys received  Completion rate (%)      1999  117  88  75.2      2000  164  118  72.0      2001  148  106  71.6      2002  120  83  69.2      2003  153  106  69.3  Survey year  No. of surveys sent  No. of surveys received  Completion rate (%)      1999  117  88  75.2      2000  164  118  72.0      2001  148  106  71.6      2002  120  83  69.2      2003  153  106  69.3    2001     2002     2003     Primary role on the CRN  N  %*  N  %  N  %      Investigator/Biostatistician  49  44  49  46  40  48      Coordinator/Project Manager/Administrator  35  32  26  25  23  28      Programmer/Data Analyst/Site Data Manager  23  21  21  20  15  18      Other (Financial, Research Specialist, Project Assistant, Sr. Project Assistant, Unspecified)  10  9  10  9  5  6  *Check all that apply—total exceeds 100%.                2001     2002     2003     Primary role on the CRN  N  %*  N  %  N  %      Investigator/Biostatistician  49  44  49  46  40  48      Coordinator/Project Manager/Administrator  35  32  26  25  23  28      Programmer/Data Analyst/Site Data Manager  23  21  21  20  15  18      Other (Financial, Research Specialist, Project Assistant, Sr. Project Assistant, Unspecified)  10  9  10  9  5  6  *Check all that apply—total exceeds 100%.                % Involved in:       Duration of involvement with CRN (years)  2001  2002  2003*      <1  4  4  22      1–2  24  10  22      >2  73  86  56  *This was the first year of a new grant cycle and new core projects.          % Involved in:       Duration of involvement with CRN (years)  2001  2002  2003*      <1  4  4  22      1–2  24  10  22      >2  73  86  56  *This was the first year of a new grant cycle and new core projects.          % Involved in:       Level of involvement with CRN  2001  2002  2003*      Single project or committee  41  38  54      Multiple projects and/or committees  59  62  46    % Involved in:       Level of involvement with CRN  2001  2002  2003*      Single project or committee  41  38  54      Multiple projects and/or committees  59  62  46  * This was the first year of a new grant cycle and new core projects. View Large Since the intent of the evaluation is to identify successes and deficiencies, quantitative results are not tested for statistical significance. However, year-to-year comparisons are included in the quantitative results (e.g., 1999 mean versus 2000 mean) to ascertain whether improvements have been realized. Infrastructure Table 3 shows an excerpt of quantitative and qualitative feedback. Each year, the evaluation has identified critical aspects of this collaboration that could be improved. The infrastructure has changed over time to meet projects’ and participants’ needs. These changes range from the structure of the data resources core to optimizing modes of communication to keeping CRN members informed about CRN-wide activities. Midway through our first funding cycle, feedback suggested that the principal investigator's site did not have an adequate understanding of the unique needs and capabilities of participating sites. Site visits were immediately instituted, which have been an invaluable element for improving collaboration. Table 3.  Sample quantitative and qualitative feedback from CRN Evaluation surveys How effective is the Scientific and Data Resources Core (SDRC) at:  Very Ineffective (1) n (%)  Somewhat Ineffective (2) n (%)  Somewhat Effective (3) n (%)  Very Effective (4) n (%)  Can't Eval. N  N  2003 Mean (μ)  2002* Mean (μ)  2001* Mean (μ)  2000* Mean (μ)      Providing leadership on data issues  1 (2)  6 (13)  20 (42)  21(44)  53  48  3.3  2.8  3.1  3.1      Understanding the data-related needs of the projects  2 (5)  6 (14)  18 (41)  18 (41)  57  44  3.2  2.8  2.8  3.0      Providing information and tools useful to proposals and projects  0 (0)  7 (16)  19 (42)  19 (42)  55  45  3.3  N/A – new question      How effective is the Scientific and Data Resources Core (SDRC) at:  Very Ineffective (1) n (%)  Somewhat Ineffective (2) n (%)  Somewhat Effective (3) n (%)  Very Effective (4) n (%)  Can't Eval. N  N  2003 Mean (μ)  2002* Mean (μ)  2001* Mean (μ)  2000* Mean (μ)      Providing leadership on data issues  1 (2)  6 (13)  20 (42)  21(44)  53  48  3.3  2.8  3.1  3.1      Understanding the data-related needs of the projects  2 (5)  6 (14)  18 (41)  18 (41)  57  44  3.2  2.8  2.8  3.0      Providing information and tools useful to proposals and projects  0 (0)  7 (16)  19 (42)  19 (42)  55  45  3.3  N/A – new question      Compared to a year ago, how well do you think the CRN is doing in each of the following areas:  Worse (1) n (%)  The Same (2) n (%)  Better (3) n (%)  Can't Evaluate n  N  2003 Mean (μ)  2002 Mean (μ)  2001 Mean (μ)      a. Assisting with administrative issues  1 (2)  24 (57)  17 (40)  60  42  2.4  2.4  2.4      b. Assisting with new proposal development  1 (3)  17 (55)  13 (42)  71  31  2.4  2.6  2.8      c. Data coordination  4 (9)  17 (39)  23 (52)  58  44  2.4  2.4  2.5      d. Dealing with publication issues  1 (4)  16 (64)  8 (32)  77  25  2.3  2.5  2.5      e. Overall leadership  2 (4)  27 (59)  17 (37)  56  46  2.3  2.4  2.5      f. Promoting communication across sites  1 (2)  29 (63)  16 (35)  56  46  2.3  2.4  2.6      g. Providing information concerning other projects  1 (2)  27 (64)  14 (33)  59  42  2.3  2.3  2.5      h. Providing useful information on the web  1 (2)  19 (42)  25 (56)  56  45  2.5  2.5  2.8  Compared to a year ago, how well do you think the CRN is doing in each of the following areas:  Worse (1) n (%)  The Same (2) n (%)  Better (3) n (%)  Can't Evaluate n  N  2003 Mean (μ)  2002 Mean (μ)  2001 Mean (μ)      a. Assisting with administrative issues  1 (2)  24 (57)  17 (40)  60  42  2.4  2.4  2.4      b. Assisting with new proposal development  1 (3)  17 (55)  13 (42)  71  31  2.4  2.6  2.8      c. Data coordination  4 (9)  17 (39)  23 (52)  58  44  2.4  2.4  2.5      d. Dealing with publication issues  1 (4)  16 (64)  8 (32)  77  25  2.3  2.5  2.5      e. Overall leadership  2 (4)  27 (59)  17 (37)  56  46  2.3  2.4  2.5      f. Promoting communication across sites  1 (2)  29 (63)  16 (35)  56  46  2.3  2.4  2.6      g. Providing information concerning other projects  1 (2)  27 (64)  14 (33)  59  42  2.3  2.3  2.5      h. Providing useful information on the web  1 (2)  19 (42)  25 (56)  56  45  2.5  2.5  2.8  Excerpts from Qualitative Feedback (multiple different respondents)      * It would be helpful to have guidelines for site participation in CRN projects. For example, if a project only includes 2 KP sites, why is that considered appropriate as a CRN project when other interested sites were turned away.      * Provide minutes for CRN site meetings for investigators and project managers and programmers who are unavailable to attend.      * I think it would be useful to have meetings (conference calls) twice a year or so with the CRN leadership and site investigators (ALL investigators, not just steering committee) to get overall updates/discussion.      * I really am proud to be part of the CRN (warts and all!). I would be thrilled if the CRN put priority on topics that can only be addressed through a large network ~ studying rarer CA and exposures.      * Need one page policy summary (checklist?) that includes the 2 to 3 boilerplate sentences expected to be in every paper.      * Fortunately, the Pub. Comm. reduced its initial barrier approach to publications approval and handles reviews in a relatively timely way, but I am unaware of anything it has done to facilitate pubs.      * Abbreviations: GHC, Group Health Cooperative; CRN, Cancer Research Network.  Excerpts from Qualitative Feedback (multiple different respondents)      * It would be helpful to have guidelines for site participation in CRN projects. For example, if a project only includes 2 KP sites, why is that considered appropriate as a CRN project when other interested sites were turned away.      * Provide minutes for CRN site meetings for investigators and project managers and programmers who are unavailable to attend.      * I think it would be useful to have meetings (conference calls) twice a year or so with the CRN leadership and site investigators (ALL investigators, not just steering committee) to get overall updates/discussion.      * I really am proud to be part of the CRN (warts and all!). I would be thrilled if the CRN put priority on topics that can only be addressed through a large network ~ studying rarer CA and exposures.      * Need one page policy summary (checklist?) that includes the 2 to 3 boilerplate sentences expected to be in every paper.      * Fortunately, the Pub. Comm. reduced its initial barrier approach to publications approval and handles reviews in a relatively timely way, but I am unaware of anything it has done to facilitate pubs.      * Abbreviations: GHC, Group Health Cooperative; CRN, Cancer Research Network.  View Large Projects CRN projects are typically complex efforts comprised of multiple data collection components and involvement of at least three sites. Early in the first funding cycle, protracted decision making emerged as a critical issue on two core projects. The project leaders took steps toward enhancing efficiency by establishing executive committees to accelerate deliberations about design issues. Yet the need for transparency and truly collaborative scientific decision making are also key facets of successfully implementing these complex multisite efforts, so the leaders also implemented strategies to encourage more active participation on the part of all collaborators. Timely and accurate dissemination of design decisions are also critical for keeping the project on track while ensuring transparency. Therefore, another tangible procedural change was adding functionality to the CRN's secure Web site to give project teams at the various research sites the ability to remotely post project documentation on the Web site. Generally, overall project functioning is rated as “somewhat” or “very effective” by respondents, indicative of the project leaders’ commitment to organization, clear communication, and efficient implementation strategies. The multisite aspect of every project means that involved investigators typically represent different disciplines and bring different perspectives to the research. Project leaders must therefore remain vigilant about ensuring that all team members contribute, and hold them accountable for completing the scope of work. Impact of the CRN From its inception, the CRN sought to improve the quality of cancer research and cancer care at its participating sites, and the assessment of our progress on this front is a much-anticipated piece of each year's survey. Table 4 shows results from a subset of these questions. The CRN has had a substantial impact at the individual investigator and research center level; however, impact at the health plan level is lower. As the CRN has matured, project findings have heightened the visibility of the research centers, and incrementally, the health plans have shown more support of the centers. Individual investigators consistently report that the CRN provides them with important opportunities to collaborate with external researchers, enhance their careers, and improve quality and availability of local data resources. Table 4.  Sample results from questions pertaining to the impact of the Cancer Research Network (CRN)   Strongly Disagree  Disagree  Not Sure  Agree  Strongly Agree            (1)  (2)  (3)  (4)  (5)    2002*  2000  1999  Statement  n (%)  n (%)  n (%)  n (%)  n (%)  N  Mean (μ)  Mean (μ)  Mean (μ)  Impact on Health Plan                        The leadership of my health plan is aware of the CRN.  (1)  2 (2.5)  32 (40.5)  33 (42)  11 (14)  79  3.6  3.7  3.7  Impact on Research Organization                        The CRN is a cornerstone of our research center.  4 (5)  26 (32)  17 (21)  27 (33)  7 (9)  81  3.1  2.8  2.6      The CRN will enhance our research organization's reputation.  0 (0)  1 (1)  11 (14)  54 (67)  15 (18)  81  4.0  4.0  3.9  Impact on Individual                        I believe the work of the CRN researchers will result in significant improvements in the quality of cancer data and data availability.  0 (0)  3 (4)  11 (14)  41 (51)  26 (32)  81  4.1  3.9  3.9      My involvement on the CRN is very satisfying to me.  0 (0)  9 (11)  11 (13)  46 (56)  16 (20)  82  3.8  3.8  3.8      I view the CRN as an opportunity to enhance my research career.  1 (1)  9 (11)  14 (17)  41 (50)  17 (21)  82  3.8  3.8  3.8    Strongly Disagree  Disagree  Not Sure  Agree  Strongly Agree            (1)  (2)  (3)  (4)  (5)    2002*  2000  1999  Statement  n (%)  n (%)  n (%)  n (%)  n (%)  N  Mean (μ)  Mean (μ)  Mean (μ)  Impact on Health Plan                        The leadership of my health plan is aware of the CRN.  (1)  2 (2.5)  32 (40.5)  33 (42)  11 (14)  79  3.6  3.7  3.7  Impact on Research Organization                        The CRN is a cornerstone of our research center.  4 (5)  26 (32)  17 (21)  27 (33)  7 (9)  81  3.1  2.8  2.6      The CRN will enhance our research organization's reputation.  0 (0)  1 (1)  11 (14)  54 (67)  15 (18)  81  4.0  4.0  3.9  Impact on Individual                        I believe the work of the CRN researchers will result in significant improvements in the quality of cancer data and data availability.  0 (0)  3 (4)  11 (14)  41 (51)  26 (32)  81  4.1  3.9  3.9      My involvement on the CRN is very satisfying to me.  0 (0)  9 (11)  11 (13)  46 (56)  16 (20)  82  3.8  3.8  3.8      I view the CRN as an opportunity to enhance my research career.  1 (1)  9 (11)  14 (17)  41 (50)  17 (21)  82  3.8  3.8  3.8  * Means for 2001 are not provided for comparison because in the 2001 survey we used different response categories for all questions in Section 3. View Large Improvement Plans Once evaluation results are disseminated, research teams develop improvement plans based on the findings. These improvement plans are generated completely within the group, keeping with the heuristic that implementation and adherence to the improvement plan will be more likely if the stakeholders develop it, rather than having a prescription for improvements dictated by another person or group. The improvement plans are brief, specific, and outcome-oriented. Major aspects of the improvement plans are to 1) establish goals; 2) develop an approach to meeting each goal; and 3) determine how to measure progress toward each goal. DISCUSSION The most important result of the CRN Evaluation is enhancing the ability to openly discuss ways to improve overall performance. The entire evaluation process enables discussion of deficiencies to take place in a nonthreatening context. Weaknesses in organization, communication, or leadership can be identified constructively and confidentially; the survey also offers respondents the ability to suggest ways to strengthen or modify procedures as well as interpersonal interactions. As suggestions are adopted, trust increases and interactions continue to improve. The primary purpose of this evaluation is to rapidly identify and correct problems in the functioning of the CRN before they interfere with the research. Though initially this seemed to be an unorthodox undertaking in an applied research setting, implementing a self-assessment of this collaboration has been a major contributor to its success. Other NCI-funded consortia, including the Transdisciplinary Tobacco Use Research Centers, Cancer Care Outcomes Research and Surveillance Consortium, and the Statistical Coordinating Center of the Breast Cancer Surveillance Consortium, have all approached the CRN to learn from its experiences and undertake similar evaluations. The recently awarded Centers of Excellence in Cancer Communications Research initiative also includes an evaluation component. Surfacing issues that could interfere with collaboration and research has been cited by consortium leaders as one motive for evaluation. Institutionalizing an evaluation requirement for consortia could ultimately facilitate comparison across different research collaborations and possibly identify best practices in collaborative research. Although our evaluation has had a substantial impact internally, limitations must be acknowledged. First, it is an ongoing challenge to persuade participants of the survey's importance. Time spent on the survey—though it is a fairly brief instrument—is time taken away from the research itself. Second, approximately half of the participants are involved in only a single project and therefore may not be ideally positioned to evaluate networkwide activities. But perceptions from team members who are intimately involved, as well as those on the periphery, provide a less biased and broader assessment of our effectiveness and also indicate areas where the CRN could improve its visibility. Third, with personnel turnover, different people may participate each year, reducing ability to compare ratings of effectiveness over time. However, the dynamic nature of the research environment, and the CRN's approach to adapting processes in response to prior evaluations, preclude contacting the same respondents every year. Finally, although most questions remained unchanged from year to year, some items in each survey have been added or deleted to each iteration. Although this tailored approach could diminish the extent to which the CRN can gauge its overall improvement over the lifetime of the collaboration, it facilitates assessment of newer organizational changes. There is an abundance of literature on quality improvement, including seminal publications by Deming (5) and Juran (6). Several multicenter research consortia have published descriptions of their structure and purpose (7–10). Mattessich (11) has reviewed social science and educational research on optimizing collaborations. However, we found few articles that bridge these three domains, particularly in health research. Kaluzny's assessment of the Community Clinical Oncology Program (2) served as a blueprint for the CRN's evaluation strategy. But to our knowledge, few other consortia have published a systematic evaluation of their collaborative processes (12,13). Warden (14) describes the use of quality improvement methods in a multicenter depression trial; however, they incorporate performance-oriented research goals (e.g., accrual of patients, retention), to enhance trial performance, which is very useful but quite different from the CRN's approach. Researchers in multicenter collaborations might consider a hybrid approach, using satisfaction or effectiveness measures (e.g., how well the collaboration is functioning) in combination with other cornerstones of research quality and productivity, such as response or retention rates, data quality, adherence, and number of publications. NCI's evaluation requirement in its Request for Applications was a unique trial balloon that has been beneficial to the CRN. From examination of successful collaborations, Mattessich (11) advocates that “members’ perceptions of ownership of a collaborative group need to be monitored, and needed changes in process or structure [made] in order to ensure the feeling of ownership.” Moreover, as Kaluzny (2) described, consortiums attuned to the interactive processes among alliance partners are more likely to achieve a productive dynamic and meet research objectives. Various CRN member organizations had collaborated previously, though never on this scale. Hence, the ability for individuals to critique their own projects, as well as networkwide activities, created a unique environment for change. Moreover, rapid-cycle discussion and implementation of improvements strengthens participants’ level of investment. Contributing feedback is empowering to CRN members, and this empowerment translates into constructive ability to address everything from structural issues to interpersonal communication. Our findings can be of value to other consortia on two levels. First, the survey itself creates a safe environment to identify issues that may interfere with research. Second, strategies the CRN has used to improve its functioning are broadly relevant to all types of multicenter collaborations. In our experience, key strategies that enhance collaboration include in-person contact; transparent decision-making processes; efficient use of communication modalities; and creating opportunities for all collaborators to play an active role in the consortium's research and operations. Optimizing the process of collaboration will contribute to achievement of the scientific goals. The experience of the CRN provides a useful framework and process for evaluating the structure of consortium-based research. References (1) Kaluzny AD, Warnecke RB. Managing a health care alliance: improving community cancer care. San Francisco (CA): Jossey-Bass; 1996. Google Scholar (2) Cancer control and population sciences—management and evaluation of large initiatives. Available at: http://dccps.nci.nih.gov/bb/Management_and_Evaluation.htm. Google Scholar (3) Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. San Francisco (CA): Jossey-Bass; 1996. Google Scholar (4) Klabunde CN, Lacey LM, Kaluzny AD. Managing care in the community. In: AD Kaluzny, RB Warnecke. Managing a health care alliance: improving community cancer care. San Francisco (CA): Jossey-Bass; 1996. p. 83–103. Google Scholar (5) Deming WS. Out of the crisis. Cambridge (MA): Massachusetts Institute of Technology; 1986. Google Scholar (6) Juran JM, editor. Juran's quality control handbook, 4th ed. New York (NY): McGraw-Hill; 1988. Google Scholar (7) Platt R, Davis R, Finkelstein J, Go AS, Gurwitz JH, Roblin D, et al. Multicenter epidemiologic and health services research on therapeutics in the HMO Research Network Center for Education and Research on Therapeutics. Pharmacoepidemiol Drug Saf  2001; 10: 373–7. Google Scholar (8) Dignam JJ. The role of cancer cooperative groups within the spectrum of cancer care. Cancer Control  2004; 11: 55–63. Google Scholar (9) Ballard-Barbash R, Taplin SH, Yankaskas BC, Ernster VL, Rosenberg RD, Carney PA, et al. Breast Cancer Surveillance Consortium: a national mammography screening and outcomes database. AJR Am J Roentgenol  1997; 169: 1001–8. Google Scholar (10) Gohagan JK, Prorok PC, Hayes RB, Kramer BS; Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial Project Team. The Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial of the National Cancer Institute: history, organization, and status. Control Clin Trials  2000; 21(6 Suppl): 251S–72S. Google Scholar (11) Mattessich PW, Murray-Close M, Monsey BR, Wilder Research Center. Collaboration: what makes it work, 2nd ed. St Paul (MN): Wilder Publishing Center; 2001. Google Scholar (12) El Ansari W, Phillips CJ, Hammick M. Collaboration and partnerships: developing the evidence base. Health Soc Care Community  2001; 9: 215–27. Google Scholar (13) Pirkis J, Herrman H, Schweitzer I, Yung A, Grigg M, Burgess P. Evaluating complex, collaborative programmes: the Partnership Project as a case study. Aust N Z J Psychiatry  2001; 35: 639–46. Google Scholar (14) Warden D, Rush AJ, Trivdei M, Ritz L, Stegman D, Wisniewski SR. Quality improvement methods as applied to a multicenter effectiveness trial—STAR*D. Contemp Clin Trials  2005; 26: 95–112. Google Scholar © The Author 2005. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oxfordjournals.org. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png JNCI Monographs Oxford University Press

Measuring and Improving Performance in Multicenter Research Consortia

JNCI Monographs , Volume 2005 (35) – Nov 1, 2005

Loading next page...
 
/lp/oxford-university-press/measuring-and-improving-performance-in-multicenter-research-consortia-mzNWMVmsrG

References (13)

Publisher
Oxford University Press
Copyright
© The Author 2005. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oxfordjournals.org.
ISSN
1052-6773
eISSN
1745-6614
DOI
10.1093/jncimonographs/lgi034
pmid
16287882
Publisher site
See Article on Publisher Site

Abstract

Abstract Background: Some evidence suggests that the quality of the organization and management of research consortia influences productivity and staff satisfaction. Collaborators in a research consortium generally focus on developing and implementing studies and thus rarely assess the process of collaboration. We present an approach to evaluating and improving a research consortium, using the HMO Cancer Research Network (CRN) as an example. Methods: Five domains are evaluated: extent of collaboration and quality of communication; performance of projects and infrastructure; data quality; scientific productivity; and impact on member organizations. The primary assessment tool is a survey of CRN scientists and project staff, undertaken annually. Results: Each year, the evaluation has identified critical aspects of this collaboration that could be improved. Several tangible changes have been implemented to improve productivity of the consortium. The most important result of the CRN Evaluation is the ability to have open dialogue about ways to improve its overall performance. Conclusion: Optimizing the process of collaboration will contribute to achievement of the scientific goals. The experience of the CRN provides a useful framework and process for evaluating the structure of consortium-based research. Multicenter consortia are used increasingly as a vehicle for conducting research. Cooperative Oncology Groups are established consortia conducting cancer treatment trials. Other aspects of cancer research, such as assessment of care, studies of the etiology of rare cancers, or organization-level responses to established cancer prevention guidelines, also lend themselves to consortium-based research. Thus, many large collaborations beyond cancer treatment trials have been funded in the past decade by the National Cancer Institute (NCI). The HMO Cancer Research Network (CRN), a consortium of research centers based in health care delivery systems, is one such collaboration. Some evidence suggests that productivity can be modified in the context of a research consortium. Kaluzny (1) found that size, structure, and complexity contribute to the ability of multiple organizations to adapt in a collaborative environment and that organizations who could mold diverse perspectives to meet common goals were more productive in terms of patient accrual performance. Achieving success and ensuring scientific productivity in a collaborative research enterprise generally require much more effort than a single-site research project. A multicenter collaboration benefits from a clear organizational structure, strong leadership, shared goals, and effective communication channels. These characteristics foster cooperation and can increase the effectiveness of the collaboration. Since research collaborations generally focus on implementing multiple studies simultaneously, rarely do collaborators turn their lens inward to assess the process of collaboration and its impact on research. We present one approach to evaluating and improving a research consortium, using the HMO CRN as an example. The NCI stipulated that the CRN grant recipient periodically evaluate its performance. Thus, a specific evaluation core was defined in the CRN application. This was the first time NCI included such a stipulation. However, based partly on the CRN's success with this requirement, it is now considered an integral aspect of consortium-based research. Similar evaluations are being instituted by other large collaborations (2). METHODS Setting The HMO CRN consists of the research programs, enrollee populations, and databases of 11 integrated health care delivery systems across the United States. The health care delivery systems participating in the CRN are: Group Health Cooperative, Harvard Pilgrim Health Care, Henry Ford Health System/Health Alliance Plan, HealthPartners Research Foundation, the Meyers Primary Care Institute of the Fallon Healthcare System/University of Massachusetts, and Kaiser Permanente in six regions (Colorado, Georgia, Hawaii, Northwest [Oregon and Washington], Northern California, and Southern California). The 11 health plans have nearly 10 million enrollees. The CRN conducts collaborative research on the effectiveness of cancer prevention and control strategies in its health care systems. The CRN portfolio contains 27 projects in various stages of completion, and an infrastructure including committees and guidelines to bolster organizational efficiency. Conceptual Models for Evaluation and Improvement Two important conceptual frameworks guide the CRN. Kaluzny (2) has described a “lateral alliance” as a model for collaborations. In lateral alliances, similar organizations pool resources in pursuit of mutually beneficial goals. This model helped the CRN determine what aspects of our own “alliance” should be evaluated. The Model for Improvement (3) also guided the CRN by providing a tested process for evaluating and improving our efforts. This model provides a framework for making changes rapidly using Plan, Do, Study, Act (PDSA) cycles. The overall goal of the CRN Evaluation is to annually assess and improve its research projects, infrastructure, and overall performance. Although publications are an important and relevant outcome of research projects, they come late in the research cycle–often too late to take corrective action if collaborative processes are deficient. For large research enterprises, the process-oriented evaluation described here can complement customary barometers of research output (e.g., peer-reviewed publications, participant retention, influence on practice and policy), especially early in the cycle of large initiatives. Also, information generated by process evaluation is useful for both internal and external stakeholders. Although external evaluation informs decisions about continued funding of a project, internal evaluation informs the creation of approaches that improve overall efficiency and quality of the research process. The evaluation measures performance over time to stimulate and guide improvement activities for each of the CRN's components. Feedback includes numerical ratings plus qualitative suggestions and highlights strengths or weaknesses of each component. Each year's findings are translated into improvement plans with specific goals. Klabunde et al. (4) identified four challenges to the integrity and success of research alliances: 1) maintaining a unified purpose despite multiple and changing environments, 2) achieving and sustaining integration across organizational boundaries, 3) meeting complex information needs created by a multiorganizational arrangement, and 4) fostering a cooperative spirit among alliance partners, who … may have been competitors. As we developed our evaluation strategy, we were attuned to these challenges, and our overall evaluation plan includes an assessment of five broad domains: 1) extent of collaboration and quality of communication, 2) performance of projects and infrastructure, 3) data quality, 4) scientific productivity, and 5) impact on member organizations. The primary assessment tool is an annual survey of CRN scientists and project staff. Yearly progress reports from project teams, accrual of publications and presentations, and the use of infrastructure resources are other measures the CRN uses to evaluate its performance. CRN Participant Survey Each CRN project is rated on four characteristics that are central to a successful collaboration: leadership, communication, organization, and effectiveness of interactions. Also, key components of the CRN infrastructure (data resources, committees, Web site, other communication tools) are assessed for effectiveness, relevance, consistency with CRN aims, and utility. Finally, the survey measures the impact of the CRN on the researchers and participating organizations. Sample questions from the evaluation survey are presented in Table 1, including questions about the projects, steering committee, and member institutions. Given the dynamic nature of this consortium, questions are added to assess new elements and processes, such as the CRN data warehouse, communication strategies, and execution of the recompetition for a second 4-year grant. The complete survey form is available here. Table 1.  Sample questions from the Cancer Research Network participant survey How effective is the Steering Committee at:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      a. Fostering cooperation between sites  1  2  3  4  9      b. Making decisions that support the overall goals of the CRN  1  2  3  4  9  How well do CRN publications guidelines address:  Very Poorly      Very Well  Can't Evaluate      a. Authorship responsibilities  1  2  3  4  9      b. Collaborative manuscript development  1  2  3  4  9  Compared to a year ago, how well is the CRN is doing in each of the following areas:              Worse  The Same  Better  Can't Evaluate        a. Promoting communication across sites  1  2  3  9        b. Data coordination  1  2  3  9        c. Overall leadership  1  2  3  9        d. Providing information concerning other projects  1  2  3  9        e. Providing useful information on the web  1  2  3  9        f. Dealing with publication issues  1  2  3  9        g. Assisting with administrative issues  1  2  3  9        h. Assisting with new proposal development  1  2  3  9    Project-specific questions:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      1. How effective is the leadership of the project?  1  2  3  4  9      2. How effective is the organization of the project for accomplishing the scope of work?  1  2  3  4  9      3. How effective are the project meetings? (Consider both conference calls and in-person meetings.)  1  2  3  4  9      4. How effective are the project communications among different sites and different working groups?  1  2  3  4  9  Selected items on the impact of the CRN:            Impact on Health Plan  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN is highly visible to the oncologists in my health plan.  1  2  3  4  9      As a result of the CRN, our health plan is more supportive of our research center.  1  2  3  4  9  Impact on Research Organization  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN will enhance our research organization's reputation.  1  2  3  4  9      The CRN provides my research center with the opportunity to improve the quality of its cancer research program.  1  2  3  4  9  Impact on Individual  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      I view the CRN as an opportunity to enhance my research career.  1  2  3  4  9      The opportunity to work with researchers outside of our health care system is important to me.  1  2  3  4  9  How effective is the Steering Committee at:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      a. Fostering cooperation between sites  1  2  3  4  9      b. Making decisions that support the overall goals of the CRN  1  2  3  4  9  How well do CRN publications guidelines address:  Very Poorly      Very Well  Can't Evaluate      a. Authorship responsibilities  1  2  3  4  9      b. Collaborative manuscript development  1  2  3  4  9  Compared to a year ago, how well is the CRN is doing in each of the following areas:              Worse  The Same  Better  Can't Evaluate        a. Promoting communication across sites  1  2  3  9        b. Data coordination  1  2  3  9        c. Overall leadership  1  2  3  9        d. Providing information concerning other projects  1  2  3  9        e. Providing useful information on the web  1  2  3  9        f. Dealing with publication issues  1  2  3  9        g. Assisting with administrative issues  1  2  3  9        h. Assisting with new proposal development  1  2  3  9    Project-specific questions:  Very Ineffective  Somewhat Ineffective  Somewhat Effective  Very Effective  Can't Evaluate      1. How effective is the leadership of the project?  1  2  3  4  9      2. How effective is the organization of the project for accomplishing the scope of work?  1  2  3  4  9      3. How effective are the project meetings? (Consider both conference calls and in-person meetings.)  1  2  3  4  9      4. How effective are the project communications among different sites and different working groups?  1  2  3  4  9  Selected items on the impact of the CRN:            Impact on Health Plan  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN is highly visible to the oncologists in my health plan.  1  2  3  4  9      As a result of the CRN, our health plan is more supportive of our research center.  1  2  3  4  9  Impact on Research Organization  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      The CRN will enhance our research organization's reputation.  1  2  3  4  9      The CRN provides my research center with the opportunity to improve the quality of its cancer research program.  1  2  3  4  9  Impact on Individual  Disagree  Somewhat Disagree  Somewhat Agree  Agree  Can't Evaluate      I view the CRN as an opportunity to enhance my research career.  1  2  3  4  9      The opportunity to work with researchers outside of our health care system is important to me.  1  2  3  4  9  View Large The CRN Steering Committee (including NCI) developed the evaluation instrument. Most items are quantitative ratings using Likert scales. Since rating scales cannot capture all aspects of performance, open-ended items collect qualitative input on the strengths and areas for improvement for each project/committee. CRN collaborators agreed at the outset that this evaluation must always be viewed as useful and meaningful to its stakeholders. Therefore, the survey is reviewed annually, and questions are added or deleted to best meet the consortium's needs. Questions about the impact of the CRN on the individual staff and member organizations remain constant for year-to-year comparisons. Respondents As projects evolve, the pool of relevant survey respondents changes from year to year. Site and project principal investigators identify the universe of potential respondents for each year's survey. Generally, these include all individuals who were involved in at least one facet of the CRN (e.g., a project or committee). Skip patterns are included for every question, since some components of the CRN may not be relevant to every respondent. The survey is administered at the end of each grant year. Because this is regarded as a quality improvement effort, neither institutional review board approval nor written informed consent are required. In the first 4 years, this was a self-administered mailed survey. However, in year 5, we switched to a Web-based survey to streamline data entry. Since participants were evaluating their peers, or in some cases their supervisors, protecting the identity of respondents has always been a chief concern. In year 1, the surveys were confidential, but in later years, they were anonymous. Therefore, reminders to complete the surveys can be circulated only to the entire respondent pool, rather than individuals. No incentives are provided. Data Synthesis Means are computed for each quantitative item, and cross-tabulations are generated that show the mean score for the items by role on project and level of involvement (i.e., involved in single versus multiple CRN components). Qualitative feedback on a project or infrastructure component is disseminated to the relevant project or committee leader and the CRN Steering Committee. Project teams and committees review results and formulate improvement plans during in-person meetings. RESULTS Table 2 shows the number of participants and response rates by year, type of respondent, and level of participation. Because participation in different components of the CRN varies by individual, the number of responses to a given question may also vary based on the number of respondents involved in that component. Given evolution in project staffing, different individuals may respond to the questionnaire or specific sections from year to year. The pool of respondents is identified by the site and project leaders and includes CRN participants who may be involved in only a single aspect of the collaboration. However, we know anecdotally that those who are most heavily involved in the CRN participate in the survey each year. Table 2.  Cancer Research Network evaluation respondents Survey year  No. of surveys sent  No. of surveys received  Completion rate (%)      1999  117  88  75.2      2000  164  118  72.0      2001  148  106  71.6      2002  120  83  69.2      2003  153  106  69.3  Survey year  No. of surveys sent  No. of surveys received  Completion rate (%)      1999  117  88  75.2      2000  164  118  72.0      2001  148  106  71.6      2002  120  83  69.2      2003  153  106  69.3    2001     2002     2003     Primary role on the CRN  N  %*  N  %  N  %      Investigator/Biostatistician  49  44  49  46  40  48      Coordinator/Project Manager/Administrator  35  32  26  25  23  28      Programmer/Data Analyst/Site Data Manager  23  21  21  20  15  18      Other (Financial, Research Specialist, Project Assistant, Sr. Project Assistant, Unspecified)  10  9  10  9  5  6  *Check all that apply—total exceeds 100%.                2001     2002     2003     Primary role on the CRN  N  %*  N  %  N  %      Investigator/Biostatistician  49  44  49  46  40  48      Coordinator/Project Manager/Administrator  35  32  26  25  23  28      Programmer/Data Analyst/Site Data Manager  23  21  21  20  15  18      Other (Financial, Research Specialist, Project Assistant, Sr. Project Assistant, Unspecified)  10  9  10  9  5  6  *Check all that apply—total exceeds 100%.                % Involved in:       Duration of involvement with CRN (years)  2001  2002  2003*      <1  4  4  22      1–2  24  10  22      >2  73  86  56  *This was the first year of a new grant cycle and new core projects.          % Involved in:       Duration of involvement with CRN (years)  2001  2002  2003*      <1  4  4  22      1–2  24  10  22      >2  73  86  56  *This was the first year of a new grant cycle and new core projects.          % Involved in:       Level of involvement with CRN  2001  2002  2003*      Single project or committee  41  38  54      Multiple projects and/or committees  59  62  46    % Involved in:       Level of involvement with CRN  2001  2002  2003*      Single project or committee  41  38  54      Multiple projects and/or committees  59  62  46  * This was the first year of a new grant cycle and new core projects. View Large Since the intent of the evaluation is to identify successes and deficiencies, quantitative results are not tested for statistical significance. However, year-to-year comparisons are included in the quantitative results (e.g., 1999 mean versus 2000 mean) to ascertain whether improvements have been realized. Infrastructure Table 3 shows an excerpt of quantitative and qualitative feedback. Each year, the evaluation has identified critical aspects of this collaboration that could be improved. The infrastructure has changed over time to meet projects’ and participants’ needs. These changes range from the structure of the data resources core to optimizing modes of communication to keeping CRN members informed about CRN-wide activities. Midway through our first funding cycle, feedback suggested that the principal investigator's site did not have an adequate understanding of the unique needs and capabilities of participating sites. Site visits were immediately instituted, which have been an invaluable element for improving collaboration. Table 3.  Sample quantitative and qualitative feedback from CRN Evaluation surveys How effective is the Scientific and Data Resources Core (SDRC) at:  Very Ineffective (1) n (%)  Somewhat Ineffective (2) n (%)  Somewhat Effective (3) n (%)  Very Effective (4) n (%)  Can't Eval. N  N  2003 Mean (μ)  2002* Mean (μ)  2001* Mean (μ)  2000* Mean (μ)      Providing leadership on data issues  1 (2)  6 (13)  20 (42)  21(44)  53  48  3.3  2.8  3.1  3.1      Understanding the data-related needs of the projects  2 (5)  6 (14)  18 (41)  18 (41)  57  44  3.2  2.8  2.8  3.0      Providing information and tools useful to proposals and projects  0 (0)  7 (16)  19 (42)  19 (42)  55  45  3.3  N/A – new question      How effective is the Scientific and Data Resources Core (SDRC) at:  Very Ineffective (1) n (%)  Somewhat Ineffective (2) n (%)  Somewhat Effective (3) n (%)  Very Effective (4) n (%)  Can't Eval. N  N  2003 Mean (μ)  2002* Mean (μ)  2001* Mean (μ)  2000* Mean (μ)      Providing leadership on data issues  1 (2)  6 (13)  20 (42)  21(44)  53  48  3.3  2.8  3.1  3.1      Understanding the data-related needs of the projects  2 (5)  6 (14)  18 (41)  18 (41)  57  44  3.2  2.8  2.8  3.0      Providing information and tools useful to proposals and projects  0 (0)  7 (16)  19 (42)  19 (42)  55  45  3.3  N/A – new question      Compared to a year ago, how well do you think the CRN is doing in each of the following areas:  Worse (1) n (%)  The Same (2) n (%)  Better (3) n (%)  Can't Evaluate n  N  2003 Mean (μ)  2002 Mean (μ)  2001 Mean (μ)      a. Assisting with administrative issues  1 (2)  24 (57)  17 (40)  60  42  2.4  2.4  2.4      b. Assisting with new proposal development  1 (3)  17 (55)  13 (42)  71  31  2.4  2.6  2.8      c. Data coordination  4 (9)  17 (39)  23 (52)  58  44  2.4  2.4  2.5      d. Dealing with publication issues  1 (4)  16 (64)  8 (32)  77  25  2.3  2.5  2.5      e. Overall leadership  2 (4)  27 (59)  17 (37)  56  46  2.3  2.4  2.5      f. Promoting communication across sites  1 (2)  29 (63)  16 (35)  56  46  2.3  2.4  2.6      g. Providing information concerning other projects  1 (2)  27 (64)  14 (33)  59  42  2.3  2.3  2.5      h. Providing useful information on the web  1 (2)  19 (42)  25 (56)  56  45  2.5  2.5  2.8  Compared to a year ago, how well do you think the CRN is doing in each of the following areas:  Worse (1) n (%)  The Same (2) n (%)  Better (3) n (%)  Can't Evaluate n  N  2003 Mean (μ)  2002 Mean (μ)  2001 Mean (μ)      a. Assisting with administrative issues  1 (2)  24 (57)  17 (40)  60  42  2.4  2.4  2.4      b. Assisting with new proposal development  1 (3)  17 (55)  13 (42)  71  31  2.4  2.6  2.8      c. Data coordination  4 (9)  17 (39)  23 (52)  58  44  2.4  2.4  2.5      d. Dealing with publication issues  1 (4)  16 (64)  8 (32)  77  25  2.3  2.5  2.5      e. Overall leadership  2 (4)  27 (59)  17 (37)  56  46  2.3  2.4  2.5      f. Promoting communication across sites  1 (2)  29 (63)  16 (35)  56  46  2.3  2.4  2.6      g. Providing information concerning other projects  1 (2)  27 (64)  14 (33)  59  42  2.3  2.3  2.5      h. Providing useful information on the web  1 (2)  19 (42)  25 (56)  56  45  2.5  2.5  2.8  Excerpts from Qualitative Feedback (multiple different respondents)      * It would be helpful to have guidelines for site participation in CRN projects. For example, if a project only includes 2 KP sites, why is that considered appropriate as a CRN project when other interested sites were turned away.      * Provide minutes for CRN site meetings for investigators and project managers and programmers who are unavailable to attend.      * I think it would be useful to have meetings (conference calls) twice a year or so with the CRN leadership and site investigators (ALL investigators, not just steering committee) to get overall updates/discussion.      * I really am proud to be part of the CRN (warts and all!). I would be thrilled if the CRN put priority on topics that can only be addressed through a large network ~ studying rarer CA and exposures.      * Need one page policy summary (checklist?) that includes the 2 to 3 boilerplate sentences expected to be in every paper.      * Fortunately, the Pub. Comm. reduced its initial barrier approach to publications approval and handles reviews in a relatively timely way, but I am unaware of anything it has done to facilitate pubs.      * Abbreviations: GHC, Group Health Cooperative; CRN, Cancer Research Network.  Excerpts from Qualitative Feedback (multiple different respondents)      * It would be helpful to have guidelines for site participation in CRN projects. For example, if a project only includes 2 KP sites, why is that considered appropriate as a CRN project when other interested sites were turned away.      * Provide minutes for CRN site meetings for investigators and project managers and programmers who are unavailable to attend.      * I think it would be useful to have meetings (conference calls) twice a year or so with the CRN leadership and site investigators (ALL investigators, not just steering committee) to get overall updates/discussion.      * I really am proud to be part of the CRN (warts and all!). I would be thrilled if the CRN put priority on topics that can only be addressed through a large network ~ studying rarer CA and exposures.      * Need one page policy summary (checklist?) that includes the 2 to 3 boilerplate sentences expected to be in every paper.      * Fortunately, the Pub. Comm. reduced its initial barrier approach to publications approval and handles reviews in a relatively timely way, but I am unaware of anything it has done to facilitate pubs.      * Abbreviations: GHC, Group Health Cooperative; CRN, Cancer Research Network.  View Large Projects CRN projects are typically complex efforts comprised of multiple data collection components and involvement of at least three sites. Early in the first funding cycle, protracted decision making emerged as a critical issue on two core projects. The project leaders took steps toward enhancing efficiency by establishing executive committees to accelerate deliberations about design issues. Yet the need for transparency and truly collaborative scientific decision making are also key facets of successfully implementing these complex multisite efforts, so the leaders also implemented strategies to encourage more active participation on the part of all collaborators. Timely and accurate dissemination of design decisions are also critical for keeping the project on track while ensuring transparency. Therefore, another tangible procedural change was adding functionality to the CRN's secure Web site to give project teams at the various research sites the ability to remotely post project documentation on the Web site. Generally, overall project functioning is rated as “somewhat” or “very effective” by respondents, indicative of the project leaders’ commitment to organization, clear communication, and efficient implementation strategies. The multisite aspect of every project means that involved investigators typically represent different disciplines and bring different perspectives to the research. Project leaders must therefore remain vigilant about ensuring that all team members contribute, and hold them accountable for completing the scope of work. Impact of the CRN From its inception, the CRN sought to improve the quality of cancer research and cancer care at its participating sites, and the assessment of our progress on this front is a much-anticipated piece of each year's survey. Table 4 shows results from a subset of these questions. The CRN has had a substantial impact at the individual investigator and research center level; however, impact at the health plan level is lower. As the CRN has matured, project findings have heightened the visibility of the research centers, and incrementally, the health plans have shown more support of the centers. Individual investigators consistently report that the CRN provides them with important opportunities to collaborate with external researchers, enhance their careers, and improve quality and availability of local data resources. Table 4.  Sample results from questions pertaining to the impact of the Cancer Research Network (CRN)   Strongly Disagree  Disagree  Not Sure  Agree  Strongly Agree            (1)  (2)  (3)  (4)  (5)    2002*  2000  1999  Statement  n (%)  n (%)  n (%)  n (%)  n (%)  N  Mean (μ)  Mean (μ)  Mean (μ)  Impact on Health Plan                        The leadership of my health plan is aware of the CRN.  (1)  2 (2.5)  32 (40.5)  33 (42)  11 (14)  79  3.6  3.7  3.7  Impact on Research Organization                        The CRN is a cornerstone of our research center.  4 (5)  26 (32)  17 (21)  27 (33)  7 (9)  81  3.1  2.8  2.6      The CRN will enhance our research organization's reputation.  0 (0)  1 (1)  11 (14)  54 (67)  15 (18)  81  4.0  4.0  3.9  Impact on Individual                        I believe the work of the CRN researchers will result in significant improvements in the quality of cancer data and data availability.  0 (0)  3 (4)  11 (14)  41 (51)  26 (32)  81  4.1  3.9  3.9      My involvement on the CRN is very satisfying to me.  0 (0)  9 (11)  11 (13)  46 (56)  16 (20)  82  3.8  3.8  3.8      I view the CRN as an opportunity to enhance my research career.  1 (1)  9 (11)  14 (17)  41 (50)  17 (21)  82  3.8  3.8  3.8    Strongly Disagree  Disagree  Not Sure  Agree  Strongly Agree            (1)  (2)  (3)  (4)  (5)    2002*  2000  1999  Statement  n (%)  n (%)  n (%)  n (%)  n (%)  N  Mean (μ)  Mean (μ)  Mean (μ)  Impact on Health Plan                        The leadership of my health plan is aware of the CRN.  (1)  2 (2.5)  32 (40.5)  33 (42)  11 (14)  79  3.6  3.7  3.7  Impact on Research Organization                        The CRN is a cornerstone of our research center.  4 (5)  26 (32)  17 (21)  27 (33)  7 (9)  81  3.1  2.8  2.6      The CRN will enhance our research organization's reputation.  0 (0)  1 (1)  11 (14)  54 (67)  15 (18)  81  4.0  4.0  3.9  Impact on Individual                        I believe the work of the CRN researchers will result in significant improvements in the quality of cancer data and data availability.  0 (0)  3 (4)  11 (14)  41 (51)  26 (32)  81  4.1  3.9  3.9      My involvement on the CRN is very satisfying to me.  0 (0)  9 (11)  11 (13)  46 (56)  16 (20)  82  3.8  3.8  3.8      I view the CRN as an opportunity to enhance my research career.  1 (1)  9 (11)  14 (17)  41 (50)  17 (21)  82  3.8  3.8  3.8  * Means for 2001 are not provided for comparison because in the 2001 survey we used different response categories for all questions in Section 3. View Large Improvement Plans Once evaluation results are disseminated, research teams develop improvement plans based on the findings. These improvement plans are generated completely within the group, keeping with the heuristic that implementation and adherence to the improvement plan will be more likely if the stakeholders develop it, rather than having a prescription for improvements dictated by another person or group. The improvement plans are brief, specific, and outcome-oriented. Major aspects of the improvement plans are to 1) establish goals; 2) develop an approach to meeting each goal; and 3) determine how to measure progress toward each goal. DISCUSSION The most important result of the CRN Evaluation is enhancing the ability to openly discuss ways to improve overall performance. The entire evaluation process enables discussion of deficiencies to take place in a nonthreatening context. Weaknesses in organization, communication, or leadership can be identified constructively and confidentially; the survey also offers respondents the ability to suggest ways to strengthen or modify procedures as well as interpersonal interactions. As suggestions are adopted, trust increases and interactions continue to improve. The primary purpose of this evaluation is to rapidly identify and correct problems in the functioning of the CRN before they interfere with the research. Though initially this seemed to be an unorthodox undertaking in an applied research setting, implementing a self-assessment of this collaboration has been a major contributor to its success. Other NCI-funded consortia, including the Transdisciplinary Tobacco Use Research Centers, Cancer Care Outcomes Research and Surveillance Consortium, and the Statistical Coordinating Center of the Breast Cancer Surveillance Consortium, have all approached the CRN to learn from its experiences and undertake similar evaluations. The recently awarded Centers of Excellence in Cancer Communications Research initiative also includes an evaluation component. Surfacing issues that could interfere with collaboration and research has been cited by consortium leaders as one motive for evaluation. Institutionalizing an evaluation requirement for consortia could ultimately facilitate comparison across different research collaborations and possibly identify best practices in collaborative research. Although our evaluation has had a substantial impact internally, limitations must be acknowledged. First, it is an ongoing challenge to persuade participants of the survey's importance. Time spent on the survey—though it is a fairly brief instrument—is time taken away from the research itself. Second, approximately half of the participants are involved in only a single project and therefore may not be ideally positioned to evaluate networkwide activities. But perceptions from team members who are intimately involved, as well as those on the periphery, provide a less biased and broader assessment of our effectiveness and also indicate areas where the CRN could improve its visibility. Third, with personnel turnover, different people may participate each year, reducing ability to compare ratings of effectiveness over time. However, the dynamic nature of the research environment, and the CRN's approach to adapting processes in response to prior evaluations, preclude contacting the same respondents every year. Finally, although most questions remained unchanged from year to year, some items in each survey have been added or deleted to each iteration. Although this tailored approach could diminish the extent to which the CRN can gauge its overall improvement over the lifetime of the collaboration, it facilitates assessment of newer organizational changes. There is an abundance of literature on quality improvement, including seminal publications by Deming (5) and Juran (6). Several multicenter research consortia have published descriptions of their structure and purpose (7–10). Mattessich (11) has reviewed social science and educational research on optimizing collaborations. However, we found few articles that bridge these three domains, particularly in health research. Kaluzny's assessment of the Community Clinical Oncology Program (2) served as a blueprint for the CRN's evaluation strategy. But to our knowledge, few other consortia have published a systematic evaluation of their collaborative processes (12,13). Warden (14) describes the use of quality improvement methods in a multicenter depression trial; however, they incorporate performance-oriented research goals (e.g., accrual of patients, retention), to enhance trial performance, which is very useful but quite different from the CRN's approach. Researchers in multicenter collaborations might consider a hybrid approach, using satisfaction or effectiveness measures (e.g., how well the collaboration is functioning) in combination with other cornerstones of research quality and productivity, such as response or retention rates, data quality, adherence, and number of publications. NCI's evaluation requirement in its Request for Applications was a unique trial balloon that has been beneficial to the CRN. From examination of successful collaborations, Mattessich (11) advocates that “members’ perceptions of ownership of a collaborative group need to be monitored, and needed changes in process or structure [made] in order to ensure the feeling of ownership.” Moreover, as Kaluzny (2) described, consortiums attuned to the interactive processes among alliance partners are more likely to achieve a productive dynamic and meet research objectives. Various CRN member organizations had collaborated previously, though never on this scale. Hence, the ability for individuals to critique their own projects, as well as networkwide activities, created a unique environment for change. Moreover, rapid-cycle discussion and implementation of improvements strengthens participants’ level of investment. Contributing feedback is empowering to CRN members, and this empowerment translates into constructive ability to address everything from structural issues to interpersonal communication. Our findings can be of value to other consortia on two levels. First, the survey itself creates a safe environment to identify issues that may interfere with research. Second, strategies the CRN has used to improve its functioning are broadly relevant to all types of multicenter collaborations. In our experience, key strategies that enhance collaboration include in-person contact; transparent decision-making processes; efficient use of communication modalities; and creating opportunities for all collaborators to play an active role in the consortium's research and operations. Optimizing the process of collaboration will contribute to achievement of the scientific goals. The experience of the CRN provides a useful framework and process for evaluating the structure of consortium-based research. References (1) Kaluzny AD, Warnecke RB. Managing a health care alliance: improving community cancer care. San Francisco (CA): Jossey-Bass; 1996. Google Scholar (2) Cancer control and population sciences—management and evaluation of large initiatives. Available at: http://dccps.nci.nih.gov/bb/Management_and_Evaluation.htm. Google Scholar (3) Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. San Francisco (CA): Jossey-Bass; 1996. Google Scholar (4) Klabunde CN, Lacey LM, Kaluzny AD. Managing care in the community. In: AD Kaluzny, RB Warnecke. Managing a health care alliance: improving community cancer care. San Francisco (CA): Jossey-Bass; 1996. p. 83–103. Google Scholar (5) Deming WS. Out of the crisis. Cambridge (MA): Massachusetts Institute of Technology; 1986. Google Scholar (6) Juran JM, editor. Juran's quality control handbook, 4th ed. New York (NY): McGraw-Hill; 1988. Google Scholar (7) Platt R, Davis R, Finkelstein J, Go AS, Gurwitz JH, Roblin D, et al. Multicenter epidemiologic and health services research on therapeutics in the HMO Research Network Center for Education and Research on Therapeutics. Pharmacoepidemiol Drug Saf  2001; 10: 373–7. Google Scholar (8) Dignam JJ. The role of cancer cooperative groups within the spectrum of cancer care. Cancer Control  2004; 11: 55–63. Google Scholar (9) Ballard-Barbash R, Taplin SH, Yankaskas BC, Ernster VL, Rosenberg RD, Carney PA, et al. Breast Cancer Surveillance Consortium: a national mammography screening and outcomes database. AJR Am J Roentgenol  1997; 169: 1001–8. Google Scholar (10) Gohagan JK, Prorok PC, Hayes RB, Kramer BS; Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial Project Team. The Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial of the National Cancer Institute: history, organization, and status. Control Clin Trials  2000; 21(6 Suppl): 251S–72S. Google Scholar (11) Mattessich PW, Murray-Close M, Monsey BR, Wilder Research Center. Collaboration: what makes it work, 2nd ed. St Paul (MN): Wilder Publishing Center; 2001. Google Scholar (12) El Ansari W, Phillips CJ, Hammick M. Collaboration and partnerships: developing the evidence base. Health Soc Care Community  2001; 9: 215–27. Google Scholar (13) Pirkis J, Herrman H, Schweitzer I, Yung A, Grigg M, Burgess P. Evaluating complex, collaborative programmes: the Partnership Project as a case study. Aust N Z J Psychiatry  2001; 35: 639–46. Google Scholar (14) Warden D, Rush AJ, Trivdei M, Ritz L, Stegman D, Wisniewski SR. Quality improvement methods as applied to a multicenter effectiveness trial—STAR*D. Contemp Clin Trials  2005; 26: 95–112. Google Scholar © The Author 2005. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oxfordjournals.org.

Journal

JNCI MonographsOxford University Press

Published: Nov 1, 2005

There are no references for this article.