Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Cancer Early-Detection Services in Community Health Centers for the Underserved

Cancer Early-Detection Services in Community Health Centers for the Underserved BackgroundAchieving cancer early-detection goals remains a challenge, especially among low-income and minority populations.Design/SettingA randomized trial based in 62 community health centers for the underserved in New York, New Jersey, and western Connecticut. Family physicians were on staff at most of the centers.InterventionWorkshops, materials, and ongoing advice for center leaders promoted implementation of a preventive services office system to identify patients in need of services at each visit through use of medical record flow sheets, other tools, and staff involvement.Evaluation End PointsThe proportion of randomly selected patients by center who were up to date for indicated services at baseline (n=2645) and follow-up (n=2864) record review.ResultsOnly 1 service (breast self-examination advice) increased more in intervention centers. Seven of 8 target services increased significantly for the 62 centers overall. During the study, the medical director changed in 26 centers (42%). Keeping the same medical director at intervention centers was associated with improvements in services.ConclusionsCancer early-detection services are improving in community health centers, but the intervention had only a small impact, as determined by record review. To have an impact, the intervention required that there be no change in medical director. The relationship of changes in the practice environment to services delivered is complex and deserves more study.COMMUNITY/migrant health centers (C/MHCs) provide primary care and preventive services for populations that otherwise have limited or no access to continuous, comprehensive care.In 1995, there were 722 C/MHCs providing primary care across 2204 sites to 8.1 million people through more than 36 million visits.In addition to Medicaid, private insurance, and sliding-scale fees, C/MHCs receive funding from the Division of Community and Migrant Health of the Bureau of Primary Health Care of the Public Health Service, Health Resources and Services Administration. The C/MHCs are a proven source of cost-effective, high-quality, community-based primary care,with family physicians playing a prominent role.Providing preventive care to underserved populations is a well-recognized challenge. Health for All: Objectives for the Year 2000 specifies more modest preventive service targets for low-income and minority groups.Costs of indicated services, limited availability, difficult access, lack of clinician recommendations, patient reluctance, and other well-recognized barriersmay weigh especially heavily on these group.Enhancing the provision of preventive services in primary care has been addressed by a number of studies.Interventions based on office systems that include use of flow sheets and other clinician reminders or that include multiple components have been most successful.Some studies have targeted underserved populations. Recently, Gemson and colleaguesfound that an office system based on the Public Health Service's Put Prevention Into Practice office systemincreased provision of preventive services in 1 urban academic medical center compared with a similar control center. Most services were provided by physicians in training. How to enhance preventive care in more representative primary care settings serving special populations is not known.The Cancer Prevention in Community Practice Project (CPCP) found that an office system intervention increased significantly the provision of 6 of 10 target cancer early-detection and prevention services.The 98 participating community practices, all in New Hampshire and Vermont, included no more than 4 clinicians each, the majority of whom were family physicians. Two of the 98 sites were C/MHCs.The Community Health Center Cancer Control Project tested an intervention derived from the CPCP office system in 62 C/MHCs and other sites providing primary care to special populations. The intervention was delivered by an intermediary organization that could potentially serve as a dissemination agent after the research trial was complete. This report summarizes the findings of this study.PARTICIPANTS AND METHODSAfter recruitment and baseline data collection in 1992, 62 centers were first stratified by urban vs nonurban location, then randomly assigned to intervention or control status. In 1994, after the intervention had been provided to centers so assigned, follow-up data were collected. Rates of cancer early-detection services determined from randomly selected patient records at each interval provided the dependent variables of interest.THE INTERMEDIARYThe intermediary organization responsible for center recruitment and intervention delivery was Clinical Directors Network Inc (CDN). Established in 1985, CDN is a not-for-profit membership organization consisting of the medical, dental, and nursing directors as well as other clinicians practicing in federal, state, and city-funded community, migrant, homeless, and other health centers in New York, New Jersey, Puerto Rico, Virgin Islands of the United States, and elsewhere. Clinical Directors Network provides training on clinical leadership and management issues, as well as clinical updates for C/MHC providers and others who serve low-income, disadvantaged, and minority populations. Clinical Directors Network is also active in practice-based clinical research.THE C/MHCsParticipation was sought from New York, New Jersey, and western Connecticut centers. The medical directors of all 97 C/MHCs open in 1992 as well as the directors of other centers supported by states and the New York City Health and Hospitals Corporation were approached to determine eligibility. To be eligible, a center had to have opened at least 2 years earlier and provide a broad range of primary care services to adults. Sixty-two sites were eligible and agreed to participate, 27 were eligible but declined, and the balance were ineligible because they provided a limited range of services.THE INTERVENTIONCenters assigned to intervention status received ongoing external assistance from CDN to implement the office system for cancer early detection shown to be efficacious in the earlier CPCP.The system consisted of establishing preventive care goals and practice routines to promote meeting these goals; shared responsibility among all center clinicians and office staff for identifying patients overdue for preventive services at all feasible visits; and use of various tools such as preventive care flow sheets in the medical record, patient education materials, and external chart identifiers.In addition, patient-held health diaries were adapted from Dickey and Petittiand removable adhesive notes were modified from CPCP.Similar tools are included in the Put Prevention Into Practice kit.During 20 months, CDN provided office system implementation assistance to intervention centers in 3 ways: a workshop for key personnel; 1 or 2 visits to each center by CDN personnel who facilitated development and implementation of the office system; and ongoing support with free professional education materials, office system tools, and advice. The workshop described the components of the office systemas well as an implementation process called the preventive GAPS approach,which allowed each center to develop an office system that fit its unique resources, patient population, and preventive service goals. Professional education materials available for loan included clinical breast examination teaching models (Mammotech Corp, Gainesville, Fla), oral cavity clinical pathology slides, and office staff training videos on communication.THE PATIENT EVALUATION SAMPLING STRATEGYThe records of established patients who were at least 42 years of age, not terminally ill, and not previously diagnosed as having cancer provided data about cancer early detection. Further patient eligibility criteria included a first visit to the practice at least 366 days before a record review date, as well as an additional visit within the most recent 365 days.With the use of a computerized random selection procedure, records for 50 eligible patients were identified at both baseline and 2-year follow-up. Patients of the first record review were not excluded from the second. All eligible patients had an equal chance of being selected. Thus, patients with many visits were no more likely to be selected than patients with just 2 visits. Patients were also selected without regard to whether they had 1 usual or assigned clinician.DESCRIPTIVE EVALUATION DATAFor each C/MHC, a center survey was completed at baseline and follow-up by the medical director or designee. This survey ascertained characteristics of the center and the population it served. Descriptive data on the patient samples were ascertained from medical records of patients in the evaluation samples. After patient eligibility was confirmed, records were reviewed to determine patient demographics, insurance status, and office visit history.OUTCOME DATAEvaluation end points included cancer early-detection services recommended by the National Cancer Institute Working Guidelines for Practicing Physicians at the time this project began.For women, these included clinical breast examination, breast self-examination advice, mammography (age 50 years and older), and Papanicolaou test. For both men and women aged 50 years and older, these included oral cavity examination, digital rectal examination, home fecal occult blood testing, and sigmoidoscopy. At both baseline and follow-up record reviews, each record was checked in reverse chronological order back 24 months (or until the patient's initial visit if the patient had registered less than 24 months earlier) for each service.QUALITY CONTROLRecord reviews were completed by independent professional medical record abstractors on contract from a regional professional standards review organization (MedReview Inc, New York, NY). Reviewers were blinded to hypotheses and center study group. In all practices, a randomly selected subsample of 10% of the records was reviewed independently by a second reviewer and the reviews were compared. Reviewer agreement was high. Disagreements between reviewers varied from 0.2% to 5.9% across individual demographic characteristics. For target services, disagreement on whether a service was up to date ranged from 3.2% for clinical breast examination to 9.1% for sigmoidoscopy.STATISTICAL METHODSThe center is the unit of analysis unless otherwise noted. Urban vs nonurban location was determined by ZIP code. Characteristics of centers in each study group are summarized as means or medians and compared by means of appropriate statistical tests. Three centers (2 assigned to the intervention and 1 a control) were very large, hospital-based programs, employing over 50 clinicians each, and were excluded from the calculation of mean full-time equivalent clinicians, office staff, and number of examination rooms.For the primary analysis, services were considered up-to-date if the service was provided or recommended at least once during the 24-month period before the record review date, or during the time since enrollment at the center. For each service, the performance of individual centers was determined and then combined with the results from other centers in the same study group. Thus, centers were weighted equally, although the number of evaluation sample patients at each center who were eligible for the specific individual services varied.Nonparametric tests were used for statistical comparisons of the primary outcomes. Median proportions of patients up-to-date for target services were compared between baseline and follow-up intervals within study groups by Wilcoxon signed rank tests and between study groups by the Wilcoxon rank sum test. An alternative analysis demonstrated similar results with the use of generalized estimating equations for longitudinal data based on Poisson distributions with overdispersed variances and odds ratios calculated by means of generalized linear models.(This analysis is not reported here.) In addition to the intention-to-treat analysis, subsample analyses explored the relationship to outcomes of various center characteristics such as a change in medical director. We considered Pvalues less than .05 statistically significant.RESULTSThirty-one centers were assigned to each study group and provided complete data. About 10% of the records reviewed for eligibility at each interval were unavailable, usually because an appointment was pending. Of the records found, the most common reason for ineligible status was too short a time as an established patient of the center. For most centers, 50 eligible records provided usable data at each evaluation interval.CENTER CHARACTERISTICSCharacteristics of the centers and their patient populations are provided in Table 1. A majority of patients for whom ethnicity was known were African American (35%) or Hispanic (29%). Across centers, more than one fourth of the patient sample received Medicaid. At least 11% overall had no insurance. For an additional 20%, insurance status was not recorded in the record, suggesting that they also may have lacked coverage. Continuity of care, as determined by the proportion of visits for each patient with the provider seen most often, was high. Family physicians were on the staff of 61% of the intervention C/MHCs and 71% of controls, with the remaining centers depending on clinicians trained in internal medicine, pediatrics, or obstetrics and gynecology.Table 1. Baseline Characteristics of Community Health Centers and Their Patient Samples by Study Group*See table graphicTable 2describes the practice environments during the study. In intervention practices, medical director turnover was higher, clinicians were busier, and moves or expansions were more common, although none of these comparisons was statistically significant.Table 2. Environmental Characteristics Present in Centers During the Study PeriodSee table graphicOf the intervention centers, all cooperated with the 3 components of the workshop. All were represented at a workshop, 30 (97%) of 31 cooperated with at least 1 visit from CDN staff, and all made use of at least some form of follow-up assistance. Twelve (39%) of 31 centers implemented new or redesigned flow sheets.MAIN EFFECTS ANALYSISTable 3summarizes the proportion of age- and sex-appropriate patients provided indicated services within the previous 24 months overall and by study group. Comparing baseline and follow-up proportion changes between study groups, only breast self-examination advice increased more for the intervention group (P=.009).Table 3. Median Proportions of Patients by Center Who Received Age- and Sex-Indicated Services During the Previous 24 Months: Baseline vs Follow-up IntervalsSee table graphicFor the 62 centers overall, 7 of 8 services showed increased proportions of patients who were up-to-date for indicated services (P<.05); only sigmoidoscopy proportions did not change. At baseline and follow-up, oral cavity examination and mammography were the most frequently provided services. Home fecal occult blood testing and sigmoidoscopy for patients aged 50 years and older and breast self-examination advice for women were services least frequently provided.Between baseline and follow-up in the intervention group, the median proportion of patients up-to-date increased by 0.10 or more for clinical breast examination, home fecal occult blood testing, and oral cavity examination. Within the intervention group, comparisons of baseline and follow-up proportions for clinical breast examination, breast self-examination advice, and oral cavity examination were statistically significant. The median proportion of women up-to-date for mammography increased from 0.58 to 0.65 (P=.06).Between baseline and follow-up in the control group, the median proportion of patients up-to-date for Papanicolaou tests, fecal occult blood tests, and oral cavity and digital rectal examinations increased by 0.10 or more. Within the control group, comparisons of proportions between baseline and follow-up were significant for 4 services: clinical breast examination, Papanicolaou test, digital rectal examination, and oral cavity examination. Comparisons over time involving breast self-examination advice and home fecal occult blood testing improved 0.06 (P=.06) and 0.12 (P=.06), respectively, in the proportion of patients up-to-date.SUBGROUP ANALYSESWe explored the relationship of various practice characteristics to changes in early-detection services provided in intervention and control practices. No pattern of change across the 8 target services between study groups was apparent according to size, urban vs nonurban location, higher vs lower clinician or staff turnover, or center moves and expansions based on the Wilcoxon signed rank test. For comparisons in this hypothesis-generating consideration, statistical power is limited because of small cell sizes.Because the medical director played a central role in the intervention and medical director change occurred in almost half the centers, we explored the relationship between medical director change and the impact of the intervention. Table 4shows median differences for indicated services between baseline and follow-up according to whether there was a change in medical director (n=26) vs whether the same medical director remained (n=36). Intervention centers with the same medical director (n=14) and control centers with a change in medical director (n=9) showed greater improvement than the other groups. For intervention centers with the same medical director, the median change in proportion between baseline and follow-up in this subgroup was greatest for mammography (P=.05). For control centers with medical director change, the largest median change was for Papanicolaou tests (P=.10).Table 4. Median Change in Provision of Early-Detection Services at Intervention and Control Centers: Impact of Medical Director Change During the Study Period*See table graphicFigure 1shows the change in proportions of patients up-to-date for each service at individual centers between baseline and follow-up according to subgroups determined by combining intervention and medical director change status. Changes in sigmoidoscopy over time were small across all the groups. For the other 7 services, there was considerable range of change in proportions within each subgroup. In general, most centers improved over time on each service. Improvements across all groups are most striking for oral cavity examination.Change in proportion of patients up-to-date for the indicated services between baseline and follow-up for each community health center. The position of the median in each group is indicated by the long bar.Some changes in individual center results indicated in Figure 1deserve specific comment. One center in the control–same medical director subgroup improved the proportion of women up-to-date for mammography by 0.75. The same center improved Papanicolaou tests by more than 0.50. Because of the age and sex distribution of patients at this center, fewer than 10 patients were eligible for these services at each interval, suggesting that these results may be unstable because of the small denominator.For changes in breast self-examination advice, Papanicolaou test, and home fecal occult blood testing proportions indicated in Figure 1, a different single center is an outlier in the intervention–same medical director subgroup, showing decreases in the proportions of all 3 services, each exceeding 0.50. This was the 1 center among the 31 in the intervention group that did not allow any site visit to the center by CDN staff to assist in office system implementation.The intervention–different medical director subgroup had 1 center that increased digital rectal examinations by more than 0.50 (from 0.24 at baseline to 0.89 at follow-up) and another center that increased oral cavity examination by more than 0.50 (from 0.40 to 0.96). At each interval, these centers had at least 25 patients eligible for these services. These centers also showed substantial improvements in other target services, suggesting that the large improvements that appear to represent outlier data may in fact reflect substantial changes in preventive services overall.COMMENTSeven of 8 cancer early-detection services increased in both study groups, but intervention centers improved more than control centers for only 1 service—breast self-examination advice. While these results indicate cancer early-detection progress at C/MHCs, intervention impact was disappointing.Why did this study, based on the same office system shown to be effective in the CPCP,demonstrate so modest an impact here? Certain differences between studies deserve mention in terms of the practices involved, the evaluation methods, and the national health care environment.Almost two thirds of the eligible centers participated in the current study. This participation rate differs from that in the CPCP, in which fewer than 25% of eligible practices participated.The CPCP thus addressed whether an office system intervention can work in highly motivated practices that were not necessarily representative of others in the region. The current project explores the impact of the intervention when disseminated across more representative practices that may have had a broader range of motivation, resources, and ability to cooperate.Another difference concerns practice size. In CPCP, most practices included 1 or 2 clinicians. The largest practice had 4. In the current project, 50 (81%)of the 62 participating practices included more than 4 clinicians. Sample size limitations preclude exploring whether smaller practices benefited more from the intervention than larger practices, but there is no question that the intervention implementation process faced more obstacles in larger centers. As an example, the implementation process in CPCP allowed research staff to work directly over time with the entire staff of each practice.The "train the trainer" approach followed here limited ongoing direct contact to practice leaders only.Another difference concerns practice stability. In CPCP, practices had the same clinical leadership throughout. During the current study, almost half of the practices changed leadership. Those intervention practices with no leadership change performed better. The importance of medical director stability to the impact of the intervention comes as no surprise, since the implementation process was directed at center leaders. If the medical director left, the office system intervention was compromised. The timeline of the study precluded starting over. In addition, the competing demands faced by a new medical director would make participation in an intervention project of this type a low priority.Evaluation also differed between studies. The impact of the CPCP intervention focused on patients seen in the practice near the time of follow-up record review and well after the intervention was established, not a random sample drawn from all active records as in the current study. Thus, the current study sets a higher outcome standard of looking for benefit in all patients seen during the past year, not just those most recently seen. Patients seen recently are more likely to be high utilizers, and they are more likely to receive the full benefit of an intervention established a year earlier; in the current study, more representative patients may have been seen only once during the previous year, potentially early in the intervention period, before the intervention was firmly established.Differences in the national health care environment between 1987 to 1989 and 1992 to 1994 deserve note as well. The first edition of the US Preventive Services Task Force Guidewas published in 1989, so both intervention and control centers in the current study, but not CPCP, benefited from the resultant momentum to preventive service guidelines. In addition, debate raged in 1987 about what services were indicated, and the current emphasis on quality improvement was in its early stages. By 1992, these recent developments had created a different health care environment that may have contributed to the strong secular trend observed across intervention and control practices seen here.A surprising finding of the subgroup analysis concerns the increases in services in those control sites that had a change in medical director. This finding has several explanations worth exploring.New medical directors may be inclined to promote new initiatives, such as more preventive services, and to have the strong support of colleagues in doing so. As such, it is plausible that services would increase, at least for a time, after medical director change. Perhaps medical director change and the intervention studied here interfere with each other, with each effective alone but not together. Thus, in control sites, a change in medical director improved services, whereas with no change in medical director these centers continued to provide the same level of services. In intervention sites, medical director stability was required for the intervention to have an impact. If the medical director changed, intervention implementation may have lost its driving force. These and other explanations regarding the impact of the practice environment on interventions in community practice require exploration in future studies.Medical director turnover was higher in intervention sites, as were the number of patients seen per hour by clinicians and the number of changes in practice location or expansion. Were these changes related to the intervention? This seems implausible, because the clinical demands on the centers dwarf the demands of the intervention. A more likely explanation concerns the complexity of these practice environments, our limited understanding of what environmental factors influence the ability of a practice to change, and the dearth of measures that allow rigorous assessment of these factors. Typical descriptors of practice environments, such as the number of clinicians or the age and sex distribution of the patients, may not go far enough to assure that the 2 center study groups were equivalent to begin with and, even if they were, that the vicissitudes and changes in their operations during the study period did not interfere with the study. New measures that address the practice environment in detail are needed.Strengths of this study include the high center participation rate, an intervention that shares important similarities with the Put Prevention Into Practice program,rigorous evaluation based on quality-controlled record review, and the setting in community health centers, which are important but seldom-studied clinical practices that provide vital primary care services to special populations.Limitations include the variable speed of implementation among intervention centers. Some practices implemented an office system within 4 months, allowing the intervention to have more than 18 months to show an impact in patient records. Others required 18 months, allowing less than 6 months for changes to be reflected in the records. In addition, only 12 intervention centers (39%) initiated a new flow sheet, compared with 100% of intervention practices in CPCP.A further limitation of our evaluation design was the focus on each center overall, hence our inability to address the performance of individual clinicians and reliance on only 50 randomly selected charts to evaluate C/MHC performance.What lessons does this study provide for family physicians? First, it confirms and extends the results of the CPCP,that an office system can improve cancer early-detection services in complex community health center practices as long as they have stable clinical leadership. If a practice does not use preventive-service flow sheets and involve office staff in preventive-service routines, doing so can make a difference. Second, implementation of an office system is not a simple task. When all intervention practices as a group were compared with controls, the combination of workshops, ongoing support from an intermediary, and free office system tools failed to increase early-detection services above the encouraging secular trend observed across all 62 community health centers. If a practice is undergoing substantial staff turnover or other major changes, implemention of a preventive-services office system should wait.What lessons does this study provide for policy makers, administrators, and researchers? Many of the C/MHCs participating in the study were dynamic environments showing a high degree of medical director, clinician, and staff turnover. Any contemplated quality-improvement initiatives need to take the dynamic nature of practices into account. More fundamental understanding of the practice environment may be required to plan and achieve quality improvements across a broad range of practices. For example, control practices with a change in medical director showed improvement, which was contrary to intervention practices, where medical stability was required for early-detection improvement. More research is needed to determine the relationship of practice environments to changes in the quality of care, in response both to externally driven interventions as in this study and to internally driven initiatives such as a new medical director might launch. Future studies should assess the practice environment, including the role of clinical leadership, in more detail before implementing interventions.The office systems approach to improving clinical practice services continues to show promise, but it is not a panacea. Studies of representative community practices need to tailor interventions to be tested to the range of study practice environments, including the ability and motivation of the practices to cooperate.RCBohrerUS Public Health Service, Bureau of Primary Health Care, Division of Community/Migrant Health, Annual Fact Sheet 1995.Bethesda, Md: Public Health Service; 1996.JLeiferCommunity and Migrant Health Centers: Legal Handbooks for Program Planners.Washington, DC: Association of Maternal and Child Health Programs; 1991.ASardellThe US Experiment in Social Medicine: The Community Health Center Program, 1965-1986.Pittsburgh, Pa: University of Pittsburgh Press; 1988.ASardellClinical networks and clinician retention: the case of CDN.J Community Health.1996;21:437-451.BStarfieldNPoweJWeinerCost versus quality in primary care.JAMA.1994;272:1903-1908.US Dept of Health and Human Services, Public Health ServiceHealthy People 2000: National Health Promotion and Disease Prevention Objectives.Washington, DC: US Dept of Health and Human Services; 1990.PSFrameHealth maintenance in clinical practice: strategies and barriers.Am Fam Physician.1992;45:1192-1200.HFreemanRace, poverty, and cancer.J Natl Cancer Inst.1991;83:526-527.SJMcPheeJABirdDFordhamJERodnickEHOsbornPromoting cancer prevention activities by primary care physicians.JAMA.1991;266:538-544.PSFrameJGZimmerPLWerthWJHallSWEberlyComputer-based vs manual health maintenance tracking: a controlled trial.Arch Fam Med.1994;3:581-588.AJDietrichGTO'ConnorAKellerPACarneyDLevyFSWhaleyImproving cancer early detection and prevention: a community practice randomized trial.BMJ.1992;304:687-691.DADavisMAThompsonADOxmanRBHaynesEvidence for the effectiveness of CME: a review of 50 randomized controlled trials.JAMA.1992;268:1111-1117.JLomasRBHaynesA taxonomy and critical review of tested strategies for the application of clinical practice recommendations from "official" to "individual" policy.Am J Prev Med.1988;4(suppl 4):77-94.DADavisMSThompsonADOxmanRBHaynesChanging physician performance: a systematic review of the effect of continuing medical education.JAMA.1995;274:700-705.ADOxmanMAThomsonDADavisDAHaynesNo magic bullets: a systematic review of 102 trials of interventions to improve professional practice.CMAJ.1995;153:1423-1431.DHGemsonARAshfordLLDickeyPutting prevention into practice: impact of a multifaceted physician education program on preventive services in the inner city.Arch Intern Med.1995;155:2210-2216.LLDickeyDBKamerowThe Put Prevention Into Practice Campaign: office tools and beyond.J Fam Pract.1994;39:321-323.PCarneyAJDietrichAKellerJLandgrafGTO'ConnorTools, teamwork, and tenacity: elements of a cancer control office system for primary care.J Fam Pract.1992;35:388-394.LLDickeyDPetittiAssessment of a patient-held minirecord for adult health maintenance.J Fam Pract.1990;31:431-438.Not AvailablePut Prevention Into Practice Education and Action Kit.Washington, DC: Dept of Health and Human Services, Public Health Service, Office of Disease Prevention and Health Promotion; 1994.AJDietrichCBWoodruffPACarneyChanging office routines to enhance preventive care: the preventive GAPS approach.Arch Fam Med.1994;3:176-183.Not AvailableWorking Guidelines for Early Cancer Detection: Rationale and Supporting Evidence to Decrease Mortality.Bethesda, Md: National Cancer Institute; 1987.KYLiangSZegerLongitudinal data analysis using generalized linear models.Biometrika.1986;73:13-22.AJDietrichGO'ConnorAKellerWill community physicians participate in rigorous studies of cancer control? the methodology and recruitment in a randomized trial of physician practices.In: Engstrom PF, Rimer B, Mortenson LE, eds. Progress in Clinical and Biological Research. New York, NY: Wiley-Liss Inc; 1990:373-381.Not AvailableGuide to Clinical Preventives Services: An Assessment of the Effectiveness of 169 Interventions: Report of the US Preventive Services Task Force.2nd ed.Baltimore, Md: Williams & Wilkins; 1989.AStecklerRMGoodmanKRMcLeroySDavidGKochMeasuring the diffusion of innovative health promotion programs.Am J Health Promot.1992;6:214-224.RMGoodmanKRMcLeroyABStecklerRHHoyleDevelopment of level of institutionalization scales for health promotion programs.Health Educ Q.1993;20:161-178.JEKralewskiTDWingertMHBarboucheAssessing the culture of medical group practices.Med Care.1996;34:377-388.Accepted for publication June 9, 1997.Supported by grants CA54300, CA53631, and CA23108 from the National Cancer Institute, Bethesda, Md, and a grant from the Bureau of Primary Health Care, Rockville, Md.We wish to thank the following C/MHCs for their participation: East Harlem Council for Human Services, New York, NY; Dr Martin Luther King, Jr Health Center, New York; Comprehensive Community Development Corporation/Soundview Health Center, Bronx, NY; Montefiore Comprehensive Health Care Center, Bronx; Morris Heights Health Center, Bronx; Betances Health Unit, New York; East New York Neighborhood Family Care Center, Brooklyn; Bronx Lebanon Ambulatory Care Network, Bronx; LBJ Health Complex Inc, Brooklyn; Montefiore Family Health Center, Bronx; Sunset Park Family Health Center Network, Brooklyn; Segundo Ruiz Belvis Diagnostic and Treatment Center, Bronx; Gouverneur Diagnostic and Treatment Center, New York; South Brooklyn Health Center, Brooklyn; Chinatown Health Clinic, New York; Montefiore Comprehensive Family Care Center, Bronx; Settlement Health and Medical Services, New York; Valentine Lane Family Health Center, Yonkers, NY; Bedford Stuyvesant Family Health Center, Brooklyn; Joseph P. Addabbo Family Health Center, Arverne, NY; Jamaica Hospital Family Care Center, Jamaica, NY; Whitney M. Young F. Health Center, Albany, NY; Geneva B. Scruggs Health Center, Buffalo, NY; Peekskill Area Health Center, Peekskill, NY; Northwest Buffalo Community Health Center, Buffalo; Schenectady Community Health Center, Schenectady, NY; Family Health Network, Cortland, NY; Family Health Center–Orange & Ulster Counties, Newburgh, NY; Genesee Health Service, Rochester, NY; Anthony L. Jordan Health Corp, Rochester; Hudson Headwaters Health Network, Glens Falls, NY; Oak Orchard Health Center, Brockport, NY; Ossining Open Door Health Center, Ossining, NY; Westside Health Services, Rochester; Middletown Community Health Center, Middletown, NY; Greenburgh Neighborhood Family Center, White Plains, NY; CAMCare Health Corp, Camden, NJ; Plainfield Health Center, Plainfield, NJ; Jersey City Family Health Center, Jersey City, NJ; Paterson Community Health Center, Paterson, NJ; Newark Community Health Center, Newark, NJ; Dr Myra Smith Kearse Family Health Center, Vauxhall, NJ; Bridgeport Community Health Center, Bridgeport, Conn; Community Health Center Inc, Middletown, Conn; Stay Well Health Center, Waterbury, Conn; Hill Health Corp, New Haven, Conn; Fairhaven Community Health Center, New Haven; Southwest Community Health Center, Bridgeport; Community Health Services Inc, Hartford, Conn; and Charter Oak/Rice Heights Community Health Center, Hartford.The following individuals contributed substantially to the conception and implemention of this project: Alan Meyer Perla; Michael Stehney, MD; Anita Vaughn, MD; Barbara Menendez, PhD; Gilberto Cardona-Perez, MD, MPH; Raymond Porfilio; Marilyn Gaston, MD; David Stevens, MD; Cynthia Schlachter-Koren, MPH; Catherine Dale; Marcia Titus; Kwame Alexander; Judit Pousada; Wayne Kawadler, MPH; Merrilly Calabrese; Victor Kamensky, PhD; Laura DeMateo; Chee Jen Chang, PhD; Jennifer Lee, RN; and Joseph Stamm, MBA. Susanna Reed prepared the manuscript.Reprints: Allen J. Dietrich, MD, Department of Community and Family Medicine, Dartmouth Medical School, Hanover, NH 03755 (e-mail: allen.dietrich@dartmouth.edu). http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Archives of Family Medicine American Medical Association

Loading next page...
 
/lp/american-medical-association/cancer-early-detection-services-in-community-health-centers-for-the-vBpofR5Jgm
Publisher
American Medical Association
Copyright
Copyright 1998 American Medical Association. All Rights Reserved. Applicable FARS/DFARS Restrictions Apply to Government Use.
ISSN
1063-3987
eISSN
1063-3987
DOI
10-1001/pubs.Arch Fam Med.-ISSN-1063-3987-7-4-foc6107
Publisher site

Abstract

BackgroundAchieving cancer early-detection goals remains a challenge, especially among low-income and minority populations.Design/SettingA randomized trial based in 62 community health centers for the underserved in New York, New Jersey, and western Connecticut. Family physicians were on staff at most of the centers.InterventionWorkshops, materials, and ongoing advice for center leaders promoted implementation of a preventive services office system to identify patients in need of services at each visit through use of medical record flow sheets, other tools, and staff involvement.Evaluation End PointsThe proportion of randomly selected patients by center who were up to date for indicated services at baseline (n=2645) and follow-up (n=2864) record review.ResultsOnly 1 service (breast self-examination advice) increased more in intervention centers. Seven of 8 target services increased significantly for the 62 centers overall. During the study, the medical director changed in 26 centers (42%). Keeping the same medical director at intervention centers was associated with improvements in services.ConclusionsCancer early-detection services are improving in community health centers, but the intervention had only a small impact, as determined by record review. To have an impact, the intervention required that there be no change in medical director. The relationship of changes in the practice environment to services delivered is complex and deserves more study.COMMUNITY/migrant health centers (C/MHCs) provide primary care and preventive services for populations that otherwise have limited or no access to continuous, comprehensive care.In 1995, there were 722 C/MHCs providing primary care across 2204 sites to 8.1 million people through more than 36 million visits.In addition to Medicaid, private insurance, and sliding-scale fees, C/MHCs receive funding from the Division of Community and Migrant Health of the Bureau of Primary Health Care of the Public Health Service, Health Resources and Services Administration. The C/MHCs are a proven source of cost-effective, high-quality, community-based primary care,with family physicians playing a prominent role.Providing preventive care to underserved populations is a well-recognized challenge. Health for All: Objectives for the Year 2000 specifies more modest preventive service targets for low-income and minority groups.Costs of indicated services, limited availability, difficult access, lack of clinician recommendations, patient reluctance, and other well-recognized barriersmay weigh especially heavily on these group.Enhancing the provision of preventive services in primary care has been addressed by a number of studies.Interventions based on office systems that include use of flow sheets and other clinician reminders or that include multiple components have been most successful.Some studies have targeted underserved populations. Recently, Gemson and colleaguesfound that an office system based on the Public Health Service's Put Prevention Into Practice office systemincreased provision of preventive services in 1 urban academic medical center compared with a similar control center. Most services were provided by physicians in training. How to enhance preventive care in more representative primary care settings serving special populations is not known.The Cancer Prevention in Community Practice Project (CPCP) found that an office system intervention increased significantly the provision of 6 of 10 target cancer early-detection and prevention services.The 98 participating community practices, all in New Hampshire and Vermont, included no more than 4 clinicians each, the majority of whom were family physicians. Two of the 98 sites were C/MHCs.The Community Health Center Cancer Control Project tested an intervention derived from the CPCP office system in 62 C/MHCs and other sites providing primary care to special populations. The intervention was delivered by an intermediary organization that could potentially serve as a dissemination agent after the research trial was complete. This report summarizes the findings of this study.PARTICIPANTS AND METHODSAfter recruitment and baseline data collection in 1992, 62 centers were first stratified by urban vs nonurban location, then randomly assigned to intervention or control status. In 1994, after the intervention had been provided to centers so assigned, follow-up data were collected. Rates of cancer early-detection services determined from randomly selected patient records at each interval provided the dependent variables of interest.THE INTERMEDIARYThe intermediary organization responsible for center recruitment and intervention delivery was Clinical Directors Network Inc (CDN). Established in 1985, CDN is a not-for-profit membership organization consisting of the medical, dental, and nursing directors as well as other clinicians practicing in federal, state, and city-funded community, migrant, homeless, and other health centers in New York, New Jersey, Puerto Rico, Virgin Islands of the United States, and elsewhere. Clinical Directors Network provides training on clinical leadership and management issues, as well as clinical updates for C/MHC providers and others who serve low-income, disadvantaged, and minority populations. Clinical Directors Network is also active in practice-based clinical research.THE C/MHCsParticipation was sought from New York, New Jersey, and western Connecticut centers. The medical directors of all 97 C/MHCs open in 1992 as well as the directors of other centers supported by states and the New York City Health and Hospitals Corporation were approached to determine eligibility. To be eligible, a center had to have opened at least 2 years earlier and provide a broad range of primary care services to adults. Sixty-two sites were eligible and agreed to participate, 27 were eligible but declined, and the balance were ineligible because they provided a limited range of services.THE INTERVENTIONCenters assigned to intervention status received ongoing external assistance from CDN to implement the office system for cancer early detection shown to be efficacious in the earlier CPCP.The system consisted of establishing preventive care goals and practice routines to promote meeting these goals; shared responsibility among all center clinicians and office staff for identifying patients overdue for preventive services at all feasible visits; and use of various tools such as preventive care flow sheets in the medical record, patient education materials, and external chart identifiers.In addition, patient-held health diaries were adapted from Dickey and Petittiand removable adhesive notes were modified from CPCP.Similar tools are included in the Put Prevention Into Practice kit.During 20 months, CDN provided office system implementation assistance to intervention centers in 3 ways: a workshop for key personnel; 1 or 2 visits to each center by CDN personnel who facilitated development and implementation of the office system; and ongoing support with free professional education materials, office system tools, and advice. The workshop described the components of the office systemas well as an implementation process called the preventive GAPS approach,which allowed each center to develop an office system that fit its unique resources, patient population, and preventive service goals. Professional education materials available for loan included clinical breast examination teaching models (Mammotech Corp, Gainesville, Fla), oral cavity clinical pathology slides, and office staff training videos on communication.THE PATIENT EVALUATION SAMPLING STRATEGYThe records of established patients who were at least 42 years of age, not terminally ill, and not previously diagnosed as having cancer provided data about cancer early detection. Further patient eligibility criteria included a first visit to the practice at least 366 days before a record review date, as well as an additional visit within the most recent 365 days.With the use of a computerized random selection procedure, records for 50 eligible patients were identified at both baseline and 2-year follow-up. Patients of the first record review were not excluded from the second. All eligible patients had an equal chance of being selected. Thus, patients with many visits were no more likely to be selected than patients with just 2 visits. Patients were also selected without regard to whether they had 1 usual or assigned clinician.DESCRIPTIVE EVALUATION DATAFor each C/MHC, a center survey was completed at baseline and follow-up by the medical director or designee. This survey ascertained characteristics of the center and the population it served. Descriptive data on the patient samples were ascertained from medical records of patients in the evaluation samples. After patient eligibility was confirmed, records were reviewed to determine patient demographics, insurance status, and office visit history.OUTCOME DATAEvaluation end points included cancer early-detection services recommended by the National Cancer Institute Working Guidelines for Practicing Physicians at the time this project began.For women, these included clinical breast examination, breast self-examination advice, mammography (age 50 years and older), and Papanicolaou test. For both men and women aged 50 years and older, these included oral cavity examination, digital rectal examination, home fecal occult blood testing, and sigmoidoscopy. At both baseline and follow-up record reviews, each record was checked in reverse chronological order back 24 months (or until the patient's initial visit if the patient had registered less than 24 months earlier) for each service.QUALITY CONTROLRecord reviews were completed by independent professional medical record abstractors on contract from a regional professional standards review organization (MedReview Inc, New York, NY). Reviewers were blinded to hypotheses and center study group. In all practices, a randomly selected subsample of 10% of the records was reviewed independently by a second reviewer and the reviews were compared. Reviewer agreement was high. Disagreements between reviewers varied from 0.2% to 5.9% across individual demographic characteristics. For target services, disagreement on whether a service was up to date ranged from 3.2% for clinical breast examination to 9.1% for sigmoidoscopy.STATISTICAL METHODSThe center is the unit of analysis unless otherwise noted. Urban vs nonurban location was determined by ZIP code. Characteristics of centers in each study group are summarized as means or medians and compared by means of appropriate statistical tests. Three centers (2 assigned to the intervention and 1 a control) were very large, hospital-based programs, employing over 50 clinicians each, and were excluded from the calculation of mean full-time equivalent clinicians, office staff, and number of examination rooms.For the primary analysis, services were considered up-to-date if the service was provided or recommended at least once during the 24-month period before the record review date, or during the time since enrollment at the center. For each service, the performance of individual centers was determined and then combined with the results from other centers in the same study group. Thus, centers were weighted equally, although the number of evaluation sample patients at each center who were eligible for the specific individual services varied.Nonparametric tests were used for statistical comparisons of the primary outcomes. Median proportions of patients up-to-date for target services were compared between baseline and follow-up intervals within study groups by Wilcoxon signed rank tests and between study groups by the Wilcoxon rank sum test. An alternative analysis demonstrated similar results with the use of generalized estimating equations for longitudinal data based on Poisson distributions with overdispersed variances and odds ratios calculated by means of generalized linear models.(This analysis is not reported here.) In addition to the intention-to-treat analysis, subsample analyses explored the relationship to outcomes of various center characteristics such as a change in medical director. We considered Pvalues less than .05 statistically significant.RESULTSThirty-one centers were assigned to each study group and provided complete data. About 10% of the records reviewed for eligibility at each interval were unavailable, usually because an appointment was pending. Of the records found, the most common reason for ineligible status was too short a time as an established patient of the center. For most centers, 50 eligible records provided usable data at each evaluation interval.CENTER CHARACTERISTICSCharacteristics of the centers and their patient populations are provided in Table 1. A majority of patients for whom ethnicity was known were African American (35%) or Hispanic (29%). Across centers, more than one fourth of the patient sample received Medicaid. At least 11% overall had no insurance. For an additional 20%, insurance status was not recorded in the record, suggesting that they also may have lacked coverage. Continuity of care, as determined by the proportion of visits for each patient with the provider seen most often, was high. Family physicians were on the staff of 61% of the intervention C/MHCs and 71% of controls, with the remaining centers depending on clinicians trained in internal medicine, pediatrics, or obstetrics and gynecology.Table 1. Baseline Characteristics of Community Health Centers and Their Patient Samples by Study Group*See table graphicTable 2describes the practice environments during the study. In intervention practices, medical director turnover was higher, clinicians were busier, and moves or expansions were more common, although none of these comparisons was statistically significant.Table 2. Environmental Characteristics Present in Centers During the Study PeriodSee table graphicOf the intervention centers, all cooperated with the 3 components of the workshop. All were represented at a workshop, 30 (97%) of 31 cooperated with at least 1 visit from CDN staff, and all made use of at least some form of follow-up assistance. Twelve (39%) of 31 centers implemented new or redesigned flow sheets.MAIN EFFECTS ANALYSISTable 3summarizes the proportion of age- and sex-appropriate patients provided indicated services within the previous 24 months overall and by study group. Comparing baseline and follow-up proportion changes between study groups, only breast self-examination advice increased more for the intervention group (P=.009).Table 3. Median Proportions of Patients by Center Who Received Age- and Sex-Indicated Services During the Previous 24 Months: Baseline vs Follow-up IntervalsSee table graphicFor the 62 centers overall, 7 of 8 services showed increased proportions of patients who were up-to-date for indicated services (P<.05); only sigmoidoscopy proportions did not change. At baseline and follow-up, oral cavity examination and mammography were the most frequently provided services. Home fecal occult blood testing and sigmoidoscopy for patients aged 50 years and older and breast self-examination advice for women were services least frequently provided.Between baseline and follow-up in the intervention group, the median proportion of patients up-to-date increased by 0.10 or more for clinical breast examination, home fecal occult blood testing, and oral cavity examination. Within the intervention group, comparisons of baseline and follow-up proportions for clinical breast examination, breast self-examination advice, and oral cavity examination were statistically significant. The median proportion of women up-to-date for mammography increased from 0.58 to 0.65 (P=.06).Between baseline and follow-up in the control group, the median proportion of patients up-to-date for Papanicolaou tests, fecal occult blood tests, and oral cavity and digital rectal examinations increased by 0.10 or more. Within the control group, comparisons of proportions between baseline and follow-up were significant for 4 services: clinical breast examination, Papanicolaou test, digital rectal examination, and oral cavity examination. Comparisons over time involving breast self-examination advice and home fecal occult blood testing improved 0.06 (P=.06) and 0.12 (P=.06), respectively, in the proportion of patients up-to-date.SUBGROUP ANALYSESWe explored the relationship of various practice characteristics to changes in early-detection services provided in intervention and control practices. No pattern of change across the 8 target services between study groups was apparent according to size, urban vs nonurban location, higher vs lower clinician or staff turnover, or center moves and expansions based on the Wilcoxon signed rank test. For comparisons in this hypothesis-generating consideration, statistical power is limited because of small cell sizes.Because the medical director played a central role in the intervention and medical director change occurred in almost half the centers, we explored the relationship between medical director change and the impact of the intervention. Table 4shows median differences for indicated services between baseline and follow-up according to whether there was a change in medical director (n=26) vs whether the same medical director remained (n=36). Intervention centers with the same medical director (n=14) and control centers with a change in medical director (n=9) showed greater improvement than the other groups. For intervention centers with the same medical director, the median change in proportion between baseline and follow-up in this subgroup was greatest for mammography (P=.05). For control centers with medical director change, the largest median change was for Papanicolaou tests (P=.10).Table 4. Median Change in Provision of Early-Detection Services at Intervention and Control Centers: Impact of Medical Director Change During the Study Period*See table graphicFigure 1shows the change in proportions of patients up-to-date for each service at individual centers between baseline and follow-up according to subgroups determined by combining intervention and medical director change status. Changes in sigmoidoscopy over time were small across all the groups. For the other 7 services, there was considerable range of change in proportions within each subgroup. In general, most centers improved over time on each service. Improvements across all groups are most striking for oral cavity examination.Change in proportion of patients up-to-date for the indicated services between baseline and follow-up for each community health center. The position of the median in each group is indicated by the long bar.Some changes in individual center results indicated in Figure 1deserve specific comment. One center in the control–same medical director subgroup improved the proportion of women up-to-date for mammography by 0.75. The same center improved Papanicolaou tests by more than 0.50. Because of the age and sex distribution of patients at this center, fewer than 10 patients were eligible for these services at each interval, suggesting that these results may be unstable because of the small denominator.For changes in breast self-examination advice, Papanicolaou test, and home fecal occult blood testing proportions indicated in Figure 1, a different single center is an outlier in the intervention–same medical director subgroup, showing decreases in the proportions of all 3 services, each exceeding 0.50. This was the 1 center among the 31 in the intervention group that did not allow any site visit to the center by CDN staff to assist in office system implementation.The intervention–different medical director subgroup had 1 center that increased digital rectal examinations by more than 0.50 (from 0.24 at baseline to 0.89 at follow-up) and another center that increased oral cavity examination by more than 0.50 (from 0.40 to 0.96). At each interval, these centers had at least 25 patients eligible for these services. These centers also showed substantial improvements in other target services, suggesting that the large improvements that appear to represent outlier data may in fact reflect substantial changes in preventive services overall.COMMENTSeven of 8 cancer early-detection services increased in both study groups, but intervention centers improved more than control centers for only 1 service—breast self-examination advice. While these results indicate cancer early-detection progress at C/MHCs, intervention impact was disappointing.Why did this study, based on the same office system shown to be effective in the CPCP,demonstrate so modest an impact here? Certain differences between studies deserve mention in terms of the practices involved, the evaluation methods, and the national health care environment.Almost two thirds of the eligible centers participated in the current study. This participation rate differs from that in the CPCP, in which fewer than 25% of eligible practices participated.The CPCP thus addressed whether an office system intervention can work in highly motivated practices that were not necessarily representative of others in the region. The current project explores the impact of the intervention when disseminated across more representative practices that may have had a broader range of motivation, resources, and ability to cooperate.Another difference concerns practice size. In CPCP, most practices included 1 or 2 clinicians. The largest practice had 4. In the current project, 50 (81%)of the 62 participating practices included more than 4 clinicians. Sample size limitations preclude exploring whether smaller practices benefited more from the intervention than larger practices, but there is no question that the intervention implementation process faced more obstacles in larger centers. As an example, the implementation process in CPCP allowed research staff to work directly over time with the entire staff of each practice.The "train the trainer" approach followed here limited ongoing direct contact to practice leaders only.Another difference concerns practice stability. In CPCP, practices had the same clinical leadership throughout. During the current study, almost half of the practices changed leadership. Those intervention practices with no leadership change performed better. The importance of medical director stability to the impact of the intervention comes as no surprise, since the implementation process was directed at center leaders. If the medical director left, the office system intervention was compromised. The timeline of the study precluded starting over. In addition, the competing demands faced by a new medical director would make participation in an intervention project of this type a low priority.Evaluation also differed between studies. The impact of the CPCP intervention focused on patients seen in the practice near the time of follow-up record review and well after the intervention was established, not a random sample drawn from all active records as in the current study. Thus, the current study sets a higher outcome standard of looking for benefit in all patients seen during the past year, not just those most recently seen. Patients seen recently are more likely to be high utilizers, and they are more likely to receive the full benefit of an intervention established a year earlier; in the current study, more representative patients may have been seen only once during the previous year, potentially early in the intervention period, before the intervention was firmly established.Differences in the national health care environment between 1987 to 1989 and 1992 to 1994 deserve note as well. The first edition of the US Preventive Services Task Force Guidewas published in 1989, so both intervention and control centers in the current study, but not CPCP, benefited from the resultant momentum to preventive service guidelines. In addition, debate raged in 1987 about what services were indicated, and the current emphasis on quality improvement was in its early stages. By 1992, these recent developments had created a different health care environment that may have contributed to the strong secular trend observed across intervention and control practices seen here.A surprising finding of the subgroup analysis concerns the increases in services in those control sites that had a change in medical director. This finding has several explanations worth exploring.New medical directors may be inclined to promote new initiatives, such as more preventive services, and to have the strong support of colleagues in doing so. As such, it is plausible that services would increase, at least for a time, after medical director change. Perhaps medical director change and the intervention studied here interfere with each other, with each effective alone but not together. Thus, in control sites, a change in medical director improved services, whereas with no change in medical director these centers continued to provide the same level of services. In intervention sites, medical director stability was required for the intervention to have an impact. If the medical director changed, intervention implementation may have lost its driving force. These and other explanations regarding the impact of the practice environment on interventions in community practice require exploration in future studies.Medical director turnover was higher in intervention sites, as were the number of patients seen per hour by clinicians and the number of changes in practice location or expansion. Were these changes related to the intervention? This seems implausible, because the clinical demands on the centers dwarf the demands of the intervention. A more likely explanation concerns the complexity of these practice environments, our limited understanding of what environmental factors influence the ability of a practice to change, and the dearth of measures that allow rigorous assessment of these factors. Typical descriptors of practice environments, such as the number of clinicians or the age and sex distribution of the patients, may not go far enough to assure that the 2 center study groups were equivalent to begin with and, even if they were, that the vicissitudes and changes in their operations during the study period did not interfere with the study. New measures that address the practice environment in detail are needed.Strengths of this study include the high center participation rate, an intervention that shares important similarities with the Put Prevention Into Practice program,rigorous evaluation based on quality-controlled record review, and the setting in community health centers, which are important but seldom-studied clinical practices that provide vital primary care services to special populations.Limitations include the variable speed of implementation among intervention centers. Some practices implemented an office system within 4 months, allowing the intervention to have more than 18 months to show an impact in patient records. Others required 18 months, allowing less than 6 months for changes to be reflected in the records. In addition, only 12 intervention centers (39%) initiated a new flow sheet, compared with 100% of intervention practices in CPCP.A further limitation of our evaluation design was the focus on each center overall, hence our inability to address the performance of individual clinicians and reliance on only 50 randomly selected charts to evaluate C/MHC performance.What lessons does this study provide for family physicians? First, it confirms and extends the results of the CPCP,that an office system can improve cancer early-detection services in complex community health center practices as long as they have stable clinical leadership. If a practice does not use preventive-service flow sheets and involve office staff in preventive-service routines, doing so can make a difference. Second, implementation of an office system is not a simple task. When all intervention practices as a group were compared with controls, the combination of workshops, ongoing support from an intermediary, and free office system tools failed to increase early-detection services above the encouraging secular trend observed across all 62 community health centers. If a practice is undergoing substantial staff turnover or other major changes, implemention of a preventive-services office system should wait.What lessons does this study provide for policy makers, administrators, and researchers? Many of the C/MHCs participating in the study were dynamic environments showing a high degree of medical director, clinician, and staff turnover. Any contemplated quality-improvement initiatives need to take the dynamic nature of practices into account. More fundamental understanding of the practice environment may be required to plan and achieve quality improvements across a broad range of practices. For example, control practices with a change in medical director showed improvement, which was contrary to intervention practices, where medical stability was required for early-detection improvement. More research is needed to determine the relationship of practice environments to changes in the quality of care, in response both to externally driven interventions as in this study and to internally driven initiatives such as a new medical director might launch. Future studies should assess the practice environment, including the role of clinical leadership, in more detail before implementing interventions.The office systems approach to improving clinical practice services continues to show promise, but it is not a panacea. Studies of representative community practices need to tailor interventions to be tested to the range of study practice environments, including the ability and motivation of the practices to cooperate.RCBohrerUS Public Health Service, Bureau of Primary Health Care, Division of Community/Migrant Health, Annual Fact Sheet 1995.Bethesda, Md: Public Health Service; 1996.JLeiferCommunity and Migrant Health Centers: Legal Handbooks for Program Planners.Washington, DC: Association of Maternal and Child Health Programs; 1991.ASardellThe US Experiment in Social Medicine: The Community Health Center Program, 1965-1986.Pittsburgh, Pa: University of Pittsburgh Press; 1988.ASardellClinical networks and clinician retention: the case of CDN.J Community Health.1996;21:437-451.BStarfieldNPoweJWeinerCost versus quality in primary care.JAMA.1994;272:1903-1908.US Dept of Health and Human Services, Public Health ServiceHealthy People 2000: National Health Promotion and Disease Prevention Objectives.Washington, DC: US Dept of Health and Human Services; 1990.PSFrameHealth maintenance in clinical practice: strategies and barriers.Am Fam Physician.1992;45:1192-1200.HFreemanRace, poverty, and cancer.J Natl Cancer Inst.1991;83:526-527.SJMcPheeJABirdDFordhamJERodnickEHOsbornPromoting cancer prevention activities by primary care physicians.JAMA.1991;266:538-544.PSFrameJGZimmerPLWerthWJHallSWEberlyComputer-based vs manual health maintenance tracking: a controlled trial.Arch Fam Med.1994;3:581-588.AJDietrichGTO'ConnorAKellerPACarneyDLevyFSWhaleyImproving cancer early detection and prevention: a community practice randomized trial.BMJ.1992;304:687-691.DADavisMAThompsonADOxmanRBHaynesEvidence for the effectiveness of CME: a review of 50 randomized controlled trials.JAMA.1992;268:1111-1117.JLomasRBHaynesA taxonomy and critical review of tested strategies for the application of clinical practice recommendations from "official" to "individual" policy.Am J Prev Med.1988;4(suppl 4):77-94.DADavisMSThompsonADOxmanRBHaynesChanging physician performance: a systematic review of the effect of continuing medical education.JAMA.1995;274:700-705.ADOxmanMAThomsonDADavisDAHaynesNo magic bullets: a systematic review of 102 trials of interventions to improve professional practice.CMAJ.1995;153:1423-1431.DHGemsonARAshfordLLDickeyPutting prevention into practice: impact of a multifaceted physician education program on preventive services in the inner city.Arch Intern Med.1995;155:2210-2216.LLDickeyDBKamerowThe Put Prevention Into Practice Campaign: office tools and beyond.J Fam Pract.1994;39:321-323.PCarneyAJDietrichAKellerJLandgrafGTO'ConnorTools, teamwork, and tenacity: elements of a cancer control office system for primary care.J Fam Pract.1992;35:388-394.LLDickeyDPetittiAssessment of a patient-held minirecord for adult health maintenance.J Fam Pract.1990;31:431-438.Not AvailablePut Prevention Into Practice Education and Action Kit.Washington, DC: Dept of Health and Human Services, Public Health Service, Office of Disease Prevention and Health Promotion; 1994.AJDietrichCBWoodruffPACarneyChanging office routines to enhance preventive care: the preventive GAPS approach.Arch Fam Med.1994;3:176-183.Not AvailableWorking Guidelines for Early Cancer Detection: Rationale and Supporting Evidence to Decrease Mortality.Bethesda, Md: National Cancer Institute; 1987.KYLiangSZegerLongitudinal data analysis using generalized linear models.Biometrika.1986;73:13-22.AJDietrichGO'ConnorAKellerWill community physicians participate in rigorous studies of cancer control? the methodology and recruitment in a randomized trial of physician practices.In: Engstrom PF, Rimer B, Mortenson LE, eds. Progress in Clinical and Biological Research. New York, NY: Wiley-Liss Inc; 1990:373-381.Not AvailableGuide to Clinical Preventives Services: An Assessment of the Effectiveness of 169 Interventions: Report of the US Preventive Services Task Force.2nd ed.Baltimore, Md: Williams & Wilkins; 1989.AStecklerRMGoodmanKRMcLeroySDavidGKochMeasuring the diffusion of innovative health promotion programs.Am J Health Promot.1992;6:214-224.RMGoodmanKRMcLeroyABStecklerRHHoyleDevelopment of level of institutionalization scales for health promotion programs.Health Educ Q.1993;20:161-178.JEKralewskiTDWingertMHBarboucheAssessing the culture of medical group practices.Med Care.1996;34:377-388.Accepted for publication June 9, 1997.Supported by grants CA54300, CA53631, and CA23108 from the National Cancer Institute, Bethesda, Md, and a grant from the Bureau of Primary Health Care, Rockville, Md.We wish to thank the following C/MHCs for their participation: East Harlem Council for Human Services, New York, NY; Dr Martin Luther King, Jr Health Center, New York; Comprehensive Community Development Corporation/Soundview Health Center, Bronx, NY; Montefiore Comprehensive Health Care Center, Bronx; Morris Heights Health Center, Bronx; Betances Health Unit, New York; East New York Neighborhood Family Care Center, Brooklyn; Bronx Lebanon Ambulatory Care Network, Bronx; LBJ Health Complex Inc, Brooklyn; Montefiore Family Health Center, Bronx; Sunset Park Family Health Center Network, Brooklyn; Segundo Ruiz Belvis Diagnostic and Treatment Center, Bronx; Gouverneur Diagnostic and Treatment Center, New York; South Brooklyn Health Center, Brooklyn; Chinatown Health Clinic, New York; Montefiore Comprehensive Family Care Center, Bronx; Settlement Health and Medical Services, New York; Valentine Lane Family Health Center, Yonkers, NY; Bedford Stuyvesant Family Health Center, Brooklyn; Joseph P. Addabbo Family Health Center, Arverne, NY; Jamaica Hospital Family Care Center, Jamaica, NY; Whitney M. Young F. Health Center, Albany, NY; Geneva B. Scruggs Health Center, Buffalo, NY; Peekskill Area Health Center, Peekskill, NY; Northwest Buffalo Community Health Center, Buffalo; Schenectady Community Health Center, Schenectady, NY; Family Health Network, Cortland, NY; Family Health Center–Orange & Ulster Counties, Newburgh, NY; Genesee Health Service, Rochester, NY; Anthony L. Jordan Health Corp, Rochester; Hudson Headwaters Health Network, Glens Falls, NY; Oak Orchard Health Center, Brockport, NY; Ossining Open Door Health Center, Ossining, NY; Westside Health Services, Rochester; Middletown Community Health Center, Middletown, NY; Greenburgh Neighborhood Family Center, White Plains, NY; CAMCare Health Corp, Camden, NJ; Plainfield Health Center, Plainfield, NJ; Jersey City Family Health Center, Jersey City, NJ; Paterson Community Health Center, Paterson, NJ; Newark Community Health Center, Newark, NJ; Dr Myra Smith Kearse Family Health Center, Vauxhall, NJ; Bridgeport Community Health Center, Bridgeport, Conn; Community Health Center Inc, Middletown, Conn; Stay Well Health Center, Waterbury, Conn; Hill Health Corp, New Haven, Conn; Fairhaven Community Health Center, New Haven; Southwest Community Health Center, Bridgeport; Community Health Services Inc, Hartford, Conn; and Charter Oak/Rice Heights Community Health Center, Hartford.The following individuals contributed substantially to the conception and implemention of this project: Alan Meyer Perla; Michael Stehney, MD; Anita Vaughn, MD; Barbara Menendez, PhD; Gilberto Cardona-Perez, MD, MPH; Raymond Porfilio; Marilyn Gaston, MD; David Stevens, MD; Cynthia Schlachter-Koren, MPH; Catherine Dale; Marcia Titus; Kwame Alexander; Judit Pousada; Wayne Kawadler, MPH; Merrilly Calabrese; Victor Kamensky, PhD; Laura DeMateo; Chee Jen Chang, PhD; Jennifer Lee, RN; and Joseph Stamm, MBA. Susanna Reed prepared the manuscript.Reprints: Allen J. Dietrich, MD, Department of Community and Family Medicine, Dartmouth Medical School, Hanover, NH 03755 (e-mail: allen.dietrich@dartmouth.edu).

Journal

Archives of Family MedicineAmerican Medical Association

Published: Jul 1, 1998

References