Access the full text.
Sign up today, get DeepDyve free for 14 days.
(2016)
Medicare Data on Provider Practice and Specialty (MD-PPAS)
W. Welch, S. Stearns, A. Cuellar, A. Bindman (2014)
Use of hospitalists by Medicare beneficiaries: a national picture.Medicare & medicaid research review, 4 2
Association of American Medical Colleges. The complexities of physician supply and demand: projections from 2013 to 2025
L. Pollack, Walter Adamache, C. Eheman, A. Ryerson, L. Richardson (2009)
Enhancement of identifying cancer specialists through the linkage of Medicare claims to additional sources of physician specialty.Health services research, 44 2 Pt 1
Aaron Young, H. Chaudhry, Xiaomei Pei, Katie Halbesleben, Donald Polk, Michael Dugan (2015)
A Census of Actively Licensed Physicians in the United States, 2014, 101
Tracy Spinks, H. Albright, T. Feeley, R. Walters, Thomas Burke, T. Aloia, E. Bruera, A. Buzdar, L. Foxhall, D. Hui, B. Summers, Alma Rodriguez, Raymond Dubois, K. Shine (2012)
Ensuring quality cancer careCancer, 118
Elizabeth Gage-Bouchard, E. Rodriguez, F. Saad-Harfouche, Austin Miller, D. Erwin (2014)
Factors Influencing Patient Pathways for Receipt of Cancer Care at an NCI-Designated Comprehensive Cancer CenterPLoS ONE, 9
A. Bindman (2013)
Using the National Provider Identifier for health care workforce evaluation.Medicare & medicaid research review, 3 3
Aaron Young, H. Chaudhry, Xiaomei Pei, Katie Arnhart, Michael Dugan, Gregory Snyder (2017)
A Census of Actively Licensed Physicians in the United States, 2016, 103
L. Baldwin, Walter Adamache, C. Klabunde, K. Kenward, Celia Dahlman, Joan Warren (2002)
Linking Physician Characteristics and Medicare Claims Data: Issues in Data Availability, Quality, and MeasurementMedical Care, 40
Division of Cancer Control & Population Sciences. SEER-Medicare: Brief Description of the SEER-Medicare Database
(2014)
Factors influencing patient pathways for receipt of cancer care at an NCIdesignated comprehensive cancer
F. Zapffe (1938)
The Association of American Medical CollegesAcademic Medicine, 13
C. Klabunde, A. Ambs, N. Keating, Yulei He, W. Doucette, D. Tisnado, S. Clauser, K. Kahn (2009)
The Role of Primary Care Physicians in Cancer CareJournal of General Internal Medicine, 24
W. Welch, A. Bindman (2016)
Town and Gown Differences Among the 100 Largest Medical Groups in the United StatesAcademic Medicine, 91
M. Hewitt, J. Simone (1999)
Ensuring Quality Cancer Care
Institute of Medicine (US) and National Research Council (US) National Cancer Policy Board
Introduction: Physicians are vital to health-care delivery, but assessing their impact on care can be challenging given limited data. Historically, health services researchers have obtained physician characteristics data from the American Medical Association (AMA) Physician Masterfile. The Center for Medicare and Medicaid Services’ Medicare Data on Provider Practice and Specialty (MD-PPAS) file was assessed, as an alternative source of physician data, particularly in the context of cancer health services research. Methods: We used physician National Provider Identifiers in the MD-PPAS data (2008–2014) to identify physicians in the AMA data current as of July 18, 2016. Within each source, we grouped physicians into six broad specialty groups. Percent agreement and Cohen’s kappa coefficient (k) were calculated for age, sex, specialty, and practice state. Results: Among the 698 202 included physicians, there was excellent agreement for age (percent agreement¼ 97.7%, k¼ 0.97) and sex (99.4%, k¼ 0.99) and good agreement for specialty (86.1%, k¼ 0.80). Within specialty, using AMA as the reference, agreement was lowest for oncologists (77%). Approximately 85.9% of physicians reported the same practice state in both data sets. Conclusion: Although AMA data have been commonly used to account for physician-level factors in health services research, MD-PPAS data provide researchers with an alternative option depending on study needs. MD-PPAS data may be optimal if nonphysicians, provider utilization, practice characteristics, and/or temporal changes are of interest. In contrast, the AMA data may be optimal if more granular specialty, physician training, and/or a broader inclusion of physicians is of interest. Physicians are vital members of health-care delivery teams and Analyses of the National Cancer Institute’s Surveillance, have a sizeable influence on patient treatment decisions and, by Epidemiology, and End Results (SEER)-Medicare data, which in- extension, outcomes (1). Using their knowledge of available local clude detailed information about tumor characteristics, cancer resources, such as cancer specialists and high-volume cancer treatments, patient specific factors, aggregate socioeconomic treatment centers offering clinical trials, physicians can make rec- data, and descriptors of the care setting, have greatly contrib- ommendations that affect cancer outcomes. Patients with access uted to the understanding of cancer health-care delivery in the to specialty care have better outcomes than patients lacking ac- United States (4). For cancer health services researchers inter- cess (2,3). Therefore, to more completely understand care delivery, ested in including physician specialty in their analyses, spe- health services researchers often want to study characteristics of cialty codes are included on Medicare claims, but previous the treating physician(s), especially physician specialty, in addi- studies have indicated that this source of specialty information tion to patient and organizational factors. However, information is suboptimal (5,6). As a result, researchers using SEER-Medicare related to physicians and their treatment patterns is limited. typically have linked to the American Medical Association’s Received: September 19, 2019; Revised: November 4, 2019; Accepted: November 18, 2019 Published by Oxford University Press 2020. This work is written by US Government employees and is in the public domain in the US. 66 Downloaded from https://academic.oup.com/jncimono/article/2020/55/66/5837287 by DeepDyve user on 19 July 2022 D. P. White et al. |67 (AMA) Physician Masterfile (herein referred to as “AMA” data), practice location vs maintenance of concurrent positions in when physician characteristics were of interest (7). Another re- multiple practice locations (8). There was also a change in source for physician data, the Medicare Data on Provider methodology used to assign providers to TINs, their primary Practice and Specialty (herein referred to as “MD-PPAS” data), is geographic location, and to the hospitalist specialty group. now available (8). The MD-PPAS data contain physician infor- mation collected by the Center for Medicare and Medicaid AMA Physician Masterfile. Services (CMS). To date, there has been limited assessment The AMA Physician Masterfile was established in 1906 for AMA of the utility of the MD-PPAS data for health services research membership record-keeping (7). Over time it has become a data- and, to our knowledge, no assessment involving SEER-Medicare base including information on all living and deceased medical data (9). doctors and doctors of osteopathy who are AMA members, are The primary objective of this study was to provide a compar- currently in or have completed an accredited residency training ison of the MD-PPAS and AMA data sources, with a focus on the program, or have a valid US state license. The AMA also collects utility of these data sources for cancer health services research. information for foreign and international medical graduates The study aims included 1) identification of common and who live in the United States and have a valid US state license. unique variables in each data source; 2) the quantification of The AMA file includes information pertaining to each resi- agreement for the common variables, particularly physician dent and attending physician’s birth date, sex, specialty, pri- specialty; and 3) an exploration of when each data source may mary type of practice, geographic location, medical education, be the optimal choice for an analysis. We used the results of the postgraduate training, and professional certification (Table 1). primary analysis to determine the merits of including the MD- Physician data are obtained or verified through credentialing PPAS data in the SEER-Medicare database. institutions and organizations. Specialty information is obtained from self-report on physician surveys and information provided to the AMA from Graduate Medical Education pro- Methods grams. Geographic information is provided at the time the phy- sician is included in the AMA data, although individual Data Sources physicians can update practice data including practice state and primary type of practice at any time through the AMA web- MD-PPAS. site. For this analysis, we used AMA data current as of July 18, The MD-PPAS file was initially created in 2008 through a collab- oration between CMS and the Office of the Assistant Secretary for Planning and Evaluation to assist researchers with assigning individual providers to tax identification number (TIN)-based Study Population group practices and determining physician specialty (8). Subsequently, annual files have been created. Three data sour- We identified over 1.2 million unique providers, by their NPIs, in ces are used to create the MD-PPAS files: the National Plan and the MD-PPAS data (Figure 1). Given our interest in assessing Provider Enumeration System, which is the national directory physician characteristics, we excluded nonphysician providers through which CMS assigns each health-care provider a unique (MD-PPAS broad specialty code “7”) and retained only physi- National Provider Identifier (NPI); the Provider Enrollment, cians (n ¼ 734 318) (8). We then requested AMA data for these Chain, and Ownership System (PECOS), which health-care pro- physicians, of which 36 043 (5%) were not found in the AMA viders use to enroll as a Medicare provider; and claims submit- data. Next, we excluded from the analytic data set physicians ted to Medicare by providers. Physician and nonphysician who had inconsistent specialty information across the annual providers are eligible for inclusion in the MD-PPAS file if they MD-PPAS files (n¼ 73) because this would have precluded accu- have a valid NPI and have submitted at least one Medicare fee- rately assigning their specialty. The final sample included for-service (FFS) physician claim during a given calendar year. 698 202 physicians. Specifically, only providers who submitted Medicare claims for evaluation and management visits, procedures, imaging, or Variables of Interest nonlaboratory tests are included in the MD-PPAS data (8). The MD-PPAS files include information pertaining to each The analysis focused on physician demographics, state where provider’s birth date, sex, specialty, practice TIN, geographic lo- the physician practiced, and specialty. These variables were cation, and utilization summary measures. Specialty is deter- assessed because they were found in both data sources mined from two sources: the self-validated PECOS data and, if (“common” variables). Table 1 lists the common variables as specialty is missing in PECOS data, the most frequent specialty well as other variables available in each data source that may reported on the Medicare claims. The utilization measures are be of interest to researchers. aggregate measures of the amount of Medicare services billed during the previous 2 years, including the number of line items, total allowed charges, and number of unique beneficiaries cared Demographics. for (Table 1). For this analysis, we included MD-PPAS data (ver- The MD-PPAS and AMA data include two demographic variables sion 2.0) for the years 2008 through 2014; this was the latest ver- that are frequently used in health services research: birth date, sion available at study commencement. Compared with the which is used to determine physician age, and sex. We calcu- earlier version, MD-PPAS version 2.0 included additional indi- lated age as 2014 minus the year of birth in each data set be- vidual providers (all nonphysician providers and all providers cause 2014 was the last year of data available in the version 2.0 residing in Puerto Rico), annual provider-level measures (num- MD-PPAS data. Then we created the following age groups: youn- ber of Part B line items, total Medicare allowed charges, and the ger than 40 years, 40–49 years, 50–59 years, 60–69 years, age number of unique patients overall and by TIN), and monthly 70 years and older, and missing. Sex was categorized as male, TIN indicators to help distinguish between a change in provider female, or missing. Downloaded from https://academic.oup.com/jncimono/article/2020/55/66/5837287 by DeepDyve user on 19 July 2022 68 | J Natl Cancer Inst Monogr, 2020, Vol. 2020, No. 55 Table 1. Comparison of variables included in the MD-PPAS file and AMA Physician Masterfile MD-PPAS AMA Mechanism Update frequency Mechanism Update frequency Common variables Birth date Physician report* At least every 5 y Physician survey§, medical and Continuously training institutions, state and federal agencies, ECFMG Sex Physician report† Continuously Medical school: AAMC Continuously Primary and Physician report* ‡ At least every 5 y Physician survey§ Continuously secondary specialties Practice state Derived from Zip code Continuously Physician survey§, correspondence Continuously reported by from hospitals, government agencies, physician on medical societies, specialty boards, Medicare claims licensing agencies Unique variables Utilization measuresk Medicare claims Annually based N/A N/A on prior 2 y Primary type of practice¶ N/A N/A Physician survey§ Continuously or daily Medical school ID** N/A N/A Medical school: AAMC As needed to correct mistakes Medical school year N/A N/A Medical school: AAMC As needed to correct of graduation mistakes or if a physi- cian has a delay in graduation Medical training N/A N/A Training institutions: ACGME, As needed to correct institution code†† National mistakes or if a physi- GME Census cian begins another residency program US trained N/A N/A Training institutions: ACGME, As needed to correct National mistakes GME Census, ECFMG *PECOS. AAMC ¼ American Association of Medical Colleges; ACGME ¼ Accreditation Council for Graduate Medical Education; AMA ¼ American Medical Association; ECFMG ¼ Educational Commission for Foreign Medical Graduates; GME ¼ Graduate Medical Education; MD-PPAS ¼ Medicare Data on Provider Practice and Specialty; N/A ¼ not applicable; NPI ¼ National Provider Identifier; NPPES ¼ National Plan and Provider Enumeration System; PECOS ¼ Provider Enrollment, Chain, and Ownership System. †NPPES. ‡If not enrolled in PECOS, based on the specialty most frequently (primary) or the second most frequently (secondary) reported in Medicare out-patient (Part B) claims. §Annual census of physicians. kNumber of line items billed by an NPI, total allowed charges billed by NPI, and number of unique beneficiaries for whom the NPI billed. ¶Indicating whether physician’s primary activity is direct patient care, teaching, administration, etc. **The code for the medical school where the physician graduated. ††The institution where the physician is or was in graduate training. Practice State. MD-PPAS provider NPIs Both data sources collect information on practice state. For the n = 1 225 719 Nonphysician provider NPIs MD-PPAS data, state information may change across the annual removed data files. Therefore, we assigned state according to the most n= 491401 recent MD-PPAS data year available for each physician. MD-PPAS unique physician NPIs NPIs not found in AMA data n = 734 318 Physician Specialty. n= 36043 Both the MD-PPAS and AMA data have primary and secondary NPIs in MD-PPAS and AMA data specialty information. Only one specialty is reported per spe- cialty field in the MD-PPAS data (eg, specialty 1: oncology; spe- n = 698 275 Inconsistent MD-PPAS specialty cialty 2: internal medicine). However, physicians can report one over me or more specialties per specialty field in the AMA data (eg, spe- n= 73 Total matching NPIs remaining cialty 1: infectious diseases; specialty 2: internal medicine or emergency medicine or critical care medicine). Given the differ- n = 698 202 ent specialty classification systems, for comparison, we created a broad specialty classification scheme that included oncology, Figure 1. Process of linking the physician National Provider Identifiers (NPIs) in- surgery (not oncology), radiology (not oncology), primary care, cluded in the Medicare Data on Provider Practice and Specialty (MD-PPAS) files other, and missing (Supplemental Table 1). Oncologists were and the American Medical Association (AMA) Physician Masterfile. MD-PPAS singled out because they are a frequent focus in cancer health files were from 2008–2014 and AMA data was current as of July 18, 2016. N ¼ all services research. observations available; n ¼ subset of observations included/excluded. Downloaded from https://academic.oup.com/jncimono/article/2020/55/66/5837287 by DeepDyve user on 19 July 2022 D. P. White et al. |69 Physicians were assigned to a broad specialty within each (Table 5). Of note, 7.9% of physicians were missing practice state data source based on the following hierarchical scheme that information in AMA data vs only 0.02% in MD-PPAS data. In the considered all primary and secondary specialty codes equally. sensitivity analysis including only physicians involved in direct Any physician reporting any type of oncology specialty was patient care, the level of agreement was similar (data not assigned to the oncology category. For example, a physician shown). who reported the specialties medical oncology and internal medicine would be classified as an oncologist. For the remain- Discussion ing physicians, any physician with a surgery specialty was assigned to the surgery category. Next, of those who remained, Physician information from the two data sources was found to any physician with a radiology specialty was assigned to the ra- have excellent agreement with respect to physician age and diology category. Any remaining physicians who reported a pri- sex. Despite the MD-PPAS data being limited to a select group of mary care specialty were assigned to the primary care category. physicians (eg, those who submitted a Medicare FFS claim for Finally, all remaining physicians with a specified specialty were direct patient care), the distribution of physician demographics assigned to the other category. Physicians who had an unspeci- did not appear to be exceptionally biased. A 2014 census of ac- fied or undefined specialty were categorized as missing. tive physicians by the Federation of State Medical Boards showed 48% of responding physicians were 40–59 years old and 66% were male (10). There was also 85.9% agreement between Statistical Analyses the two data sources for state of practice. This was somewhat surprising given that updates to practice geographic informa- The percent agreement and Cohen’s kappa coefficient (k) were tion are obtained by voluntary physician report to the AMA (11). calculated for each of the common variables. We included miss- In a 2015 physician workforce study, less than one-half of physi- ing values in the calculations for all variables except sex be- cians were reported to be practicing in the state where they cause there were no missing sex values in the AMA data. We completed their residency (12). performed a sensitivity analysis restricting the data to only There was good agreement for overall specialty and within physicians who were classified as providing direct patient care specialty category for surgery and radiology. However, the in the AMA data (type of practice code ¼ 020; N¼ 596 547). All results indicate underascertainment of oncologists and primary statistics were calculated using SAS version 9.4 (SAS Institute, care physicians in the MD-PPAS data compared with AMA data. Cary, NC). Although the level of ascertainment of oncologists was lower in the MD-PPAS data, the inclusion of specialty data from PECOS appears to capture more oncologists than if ascertainment re- Results lied solely on physician Medicare claims data. A previous study Of the 698 202 physicians found in both MD-PPAS and AMA indicated that only 60% of physicians classified as oncologists data, more than one-half were between 40 and 59 years old in the AMA data would be similarly classified based on (MD-PPAS: 51.9%; AMA: 52.3%) and more than 70% were male Medicare claims data alone (6). (MD-PPAS: 70.4%; AMA: 70.6%) (Tables 2 and 3). There was excel- When deciding which data source would be best for an anal- lent agreement between the two data sources for age group ysis, researchers should consider factors such as the complete- (percent agreement: 97.7%, k ¼ 0.97) and sex (percent agreement: ness of common variables and data acquisition costs. Given the 99.4%, k¼ 0.99). In both data sources, primary care was the most good agreement between the two sources on physician age and common specialty (MD-PPAS: 39.3%; AMA: 42.6%) and oncology sex, if these are the only variables of interest, the MD-PPAS may the least common specialty (MD-PPAS: 2.9%; AMA 3.4%) be a more economical option, especially if investigators are (Table 4). Overall, specialty agreement between the two data requesting only a few years of data. The price structure for sources was good (percent agreement: 86.1%, k ¼ 0.80). obtaining AMA data is based on the number of NPIs compared Agreement within specialty category (eg, among physicians with the number of years for MD-PPAS data. If practice state is identified as having a given specialty in AMA, the percentage of germane to the research question, MD-PPAS may also be the physicians who were similarly classified in MD-PPAS) was lower better choice because practice state is likely more current and for oncologists (77%) and primary care physicians (82%) than for missing data were less common than in the AMA data. surgeons (94%) and radiologists (98%). The same practice state However, AMA data may be a better option if the focus of the was reported in MD-PPAS and AMA data for 85.9% of physicians study is to ascertain a complete cohort of oncologists or primary Table 2. Agreement of physician age as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n ¼ 698 202)* MD-PPAS Age (years) AMA Age (years) <40 40–49 50–59 60–69 70þ Missing Total Row n (%) <40 135 578 106 23 12 <11 >253 135 983 (19.5) 40–49 133 183 449 82 >11 <11 1231 184 916 (26.5) 50–59 83 106 178 095 >121 <11 2127 180 543 (25.9) 60–69 62 35 80 133 765 83 3760 137 785 (19.7) 70þ 86 101 81 154 51 567 5538 57 527 (8.2) Missing 945 311 113 51 >17 <11 1448 (0.20) Total Column n (%) 136 887 (19.6) 184 108 (26.4) 178 474 (25.6) 134 118 (19.2) 51 695 (7.4) 12 920 (1.9) 698 202 (100) *MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. Cell sizes <11 masked for confidentiality concerns. Percent agreement: 97.7%; Kappa 0.97. Downloaded from https://academic.oup.com/jncimono/article/2020/55/66/5837287 by DeepDyve user on 19 July 2022 70 | J Natl Cancer Inst Monogr, 2020, Vol. 2020, No. 55 Table 3. Agreement of physician sex as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n ¼ 698 202)* MD-PPAS Sex AMA Sex Female Male Missing Total Row n (%) Female 203 000 1826 111 204 937 (29.4) Male 2094 489 934 1237 493 265 (90.6) Missing 0 0 0 0 (0) Total Column n (%) 205 094 (29.4) 491 760 (70.4) 1348 (0.2) 698 202 (100) *MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. When physcians with missing sex classification in the MD-PPAS files were excluded, percent agree- ment: 99.4% and Kappa: 0.99. Table 4. Agreement of physician specialty as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n ¼ 698 202)* MD-PPAS Specialty AMA Specialty Oncology Surgery Radiology Primary Care Other Missing Total Row n (%) Oncology 18 394 1074 160 3297 >938 <11 23 874 (3.4) Surgery 542 106 706 236 2908 3 422 59 113 873 (16.3) Radiology 98 60 33 902 264 <424 >19 34 767 (5.0) Primary Care 1003 1013 301 243 908 50 897 148 297 270 (42.6) Other 116 1086 442 18 746 197 921 127 218 438 (31.3) Missing 117 1597 431 5189 >2 635 <11 9980 (1.4) Total Column n (%) 20 270 (2.9) 111 536 (16.0) 35 472 (5.1) 274 312 (39.3) 256 248 (36.7) 364 (0.1) 698 202 (100) *MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. Cell sizes <11 masked for confidentiality concerns. Percent agreement: 86.1%; Kappa 0.80. acommonTIN). Practice size canthenbe calculated asthe Table 5. Agreement of physician practice location as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file number of providers assigned thesameTIN.Additional prac- compared to the American Medical Association (AMA) Physician tice characteristics (eg, academic vs community) can then be Masterfile (n ¼ 698 202)* determined through linkages to other data sources via the unencrypted TIN (14). Finally, the MD-PPAS files are created Same state n (%) annually, thus allowing for assessments of temporal changes Yes 600 022 (85.9) (eg, physicians who relocate and fluctuations in Medicare uti- No 42 848 (6.1) lization measures). Therefore, if information on nonphysi- Missing (MD-PPAS) 133 (0.0) cians, provider utilization, practice characteristics, and/or Missing (AMA) 55 163 (7.9) temporal changes is of interest, then the MD-PPAS would ap- Missing (Both) 36 (0.0) pear to be a better data source. Conversely, AMA data include physician medical education *MD-PPAS files from 2008–2014, location determined based on most recent infor- and postgraduate training information, which is not available in mation available for each physician; AMA classification as of July 18, 2016. the MD-PPAS data (4). Compared with the MD-PPAS data, the AMA data also provide more granular specialty information, care physicians given the lower potential to identify oncologists which could allow investigators to more extensively categorize and primary care physicians in the MD-PPAS data. and compare types of physicians. Additionally, the AMA data In addition to common variables and data acquisition are less restrictive about which physicians are included; physi- costs, unique provider information in the AMA and MD-PPAS cians are included regardless of insurance affiliation. In con- data may influence a researcher’s choice of which data source trast, to be included in the MD-PPAS data, physicians must to use. The MD-PPAS data capture both physician and nonphy- accept Medicare patients and submit select types of Medicare sician providers who submit claims to Medicare. Although we FFS claims (8). Therefore, if information on physician training, focused on physicians in this study, being able to assess char- more granular specialty, and/or a broader inclusion of physi- acteristics of nonphysician providers may be key to a specific cians is of interest, then the AMA would appear to be a better research question. This is of increasing importance as non- data source. physician providers are more intricately involved in patient All of the above factors were considered when determining care, especially primary care (13). Furthermore, the MD-PPAS the merits of including the MD-PPAS data in the SEER-Medicare data include provider utilization summary measures (eg, num- database. Physician specialty, particularly the ability to ber of unique Medicare beneficiaries treated and total allowed accurately identify oncologists, is of great interest to many Medicare charges), which could be used to characterize pro- SEER-Medicare researchers. Importantly, the observed vider workload and/or treatment patterns (5). MD-PPAS data underascertainment of oncologists in MD-PPAS data made us can also provide insights into practice characteristics. For ex- reconsider the benefit of releasing the data through SEER- ample, providers can be assigned to group practices (eg, if two Medicare. Additionally, although researchers might be unaware providers are affiliated with the same practice, they will have that relevant data exist, the demand for the unique variables Downloaded from https://academic.oup.com/jncimono/article/2020/55/66/5837287 by DeepDyve user on 19 July 2022 D. P. White et al. |71 (eg, nonphysician provider characteristics and provider utiliza- delivery, and being able to describe and account for their in- tion measures) in the MD-PPAS data is currently low in the volvement is critical to health services research. SEER-Medicare research community. Furthermore, some of the possible insights gained from the MD-PPAS data can already Notes be gleamed through the claims data. For example, TINs are available on the claims; therefore, if desired, SEER-Medicare Affiliations of authors: National Cancer Institute, Division of Cancer Control and Population Science, Healthcare Delivery researchers can already determine if providers included in their subset of the data belong to the same group practices. It Program, Bethesda, MD (DPW, LE, AMG, JLW); Information should be noted that the TINs included in the SEER-Medicare Management Services, Calverton, MD (RB). data are encrypted, like all individual provider identifiers; The authors have disclosed that they have no financial inter- ests, arrangements, affiliations, or commercial interests with therefore, it is not feasible to link data obtained through SEER- Medicare to an external data source via TINs. For these rea- the manufacturers of any products discussed in this article or sons, it was decided that there was not enough justification to their competitors. This article was produced by employees of release the MD-PPAS data with the SEER-Medicare database at the US government as part of their official duties and, as such, this time. We are aware that the MD-PPAS file is evolving and is in the public domain in the United States of America. The physician specialty ascertainment, especially for oncologists, content is solely the responsibility of the authors and does not necessarily represent the official views of the National Cancer may improve in the future (eg, through the development of novel algorithms that account for diagnosis and/or procedure Institute, the National Institutes of Health, or the Centers for codes in claims data or utilization of additional data sources). Medicare and Medicaid Services. We are also mindful that additional unique variables may be We thank the following individuals from the Centers for Medication and Medicaid Services for their contributions to this added to the MD-PPAS and/or that demand for the current unique variables may increase. As a result, we plan to stay study: David M. Bott, PhD; Jennifer Lloyd, PhD, MA, MS; and abreast of such developments. Arpit Misra PhD, MA. Although this study had strengths, namely it compared two large resources of provider characteristics in a systematic way, References there were limitations that should be discussed. Given the dif- 1. Klabunde CN, Ambs A, Keating NL, et al. The role of primary care physicians ferent specialty classification systems between the MD-PPAS in cancer care. J Gen Intern Med. 2009;24(9):1029–1036. and AMA data, we created a hierarchical broad specialty classi- 2. Gage-Bouchard EA, Rodriguez LM, Saad-Harfouche FG, Miller A, Erwin DO. fication scheme to categorize physicians within each data Factors influencing patient pathways for receipt of cancer care at an NCI- designated comprehensive cancer center. PLoS ONE. 2014;9(10):e110649. source. It is possible that we differentially misclassified physi- 3. Institute of Medicine (US) and National Research Council (US) National cians, particularly given the variation in granularity between Cancer Policy Board; Hewitt M, Simone JV, eds. Ensuring Quality Cancer Care. the two data sources. It is also possible that variations in spe- Washington, DC: National Academies Press (US); 1999. Bookshelf ID: NBK230937. cialty were oversimplified using the scheme. Comparison of cat- 4. National Cancer Institute. Division of Cancer Control & Population Sciences. egorical age and practice location, based on state alone, also SEER-Medicare: Brief Description of the SEER-Medicare Database. 2017. https:// may have meant that potential age and geographical variations healthcaredelivery.cancer.gov/seermedicare/overview/. Accessed March 19, on more granular levels were not identified. Although AMA data 5. Baldwin LM, Adamache W, Klabunde CN, Kenward K, Dahlman C, Warren JL. are commonly used in studies as the standard for physician Linking physician characteristics and Medicare claims data: issues in data specialty, we did not designate a gold standard between the availability, quality, and measurement. Med Care. 2002;40(suppl 8): IV-82–IV-95. AMA data and the MD-PPAS data to measure sensitivity, specif- 6. Pollack LA, Adamache W, Eheman CR, Ryerson AB, Richardson LC. icity, and positive predictive value. When we began our analy- Enhancement of identifying cancer specialists through the linkage of sis, MD-PPAS version 2.0 was available. CMS continues to Medicare claims to additional sources of physician specialty. Health Serv Res. improve the MD-PPAS data and version 2.3 is now available. 2009;44(2, pt 1):562–576. 7. American Medical Association. AMA. AMA Physician Masterfile. 1995–2017. Assessments of data available in the most recent version may https://www.ama-assn.org/life-career/ama-physician-masterfile. Accessed show greater agreement and concordance between the two data September 25, 2017. sources. Additionally, comparison of other sources of physician 8. Center for Medicare & Medicaid Services. Research Data Assistance Center (ResDac). Medicare Data on Provider Practice and Specialty (MD-PPAS). 2016. data were not considered but may be of interest for future https://www.resdac.org/cms-data/files/md-ppas/data-documentation. investigation. Accessed September 25, 2017. In conclusion, this analysis compared variables available in 9. Welch PW, Stearns SC, Cuellar AE, Bindman AB. Use of hospitalists by Medicare beneficiaries: a national picture. Medicare Medicaid Res Rev. 2014;4(2): the MD-PPAS and AMA data and the agreement of the common ecollection 2014. variables. We found comparable physician information on age, 10. Young A, Chaudhry HJ, Pei X, Halbesleben K, Polk DH, Dugan M. A census of sex, overall specialty, and practice state is available in the MD- actively licensed physicians in the United States, 2014. J Med Regul. 2015; 101(2):7–23. PPAS and AMA data. There was some indication that oncolo- 11. Bindman AB. Using the National Provider Identifier for health care workforce gists were underascertained in the MD-PPAS data compared evaluation. Medicare Medicaid Res Rev. 2012;3(3):E1–E10. with the AMA data. Each data source has unique physician in- 12. Association of American Medical Colleges. Center for Workforce Studies. 2015 State Physician Workforce Data Book. 2015. https://www.aamc.org/ formation (eg, granular specialty information in AMA and prac- data/workforce/reports/442830/statedataandreports.html. Accessed October tice characteristics in MD-PPAS) that may influence which data 20, 2017. source to include in an analysis. Although AMA data have been 13. Association of American Medical Colleges. The complexities of physician commonly used to account for physician-level factors in health supply and demand: projections from 2013 to 2025. 2015. https://www.aamc. org/download/426242/data/ihsreportdownload.pdf. Accessed September 25, services research, the availability of MD-PPAS data provides researchers with an alternative option depending on study 14. Welch WP, Bindman AB. Town and gown differences among the 100 largest needs. Physicians are at the heart of cancer health-care medical groups in the United States. Acad Med. 2016;91(7):1007–1014.
JNCI Monographs – Oxford University Press
Published: May 1, 2020
Keywords: cancer; health services research; medicare; physician demographics; oncologists; health care; health services; datasets; health care financing administration
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.