Deborah M. Fournier is Emeritus Associate Clinical Professor of Medical Sciences and Education. During her 28 years with Boston University Medical Campus (BUMC), she worked to strengthen evaluation methods, practices and infrastructure to support research on research, program evaluation, grant application submissions, continuous quality improvement and accreditation. She served as BUMC Assistant Provost for Institutional Research and Evaluation and Executive Director of Evaluation for the Boston University Clinical and Translational Science Institute. Her leadership role was to assess biomedical research productivity, influence and impact and collaborate with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive evaluation of biomedical research training programs, and use of outcomes assessment data to guide operational and strategic decision-making. Her expertise and field experience centered on applied research studies methods, educational program evaluation practices and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation, and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation. She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. Since her retirement in 2023, she continues to serve on the Editorial Review Board of the Journal of Clinical and Translational Science and External Advisory Boards for Clinical and Translational Science Institutes and collaborate with faculty on grant proposal writing.
Dr. Fournier advocates for pluralism and democratizing inclusive approaches in planning and conducting evaluation of programs and interventions in Boston University’s academic medicine setting. In using research and evaluation methods and models to conduct evaluation to enhance programming and organizational change, she used participatory evaluation practices. Such practices emphasize the inclusion of diverse audiences in discussions and decisions at different stages of the evaluation process, from design to methods to data collection to interpretation and use of the data to guide change. Participatory evaluation fosters a sense of inclusion with individuals that then increases the likelihood of them using the data to enact change because they understand the findings and feel empowered to act and advance innovations.
In planning and conducting evaluation, she is guided by the question, “Who is and isn’t at the table?” She encourages the need for diverse voices to feed into evaluation processes, especially groups who may be voiceless and marginalized. She views her evaluator role as a facilitator of evaluations that are mindful about safeguarding how factors in each evaluation stage will affect different individuals and groups who will be impacted by the evaluation and its use. Responsible evaluation practice is being intentional about planning audience identification analyses to determine how best to involve them. It strives to ensure inclusive engagement in meaningful dialogue, reflection, and deliberation to catalyze changes in awareness, understanding, behaviors, practices, and polices. It further means being mindful of evaluating context, which is the psychosocial and sociopolitical factors in the setting that influences evaluation design, implementation of methods, outcomes, and audience willingness to act on results. Responsible evaluation practice serves the university well when evaluation practice is grounded in participatory and culturally responsive practices.
U54 RR025771, NIH/NCRR, 5/01/08-04/30/013, Clinical and Translational Science Award; Center, D., PI.
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this grant is to develop infrastructure for conducting clinical and translational science research.
K30 HL004124-07, NIH/NHLBI, 6/01/05-05/31/10, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.
K30 HL004124-07, NIH/NHLBI, 6/01/99-05/31/05, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.
Robert Wood Johnson Foundation, 8/30/02-7/31/07, New England Dental Access Project; Frankl, S., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment. The goals of this grant are to (i) enhance and enlarge community-based clinical education programs that provide care to underserved populations in the New England area and (ii) develop, implement, and monitor programs to increase recruitment and retention of underrepresented minority and low-income students.
Deborah M. Fournier is Assistant Provost for Institutional Research and Evaluation at Boston University Medical Campus and Director of Program Evaluation for the Boston University Clinical and Translational Science Institute (BU CTSI). She has more than 20 years of field experience in applied social science research, educational program evaluation and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation (68), and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation (76). She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. Her applied scholarship at Boston University includes her collaboration with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive program evaluation plans for educational programs, research institutes and research grant proposals, outcomes assessment processes and dashboards to guide operational and strategic decision-making, faculty annual evaluation and development systems, and numerous faculty workshops on theories, methods and practices in program evaluation, student/trainee career outcomes, institutional research and competency-based assessment.
U54 RR025771, NIH/NCRR, 5/01/08-04/30/013, Clinical and Translational Science Award; Center, D., PI.
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this grant is to develop infrastructure for conducting clinical and translational science research.
K30 HL004124-07, NIH/NHLBI, 6/01/05-05/31/10, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.
K30 HL004124-07, NIH/NHLBI, 6/01/99-05/31/05, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.
Robert Wood Johnson Foundation, 8/30/02-7/31/07, New England Dental Access Project; Frankl, S., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment. The goals of this grant are to (i) enhance and enlarge community-based clinical education programs that provide care to underserved populations in the New England area and (ii) develop, implement, and monitor programs to increase recruitment and retention of underrepresented minority and low-income students.
Deborah M. Fournier is Assistant Provost for Institutional Research and Evaluation at Boston University Medical Campus and Director of Program Evaluation for the Boston University Clinical and Translational Science Institute (BU CTSI). She has more than 20 years of field experience in applied social science research, educational program evaluation and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation (68), and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation (76). She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. Her applied scholarship at Boston University includes her collaboration with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive program evaluation plans for educational programs, research institutes and research grant proposals, outcomes assessment processes and dashboards to guide operational and strategic decision-making, faculty annual evaluation and development systems, and numerous faculty workshops on theories, methods and practices in program evaluation, student/trainee career outcomes, institutional research and competency-based assessment.