Deborah Fournier, PhD
Emeritus Clinical Associate Professor
Boston University Chobanian & Avedisian School of Medicine
Dean’s Office

PhD, Syracuse University
MS, Syracuse University
BS, University of Maryland



Deborah M. Fournier is Assistant Provost for Institutional Research and Evaluation at Boston University Medical Campus and Executive Director of Evaluation for the Boston University Clinical and Translational Science Institute. Her role is to assess biomedical research productivity, influence and impact and collaborate with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive evaluation of biomedical research training programs, and use of outcomes assessment data to guide operational and strategic decision-making. She has more than 25 years of field experience in applied research studies methods, educational program evaluation practices and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation, and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation. She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. She serves on External Advisory Boards for Clinical and Translational Science Institutes.

U54 RR025771, NIH/NCRR, 5/01/08-04/30/013, Clinical and Translational Science Award; Center, D., PI.
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this grant is to develop infrastructure for conducting clinical and translational science research.

K30 HL004124-07, NIH/NHLBI, 6/01/05-05/31/10, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.

K30 HL004124-07, NIH/NHLBI, 6/01/99-05/31/05, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.

Robert Wood Johnson Foundation, 8/30/02-7/31/07, New England Dental Access Project; Frankl, S., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment. The goals of this grant are to (i) enhance and enlarge community-based clinical education programs that provide care to underserved populations in the New England area and (ii) develop, implement, and monitor programs to increase recruitment and retention of underrepresented minority and low-income students.
Deborah M. Fournier is Assistant Provost for Institutional Research and Evaluation at Boston University Medical Campus and Director of Program Evaluation for the Boston University Clinical and Translational Science Institute (BU CTSI). She has more than 20 years of field experience in applied social science research, educational program evaluation and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation (68), and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation (76). She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. Her applied scholarship at Boston University includes her collaboration with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive program evaluation plans for educational programs, research institutes and research grant proposals, outcomes assessment processes and dashboards to guide operational and strategic decision-making, faculty annual evaluation and development systems, and numerous faculty workshops on theories, methods and practices in program evaluation, student/trainee career outcomes, institutional research and competency-based assessment.

U54 RR025771, NIH/NCRR, 5/01/08-04/30/013, Clinical and Translational Science Award; Center, D., PI.
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this grant is to develop infrastructure for conducting clinical and translational science research.

K30 HL004124-07, NIH/NHLBI, 6/01/05-05/31/10, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.

K30 HL004124-07, NIH/NHLBI, 6/01/99-05/31/05, BU Clinical Research Training Program (CREST); Felson, D., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment.
The goal of this training grant is to provide educational support and training opportunities for persons pursuing a career in clinical research.

Robert Wood Johnson Foundation, 8/30/02-7/31/07, New England Dental Access Project; Frankl, S., PI
Role: Co-investigator; Design and implementation of program evaluation and outcomes assessment. The goals of this grant are to (i) enhance and enlarge community-based clinical education programs that provide care to underserved populations in the New England area and (ii) develop, implement, and monitor programs to increase recruitment and retention of underrepresented minority and low-income students.
Deborah M. Fournier is Assistant Provost for Institutional Research and Evaluation at Boston University Medical Campus and Director of Program Evaluation for the Boston University Clinical and Translational Science Institute (BU CTSI). She has more than 20 years of field experience in applied social science research, educational program evaluation and research evaluation and policy. Her primary contributions to the field of evaluation include exploring evaluative reasoning, warrantability of claims and the inferences drawn from types of evidence in applied field studies. She has edited such volumes as Reasoning in Evaluation: Inferential Links and Leaps, New Directions for Evaluation (68), and Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, New Directions for Evaluation (76). She is actively involved with the American Evaluation Association, having served on the Editorial Advisory Board of The American Journal of Evaluation, established and chaired the Topical Interest Group, Theories of Evaluation and has served as national program conference chair. Her applied scholarship at Boston University includes her collaboration with administrators and faculty to build and sustain capacity for institutional assessment of research, accreditation compliance, comprehensive program evaluation plans for educational programs, research institutes and research grant proposals, outcomes assessment processes and dashboards to guide operational and strategic decision-making, faculty annual evaluation and development systems, and numerous faculty workshops on theories, methods and practices in program evaluation, student/trainee career outcomes, institutional research and competency-based assessment.

Diversity, Equity, Inclusion and Accessibility

I advocate for pluralism and democratizing inclusive approaches in planning and conducting evaluation of programs and interventions in Boston University’s academic medicine setting. In using research and evaluation methods and models to conduct evaluation to enhance programming and organizational change, I use participatory evaluation practices. Such practices emphasize the inclusion of diverse audiences in discussions and decisions at different stages of the evaluation process, from design to methods to data collection to interpretation and use of the data to guide change. Participatory evaluation fosters a sense of inclusion with individuals that then increases the likelihood of them using the data to enact change because they understand the findings and feel empowered to act and advance innovations.

In planning and conducting evaluation, I am guided by the question, “Who is and isn’t at the table?” Diverse voices need to feed into an evaluation process, especially groups who may be voiceless and marginalized. I view my evaluator role as a facilitator of evaluations that are mindful about safeguarding how factors in each evaluation stage will affect different individuals and groups who will be impacted by the evaluation and its use. As I see it, responsible evaluation practice is being intentional about planning audience identification analyses to determine how best to involve them. Evaluation practice strives to ensure inclusive engagement in meaningful dialogue, reflection, and deliberation to catalyze changes in awareness, understanding, behaviors, practices, and polices. It further means being mindful of evaluating context, which is the psychosocial and sociopolitical factors in the setting that influences evaluation design, implementation of methods, outcomes, and audience willingness to act on results. Sometimes inclusive participatory evaluation approaches evolve into participation limited to administrative leadership because it seems easier and faster to get a few leaders together than a more far-reaching group of staff, students, and faculty. Participatory evaluation can backslide into engagement of administrative leadership that unintentionally reinforces the existing power structure and undermines credibility of a truly inclusive process that can strengthen our culture of inclusive excellence at Boston University. Responsible evaluation practice serves our university well when evaluation practice is grounded in participatory, equity-focused, and culturally responsive practices.

Director, Program Evaluation for the BU Clinical and Translational Science Institute
Boston University Medical Campus




Boston University Clinical and Translational Science Award (CTSA) Program UL1
05/01/2008 - 04/30/2013 (Key Person)
PI: David M. Center, MD
NIH/National Center for Health Research Resources
3UL1RR025771-04



Title


Yr Title Project-Sub Proj Pubs

Publications listed below are automatically derived from MEDLINE/PubMed and other sources, which might result in incorrect or missing publications. Faculty can login to make corrections and additions.

iCite Analysis       Copy PMIDs To Clipboard

  1. Fournier DM. Book Review of What Counts as Credible Evidence in Applied Research and Evaluation Practice. American Journal of Evaluation. 2009; 30:255-258. View Publication
  2. Lamster IB, Tedesco LA, Fournier DM, Goodson JM, Gould AR, Haden NK, Howell TH, Ship JA, Wong TW. New opportunities for dentistry in diagnosis and primary health care: Report of a panel of the Macy Study. American Dental Education Association. 2008; 1-5.
  3. Fournier DM. Logic of evaluation. Encyclopedia of Evaluation. Sage Publications. Thousand Oakes, CA. 2005; 238-242.
  4. Fournier DM. Evaluation defined. Encyclopedia of Evaluation. Sage Publications. Thousand Oakes, CA. 2005; 139-140.
  5. Falk-Nilsson E, Walmsley D, Brennan M, Fournier DM, Junfin Glass B, Haden K, Kersten H, Neumann L, Lian GO, Petersson K. 1.2 Cognition and learning. Eur J Dent Educ. 2002; 6 Suppl 3:27-32. PMID: 12390256
     
  6. Frankl SN, Boustany FG, Fournier DM. New directions in the evolving design of an experiential education program. J Dent Educ. 1997 Sep; 61(9):746-52. PMID: 9316595
     
  7. Rog DJ, Fournier DM (eds). Progress and future directions in evaluation: Perspectives on theory, practice, and methods. New Directions for Evaluation. 1997; 76.
  8. Fournier DM. Establishing evaluative conclusions: A distinction between general and working logic. New Directions for Evaluation. 1995; 68:15-32.
  9. Fournier DM (ed). Reasoning in evaluation: Inferential links and leaps. New Directions for Evaluation. 1995; 68.
  10. Fournier DM. Book Review of The Joint Committee on Standards for Educational Evaluation, The Program Evaluation Standards: How to Assess Evaluations of. Journal of Educational Measurement. 1994; 4(31):363-368.
Showing 10 of 15 results. Show More

This graph shows the total number of publications by year, by first, middle/unknown, or last author.

Bar chart showing 15 publications over 11 distinct years, with a maximum of 2 publications in 1992 and 1995 and 1997 and 2005

YearPublications
19901
19911
19922
19931
19941
19952
19972
20021
20052
20081
20091


DEIA Statements Added to BU Profiles

BU School of Medicine 9/28/2022
In addition to these self-described keywords below, a list of MeSH based concepts is available here.

Program Evaluation
Evaluation Logic
Research on Research
Outcomes Assessment
Evaluation Theories
Models and Methods
Contact for Mentoring:

75 E. Newton St Evans Building
Boston MA 02118
Google Map


Fournier's Networks
Click the "See All" links for more information and interactive visualizations
Concepts
_
Media Mentions
_
Similar People
_
Same Department