{"title":"Rubrics and direct measures in accreditation [From the Editor]","authors":"J. Enderle","doi":"10.1109/MEMB.2007.384083","DOIUrl":null,"url":null,"abstract":"T his column is a continuation of my biomedical engineering (BME) accreditation experience at the University of Connecticut. During the first part of this year, I have been rewriting the self-study based on the input from the program evaluator mock visit and adding new information. The data and evaluation results from the BME ABET Committee have been tabulated and summarized. One of the most important items to include in a self-study is a summary of the direct and indirect measures (data and evaluation) and the changes in the curriculum that have evolved from the evaluation (commonly referred to as “closing the loop”). For indirect measures, we use senior exit and EBI surveys. We have used rubrics as direct measures of program outcomes a-k, and program criteria 8. There are at least two direct measures for almost every program outcome in our self-study. Most of the direct measures are from the final reports or presentations in our senior design courses. I first became aware of the use of rubrics from a presentation by John Gassert (Milwaukee School of Engineering) at the ASEE conference in 2005. Done right, creating rubrics is a time-consuming task as the goal is to describe all outcomes of the analysis ahead of time for different levels of performance. We selected performance levels as novice, apprentice, proficient, and expert. Within each level of performance are descriptions of those characteristics, sometimes with as few as one characteristic [i.e., (k) demonstrates an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice has only one characteristic for each performance level, which for novice is “No use of computational tools beyond word processing and presentation","PeriodicalId":50391,"journal":{"name":"IEEE Engineering in Medicine and Biology Magazine","volume":"128 1","pages":"4-5"},"PeriodicalIF":0.0000,"publicationDate":"2007-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/MEMB.2007.384083","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Engineering in Medicine and Biology Magazine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MEMB.2007.384083","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
T his column is a continuation of my biomedical engineering (BME) accreditation experience at the University of Connecticut. During the first part of this year, I have been rewriting the self-study based on the input from the program evaluator mock visit and adding new information. The data and evaluation results from the BME ABET Committee have been tabulated and summarized. One of the most important items to include in a self-study is a summary of the direct and indirect measures (data and evaluation) and the changes in the curriculum that have evolved from the evaluation (commonly referred to as “closing the loop”). For indirect measures, we use senior exit and EBI surveys. We have used rubrics as direct measures of program outcomes a-k, and program criteria 8. There are at least two direct measures for almost every program outcome in our self-study. Most of the direct measures are from the final reports or presentations in our senior design courses. I first became aware of the use of rubrics from a presentation by John Gassert (Milwaukee School of Engineering) at the ASEE conference in 2005. Done right, creating rubrics is a time-consuming task as the goal is to describe all outcomes of the analysis ahead of time for different levels of performance. We selected performance levels as novice, apprentice, proficient, and expert. Within each level of performance are descriptions of those characteristics, sometimes with as few as one characteristic [i.e., (k) demonstrates an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice has only one characteristic for each performance level, which for novice is “No use of computational tools beyond word processing and presentation