Background: The field of clinical ethics is examining ways of determining competency. The Assessing Clinical Ethics Skills (ACES) tool offers a new approach that identifies a range of skills necessary in the conduct of clinical ethics consultation and provides a consistent framework for evaluating these skills. Through a training website, users learn to apply the ACES tool to clinical ethics consultants (CECs) in simulated ethics consultation videos. The aim is to recognize competent and incompetent clinical ethics consultation skills by watching and evaluating a videotaped CEC performance. We report how we set a criterion cut score (i.e., minimally acceptable score) for judging the ability of users of the ACES tool to evaluate simulated CEC performances.Methods: A modified Angoff standard-setting procedure was used to establish the cut score for an end-of-life case included on the ACES training website. The standard-setting committee viewed the Futility Case and estimated the probability that a minimally competent CEC would correctly answer each item on the ACES tool. The committee further adjusted these estimates by reviewing data from 31 pilot users of the Futility Case before determining the cut score.Results: Averaging over all 31 items, the proposed proportion correct score for minimal competency was 80%, corresponding to a cut score that is between 24 and 25 points out of 31 possible points. The standard-setting committee subsequently set the minimal competency cut score to 24 points.Conclusions: The cut score for the ACES tool identifies the number of correct responses a user of the ACES tool training website must attain to “pass” and reach minimal competency in recognizing competent and incompetent skills of the CECs in the simulated ethics consultation videos. The application of the cut score to live training of CECs and other areas of practice requires further investigation.