Surgical complications often occur due to lapses in judgment and decision-making. Advances in artificial intelligence (AI) have made it possible to train algorithms that identify anatomy and interpret the surgical field. These algorithms can potentially be used for intraoperative decision-support and postoperative video analysis and feedback. Despite the very early success of proof-of-concept algorithms, it remains unknown whether this innovation meets the needs of end-users or how best to deploy it. This study explores users' opinion on the value, usability and design for adapting AI in operating rooms. A device-agnostic web-accessible software was developed to provide AI inference either (1) intraoperatively on a live video stream (synchronous mode), or (2) on an uploaded video or image file (asynchronous mode) postoperatively for feedback. A validated AI model (GoNoGoNet), which identifies safe and dangerous zones of dissection during laparoscopic cholecystectomy, was used as the use case. Surgeons and trainees performing laparoscopic cholecystectomy interacted with the AI platform and completed a 5-point Likert scale survey to evaluate the educational value, usability and design of the platform. Twenty participants (11 surgeons and 9 trainees) evaluated the platform intraoperatively (n = 10) and postoperatively (n = 11). The majority agreed or strongly agreed that AI is an effective adjunct to surgical training (81%; neutral = 10%), effective for providing real-time feedback (70%; neutral = 20%), postoperative feedback (73%; neutral = 27%), and capable of improving surgeon confidence (67%; neutral = 29%). Only 40% (neutral = 50%) and 57% (neutral = 43%) believe that the tool is effective in improving intraoperative decisions and performance, or beneficial for patient care, respectively. Overall, 38% (neutral = 43%) reported they would use this platform consistently if available. The majority agreed or strongly agreed that the platform was easy to use (81%; neutral = 14%) and has acceptable resolution (62%; neutral = 24%), while 30% (neutral = 20%) reported that it disrupted the OR workflow, and 20% (neutral = 0%) reported significant time lag. All respondents reported that such a system should be available "on-demand" to turn on/off at their discretion. Most found AI to be a useful tool for providing support and feedback to surgeons, despite several implementation obstacles. The study findings will inform the future design and usability of this technology in order to optimize its clinical impact and adoption by end-users.