Medical students have previously been shown to be just as effective for video rating as experts. We want to compare medical students to experienced surgeons as video assessors of simulated robot-assisted radical prostatectomy (RARP) performance. Video recordings of three RARP modules on the RobotiX (formerly Simbionix) simulator from a previous study were used. Five novice surgeons, five experienced robotic surgeons, and five experienced robotic surgeons in RARP performed a total of 45 video-recorded procedures. The videos were assessed with the modified Global Evaluative Assessment of Robotic Skills tool as both full-length and an edited edition that only included the first 5 minutes of the procedure. Fifty medical students and two experienced RARP surgeons (ES) performed a total of 680 video ratings of full-length videos and 5-minute videos (2-9 ratings per video). Medical students and ES showed poor agreement for both full-length videos and 5-minute videos (0.29 and -0.13, respectively). Medical students could not discriminate between the skill level of the surgeons in either full-length videos or 5-minute videos ( P = 0.053-0.36 and P = 0.21-0.82), whereas ES could discriminate between novice surgeons and experienced surgeons (full-length, P < 0.001, and 5 minutes, P = 0.007) and intermediate and experienced surgeons (full-length, P = 0.001, and 5 minutes, P = 0.01) in both full-length videos and 5-minute videos. We found that medical students cannot be used to assess RARP because they showed poor agreement with the ES rating for both full-length videos and 5-minute videos. Medical students could not discriminate between surgical skill levels.