Computer code plays a vital role in modern science, from the conception and design of experiments through to final data analyses. Open sharing of code has been widely discussed as being advantageous to the scientific process, allowing experiments to be more easily replicated, helping with error detection, and reducing wasted effort and resources. In the case of psychology, the code used to present stimuli is a fundamental component of many experiments. It is not known, however, the degree to which researchers are sharing this type of code. To estimate this, we conducted a survey of 400 psychology papers published between 2016 and 2021, identifying those working with the open-source tools Psychtoolbox and PsychoPy that openly share stimulus presentation code. For those that did, we established if it would run following download and also appraised the code's usability in terms of style and documentation. It was found that only 8.4% of papers shared stimulus code, compared to 17.9% sharing analysis code and 31.7% sharing data. Of shared code, 70% ran directly or after minor corrections. For code that did not run, the main error was missing dependencies (66.7%). The usability of the code was moderate, with low levels of code annotation and minimal documentation provided. These results suggest that stimulus presentation code sharing lags behind other forms of code and data sharing, potentially due to less emphasis on such code in open-science discussions and in journal policies. The results also highlight a need for improved documentation to maximize code utility.