Literature reviews show stated-preference studies, used to understand the values individuals place on health and health care, are increasingly administered online, potentially maximising respondent access and allowing for enhanced response quality. Online respondents may often choose whether to use a desktop or laptop personal computer (PC), tablet or smartphone, all with different screen sizes and modes of data entry, to complete the survey. To avoid differences in measurement errors, frequently respondents are asked to complete the surveys on a PC despite evidence that handheld devices are increasingly used for internet browsing. As yet, it is unknown if or how the device used to access the survey affects responses and/or the subsequent valuations derived. This study uses data from a discrete choice experiment (DCE) administered online to elicit preferences of a general population sample of females for a national breast screening programme. The analysis explores differences in key outcomes such as completion rates, engagement with the survey materials, respondent characteristics, response time, failure of an internal validity test and health care preferences for (1) handheld devices and (2) PC users. Preferences were analysed using a fully correlated random parameter logit (RPL) model to allow for unexplained scale and preference heterogeneity. One thousand respondents completed the survey in its entirety. The most popular access devices were PCs (n = 785), including Windows (n = 705) and Macbooks (n = 69). Two-hundred and fifteen respondents accessed the survey on a handheld device. Most outcomes related to survey behaviour, including failure of a dominance check, 'flat lining', self-reported attribute non-attendance (ANA) or respondent-rated task difficulty, did not differ by device type (p > 0.100). Respondents accessing the survey using a PC were generally quicker (median time to completion 14.5min compared with 16min for those using handheld devices) and were significantly less likely to speed through a webpage. Although there was evidence of preference intensity (taste) or variability (scale) heterogeneity across respondents in the sample, it was not driven by the access device. Overall, we find that neither preferences nor choice behaviour is associated with the type of access device, as long as respondents are presented with question formats that are easy to use on small touchscreens. Health preference researchers should optimise preference instruments for a range of devices and encourage respondents to complete the surveys using their preferred device. However, we suggest that access device characteristics should be gathered and included when reporting results.