Abstract

Open innovation platforms that enable organizations to crowdsource ideation to parties external to the firm are proliferating. In many cases, the platforms use open contests that allow the free exchange of ideas with the goal of improving the ideation process. In open contests, participants (“solvers”) observe the ideas of others as well as the feedback received from the contest sponsor (“seeker”). The open nature of such contests generate incentives for imitating successful early designs by future solvers at the cost of the original solvers. As such, this creates the possibility of the platform unraveling when original solvers strategically withdraw from the platform, expecting their ideas will be copied without recompense. To investigate agent behavior in such a setting, we analyze publicly accessible micro-data on more than 6,000 design contests, submissions and participants from crowdsourced open ideation platforms and augment this analysis with field and online experiments. These data include the original image files submitted to the contests, which enable us to compare how similar one image is to another using a customized ensemble image comparison algorithm. We find that better rated designs are likely to be imitated by later entering solvers, thereby generating significant risk to early entrants that their ideas will be appropriated by later entrants without recompense. As a countervailing force, we document that seekers tend to reward original designs, and avoid picking as winners those that seem to be imitating and free-riding. Seekers perceive original designers as more competent, informing their design choice. These patterns suggest that market behavior on such platforms may have a self-policing component that disincentivizes excessive imitation, rewards originality and prevents unraveling.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call