In May 2020, the social media giant Meta, parent company of social media platforms Facebook and Instagram, established an Oversight Board to allow some recourse to the content moderation policy of the company. The Oversight Board is an experiment in self-regulation (where ‘self’ refers to the company, not industry) by one of the global platforms to address content regulation. The Board is structured on the ‘arm's length principle’ to ensure independence from Meta. This paper analyses the structure for legitimacy and takes a bird's eye view of the decisions that have been made to date for efficacy. Of the more than 100 decisions by the Board, a total of 80% had been overturned. Interestingly, the percentage of overturned decisions has been increasing. This study suggests that the company was indeed not able to handle decisions around content well by itself. Further, the Board reported the need to increase the answerability and comprehensiveness of Meta to its recommendations. The paper suggests a shift in the tone of the global debate on content moderation from a near-absolutist position against moderation to a willingness to consider content governance. It concludes that while there is room for improvement, the structure could serve as a model for other companies that face a similar need to moderate content globally.