In the age of Big Data Analytics and COVID-19 Apps, the conventional conception of privacy that focuses excessively on the identification of the individual is inadequate to safeguard the individual’s identity and autonomy. An individual’s autonomy can be impaired and their control over their social identity diminished, even without infringing the anonymity surrounding their personal identity. A century-old individualistic conception of privacy that aimed at safeguarding a person from unwarranted social interference is incapable of protecting their autonomy and identity when they are targeted on the basis of their interdependent social and algorithmic group affiliations. In order to overcome these limitations, in this article, I develop a theoretical framework in the form of a triumvirate model of the group right to privacy (GRP), which is based on privacy as a social value (Pv). An individual has an interest in protecting their social identity arising out of their participation in social groups. The panoptic sorting of individuals by Big Data Analytics for behavioral targeting purposes gives rise to epistemic bubbles and echo chambers that impede the formation of an individual’s social identity. I construct the formulation of GRP1 to protect an individual’s interest in their social identity and their socially embedded autonomous self. Thereafter, I emphasize an individual’s right to informational self-determination and against algorithmic grouping in GRP2. Lastly, I highlight instances where GRP3 entitles an organized group to privacy in its own right. I develop a Razian formulation to state that Big Data Analytics’ constant surveillance and monetization of human existence is an infringement of individual autonomy. The violation of GRP subjects an individual to behavioral targeting (including hyper-targeted political advertising) and distorts their weltanschauung, or worldview. As regards COVID-19 Apps, I assert that the extraordinary circumstances surrounding the pandemic do not provide an everlasting justification for reducing an individual’s identity to a potential disease carrier. I argue that the ambivalence regarding existence of surveillance surrounding an individual’s social identity can leave them in a perpetual state of simulated surveillance (simveillance). I further assert that it is in the long-term best interests of the Big Tech corporations to respect privacy. In conclusion, I highlight that our privacy is not only interdependent in nature, but it is also existentially cumulatively interlinked. It increases in force with each successive protection. The privacy challenge posed by COVID-19 Apps has helped us realize that while limited exceptions to privacy may be carved out in grave emergencies, there is no moral justification for round-the-clock surveillance of an individual’s existence by Big Data Analytics. Similarly, the threat to privacy posed by Big Data Analytics has helped us realize that privacy has been wrongly focusing on the distinguishing aspects of the individual. It is our similarities that are truly worth protecting. In order to protect these similarities, I formulate the concept of mutual or companion privacy, which counterintuitively states that in the age of Big Data Analytics, we have more privacy together rather than individually.