Abstract

Introduction This article is a study in anxiety with regard to social online spaces (SOS) conceived of as dark. There are two possible ways to define ‘dark’ in this context. The first is that communication is dark because it either has limited distribution, is not open to all users (closed groups are a case example) or hidden. The second definition, linked as a result of the first, is the way that communication via these means is interpreted and understood. Dark social spaces disrupt the accepted top-down flow by the ‘gazing elite’ (data aggregators including social media), but anxious users might need to strain to notice what is out there, and this in turn destabilises one’s reception of the scene. In an environment where surveillance technologies are proliferating, this article examines contemporary, dark, interconnected, and interactive communications for the entangled affordances that might be brought to bear. A provocation is that resistance through counterveillance or “sousveillance” is one possibility. An alternative (or addition) is retreating to or building ‘dark’ spaces that are less surveilled and (perhaps counterintuitively) less fearful. This article considers critically the notion of dark social online spaces via four broad socio-technical concerns connected to the big social media services that have helped increase a tendency for fearful anxiety produced by surveillance and the perceived implications for personal privacy. It also shines light on the aspect of darkness where some users are spurred to actively seek alternative, dark social online spaces. Since the 1970s, public-key cryptosystems typically preserved security for websites, emails, and sensitive health, government, and military data, but this is now reduced (Williams). We have seen such systems exploited via cyberattacks and misappropriated data acquired by affiliations such as Facebook-Cambridge Analytica for targeted political advertising during the 2016 US elections. Via the notion of “parasitic strategies”, such events can be described as news/information hacks “whose attack vectors target a system’s weak points with the help of specific strategies” (von Nordheim and Kleinen-von Königslöw, 88). In accord with Wilson and Serisier’s arguments (178), emerging technologies facilitate rapid data sharing, collection, storage, and processing wherein subsequent “outcomes are unpredictable”. This would also include the effect of acquiescence. In regard to our digital devices, for some, being watched overtly—through cameras encased in toys, computers, and closed-circuit television (CCTV) to digital street ads that determine the resonance of human emotions in public places including bus stops, malls, and train stations—is becoming normalised (McStay, Emotional AI). It might appear that consumers immersed within this Internet of Things (IoT) are themselves comfortable interacting with devices that record sound and capture images for easy analysis and distribution across the communications networks. A counter-claim is that mainstream social media corporations have cultivated a sense of digital resignation “produced when people desire to control the information digital entities have about them but feel unable to do so” (Draper and Turow, 1824). Careful consumers’ trust in mainstream media is waning, with readers observing a strong presence of big media players in the industry and are carefully picking their publications and public intellectuals to follow (Mahmood, 6). A number now also avoid the mainstream internet in favour of alternate dark sites. This is done by users with “varying backgrounds, motivations and participation behaviours that may be idiosyncratic (as they are rooted in the respective person’s biography and circumstance)” (Quandt, 42). By way of connection with dark internet studies via Biddle et al. (1; see also Lasica), the “darknet” is a collection of networks and technologies used to share digital content 
 not a separate physical network  but an application and protocol layer riding on existing networks. Examples of darknets are peer-to-peer file sharing, CD and DVD copying, and key or password sharing on email and newsgroups. As we note from the quote above, the “dark web” uses existing public and private networks that facilitate communication via the Internet. Gehl (1220; see also Gehl and McKelvey) has detailed that this includes “hidden sites that end in ‘.onion’ or ‘.i2p’ or other Top-Level Domain names only available through modified browsers or special software. Accessing I2P sites requires a special routing program ... . Accessing .onion sites requires Tor [The Onion Router]”. For some, this gives rise to social anxiety, read here as stemming from that which is not known, and an exaggerated sense of danger, which makes fight or flight seem the only options. This is often justified or exacerbated by the changing media and communication landscape and depicted in popular documentaries such as The Social Dilemma or The Great Hack, which affect public opinion on the unknown aspects of internet spaces and the uses of personal data. The question for this article remains whether the fear of the dark is justified. Consider that most often one will choose to make one’s intimate bedroom space dark in order to have a good night’s rest. We might pleasurably escape into a cinema’s darkness for the stories told therein, or walk along a beach at night enjoying unseen breezes. Most do not avoid these experiences, choosing to actively seek them out. Drawing this thread, then, is the case made here that agency can also be found in the dark by resisting socio-political structural harms. 1. Digital Futures and Anxiety of the Dark Fear of the darkI have a constant fear that something's always nearFear of the darkFear of the darkI have a phobia that someone's always there In the lyrics to the song “Fear of the Dark” (1992) by British heavy metal group Iron Maiden is a sense that that which is unknown and unseen causes fear and anxiety. Holding a fear of the dark is not unusual and varies in degree for adults as it does for children (Fellous and Arbib). Such anxiety connected to the dark does not always concern darkness itself. It can also be a concern for the possible or imagined dangers that are concealed by the darkness itself as a result of cognitive-emotional interactions (McDonald, 16). Extending this claim is this article’s non-binary assertion that while for some technology and what it can do is frequently misunderstood and shunned as a result, for others who embrace the possibilities and actively take it on it is learning by attentively partaking. Mistakes, solecism, and frustrations are part of the process. Such conceptual theorising falls along a continuum of thinking. Global interconnectivity of communications networks has certainly led to consequent concerns (Turkle Alone Together). Much focus for anxiety has been on the impact upon social and individual inner lives, levels of media concentration, and power over and commercialisation of the internet. Of specific note is that increasing commercial media influence—such as Facebook and its acquisition of WhatsApp, Oculus VR, Instagram, CRTL-labs (translating movements and neural impulses into digital signals), LiveRail (video advertising technology), Chainspace (Blockchain)—regularly changes the overall dynamics of the online environment (Turow and Kavanaugh). This provocation was born out recently when Facebook disrupted the delivery of news to Australian audiences via its service. Mainstream social online spaces (SOS) are platforms which provide more than the delivery of media alone and have been conceptualised predominantly in a binary light. On the one hand, they can be depicted as tools for the common good of society through notional widespread access and as places for civic participation and discussion, identity expression, education, and community formation (Turkle; Bruns; Cinque and Brown; Jenkins). This end of the continuum of thinking about SOS seems set hard against the view that SOS are operating as businesses with strategies that manipulate consumers to generate revenue through advertising, data, venture capital for advanced research and development, and company profit, on the other hand. In between the two polar ends of this continuum are the range of other possibilities, the shades of grey, that add contemporary nuance to understanding SOS in regard to what they facilitate, what the various implications might be, and for whom. By way of a brief summary, anxiety of the dark is steeped in the practices of privacy-invasive social media giants such as Facebook and its ancillary companies. Second are the advertising technology companies, surveillance contractors, and intelligence agencies that collect and monitor our actions and related data; as well as the increased ease of use and interoperability brought about by Web 2.0 that has seen a disconnection between technological infrastructure and social connection that acts to limit user permissions and online affordances. Third are concerns for the negative effects associated with depressed mental health and wellbeing caused by “psychologically damaging social networks”, through sleep loss, anxiety, poor body image, real world relationships, and the fear of missing out (FOMO; Royal Society for Public Health (UK) and the Young Health Movement). Here the harms are both individual and societal. Fourth is the intended acceleration toward post-quantum IoT (FernĂĄndez-CaramĂ©s), as quantum computing’s digital components are continually being miniaturised. This is coupled with advances in electrical battery capacity and interconnected telecommunications infrastructures. The result of such is that the ontogenetic capacity of the powerfully advanced network/s affords supralevel surveillance. What this means is that through devices and the services that they provide, individuals’ data is commodified (Neff and Nafus; Nissenbaum and Patterson). Personal data is enmeshed in ‘things’ requiring that the decisions that are both overt, subtle, and/or hidden (dark) are scrutinised for the various ways they shape social norms and create consequences for public discourse, cultural production, and the fabric of society (Gillespie). Data and personal information are retrievable from devices, sharable in SOS, and potentially exposed across networks. For these reasons, some have chosen to go dark by being “off the grid”, judiciously selecting their means of communications and their ‘friends’ carefully. 2. Is There Room for Privacy Any More When Everyone in SOS Is Watching? An interesting turn comes through counterarguments against overarching institutional surveillance that underscore the uses of technologies to watch the watchers. This involves a practice of counter-surveillance whereby technologies are tools of resistance to go ‘dark’ and are used by political activists in protest situations for both communication and avoiding surveillance. This is not new and has long existed in an increasingly dispersed media landscape (Cinque, Changing Media Landscapes). For example, counter-surveillance video footage has been accessed and made available via live-streaming channels, with commentary in SOS augmenting networking possibilities for niche interest groups or micropublics (Wilson and Serisier, 178). A further example is the Wordpress site Fitwatch, appealing for an end to what the site claims are issues associated with police surveillance (fitwatch.org.uk and endpolicesurveillance.wordpress.com). Users of these sites are called to post police officers’ identity numbers and photographs in an attempt to identify “cops” that might act to “misuse” UK Anti-terrorism legislation against activists during legitimate protests. Others that might be interested in doing their own “monitoring” are invited to reach out to identified personal email addresses or other private (dark) messaging software and application services such as Telegram (freeware and cross-platform). In their work on surveillance, Mann and Ferenbok (18) propose that there is an increase in “complex constructs between power and the practices of seeing, looking, and watching/sensing in a networked culture mediated by mobile/portable/wearable computing devices and technologies”. By way of critical definition, Mann and Ferenbok (25) clarify that “where the viewer is in a position of power over the subject, this is considered surveillance, but where the viewer is in a lower position of power, this is considered sousveillance”. It is the aspect of sousveillance that is empowering to those using dark SOS. One might consider that not all surveillance is “bad” nor institutionalised. It is neither overtly nor formally regulated—as yet. Like most technologies, many of the surveillant technologies are value-neutral until applied towards specific uses, according to Mann and Ferenbok (18). But this is part of the ‘grey area’ for understanding the impact of dark SOS in regard to which actors or what nations are developing tools for surveillance, where access and control lies, and with what effects into the future. 3. Big Brother Watches, So What Are the Alternatives: Whither the Gazing Elite in Dark SOS? By way of conceptual genealogy, consideration of contemporary perceptions of surveillance in a visually networked society (Cinque, Changing Media Landscapes) might be usefully explored through a revisitation of Jeremy Bentham’s panopticon, applied here as a metaphor for contemporary surveillance. Arguably, this is a foundational theoretical model for integrated methods of social control (Foucault, Surveiller et Punir, 192-211), realised in the “panopticon” (prison) in 1787 by Jeremy Bentham (Bentham and BoĆŸovič, 29-95) during a period of social reformation aimed at the improvement of the individual. Like the power for social control over the incarcerated in a panopticon, police power, in order that it be effectively exercised, “had to be given the instrument of permanent, exhaustive, omnipresent surveillance, capable of making all visible 
 like a faceless gaze that transformed the whole social body into a field of perception” (Foucault, Surveiller et Punir, 213–4). In grappling with the impact of SOS for the individual and the collective in post-digital times, we can trace out these early ruminations on the complex documentary organisation through state-controlled apparatuses (such as inspectors and paid observers including “secret agents”) via Foucault (Surveiller et Punir, 214; Subject and Power, 326-7) for comparison to commercial operators like Facebook. Today, artificial intelligence (AI), facial recognition technology (FRT), and closed-circuit television (CCTV) for video surveillance are used for social control of appropriate behaviours. Exemplified by governments and the private sector is the use of combined technologies to maintain social order, from ensuring citizens cross the street only on green lights, to putting rubbish in the correct recycling bin or be publicly shamed, to making cashless payments in stores. The actions see advantages for individual and collective safety, sustainability, and convenience, but also register forms of behaviour and attitudes with predictive capacities. This gives rise to suspicions about a permanent account of individuals’ behaviour over time. Returning to Foucault (Surveiller et Punir, 135), the impact of this finds a dissociation of power from the individual, whereby they become unwittingly impelled into pre-existing social structures, leading to a ‘normalisation’ and acceptance of such systems. If we are talking about the dark, anxiety is key for a Ministry of SOS. Following Foucault again (Subject and Power, 326-7), there is the potential for a crawling, creeping governance that was once distinct but is itself increasingly hidden and growing. A blanket call for some form of ongoing scrutiny of such proliferating powers might be warranted, but with it comes regulation that, while offering certain rights and protections, is not without consequences. For their part, a number of SOS platforms had little to no moderation for explicit content prior to December 2018, and in terms of power, notwithstanding important anxiety connected to arguments that children and the vulnerable need protections from those that would seek to take advantage, this was a crucial aspect of community building and self-expression that resulted in this freedom of expression. In unearthing the extent that individuals are empowered arising from the capacity to post sexual self-images, Tiidenberg ("Bringing Sexy Back") considered that through dark SOS (read here as unregulated) some users could work in opposition to the mainstream consumer culture that provides select and limited representations of bodies and their sexualities. This links directly to Mondin’s exploration of the abundance of queer and feminist pornography on dark SOS as a “counterpolitics of visibility” (288). This work resulted in a reasoned claim that the technological structure of dark SOS created a highly political and affective social space that users valued. What also needs to be underscored is that many users also believed that such a space could not be replicated on other mainstream SOS because of the differences in architecture and social norms. Cho (47) worked with this theory to claim that dark SOS are modern-day examples in a history of queer individuals having to rely on “underground economies of expression and relation”. Discussions such as these complicate what dark SOS might now become in the face of ‘adult’ content moderation and emerging tracking technologies to close sites or locate individuals that transgress social norms. Further, broader questions are raised about how content moderation fits in with the public space conceptualisations of SOS more generally. Increasingly, “there is an app for that” where being able to identify the poster of an image or an author of an unknown text is seen as crucial. While there is presently no standard approach, models for combining instance-based and profile-based features such as SVM for determining authorship attribution are in development, with the result that potentially far less content will remain hidden in the future (Bacciu et al.). 4. There’s Nothing New under the Sun (Ecclesiastes 1:9) For some, “[the] high hopes regarding the positive impact of the Internet and digital participation in civic society have faded” (Schwarzenegger, 99). My participant observation over some years in various SOS, however, finds that critical concern has always existed. Views move along the spectrum of thinking from deep scepticisms (Stoll, Silicon Snake Oil) to wondrous techo-utopian promises (Negroponte, Being Digital). Indeed, concerns about the (then) new technologies of wireless broadcasting can be compared with today’s anxiety over the possible effects of the internet and SOS. Inglis (7) recalls, here, too, were fears that humanity was tampering with some dangerous force; might wireless wave be causing thunderstorms, droughts, floods? Sterility or strokes? Such anxieties soon evaporated; but a sense of mystery might stay longer with evangelists for broadcasting than with a laity who soon took wireless for granted and settled down to enjoy the products of a process they need not understand.  As the analogy above makes clear, just as audiences came to use ‘the wireless’ and later the internet regularly, it is reasonable to argue that dark SOS will also gain widespread understanding and find greater acceptance. Dark social spaces are simply the recent development of internet connectivity and communication more broadly. The dark SOS afford choice to be connected beyond mainstream offerings, which some users avoid for their perceived manipulation of content and user both. As part of the wider array of dark web services, the resilience of dark social spaces is reinforced by the proliferation of users as opposed to decentralised replication. Virtual Private Networks (VPNs) can be used for anonymity in parallel to TOR access, but they guarantee only anonymity to the client. A VPN cannot guarantee anonymity to the server or the internet service provider (ISP). While users may use pseudonyms rather than actual names as seen on Facebook and other SOS, users continue to take to the virtual spaces they inhabit their off-line, ‘real’ foibles, problems, and idiosyncrasies (Chenault). To varying degrees, however, people also take their best intentions to their interactions in the dark. The hyper-efficient tools now deployed can intensify this, which is the great advantage attracting some users. In balance, however, in regard to online information access and dissemination, critical examination of what is in the public’s interest, and whether content should be regulated or controlled versus allowing a free flow of information where users self-regulate their online behaviour, is fraught. O’Loughlin (604) was one of the first to claim that there will be voluntary loss through negative liberty or freedom from (freedom from unwanted information or influence) and an increase in positive liberty or freedom to (freedom to read or say anything); hence, freedom from surveillance and interference is a kind of negative liberty, consistent with both libertarianism and liberalism. Conclusion The early adopters of initial iterations of SOS were hopeful and liberal (utopian) in their beliefs about universality and ‘free’ spaces of open communication between like-minded others. This was a way of virtual networking using a visual motivation (led by images, text, and sounds) for consequent interaction with others (Cinque, Visual Networking). The structural transformation of the public sphere in a Habermasian sense—and now found in SOS and their darker, hidden or closed social spaces that might ensure a counterbalance to the power of those with influence—towards all having equal access to platforms for presenting their views, and doing so respectfully, is as ever problematised. Broadly, this is no more so, however, than for mainstream SOS or for communicating in the world.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call