How do we keep Big Tech in check?
That was the key question during the second Digital Participation Symposium at the House of Connections on 27 November. In break-out sessions, participants explored possible solutions, with an early consensus about the crucial need to prioritize ‘the middle’ and stricter regulations.
Today, the vast majority of public debate in Europe takes place on free apps like Facebook, WhatsApp, Instragram, X and TikTok. These apps are offered by a small number of tech giants, namely: Meta, X Corp and ByteDance. In his speech before the symposium, co-organiser David Langley, professor of Digital Transformation and Strategy, pointed out this dominance of Big Tech companies.
According to Langley, the widely used apps are ‘polarisation machines’ that put great pressure on democratic processes and institutions. Regulating traditional media, such as newspapers and television, was relatively easy. But to date, policymakers are failing to contain social media, Langley says. This, he adds, is causing large amounts of ‘nonsense’ being poured over citizens in recent years.
Led by three experts, symposium participants explored solutions to this troublesome dominance. The experts gave a short pitch to explain their respective topics. First of all, Lisa Gaufman, assistant professor of Russian Discourse and Politics, took the floor. In her pitch, she placed misinformation at the centre. According to Gaufman, fake news is of all times. As an example, she showed an old, famous picture of ‘the monster’ of Loch Ness, and compared it to a photoshopped photo of the pope wearing a thick white coat.
Although fakenews is of all times, social media caused an exponential increase in its supply. Popular platforms serve their users the content most likely to hold their attention, creating online bubbles. To combat misinformation, European Union policymakers came up with ‘factchecking’. But this method has been of little success to date, Gaufman says. In fact, ‘correcting’ a group-narrative proves almost impossible, and factcheckers are often distrusted because they are paid by governments.
The seemingly unbridgeable gap this has created between groups in society was the starting point of the break-out session led by Gaufman. For how do we discuss controversial topics with people who think diametrically differently, such as conspiracy theorists? Attendees’ strategies proved diverse. Some entered the debate and took up ‘the fight’, while others used ‘salami tactics’ by breaking up delicate topics. And yet others observed or asked questions. Creating an emotional connection was also suggested. ‘Research shows that this helps to release people from a cult-like situation.’
Several apps were also explored during the discussion. One notable example is a political ‘anti-Tinder’, designed to match users with individuals outside their usual social circles to initiate conversations. Could this be a potential solution? It’s a question worth pondering. Ultimately, it is about having a moderate discussion and searching for ‘the middle’, one participant felt. In an era where social media tends to amplify differences, the focus should shift towards identifying common ground. ‘We are being destroyed between two poles, whereas we should be feeding the middle,’ concluded one attendee.
Once back at the parallel session, the results of the break-out discussions were presented. Led by Jeanne Mifsud Bonnici, professor of European Technology Law and Human Rights, participants addressed the question which institutions are responsible for regulating the public sphere. The dilemma emerged as censorship clashes with the principle of freedom of expression. Simultaneously, there exists a responsibility for institutions to safeguard society against the perils of misinformation.
Given its impact on our daily lives, we cannot let the digital discourse depend on the market alone, the attendees felt. Yet how can institutions intervene, given that trust in those same institutions is waning? Education might have an important role to play, suggested one participant. Technical solutions were also put forward, such as the mandatory scrapping of algorithms, giving users a ‘clean sheet’ on their timeline every year. In the end, it is all about trust, the attendees concluded. That is why, the design of digital services needs to provide trustworthy checks and balances, so that even untrustworthy parties, both governments and Big Tech, cannot misuse their power. So far, those checks and balances are not in place.
In the break-out session led by Cleo Silvestri, assistant professor of Innovation Management and Strategy, the public good was discussed. Many technologies such as AI can benefit humanity. Silvestri likened the situation to a scale, where the public good is positioned on one side and profit maximization on the opposite end. The consensus among symposium participants was that, for Big Tech, shareholder interests currently tip the scale in the direction of profit maximization. It might work to have a non-profit foundation overseeing for-profit digital firms. Yet, OpenAI proves that this is very difficult: in November the board focused on public good was dismissed and profit became the number one priority.
This leads to the question: how can we stimulate Big Tech giants to commit to the public good? It became clear that market players are failing to do so themselves. According to participants, policymakers should therefore intervene more strictly, for instance by handing out fines to companies that damage the public good. Another suggestion was also offered. Make sure that public organizations only use ethical platforms for their own citizen participation activities, for example, using Signal instead of WhatsApp.
The final part of the symposium, starting with a digital participation experiment on the spot, focused on the use of digital technologies to support democracy. As examples, several initiatives in the region were presented by Annet Witvoet and Sirin Yildiz from the municipality of Groningen. Among other things, they told about the online platform ‘Stem van Groningen’, where citizens can participate and vote on issues in their neighbourhood. As an example, Witvoet and Yildiz mentioned an initiative from the Lewenborg neighbourhood that resulted in a successful film house.
However, there are challenges when deploying digital technologies, acknowledged Witvoet and Yildiz. For example, how do you weigh digital votes against offline voters? When is an online vote successful, and what about fraud? A panel of three UG experts (Davide Grossi, Oskar Gstrein and Alexander Smit) took a closer look at these questions. Smit researches digital literacy. According to him, many citizens are not skilled enough to participate in digital democracy, which can lead to exclusion. One possible solution is to diversify the options to participate, for example by using other platforms for online voting.
Oskar Gstrein pointed to the relation between citizens and Big Tech companies. Do we want to use platforms from Google and Meta for our democracy? The programmes often work handily, but what about the underlying values? The public’s and companies’ values can be very different.’ To maintain citizens’ trust in digital democracy, it is essential to involve them in all steps of the decision-making process and the designing of the platform, the attendees concluded.
The Digital Participation Symposium at the House of Connections was organised by the Public Participation Centre and the Wubbo Ockels School for Energy and Climate in collaboration with the New Energy Coalition.