Latin America is part of a global bandwagon moving towards “digital transformation”, which is being hailed by various private and public actors as the silver bullet for fixing structural societal problems. However, behind promises of economic development and public sector efficiency, the deployment of technology reinforces neoliberal approaches, ignores differences between people, and deteriorates the conditions of life for traditionally marginalized groups, including women and gender-diverse populations.
Technologies based on exploiting humans as sources of data ignore differences between people and promote standardization as a form of control. Furthermore, colonial logics – disguised as techno-optimism, embedded in Global North logics – restrict the likelihood of local developments to create, what World Social Forum’s slogan refers to as “another possible world”. Amid such a backdrop, this essay explores the narratives behind public sector initiatives implemented by various states and private companies in Latin America. We attempt to show how feminist perspectives can help question technology and put it at the service of a paradigmatic change that furthers equality and social justice.
Furthermore, to explore the biopolitics of gender in datafied welfare systems, we look into some of the shortcomings and consequences of the datafication of Latin American welfare policies from an intersectional feminist perspective. As indicated by Derechos Digitales, the region has become an international reference in the implementation of social protection programs in the past decades. However, increased introduction of explicit and implicit conditionalities for access to welfare, usually associated with digital mechanisms of control, not only represent a risk to human rights, but may also result in the intensification of inequalities instead of their eradication.
It is thus pertinent to question the power dynamics behind the dominant narratives of “digital transformation” and their impact in fixing structural societal problems in Latin American states.
Narratives from Platformized States: Mistaking Data for Bodies
Of late, with the idea of digital transformation gaining steam in Latin America, several data entry points have been created to collect information that map traditionally marginalized groups who are insufficiently understood or prioritized in welfare policies. The problem is that while doing so, more often than not, states mistake data for bodies. With the same racist and patriarchal foundations that were responsible for making Latin America one of the most unequal regions of the world, such initiatives risk creating new and worse forms of control if they are silent on the biases that are encoded in the way data is generated, collected, and used to build technical systems and narratives that contribute to decision making.
There are several examples in which initiatives that attempt to automatize the identification of vulnerable groups have guided states’ social protection policies, sometimes risking further vulnerability. They vary from the implementation of artificial intelligence (AI) to predict and prevent school dropouts or teenage pregnancies in Salta, Argentina, to a similar program implemented in Chile that seeks to guide state action to protect children whose rights have been violated. Such initiatives are rapidly spreading to other countries as well. They not only represent a violation of privacy and autonomy of affected people, generating wide potential for discrimination; but expose individuals to differentiated state interventions without their consent, reinforcing previous inequalities.
As data is becoming a unique means of legitimizing eligibility to receive social protection – as in the case of Venezuela where biometric identification is necessary to access basic services – it facilitates the surveillance and control of historically marginalized groups and creates great risks of deepening prejudice and stigmatization. In such a context, promises of inclusion, usually limited to economic development and mostly based on data processing, do not challenge existing structures and the power imbalances that sustain them. Rather, they force marginalized groups ‘to fit in’ and be organized and controlled by the welfare state.
This approach to data collection uses data to obscure reality, rather than representing those to whom it belongs or challenging the injustices perpetrated based on such data. It ignores the differences between people and the deteriorating quality of life for traditionally marginalized groups, including women and gender-diverse populations. Exchanging reality for data as the only truth leaves policymakers ill-equipped to comprehend the shortcomings of traditional systems, and address the needs of those who have not been included in them. Forcefully representing them by collecting their data so as to make them a part of a pre-existing hierarchy of priorities is as invisibilizing as not capturing any data at all, as Díaz and Venturini point out.
As data is becoming a unique means of legitimizing eligibility to receive social protection – as in the case of Venezuela where biometric identification is necessary to access basic services – it facilitates the surveillance and control of historically marginalized groups and creates great risks of deepening prejudice and stigmatization.
Juliana Guerra reflects on the impacts of the pandemic on gender-based violence in Latin America. She concludes that data has the power to challenge reality only when data collection is aligned with the desire to capture numbers that do not necessarily fit in the current structures. Data collection cannot account for that which comfortably reinforces the presumptions of a given model. She gives the example of how several Latin American states, including Colombia and Chile, initially interpreted the decreasing complaints about gender-based and domestic violence at the beginning of the pandemic. A decrease in the number of complaints did not reflect a decrease in the number of cases, instead was probably indicative of the deficiency in public policies in creating safe channels for victims locked in with their aggressors.
The impacts of the narratives built around data collection do not stop at the ‘objective’ reality that the data intends to depict. Following D’Ignazio and Klein’s principle of elevate emotion and embodiment -which points out to the value of multiple forms of knowledge, including the knowledge that comes from people as living, feeling bodies in the world, beyond what is traditionally considered tangible or objective, it elevates experience – we reflect on how data collection by Latin American states, usually, if not always, falls short of recognizing autonomy and consent as ways for the depicted subjects to regain agency over their representation. This can help disrupt the pre-established models of what it means to be disadvantaged and live with poverty. The situated knowledge of what is represented in the technical system will not be debunked if, for instance, unemployed workers are classified to be matched with employment opportunities that do not account for their autonomy in pursuing new careers. Instead, their data will continue to be statically captured from a standpoint that does not embrace their humanity.
This technocratic approach to technologies, which does not account for pluralism in Latin American states, is well-documented in Derechos Digitale’s recent research on AI and inclusion. In this study, Velasco and Venturini reveal how the absence of meaningful participation from those potentially affected by automated decision-making systems leads to presumed solutions that ignore the actual needs of the populations they intend to serve. They state,
“All technological deployment takes place in a space of political tensions, where the danger is to pretend that the systems considered can obviate, hide or mitigate such tensions without any mediation other than the very assumption of efficiency that sustains them. In a word: digital technocracy is not enough, neither to protect the human rights of citizens, nor, much less, to claim legitimacy for the state action.”
Finally, framing technologies to exploit humans as sources of data erodes diversity and promotes context-less standardization as a form of control. As the Data for Black Lives Movement states, “Data must be reclaimed and reimagined. For those of us living on the margins, who have found a way to thrive in a country that has made Black life dispensable and Black survival optional, data is protest, data is accountability and data is collective action…We knew that it would be irresponsible and potentially harmful to solely share the data – we also needed to shape the narrative by putting the voices of Black epidemiologists, policymakers, activists, and organizers at the center.”
Exchanging reality for data as the only truth leaves policymakers ill-equipped to comprehend the shortcomings of traditional systems, and address the needs of those who have not been included in them.
Furthermore, it matters who tells the narratives built on data. Latin American attempts at a data-driven welfare policy have insufficiently focused on this. Decision-making is usually centralized in a group of – more usually than not – white male technocrats in charge of designing models for the collection of data, choosing the system the data is fed into, and building the narratives that the systems will serve. Ways to mitigate the reproduction of bias and traditional structure of powers that benefit from it have not been explored. As Ruha Benjamin reflects,
“One need not harbor any racial animus to exercise racism in this and so many other contexts; rather, when the default settings have been stipulated, simply doing ones job—clocking in, punching out, turning the machine on and off—is enough to ensure the consistency of white domination over time. Likewise, changing individual sentiment from animus to tolerance, or even affection, will not transform the status quo so long as the underlying design of our socio-technical world is left in place.”
So far, Latin American authorities have not undertaken a self-critical exercise of assessing who is in the room and who is not when drafting interventions to welfare policies. They are not even close to acknowledging the absence of open and public consultations to define priorities, areas of concern and potential providers. To move away from reinforcing pre-existing social structures through data, and towards challenging structural inequalities through it, we must make room for a diversity of perspectives. This demands transparency, guarantees of meaningful participation of concerned communities in policy making processes and policies to diversify people who are in charge of tech development.
In the following sections we will further explore existing challenges when it comes to tech development in Latin America and lessons from feminist and trans activists in resisting and creating alternatives to Silicon Valley models.
Narratives from Platformized Tech Development
When states rush to adopt models of governing ‘by algorithms’, and decide to partner with global tech corporations, or import foreign technologies for that, they reproduce privatized logics for data collection and processing that reflect and reinforce the commercial interests of Big Tech as is detailed by Shoshana Zuboff, among others. This is the case, for instance, when they adopt automated or semi-automated systems for decision-making on issues such as employment, health, justice or social welfare. Furthermore, an analysis of mobile applications launched during the Covid-19 pandemic in Latin America also evidenced a trend of excessive collection of data. By embracing this form of governance, states often take the data exploitation model for granted, and forfeit the possibility of governing algorithms and creating alternatives for technological development and use.
However, governments are not alone in blindly adopting the Silicon Valley model. Collaborations with the strong regional free-software movement flourished in the tech industry before, however the “startup logic” has grown in the recent past, as can be seen in Brazil. This is partially due to Big Tech co-opting activists and professionals. For instance, local startups in Brazil are flourishing by facilitating the intrusion of global corporations in the local environments through training on the adoption of specific software, like Google’s educational tools or by reselling surveillance and biometric systems. Others locally replicate the services of the once celebrated ‘sharing economy’ and build their platforms under the same exploitative models. Sometimes, they even benefit from differentiated regulatory standards, justifying it with the fact that higher guarantees could hinder local ‘innovation’. Thus, the unasked question is, how innovative are these technologies intended to transform lives really, when what they do is mainly mint wealth for a few local and global capitalists.
While such entrepreneurship is celebrated, investments for fostering local autonomous infrastructures and capacities are lacking. Despite the post-Snowden hype, when Latin America tried to lead a global reaction to the extensive US-led global surveillance scheme, little changed. The region not only continues to depend on foreign companies, but also allows them to handle an increasing amount of strategic data, deepening such dependency and replicating a well-documented colonial relationship. A concrete example can be found in Brazil, which initially led the international resistance to Big Tech, but in the last few years has been relying on them to manage sensitive information. This has resulted in the adoption of corporate cloud and communications services at universities, which have replaced the previously installed national infrastructure.
Such initiatives represent a setback to the idea of national development and sovereignty that mobilized a few Latin American governments in the past to create initiatives in the digital sector. With a few exceptions, policymakers are accepting a dependent role in tech development, acting as privileged clients for foreign corporations. Due to this, they miss several opportunities to foster alternative models of technologies that can be designed and used to overcome inequalities and injustice. The idea of inclusion, as we pointed out, becomes a rhetoric used to justify further exploitation through data.
The contribution brought by Latin American feminist and trans activists is key in resisting the white and masculine socio-technical imaginary deeply embedded in Silicon Valley narratives and culture.
However, grassroots feminist movements are at the forefront of proposing bottom-up alternatives and deeply questioning how gender and racial relations and representations are not only embedded in technological design, but also in the norms that govern them. This is also observed by scholars Denardis and Hackl, who reflect on how conflicts over LGBTQ+ rights manifest within the internet’s architecture and governance: “political and economic conflicts increasingly materialize within arrangements of Internet governance rather than at the surface level of content”. The ability of women and gender-diverse people to exercise their freedom of expression and information, and assert their sexual and reproductive rights, is greatly affected by the quality and quantity of public and private regulations governing the technical infrastructures of platforms.
Sometimes these questions boil down to the very operation of basic infrastructures that are necessary for women and gender-diverse people to exercise their rights. A case in point is the blocking of an abortion rights website, Women on Waves in Brazil (and other countries).
Women on Waves facilitate abortions in countries where it is illegal. They provide abortion services on a ship in international waters, where women can legally receive them, and even deliver abortion pills using robots and drones. The organization also offers critical information about abortion-related issues like the use of pills such as misoprostol through their hot-lines, mobile app, training manuals, and several other resources. Women on Waves is a concrete example of meaningful usage of technologies for promoting women’s autonomy.
At a higher level of the internet architecture, content rules have led to the removal of a sexual and reproductive health app called Lunar from the Google Play Store. This app was developed to “reclaim sovereignty of our own bodies […] and develop new meanings to the concepts of menstruation and menstrual health”. Lunar is a particularly relevant app in a context where menstruation apps incorporate conservative perspectives on women’s reproduction, and extensively share sensitive data with third parties for commercial purposes.
However, its vision to offer diverse perspectives on menstruation that are not limited to reproduction clashed with Google Play Store’s policies, which states that they, “Don’t allow apps that contain or promote sexual content, such as pornography, or any content or services intended to be sexually gratifying. Content that contains nudity may be allowed if the primary purpose is educational, documentary, scientific or artistic, and is not gratuitous”. Despite the alleged exception for content with educational purposes, images on Lunar explaining the effects of menstruation on women’s bodies and other artistic representations of the women’s bodies were interpreted as infringing upon Play Store’s policy.
The Lunar case reveals how cultural taboo around menstruation – as something shameful, disgusting or uncomfortable to be discussed – is encoded into technological design and architecture. It also shows the consequences of depending on Silicon Valley companies and their mentality towards women’s privacy, autonomy, freedom of expression, and social, economic and reproductive rights. The lack of institutional frameworks and public programs to address the access to information in Latin America makes the internet a key element for effectively exercising women’s rights. Therefore, we must question how the approach towards technology development can be driven in a different direction through alternative visions and set of values which are rooted in feminist principles.
Final Reflections
In this brief essay we have tried to highlight how current trends regarding digitization of the public sector in Latin America rely on a rhetoric of inclusion and a model of technology that are by themselves contradictory. Digitized policy initiatives in sensitive areas such as welfare, education or employment, risk deepening existing inequalities and therefore generate new forms of discrimination.
By blindly embracing the Silicon Valley data exploitation model as the only way, Latin American states, mainstream media, and academia, have uncritically incorporated and validated the novel techniques of classification and control imbricated in the current narratives of development and progress. These narratives are not only based on exclusion, but also create a “temporal penitentiary in which oppressed people are locked in to a dystopic present”, as Ruha Benjamin states. Under the guise of an inclusive future, they strengthen the roots of a colonial past to create a colonial present, deepening dependency on foreign states and corporations and reproducing historical inequalities.
We have also emphasized how existing patriarchal and racist power structures are imbricated in policymaking and tech development and underscored the challenges involved in overcoming them. Existing policies for digital transformation in various countries miss the opportunity to challenge unfair models of tech for society. The contribution brought by Latin American feminist and trans activists is key in resisting the white and masculine socio-technical imaginary deeply embedded in Silicon Valley narratives and culture. Such groups have gone beyond the liberal demand for women’s participation in the tech industry, or the use of private platforms to share their views and demands. Instead, they have adopted a techno-feminist perspective to develop their own way of interacting with technologies. As academic Judy Wajcman proposes, techno-feminists expose, “how the concrete practices of design and innovation lead to the absence of specific users, such as women” and dispute the imaginary of computer hackers, marked by a “macho culture” based on “winning and of subjecting oneself to increasingly violent tests”.
In a hostile digital environment that perpetuates the structural misogyny and homophobia affecting most Latin American societies, feminists are taking back the tech at various levels. They have developed manuals on how to protect themselves against surveillance and online violence and safely navigate the web to access relevant information, including on sexual and reproductive rights. They have also used the networks built in digital spaces to claim their rights. More importantly, they have built autonomous infrastructures according to feminist principles. As Goldsman and Venturini summarize, “Women and LGBTQ+ groups have adopted strategies to protect themselves online, to shed light on social networks’ flawed policies and request accountability”.
Challenges remain and expand, but the good news is the increasing number of well-articulated academic works that feminists are promoting intensively in the region. That said, we need to continue working towards incorporating a feminist, emancipatory perspective to technologies in the Latin American states and narratives of technology based on private innovation. It is imperative that narratives of technology deployment shift to embrace the value that feminist principles and imaginaries can bring to building alternative futures based on autonomy, justice, and equality for all in the Global South.
This is the eleventh piece from our special issue on Feminist Re-imagining of Platform Planet.