Whose data is it anyway?
- Tebogo Keitheile
- Jan 31
- 26 min read

Copyright © 2026
Print ISSN: 2960-1541
Online ISSN: 2960-155X
Inclusive Society Institute PO Box 12609
Mill Street
Cape Town, 8000 South Africa 235-515 NPO All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means without the permission in writing from the Inclusive Society Institute D I S C L A I M E R
Views expressed in this report do not necessarily represent the views of
the Inclusive Society Institute or those of their respective Board or Council
members.
JANUARY 2026

Image credit: AI-generated illustration produced with OpenAI (DALL·E), 2026.
by Tebogo Keitheile
Abstract
This article seeks to explore data privacy and data protection rights and obligations through the lens of social contracts, human rights and ethics, whether through commercial endeavours or public service provision. While not a new human asset, digital data is at the centre of sustainable development as it allows accurate statistical oversight when setting developmental goals, monitoring progress or assessing their impact after the fact; accordingly, data availability is an important joint goal that benefits humans, corporations and governments alike. Through viewing data protection through the lens of Ubuntu, and similar social contract lenses, in light of the slow pace of regulation across jurisdictions, data protection can be given a human-centric approach, notwithstanding the word of the law, thus mitigating inefficiencies and limiting risks to data subjects, controllers and the broader data ecosystem as a whole.
Keywords: Data privacy, Data protection, Social contract, Human rights, Ubuntu ethics, Digital governance, Big Data, Consent and choice, Artificial intelligence, Fiduciary duty
Introduction
“The social contract emerges from the interaction between a) expectations that a given society has of a given state; b) state capacity to provide services, including security, and to secure revenue from its population and territory to provide these services (in part a function of economic resources); and c) élite will to direct state resources and capacity to fulfil social expectations. It is crucially mediated by d) the existence of political processes through which the bargain between state and society is struck, reinforced and institutionalised. Finally, e) legitimacy plays a complex additional role in shaping expectations and facilitating political processes. Legitimacy is also produced and replenished, or, conversely, eroded by the interaction among the other four factors ... Taken together, the interaction among these factors forms a dynamic agreement between state and society on their mutual roles and responsibilities – a social contract” (OECD, 2008).
When the European Union’s General Data Protection Regulations (General Data Protection Regulation (EU) 2016/679) (GDPR) came into effect in 2018, African regional bodies had already recognised the growing imperative to enact standards in their member countries, protecting personal data, due to the growing reliance on the computer and the internet. These were demonstrated by the 2010 ECOWAS Act (Supplementary Act A/SA. 1/01/10 on Personal Data Protection) governing the west African region, SADC’s 2013 Modern Law on Data Protection governing sub-Saharan countries, and the African Union’s 2014 Convention on Cyber Security. In terms of benchmarking, however, the EU instrument served as a reference point allowing implementation of complimentary standards to the many considered the gold standard in data protection laws. Spurred in part by the extraterritoriality of claims under GDPR, the sizeable penalties and the limitations it put on data transfers to less compliant jurisdictions, recent years have seen the regulatory instrument inspire similar data protection progress across the continents, with countries seeking to ensure continuity of data transfers, and resultantly, ensure secure commercial, governmental, personal, and other internet access to markets that would otherwise potentially be increasingly restricted by regulatory disparities.
The COVID pandemic is often identified as an unofficial marker (particularly in less developed economies) for when digital data truly moved from being a mere fact of life, to a true societal, economic commodity, due in large part to the increase in internet reliance during the global shutdown, thus motivating a greater reliance on internet-driven services out of necessity. Through pushing communications and media to largely internet-based spaces (web-based meetings, greater reliance on social media, the explosion of the online marketplace, reliance on health tracking data for pandemic response, etc.), the global shutdown catapulted reliance on internet interactions and with that, the use of personal data by corporations and governments alike for their various objectives. As a result, between 2019 and 2021, internet use in Africa increased by 23% whereas in the Asia-Pacific region it increased by 24%, increasing by 20% in the least developed countries. Globally the figure rose from 54% internet connectivity to 63% of the world’s population having access to the internet, this left 2.9 billion people who remained unconnected to the internet as at 2021, of which 96% live in developing countries (ITU, 2021). By 2024, 68%, or 5.5 billion, of the global population was online, with the correlation between internet access and economic development still stark, 93% of high-income nations were online, compared to just 27% internet penetration in the low-income nations, again concentrated in Africa and Asia (with Africa having the lowest internet access rate at 27%) (ITU, 2021).

Figure 1: Percentage of individuals using the Internet by region, 2019 and 2024 (ITU, 2021)
As increased internet access becomes the norm, it is also increasingly a corporate and national policy objective as well as a distinct objective set by regional intergovernmental organisations such as the AU, UN and their sub-regional organisations, as well as regional regulators, for various reasons. Due, however, to the many risks posed by increased digital presences, and the limited binding regulations implemented to date, particularly in Africa, exploring the social contract underpinning the relationship between data subjects and the various entities to which they share their data, including the means and reasons they do so, is more and more urgent in order to determine the appropriate ways in which data privacy and protection approaches and governance models can be viewed in order to ensure greater protection of personal data rights. It is through this assessment that questions such as to whom the data created through human interaction with the internet ultimately belongs, the parameters of its continued use, extrapolation, referencing whether in serving the data subject or in pursuit of the broader national interests or economic objective, and where it all ends, become increasingly critical.
As human beings increasingly interact with one another, their governments, corporations and other entities through digital means, we create greater and more complex digital identities, directly. To these direct digital disclosures and connections are added the data shared indirectly through the increasingly ever constant data collection spurred by the Internet of Things (IoT) and pervasive presence of internet digital reliance, such as information about our preferences, habits, and other purely digital iterations of our beings such as data from our cell phones, cars, TVs, health watches, to CCTV (including Safe City initiatives) and unauthorised recordings from other internet users through social media videos, to data leaks and breaches, and more and more commonly, data from previously non-digital ordinary household appliances such as fridges, toasters, washing machines as well as personal data shared between data collecting organisations.
In this data-driven bartering system where data subjects trade their data in exchange for access to digital solutions, is the mutual understanding that data controllers will not only at minimum handle the data with the requisite care but will further guard it from unauthorised access by ensuring appropriate security measures are in place. It is from this broad mutual understanding between data subjects and the entities that collect and derive value from their personal data that the data privacy principles will be upheld – that data collected will be limited to what is collected lawfully, fairly or consensually, data quality will be relevant to the collection objective and accurate as much as possible, both parties are clear on why the data collection is being collected, continued confidentiality or limited use of the data unless required by law or a greater good, prevention of unauthorised access through appropriate security safeguards, openness and transparency between the parties, individual participation allowing the data subject to continue to have some control over their personal data, and that both parties remain accountable for their obligations to one another regarding the personal data (OECD, 2002).
The value of data to corporations and governments alike has increased exponentially in the last two decades, transforming data collection from direct means to ever expanding sharing means, such as data brokerage and the creation of data sharing ecosystems. Per McKinsey & Company, data creates tangible, real-world commercial value for organisations, with 40% more revenue generated from personalised services and over $1 trillion in additional value in top quartile performance (McKinsey, 2021). From a public authority perspective, personal data functions in supporting governments in delivering its agreed policies and services to its citizens, as well as helping them attain data-driven goals such as e-governments and smart cities (Löfgren & Webster, 2020). From a civil society perspective, access to data supports transparency, allowing oversight over national statistics, allowing independent means to measure real-world issues affecting the citizenry such as government policy implementation, whether on service delivery or other human rights obligations (World Bank, 2021).
Data Privacy as a Human Right
While like other mammals, humans are a social species, we still need privacy. Alan Westin points out that in our daily endeavours, we seek out both privacy and community, seeking out company from others while at times setting limitations on proximity through boundaries, creating family, friendship, or other community sub-units that exclude other community members in some respect self-identifying with these various groups, while at the same time distinguishing oneself and others from them in some way (Westin, 1967), highlighting that from the time of primitive societies, communing with the spirits or gods, required one to be alone from [perception by] fellow humans. Westin further highlights the human need, in more modern democratic societies, for privacy as groups away from governmental perception, allowing not only open and unhindered exchange of ideas and association, including on topics that deviate from dominant opinion, demonstrated through privacy of voting, protection from unlawful intrusions on one’s privacy even by state organs such as the police.
The overall right to privacy gained one of its earlier forms of legal recognition through Article 12 of the United Nations’ 1948 Universal Declaration of Human Rights (UN, 1948) before, owing to the need to speak to the growing digitisation of human lives and experiences, and with them their data, coupled with the increasing reliance on computing and the risk that interruptions in these data flows had on the global economy in light of the then disparity of the legislation on the topic globally, the Organisation for Economic Co-operation and Development (OECD) in 1980 devised its Recommendation on the Protection of Privacy and Transborder Flows of Personal Data, which was further updated in 2013 (OECD, 1980; 2013). The organisation noted the futility of regulating data privacy at a purely national level given the rate at which data was transferred between nations as a function of the increased interconnectedness brought by the internet.
From an African perspective, while some scholars have argued that the concept of “Ubuntu” (or “Botho” in Setswana) excludes a personal right to privacy, the argument falters when one considers that Ubuntu as a function of mutual respect and empathy cannot reject duty of confidence that one owes to ones chosen sub-community groups, much in the same way as highlighted above by Westin, as a material feature. Accordingly, Ubuntu, including within more formal settings, is itself a social contract creating a variety of obligations amongst the members of the community, directly between the human individuals on either end and thus appropriately underpinning such wider subjects as business ethics and human rights (whether as a function of trust or mutual obligation) in communities for whom it is central. It is thus unfortunate that the 1981 African Charter on Human Rights did not specifically mention privacy as a distinct right in its final draft, leaving some debate rife on the subject. Notwithstanding this, even prior to the advent of modern data privacy laws, at least 25 African countries[1] contained the right to privacy within their constitutions, limited, when so done, primarily for national security reasons and always under the standard of reasonableness and proportionality (Mavedzenge, 2020 ).
Since these foundational moments in the international legal recognition of personal data privacy, more variables continue to add complexity through advancements in algorithms, machine learning and artificial intelligence (AI), Internet of Things (IoT) and social network systems (Sutherland, 2021) as more and more of our lives continue to require the internet (ITU, 2021), and correspondingly, more and more of our personal data enters and becomes essential to the functioning of the broader digital ecosystem. In its 2015 general assembly sitting, the United Nations noted the need for data in gauging implementation success and ultimately evaluating real-life outcomes of the goals set through the 2030 Agenda for Sustainable Development (2030 Agenda) stating:
“Quality, accessible, timely and reliable disaggregates data will be needed to help with the measurement of progress (SGDs) and to ensure that no one is left behind. Such data is key to decision-making. UN member states committed to, amongst others ...”
Big Data, which has many definitions, can be characterised as everything from “call logs, mobile-banking transactions, online user-generated content such as blog posts and Tweets, online searches, satellite images, etc.” to GPS locations, social media post interactions, purchase history and many other forms of information about ourselves that we share consciously and unconsciously and leave a trail of while engaging (directly or indirectly) with a device or service that requires the internet (UN Global Pulse, 2012). Considering the age of the internet, number of users, type and amount of data collected, etc., at any given moment, Big Data is resultantly so large, variable and performing (evolving) with such speed that it cannot be processed by ordinary data processing tools, hence BIG data. Commercially, Big Data is credited with helping corporations with more accurate and precise decision-making, providing better and more detailed insights about internet users, providing more and more acutely personalised customer experiences, whereas from a public service provision perspective, Big Data can support policymaking, developmental target creation and monitoring, accurate statistical collection and interpretation (UNDP, 2017). From a development perspective, the UN states that Big Data can support nations to achieve their development goals in a myriad of ways, including efficient traffic control using GPS data, spending data used to determine poverty levels and habits, tracking deforestation by combining satellite imagery and crown sourced data, disaster management through social media monitoring (UN Global Pulse, 2012).
In addition to the role it plays in attaining sustainable developmental goals, internet access as a whole has a direct impact on economic growth in developing and least-developed countries due to increased access not only to information (locally and internationally) but also to potential export markets and innovative technologies. Advancements in internet access thus benefits citizens through improved financial inclusion (greater access to alternative financial systems such as mobile money or internet-enabled financial solutions); improved social services through functional, efficient and reliable e-government solutions and digitised civil society access; improved healthcare administration (whether public or private providers) through data availability for early warnings, education, e-health solutions and in education through distance and online education availability; access to scholarships; general access to information by the broader population; in agriculture, internet connectivity supports critical processes such as supply chain management, price transparency and climate data (Guerriero, 2015).
Various studies conducted in Sub-Saharan African countries have demonstrated a positive correlation between basic internet connectivity and a reduction in poverty, and ultimately, an improvement in economic grown as a result of growth in improved market access, including e-commerce (Jonas Hjort, 2025). The World Bank highlighted that while data (particularly Big Data) is an essential enabler for not only achieving the UN’s SDGs but for economic growth as a whole, developing countries continued to be plagued by delays in general access to the internet and limited opportunities to not only submit their data but to benefit from their data, due to obstacles such as limited infrastructure (World Bank, 2021).
To bridge this infrastructure deficit, the African Union (AU) and African Development Bank (AfDB) highlight digital transformation of the continent as a major priority underpinning the objective of propelling African industrialisation (AU, 2020), devising a continent-wide Digital Transformation Strategy for 2020 to 2030 (AU-DTS) and Digital Transformation Action Plan for 2024 to 2028 (AfDB-DTAP), respectively. The AfDB-DTAP aims to contribute to the international digital economy and thus support the Africa Continental Free Trade Agreement (AfCFTA), premised on the overall pillars of improved digital infrastructure to bridge the digital gap, by investing in digital entrepreneurs and innovation (creating end-to-end tech ecosystems in African countries), integration of digital and other emerging technologies into various sectors including government, agriculture, education, commerce and sustainable energy, (AfDB, 2024) as well as creating an enabling environment for policy and regulatory advancements in line with the ultimate objectives (AU, 2020).
Over and above its pivotal role in economic development, commerce and communication, access to the internet plays an increasingly essential role in maintaining democratic institutions, by virtue of its function as one of the increasingly central means for citizens to exercise their right to freedom of expression. As an extension, however, of the ultimate objective of a more data-driven future in which universal access to the internet is achieved, the transition from in-person, pen and paper, analogue economies and personal lives, lived primarily outside the digital universe, to the modern age of digital ubiquity and IoT permanence, data subjects who wish to maintain relatively whole and efficient lives outside the digital universe find themselves with fewer and fewer alternatives, as analogue and manual service provision is continually neglected and phased out by corporations and governments alike. Through an extension of the right to opt out or be forgotten, corporations and governments alike have an obligation to allow alternative service provision outside of digital and internet connected means, allowing space for a right to use (or limit one’s presence on) the internet. By having a right to be forgotten, data subjects likewise have a right to not be known digitally at all, where possible, and to whatever extent is reasonable, opt out of a digital presence without compromising their overall quality of life (Terzis, 2025); this right can only be exercised where access (to the economy, to other communication, to essential services or to the world as a whole) is not contingent on waiving other rights.
Currently, with the limitation of physical processes and of non-digital solutions, anyone choosing to live outside the digital realm, for whatever reason they choose, finds themselves likewise forgoing a degree of access (Kloza, 2021). Accordingly, it can be reasonably argued that by limiting life outside of digital solutions, the right to be forgotten or opt out is compromised; as a result, the right not to analogue alternatives, is not only functionally necessary to data privacy, but is further critical in guarding against digital discrimination, where opting out means losing out. The result of this need, as well as the growing risk that the unavailability of alternative non-digital spaces consolidates all the functions that are carried through the internet to one potential failure point, is that it creates an obligation by governments to ensure that basic services remain accessible outside of internet-reliant means.
Data in Commerce
The internet began its transformation into the more commercial variant as we now know it between the late-80s and mid-90s, shifting its primary function from an instrument of communication by creating wholly novel markets to which it could bring value. This historic invention was achieved through innovation such as the introduction of HTTPS and HTML as well as the opening up of web access by the decommissioning of the US National Science Foundation’s (NSF) NSFNET through the National Information Infrastructure Act of 1993, introducing privately-owned internet infrastructure and funding the first freely available web browser – Mosaic (NSF, N.d.). The arrival of internet service providers (ISPs) widened its reach to private citizens, transforming the almost exclusively academic and military internet monopoly that existed prior to that point, allowing businesses as well as their customers to interact in a novel way outside of physical and printed marketing and transactions as well as creating new digital spaces for individual users to interact socially. In the period since, its value to users as a tool of international, real-time communication has been surpassed only by its value to commerce (McKinsey, 2021).
The modern world has turned everyday experiences and appliances/devices into data collection opportunities for a countless number of increasingly innumerable and faceless, at times interconnected corporate entities. From Siri and Alexa, to smart TVs, fridges and printers perpetually connected to Wi-Fi and capable of voice commands, to health monitoring wearable tech and location sharing, the digital age is making it increasingly impossible to live outside the ever-expanding Internet of Things, all the while underpinned by assurances from data controllers that our most sensitive personal data is safe in their hands (Balkin, 2020). These comparatively better resourced (compared to the average human internet user) corporations often rely on user consent in their data collection exercises, backed by at times difficult-to-read policies that the average user is required to agree to in order to benefit from the services. As a result, the average internet user is required to have read, understood, and agreed to be bound by a data privacy policy for each stand-alone service that requires their personal data before they derive reciprocal value from this service. This requirement is irrespective of the user’s background, level of education, or disparity of internet services (and thus policies to review and consent to) (Ibdah, 2021).
In Botswana, as a regional example, prior to the Data Protection Act of 2025, data protection rights were espoused primarily, particularly in a commercial setting, in legislation such as the Constitution of the Republic of Botswana [Cap 00:00], Financial Intelligence Act, 2019, Banking Act [Cap 46:04] and common law (Bookbinder Law, 2021). These laws primarily, at least on the face thereof, sought to regulate privacy through reference to the inherent fiduciary duty of client confidentiality, particularly in financial and business affairs. Notwithstanding the Data Protection Act subsequently coming into effect, these legal instruments and the obligations they introduced still stand, resulting in parallel (complimentary) legal obligations. Similarly, across similar judications, through instruments relating to elements such as cyber security, medical confidentiality, and commercial confidentiality, privacy between corporations and their customers (or patients) precedes data protection legislation.
By retaining the view of data privacy and data protection as fiduciary duty (at least where personal data is utilised in commercial endeavours), and thus ethical obligation to the data subject, it places greater obligation not only on the corporate controller entity but also directly on management and ultimately the board, to ensure the data subjects’ rights are considered and given priority outside of as a matter purely of regulatory compliance, thus aligning it to cybersecurity and other digital fiduciary obligations. This further ensures alignment with King IV and ultimately King V (as well as the Pula Code [BAOA, 2020]) by addressing required oversight over Technology and Information governance and the obligation upon the governing body to ensure ethical use of information, deletion of obsolete information, management of third-party and service provider risks relating to information, integration of information risk in the organisation’s overall risk management processes, and policy creation (Institute of Directors in Southern Africa, 2016).
By defining data protection as a fiduciary duty, the pervasive on data subject consent over other lawful reasons for processing personal data, is mitigated, and reasonably so, as being the weaker party, with no realistic access to verify the controller’s claims, and being the party whose consent was given on the back of policies they at times do not fully grasp (Ibdah, 2021), at times without legal advice, and under the threat of losing access to any service in which they do not give this consent (thus making the consent not truly, wholly freely given), data subjects should at minimum be able to rely on the controller’s “duties of care, confidentiality and loyalty” as emanating from the business relationship, position of trust of the controller to the user, and the data subject’s inherently vulnerable position (Balkin, 2020).
From a governance perspective, between the size of potential fines often carried by data protection laws (particularly those modelled after GDPR), to the extraterritoriality of privacy claims, to the cascading risks brought by transfers to corporations with poorer compliance records and jurisdictions with weaker laws or regulators and even to the trust and goodwill that underpins data protection undertakings and the giving of consent, it is increasingly upon the board to fully encompass their obligations as data privacy fiduciaries, which exist outside of strictly legal compliance obligations, particularly due to the operational and reputational risks espoused by data breach risks (McKee, 2024). It is not hard to determine that this umbrella of concerns motivated the increased emphasis on data governance in King V, its recently released executive summary emphasising more distinctly than previous versions, the critical nature of information governance, and protection of stakeholder interests (of which the data subject is now a prominent feature due to the increasingly material need for good information governance and in the spirit of Ubuntu) as a function of Ubuntu and overall responsible governance practices over and above pure shareholder concerns (Institute of Directors in Southern Africa, 2025).
The overreliance on consent as a basis for processing personal data, and shifting of onus to the uninformed and ill-resourced individual data subject and away from well-resourced corporate entities, is further exacerbated by the widespread use of Artificial Intelligence (AI) not only in standard operating software, but also in carrying out tasks relating to personal data across industries inclusive of healthcare, marketing to insurance amongst many other industries. AI systems not only collect and process big data at a scale previously not possible, but through internet scraping, cross referencing, extrapolating and perpetual processing, thus foundational data privacy principles such as purpose limitation, data minimisation, transparency, accuracy and even the ability for data subjects to receive comprehensive reports of data held by controllers or wholly withdraw consent (and require that all their personal data be deleted), remain uncertain (Trott, 2024). In addition to AI’s incredible capacity to process large amounts of personal data, AI as a technology remains largely hard-to-understand technology for the average citizen. Accordingly, how it does what it does (including how it further processes the data they have given to make inferences about the data subject, particularly when customising user habits and preferences for marketing purposes), the bounds of its abilities, and where it sources the information it uses to perform its functions, is largely not fully grasped by average data subjects even when consenting to incorporation of AI technologies when processing their personal data (Trott, 2024).
Due to increased general reliance on AI, as evidenced by its introduction into numerous websites such as Google, and integration as a standard feature in software by corporations such as Microsoft, the speed with which it continues to advance, its potential to support economic growth, increased innovation, employment creation, and generally keeping African countries somewhat on par with the rest of the world, the AU Executive Council endorsed a Continental Artificial Intelligence Strategy in July 2024 (the “AU AI Strategy”) (AU, 2024).
As of December 2024 however, no African country had introduced a binding instrument relating to AI, whether facing developers in country, or roll out of the technology within its borders, to its citizens. As at the end of 2024, only Algeria, Benin, Côte d'Ivoire, Egypt, Ethiopia, Ghana, Kenya, Lesotho, Libya, Mauritius, Mauritania, Nigeria, Rwanda and Senegal had implemented a national strategy or policy, with others such as Botswana, South Africa, Tanzania and Uganda having initiated stakeholder consultations and/or published a draft document (Alayande, 2025), which number still represents less than half of the 54 countries on the continent. Comparatively, 36 African countries have currently implemented Data Protection laws (of which 25 have Data Protection laws that speak in some regard to AI), with Ethiopia, Namibia and Malawi’s laws still yet at draft stage (ALT Advisory, 2024). Due to the slow pace at which AI and other regulations are being rolled out not only in Africa but also internationally, it then places the impetus on corporations involved to ensure their own values and cultures determine the approach they will take regarding new technologies and the use of personal data, as a function of their business and professional ethics, as well as a function of corporate governance including under the subject of Environmental Social and Governance (ESG). In its 2025 report, the OECD commented on this, lamenting:
“Given the pace of change, governments and regulators themselves are continually trailing technological and scientific progress and urgently need to strengthen their capacities for horizon scanning and regulatory foresight. This will build knowledge to better anticipate emerging and future challenges and avoid harms playing out due to regulatory vacuums or institutional inertia, or having burdensome legacy regulations. In addition to increasing institutional foresight capacity, regulators on the frontline will need equipping with sufficient powers and resources to act on their insights. At times when the last resort of sanctions is reached, these may pale in comparison to the size and cross-border nature of regulated entities, calling into question the very efficacy of enforcement regimes” (OECD Regulatory Policy Outlook 2025, 2025).
It is of further concern that in a majority of instances governments collect data through private corporate data collectors. A growing percentage of the data that is useful to a government, whether from a policy perspective (accurate information, statistics, and other data that forms Big Data used for decision-making) or national security purposes (political opinions, movement, etc.), is directly sourced by the government itself. Private entities such as META, TikTok, Google (including location tracking, voice command, online behaviour monitoring, and other similar applications by such organisations) hold a majority of the information that would prove useful to governments in its policymaking or surveillance efforts. However, while we may know (at least on paper) how these private entities intend to store and use our data, “national security risk” and “public safety” remain blanket exceptions that can be easily manipulated and adapted to meet whatever current regime’s objectives are (officially and otherwise). Accordingly, the digital nature of current data collection, storage and use means, along with the current critical nature of personal data in society, pose particular challenges (Darwin & Wa Nkongolo, 2023). As a result, there is a need for governments, in their regulatory efforts, to ensure that rights processed in reality are as much as possible reflected in online environments, particularly as online environments have increasingly significant real-world effects notwithstanding.
In addition to these challenges, the right to legal identity, which was included in the 2030 Sustainable Development Goals, underpins the right to access basic services. Being able to produce a legal identity document is, for example, a key requirement for participating in the formal commercial economy in many jurisdictions as a function of anti-money laundering legislation, premised largely on the need to formally identify counterparties for Know Your Counterparty. Due to the digitisation of commerce and a large section of human interaction, a digital version of legal identity has been determined to be more and more urgent. Justifications for this growing need include limiting false identities online, ensuring regulatory compliance, limiting bottlenecks in service provisions through a reliable means of identity, allowing a uniform approach to identity submission and verification online (World Economic Forum, 2016). Challengers to the roll out of digital identities, however, point to the privacy risks posed, particularly in terms of the reliance on biometrics for its functionality and, most concerningly, of the ability for abuse, through limitation of access to basic services, in certain scenarios, or to outright weaponisation and surveillance potential.
Conclusion
“Everything is solidly anchored within a pedagogic space. A painting ‘shows’ a drawing that ‘shows’ the form of a pipe; a text written by a zealous instructor ‘shows’ that a pipe is really what is meant. We do not see the teacher's pointer, but it rules 30 This Is Not a Pipe throughout-precisely like his voice, in the act of articulating very clearly, ‘This is a pipe’. From painting to image, from image to text, from text to voice, a sort of imaginary pointer indicates, shows, fixes, locates, imposes a system of references, tries to stabilize a unique space. But why have we introduced the teacher's voice? Because scarcely has he stated, ‘This is a pipe’, before he must correct himself and stutter, ‘This is not a pipe, but a drawing of a pipe’, ‘This is not a pipe but a sentence saying that this is not a pipe’, ‘The sentence “this is not a pipe” is not a pipe’, ‘In the sentence “this is not a pipe”, this is not a pipe: the painting, written sentence, drawing of a pipe – all this is not a pipe” (Foucault, 1983).
While regulators and legislators (through access) ostensibly to wish to create a more open and safe internet, without a human-centric, Ubuntu / good governance or other social ethic contract premise underpinning its foundations, whether from a governmental or from a commercial perspective, it is imperative that the solution be at all times human centric and centred in protecting critical social contracts such as Ubuntu and duty of care for one another as members of the broader human community. Corporations, governments and citizens alike ultimately stand to benefit greatly through the use of data to propel humanity forward, whether through closing gaps in development and ending inequalities currently exacerbated by lack of access, or through innovation and commercial success. For this reason, for this to be achieved mutual trust in the collection, use and sharing of data must be built and protected.
More and more literature points to the need for self-regulation by corporations and governments alike. Corporations must determine their position vis-à-vis their clients, employees, corporate counterparts and other stakeholders, devising rules of conduct for themselves premised on good governance, and seek to implement policies based on more than what is legal, but also on what their duty of care requires of them. Likewise, governments – as the counterparty to the primary social contract between humans – whose role is to regulate human relations and protect the greater community from abuse whether externally or internally through inequality and discrimination, have an obligation to protect this relationship with their citizens by playing an active part in ensuring regulations are appropriate, responsive and enforceable, with the greater additional objective of ensuring that future iterations of itself, inherit checks and balances, seek to ultimately achieve the objective of ensuring the success of its citizenry.
References
AfDB. 2024. African Development Bank Group Digital Transformation Action Plan 2024 - 2028. [Online] Available at: https://vcda.afdb.org/en/system/files/report/DTAP_Detailed%20Version.pdf [accessed 16 October 2025].
African Union (AU). 2020. The Digital Transformation Strategy for Africa (2020-2030). [Online] Available at: https://au.int/sites/default/files/documents/38507-doc-dts-english.pdf [accessed 16 October 2025].
African Union (AU). 2024. Continental Artificial Intelligence Strategy. [Online] Available at: https://au.int/sites/default/files/documents/44004-doc-EN-_Continental_AI_Strategy_July_2024.pdf [accessed 16 October 2025].
Alafaa, P. 2022. Data Privacy and Data Protection: The Right of User’s and the Responsibility of Companies in the Digital World. [Online] Available at: https://ssrn.com/abstract=4005750 [accessed 16 October 2025].
Alayande, S. E. 2025. African Countries Are Racing to Create AI Strategies - But Are They Putting the Cart Before the Horse? [Online] Available at: https://www.globalcentre.ai/research/african-countries-are-racing-to-create-ai-strategies-but-are-they-putting-the-cart-before-the-horse [accessed 16 October 2025].
ALT Advisory. 2024. Which African data protection laws regulate AI? [Online] Available at: https://dataprotection.africa/ai-and-data-protection-regulation/ [accessed 16 October 2025].
Balkin, J. M. 2020. The Fiduciary Model of Privacy. Harvard Law Review Forum, 134(1), Yale Law & Economics Research Paper Forthcoming. [Online] Available at: https://ssrn.com/abstract=3700087 [accessed 16 October 2025].
BAOA. 2020. [Online] Available at: https://www.baoa.org.bw/standards-setting-adopted-standards/ [accessed 16 October 2025].
Bookbinder Law. 2021. Data Protection in Botswana. [Online] Available at: https://bookbinderlaw.co.bw/data-protection-in-botswana/ [accessed 14 October 2025].
Darwin, V. & Wa Nkongolo, M. 2023. Data Protection for Data Privacy-A South African Problem? https://doi.org/10.48550/arXiv.2306.09934.
Davis, T. & Trott, W. 2024. The regulation of artificial intelligence through data protection laws: Insights from South Africa. African Journal on Privacy & Data Protection, 207-219.
EU. 2016. General Data Protection Regulations - Lawful Processing. [Online] Available at: https://gdpr-info.eu/art-6-gdpr/ [accessed 16 October 2025].
Foucault, M. 1983. This is Not a Pipe. University of California Press. [Online] Available at: https://monoskop.org/images/9/99/Foucault_Michel_This_Is_Not_a_Pipe.pdf [accessed 16 October 2025].
Galal, S. 2024. Internet usage in Africa - statistics & facts. [Online] Available at: https://www.statista.com/topics/9813/internet-usage-in-africa/ [accessed 16 October 2025].
Guerriero, M. 2015. The impact of Internet connectivity on economic development in Sub-Saharan Africa (p19-27). [Online] Available at: https://assets.publishing.service.gov.uk/media/57a0899b40f0b652dd0002f4/The-impact-of-internet-connectivity-on-economic-development-in-Sub-Saharan-Africa.pdf [accessed 16 October 2025].
Ibdah, D., Lachtar, N., Raparthi, S.M. & Bacha, A. 2021. “Why Should I Read the Privacy Policy, I Just Need the Service”: A Study on Attitudes and Perceptions Toward Privacy Policies. IEEE Access, 9, 166465-166487. https://doi.org/10.1109/ACCESS.2021.3130086.
Institute of Directors in Southern Africa. 2016. King IV. [Online] Available at: https://www.iodsa.co.za/page/king-iv [accessed 16 October 2025].
Institute of Directors in Southern Africa. 2025. King V Code on Corporate Governance for South Africa Executive Summary of Fundamental Concepts. [Online] Available at: https://cdn.ymaws.com/www.iodsa.co.za/resource/collection/7DAE15BF-07FA-4922-879E-6788368F5DB4/KingV_code.pdf [accessed 14 October 2025].
International Telecommunication Union (ITU). 2021. Facts and Figures 2021 Internet uptake has accelerated during the pandemic. [Online] Available at: https://www.itu.int/itu-d/reports/statistics/2021/11/15/internet-use/ [accessed 16 October 2025].
International Telecommunication Union (ITU). 2024. Measuring digital development Facts and Figures 2024. [Online] Available at: https://www.itu.int/hub/publication/d-ind-ict_mdd-2024-4/ [accessed 16 October 2025].
Jonas Hjort, L. T. 2025. The Economic Impact of Internet Connectivity in Developing Countries. [Online] Available at: https://eprints.lse.ac.uk/129143/1/annurev-economics-081224-102352.pdf [accessed 16 October 2025].
Kloza, D. 2021. It’s All About Choice: The Right Not to Use the Internet. [Online] Available at: https://cris.vub.be/ws/portalfiles/portal/77048349/dk2021_Its_All_About_Choice.pdf [accessed 30 September 2025].
Kovalenko, Y. 2022. The Right to Privacy and Protection of Personal Data: Emerging Trends and Implications for Development in Jurisprudence of European Court of Human Rights. Masaryk University Journal of Law and Technology, 16(1), 37–58.
Löfgren, K. & Webster, C. W. R. 2020. The value of Big Data in government: The case of ‘smart cities’. Big Data & Society, 7(1). https://doi.org/10.1177/2053951720912775.
Mavedzenge, J. A. 2020. The Right to Privacy v National Security in Africa: Towards a Legislative Framework Which Guarantees Proportionality in Communications Surveillance. African Journal of Legal Studies, 12(3-4), 360-390. https://doi.org/10.1163/17087384-12340056.
McKinsey. 2021. The value of getting personalization right—or wrong—is multiplying. [Online] Available at: https://www.mckinsey.com/capabilities/growth-marketing-and-sales/our-insights/the-value-of-getting-personalization-right-or-wrong-is-multiplying [accessed 16 October 2025].
NSF. N.d. Birth of the Commercial Internet. [Online] Available at: https://www.nsf.gov/impacts/internet [accessed 16 October 2025].
OECD. 1980; 2013. OECD Guidelines Governing the Protection of Privacy and Transborder Flow of Personal Data. [Online] Available at: https://www.refworld.org/policy/legalguidance/oecd/1980/en/14534 [accessed 13 October 2025].
OECD. 2002. OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. [Online] Available at: https://doi.org/10.1787/9789264196391-en [accessed 16 October 2025].
OECD. 2008. Governance and Peace for Development. [Online] Available at: https://www.oecd.org/en/topics/governance-and-peace-for-development.html [accessed 16 October 2025].
OECD. 2023. Explanatory memoranda of the OECD Privacy Guidelines, OECD Digital Economy Papers, No. 360. Paris: OECD Publishing. https://doi.org/10.1787/ea4e9759-en.
OECD. 2025. OECD Regulatory Policy Outlook 2025. [Online] Available at: https://www.oecd.org/en/publications/2025/04/oecd-regulatory-policy-outlook-2025_a754bf4c/full-report/regulating-for-the-future_e948d334.html [accessed 16 October 2025].
Riduan, S., Leonard, S. & Muhammad, I.H. 2023. Human Rights in The Digital Era: Online Privacy, Freedom of Speech, and Personal Data Protection. Journal of Digital Learning and Distance Education, 2(4), 513-523. https://doi.org/10.56778/jdlde.v2i4.149.
Robles-Carrillo, M. 2024. Digital identity: an approach to its nature, concept, and functionalities, an approach to its nature. International Journal of Law and Information Technology, 32(1). https://doi.org/10.1093/ijlit/eaae019.
Sutherland, E. 2021. The Governance of Data Protection in South Africa. [Online] Available at: https://ssrn.com/abstract=3922218 [accessed 16 October 2025].
Terzis, G. 2025. Ethical meditations for a human right to an analogue life. In Dariusz et al. (Eds), The Right Not To Use The Internet - Concept, Context, Consequences. UK: Routledge.
UN. 1948. UN General Assembly, Resolution 217A (III), Universal Declaration of Human Rights, A/RES/217(III). [Online] Available at: https://www.un.org/en/about-us/universal-declaration-of-human-rights [accessed 16 October 2025].
UN Global Pulse. 2012. Big Data for Development: Challenges & Opportunities. [Online] Available at: https://unstats.un.org/unsd/trade/events/2014/beijing/documents/globalpulse/Big%20Data%20for%20Development%20-%20UN%20Global%20Pulse%20-%20June2012.pdf [accessed 15 October 2025].
UNDP. 2017. Data Privacy, Ethics and Protection Guidance Note on Big Data for Achievement of the 2030 Agenda. [Online] Available at: https://unsdg.un.org/resources/data-privacy-ethics-and-protection-guidance-note-big-data-achievement-2030-agenda [accessed 10 October 2025].
UNHRC. 2021. A/HRC/47/L.22. [Online] Available at: https://documents.un.org/doc/undoc/ltd/g21/173/56/pdf/g2117356.pdf [accessed 16 October 2025].
Warner, D. & McKee, L. 2024. Board of Directors Role in Data Privacy Governance: Making the Transition from Compliance Driven to Good Business Stewardship. Journal of Cybersecurity Education, Research and Practice, 2024(1), Article 14.
Westin, A. 1967. Privacy and Freedom. [Online] Available at: https://scholarlycommons.law.wlu.edu/wlulr/vol25/iss1/20 [accessed 16 October 2025].
World Bank. 2021. World Development Report 2021: Data for Better Lives. [Online] Available at: https://wdr2021.worldbank.org/the-report/ [accessed 16 October 2025].
World Economic Forum. 2016. A Blueprint for Digital Identity: The Role of Financial Institutions in Building Digital Identity. [Online] Available at: https://www3.weforum.org/docs/WEF_A_Blueprint_for_Digital_Identity.pdf [accessed 16 October 2025].
[1] Including: Zimbabwe, South Africa, Namibia, Botswana, Zambia, Nigeria, Liberia, Côte d'Ivoire, Kenya, Guinea, Gambia, Senegal, Togo, Niger, Benin, Guinea-Bissau, Ghana, Tanzania, Uganda, Ethiopia, Rwanda, Somalia, Lesotho, and Burundi.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

This report has been published by the Inclusive Society Institute
The Inclusive Society Institute (ISI) is an autonomous and independent institution that functions independently from any other entity. It is founded for the purpose of supporting and further deepening multi-party democracy. The ISI’s work is motivated by its desire to achieve non-racialism, non-sexism, social justice and cohesion, economic development and equality in South Africa, through a value system that embodies the social and national democratic principles associated with a developmental state. It recognises that a well-functioning democracy requires well-functioning political formations that are suitably equipped and capacitated. It further acknowledges that South Africa is inextricably linked to the ever transforming and interdependent global world, which necessitates international and multilateral cooperation. As such, the ISI also seeks to achieve its ideals at a global level through cooperation with like-minded parties and organs of civil society who share its basic values. In South Africa, ISI’s ideological positioning is aligned with that of the current ruling party and others in broader society with similar ideals.
Email: info@inclusivesociety.org.za
Phone: +27 (0) 21 201 1589
Web: www.inclusivesociety.org.za




Comments