Hostname: page-component-848d4c4894-wg55d Total loading time: 0 Render date: 2024-06-07T20:32:47.504Z Has data issue: false hasContentIssue false

Data responsibility, corporate social responsibility, and corporate digital responsibility

Published online by Cambridge University Press:  08 April 2022

Joanna van der Merwe*
Affiliation:
Centre for Innovation, Leiden University, Leiden, The Netherlands
Ziad Al Achkar
Affiliation:
Carter School for Peace and Conflict Resolution, George Mason University, Arlington, Virginia, USA
*
*Corresponding author. E-mail: j.s.van.der.merwe@sea.leidenuniv.nl

Abstract

This commentary looks at the use of corporate social responsibility (CSR) mechanisms for implementing responsible data use. The commentary offers an overview of CSR theory and the discourse on a growing phenomenon known as corporate digital responsibility (CDR). The commentary links these theories to the historical debates on the nature of technology, ethics, and society. The aim is to reflect on CSR and CDR mechanisms and ignite the discussion on their adequacy considering the pursuit of data responsibility. Through our discussion and brief case studies, the paper reveals the gaps in relying on CSR and CDR and the need for a broader societal and comprehensive approach.

Type
Commentary
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Policy Significance Statement

The commentary identifies that corporate social responsibility and corporate digital responsibility mechanisms are not adequate for managing data responsibility. These findings are key for understanding suitable path forwards as data-driven technology is further integrated into society and shaping the discussion as the field moves forward. The commentary suggests the need for a comprehensive approach to ensure data are managed responsibly and protect consumers.

1. Introduction

Data, and data-driven technologies, are seen to hold a lot of promise for overcoming the complex challenges facing society (e.g., climate change and food insecurity) as well as the potential to improve the lives of all and importantly those in poverty. However, the data being generated and the technology to make sense of it are mainly controlled by the private sector, with governments having to form public–private partnerships to even improve the use of the data that they themselves generate and use. Technological progress is largely driven by the private sector, with the public sphere typically playing catch-up, especially when legislating, or contracting the private sector for new projects.

Although the consequent impact and role that these private companies have on societal issues are not new, especially when looking at issues around environmental protection, climate change, and the role of corporations in developing countries, technological development brings a new side to the possible impact. As digital transformation expands, and data technologies play a larger role in the day-to-day lives of people, there is a growth in demand for data responsible approaches, be it through private governance or through regulatory means. For many in the private sector, data responsibility is increasingly seen through the lens of corporate social responsibility (CSR) and the emerging related idea of corporate digital responsibility (CDR).

This paper aims to examine the status of CSR and its relation to data responsibility. It also aims to examine the adequacy of CSR, and the emerging CDR, as an approach to governing and implementing data responsibility and protecting the interests of wider society. The paper engages with the debates about technology, data responsibility, and governance. By reflecting on CSR and CDR, the paper showcases the current trajectory of the discussion regarding responsible use of data and digital tools. In doing so, the paper aims to ensure that the lessons from CSR are considered as digital transformation expands and governments grapple with the societal implications. It must be noted that this commentary is limited to the Global North notion of CSR/CDR and does not reflect on the way the concept is being implemented by governments and corporations in the Global South.

2. Corporate Social Responsibility: The Evolution

In the broadest sense, CSR can be described as the idea that corporations should go beyond their minimum legal obligations and consider their impact on society when making strategic and operational decisions and taking actions (Russell et al., Reference Russell, Russell and Honea2016). It is a concept developed in the Global North with a heavy focus on the impact of multinational corporations as they expanded globally (Bernard, Reference Bernard2021; Marques and Utting, Reference Marques and Utting2010). In searching for a precise definition of CSR, it becomes clear that despite it being a heavily researched field, there is still no common definition. Neither is there the consensus on what the core principles are, how a company achieves social responsibility, or, in fact, whether a company is even obliged to be socially responsible (Crane et al., Reference Crane, McWilliams, Matten, Moon and Siegel2008). This may be due to the multitude of theories that form the foundation of our understanding of CSR, theories coming from varying fields of knowledge, such as economics, political sciences, and sociology.

Despite the lack of consensus, the existing theories and examinations of CSR can be categorized into groupings (Melé, Reference Melé2008). Each of these theories is descriptive of the ways in which businesses are approaching their social responsibilities, but can also be understood from a normative perspective.

  1. (1) Corporate Social Performance: This theory holds that companies have four responsibilities, namely, wealth creation, economic responsibilities, legal responsibilities, and addressing societal problems.

  2. (2) Shareholder Value Theory: It holds that the only responsibilities are to generate profit and increase economic value for shareholders while upholding its legal obligations. This may be seen as the “classical” approach to CSR captured in Friedmans definition “the social responsibility of the business is to increase profits” (Friedman, 1962; Grigore et al., Reference Grigore, Molesworth, Watkins, Theofilou, Grigore and Stancu2017 as cited by Melé, Reference Melé2008).

  3. (3) Stakeholder Theory: Under the stakeholder theory, a business has a responsibility to anybody who has a “stake” in the business activities, beyond just those described legally.

  4. (4) Corporate Citizenship: Rather than a full single theory, Corporate Citizenship can be seen as an umbrella term for several theories that are emerging around the role of businesses within society. Rather than the business’s responsibility being an external affair, this theoretical group argues that a business is an integral part of society. Views on this have been derived from instances where a corporation enters the realm of citizenship, fulfilling roles that are like that of governments.

Corporate responsibility has evolved over the past few decades as the different practical approaches and theoretical thinking emerged and as awareness of the impact has grown. The most dominant of these is the classical approach, of shareholder value theory, in which a corporation’s ultimate responsibility is toward maximizing profits. Corporate citizenship is the most recent approach to emerge, and therefore a full theory is yet to emerge (Grigore et al., Reference Grigore, Molesworth, Watkins, Theofilou, Grigore and Stancu2017; Melé, Reference Melé2008).

One of the tensions with CSR is that it can be applied as a set of activities of a corporation or an expansive guiding ethos that is holistic in nature and reflective of the corporation’s “personality.” This dialectic can create challenges when corporations attempt to obscure their intensions by using CSR activities performatively. In essence, they can render CSR as simply a public relations tool rather than a normative and principle-driven governing ethos. (Bernard, Reference Bernard2021).

2.1. CSR and practical implementation

In practice, CSR is not one mechanism, but rather several implementation styles and accountability mechanisms that are not mutually exclusive. The result is a complex notion of CSR, how it is, and should be, implemented and how it should be enforced. There is no “one-size-fits-all” (Maon et al., Reference Maon, Swaen and Lindgreen2017). This creates a challenge for governments who aim to enforce CSR through either soft or hard law or self-regulation. As the concept of CSR and its foundations are expanded to address the emerging data-driven impacts, it is imperative that a solution to these challenges be found.

2.1.1. Self-regulation

One of the mechanisms for upholding CSR is self-regulation through mechanisms and instruments in place at the firm or sector level, including codes of ethics and conduct, responsible investment, and so forth. These aim to establish and uphold industry standards and norms which can be enforced through hard and/or soft laws. Self-regulation is argued to be an important mechanism for sectors that are experiencing a growth of emerging practices and technologies that fall outside or in the gray zones of existing government regulations. It is seen as the way to allow continued innovation and development while countering the risk that harder governance mechanisms would hinder innovation (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019).

When setting self-regulated industry norms and standards, it is often done through meta-organizations (MOs), organizations that are composed of other organizations. The role of these MOs is to not only generate CSR solutions, but also facilitate and ensure reporting and accountability, and capacity building. The MOs can be divided into three main categories (Berkowitz et al., Reference Berkowitz, Bucheli and Dumez2017):

  1. (1) Traditional MOs such as trade associations.

    1. (a) American Petroleum Institute—“Advocacy, research and statistics, standards, certification, and education.”

  2. (2) Specialized MOs dealing with specific challenges for which businesses can work together to research a solution. Members are usually made up of only businesses.

    1. (a) Conservation of Clean Air and Water in Europe—Research on environmental issues related to the oil industry.

    2. (b) Asistencia Reciproca Petrolera Empresarial Latinoamercana—Developing and sharing best practices; research and building competency.

    3. (c) World Ocean Council—Collaborative stewardship.

  3. (3) Multistakeholder MOs which gather business, government, as well as civil society actors.

    1. (a) UN Global Compact—“Framework for development, implementation, and disclosure” of policies and practices.

    2. (b) Voluntary Principles for Security and Human Rights—Principles.

These organizations are also not necessarily restricted to a specific issue or a specific sector. They can be infrasectoral, focusing on segments of a sector, suprasectoral, bringing together companies from related sectors (e.g., oil and gas mining), or cross-sectoral, bringing together companies from unrelated sectors (Berkowitz et al., Reference Berkowitz, Bucheli and Dumez2017).

MOs are vital for companies aiming not only to uphold their CSR objectives, but also to determine how existing legal and societal obligations apply to their emerging technologies, but they cannot replace government regulation. They may act as a precursor to future regulatory framework, facilitating cooperation with the government as it explores the implications of the emerging tech and practices (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019). We acknowledge that there are other types of self-regulations that sectors and industries impose, such as the BAR Association on lawyers who want to practice law in the United States and requirements to conduct pro-bono work in a calendar year. These are a form of CSR activities.

Self-regulation approaches are by nature self-designed and sometimes voluntary. Corporations engage in them as a means to thwart hard regulations that are imposed by governments (Lock and Seele, Reference Lock and Seele2016). CSR mechanisms allow for firms or industries to signal to policymakers, stakeholders, and public their intentions, reducing potential information asymmetries (Su et al., Reference Su2016). Firms engage in those activities as a tool to maximize their economic profit potential, and to signal about their capacities to implement these policies. How a firm engages with the political arena and uses is reflective of their perspectives of governments and institutions (Anastasiadis, Reference Anastasiadis2014). Corporations regularly engage in political lobbying when they see opportunities to either enshrine in hard laws regulations that would benefit their sectors, or to thwart the establishment of new laws that they oppose (McGreal, Reference McGreal2021). A real tension, however, emerges when there are discrepancies between the stipulated CSR platforms of a firm or industry, and the political lobbying that they engage in (Lock and Seele, Reference Lock and Seele2016).

However, many companies have pushed to reduce potential regulatory requirements, especially hard law obligations, on their business operations allowing for flexibility in how they conduct their business, argued to be key for innovation and tech development (Berkowitz and Souchaud, Reference Berkowitz and Souchaud2019). As a result, companies might pursue strategies that mimic government regulations by engaging in self-regulatory practices indicating to regulators that they are adhering to best practices or behaviors that would be the foundation of regulation. Often, these types of approaches are couched under a CSR approach in which companies showcase to regulators, policymakers, and society that they are good stewards and that there is no need for hard laws or if there are, it should be based on the practices they already following. We are mindful that companies and even industries may pursue or encourage government regulation to secure their interests and advantage while hurting competitors or limiting entry into their market (see Case Study below).

2.1.2. Social pressure

Another mechanism through which businesses can be forced to adhere to social responsibility obligations is social pressure through stakeholder actions either individually or collectively. Stakeholders can be employees, consumers, shareholders, or competitors. Research has found that the strongest group that has influence on the decision-making of organizations are the employees themselves. Employees can force their management to adhere to the CSR policies established or recognize new aspects of social responsibility (Helmig et al., Reference Helmig, Spraul and Ingenhoff2016). Within the tech sector, this push by employees has been seen at Google, Amazon, and Palantir with varying levels of success (MacMillan and Dwoskin, Reference MacMillan and Dwoskin2019; Paul, Reference Paul2020; The Guardian, 2021). Through research conducted regarding consumer behaviors in the United States, it has been shown when corporations violate governments or voluntary standards, there are punitive actions taken by both highly conscious and less conscious consumers (Russell et al., Reference Russell, Russell and Honea2016). As such, CEOs are increasingly mindful about the impact of aligning CSR strategies with their business models in order to keep stakeholders and consumers satisfied (Henderson, Reference Henderson2018).

3. The Emergence of Corporate Digital Responsibility

Over the past decade, as society has experienced a major shift toward digitalization, there have been increasing calls for companies to engage in “CDR.” The concept has broadly been understood as the desired responsibilities of corporations to their users, societies, and governments, when it comes to the use of digital tools and data collection (Lobschat et al., Reference Lara, Mueller, Eggers, Brandimarte, Diefenbach, Kroschke and Wirtz2021). The basic implementation of CDR is the same as CSR, and may be viewed as one part of an overall CSR model. It is a voluntarily established set of policies and self-governing principles, developed, implemented, and overseen by corporations themselves, going above regulations that are mandate by regulations. As a result, a cynical interpretation of CDR is that it offers corporations an opportunity to a build a cover for unethical behaviors and practices. More optimistically, it can be interpreted as organization aiming to ensure responsible development of technology in a realm that is severely underregulated.

3.1. Understanding CDR

As an emerging field of practice and study, defining CDR is challenging. Conceptually, Richard Mason, writing in the 1980s, articulates the essence of CDR best “Our moral imperative is clear. We must ensure that information technology, and the information it handles, are used to enhance the dignity of mankind” (Mason, Reference Mason1986). Competing definitions of CDR have emerged over the past few years, focused on who are constituents of CDR, arguing special attention needs to be given to artificial actors (Lobschat et al., Reference Lara, Mueller, Eggers, Brandimarte, Diefenbach, Kroschke and Wirtz2021), identifying different branches of responsibilities CDR aims to target (Herden et al., Reference Herden, Alliu, Cakici, Cormier, Deguelle, Gambhir and Griffiths2021), or making the case for merging sustainability and digitalization together (Wade, Reference Wade2020).

We propose to define CDR as the set practices, policies, and governance structures of corporations as they relate to the digital transformation. CDR must be centered around accountable digital practices, enforcement mechanisms, sustainable growth and development, and the promotion of trust across the digital ecosystem. CDR practices must engage how digitalization shapes society and the environment and the impacts that it has on individuals, communities, and states.

States for their part are beginning to recognize CDR practices as critical. For example, the French and German governments have both articulated how CDR and building trust in a digital ecosystem will be imperative for corporations moving forward. Both countries articulate the need for CDR to go beyond minimum-expected standards and regulations (see German CDR Initiative and France Corporate Digital Responsibility for more details).Footnote 1

As part of the discussion on digitalization and ethical and responsible approaches to it, many have begun applying for a critical perspective on the impact of data and digitization on society. As a result, we have seen witnessed a rise in critical studies that uses data colonialism and data capitalism as a lens from which to highlight how existing power dynamics now manifest themselves in the digital realm. CDR will have to reckon with this and the continued drive toward commodifying and quantifying human behaviors and digital interactions (Sadowski, Reference Sadowski2019; Thatcher et al., Reference Thatcher, O’Sullivan and Mahmoudi2016).

3.2. Separate but overlapping: Why CDR must go beyond CSR

Commentators often tend to think of CDR as part of a CSR mechanism; we argue that it is best to think of CDR as a separate mechanism that overlaps with CSR in multiple areas. CDR and CSR are both voluntary and self-governed approaches to responsible business practices that aim to go beyond the mandatory legal and regulatory minimums established by the states where they operate. CDS and CSR both share, in theory, a corporate citizenship ethos in that they reflect a deeper interest in the impact of business practices on consumers, and broader society. CDR and CSR both maintain that implementing these policies provide an economic and business advantage for firms.

We argue that CDR is separate due to the scale and the impact of the digital transformation, and the unique influence and power dynamics that arise. Moreover, our concern is that if CDR develops along similar lines of CSR as a more voluntary “soft” measures rather than government-enforced obligatory laws, it risks failing in the same way that CSR has failed to be an effective mechanism in the realm of environmental and climate protection (Eavis and Krauss, Reference Eavis and Krauss2021). As CDR is an emerging mechanism, an opportunity exists to incorporate the effective parts of CSR but include stronger enforcement and accountability mechanisms. We have an opportunity to learn from the last time technologies with such profound impacts emerged, and we did not effectively protect long-term societal interests. CDR will have to manage the impacts associated with environmental degradation resulting from unsustainable practices and increased digital waste. Moreover, CDR will have to target the growing concerns regarding data collection and the problems related to corporate surveillance and the deep implications that this has for societies across the world. Examples of the challenges CDR will have to address are emerging with clear implications for the future. In the education sector, there is need to balance the needs for digitalization with the concerns about perpetuating a dehumanizing culture that fosters mistrust between education institutions and students. The criminal justice system is increasingly using algorithms that enshrine unjust and biased practices. Such examples can be seen throughout different sectors, and they are not necessarily new phenomena; rather they are exacerbated by the turn toward digital tools and algorithms for decision-making support.

The concern is that CDR will be relegated to secondary or tertiary consideration by organizations while continuing to prioritize a business-as-usual approach. More worryingly, without strong accountability mechanism and independent government regulatory development, it can be used for whitewashing or regulatory capture. Additionally, the challenges that plagued CSR in other sectors will be the same for CDR if they are not actively addressed. CDR must reckon that it is dealing with a phenomenon that is reshaping human relations and that the responsibility threshold is higher.

4. Data Responsibility and the Limits of CDR

4.1. New tools, old problems

Governments, communities, and corporations are all trying to grasp the full scale and impact of digitization (Dufva and Dufva, 2019), and assessing whether they are well suited to respond to this digital transformation (Cheng et al., Reference Cheng, Frangos and Groysberg2021). This is happening as a gap emerges between societal expectations, and the impact of the proliferation of new technologies and digital tools. This, however, is a phenomenon that occurs throughout history as new material forms and tools emerge. The sociologist William Ogburn argued (a century ago) of the existence of a cultural gap that arises when new tools and technology make it to the world and societies assess their societal impact. Ogburn called this a period of maladjustment where tools and products exist and proliferate through the economy and society, but that society itself has yet to figure out what the norms and regulations ought to be (Ogburn, Reference Ogburn1922). We are living through this period of maladjustment with digital and data responsibility.

One of the challenges that we have when regulating or thinking about technology is the debate about whether technology shapes societal norms and values, what proponents refer to as “Technological Determinism,” or whether societal and cultural norms, developed by individuals and groups go on to shape technologies, control them, and place them at the service of society (Dafoe, Reference Dafoe2015; Winner, Reference Winner1980). These debates have been ongoing for decades and we suspect we will continue to do so for many more years to come.

We argue that it is important for policymakers, academics, and corporate executives to be aware of these tensions and to understand how these debates shape competing perspectives on the role of regulations. These debates help frame the conversations around agency and the degree to which technology exists within the realm of control and governance. Critically, the debates about the nature of technology or the separation of business and ethics have real-life implications. They help shape how we go about designing governance structures, whether its norms, regulations, or frameworks, to operate these technologies, manage them, or conclude that the harms far outweigh the potential benefits.

4.2. Tech and data responsibility

For the purposes of this commentary, we adopt the initial United Nation Office for the Coordination of Humanitarian Affairs (UNOCHA) definition when we think about Data Responsibility: “a set of principles, processes, and tools that support the safe, ethical, and effective management of data.” (OCHA Centre for Humanitarian Data, 2021).

Data responsibility and data protection regulations have existed for a while, and the first national data privacy law goes back to the Sweden Data Protection Act of 1973, or the state of Hesse in Germany in 1970 (Hofverberg, Reference Hofverberg2012). These laws first established broad protections and rules about the management of data by entities. Today, more than 66% of countries around the world have passed legislation focused on Data Protection and Privacy, and they vary in breadth and scope (UNCTAD, n.d.).

The struggle around data and tech responsibility is a struggle between three groups: Governments seeking to exert their power and control, corporations seeking to maintain their ability to generate profits and expand market share, and people who are looking to protect their rights all the while making use of the modern-day benefits that technological advancement brings.

There is no escaping being a part of the data ecosystem. If you live in a country with even a minimum level of data and technological penetration, you are impacted by the scope and scale of data collection, the data generating apparatus, and the ways in which they have an impact on you and your community (Thompson and Warzel, Reference Thompson and Warzel2019). Every individual that is part of the interconnected economic system generates new data every day by simply existing and participating in the economic system, there is no escaping it, otherwise as Paolo Ricaurte puts it “refusing to generate data means exclusion”(Ricaurte, Reference Ricaurte2019).

Concurrently, the commodification of everyday data is here for all to see (Thatcher et al., Reference Thatcher, O’Sullivan and Mahmoudi2016). Financial reports of tech companies highlight the dollar values companies expect to generate from user data and engagement. Corporations often resist the push toward hard laws as they view the costs associated with implementations as too high to bear or as a threat to their business models (Crain, Reference Crain2018; Sherman, Reference Sherman2020). Interestingly, this clash is also between corporations, as advertising firms (and Facebook), for example, have reported how Apple’s new privacy model has cut deep into their revenues and voiced their dissatisfaction with Apple’s move toward a more privacy conscious model.

The lack of a global regulatory system or globally accepted standards and principles means that there is going to remain gaps in implementations and inadequate responsible data practices. Corporations adopting CDR mechanisms may be the first step, but our concern is that it will remain a piece-meal or ad hoc approach limited to one company or a portion of a sector. The challenge is in how to design new approaches that can be scaled across the sectors and across continents that would leverage the potential of digital technologies, all the while empowering users, ushering in a new era of transparency, agency, and innovation that is sustainable and responsible (Mastercard, 2019).

4.2.1. The “Big-Tech” perspective on data responsibility: Self-governance

Commentators argued that Big-Tech companies push for private governance in order to avoid mandates or rules imposed by government regulators (Schaake, Reference Schaake2021). Analysts have made the case that it is in the best interest of these companies and for the sector to preempt government regulations by establishing clear rules and guidelines that would govern the sector. Private governance is seen as a tool to maintain consumer trust, pursue sector wide strategies, and avoid a “digital tragedy of the commons” (Cusumano et al., Reference Cusumano, Gawer and Yoffie2021). An extreme view of this is being proposed in Nevada where local tech companies would essentially become their own government, proposing rules and regulations in their “innovation zones,” levy taxes on constituents, and manage schools and other government functions (Nevett, Reference Nevett2021).

Over the past few years, “Big-Tech” companies have pursued self-regulation strategies and developed their own oversight board because of increasing public scrutiny to their actions. Oversight boards are in theory an adjudicatory body that will often review and provide guidance on issues as they relate to the digital policies and practices of a corporation. They are designed to be an external body whose recommendations the boards and upper management ought to implement. The most notable example of this has been the Facebook oversight board first introduced in November 2018 and later formed in October 2019 (Oversight Board, n.d.). The board is often referred to as Facebook’s Supreme Court, which listens to appeals and is tasked to provide commentary on Facebook’s policies and to evaluate appeal processes vis-a-vis content moderation. The concern is that an oversight board or similar mechanism, while in theory a welcomed process, in practice, can often end up a regulatory dodge, utilizing a CDR mechanism as a facade of accountability while shielding a corporation from true regulation.

The Facebook Oversight Board has been criticized as not having any teeth. The ruling of the board only applies to the specific case in question and does not lead to the establishment of companywide policies or precedents. The oversight board is open for expansion should they prove to be successful (Hatmaker, Reference Hatmaker2021). The board is only able to issue recommendations, and their ruling can be ignored by the company (Klonick, Reference Klonick2021). Moreover, this approach remains ad hoc and limited to one company at a time, rather than provide industrywide standards and practices. As a result, there is a lack of transparency into these processes. Critically, the scope of the board and a lot of the discussion on the regulation of big-tech are focused on content moderation, with gaps on issues related to data collection and tracking, the environmental and labor costs of collecting, managing and using the data, and the extent to which data are sold and bought by data brokers in the United States for example (Melendez and Pasternack, Reference Melendez and Pasternack2019). Furthermore, the lack of independent access and investigation makes it hard to truly evaluate the true impact of the Facebook Oversight Board, but the signs are not promising (Bass, Reference Bass2021; Hegelich, Reference Hegelich2020; Wall Street Journal, 2021).

4.2.2. The nonprofit perspective on data responsibility: Guidelines, frameworks, and operationalized approaches

One sector that deals with data responsibility in a volatile environment daily is the humanitarian sector. Humanitarian actors collect (and generate) a wide range of data related to the conflict or disaster areas they operate in, including data on some of the most vulnerable groups. Humanitarian actors increasingly use sophisticated tools such as remote sensing, biometric data collection, or information communication technologies to conduct their work, and see connectivity and digitalization as a central component of aid. Due to the nature of their work, humanitarians are increasingly wary of the dangers and potential risks of digitization and increased data collection. Individuals’ lives and livelihood could be in danger if data are leaked or misused for example.

As a result, the humanitarian sector (broadly defined) has pushed toward the development of sectorwide guidelines, frameworks, and ethical codes to guide the work of humanitarians. Certain codes, like the Signal Code, apply existing international human rights law and humanitarian law to the rights of individuals in crises for information and establishes clear responsibilities for actors responding to crises and collecting data of those impacted.

The International Committee of the Red Cross Data Protection Guidelines, UNOCHA Data Responsibility Guidelines, or the Inter-Agency Standing Committee Operational Guidance on Data Responsibility in Humanitarian Action look at how data protection guidelines can be operationalized and how organizations across the sector can implement them. The Sphere Handbook that establishes The Humanitarian Charter and Minimum Standards and is relied upon by hundreds of organizations around the world has continuously expanded its guidelines and focus on data protection and responsible digital use.

It must be noted that many humanitarian agencies operate under agreements that provide them with immunities and protections from prosecution, some of which private corporations might very much desire. The purpose here is to showcase the sector’s approach and thinking about these issues. The authors believe that the humanitarian sector has been leading the way in many ways toward developing data responsibility approaches that reduce risks and harms, all the while maximizing the benefit of digital tools.

4.2.3. Clash of the Titans

The limits of a CDR approach were exposed as governments clashed with tech companies as they sought to combat monopolies and impose antitrust regulations, institute fines (Graham, Reference Graham2017), threaten to breakup companies (Staff, 2007), or impose new rules on companies (Perrigo, Reference Perrigo2020). In early 2021, Facebook and Google both clashed with the government of Australia over its proposed media law that would have required companies such as Google and Facebook to share profits, emanating from search results, with media companies (Kaye, Reference Kaye2021). Both companies threatened to limit their service offerings in those countries before stepping back and reaching settlements with the government (Kaye and Packham, Reference Kaye and Packham2021).

This episode underlines the power struggle that will be ongoing over the next decades, and the limits of CDR models. As governments reckon with the continued digital transformation and look for new ways of retaining their control and power, tech companies will continuously push against regulations while lobbying for concessions or reach settlements that do not impact their profit margins. Concurrently, tech companies may pursue strategies to lobby governments to impose certain regulations to undermine a competitor. Microsoft, for example, backed the Australian government in its row with Google, seeing an opportunity to gain a larger market share of the search market (Shead, Reference Shead2021). This dynamic at play underscores that striking the balance between government regulations, tech companies’ private interests individually and as a sector, and maintaining the rights and the trust of the general population will require tremendous effort, one that is not resolved simply by pursuing a CSR or CDR strategy.

5. Implications

5.1. Data responsibility must be made the core of business

Data responsibility must be understood as one of the most important principles for the protection of rights individuals and communities moving forward. Otherwise, as new technologies and new forms of extraction are developed, the guardrails will become harder and harder to establish. Data responsibility and ethical use of technology have to be embedded through the entire business cycle (Martin and Freeman, Reference Martin and Freeman2004). Therefore, treating data responsibility exclusively through a CSR or a CDR lens does not meet the seriousness and critical nature of the issue. That approach both undermines the higher ideals of developing widespread responsible practice all the while also underscoring the true value of data and the actual impact that data plays on the individual level, societal level, and corporate level. Viewing data responsibility as a CSR or CDR policy shifts the focus from centering data responsibility in the processes, design, and implementation of corporate practices, to a secondary, or worse, an afterthought that is completed to complement the business operations of a company.

5.2. Connecting digital and environmental

There is a growing divergence between digital strategies and environmental/climate change strategies. A lot of the solutions to solving digital technology challenges, such as biased algorithms, are to collect more and more data, whereas, currently, those conflict with strategies to become more environmentally friendly. Furthermore, as Kate Crawford highlights in her groundbreaking book Atlas of AI, “calls for labor, climate, and data justice are at their most powerful when they are united” (Crawford, Reference Crawford2021). Therefore, the social responsibilities of these technology companies must be assessed and held accountable in a holistic manner.

5.3. Acknowledging and addressing the societal impact in a holistic manner rather than piecemeal

An approach that considers the protection of individual rights, technological growth, and innovation is a challenge. Ignoring or undermining one component for the sake of the other will lead to further inequity and continued accumulation of power and control by a seemingly smaller share of entities that control the data ecosystem. Furthermore, due to the foundational role that data-driven technologies will have in society, this disproportionate power will inadvertently translate into many, if not all, other areas of society.

Data responsibility and CSR/CDR must grapple and consider issues with issues linked to data exploitation, data justice, and data colonialism. Data responsibility also must reckon with the growing digital divide and how the rush toward a digital transformation that is not equitable will leave hundreds of millions of people behind and outside the economic system.

5.4. Understanding new stakeholders and the level of impact

We are particularly concerned with companies that contribute to the growth of data brokers by selling data to these third-party groups. Data brokers facilitate the ability of companies and states to subvert CDR commitments or, in the case of governments, allow them to curtail certain rights (Roderick, Reference Roderick2014). Data brokers take advantage of a lack of regulation and gray areas to collect massive amounts of information on people, package the data, and then sell them.

Funding Statement

This work received no specific grant from any funding agency, commercial, or not-for-profit sectors.

Competing Interests

The authors declare no competing interests exist.

Author Contributions

The authors have contributed equally to the design, research, writing, and editing of the manuscript.

Data Availability Statement

There are no primary data used in this manuscript, and all references are linked below, some of which may require institutional access.

Abbreviations

CDR, corporate digital responsibility; CSR, corporate social responsibility; IASC, Inter-Agency Standing Committee; ICRC, International Committee of the Red Cross; MOs, meta-organizations; UNOCHA, United Nation Office for the Coordination of Humanitarian Affairs

Footnotes

1 Germany: “Businesses should be encouraged to go above and beyond the minimum statutory requirements—to ensure that social values and the individual are not overlooked in the process of digital transformation. This should translate into real market advantages for those businesses that cultivate a trustworthy reputation. And what is more: consumers will in future have a better overview of how businesses are handling their personal data” (German CDR Initiative).

France: “CDR is a new and unavoidable extension of CSR, which is based on the same principles of trust, accountability, ethics, and exchanges with companies’ stakeholders. The cross-cutting nature of digital technology and its omnipresence mean that the value creation it generates must be understood and shared by all, with regard to democratic, social and societal issues. It is a question of trust, a trust that needs to be renewed in view of the constant changes in technology” (France Corporate Digital Responsibility).

References

Anastasiadis, S (2014) Toward a view of citizenship and lobbying: Corporate engagement in the political process. Business & Society 53(2), 260299. https://doi.org/10.1177/0007650313483495CrossRefGoogle Scholar
Bass, F (2021) Limited Access to Facebook Is Only “Tip of the Iceberg” for Disinformation Researchers, Decode Democracy, 18 May 2021. Available at: https://decode.org/news/limited-access-to-facebook-is-only-tip-of-the-iceberg-for-disinformation-researchers/Google Scholar
Berkowitz, H, Bucheli, M and Dumez, H (2017) Collectively designing CSR through meta-organizations: A case study of the oil and gas industry. Journal of Business Ethics 143(4), 753769. https://doi.org/10.1007/s10551-016-3073-2CrossRefGoogle Scholar
Berkowitz, H and Souchaud, A (2019) (Self-)regulation of sharing economy platforms through partial meta-organizing. Journal of Business Ethics 159(4), 961976. https://doi.org/10.1007/s10551-019-04206-8CrossRefGoogle Scholar
Bernard, T (2021) Corporate social responsibility in postcolonial contexts: A critical analysis of the representational features of South African corporate social responsibility reports. Critical Discourse Studies 18(6), 619636. https://doi.org/10.1080/17405904.2020.1798797CrossRefGoogle Scholar
Cheng, JY-J, Frangos, C and Groysberg, B (2021) Is Your C-Suite Equipped to Lead a Digital Transformation? Harvard Business Review, 12 March 2021. Available at https://hbr.org/2021/03/is-your-c-suite-equipped-to-lead-a-digital-transformation (accessed 21 March 2021).Google Scholar
Crain, M (2018) The limits of transparency: Data brokers and commodification. New Media & Society 20(1), 88104. https://doi.org/10.1177/1461444816657096CrossRefGoogle Scholar
Crane, Andrew, McWilliams, Abagail, Matten, Dirk, Moon, Jeremy, and Siegel, Donald S.. (2008) The corporate social responsibility agenda. In The Oxford Handbook of Corporate Social Responsibility. Oxford University Press, Oxford.Google Scholar
Crawford, K (2021) Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven: Yale University Press.Google Scholar
Cusumano, MA, Gawer, A and Yoffie, DB (2021) Social Media Companies Should Self-Regulate. Now, Harvard Business Review, 15 January 2021. Available at https://hbr.org/2021/01/social-media-companies-should-self-regulate-now (accessed 21 March 2021).Google Scholar
Dafoe, A (2015) On technological determinism: A typology, scope conditions, and a mechanism. Science, Technology, & Human Values 40(6), 10471076. https://doi.org/10.1177/0162243915579283CrossRefGoogle Scholar
Dufva, T and Dufva, M (2019) Grasping the future of the digital society. Futures 107, 1728. https://doi.org/10.1016/j.futures.2018.11.001CrossRefGoogle Scholar
Eavis, P and Krauss, C (2021) What’s Really Behind Corporate Promises on Climate Change?, The New York Times, 22 February 2021. Available at https://www.nytimes.com/2021/02/22/business/energy-environment/corporations-climate-change.html (accessed 16 December 2021).Google Scholar
Graham, L (2017) Here Are Some of the Largest Fines Dished Out by the EU, CNBC, 27 June 2017. Available at https://www.cnbc.com/2017/06/27/the-largest-fines-dished-out-by-the-eu-commission-facebook-google.html (accessed 24 March 2021).Google Scholar
Grigore, G, Molesworth, M and Watkins, R (2017) New corporate responsibilities in the digital economy. In Theofilou, A, Grigore, G and Stancu, A (eds), Corporate Social Responsibility in the Post-Financial Crisis Era. Cham: Springer International Publishing, pp. 4162. https://doi.org/10.1007/978-3-319-40096-9_3CrossRefGoogle Scholar
Hatmaker, T (2021) Facebook Oversight Board Says Other Social Networks “Welcome to Join” if Project Succeeds, TechCrunch. Available at https://social.techcrunch.com/2021/02/11/facebook-oversight-board-other-social-networks-beyond-facebook/ (accessed 21 March 2021).Google Scholar
Hegelich, S (2020) Facebook needs to share more with researchers. Nature 579(7800), 473. https://doi.org/10.1038/d41586-020-00828-5CrossRefGoogle ScholarPubMed
Helmig, B, Spraul, K and Ingenhoff, D (2016) Under positive pressure: How stakeholder pressure affects corporate social responsibility implementation. Business & Society 55(2), 151187. https://doi.org/10.1177/0007650313477841CrossRefGoogle Scholar
Henderson, RM (2018) More and More CEOs Are Taking Their Social Responsibility Seriously, Harvard Business Review, 12 February 2018. Available at https://hbr.org/2018/02/more-and-more-ceos-are-taking-their-social-responsibility-seriously (accessed 25 March 2021).Google Scholar
Herden, Christina J. Alliu, Ervin, Cakici, André, Cormier, Thibaut, Deguelle, Catherine, Gambhir, Sahil, Griffiths, Caleb, et al. (2021) Corporate digital responsibility. Sustainability Management Forum [NachhaltigkeitsManagementForum] 29(1), 1329. https://doi.org/10.1007/s00550-020-00509-xCrossRefGoogle Scholar
Hofverberg, EP (2012) Online Privacy Law: Sweden | Law Library of Congress. Available at https://www.loc.gov/law/help/online-privacy-law/2012/sweden.php (accessed 18 March 2021).Google Scholar
Kaye, B (2021) Australia Takes on Google Advertising Dominance in Latest Big Tech Fight, Reuters, 28 January 2021. Available at https://www.reuters.com/article/us-australia-media-regulator-idUSKBN29X02X (accessed 24 March 2021).Google Scholar
Kaye, B and Packham, C (2021) Facebook “Refriends” Australia After Changes to Media Laws, Reuters, 23 February 2021. Available at https://www.reuters.com/article/us-australia-media-idUSKBN2AN07E (accessed 17 August 2021).Google Scholar
Klonick, K (2021) Inside the Making of Facebook’s Supreme Court, The New Yorker, 12 February 2021. Available at https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-court (accessed 21 March 2021).Google Scholar
Lara, Lobschat, Mueller, Benjamin, Eggers, Felix, Brandimarte, Laura, Diefenbach, Sarah, Kroschke, Mirja, and Wirtz, Jochen (2021) Corporate digital responsibility. Journal of Business Research 122, 875888. https://doi.org/10.1016/j.jbusres.2019.10.006Google Scholar
Lock, I and Seele, P (2016) Deliberative lobbying? Toward a noncontradiction of corporate political activities and corporate social responsibility? Journal of Management Inquiry 25(4), 415430. https://doi.org/10.1177/1056492616640379CrossRefGoogle Scholar
MacMillan, D and Dwoskin, E (2019) The War Inside Palantir: Data-Mining Firm’s Ties to ICE Under Attack by Employees, Washington Post, 22 August 2019. Available at https://www.washingtonpost.com/business/2019/08/22/war-inside-palantir-data-mining-firms-ties-ice-under-attack-by-employees/ (accessed 13 December 2021).Google Scholar
Maon, F, Swaen, V and Lindgreen, A (2017) One vision, different paths: An investigation of corporate social responsibility initiatives in Europe. Journal of Business Ethics 143(2), 405422. https://doi.org/10.1007/s10551-015-2810-2CrossRefGoogle Scholar
Marques, JC and Utting, P (eds) (2010) Business, Politics and Public Policy. London: Palgrave Macmillan. https://doi.org/10.1057/9780230277243CrossRefGoogle Scholar
Martin, KE and Freeman, RE (2004) The separation of technology and ethics in business ethics. Journal of Business Ethics 53(4), 353364. https://doi.org/10.1023/B:BUSI.0000043492.42150.b6CrossRefGoogle Scholar
Mason, RO (1986) Four ethical issues of the information age. MIS Quarterly 10(1), 512. https://doi.org/10.2307/248873CrossRefGoogle Scholar
McGreal, C (2021) How a Powerful US Lobby Group Helps Big Oil to Block Climate Action, The Guardian, 19 July 2021. Available at https://www.theguardian.com/environment/2021/jul/19/big-oil-climate-crisis-lobby-group-api (accessed 15 December 2021).Google Scholar
Melé, D (2008) Corporate social responsibility theories. In The Oxford Handbook of Corporate Social Responsibility. Oxford University Press, Oxford.CrossRefGoogle Scholar
Melendez, S and Pasternack, A (2019) Here Are the Data Brokers Quietly Buying and Selling Your Personal Information, Fast Company, 2 March 2019. Available at https://www.fastcompany.com/90310803/here-are-the-data-brokers-quietly-buying-and-selling-your-personal-information (accessed 21 March 2021).Google Scholar
Nevett, J (2021) Nevada Smart City: A Millionaire’s Plan to Create a Local Government, BBC News, 18 March 2021. Available at https://www.bbc.com/news/world-us-canada-56409924 (accessed 21 March 2021).Google Scholar
OCHA Centre for Humanitarian Data (2021) OCHA Data Responsibility Guidelines, October 2021. Available at https://data.humdata.org/dataset/2048a947-5714-4220-905b-e662cbcd14c8/resource/60050608-0095-4c11-86cd-0a1fc5c29fd9/download/ocha-data-responsibility-guidelines_2021.pdf (accessed 10 November 2021).Google Scholar
Ogburn, WF (1922) Social Change with Respect to Culture and Original Nature. B.W. Huebsch, Inc. New York, New York.Google Scholar
Oversight Board (n.d.) Independent Judgment. Transparency. Legitimacy. Available at https://oversightboard.com/ (accessed 21 March 2021).Google Scholar
Paul, K (2020) Hundreds of Workers Defy Amazon Rules to Protest Company’s Climate Failures, The Guardian, 28 January 2020. Available at https://www.theguardian.com/technology/2020/jan/27/amazon-workers-climate-protest (accessed 13 December 2021).Google Scholar
Perrigo, B (2020) European Union Announces New Big Tech Regulations, Time, 30 December 2020. Available at https://time.com/5921760/europe-digital-services-act-big-tech/ (accessed 24 March 2021).Google Scholar
Ricaurte, P (2019) Data epistemologies, the coloniality of power, and resistance. Television & New Media 20(4), 152747641983164. https://doi.org/10.1177/1527476419831640CrossRefGoogle Scholar
Roderick, L (2014) Discipline and power in the digital age: The case of the US consumer data broker industry. Critical Sociology 40(5), 729746. https://doi.org/10.1177/0896920513501350CrossRefGoogle Scholar
Russell, CA, Russell, DW and Honea, H (2016) Corporate social responsibility failures: How do consumers respond to corporate violations of implied social contracts? Journal of Business Ethics 136(4), 759773. https://doi.org/10.1007/s10551-015-2868-xCrossRefGoogle Scholar
Sadowski, J (2019) When data is capital: Datafication, accumulation, and extraction. Big Data & Society 6(1), 2053951718820549. https://doi.org/10.1177/2053951718820549CrossRefGoogle Scholar
Schaake, M (2021) Big Tech Is Trying to Take Governments’ Policy Role, Financial Times, 27 January 2021. Available at https://www.ft.com/content/7f85a5ff-326f-490c-9873-013527c19b8f (accessed 21 March 2021).Google Scholar
Shead, S (2021) Microsoft Says It Would Never “Threaten to Leave” Australia After Google Said It Could Withdraw Search Engine, CNBC, 3 February 2021. Available at https://www.cnbc.com/2021/02/03/microsoft-backs-australia-after-google-threatened-to-withdraw-search-.html (accessed 24 March 2021).Google Scholar
Sherman, J (2020) Oh Sure, Big Tech Wants Regulation—On Its Own Terms, Wired, 28 January 2020. Available at https://www.wired.com/story/opinion-oh-sure-big-tech-wants-regulationon-its-own-terms/ (accessed 22 March 2021).Google Scholar
Staff (2007) TIMELINE: 17 Years of EU, U.S. Tussles with Microsoft, Reuters, 16 September 2007. Available at https://www.reuters.com/article/us-microsoft-eu-chronology-idUSL1075898720070916 (accessed 24 March 2021).Google Scholar
Su, W, et al. (2016) The Signaling Effect of Corporate Social Responsibility in Emerging Economies. Journal of Business Ethics 134(3), 479491. https://doi.org/10.1007/s10551-014-2404-4CrossRefGoogle Scholar
Thatcher, J, O’Sullivan, D and Mahmoudi, D (2016) Data Colonialism Through Accumulation by Dispossession: New Metaphors for Daily Data. Environment and Planning D: Society and Space 34(6), 9901006. https://doi.org/10.1177/0263775816633195CrossRefGoogle Scholar
The Guardian (2021) Google Fires Margaret Mitchell, Another Top Researcher on Its AI Ethics Team, 20 February 2021. Available at https://www.theguardian.com/technology/2021/feb/19/google-fires-margaret-mitchell-ai-ethics-team (accessed 13 December 2021).Google Scholar
Thompson, SA and Warzel, C (2019) Opinion | Twelve Million Phones, One Dataset, Zero Privacy, The New York Times, 19 December 2019. Available at http://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html (accessed 22 March 2021).Google Scholar
UNCTAD (n.d.) Data Protection and Privacy Legislation Worldwide. Available at https://unctad.org/page/data-protection-and-privacy-legislation-worldwide (accessed 18 March 2021).Google Scholar
Wade, M (2020) Corporate Responsibility in the Digital Era, MIT Sloan Blogs. Cambridge, MA: Massachusetts Institute of Technology. Available at http://www.proquest.com/docview/2395339489/citation/BF37F43CDB684B00PQ/1 (accessed 28 July 2021).Google Scholar
Wall Street Journal (2021) The Facebook Files, 1 October 2021. Available at https://www.wsj.com/articles/the-facebook-files-11631713039 (accessed 10 December 2021).Google Scholar
Winner, L (1980) Do artifacts have politics? Daedalus 109(1), 121136.Google Scholar
Submit a response

Comments

No Comments have been published for this article.