New digital technologies have transformed the way people live in so many ways, creating economic growth and job creation, forging scientific breakthroughs, empowering human rights activism and providing new opportunities in every part of society, according to Lene Wendland, head UN Human Rights Division Business and Human Rights. Digital technologies can also help make progress on all the Sustainable Development Goals (SDGs), she added.
At the same time, the dark side of these same innovations can pose serious risks to people’s rights, including by violating privacy, spreading hate speech, disinformation, undermining democratic processes, and increasing online violence against women and LGBTI people. people, according to Wendland.
That is why it was vital for UN Human Rights, which takes the lead on business and human rights within the UN system, to establish the Business and Human Rights in Technology Project (B-Tech Project) in 2019 as a way to address these issues by providing an authoritative roadmap for applying the UN Guiding Principles on Business and Human Rights (UNGP) to the development and use of digital technologies.
The Office has played an important role in advocating for business enterprises to be held accountable for human rights violations in accordance with the UNGPs. Twelve years ago, the UNGPs changed expectations about how companies should do business. Before the Guiding Principles, the extent to which human rights apply to business was once a divisive and polarizing topic. The UNGPs have quickly become a global standard for countries and companies to use as their one-size-fits-all guide to preventing, addressing and correcting human rights violations related to business activities.
“The UNGPs have led to game-changing progress on how to do business with respect for human rights and provide the perfect framework to prevent, address and remedy the potential and actual dangers associated with digital technologies,” Wendland said.
Human Rights Framework for Tech
Through the project, B-Tech Office works directly with companies such as Microsoft, Hewlett Packard Enterprise, Google and Meta. The project engages not only with private partners, but also with governments, academia and civil society, providing a safe space to engage and learn from each other.
The four key areas the B-Tech project focuses on are: addressing human rights risks in business models, human rights and end-use due diligence, accountability and remedies, and exploring regulatory and policy responses to the challenges on digital human rights. There is also a focus on the role of technology investors.
“We can’t do this alone. We need to have these strategic engagements and strategic partnerships for this project to have any impact.”
LENE WENDLAND, HEAD, BUSINESS AND HUMAN RIGHTS DIVISION, UN HUMAN RIGHTS
Wendland applauds Google for their strong commitment to working together through B-Tech to address human rights challenges, including in the area of generative AI. The Google Human Rights Program develops Google’s company-wide civil and human rights strategy.
“Our team is responsible for working across Google to develop our human rights policies, processes and tools, including conducting human rights due diligence, providing guidance to our product teams on potential risks and mitigations, and more,” said Shahla Naimi, Deputy Global Director Head of Human Rights at Google.
In Brussels, Google’s human rights team recently joined a meeting organized by B-Tech and the Global Network Initiative to explore AI, human rights and the evolving regulatory environment alongside civil society organizations, academics, policy makers and other companies .
“At Google, we’ve been doing human rights due diligence on AI products for years,” Naimi said. “We undertake this work to identify actual and potential adverse impacts and opportunities for appropriate action to avoid, prevent or mitigate such impacts. Our generative AI due diligence efforts are an extension of these long-term efforts.”
For example, Naimi said, Google has conducted due diligence on specific products to inform a rights-based approach to future products or services that will integrate or implement generative AI with significant scope, scale and likelihood of impact. Other work includes product-agnostic analysis of generative AI that considers long-term human rights risks to individuals and society to inform Google’s understanding of potential harms and opportunities in their products, she added.
Naimi said the Office has provided instrumental guidance, support and feedback as they work to meet their human rights commitments and implement the UNGPs throughout their business. It is also a useful forum for engaging with peers and civil society on the challenges and opportunities for human rights due diligence in the technology sector.
“UN Human Rights not only provides us with authoritative guidance on the implementation of the UNGPs, but also provides us with a place where we can learn together with other companies in important information-sharing and training calls,” Naimi said. “The B-Tech Community of Practice, for example, helps us engage in difficult scenarios with other companies and get practical guidance from UN Human Rights in the process.”
The Business and Human Rights Resource Center (BHRRC), B-Tech’s civil society partner, tracks over 10,000 companies worldwide, with its primary focus on the intersection of business and its impact on human rights.
Gayatri Khandhadai, Head of Technology and Human Rights for the BHRRC, helps promote the implementation of the UNGPs in technology companies around the world. As a digital rights defender, the work she does with the B-Tech Project is deeply personal to her, as she has witnessed the impact of online violence and privacy violations firsthand and worked with many human rights defenders. which have been marked.
“There is a complete sense of helplessness that comes out when we are faced with online attacks. “Neither the state nor the companies are fast enough, even when they want to, to prevent damage,” she said.
Her team consistently monitors the news and monitors civil society reports of human rights abuse allegations against tech companies in various sub-sectors in the tech space.
“The tech sector is relatively young compared to other sectors, so there’s still a lot of reckoning to be done,” she said. “It often feels like countries are constantly just trying to catch up with these companies. Too often this means that the policy response to the technology sector is reactive and not necessarily well thought out. In some ways, the cycle of accountability that other sectors have gone through is only now manifesting itself for the technology sector. There is a learning curve for tech companies to figure out how to behave responsibly.”
Khandhadai believes that the B-Tech project is making the technology space safer for people in their daily lives.
“The way they articulate the standards and expectations of companies is rooted in international law, which in turn helps us in advocacy,” she said. “They enable conversations between technology companies and civil society about key business and human rights challenges at the international level.” There aren’t many places where that happens. I find their analysis and guidance innovative and collaborative.”