Dailymaverick logo

Op-eds

DIGITAL DANGER OP-ED

Grok’s undress feature violates South Africans’ rights, and must be stopped now

Moxii Africa (formerly Media Monitoring Africa) has issued letters of demand to the South African government and to X Corp to ensure Grok AI acts immediately to disable the ‘undressing’ feature of Grok, and any other feature that allows users to create and share non-consensual intimate images and child sexual abuse material.

Moxii Africa is saying that the use of the Grok AI undress feature violates several rights entrenched in South Africa’s Constitution (Photo: Andrey Rudakov / Bloomberg / Getty Images) Moxii Africa is saying that the use of the Grok AI undress feature violates several rights entrenched in South Africa’s Constitution (Photo: Andrey Rudakov / Bloomberg / Getty Images)

A global story has exposed how X Corp’s (formerly Twitter) AI tool Grok has been allowing users to instantly generate images of adults and children in various states of undress. With simple commands, Grok could put any subject into a bikini, or underwear, or indeed practically no underwear at all, and share it. It is a clear illustration of the danger of AI tools (that have been designed with few built-in safety features), and that have apparently had no regard for fundamental rights.

X owner Elon Musk initially made light of the feature, posting images of himself in a bikini. While in context, this feature can be used to poke fun at yourself, as is being done in this instance by Musk himself, consent is implicit.

It is quite a different matter to use images of another person, with no awareness or any form of consent, and place them in skimpy clothes and in sexualised positions.

In other words, there may be scenarios where placing a subject in a bikini isn’t a rights violation. An advertising agency working with swimwear models, for example, where a model has given full informed consent to use and alter the clothes they are wearing, may be a legitimate use of this feature.

Unfortunately, where there are no safeguards, it’s only too easy to create and share these images of adults and young people without their consent, which is an egregious rights violation. Thousands have been subjected to the abuse. Sadly, it is extremely difficult to get accurate data as to exactly how many, but the UK government and regulator are both investigating and taking action.

It isn’t just us who are deeply angered by this. Many locally known social media users, including Phumzile Van Damme, raised the alarm and encouraged users to report the feature.

As a result of the outrage, the BBC has reported recently that Grok AI will no longer “be able to edit photos of real people to show them in revealing clothing in jurisdictions where it is illegal”.

There are critical elements to consider here. It was only in response to the threat of substantial fines and widespread public outrage that action was taken by X.

Musk’s response, beyond ridicule, was, as reported by Bloomberg and others, to refer to the UK government as “fascist” for demanding action. X has, however, now acted in certain places to limit Grok’s ability to undress people. It is a mystery, and possibly a deeply cynical response, to limit this feature of Grok only in jurisdictions where such actions are illegal.

Why, in the face of an abuse that is so clearly and patently harmful, such as NCII (non-consensual intimate image) and CSAM (child sexual abuse material), would you not apply your changes globally?

Even if South Africa had not criminalised such conduct under the Cybercrimes Act and other legislation, the sharing of such images remains an egregious rights violation.

Oped-Bird-Grok violations
Letter of demand to Government Agencies. (Image: Supplied)


Oped-Bird-Grok violations
Letter of demand to X. (Image: Supplied)
Oped-Bird-Grok violations
Letter of demand to X. (Image: Supplied)
Oped-Bird-Grok violations
Letter of demand to X. (Image: Supplied)

Rights apply to all

This is important because rights apply to all people, whether you support Maga, AfriForum, the ANC, DA, EFF, or MK, whether you are local or foreign, Christian, Jewish, Muslim, Hindu or a Scientologist.

Even if you believe human rights are part of a left-wing extremist antifa agenda, all people have fundamental rights. So, when a technology tool is shown to so easily violate the rights of possibly millions of people, where it can so easily create and distribute child sexual abuse material, surely the reasonable response would be to immediately act to limit the harm everywhere you operate and where you know people use your tool.

It is a bitter irony that Musk continues to lie about a white genocide taking place in South Africa, and yet his company seemingly has no problem with violating the rights of the people of South Africa. With no need to sign in, users can still make use of the undress feature of Grok AI. We say this must stop.

That is why we have sent a letter of demand to XCorp legal representatives, to take immediate steps in relation to the Grok AI undress feature. We have also sent letters of demand to the Minister of the Department of Communication and Digital Technologies, the Minister of Justice, the Minister of Women, Children and people with Disabilities, the Film and Publications Board, and Icasa (Independent Communications Authority of South Africa) to stand with the people of the country and act to ensure Grok AI disables the undress features immediately.

Specifically, Moxii Africa is saying that the use of the Grok AI undress feature violates several rights entrenched in South Africa’s Constitution, including:

  • the right to human dignity (section 10);
  • the right to privacy (section 11);
  • the right to bodily and psychological integrity (section 12); and
  • the right of children to be protected from abuse and degradation (section 28(1)(d)).

The use of the feature also contravenes the following South African criminal prohibitions, among others:

  • The unlawful, intentional, and non-consensual disclosure of a data message of an intimate image (Cybercrimes Act 19 of 2020);
  • The unlawful possession, creation, production or any contribution to, or assistance in, the creation of CSAM (Films and Publications Act 65 of 1996);
  • The harmful disclosure of an image, however created, or any description, of a person, real or simulated, of an explicit or sexual nature (Criminal Law (Sexual Offences and Related Matters) Amendment Act 32 of 2007); and
  • The exposure or display of CSAM and causing the exposure or display to children of any image, however created, or any description or presentation, of a person, real or simulated, of a pornographic nature (Criminal Law (Sexual Offences and Related Matters) Amendment Act 32 of 2007).

With gender-based violence a declared national emergency, the path is clear. We demand that XCorp acts with the same urgency as it has in other countries. DM

Comments

Scroll down to load comments...