WARNING: This text discusses little one sexual abuse materials (CSAM).
At first look, OpenDream is only one of many generic AI picture technology websites which have sprung up in recent times, permitting individuals to create photographs by typing quick descriptions of what they need the outcomes to appear to be.
The platform describes itself as an “AI artwork generator” on its web site and social media platforms, and invitations customers to “unlock a world of limitless artistic prospects”.
However for months, individuals generated little one sexual abuse materials (CSAM) with it – and the photographs had been left seen on the platform’s public pages with none obvious moderation.
Bellingcat first got here throughout OpenDream by promotional posts on X (previously Twitter) in July. Lots of the accounts selling OpenDream centered on AI instruments and companies.
A number of the posts about OpenDream on X featured a screenshot of the platform’s public gallery with photographs of younger ladies subsequent to grownup girls in bikinis.
The location’s important web page described the platform as an “AI Artwork Generator”, with a reasonably innocuous animation-style picture. Nevertheless, the general public gallery web page – which didn’t require any login to view – was stuffed with sexually express photographs.
Inside seconds of scrolling, we got here throughout a number of AI-generated photographs of younger kids and infants in underwear or swimwear, or being sexually abused. The prompts used to generate this imagery – which had been additionally publicly seen – clearly demonstrated an intent to create sexualised photographs of youngsters. For instance, some included the phrase “toddler” together with specific physique components and poses, descriptions of full or partial nudity, or a sexual act.
Bellingcat has reported the positioning to the Nationwide Middle for Lacking & Exploited Youngsters within the US, the place this reporter is predicated.
Along with CSAM, there gave the impression to be no restrictions by OpenDream on producing non-consensual deepfakes of celebrities. Scrolling by the gallery, we discovered bikini-clad photographs of well-known feminine net streamers, politicians, singers, and actors.
Archived variations of the gallery on Wayback Machine present that OpenDream was displaying AI CSAM and non-consensual deepfakes way back to December 2023.
Upon trying to find the platform’s title on widespread search engines like google and yahoo similar to Google and Bing, we discovered that almost all of picture outcomes had been additionally CSAM.
Whereas OpenDream isn’t as broadly used as another AI picture technology platforms, and it’s not the primary case of individuals utilizing AI to generate CSAM, the extent to which such materials was made obtainable to the general public with none obvious moderation set it other than different websites we had come throughout at Bellingcat.
On the finish of July 2024, we noticed moderation makes an attempt for the primary time on the platform. As of publication, CSAM and deepfake porn photographs seem to have been both made non-public or taken down from the positioning. It’s unclear why there was a sudden clean-up after this content material was left unmoderated on the web site for no less than half a yr. OpenDream didn’t reply to our requests for remark.
In accordance with the positioning’s WHOIS report, the platform’s area was registered by Namecheap, which sells net domains. Namecheap’s area registration settlement prohibits utilizing their companies for “unlawful or improper functions”, together with if the content material violates the legal guidelines of any native authorities. The area registrar didn’t reply to our requests for remark. OpenDream’s web site is hosted by IT companies firm Cloudflare, which prohibits content material that comprises, shows, distributes, or encourages the creation of kid sexual abuse materials. Cloudflare additionally didn’t reply to our request for remark.
Monetising ‘Not Protected For Work’ Content material
OpenDream generates revenue from each subscriptions and promoting. Whereas anybody can generate a handful of photographs every day free of charge, there are additionally paid plans.
In July, the 2 costliest subscription plans allowed customers to make use of NSFW (not protected for work) prompts, entry OpenDream’s NSFW fashions and conceal the photographs they generated from the general public gallery. NSFW is a basic time period for content material that will often be thought of inappropriate to be considered at work, similar to sexual, violent or disturbing imagery. The one NSFW imagery we noticed on OpenDream was of a pornographic or sexual nature.
As of publication, the mentions of public NSFW fashions have been faraway from the pricing info, however the paid plans nonetheless embrace entry to NSFW prompts.
Till not too long ago, the funds had been powered by monetary companies firm Stripe. However in direction of the top of July, the Stripe cost display screen for OpenDream stopped working. Bellingcat requested Stripe about this, however the firm stated it couldn’t touch upon particular person accounts.
Stripe’s coverage prohibits pornographic content material, together with CSAM and non-consensual deepfakes, and it has beforehand taken down cost screens for different AI websites that includes pornographic content material.
Archived variations of the positioning from the Wayback Machine present that OpenDream was first launched as a free beta product in early 2023. A pricing web page first confirmed up someday in July 2023, however didn’t embrace any point out of NSFW prompts or fashions.
Nevertheless, OpenDream had launched plans that particularly allowed NSFW prompts and utilization of NSFW fashions by December 2023, which was additionally across the time when apparent makes an attempt to generate CSAM and non-consensual deepfake porn started showing in archived variations of the gallery web page.
Over the course of our investigation, we additionally got here throughout Google AdSense commercials on the positioning.
One of many people related to OpenDream, Nguyen Viet Hai, appeared to share particulars of the platform’s earnings from advertisements and subscriptions in 4 posts on AI-related Fb teams in October 2023 and April 2024.
In these posts, the account beneath Nguyen’s title stated he was seeking to promote OpenDream because the platform had run into monetary issues. In addition they revealed that OpenDream made about US$2,000-3,000 a month, together with about US$800-1,000 in month-to-month advert income.
The posts from October 2023 stated that OpenDream had 310 customers with paid plans. A put up from April 2024 – with a screenshot of a Google AdSense dashboard for OpenDream’s area – confirmed that from January 1 to March 31, 2024, the platform earned US$2,404.79 (or 59,282,192 Vietnamese Dong) from Google’s programmatic promoting.
Google’s AdSense’s coverage prohibits sexually express content material, together with non-consensual sexual themes similar to CSAM, whether or not simulated or actual. In response to Bellingcat’s queries, Google stated in September that OpenDream’s AdSense account had been terminated in accordance with its insurance policies.
Nguyen is registered as a director of CBM Media Pte Ltd in Singapore, an organization that was named as OpenDream’s proprietor in a number of profiles and weblog posts beneath the platform’s title.
The posts about promoting OpenDream talked about that the corporate doesn’t need to pay taxes on its earnings for the following three years, and has Stripe funds arrange.
Singapore grants newly included corporations tax exemptions for his or her first three years of operation. Based mostly on paperwork Bellingcat obtained from Singapore’s official enterprise registry, which matched the deal with and tax code particulars shared in a few of the profiles and weblog posts, CBM Media Pte Ltd was registered within the nation on June 13, 2023.
Within the Fb posts, Nguyen quoted a value of US$30,000 for the sale in OpenDream in October 2023, however raised this to US$50,000 (or 1.23 billion Vietnamese Dong) in April 2024. It’s unclear if any sale has occurred.
Not a Victimless Crime
Globally, legal guidelines criminalising AI-generated CSAM have lagged behind the expansion of NSFW AI picture technology web sites.
In each the US and the UK, AI-generated CSAM is prohibited and is handled the identical as real-life CSAM. In August this yr, California filed a ‘first-of-its type’ lawsuit towards a few of the world’s largest web sites that generate non-consensual deepfake pornography, together with CSAM, marking a definite shift in authorized consequence to the businesses that supply the AI companies as a substitute of the customers who create the content material. The EU can be discussing new legal guidelines to criminalise AI-generated CSAM and deepfakes. However in different nations, like Japan, there are not any legal guidelines towards sharing artificial sexual content material depicting minors so long as they don’t contain actual kids.
Whereas OpenDream was probably the most egregious examples of an AI picture technology web site that includes CSAM we have now come throughout, it was removed from the one one which has proven up up to now yr. Bellingcat beforehand lined different AI platforms incomes cash by non-consensual deepfakes, together with these that includes photographs of very young-looking topics.
Dan Sexton, chief know-how officer on the Web Watch Basis (IWF), stated AI CSAM isn’t a victimless crime.
“The truth is that there’s an excellent likelihood that you might have had been utilizing a mannequin that has itself been educated on sexual imagery of youngsters, so the abuse of youngsters might effectively have been concerned within the creation of that AI little one sexual abuse imagery,” Sexton stated.
Sexton added that even when the kids depicted are usually not “actual”, publicity to imagery of youngsters being sexually abused may normalise that behaviour.
“You could have the rationalisation of these consuming [AI CSAM] that ‘this isn’t an actual little one’. However even when it seems it was an actual little one, you possibly can rationalise it by saying it’s not actual. In order that I believe is a harmful path to go down,” he stated.
The problem of AI-generated CSAM has grown alongside the problem of deepfake non-consensual express materials of adults, as most of the AI technology companies that folks use to generate pornographic photographs of grownup girls will also be used to generate pornographic photographs of youngsters.
In accordance with IWF’s July 2024 report, most AI CSAM is now lifelike sufficient to be handled as “actual” CSAM and as AI know-how improves, it can develop into an much more troublesome drawback to fight.
Within the case of OpenDream, this materials was made obtainable not simply by the web site itself, however was additionally listed by Google and Bing’s search engines like google and yahoo.
Looking for “Opendream” and “Opendream ai” photographs resulted in each search engines like google and yahoo returning artificial sexualised photographs of minors. Nearly all of the photographs returned included photographs of bikini-clad or bare toddlers.
When requested about OpenDream’s AI CSAM photographs showing on Bing’s search outcomes, a spokesperson for Microsoft acknowledged that the corporate has a “long-standing dedication to advancing little one security and tackling on-line little one sexual abuse and exploitation”.
“We’ve taken motion to take away this content material and stay dedicated to strengthening our defences and safeguarding our companies,” the spokesperson added.
Bellingcat confirmed that these photographs had been faraway from Bing’s picture search outcomes after we reached out for remark.
Equally, a spokesperson for Google acknowledged that the search engine has “sturdy protections to restrict the attain of abhorrent content material that depicts CSAM or exploits minors, and these programs are efficient towards artificial CSAM imagery as effectively”.
“We proactively detect, take away, and report such content material in Search, and have extra protections in place to filter and demote content material that sexualises minors,” the Google spokesperson stated.
Google additionally eliminated the CSAM photographs generated by OpenDream from its search engine outcomes after Bellingcat reached out for remark.
OpenDream, Closed Doorways
Whereas a person named Nguyen Viet Hai was on the Singapore enterprise registry for CBM Media Pte Ltd – the identical one that had overtly mentioned promoting OpenDream on social media – Bellingcat’s investigation urged others had been linked to the operation too.
We observed that a number of people who had been utilizing social media accounts that bore OpenDream’s title had acknowledged they labored for a corporation referred to as CBM or CBM Media. It’s unclear if this refers to CBM Media Pte Ltd., as Bellingcat additionally discovered one other firm referred to as CBM Media Firm Restricted registered within the Vietnamese capital Hanoi by a person named Bui Duc Quan.
Nevertheless, on-line proof urged the house owners of the 2 corporations with “CBM Media” of their names know one another.
Bui, the registered proprietor of the Vietnam CBM Media, additionally beforehand mentioned proudly owning a web site named CasinoMentor.
This playing evaluate web site shared a sequence of Instagram Tales, saved beneath a spotlight referred to as “CBM Media”, which appeared to indicate a gaggle of individuals holidaying on Vietnam’s Ly Son Island in Might 2022.
Each Nguyen and Bui might be seen amongst photographs, with Bellingcat matching their faces to photographs shared on their Fb accounts.
A number of different employees members are related with each CBM Media and OpenDream, based mostly on their public profiles.
For instance, a “content material creator” which OpenDream’s weblog lists as a part of its group is now working for CasinoMentor in addition to one other web site referred to as BetMentor, in accordance with their LinkedIn profile.
BetMentor lists CasinoMentor as its associate on its web site. CasinoMentor’s profile on the Playing Portal Site owners Affiliation (GPWA), a web based playing group, additionally claims possession of BetMentor.
We discovered one other LinkedIn profile of somebody who listed their present place as a product supervisor of OpenDream, with previous expertise as an AI programmer for “CBM”.
We additionally discovered a CasinoMentor worker whose title is identical as an admin of an OpenDream Fb group. One other Fb profile that hyperlinks to social media accounts beneath this title – however utilizing a special profile title – lists their present employer as OpenDream.
Few, if any, CasinoMentor and BetMentor workers had real photos on their work profiles. Many profiles for BetMentor workers appeared to make use of photographs generated by OpenDream.
One LinkedIn profile matching the small print of a CasinoMentor worker used the picture of a lady who had been murdered within the UK in 2020. The profile was created over a yr after the girl’s homicide, in 2021.
CasinoMentor’s deal with, in the meantime, is within the Mediterranean state of Malta in accordance with its web site and Fb web page.
On CasinoMentor’s Google Maps itemizing, the account proprietor uploaded a number of photographs of an workplace house, together with one the place the positioning’s brand is proven on a frosted glass wall. By way of a reverse picture search, we discovered an equivalent picture – besides with out the emblem – on a business property rental itemizing for the co-working house in Malta the place the deal with is situated.
All different photographs exhibiting the workplace house on CasinoMentor’s Google Maps itemizing gave the impression to be from the identical supply as this rental itemizing, with CasinoMentor’s brand and different info added to them.
Nevertheless, CasinoMentor did add one photograph on Google Maps that confirmed a gaggle of individuals together with Bui. We geolocated this photograph to a purchasing plaza in Vietnam and located that the filename of the photograph was “CMTeam”.
Whether or not CasinoMentor employees are based mostly in Vietnam or Malta is, due to this fact, unclear. Neither CasinoMentor nor BetMentor responded to our requests for remark.
Individually, Bellingcat additionally discovered a put up by an organization referred to as VPNChecked describing OpenDream as “our product” on X.
Apparently, a Reddit account with the username “vpnchecked” had been actively moderating the OpenDream subreddit as not too long ago as June.
Contact particulars on the VPNChecked web site supplied one other intriguing clue. An archived model of this web site from 2021 listed a cellphone quantity beneath the contact particulars. We looked for this cellphone quantity on Skype, revealing the identification of one other particular person.
The identical Skype username was additionally used to arrange a Fb profile discovered by Bellingcat. Though the profile names had been barely totally different, the Skype profile image matched a earlier profile image on the Fb account.
Bellingcat isn’t naming this individual as their title doesn’t seem on any paperwork referring to possession of both CBM Media Firm Restricted or CBM Media Pte Ltd.
Nevertheless, the Fb account for this particular person included hyperlinks to VPNChecked. In addition they seem like related to each Bui and Nguyen.
All three are mates on Fb and had been tagged collectively in quite a few photographs. These photographs confirmed them in social settings collectively in addition to leaving pleasant feedback on every others’ posts over the course of a few years.
One photograph posted by Bui in 2019 confirmed a gaggle together with all three sharing a toast in a restaurant. The caption, as translated by Fb, reads: “Examine collectively, speak collectively, love for 14 years has not modified!”
No Solutions From OpenDream
Based mostly on a number of social media accounts and weblog posts related to OpenDream, the positioning is owned by Nguyen’s Singapore-based CBM Media Pte Ltd. Our investigation reveals that a number of OpenDream employees, in addition to moderators of the platform’s social media pages, are additionally working for websites apparently owned by Bui’s Vietnam-registered CBM Media Firm Restricted. As well as, VPNChecked – an organization related to a person apparently near Nguyen and Bui – has claimed OpenDream as its product .
However regardless of all that we have now been capable of finding, it stays unclear to what extent every of the three people had been conscious of customers creating content material that’s unlawful within the US, the UK and Canada – OpenDream’s high markets in accordance with Nguyen’s Fb posts.
We tried to get involved with Bui, Nguyen, and the person related to VPNChecked a number of instances over the course of this investigation. We reached out by way of varied private {and professional} emails discovered on web sites related to them. We additionally made a number of cellphone calls to numbers they listed on their profiles and despatched messages to their private Fb accounts. On the time of publication we had not obtained any response.
OpenDream’s AI picture technology software makes use of open-source fashions freely obtainable on AI communities similar to Hugging Face. The descriptions of a few of these fashions particularly point out NSFW capabilities and tendencies.
Dan Sexton, from IWF, instructed Bellingcat it was vital to think about the entire ecosystem of on-line CSAM, together with free, open-source AI instruments and fashions which have made it simpler to create and make cash off AI picture technology apps.
“These apps didn’t seem from nowhere. What number of steps did it take to create certainly one of these companies? As a result of truly there are many factors of intervention to make it tougher,” Sexton stated.
On this regard, Sexton stated legal guidelines and regulation might be strengthened to discourage business actors from creating such companies.
“These are individuals seeking to make cash. In the event that they’re not making a living doing this, they’re making an attempt to make cash doing one thing else. And if it’s tougher to do that, they won’t do it,” he stated.
“They’re not going to cease making an attempt to make cash and never essentially making an attempt to do it in dangerous methods, however no less than they may transfer to one thing else.”
Since Google terminated its AdSense account, OpenDream can not make cash off Google advertisements which, in accordance with Nguyen’s Fb posts, beforehand accounted for a major proportion of the platform’s income. Its cost processing service additionally stays down on the time of publication.
However commercials from different programmatic promoting suppliers are nonetheless seen on OpenDream’s web site, and the platform seems to be exploring different cost processors. Clicking to improve to a paid account now redirects customers to a digital testing surroundings for PayPal, though no precise funds could be made on this web page because the testing surroundings – which any developer can arrange – solely simulates transactions. PayPal’s coverage additionally prohibits shopping for or promoting sexually oriented content material, together with particularly those who “contain, or seem to contain, minors”.
On Sept. 11, 2024, about two weeks after Bellingcat first contacted OpenDream in addition to the people related to it, the positioning appeared to have been suspended.
When it got here again on-line a day later, all proof of AI CSAM and non-consensual deepfake pornography appeared to have been erased from the positioning, together with from particular person account and template pages the place such photographs had been seen even after they disappeared from the general public gallery in July.
The location’s pricing web page nonetheless options paid subscriptions that supply customers the choice to generate photographs privately with NSFW prompts.
However the relaunched web site comes with a brand new disclaimer on each web page: “OpenDream is a spot the place you’ll be able to create creative photographs. We don’t promote photographs and are usually not accountable for user-generated photographs.”
Readers within the US can report CSAM content material to the NCMEC at www.cybertipline.com, or 1-800-843-5678. Per the US Division of Justice, your report can be forwarded to a regulation enforcement company for investigation and motion. A world record of hotlines to report or search help on CSAM-related points can be obtainable right here.
Melissa Zhu contributed analysis to this piece.
Featured picture: Ying-Chieh Lee & Kingston Faculty of Artwork / Higher Photographs of AI / Who’s Creating the Kawaii Woman? / CC-BY 4.0.
Bellingcat is a non-profit and the flexibility to hold out our work relies on the sort help of particular person donors. If you need to help our work, you are able to do so right here. You may as well subscribe to our Patreon channel right here. Subscribe to our Publication and observe us on Twitter right here and Mastodon right here.