WARNING: This text discusses baby sexual abuse materials (CSAM).
At first look, OpenDream is only one of many generic AI picture technology websites which have sprung up lately, permitting individuals to create pictures by typing quick descriptions of what they need the outcomes to appear to be.
The platform describes itself as an “AI artwork generator” on its web site and social media platforms, and invitations customers to “unlock a world of limitless artistic prospects”.
However for months, individuals generated baby sexual abuse materials (CSAM) with it – and the photographs have been left seen on the platform’s public pages with none obvious moderation.
Bellingcat first got here throughout OpenDream by means of promotional posts on X (previously Twitter) in July. Lots of the accounts selling OpenDream targeted on AI instruments and companies.
A few of the posts about OpenDream on X featured a screenshot of the platform’s public gallery with pictures of younger ladies subsequent to grownup girls in bikinis.
The positioning’s fundamental web page described the platform as an “AI Artwork Generator”, with a reasonably innocuous animation-style picture. Nevertheless, the general public gallery web page – which didn’t require any login to view – was filled with sexually specific pictures.
Inside seconds of scrolling, we got here throughout a number of AI-generated pictures of younger kids and infants in underwear or swimwear, or being sexually abused. The prompts used to generate this imagery – which have been additionally publicly seen – clearly demonstrated an intent to create sexualised pictures of youngsters. For instance, some included the phrase “toddler” together with explicit physique components and poses, descriptions of full or partial nudity, or a sexual act.
Bellingcat has reported the location to the Nationwide Middle for Lacking & Exploited Kids within the US, the place this reporter relies.
Along with CSAM, there seemed to be no restrictions by OpenDream on producing non-consensual deepfakes of celebrities. Scrolling by means of the gallery, we discovered bikini-clad pictures of well-known feminine net streamers, politicians, singers, and actors.
Archived variations of the gallery on Wayback Machine present that OpenDream was displaying AI CSAM and non-consensual deepfakes way back to December 2023.
Upon trying to find the platform’s title on common search engines like google and yahoo comparable to Google and Bing, we discovered that almost all of picture outcomes have been additionally CSAM.
Whereas OpenDream shouldn’t be as broadly used as another AI picture technology platforms, and it’s not the primary case of individuals utilizing AI to generate CSAM, the extent to which such materials was made accessible to the general public with none obvious moderation set it aside from different websites we had come throughout at Bellingcat.
On the finish of July 2024, we noticed moderation makes an attempt for the primary time on the platform. As of publication, CSAM and deepfake porn pictures seem to have been both made personal or taken down from the location. It’s unclear why there was a sudden clean-up after this content material was left unmoderated on the web site for not less than half a yr. OpenDream didn’t reply to our requests for remark.
Based on the location’s WHOIS document, the platform’s area was registered by Namecheap, which sells net domains. Namecheap’s area registration settlement prohibits utilizing their companies for “unlawful or improper functions”, together with if the content material violates the legal guidelines of any native authorities. The area registrar didn’t reply to our requests for remark. OpenDream’s web site is hosted by IT companies firm Cloudflare, which prohibits content material that incorporates, shows, distributes, or encourages the creation of kid sexual abuse materials. Cloudflare additionally didn’t reply to our request for remark.
Monetising ‘Not Secure For Work’ Content material
OpenDream generates earnings from each subscriptions and promoting. Whereas anybody can generate a handful of pictures every day without cost, there are additionally paid plans.
In July, the 2 most costly subscription plans allowed customers to make use of NSFW (not protected for work) prompts, entry OpenDream’s NSFW fashions and conceal the photographs they generated from the general public gallery. NSFW is a basic time period for content material that might often be thought-about inappropriate to be considered at work, comparable to sexual, violent or disturbing imagery. The one NSFW imagery we noticed on OpenDream was of a pornographic or sexual nature.
As of publication, the mentions of public NSFW fashions have been faraway from the pricing data, however the paid plans nonetheless embody entry to NSFW prompts.
Till just lately, the funds have been powered by monetary companies firm Stripe. However in direction of the top of July, the Stripe fee display for OpenDream stopped working. Bellingcat requested Stripe about this, however the firm stated it couldn’t touch upon particular person accounts.
Stripe’s coverage prohibits pornographic content material, together with CSAM and non-consensual deepfakes, and it has beforehand taken down fee screens for different AI websites that includes pornographic content material.
Archived variations of the location from the Wayback Machine present that OpenDream was first launched as a free beta product in early 2023. A pricing web page first confirmed up someday in July 2023, however didn’t embody any point out of NSFW prompts or fashions.
Nevertheless, OpenDream had launched plans that particularly allowed NSFW prompts and utilization of NSFW fashions by December 2023, which was additionally across the time when apparent makes an attempt to generate CSAM and non-consensual deepfake porn started showing in archived variations of the gallery web page.
Over the course of our investigation, we additionally got here throughout Google AdSense commercials on the location.
One of many people related to OpenDream, Nguyen Viet Hai, appeared to share particulars of the platform’s earnings from advertisements and subscriptions in 4 posts on AI-related Fb teams in October 2023 and April 2024.
In these posts, the account beneath Nguyen’s title stated he was seeking to promote OpenDream because the platform had run into monetary issues. Additionally they revealed that OpenDream made about US$2,000-3,000 a month, together with about US$800-1,000 in month-to-month advert income.
The posts from October 2023 stated that OpenDream had 310 customers with paid plans. A publish from April 2024 – with a screenshot of a Google AdSense dashboard for OpenDream’s area – confirmed that from January 1 to March 31, 2024, the platform earned US$2,404.79 (or 59,282,192 Vietnamese Dong) from Google’s programmatic promoting.
Google’s AdSense’s coverage prohibits sexually specific content material, together with non-consensual sexual themes comparable to CSAM, whether or not simulated or actual. In response to Bellingcat’s queries, Google stated in September that OpenDream’s AdSense account had been terminated in accordance with its insurance policies.
Nguyen is registered as a director of CBM Media Pte Ltd in Singapore, an organization that was named as OpenDream’s proprietor in a number of profiles and weblog posts beneath the platform’s title.
The posts about promoting OpenDream talked about that the corporate doesn’t must pay taxes on its earnings for the subsequent three years, and has Stripe funds arrange.
Singapore grants newly included firms tax exemptions for his or her first three years of operation. Primarily based on paperwork Bellingcat obtained from Singapore’s official enterprise registry, which matched the tackle and tax code particulars shared in among the profiles and weblog posts, CBM Media Pte Ltd was registered within the nation on June 13, 2023.
Within the Fb posts, Nguyen quoted a value of US$30,000 for the sale in OpenDream in October 2023, however raised this to US$50,000 (or 1.23 billion Vietnamese Dong) in April 2024. It’s unclear if any sale has occurred.
Not a Victimless Crime
Globally, legal guidelines criminalising AI-generated CSAM have lagged behind the expansion of NSFW AI picture technology web sites.
In each the US and the UK, AI-generated CSAM is prohibited and is handled the identical as real-life CSAM. In August this yr, California filed a ‘first-of-its sort’ lawsuit towards among the world’s largest web sites that generate non-consensual deepfake pornography, together with CSAM, marking a definite shift in authorized consequence to the businesses that supply the AI companies as a substitute of the customers who create the content material. The EU can be discussing new legal guidelines to criminalise AI-generated CSAM and deepfakes. However in different international locations, like Japan, there are not any legal guidelines towards sharing artificial sexual content material depicting minors so long as they don’t contain actual kids.
Whereas OpenDream was one of the egregious examples of an AI picture technology web site that includes CSAM we have now come throughout, it was removed from the one one which has proven up previously yr. Bellingcat beforehand coated different AI platforms incomes cash by means of non-consensual deepfakes, together with these that includes pictures of very young-looking topics.
Dan Sexton, chief know-how officer on the Web Watch Basis (IWF), stated AI CSAM shouldn’t be a victimless crime.
“The truth is that there’s a superb likelihood that you might have had been utilizing a mannequin that has itself been skilled on sexual imagery of youngsters, so the abuse of youngsters might properly have been concerned within the creation of that AI baby sexual abuse imagery,” Sexton stated.
Sexton added that even when the youngsters depicted usually are not “actual”, publicity to imagery of youngsters being sexually abused may normalise that behaviour.
“You have got the rationalisation of these consuming [AI CSAM] that ‘this isn’t an actual baby’. However even when it seems it was an actual baby, you could possibly rationalise it by saying it’s not actual. In order that I feel is a harmful path to go down,” he stated.
The difficulty of AI-generated CSAM has grown alongside the problem of deepfake non-consensual specific materials of adults, as lots of the AI technology companies that individuals use to generate pornographic pictures of grownup girls will also be used to generate pornographic pictures of youngsters.
Based on IWF’s July 2024 report, most AI CSAM is now real looking sufficient to be handled as “actual” CSAM and as AI know-how improves, it’ll grow to be an much more tough downside to fight.
Within the case of OpenDream, this materials was made accessible not simply by means of the web site itself, however was additionally listed by Google and Bing’s search engines like google and yahoo.
Trying to find “Opendream” and “Opendream ai” pictures resulted in each search engines like google and yahoo returning artificial sexualised pictures of minors. Nearly all of the photographs returned included pictures of bikini-clad or bare toddlers.
When requested about OpenDream’s AI CSAM pictures showing on Bing’s search outcomes, a spokesperson for Microsoft acknowledged that the corporate has a “long-standing dedication to advancing baby security and tackling on-line baby sexual abuse and exploitation”.
“We have now taken motion to take away this content material and stay dedicated to strengthening our defences and safeguarding our companies,” the spokesperson added.
Bellingcat confirmed that these pictures have been faraway from Bing’s picture search outcomes after we reached out for remark.
Equally, a spokesperson for Google acknowledged that the search engine has “sturdy protections to restrict the attain of abhorrent content material that depicts CSAM or exploits minors, and these methods are efficient towards artificial CSAM imagery as properly”.
“We proactively detect, take away, and report such content material in Search, and have extra protections in place to filter and demote content material that sexualises minors,” the Google spokesperson stated.
Google additionally eliminated the CSAM pictures generated by OpenDream from its search engine outcomes after Bellingcat reached out for remark.
OpenDream, Closed Doorways
Whereas a person named Nguyen Viet Hai was on the Singapore enterprise registry for CBM Media Pte Ltd – the identical one who had overtly mentioned promoting OpenDream on social media – Bellingcat’s investigation advised others have been linked to the operation too.
We observed that a number of people who have been utilizing social media accounts that bore OpenDream’s title had acknowledged they labored for a corporation referred to as CBM or CBM Media. It’s unclear if this refers to CBM Media Pte Ltd., as Bellingcat additionally discovered one other firm referred to as CBM Media Firm Restricted registered within the Vietnamese capital Hanoi by a person named Bui Duc Quan.
Nevertheless, on-line proof advised the house owners of the 2 firms with “CBM Media” of their names know one another.
Bui, the registered proprietor of the Vietnam CBM Media, additionally beforehand mentioned proudly owning a web site named CasinoMentor.
This playing assessment web site shared a collection of Instagram Tales, saved beneath a spotlight referred to as “CBM Media”, which appeared to point out a bunch of individuals holidaying on Vietnam’s Ly Son Island in Might 2022.
Each Nguyen and Bui may very well be seen amongst pictures, with Bellingcat matching their faces to photographs shared on their Fb accounts.
A number of different employees members are related with each CBM Media and OpenDream, primarily based on their public profiles.
For instance, a “content material creator” which OpenDream’s weblog lists as a part of its workforce is now working for CasinoMentor in addition to one other web site referred to as BetMentor, in keeping with their LinkedIn profile.
BetMentor lists CasinoMentor as its companion on its web site. CasinoMentor’s profile on the Playing Portal Site owners Affiliation (GPWA), an internet playing group, additionally claims possession of BetMentor.
We discovered one other LinkedIn profile of somebody who listed their present place as a product supervisor of OpenDream, with previous expertise as an AI programmer for “CBM”.
We additionally discovered a CasinoMentor worker whose title is similar as an admin of an OpenDream Fb group. One other Fb profile that hyperlinks to social media accounts beneath this title – however utilizing a unique profile title – lists their present employer as OpenDream.
Few, if any, CasinoMentor and BetMentor staff had real footage on their work profiles. Many profiles for BetMentor staff appeared to make use of pictures generated by OpenDream.
One LinkedIn profile matching the small print of a CasinoMentor worker used the picture of a lady who had been murdered within the UK in 2020. The profile was created over a yr after the lady’s homicide, in 2021.
CasinoMentor’s tackle, in the meantime, is within the Mediterranean state of Malta in keeping with its web site and Fb web page.
On CasinoMentor’s Google Maps itemizing, the account proprietor uploaded a number of pictures of an workplace house, together with one the place the location’s brand is proven on a frosted glass wall. By a reverse picture search, we discovered an similar picture – besides with out the brand – on a business property rental itemizing for the co-working house in Malta the place the tackle is positioned.
All different pictures displaying the workplace house on CasinoMentor’s Google Maps itemizing seemed to be from the identical supply as this rental itemizing, with CasinoMentor’s brand and different data added to them.
Nevertheless, CasinoMentor did add one picture on Google Maps that confirmed a bunch of individuals together with Bui. We geolocated this picture to a purchasing plaza in Vietnam and located that the filename of the picture was “CMTeam”.
Whether or not CasinoMentor employees are primarily based in Vietnam or Malta is, due to this fact, unclear. Neither CasinoMentor nor BetMentor responded to our requests for remark.
Individually, Bellingcat additionally discovered a publish by an organization referred to as VPNChecked describing OpenDream as “our product” on X.
Curiously, a Reddit account with the username “vpnchecked” had been actively moderating the OpenDream subreddit as just lately as June.
Contact particulars on the VPNChecked web site supplied one other intriguing clue. An archived model of this web site from 2021 listed a cellphone quantity beneath the contact particulars. We looked for this cellphone quantity on Skype, revealing the identification of one other particular person.
The identical Skype username was additionally used to arrange a Fb profile discovered by Bellingcat. Though the profile names have been barely totally different, the Skype profile image matched a earlier profile image on the Fb account.
Bellingcat shouldn’t be naming this individual as their title doesn’t seem on any paperwork regarding possession of both CBM Media Firm Restricted or CBM Media Pte Ltd.
Nevertheless, the Fb account for this particular person included hyperlinks to VPNChecked. Additionally they seem like related to each Bui and Nguyen.
All three are associates on Fb and had been tagged collectively in quite a few pictures. These pictures confirmed them in social settings collectively in addition to leaving pleasant feedback on every others’ posts over the course of a few years.
One picture posted by Bui in 2019 confirmed a bunch together with all three sharing a toast in a restaurant. The caption, as translated by Fb, reads: “Research collectively, discuss collectively, love for 14 years has not modified!”
No Solutions From OpenDream
Primarily based on a number of social media accounts and weblog posts related to OpenDream, the location is owned by Nguyen’s Singapore-based CBM Media Pte Ltd. Our investigation reveals that a number of OpenDream employees, in addition to moderators of the platform’s social media pages, are additionally working for websites apparently owned by Bui’s Vietnam-registered CBM Media Firm Restricted. As well as, VPNChecked – an organization related to a person apparently near Nguyen and Bui – has claimed OpenDream as its product .
However regardless of all that we have now been capable of finding, it stays unclear to what extent every of the three people have been conscious of customers creating content material that’s unlawful within the US, the UK and Canada – OpenDream’s prime markets in keeping with Nguyen’s Fb posts.
We tried to get involved with Bui, Nguyen, and the person related to VPNChecked a number of instances over the course of this investigation. We reached out by way of varied private {and professional} emails discovered on web sites related to them. We additionally made a number of cellphone calls to numbers they listed on their profiles and despatched messages to their private Fb accounts. On the time of publication we had not obtained any response.
OpenDream’s AI picture technology device makes use of open-source fashions freely accessible on AI communities comparable to Hugging Face. The descriptions of a few of these fashions particularly point out NSFW capabilities and tendencies.
Dan Sexton, from IWF, informed Bellingcat it was essential to contemplate the entire ecosystem of on-line CSAM, together with free, open-source AI instruments and fashions which have made it simpler to create and generate income off AI picture technology apps.
“These apps didn’t seem from nowhere. What number of steps did it take to create one in every of these companies? As a result of really there are many factors of intervention to make it more durable,” Sexton stated.
On this regard, Sexton stated legal guidelines and regulation may very well be strengthened to discourage business actors from creating such companies.
“These are individuals seeking to generate income. In the event that they’re not being profitable doing this, they’re attempting to generate income doing one thing else. And if it’s more durable to do that, they may not do it,” he stated.
“They’re not going to cease attempting to generate income and never essentially attempting to do it in dangerous methods, however not less than they could transfer to one thing else.”
Since Google terminated its AdSense account, OpenDream can now not generate income off Google advertisements which, in keeping with Nguyen’s Fb posts, beforehand accounted for a big proportion of the platform’s income. Its fee processing service additionally stays down on the time of publication.
However commercials from different programmatic promoting suppliers are nonetheless seen on OpenDream’s web site, and the platform seems to be exploring different fee processors. Clicking to improve to a paid account now redirects customers to a digital testing surroundings for PayPal, though no precise funds might be made on this web page because the testing surroundings – which any developer can arrange – solely simulates transactions. PayPal’s coverage additionally prohibits shopping for or promoting sexually oriented content material, together with particularly those who “contain, or seem to contain, minors”.
On Sept. 11, 2024, about two weeks after Bellingcat first contacted OpenDream in addition to the people related to it, the location appeared to have been suspended.
When it got here again on-line a day later, all proof of AI CSAM and non-consensual deepfake pornography appeared to have been erased from the location, together with from particular person account and template pages the place such pictures have been seen even after they disappeared from the general public gallery in July.
The positioning’s pricing web page nonetheless options paid subscriptions that supply customers the choice to generate pictures privately with NSFW prompts.
However the relaunched web site comes with a brand new disclaimer on each web page: “OpenDream is a spot the place you possibly can create creative pictures. We don’t promote pictures and usually are not accountable for user-generated pictures.”
Readers within the US can report CSAM content material to the NCMEC at www.cybertipline.com, or 1-800-843-5678. Per the US Division of Justice, your report shall be forwarded to a regulation enforcement company for investigation and motion. A global listing of hotlines to report or search assist on CSAM-related points can be accessible right here.
Melissa Zhu contributed analysis to this piece.
Featured picture: Ying-Chieh Lee & Kingston College of Artwork / Higher Photos of AI / Who’s Creating the Kawaii Woman? / CC-BY 4.0.
Bellingcat is a non-profit and the power to hold out our work relies on the type assist of particular person donors. If you need to assist our work, you are able to do so right here. It’s also possible to subscribe to our Patreon channel right here. Subscribe to our Publication and observe us on Twitter right here and Mastodon right here.