In the U.S., operating a child sexbot would seem to be illegal under the PROTECT Act, which prohibits "virtual child porn":
"In the United States, the PROTECT Act of 2003 made significant changes to the law regarding virtual child pornography. Any realistic appearing computer generated depiction that is indistinguishable from a depiction of an actual minor in sexual situations or engaging in sexual acts is illegal under 18 U.S.C. § 2252A."[1]
Maybe the government could get away with it, but if independent researchers did it, they could be charged with producing child porn.
Yes, Orwell's "thoughtcrime" is an actual criminal offense in the U.S.
FWIW, simulated child porn laws aren't just about "thought crimes". They're motivated by the fact that defendants in cases involving real child porn sometimes claim the child porn is simulated, and it can be difficult and expensive to prove that false.
Which is why making pictures illegal is rather screwy.
Make causing actual harm to an actual child illegal. Make profiting from it (eg selling pictures of said actual harm) illegal. Make possessing pictures not illegal, but something that will get you asked where the pictures came from.
.
But really it is about thoughtcrime. Society (or at least politicians) have decided that having certain interests/desires is wrong (or rather, that those people are creepy and dangerous and can't possibly have the self-control to not fully act on said thoughts). And so anything that may indicate someone having said thoughts is made illegal, even if it doesn't hurt anyone, and even if it indicate that they know not to act in harmful ways.
Sure, and the supposedly-stolen things I'm selling really just fell off the back of a truck.
I would think that sellers having ties to producers would have a harder time arguing they were selling fakes. And if they don't have those ties -- if they're not supporting the people causing the harm -- then their actions are only illegal for the sake of simplifying the prosecution's work. Which means their likely being able to get off at trial is less of an issue.
That something can make law enforcement's job harder, even though it's otherwise completely innocuous* , is a terrible reason for outlawing. Especially considering that whole first amendment thing.
* In this scenario it's something creepy but harmless. In another it might be fake guns or fake drugs * *
* * Consider the implications of outlawing bags of sugar.
I'm not positive; but it doesn't look to me like they are really operating a child sexbot; they are operating a childbot that is soliciting sex work; which is very different. I find it unlikely (though, maybe?) that they are doing any nudity rendering or anything like that; just getting people to agree to pay money for some sex acts from the robot.
I was with you til the last sentence. Yes, making pornography that appears to be child porn is illegal. Even taking a real adult porn star and dressing her up as a child is illegal. Why is this Orwellian?
Because the reason child pornography is illegal is not that viewing child pornography is bad, it's that producing it is clearly wrong. The producers are hard to hit directly, but one can reduce their revenue by making it hard to consume their product.
Producing virtual child pornography is not obviously wrong. But no politician will speak up, because THINK OF THE CHILDREN (what children? DOESNT MATTER THINK OF THEM).
Thus orwellian thoughtcrime.
EDIT: I'm willing to agree that leaving real producers an escape path by letting them claim everything was virtual is bad. So the US clause for 'realistic' virtual porn is okay in my book.
But in the UK stylised drawings of child pornography are illegal. That's just silly.
'Realistic' virtual porn producers should just be asked about source for generated scene, just like producers of real porn are asked for actors age verification (https://en.wikipedia.org/wiki/Adult_film_industry_regulation...). There is no reason to ban it just on this assumption. I think the real reason is different.
> Even taking a real adult porn star and dressing her up as a child is illegal.
Is it really? Isn't "catholic schoolgirl" stereotypically one of the most popular 'fetishes', right up there with "pizza delivery guy" (not sure I'd call roleplaying a fetish per-say)?
And at the same time, the adult porn business continues chugging on .. I have to wonder, what the fuck are we thinking as a human race, that if it's totally ok for adults to buy, distribute and sell porn, can't we really see that some people might not understand that it's not ok to watch porn with children .. and maybe they are driven to it originally by watching adult porn.
Would we have so many men interested in having sex with children if the porn industry wasn't so big ?
The problem is they are not soliciting a bot, they are soliciting children and are going to lengths to ensure that they are getting what they are paying for.
If an MMORPG had "100 year old elvish" NPCs which had childlike features and full nudity, then playing the game might be distasteful but probably should fall under constitutional protection.
The law is very cognizant of intent and willfulness. If you willfully intend to break the law, but fail to actually break the law, you can still be charged and convicted of a crime. I don't have a problem with that.
I think you must agree that if you solicit a bot then there is no crime, just like if you "murder" a bot there is no crime. Ultimately we have juries to decide of the accused if full of shit claiming they were soliciting bots when they were actually soliciting children. The burden of proof falls on the Government to prove this beyond a reasonable doubt.
As bots get increasingly realistic and entertaining enough to create a market for that service (regardless of the physical characteristic of the avatar) it will be interesting to watch jurisprudence evolve.
A convincing avatar which can be synchronized with a human's text input, and/or speech alteration (not text-to-speech) is one way you approach 'the singularity' without having to synthesize human thoughts and emotions. Such a system, if it can cross the uncanny valley, would radically improve quality of live for millions of (and I hate this term) 'genetic lottery losers'.
> If you willfully intend to break the law, but fail to actually break the law, you can still be charged and convicted of a crime.
No, you can't.
If you take concrete action in an attempt to commit a crime, that is often itself a crime (usually a lesser offense than the one attempted; e.g., "attempted murder" is a distinct crime, and a lesser offense than murder.) But if you are not proven to have broken the law (and, more specifically, committed a crime -- not all lawbreaking is criminal), you cannot be convicted of a crime.
Do attempted murder charges typically have a less extreme penalty than actual murder charges? It is my impression that is the case, but I can't find much that actually supports that notion.
This is a concern to me because it doesn't make sense to me that somebody who tries to kill somebody but does a poor job of it should be punished less harshly than somebody who tries to kill somebody and manages to pull it off. Nobody should get off lighter just because their victim was particularly hearty.
Of course on the opposite end of the spectrum, if I try to break the speed limit with my beat up '65 Volkswagen beetle, but fail to do so, obviously I don't deserve a speeding ticket. Attempted speeding isn't a crime, not all "attempted [whatevers]" need to be punished.
> Do attempted murder charges typically have a less extreme penalty than actual murder charges?
Yes.
> This is a concern to me because it doesn't make sense to me that somebody who tries to kill somebody but does a poor job of it should be punished less harshly than somebody who tries to kill somebody and manages to pull it off.
Typically, both the degree of "wrongness" of the act and the degree of harm inflicted by it are factors in setting criminal punishments. While attempted murder might be as morally wrong as actual murder, the harm inflicted is different.
What if we view sentencing not as punishment or revenge, but rather as something that we do for the safety of society?
It isn't clear to me that an attempted murderer is less dangerous to society than a successful murderer (unless we assume that all failed murder attempts failed due to incompetence, but I don't think that is safe to say), surely they should both be kept off the streets for the same amount of time.
> What if we view sentencing not as punishment or revenge, but rather as something that we do for the safety of society?
It could be argued that someone with both demonstrated bad intent and demonstrated capacity is still more dangerous than someone with equally bad intent that fails to demonstrate the capacity to successfully carry that intent into fruition.
> This is a concern to me because it doesn't make sense to me that somebody who tries to kill somebody but does a poor job of it should be punished less harshly than somebody who tries to kill somebody and manages to pull it off.
Your friend calls you. The police have discovered you tried to hire a hitman to kill your wife. You despair, but then realize you won't get punished any worse for killing her before they show up...
Is hiring a hitman who successfully kills somebody the same crime as successfully killing somebody yourself?
I understand not increasing the punishment of certain types of crimes so that we don't create incentives to escalate (for example, don't give the death penalty for armed robberies, because then all armed robberies would turn into murders as well), but in the case of attempted murder, the person doing it has already committed themselves to killing somebody. There aren't really many ways that can escalate. If they tried to kill somebody, realized that they failed, and had the opportunity to try again, I don't think there is much incentive currently for them to not try again.
> Is hiring a hitman who successfully kills somebody the same crime as successfully killing somebody yourself?
Yes.
(Well, hiring the hitman may itself be other crimes, but once you've done it, its the same set of crimes for "hiring the hitman to kill somebody" + "hitman kills them" and "hiring the hitman to kill somebody" + "you kill them yourself". For you, at least. Less crimes for the hitman, though.)
Hmm, that seems strange, but on the other hand felony murder makes sense to me so I'm not sure why I should find this strange. My intuition on what punishments should be dolled out for extreme crimes is probably a little wonky.
"If you willfully intend to break the law, but fail to actually break the law, you can still be charged and convicted of a crime"
Suppose I try to kill you using my specially crafted voodoo doll, but fail to do so (my magic powers are not what they used to be) then I can still be convicted? What if I curse someone, is that enough to be convicted? I doubt it.
There is something called "moral luck" in philosophy, which relates to this.
I should probably give up, because clearly my wording was poor...
By 'willfully intend' I meant basically what jlgreco said, 'take concrete action in an attempt to commit a crime' and I fully understand that the charge for that is different than the charge if you were successful in the act.
If you 'willfully intend' to commit murder by sticking pins in a voodoo doll, you are crazy but probably not guilty of attempted murder.
I'm not exactly sure what the jury instructions would say. Is it whether the act would reasonably result in death, or whether the defendant believed at the time that the act would result in the death of their chosen victim?
Often times, trying to kill someone in a way that is extremely unlikely to actually result in their death doesn't save you from getting charged. A good example is Ross Ulbrich hiring a hitman online for bitcoins. Twice.
> The law is very cognizant of intent and willfulness.
Not in strict liability offenses, such as statutory rape. Which isn't entirely relevant to playing a game, until the legislature makes 'possession of child pornography' strict liability and includes drawings as being child pornography.
But what you suggest is unlikely to occur because the Supreme Court has consistently found strict liability for first amendment issues is unconstitutional. See: http://www.volokh.com/posts/1218485530.shtml
My personal position is that "soliciting" an "underage" sexbot should not ever be a prosecutable crime, but that they can be used in the current way to determine probable cause for warrants searching suspected pedophiles.
If there are actual human victims, then prosecute with all the harshness possible; but if some pedophile gets off on 100% computer-generated images, well, then that's probably a win for society that reduces violence and exploitation.
I think a lot of the probable cause comes down to the venue and context. After all, someone going on a rampage in a MMORPG could not be pursued for being a potential murderer. Yet a random entity in a public, online chatroom is assumed to be human by default (despite all the bots out there), so if someone wants to get kicks from a robotic kid, a public venue like that is not the place and they need to engage in these activities in a suitable forum (like a game).
Would you apply this standard to all potential crimes?
For instance, if I try to hire a hitman and the deal falls through (maybe I do not offer enough money, maybe he is too busy to take the contract, maybe he is an undercover cop), would you say that my actions should not be a prosecutable crime?
This is not a fair comparison. It would be if the original poster had said "personally I have no problem with people trying to have sex with minors, only if they succeed should they be punished." What he said was more along the lines of, I have no problem with people killing computer generated characters in video games, just don't kill real people.
The comparison is between trying to hire a hitman, and failing because the hitman is an undercover cop, and trying to seduce a child, and failing because the child is an algorithm.
In both cases the person fully believes they're interacting with the genuine article.
Here's the key difference: If I play a video game I KNOW the characters aren't real people. If "they" are soliciting a child thinking that it's a real child, then yeah I think that should be a crime.
It's already well estabilished and detailed how attempts/failed crimes are handled; I won't go into details (did it fail because of external circumstances or your choice? what was the intent? etc).
But if I try to hire a hitman on a virtual 3d character or Easter Bunny, it shouldn't be a prosecutable crime. In attempted crime there should be a victim that I attempted to harm.
I can agree that the entrapment / undercover cop scenario might make it properly prosecutable, depending on how it's done. I'd be willing to bet that in most of the cases if you'd get a warrant then you'd get evidence of actually harmed kids making the case very clear. However, if the perpetrator knows that the "target" is virtual and noone is harmed, then it shouldn't be illegal.
Yes, it's correct. Giving it more thought, I agree that the scenario of original article should be prosecuted, barring issues of entrapment, unless the perpetrator understood that it's a fake (and he didn't).
That still leaves a point about such "avatars" being legal if made/bough/sold for recreational purposes if everyone knows that noone is getting harmed. i.e., treating childporn as evil not in itself, but because it's evidence and financing for actual evil acts.
> but if some pedophile gets off on 100% computer-generated images, well, then that's probably a win for society that reduces violence and exploitation.
More research is needed. Some people suggest that users of pornography need to collect more and more of it. Some people who collect images of child sexual abuse are caught with hundreds of thousand of images.
A person collecting pseudo-photograph images of child sexual abuse might be a risk to children. (Data is tricky to find.)
But yes, if people can stay away from children and real images of child sexual abuse then pseudo photographs would be great.
There is a reason to think that virtual images of child sexual abuse cause real world child sexual abuse. People need more images, and there aren't enough virtual images and so the collection is 'topped up' with real images. And there's a possibility that people viewing images, even virtual, of child sexual abuse will go on to abuse real world children.
Obviously it's very hard to test that, but it's at least credible.
The other point about pseudo-photographs is that it's a historical remainder from times before we had photo-realistic rendering.
People would take real images of children, and manipulate those into images of child sexual abuse. Sometimes combining real world children's heads onto real world images of child sexual abuse, or images of adults having sex.
In that case two images are created from one instance of abuse. Two children are shown, but only one has suffered abuse. Or children are shown, but adults were used for the sexual parts of the photograph. Since people thought strongly that possession of images of child sexual abuse was a signal that the person would go on to commit actual abuse they made sure to make these images illegal.
I find this very hard to talk about. I strongly want to protect children from harm. Images of child sexual abuse cover a wide range of harm, from sexually suggestive clothed posing of 15 year olds at one end to brutal rape of small infants at the other. My instinct is to keep these images illegal, but I realise that the demonisation of paedophiles does not help them seek help to stop their offending behaviour, and drives them to "support groups" that attempt to normalise their behaviour. And if we did proper research and found that virtual images reduced real world offending it'd be hard to keep those images illegal.
>There is a reason to think that virtual images of child sexual abuse cause real world child sexual abuse. People need more images, and there aren't enough virtual images and so the collection is 'topped up' with real images. And there's a possibility that people viewing images, even virtual, of child sexual abuse will go on to abuse real world children.
Or you could just as well argue that people use the imagery to fulfill their needs, and if they didn't have that they'd turn to actual child abuse.
'credible' should not be good enough. (Of course in the current political systems, it doesn't even have to be credible if you can spin it scarily enough: see war on terror, war on drugs, sex offender registrations for urinating in public etc.) You need evidence that actually supports your theory.
So far we have people who collect images of child sexual abuse, and some of those people go on to abuse children more directly.
We don't have groups of people telling us that their collections of images of child sexual abuse help prevent their offending behaviour. (For some obvious reasons, of course.)
I don't have any of the studies on hand to link to, but IIRC simulating negative desires (CP, rape, violence, etc) doesn't always reduce a person's incident rate and might actually increase it in some cases.
I think we should not worry about the criminals but we should worry more about the victims. Objective should never be to put every sex offender behind bars but the objective should be that no child should be a victim of such crimes.
Once we get that clear, putting all potential victims behind bars becomes one of the way to have 0 victims. The question is how effective that method is and whether it is achievable or can it backfire.
I would say that a bot that can detect a vulnerable child is far more effective in preventing a crime than a one that detects a potential offender.
You are primarily attacking the supply side of the equation. Locking up those looking for this is working on the demand side. Both are valid means that should be done in concert.
The more (or less, who the hell knows) horrible aspect of this is that I foresee a cottage industry springing up where "good enough" digital replicas of children are used to provide sex services in the same way that phone sex operators work now -- only, it'll be known (or at least assumed) that the service is being provided by adults pretending to be children with digital avatars, and the quality of the service will be measured by how convincingly they portray their parts.
This will lead to all sorts of legal gray areas, wherein "I just assumed she was digital" could well work as a positive defense, or, even more eerily, a faux-pedophile service provider could staff enough adults to cover for the non-adults that they sometimes use.
> And there's stuff in the photography bit that means pseudo-photographs still count as illegal.
That's where America differs. We expressly allow pseudo-imagery under the first amendment. Logic being that the harm comes in the exploitation, but for digital representations of images that are pornographic, nobody was exploited in the process of their creation (at least, assuming nobody was exploited in the process of their creation).
The UK laws though seem to both be more tangible than the situation discussed in the article, at least assuming that 'sexual grooming' has been defined and is fairly narrowly interpretable.
I'm not the person to say whether or not pretend pedo-porn is a net good or evil for the world; but I can see arguments in either direction, but if they're actually grooming children who they then meet, clearly, that's a net bad.
Edit: I had it exactly backwards, and now I'm wondering how in the hell I remember it so wrongly. Apparently we do not condone digital representations of child porn under the PROTECT Act of 2003[1].
> This will lead to all sorts of legal gray areas, wherein "I just assumed she was digital" could well work as a positive defense, or, even more eerily, a faux-pedophile service provider could staff enough adults to cover for the non-adults that they sometimes use.
Why would they use non-adults at all, if they have software that can cover the visuals? Adults are easier to control and completely legal.
That's a good question actually -- my initial thought was for authenticity, but cost could be another reason.
That said, I was only hypothesizing, so it's very possible that kids would be more problematic than they're worth in such an endeavor, not even counting the increased legal risk.
And I was bullied at school. That's why I spend my day browsing child-beating websites.
But seriously, I have to disagree. I'm aware of the absurdities that arise from populist demonisation of paedophiles. (E.g. the 17-year-old girl who arrested for child porn because she sent a picture of her tits to her boyfriend.) But I really think we should take steps to choke off the demand for live images of child abuse.
I don't claim we can entirely eliminate it, but we can (and do) make it a helluva lot harder than it might be if we took a
laissez faire attitude. I think that has a real effect on children's lives.
Somewhere, in deep, dark, cold Russia, an army of Hackers is already building an army of undetectable Pedophile Man Bots just to screw with these guys ..
Seriously though, wouldn't the creators be creating a virtual child designed to encourage pedophilia ? So who would they be to judge ?
I think the first comment says you have to purposefully be trying to break the law. If you went into a situation where you knew what you were seeing was a bot and not a person then I don't think that would count as attempting to break any laws. Or am I misreading that?
"In the United States, the PROTECT Act of 2003 made significant changes to the law regarding virtual child pornography. Any realistic appearing computer generated depiction that is indistinguishable from a depiction of an actual minor in sexual situations or engaging in sexual acts is illegal under 18 U.S.C. § 2252A."
While in both scenarios a crime is technically committed (I would guess something like "using a computer with the intent of accessing/distributing child pornography,") can't you appreciate how it bothers some to think that one could be convicted without any victim? Lets use a computer analogy: what if I put up a server running running an SSH-like service that would present a false shell and allow commands to be entered, and even give reasonable-yet-bogus responses, and I put up the username/password information on a pastebin and post it everywhere. Should anyone who connects to it and attempts to run "rm -rf /" be convicted of a crime for intended to destroy computer data, even when their action actually has no effect?
Crime is a combination of intentions, actions and results. Just because no-one is harmed doesn't mean there's no crime. If I try and murder someone and fail (perhaps I'm a poor shot?) then I'm hardly an innocent.
I'd recommend the webcomic 'The illustrated guide to law' to everyone, especially many of the commentators on here. It is disturbing to read some of these posts.
I know there is a crime committed here, my first line was "While in both scenarios a crime is technically committed..." And of course I don't question the general case that attempting to commit crimes should itself be a crime, I was merely pointing out that applying that reasoning can sometimes lead to questionably-just outcomes.
Code is not an undercover cop. Undercover cops have cost restrictions that prevent their casual use. An undercover cop is also still a real person who may still be having a crime committed against them.
There's too many differences for this argument to hold. It is, at best, a good start of an argument, but one that needs to address the practical differences between a program and a human, in terms of cost, morality, sentience, etc. And I think at the end of that process you'd find that you can't just treat code as an undercover cop, and it wasn't actually that great a starting point after all.
Here's a philosophical question, as technology progresses to the point that virtual reality is so good that it is able to replace real children in cases like these and satisfy child predators thus eliminating victims is this a fundamentally morally good thing?
unbelievable interesting. I'm working on studies of human behaviour in various channels, including darknets like those. I am curious how psychological manipulation can be used to subconsciously counter this, the CG-Sexbot was the missing piece.
Are you really so blind to the fact of poverty, or the sheer numbers of people involved, that you think one computer graphics entrapment tool is driving any serious part of the "supply"? This is poor people, and victims of sociopathic criminals. And one program.
One of the major problems with law enforcement today is that so long as they carefully skirt the edges of what is legally defined as "entrapment", they can bait people into crimes, and arrest them for it even if otherwise they would not have committed any crime at all.
This really isn't good, and that's what the anti-entrapment statutes are for, but law enforcement finds its way around them... That's one of the best ways to catch "criminals", and catching criminals is their job, right?
Surely you don't think having a bot introduce itself as a 10-year-old Filipina counters as entrapment. That's not even skirting the line — based on what they've shown, the bot is not the one that brings up sex.
"In the United States, the PROTECT Act of 2003 made significant changes to the law regarding virtual child pornography. Any realistic appearing computer generated depiction that is indistinguishable from a depiction of an actual minor in sexual situations or engaging in sexual acts is illegal under 18 U.S.C. § 2252A."[1]
Maybe the government could get away with it, but if independent researchers did it, they could be charged with producing child porn.
Yes, Orwell's "thoughtcrime" is an actual criminal offense in the U.S.
[1] https://en.wikipedia.org/wiki/Simulated_child_pornography#Vi...