X offices raided in France as UK opens fresh investigation into Grok(bbc.com)
283 points by vikaveri 20 hours ago | 485 comments
- utopiah 46 minutes agoTo people claiming a physical raid is pointless from the point of gathering data :
- you are thinking about a company doing good things the right way. You are thinking about a company abiding by the law, storing data on its own server, having good practices, etc.
The moment a company starts to do dubious stuff then good practices start to go out the window. People write email with cryptic analogies, people start deleting emails, ... then as the circumvention become more numerous and complex, there needs to still be a trail in order to remain understandable. That trail will be in written form somehow and that must be hidden. It might be paper, it might be shadow IT but the point is that if you are not just forgetting to keep track of coffee pods at the social corner, you will leave traces.
So yes, raids do make sense BECAUSE it's about recurring complex activities that are just too hard to keep in the mind of one single individual over long periods of time.
- miki123211 59 minutes agoThis vindicates the pro-AI censorship crowd I guess.
It definitely makes it clear what is expected of AI companies. Your users aren't responsible for what they use your model for, you are, so you'd better make sure your model can't ever be used for anything nefarious. If you can't do that without keeping the model closed and verifying everyone's identities... well, that's good for your profits I guess.
[-]- culi 29 minutes agoIt's not really different from how we treat any other platform that can host CSAM. I guess the main difference is that it's being "made" instead of simply "distributed" here
- themafia 17 minutes agoHolding corporations accountable for their profit streams is "censorship?" I wish they'd stop passing models trained on internet conversations and hoarded data as fit for any purpose. The world does not need to boil oceans for hallucinating chat bots at this particular point in history.
- Altern4tiveAcc 17 hours ago> Prosecutors say they are now investigating whether X has broken the law across multiple areas.
This step could come before a police raid.
This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.
[-]- bawolff 2 hours ago> and no crime was prevented by harassing local workers.
Siezing records is usually a major step in an investigation. Its how you get evidence.
Sure it could just be harrasment, but this is also how normal police work looks. France has a reasonable judicial system so absent of other evidence i'm inclined to believe this was legit.
- giancarlostoro 23 minutes ago> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.
I wouldn't even consider this a reason if it wasn't for the fact that OpenAI and Google, and hell literally every image model out there all have the same "this guy edited this underage girls face into a bikini" problem (this was the most public example I've heard so I'm going with that as my example). People still jailbreak chatgpt, and they've poured how much money into that?
- moolcool 17 hours ago> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.
The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.
[-]- pdpi 49 minutes agoI'm of two minds about this.
One the one hand, it seems "obvious" that Grok should somehow be legally required to have guardrails stopping it from producing kiddie porn.
On the other hand, it also seems "obvious" that laws forcing 3D printers to detect and block attempts to print firearms are patently bullshit.
The thing is, I'm not sure how I can reconcile those two seemingly-obvious statements in a principled manner.
[-]- _trampeltier 28 minutes agoIt is very different. It is YOUR 3d printer, no one else is involved. You might print a knife and kill somebody with it, you go to jail, not third party involved.
If you use a service like Grok, then you use somebody elses computer / things. X is the owner from computer that produced CP. So of course X is at least also a bit liable for producing CP.
[-]- pdpi 25 minutes agoHow does that mesh with all the safe harbour provisions we've depended on to make the modern internet, though?
- cubefox 28 minutes ago> The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.
Do you have any evidence for that? As far as I can tell, this is false. The only thing I saw was Grok changing photos of adults into them wearing bikinis, which is far less bad.
[-]- scott_w 13 minutes agoDid you miss the numerous news reports? Example: https://www.theguardian.com/technology/2026/jan/08/ai-chatbo...
For obvious reasons, decent people are not about to go out and try to general child sexual abuse material to prove a point to you, if that’s what you’re asking for.
[-]- cubefox 2 minutes agoFirst of all, the Guardian is known to be heavily biased again Musk. They always try hard to make everything about him sound as negative as possible. Second, last time I tried, Grok even refused to create pictures of naked adults. The claim that they released a tool with "seemingly no guardrailes" is therefore clearly false. I think what instead has happened here is that some people found a hack to circumvent some of those guardrails via something like a jailbreak.
- ChrisGreenHeur 2 hours agoadobe must be shaking in their pants
- gulfofamerica 16 hours ago[dead]
- trhway 2 hours agoInternet routers, network cards, the computers, OS and various application software have no guardrails and is used for all the nefarious things. Why those companies aren't raided?[-]
- protocolture 35 minutes agoCarriage services have long been exempt from liability for the services they carry, as long as they follow other laws like lawful intercept, so that criminals can be detected.
Sorry but I feel this needs to be said: DUHHHHHHHH!!!!!!!!!
Also I need you to understand that the person who creates the child porn is the ultimate villain, transferring it across a carriage service or unrelated OS is only a crime if they can detect and prevent it. In this case, Grok is being used as an automated, turnkey child porn creation system. The OS, following your logic, would only be at fault if Grok is so thoroughly bad it cannot be removed through other means and OS level functions were required to block it. Ditto, its very possible that Grok might find its way onto an internet filter, if the outcome of this investigation leads to its blacklisting but the US government continues to permit it to seed the planet with indecent images of young people. In which case a router might be taken as evidence against an ISP that failed to implement the ban.
Sorry again, but this is just so blindingly obvious: DERRRRRRRRRRRRRR!!!!!!!!!
I am doing my best to act in keeping with the requirements of this website, unfortunately you have just made some statements so patently ridiculous, that its a moral imperative that they be immediately and thoroughly be ridiculed. Ridicule, is the only possible response because there's no argument or supposition behind these statements, only a complete leaden lack of thought, foresight or understanding.
If you want to come up with something better than the worlds worst combination non sequitur/whataboutism, I will do my best to take it seriously. Until then, you should reflect on why you made such an overwhelmingly dense statement. Duh.
- sirnicolaz 2 hours agoThis is like comparing the danger of a machine gun to that of a block of lead.
- trothamel 2 hours agoDon't forget polaroid in that.
- orwin 13 hours agoFrance prosecutors use police raids way more than other western countries. Banks, political parties, ex-presidents, corporate HQs, worksites... Here, while white-collar crimes are punished as much as in the US (i.e very little), we do at least investigate them.
- t0lo 2 hours ago[flagged]
- aaomidi 17 hours agoLmao they literally made a broad accessible CSAM maker.[-]
- Playboi_Carti 2 hours ago>Car manufacturers literally made a broadly accessible baby killer[-]
- ilogik 1 hour agoCar manufacturers are required to add features to make it less likely that cars kill babies.
What would happen if Volvo made a special baby-killing model with extra spikes?
[-]- _trampeltier 26 minutes agoTesla did, the main reason, why there are no Cybertrucks in europe. They are not allowed, because they are to dangerous.
- techblueberry 18 hours agoI'm not saying I'm entirely against this, but just out of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?[-]
- rsynnott 17 hours ago> what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
You would be _amazed_ at the things that people commit to email and similar.
Here's a Facebook one (leaked, not extracted by authorities): https://www.reuters.com/investigates/special-report/meta-ai-...
- afavour 18 hours agoIt was known that Grok was generating these images long before any action was taken. I imagine they’ll be looking for internal communications on what they were doing, or deciding not to do, doing during that time.
- direwolf20 10 hours agoMaybe emails between the French office and the head office warning they may violate laws, and the response by head office?
- arppacket 10 hours agoThere was a WaPo article yesterday, that talked about how xAI deliberately loosened Grok’s safety guardrails and relaxed restrictions on sexual content in an effort to make the chatbot more engaging and “sticky” for users. xAI employees had to sign new waivers in the summer, and start working with harmful content, in order to train and enable those features.
I assume the raid is hoping to find communications to establish that timeline, maybe internal concerns that were ignored? Also internal metrics that might show they were aware of the problem. External analysts said Grok was generating a CSAM image every minute!!
https://www.washingtonpost.com/technology/2026/02/02/elon-mu...
[-]- chrisjj 6 hours ago> External analysts said Grok was generating a CSAM image every minute!!
> https://www.washingtonpost.com/technology/2026/02/02/elon-mu...
That article has no mention of CSAM. As expected, since you can bet the Post has lawyers checking.
- reaperducer 12 hours agoout of curiosity, what do they hope to find in a raid of the french offices, a folder labeled "Grok's CSAM Plan"?
You're not too far off.
There was a good article in the Washington Post yesterday about many many people inside the company raising alarms about the content and its legal risk, but they were blown off by managers chasing engagement metrics. They even made up a whole new metric.
There was also prompts telling the AI to act angry or sexy or other things just to keep users addicted.
- Mordisquitos 18 hours agoWhat do they hope to find, specifically? Who knows, but maybe the prosecutors have a better awareness of specifics than us HN commenters who have not been involved in the investigation.
What may they find, hypothetically? Who knows, but maybe an internal email saying, for instance, 'Management says keep the nude photo functionality, just hide it behind a feature flag', or maybe 'Great idea to keep a backup of the images, but must cover our tracks', or perhaps 'Elon says no action on Grok nude images, we are officially unaware anything is happening.'
[-]- cwillu 17 hours agoOr “regulators don't understand the technology; short of turning it off entirely, there's nothing we can do to prevent it entirely, and the costs involved in attempting to reduce it are much greater than the likely fine, especially given that we're likely to receive such a fine anyway.”[-]
- bawolff 2 hours agoWouldn't surprise me, but they would have to be very incompetent to say that outside of attorney-client privledge convo.
Otoh it is musk.
- pirates 16 hours agoThey could shut it off out of a sense of decency and respect, wtf kind of defense is this?[-]
- cwillu 13 hours agoYou appear to have lost the thread (or maybe you're replying to things directly from the newcomments feed? If so, please stop it.), we're talking about what sort of incriminating written statements the raid might hope to discover.
- chrisjj 15 hours ago[flagged][-]
- wasabi991011 9 hours agoI don't understand your point.
In a further comment you are using a US-focused organization to define an English-language acronym. How does this relate to a French investigation?
[-]- chrisjj 6 hours agoUS uses English - quite a lot actually.
As for how it relates, well if the French do find that "Grok's CSAM Plan" file, they'll need to know what that acronym stands for. Right?
- rsynnott 14 hours agoItem one in that list is CSAM.[-]
- chrisjj 14 hours agoYou are mistaken. Item #1 is "images of children of a pornographic nature".
Wheras "CSAM isn’t pornography—it’s evidence of criminal exploitation of kids." https://rainn.org/get-informed/get-the-facts-about-sexual-vi...
[-]- anigbrowl 5 hours agoA distinction without a difference.
Even if some kid makes a video of themselves jerking off for their own personal enjoyment, unprompted by anyone else, if someone else gains access to that (eg a technician at a store or an unprincipled guardian) and makes a copy for themselves they're criminally exploiting the kid by doing so.
[-]- guerrilla 2 hours agoSeems like a pretty big difference. It's got to be worse to actually do something to somone in real life than not do that.[-]
- lysp 2 hours agoNot really, otherwise perpetrators will just "I was just looking at it, I didn't do anything as bad as creating it". Their act is still illegal.
There was a cartoon picture I remember seeing around 15+ years ago of Bart Simpson performing a sex act. In some jurisdictions (such as Australia), this falls under the legal definition.
[-]- guerrilla 1 hour ago> Not really, otherwise perpetrators
You don't think it's worse to molest a child than to not molest a child?
- chrisjj 4 hours ago> A distinction without a difference.
Huge difference here in Europe. CSAM is a much more serious crime. That's why e.g. Interpol runs a global database of CSAM but doesn't bother for mere child porn.
- ffsm8 10 hours agoYou're wrong - at least from the perspective of the commons.
First paragraph on Wikipedia
> Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn,[1][2][3] is erotic material that involves or depicts persons under the designated age of majority. The precise characteristics of what constitutes child pornography vary by criminal jurisdiction.[4][5]
Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn. While i'd agree that sexualizing kids is disgusting, denying that it's porn on that grounds is feels kinda... Childish? Like someone holding their ears closed and shouting loudly in order not to hear the words the adults around them are saying.
[-]- bawolff 2 hours agoI think the idea is that normal porn can be consensual. Material involving children never can be.
Perhaps similar to how we have a word for murder that is different from "killing" even though murder always involves killing.
- chrisjj 9 hours ago> First paragraph on Wikipedia
"...the encyclopedia anyone can edit." Yes, there are people who wish to redefine CSAM to include child porn - including even that between consenting children committing no crime and no abuse.
Compare and contrast Interpol. https://www.interpol.int/en/Crimes/Crimes-against-children/A...
> The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.
I have no idea how anyone could reasonably draw that conclusion from this thread.
[-]- ffsm8 6 hours ago> have no idea how anyone could reasonably draw that conclusion from this thread.
> > Honestly, reading your link got me seriously facepalming. The whole argument seems to be centered around the fact that sexualizing children is disgusting, hence it shouldn't be called porn.
Where exactly did you get the impression from I made this observation from this comment thread?
Your interpol link seems to be literally using the same argument again from a very casual glance btw.
> We encourage the use of appropriate terminology to avoid trivializing the sexual abuse and exploitation of children.
> Pornography is a term used for adults engaging in consensual sexual acts distributed (mostly) legally to the general public for their sexual pleasure.
[-]- chrisjj 4 hours ago> Where exactly did you get the impression from I made this observation from this comment thread?
I assumed you expected us to know what you were referring to.
- direwolf20 10 hours agoWell, RAINN are stupid then.
CSAM is the woke word for child pornography, which is the normal.word for pornography involving children. Pornography is defined as material aiming to sexually stimulate, and CSAM is that.
[-]- chrisjj 9 hours ago> CSAM is the woke word for child pornography
I fear you could be correct.
[-]- direwolf20 7 hours agoCSAM is to child pornography as MAP is to pedophile. Both words used to refer to a thing without the negative connotation.[-]
- FireBeyond 3 hours agoI'd say it was the other way around, MAP is an attempt at avoiding the stigma of pedophile, while CSAM is saying "pornography can be an entirely acceptable, positive, consensual thing, but that's not what 'pornography' involving children is, it's evidence of abuse or exploitation or..."
- stickfigure 12 hours agoHonest question: What does it mean to "raid" the offices of a tech company? It's not like they have file cabinets with paper records. Are they just seizing employee workstations?
Seems like you'd want to subpoena source code or gmail history or something like that. Not much interesting in an office these days.
[-]- ChuckMcM 7 hours agoSadly the media calls the lawful use of a warrant a 'raid' but that's another issue.
The warrant will have detailed what it is they are looking for, French warrants (and legal system!) are quite a bit different than the US but in broad terms operate similarly. It suggests that an enforcement agency believes that there is evidence of a crime at the offices.
As a former IT/operations guy I'd guess they want on-prem servers with things like email and shared storage, stuff that would hold internal discussions about the thing they were interested in, but that is just my guess based on the article saying this is related to the earlier complaint that Grok was generating CSAM on demand.
[-]- chrisjj 6 hours ago> I'd guess they want on-prem servers with things like email and shared storage
For a net company in 2026? Fat chance.
[-]- ChuckMcM 6 hours agoAgreed its a stretch, my experience comes from Google when I worked there and they set up a Chinese office and they were very carefully trying to avoid anything on premises that could searched/exploited. It was a huge effort, one that wasn't done for the European and UK offices where the government was not an APT. So did X have the level of hygiene in France? Were there IT guys in the same vein as the folks that Elon recruited into DOGE? Was everyone in the office "loyal"?[1] I doubt X was paranoid "enough" in France not to have some leakage.
[1] This was also something Google did which was change access rights for people in the China office that were not 'vetted' (for some definition of vetted) feeling like they could be an exfiltration risk. Imagine a DGSE agent under cover as an X employee who carefully puts a bunch of stuff on a server in the office (doesn't trigger IT controls) and then lets the prosecutors know its ready and they serve the warrant.
- Barrin92 5 hours agoUnder GDPR if a company processes European user data they're obligated to make a "Record of Processing Activities" available on demand (umbrella term for a whole bunch of user-data / identity related stuff). They don't necessarily need to store them onsite but they need to be able to produce them. Saying you're an internet company doesn't mean you can just put the stuff on a server in the Caribbean and shrug when the regulators come knocking on your door
That's aside from the fact that they're a publicly traded company under obligation to keep a gazillion records anyway like in any other jurisdiction.
[-]
- niemandhier 10 hours agoGather evidence against employees, use that evidence to put them under pressure to testify against their employer or grant access to evidence.
Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.
That was legal. Guess what, similar things would be legal in France.
We all forget that money is nice, but nation states have real power. Western liberal democracies just rarely use it.
The same way the president of the USA can order a Drone strike on a Taliban war lord, the president of France could order Musks plane to be escorted to Paris by 3 Fighter jets.
[-]- xoxolian 8 hours ago> We all forget that money is nice, but nation states have real power.
Interesting point. There's a top gangster who can buy anything in the prison commissary; and then there's the warden.
[-] - ChrisMarshallNY 9 hours ago> We all forget that money is nice, but nation states have real power.
I remember something (probably linked from here), where the essayist was comparing Jack Ma, one of the richest men on earth, and Xi Jinping, a much lower-paid individual.
They indicated that Xi got Ma into a chokehold. I think he "disappeared" Ma for some time. Don't remember exactly how long, but it may have been over a year.
[-]- kshacker 8 hours agoFrom what I hear, Ma made 1 speech critical of the government and Xi showed him his place. It was a few years, a year of total disappearance followed by slow rehab.
But China is different. Not sure most of western europe will go that far in most cases.
[-]- SanjayMehta 7 hours agoTrump kidnapped Maduro to show the latter his place, but then the US is neither China nor Western Europe so that does not count.[-]
- almosthere 7 hours agoArrested and the vast majority of Venezuela love that it happened.
https://www.cbsnews.com/miami/news/venezuela-survey-trump-ma...
[-]- isr 30 minutes agoAh, so the daily LARGE protests, in Venezuela, against his kidnapping are not indicative of "the vast majority of Venezuela".
But the celebratory pics, which were claimed to be from Venezuela, but were actually from Miami and elsewhere (including, I kid you not, an attempt to pass off Argentine's celebrating a Copa America win) ... that is indicative of "the vast majority of Venezuela"?
If I were smarter, I might start to wonder why, if President Maduro was so unpopular, why would his abductors have to resort to fake footage - which was systematically outed & destroyed by independent journalists within 24 hours? I mean, surely, enough real footage should exist.
Probably better not to have inconvenient non-US-approved independent thoughts like that.
- SanjayMehta 6 hours agoRand Paul asked Rubio what would happen if the shoe was on the other foot. Every US President from Truman onwards is a war criminal.
https://www.tampafp.com/rand-paul-and-marco-rubio-clash-over...
[-]- foolserrandboy 5 hours agoThe people of the US mostly wouldn’t like it the people of VZ mostly did and consider Maduro a thug who lost and stayed in power not their president. Ideologies like Paul have trouble with exceptions to their world view.[-]
- MYEUHD 2 hours ago> the people of VZ mostly did and consider Maduro a thug who lost and stayed in power not their president.
You got this information from American media (or their allies')
In reality, Venezuelans flooded the streets in marches demanding the return of their president.
- SanjayMehta 2 hours agoAh, the "rules based disorder" on display: we do dis, you no do dis.
Hypocrisy at its finest.
- wanderer2323 2 hours agoAccording to USA sources, USA actions are universally approved.
Color me surprised.
- tyre 6 hours agoI mean, come on, we kidnapped him. Yes, he was arrested, but we went into another sovereign nation with special forces and yoinked their head of state back to Brooklyn.[-]
- mrkstu 3 hours agoTo be fair he isn't legitimate head of state- he lost an election and is officially recognized as a usurper and the US had support of those who actually won.[-]
- platevoltage 3 hours agoLarge amounts of people call Joe Biden's election illegitimate. You could even say thats the official position of the current government. Would his kidnapping by a foreign nation be okay with you too?
- ImJamal 3 hours agoHe is not a legitimate head of state. He lost the election.
- hiprob 8 hours agoIt's legal to just put kids in foster care for no reason but to ruin someone's life?[-]
- rvnx 8 hours agoIn France it's possible without legal consequences (though immoral), if you call 119, you can push to have a baby taken from a family for no reason except that you do not like someone.
Claim that you suspect there may be abuse, it will trigger a case for a "worrying situation".
Then it's a procedural lottery:
-> If you get lucky, they will investigate, meet the people, and dismiss the case.
-> If you get unlucky, they will take the baby, and it's only then after a long investigation and a "family assistant" (that will check you every day), that you can recover your baby.
Typically, ex-wife who doesn't like the ex-husband, but it can be a neighbor etc.
One worker explains that they don't really have time to investigate when processing reports: https://www.youtube.com/watch?v=VG9y_-4kGQA and they have to act very fast, and by default, it is safer to remove from family.
The boss of such agency doesn't even take the time to answer to the journalists there...
-> Example of such case (this man is innocent): https://www.lefigaro.fr/faits-divers/var-un-homme-se-mobilis...
but I can't blame them either, it's not easy to make the right calls.
[-]- agoodusername63 8 hours agoI can't believe theres a country out there that has recreated the DMCA but for child welfare[-]
- SanjayMehta 7 hours agoCanada and Germany are no different.
[0] https://www.cbc.ca/news/canada/manitoba/winnipeg-mom-cfs-bac...
[1] https://indianexpress.com/article/india/ariha-family-visit-t...
- gf000 8 hours agoI mean, that's surely not as simple as you make it out to be.[-]
- Normal_gaussian 7 hours agoIts not.
If you call 119 it gets assessed and potentially forwarded to the right department, which then assesses it again and might (quite likely will) trigger an inspection. The people who turn up have broad powers to seize children from the home in order to protect them from abuse.
In general this works fine. Unfortunately in some circumstances this does give a very low skilled/paid person (the inspector) a lot of power, and a lot of sway with judges. If this person is bad at their job for whatever reason (incompetence/malice) it can cause a lot of problems. It is very hard to prove a person like this wrong when they are covering their arse after making a mistake.
afaik similar systems are present in most western countries, and many of them - like France - are suffering with funding and are likely cutting in the wrong place (audit/rigour) to meet external KPIs. One of the worst ways this manifests is creating 'quick scoring' methods which can end up with misunderstandings (e.g. said a thing they didn't mean) ranking very highly, but subtle evidence of abuse moderate to low.
So while this is a concern, this is not unique to France, this is relatively normal, and the poster is massively exaggerating the simplicity.
[-]- belorn 6 hours agoIn Sweden there is a additional review board that go through the decision made by the inspector. The idea is to limit the power that a single inspector has. In practice however the review board tend to rubber stamp decisions, so incompetence/malice still happens.
There was a huge mess right after metoo when a inspector went against the courts rulings. The court had given the father sole custody in a extremely messy divorce, and the inspector did not agree with the decision. As a result they remove the child from his father, in direct contrast to the courts decision, and put the child through 6 years of isolation and abuse with no access to school. It took investigative journalists a while, but the result of the case getting highlighted in media was that the inspector and supervisor is now fired, with two additoal workers being under investigation for severe misconduct. Four more workers would be under investigation but too long time has passed. The review board should have prevented this, as should the supervisor for the inspector, but those safety net failed in this case in part because of the cultural environment at the time.
- MichaelZuo 6 hours ago“ If this person is bad at their job for whatever reason (incompetence/malice) it can cause a lot of problems. It is very hard to prove a person like this wrong when they are covering their arse after making a mistake.”
This seems guaranteed to occur every year then… since incompetence/malice will happen eventually with thousands upon thousands of cases?
[-]- chrisjj 6 hours ago> This seems guaranteed to occur every year then…
Not at all. This job will go to an "AI" any moment now.
/i
- rvnx 8 hours agoI've seen that during harassment; in one YouTube live the woman claimed:
but she was saying it normally, like any normal person does when they are angry."today it's my husband to take care of him because sometimes my baby makes me angry that I want to kill him"-> Whoops, someone talked with 119 to refer a "worrying" situation, baby removed. It's already two years.
There are some non-profit fighting against such: https://lenfanceaucoeur.org/quest-ce-que-le-placement-abusif...
That being said, it's a very small % obviously not let's not exaggerate but it's quite sneaky.
- ricudis 3 hours agoI heard there's a country where they can even SWAT you out of existence with a simple phone call, but it sounds so outrageous this must be some evil communist dictatorship third-world place. I really don't remember.
- chrisjj 6 hours ago> Gather evidence against employees
I'm sure they have much better and quieter ways to do that.
Whereas a raid is #1 choice for max volume...
- projektfu 8 hours agoWait, Sabu's kids were foster kids. He was fostering them. Certainly if he went to jail, they'd go back to the system.
I mean, if you're a sole caretaker and you've been arrested for a crime, and the evidence looks like you'll go to prison, you're going to have to decide what to do with the care of your kids on your mind. I suppose that would pressure you to become an informant instead of taking a longer prison sentence, but there's pressure to do that anyway, like not wanting to be in prison for a long time.
- kps 9 hours ago> We all forget that money is nice, but nation states have real power.
Elon has ICBMs, but France has warheads.
[-]- speed_spread 9 hours agoFrance has Ariane, which was good enough to send Jame Web Telescope to some Lagrange point with extra precision. It's all fun and and games until the French finish their cigarette, arms French Guyana and fire ze missiles.
- cadamsdotcom 7 hours agoYes but using such power unscrupulously is a great way to lose it.
- mmooss 9 hours ago> Western liberal democracies just rarely use it.
Also, they are restricted in how they use it, and defendents have rights and due process.
> Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.
Though things like that can happen, which are very serious.
[-]- VBprogrammer 9 hours ago> defendents have rights and due process.
As they say: you can beat the rap but not the ride. If a state wants to make your life incredibly difficult for months or even years they can, the competent ones can even do it while staying (mostly) on the right side of the law.
[-]- colechristensen 8 hours agoWe are not entirely sure the rule of law in America isn't already over.
People are putting a lot of weight on the midterm elections which are more or less the last line of defense besides a so far tepid response by the courts and even then consequence free defiance of court orders is now rampant.
We're really near the point of no return and a lot of people don't seem to notice.
[-]- 5upplied_demand 8 hours ago> We're really near the point of no return and a lot of people don't seem to notice.
A lot of people are cheering it (some on this very site).
- nilamo 8 hours ago> Also, they are restricted in how they use it, and defendents have rights and due process.
It's a nice sentiment, if true. ICE is out there, right now today, ignoring both individual rights as well as due process.
[-]- generic92034 8 hours agoThey were talking about western liberal democracies, though.
/s
- toss1 9 hours ago>> they are restricted in how they use it, and defendents have rights and due process.
That due process only exists to the extent the branches of govt are independent, have co-equal power, and can hold and act upon different views of the situation.
When all branches of govt are corrupted or corrupted to serve the executive, as in autocracies, that due process exists only if the executive likes you, or accepts your bribes. That is why there is such a huge push by right-wing parties to take over the levers of power, so they can keep their power even after they would lose at the ballot box.
- mschuster91 8 hours ago> Also, they are restricted in how they use it, and defendents have rights and due process.
As we're seeing with the current US President... the government doesn't (have to) care.
In any case, CSAM is the one thing other than Islamist terrorism that will bypass a lot of restrictions on how police are supposed to operate (see e.g. Encrochat, An0m) across virtually all civilized nations. Western nations also will take anything that remotely smells like Russia as a justification.
[-]- gf000 8 hours ago> As we're seeing with the current US President
Well, that's particular to the US. It just shows that checks and balances are not properly implemented there, just previous presidents weren't exploiting it maliciously for their own gains.
- SpaceManNabs 8 hours ago> Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.
This is pretty messed up btw.
Social work for children systems in the USA are very messed up. It is not uncommon for minority families to lose rights to parent their children for very innocuous things that would not happen to a non-oppressed class.
It is just another way for the justice/legal system to pressure families that have not been convicted / penalized under the supervision of a court.
And this isn't the only lever they use.
Every time I read crap like this I just think of Aaron Swartz.
[-]- pastage 7 hours agoOne can also say we do too little for children who get mistreated. Taking care of other peoples children is never easy the decision needs to be fast and effective and no one wants to take the decision to end it. Because there are those rare cases were children dies because of a reunion with their parents.
- rhetocj23 8 hours ago[dead]
- gruez 8 hours ago>Sabu was put under pressure by the FBI, they threatened to place his kids into foster care.
>That was legal. Guess what, similar things would be legal in France.
lawfare is... good now? Between Trump being hit with felony charges for falsifying business records (lawfare is good?) and Lisa Cook getting prosecuted for mortgage fraud (lawfare is bad?), I honestly lost track at this point.
>The same way the president of the USA can order a Drone strike on a Taliban war lord, the president of France could order Musks plane to be escorted to Paris by 3 Fighter jets.
What's even the implication here? That they're going to shoot his plane down? If there's no threat of violence, what does the French government even hope to achieve with this?
[-]- knallfrosch 8 hours agofighter jets ARE a threat of violence, and it is widely understood and acknowledged.
Again: the threat is so clear that you rarely have to execute on it.
[-]- gruez 8 hours ago>fighter jets ARE a threat of violence, and it is widely understood and acknowledged.
That's not a credible threat because there's approximately 0% chance France would actually follow through with it. Not even Trump would resort to murder to get rid of his domestic adversaries. As we seen the fed, the best he could muster are some spurious prosecutions. France murdering someone would put them on par with Russia or India.
[-]- niemandhier 26 minutes agoIn the USA they would be allowed to down any aircraft not complying with national air interception rules, that would not be murder. It would be equivalent to not dropping a gun once prompted by an officer and being shot as a result.
https://www.faa.gov/air_traffic/publications/atpubs/aim_html...
- anigbrowl 6 hours agoI think the implication of the fighter jets is that they force the plane to land within a particular jurisdiction (where he is then arrested) rather than allowing it to just fly off to somewhere else. Similar to the way that a mall security guard might arrest a shoplifter; the existence of security guards doesn't mean the mall operators are planning to murder you.[-]
- zzrrt 6 hours agoGuards can plausibly arrest you without seriously injuring you. But according to https://aviation.stackexchange.com/a/68361 there are no safe options if the pilot really doesn’t want to comply, so there is no “forcing” a plane to land somewhere, just making it very clear that powerful people really want you to stop and might be able to give more consequences on the ground if you don’t.[-]
- arcologies1985 4 hours agoPlanes are required to comply with instructions; if they don't they're committing a serious crime and the fighters are well within their international legal framework to shoot the plane down. They would likely escalate to a warning shot with the gun past the cockpit, and if the aircraft is large enough they might try to shoot out one engine instead of the wing or fuselage.
- anigbrowl 5 hours agoI suspect fighter pilots are better than commercial pilots at putting their much-higher-spec aircraft so uncomfortably close that your choices narrow down to complying with their landing instructions or suicidally colliding with one - in which case the fighter has an ejector seat and you don't.[-]
- zzrrt 2 hours agoI felt like you ruled out collision when you said they're not going to murder, though, granted, an accidental but predictable collision after repeatedly refusing orders is not exactly murder. I think the point stands, they have to be willing to kill or to back down, and as others said I'm skeptical France or similar countries would give the order for anything short of an imminent threat regarding the plane's target. If Musk doesn't want to land where they want him to, he's going to pay the pilot whatever it takes, and the fighter jets are going to back off because whatever they want to arrest him for isn't worth an international incident.
- ozim 7 hours agoDon’t forget that captain of the plane makes decisions not Elon.
If captain of the plane disobeyed direct threat like that from a nation, his career is going to be limited. Yeah Elon might throw money at him but that guy is most likely never allowed again to fly near any French territory. I guess whole cabin crew as well .
Being clear for flying anywhere in the world is their job.
Would be quite stupid to loose it like truck driver DUI getting his license revoked.
[-]- gruez 7 hours ago>Don’t forget that captain of the plane makes decisions not Elon.
>If captain of the plane disobeyed direct threat like that from a nation, his career is going to be limited. Yeah Elon might throw money at him but that guy is most likely never allowed again to fly near any French territory. I guess whole cabin crew as well .
Again, what's France trying to do? Refuse entry to France? Why do they need to threaten shooting down his jet for that? Just harassing/pranking him (eg. "haha got you good with that jet lmao")?
- ricudis 3 hours ago> Not even Trump would resort to murder to get rid of his domestic adversaries
Don't give them ideas
- lcnPylGDnU4H9OF 7 hours ago> lawfare is... good now?
Well, when everything is lawfare it logically follows that it won't always be good or always be bad. It seems Al Capone being taken down for tax fraud would similarly be lawfare by these standards, or am I missing something? Perhaps lawfare (sometimes referred to as "prosecuting criminal charges", as far as I can tell, given this context) is just in some cases and unjust in others.
- beart 12 hours agoOffline syncing of outlook could reveal a lot of emails that would otherwise be on a foreign server. A lot of people save copies of documents locally as well.[-]
- cm2187 8 hours agoMost enterprises have fully encrypted workstations, when they don't use VM where the desktop is just a thin client that doesn't store any data. So there should be really nothing of interest in the office itself.
- paxys 12 hours agoWhether you are a tech company or not, there's a lot of data on computers that are physically in the office.[-]
- ramuel 11 hours agoExcept when they have encryption, which should be the standard? I mean how much data would authorities actually retrieve when most stuff is located on X servers anyways? I have my doubts.[-]
- throw3e98 41 minutes agoI knew someone who was involved in an investigation (the company and person was the victim not the target of the investigation), their work laptop got placed into a legal hold, the investigators had access to all of their files and they weren't allowed delete to anything (even junk emails) for several years.
You don't get to say no to these things.
- BrandoElFollito 11 hours agoThe authorities will request the keys for local servers and will get them. As for remote ones (outside of France jurisdiction) it depends where they are and how much X wants to make their life difficult.[-]
- ramuel 10 hours agoMusk and X don't seem to be the type to care about any laws or any compelling legal requests, especially from a foreign government. I doubt the French will get anything other than this headline.[-]
- Retric 10 hours agoGetting kicked out of the EU is extremely unattractive for Twitter. But the US also has extradition treaties so that’s hardly the end of how far they can escalate.[-]
- okanat 10 hours agoI don't think US will extradite anybody to EU. Especially not white people with strong support of the current government.[-]
- Retric 10 hours agoWhite people already extradited to the EU during the current administration would disagree. But this administration has a limited shelf life, even hypothetically just under 3 years of immunity isn’t enough for comfort.[-]
- wongarsu 8 hours ago> But this administration has a limited shelf life, even hypothetically just under 3 years of immunity isn’t enough for comfort.
Depends on how much faith you have in the current administration. Russia limits presidents to two 6-year terms, yet Putin is in power since 2000.
- JumpCrisscross 10 hours ago> don't think US will extradite anybody to EU
EU, maybe not. France? A nuclear state? Paris is properly sovereign.
> people with strong support of the current government
Also known as leverage.
Let Musk off the hook for a sweetheart trade deal. Trump has a track record of chickening out when others show strength.
[-]- krisoft 9 hours ago> France? A nuclear state? Paris is properly sovereign.
That is true. But nukes are not magic. Explain to me how you imagine the series of events where Paris uses their nukes to get the USA to extradite Elon to Paris. Because i’m just not seeing it.
[-]- JumpCrisscross 7 hours ago> nukes are not magic. Explain to me how you imagine the series of events where Paris uses their nukes to get the USA to extradite Elon to Paris
Paris doesn’t need to back down. And it can independently exert effort in a way other European countries can’t. Musk losing Paris means swearing off a meaningful economic and political bloc.
- rvnx 8 hours agoNo need for nukes. France can issue an Interpol Red Notice for the arrest of Elon Musk, for whatever excuse is found.
- fmajid 9 hours agoFrance doesn't extradite its citizens, even absolute scumbags like Roman Polanski. Someone like Musk has lots of lawyers to gum up extradition proceedings, even if the US were inclined to go along. I doubt the US extradition treaty would cover this unless the French could prove deliberate sharing of CSAM by Musk personally, beyond reckless negligence. Then again, after the Epstein revelations, this is no longer so far-fetched.
- shawabawa3 9 hours agoIf I'm an employee working in the X office in France, and the police come in and show me they have a warrant for all the computers in the building and tell me to unlock the laptop, I'm probably going to do that, no matter what musk thinks[-]
- formerly_proven 9 hours agoWitnesses can generally not refuse in these situations, that's plain contempt and/or obstruction. Additionally, in France a suspect not revealing their keys is also contempt (UK as well).[-]
- rvnx 8 hours ago100%. Only additional troubles for yourself personally, for practically no benefit (nobody in the company is going to celebrate you).
- Teever 10 hours agoThe game changed when Trump threatened the use of military force to seize Greenland.
At this point a nuclear power like France has no issue with using covert violence to produce compliance from Musk and he must know it.
These people have proven themselves to be existential threats to French security and France will do whatever they feel is necessary to neutralize that threat.
Musk is free to ignore French rule of law if he wants to risk being involved in an airplane accident that will have rumours and conspiracies swirling around it long after he’s dead and his body is strewn all over the ocean somewhere.
[-]- ronsor 10 hours agoYou're implying that France is going to become a terrorist state? Because suspicious accidents do not sound like rule of law.[-]
- bulbar 9 hours agoKilling foreigners outside of the own country has always been deemed acceptable by governments that are (or were until recently) considered to generally follow rule of law as well as the majority of their citizen. It also doesn't necessarily contradicts rule of law.
It's just that the West has avoided to do that to each other because they were all essentially allied until recently and because the political implications were deemed too severe.
I don't think however France has anything to win by doing it or has any interest whatsoever and I doubt there's a legal framework the French government can or want to exploit to conduct something like that legally (like calling something an emergency situation or a terrorist group, for example).
- hunterpayne 9 hours agoCounter-point. France has already kidnapped another social media CEO and forced him to give up the encryption keys. The moral difference between France (historically or currently) and a 3rd wold warlord is very thin. Also, look at the accusations. CP and political extremism are the classic go-tos when a government doesn't really have a reason to put pressure on someone but they really want to anyway. France has a very questionable history of honoring rule of law in politics. Putting political enemies in prison on questionable charges has a long history there.[-]
- rvnx 8 hours agoWe are also talking about a country who wants to ban anonymous VPNs in the name of protecting the children and ask everyone to give their ID card to register account on Instagram, TikTok, etc.
OpenDNS is censored in France... so imagine
- anigbrowl 6 hours agoPeople were surprised when the US started just droning boats in the Caribbean and wiping out survivors, but then the government explained that it was law enforcement and not terrorism or piracy, so everyone stopped worrying about it.
Seriously, every powerful state engages in state terrorism from time to time because they can, and the embarrassment of discovery is weighed against the benefit of eliminating a problem. France is no exception : https://en.wikipedia.org/wiki/Sinking_of_the_Rainbow_Warrior
- myko 7 hours agoNo difference in a strike like that and the strikes against fishing boats near Venezuela trump has ordered
- cyberax 9 hours ago> You're implying that France is going to become a terrorist state? Because suspicious accidents do not sound like rule of law.
Why not? After all, that's in vogue today. Trump is ignoring all the international agreements and rules, so why should others follow them?
- Teever 9 hours agoBecome? https://en.wikipedia.org/wiki/Sinking_of_the_Rainbow_Warrior
The second Donald Trump threatened to invade a nation allied with France is the second anyone who works with Trump became a legitimate military target.
Like a cruel child dismembering a spider one limb at a time France and other nations around the world will meticulously destroy whatever resources people like Musk have and the influence it gives him over their countries.
If Musk displays a sufficient level of resistance to these actions the French will simply assassinate him.
[-]- hunterpayne 9 hours agoYou got that backwards. Greenpeace for all its faults is still viewed as a group against which military force is a no-no. Sinking that ship cost France far more than anything they inflicted on Greenpeace. If anything, that event is evidence that going after Musk is a terrible idea.
PS Yes, Greenpeace is a bunch of scientifically-illiterate fools who have caused far more damage than they prevented. Doesn't matter because what France did was still clearly against the law.
- anigbrowl 6 hours agoIf you're a database administrator or similar working at X in France, are you going to going to go to jail to protect Musk from police with an appropriate warrant for access to company data? I doubt it.
- jimbo808 8 hours agoIt sounds better in the news when you do a raid. These things are generally not done for any purpose other than to communicate a message and score political points.
- bsimpson 12 hours agoI had the same thought - not just about raids, but about raiding a satellite office. This sounds like theater begging for headlines like this one.[-]
- direwolf20 10 hours agoThey do what they can. They obviously can't raid the American office.
- ronsor 12 hours agoThese days many tech company offices have a "panic button" for raids that will erase data. Uber is perhaps the most notorious example.[-]
- caminante 11 hours ago>notorious
What happened to due process? Every major firm should have a "dawn raid" policy to comply while preserving rights.
Specific to the Uber case(s), if it were illegal, then why didn't Uber get criminal charges or fines?
At best there's an argument that it was "obstructing justice," but logging people off, encrypting, and deleting local copies isn't necessarily illegal.
[-]- pyrale 8 hours ago> if it were illegal, then why didn't Uber get criminal charges or fines?
They had a sweet deal with Macron. Prosecution became hard to continue once he got involved.
[-]- caminante 7 hours agoMaybe.
Or they had a weak case. Prosecutors even drop winnable cases because they don't want to lose.
[-]- pyrale 10 minutes agoMacron's involvement with Uber is public information at this point.
[1]: https://www.lemonde.fr/pixels/article/2022/07/10/uber-files-...
[2]: https://www.radiofrance.fr/franceinter/le-rapport-d-enquete-...
- intrasight 10 hours agoIt is aggressive compliance. The legality would be determined by the courts as usual.[-]
- caminante 9 hours ago> aggressive compliance
Put this up there with nonsensical phrases like "violent agreement."
;-)
[-]- fragmede 8 hours agoviolent agreement is when you're debating something with someone, and you end up yelling at each other because you think you disagree on something, but then you realize that you (violently, as in "are yelling at each other") agree on whatever it is. Agressive compliance is when the corporate drone over-zealously follows stupid/pointless rules when they could just look the other way, to the point of it being aggressively compliant (with stupid corporate mumbo jumbo).[-]
- caminante 7 hours agoWho knows.
I don't see aggressive compliance defined anywhere. Violent agreement has definitions, but it feels like it's best defined as a consulting buzzword.
- wasabi991011 10 hours agoIt wasn't erasing as far I know, but locking all computers.
Covered here: https://www.theguardian.com/news/2022/jul/10/uber-bosses-tol...
- BrandoElFollito 11 hours agoThis is a perfect way for the legal head of the company in-country to visit some jails.
They will explain that it was done remotely and whatnot but then the company will be closed in the country. Whether this matters for the mothership is another story.
[-]- chrisjj 6 hours ago> but then the company will be closed in the country. Whether this matters for the mothership is another story.
Elon would love it. So it won't happen.
- amelius 8 hours agoOf course they will not lock the data but hide it, and put some redacted or otherwise innocent files in their place.[-]
- acdha 8 hours agoThat sounds awfully difficult to do perfectly without personally signing up for extra jail time for premeditated violation of local laws. Like in that scenario, any reference to the unsanitized file or a single employee breaking omertà is proof that your executives and IT staff conspired to violate the law in a way which is likely to ensure they want to prosecute as maximally as possible. Law enforcement around the world hates the idea that you don’t respect their authority, and when it slots into existing geopolitics you’d be a very tempting scapegoat.
Elon probably isn’t paying them enough to be the lightning rod for the current cross-Atlantic tension.
[-]- amelius 8 hours agoThese days you can probably ask an LLM to redact the files for you, so expect more of it.[-]
- acdha 6 hours agoTrue, but that’s going to be a noisy process until there are a few theoretical breakthroughs. I personally would not leave myself legally on the hook hoping that Grok faked something hermetically.
- BrandoElFollito 8 hours agoNobody does that. It is either cooperation with law enforcement or remote lock (and then there are consequences for the in-country legal entity, probably not personally for the head but certainly for its existence).
This was a common action during the Russian invasion of Ukraine for companies that supported Ukraine and closed their operations in Russia.
- digiown 6 hours agoOr they just connect to a mothership with keys on the machine. The authorities can have the keys, but alas, they're useless now, because there is some employee watching the surveillance cameras in the US, and he pressed a red button revoking all of them. What part of this is illegal?
Obviously, the government can just threaten to fine you any amount, close operations or whatever, but your company can just decide to stop operating there, like Google after Russia imposed an absurd fine.
[-]- anigbrowl 6 hours agoYou know police are not all technically clueless, I hope. The French have plenty of experience dealing with terrorism, cybercrime, and other modern problems as well as the more historical experience of being conquered and occupied, I don't think it's beyond them to game out scenarios like this and preempt such measures.
As France discovered the hard way in WW2, you can put all sorts of rock-solid security around the front door only to be surprised when your opponent comes in by window.
- politelemon 12 hours agoIt's sad to see this degree of incentives perverted, over adhering to local laws.
- mr_mitm 10 hours agoHow do you know this?[-]
- stronglikedan 9 hours agoFrom HN, of course! https://news.ycombinator.com/item?id=32057651
- anigbrowl 6 hours agoThey do have some physical records, but it would be mostly investigators producing a warrant and forcing staff to hand over administrative credentials to allow forensic data collection.[-]
- chrisjj 6 hours ago> forcing staff to hand over administrative credentials to allow forensic data collection.
What, thinking HQ wouldn't cancel them?
[-]
- aucisson_masque 9 hours ago> Are they just seizing employee workstations?
Yes.
- eli 8 hours agoWhy don't you think they have file cabinets and paper records?
- KaiserPro 10 hours agoGather evidence.
I assume that they have opened a formal investigation and are now going to the office to collect/perloin evidence before it's destroyed.
Most FAANG companies have training specifically for this. I assume X doesn't anymore, because they are cool and edgy, and staff training is for the woke.
[-]- niemandhier 10 hours agoIf that training involves destroying evidence or withholding evidence from the prosecution, you are going to jail if you follow it.[-]
- hn_go_brrrrr 10 hours agoWhat a strange assumption. The training is "summon the lawyers immediately", "ensure they're accompanied at all times while on company premises", etc.[-]
- niemandhier 9 hours agoThat can start with self deleting messages if you are under court order, and has happens before:
“Google intended to subvert the discovery process, and that Chat evidence was ‘lost with the intent to prevent its use in litigation’ and ‘with the intent to deprive another party of the information’s use in the litigation.’”
https://storage.courtlistener.com/recap/gov.uscourts.cand.37...
VW is another case where similar things happens:
https://www.bloomberg.com/news/articles/2017-01-12/vw-offici...
The thing is: Companies don’t got to jail, employees do.
[-]- kitsune1 8 hours ago[dead]
- free652 9 hours ago>withholding evidence from the prosecution, you are going to jail if you follow.
Prosecution must present a valid search warrant for *specific* information. They don't get a carte blanche, so uber way is correct. lock computers and lets the courts to decide.
- KaiserPro 9 hours agoThe training is very much the opposite.
mine had a scene where some bro tried to organise the resistance. A voice over told us that he was arrested for blocking a legal investigation and was liable for being fired due to reputational damage.
X's training might be like you described, but everywhere else that is vaguely beholden to law and order would be opposite.
- Aurornis 10 hours ago> Seems like you'd want to subpoena source code or gmail history or something like that.
This would be done in parallel for key sources.
There is a lot of information on physical devices that is helpful, though. Even discovering additional apps and services used on the devices can lead to more discovery via those cloud services, if relevant.
Physical devices have a lot of additional information, though: Files people are actively working on, saved snippets and screenshots of important conversations, and synced data that might be easier to get offline than through legal means against the providers.
In outright criminal cases it's not uncommon for individuals to keep extra information on their laptop, phone, or a USB drive hidden in their office as an insurance policy.
This is yet another good reason to keep your work and personal devices separate, as hard as that can be at times. If there's a lawsuit you don't want your personal laptop and phone to disappear for a while.
[-]- charcircuit 9 hours agoSure it might be on the device, but they would need a password to decrypt the laptop's storage to get any of the data. There's also the possibility of the MDM software making it impossible to decrypt if given a remote signal. Even if you image the drive, you can't image the secure enclave so if it is wiped it's impossible to retrieve.[-]
- Aurornis 5 hours ago> Sure it might be on the device, but they would need a password to decrypt the laptop's storage to get any of the data.
In these situations, refusing to provide those keys or passwords is an offense.
The employees who just want to do their job and collect a paycheck aren’t going to prison to protect their employer by refusing to give the password to their laptop.
The teams that do this know how to isolate devices to avoid remote kill switches. If someone did throw a remote kill switch, that’s destruction of evidence and a serious crime by itself. Again, the IT guy isn’t going to risk prison to wipe company secrets.
- alex1138 10 hours agoWhy is this the most upvoted question? Obsessing over pedantry rather than the main thrust of what's being discussed
- nebula8804 9 hours agoI read somewhere that Musk (or maybe Theil) companies have processes in place to quickly offload data from a location to other jurisdictions (and destroy the local data) when they detect a raid happening. Don't know how true it is though. The only insight I have into their operations was the amazing speed by which people are badged in and out of his various gigafactories. It "appears" that they developed custom badging systems when people drive into gigafactories to cut the time needed to begin work. If they are doing that kind of stuff then there has got to be something in place for a raid. (This is second hand so take with a grain of salt)
EDIT: It seems from other comments that it may have been Uber I was reading about. The badging system I have personally observed outside the Gigafactories. Apologies for the mixup.
[-]- malfist 9 hours agoThat is very much illegal in the US[-]
- int_19h 9 hours agoIt wouldn't be the first time a Musk company knowingly does something illegal.
I think as far as Musk is concerned, laws only apply in the "don't get caught" sense.
[-]- scottyah 6 hours agoEveryone defines their own moral code and trusts that more than the laws of the land. Don't tell me you've never gone over the speed limit, or broken one of the hundreds of crazy laws people break in everyday life out of ignorance.
- rvnx 8 hours agogive any country a gift / investment of 100B USD
-> crimes ? what crimes ?
- ta9000 11 hours agoGuess that will be a SpaceX problem soon enough. What a mess.[-]
- nebula8804 8 hours agoI wonder if the recent announcement spurred them into making a move now rather than later.[-]
- tyre 6 hours agoThe merger was most likely now because they have to do it before the IPO. After the IPO, there’s a whole process to force independent evaluation and negotiation between two boards / executives, which would be an absolute dumpster fire where Musk controls both.
When they’re both private, fine, whatever.
[-]- justaboutanyone 6 hours agoFirst thing a public spacex would want to do is sell off all the non-spacex crap
- mschuster91 8 hours agoHow was that move legal anyway? Like... a lot of people and governments gave Musk money to develop, build and launch rockets. And now he's using it to bail out his failing social media network and CSAM peddling AI service.[-]
- wmf 8 hours agoOnce he launched the rockets he can do whatever he wants with the profit. And he wants to train Grok.[-]
- stubish 3 hours agoMoney comes with strings, such as when forming an ongoing relationship with a company you expect them to not merge with other companies you are actively prosecuting. I suspect the deal is going so fast to avoid some sort of veto being prepared. Once SpaceX and xAI are officially the same, you lose the ability to inflict meaningful penalties on xAI without penalizing yourself as an active business partner with SpaceX.
- Psillisp 8 hours agoCSAM in space! At least he isn’t reinventing the cross town bus.
- justaboutanyone 6 hours agoThis sort of thing will be great for the SpaceX IPO :/[-]
- stubish 3 hours agoEspecially if contracts with SpaceX start being torn up because the various ongoing investigations and prosecutions of xAI are now ongoing investigations and prosecutions of SpaceX. And next new lawsuits for creating this conflict of interest by merger.
- verdverm 12 hours agoFrance24 article on this: https://www.france24.com/en/france/20260203-paris-prosecutor...
lol, they summoned Elon for a hearing on 420
"Summons for voluntary interviews on April 20, 2026, in Paris have been sent to Mr. Elon Musk and Ms. Linda Yaccarino, in their capacity as de facto and de jure managers of the X platform at the time of the events,
[-]- miltonlost 12 hours agoI wonder how he'll try to get out of being summoned. Claim 4/20 is a holiday that he celebrates?[-]
- flohofwoe 12 hours ago> Claim 4/20 is a holiday that he celebrates?
Given his recent "far right" bromance that's probably not a good idea ;)
[-]- verdverm 12 hours agoIt hadn't occurred to me that might be the reason they picked 420[-]
- layer8 10 hours agoIt’s unlikely, because putting the month first is a US thing. In France it would be 20/04, or “20 avril”.[-]
- embedding-shape 10 hours agoStill, stoner-cultures in many countries in Europe celebrate 4-20, definitively a bunch of Frenchies getting extra stoned that day. It's probably the de-facto "international cannabis day" in most places in the world, at least the ones influenced by US culture which reached pretty far in its heyday.
- miltonlost 12 hours agoOh, that was 100% in my mind when I wrote that. I was wondering how explicit to be with Musk's celebrating being for someone's birthday.
- sophacles 9 hours agoWouldn't celebrating hitler's birthday be good for his far-right bromance?
- LAC-Tech 10 hours agoWe'll know he's gone too far if he has to take another "voluntary" trip to Israel
- GuinansEyebrows 9 hours agoyou would perhaps be shocked to learn how right-leaning the money folks behind the legal and legacy cannabis markets actually are. money is money.
- inquirerGeneral 11 hours ago[dead]
- why_at 10 hours ago>The Paris prosecutor's office said it launched the investigation after being contacted by a lawmaker alleging that biased algorithms in X were likely to have distorted the operation of an automated data processing system.
I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?
Distorted the operation how? By making their chatbot more likely to say stupid conspiracies or something? Is that even against the law?
[-]- int_19h 8 hours agoHolocaust denial is illegal in France, for one, and Grok did exactly that on several occasions.[-]
- pyrale 8 hours agoAlso, csam and pornographic content using the likeness of unwilling people. Grok’s recent shit was bound to have consequences.[-]
- chrisjj 6 hours agoIf the French suspected Grok/X of something as serious as CSAM, you can bet they would have mentioned it their statement. They didn't. Porn, they did.[-]
- pyrale 13 minutes agoThe first two points of the official document, which I re-quote below are about CSAM.
> complicité de détention d’images de mineurs présentant un caractère pédopornographique
> complicité de diffusion, offre ou mise à disposition en bande organisée d'image de mineurs présentant un caractère pédopornographique
[1]: https://www.tribunal-de-paris.justice.fr/sites/default/files...
- mschuster91 8 hours ago> I'm not at all familiar with French law, and I don't have any sympathy for Elon Musk or X. That said, is this a crime?
GDPR and DMA actually have teeth. They just haven't been shown yet because the usual M.O. for European law violators is first, a free reminder "hey guys, what you're doing is against the law, stop it, or else". Then, if violations continue, maybe two or three rounds follow... but at some point, especially if the violations are openly intentional (and Musk's behavior makes that very very clear), the hammer gets brought down.
Our system is based on the idea that we institute complex regulations, and when they get introduced and stuff goes south, we assume that it's innocent mistakes first.
And in addition to that, there's the geopolitical aspect... basically, hurt Musk to show Trump that, yes, Europe means business and has the means to fight back.
As for the allegations:
> The probe has since expanded to investigate alleged “complicity” in spreading pornographic images of minors, sexually explicit deepfakes, denial of crimes against humanity and manipulation of an automated data processing system as part of an organised group, and other offences, the office said in a statement Tuesday.
The GDPR/DMA stuff just was the opener anyway. CSAM isn't liked by authorities at all, and genocide denial (we're not talking about Palestine here, calm your horses y'all, we're talking about Holocaust denial) is a crime in most European jurisdiction (in addition to doing the right-arm salute and other displays of fascist insignia). We actually learned something out of WW2.
- DaSHacka 10 hours ago[flagged]
- BrandoElFollito 11 hours agoWhy "lol"?[-]
- verdverm 11 hours ago420 is a stoner number, stoners lol a lot, thought of Elmo's failed joint smoking on JRE before I stopped watching
...but then other commenters reminded me there is another thing on the same date, which might have been more the actual troll at Elmo to get him all worked up
[-]- BrandoElFollito 10 hours agoWell yes, if France24 was using "20 April 2026" as we write here, there would be no misunderstanding.
I believe people are looking too much into 20 April → 4/20 → 420
[-]- Findecanor 3 hours agoI believe the French format the date 20/4 ... and the time 16 h 20
- LightBug1 9 hours agoApril 20th most definitely is international stoners day. And I like what the French have done here![-]
- thaumasiotes 4 hours agoI assume in France international stoners' day falls on the 4th of Duodevigintiber.
- verdverm 10 hours agoThanks for the cultural perspective / reminder, yes that is definitely an American automatic translation
- xdennis 8 hours ago> lol, they summoned Elon for a hearing on 420
No. It's 20 April in the rest of the world: 204.
- robtherobber 20 hours ago> The prosecutor's office also said it was leaving X and would communicate on LinkedIn and Instagram from now on.
I mean, perhaps it's time to completely drop these US-owned, closed-source, algo-driven controversial platforms, and start treating the communication with the public that funds your existence in different terms. The goal should be to reach as many people, of course, but also to ensure that the method and medium of communication is in the interest of the public at large.
[-]- Mordisquitos 17 hours agoI agree with you. In my opinion it was already bad enough that official institutions were using Twitter as a communication platform before it belonged to Musk and started to restrict visibility to non-logged in users, but at least Twitter was arguably a mostly open communication platform and could be misunderstood as a public service in the minds of the less well-informed. However, deciding to "communicate" at this day and age on LinkedIn and Instagram, neither of which ever made a passing attempt to pretend to be a public communications service, boggles the mind.[-]
- chrisjj 15 hours ago> official institutions were using Twitter as a communication platform before it belonged to Musk and started to restrict visibility to non-logged in users
... thereby driving up adoption far better than Twitter itself could. Ironic or what.
- nonethewiser 16 hours ago>I mean, perhaps it's time to completely drop these US-owned, closed-source, algo-driven controversial platforms
I think we are getting very close the the EU's own great firewall.
There is currently a sort of identity crisis in the regulation. Big tech companies are breaking the laws left and right. So which is it?
- fine harvesting mechanism? Keep as-is.
- true user protection? Blacklist.
[-]- lokar 12 hours agoOr the companies could obey the law
- morkalork 12 hours agoIn an ideal world they'd just have an RSS feed on their site and people, journalists, would subscribe to it. Voilà!
- spacecadet 19 hours agoThis. What a joke. Im still waiting on my tax refund from NYC for plastering "twitter" stickers on every publicly funded vehicle.
- valar_m 18 hours ago>The goal should be to reach as many people, of course, but also to ensure that the method and medium of communication is in the interest of the public at large.
Who decides what communication is in the interest of the public at large? The Trump administration?
[-]- robtherobber 16 hours agoYou appear to have posted a bit of a loaded question here, apologies if I'm misinterpreting your comment. It is, of course, the public that should decide what communication is of public interest, at least in a democracy operating optimally.
I suppose the answer, if we're serious about it, is somewhat more nuanced.
To begin, public administrations should not get to unilaterally define "the public interest" in their communication, nor should private platforms for that matter. Assuming we're still talking about a democracy, the decision-making should be democratically via a combination of law + rights + accountable institutions + public scrutiny, with implementation constraints that maximise reach, accessibility, auditability, and independence from private gatekeepers. The last bit is rather relevant, because the private sector's interests and the citizen's interests are nearly always at odds in any modern society, hence the state's roles as rule-setter (via democratic processes) and arbiter. Happy to get into further detail regarding the actual processes involved, if you're genuinely interested.
That aside - there are two separate problems that often get conflated when we talk about these platforms:
- one is reach: people are on Twitter, LinkedIn, Instagram, so publishing there increases distribution; public institutions should be interested in reaching as many citizens as possible with their comms;
- the other one is dependency: if those become the primary or exclusive channels, the state's relationship with citizens becomes contingent on private moderation, ranking algorithms, account lockouts, paywalls, data extraction, and opaque rule changes. That is entirely and dangerously misaligned with democratic accountability.
A potential middle position could be ti use commercial social platforms as secondary distribution instead of the authoritative channel, which in reality is often the case. However, due to the way societies work and how individuals operate within them, the public won't actually come across the information until it's distributed on the most popular platforms. Which is why some argue that they should be treated as public utilities since dominant communications infrastructure has quasi-public function (rest assured, I won't open that can of worms right now).
Politics is messy in practice, as all balancing acts are - a normal price to pay for any democratic society, I'd say. Mix that with technology, social psychology and philosophies of liberty, rights, and wellbeing, and you have a proper head-scratcher on your hands. We've already done a lot to balance these, for sure, but we're not there yet and it's a dynamic, developing field that presents new challenges.
[-]- direwolf20 7 hours agoPublic institutions can use any system they want and make the public responsible for reading it.
- TZubiri 10 hours agoWhy would X have offices in France? I'm assuming it's just to hire French workers? Probably leftover from the Pre Acquisition era.
Or is there any France-specific compliance that must be done in order to operate in that country?
[-]- mike-the-mikado 10 hours agoX makes its money selling advertising. France is the obvious place to have an office selling advertising to a large European French-speaking audience.[-]
- joshuaheard 8 hours agoYes, Paris is an international capital and centrally located for Europe, the Middle East, and Africa. Many tech companies have sales offices there.
- isodev 2 hours agoGood and honestly it’s high time. There used to be a time when we could give corps the benefit of the doubt but that time is clearly over. Beyond the CSAM, X is a cesspool of misinformation and generally the worst examples of humanity.
- lukasm 7 hours agoThis is a show of resolve.
"Uh guys, little heads up: there are some agents of federal law enforcement raiding the premises, so if you see that. That’s what that is."
- r721 11 hours agoAnother discussion: https://news.ycombinator.com/item?id=46872894
- darepublic 5 hours agoI remember encountering questionable hentai material (by accident) back in the Twitter days. But back then twitter was a leftist darling[-]
- nemomarx 5 hours agoI think there's a difference between "user uploaded material isn't properly moderated" and "the sites own chatbot generates porn on request based on images of women who didn't agree to it", no?[-]
- nailer 3 hours agoBut it doesn’t. Group has always had Aggressive filters on sexual content just like every other generative AI tool.
People who have found exploits, just like other generative AI tool.
- fumar 2 hours agoDefine leftist for back in the twitter days? I used twitter early in release. Don’t recall it being a faction specific platform.
- techblueberry 4 hours agoDid you report it or just let it continue doing harm?
- jongjong 8 hours agoOnce you've worked long enough in the software industry, you start to understand it's all just a fully planned economy.
- sleepybrett 8 hours agoI guess this means that building the neverending 'deepfake CSAM on demand machine' was a bad idea.
- scotty79 11 hours agoFacebook offices should routinely raided for aiding and profitting from various scams propagated through ads on this platform.[-]
- bluescrn 9 hours agoGovernments don't care about minor scams. Political speech against them, on the other hand...
- DaSHacka 10 hours agoThat would apply to any and all social media though[-]
- ToucanLoucan 10 hours agoSounds awesome, when do we start?
- mkoubaa 6 hours agoGovernments prosecute violations of laws in ways that suit their interest. News at 11
- pogue 20 hours agoFinally, someone is taking action against the CSAM machine operating seemingly without penalty.[-]
- tjpnz 5 hours agoIt's also a massive problem on Meta. Hopefully this action isn't just a one-off.
- chrisjj 20 hours agoI am not a fan of Grok, but there has been zero evidence of it creating CSAM. For why, see https://www.iwf.org.uk/about-us/[-]
- mortarion 19 hours agoCSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
No abuse of a real minor is needed.
[-]- logicchains 18 hours ago[flagged][-]
- ffsm8 16 hours agoHe made no judgement in his comment, he just observed the fact that the term csam - in at least the specified jurisdiction - applies to generated pictures of teenagers, wherever real people were subjected to harm or not.
I suspect none of us are lawyers with enough legal knowledge of the French law to know the specifics of this case
[-]- yafinder 16 hours ago[flagged]
- chrisjj 18 hours ago> CSAM does not have a universal definition.
Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning.
> In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response.
No corroboration found on web. Quite the contrary, in fact:
"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"
https://rm.coe.int/factsheet-sweden-the-protection-of-childr...
> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.
> No abuse of a real minor is needed.
Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."
Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.
[-]- lava_pidgeon 18 hours ago" Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning. "
Are you from Sweden? Why do you think the definition was clear across the world and not changed "before AI"? Or is it some USDefaultism where Americans assume their definition was universal?
[-]- chrisjj 18 hours ago> Are you from Sweden?
No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk.
> Why do you think the definition was clear across the world and not changed "before AI"?
I didn't say it was clear. I said there was no disagreement.
And I said that because I saw only agreement. CSAM == child sexual abuse material == a record of child sexual abuse.
[-]- lava_pidgeon 17 hours ago"No. I used this interweb thing to fetch that document from Sweden, saving me a 1000-mile walk."
So you cant speak Swedish, yet you think you grasped the Swedish law definition?
" I didn't say it was clear. I said there was no disagreement. "
Sorry, there are lots of different judical definitions about CSAM in different countries, each with different edge cases and how to handle them. I very doubt it, there is a disaggrement.
But my guess about your post is, that an American has to learn again there is a world outside of the US with different rules and different languages.
[-]- chrisjj 16 hours ago> So you cant speak Swedish, yet you think you grasped the Swedish law definition?
I guess you didn't read the doc. It is in English.
I too doubt there's material disagreement between judicial definitions. The dubious definitions I'm referring to are the non-judicial fabrications behind accusations such as the root of this subthread.
[-]- lava_pidgeon 15 hours ago" I too doubt there's material disagreement between judicial definitions. "
Sources? Sorry , your gut feeling does not matter. Esspecially if you are not a lawyer
[-]- chrisjj 14 hours agoI have no gut feeling here. I've seen no disagreeing judicial definitions of CSAM.
Feel free to share any you've seen.
- rented_mule 17 hours ago> Even the Google "AI" knows better than that. CSAM "is [...]"
Please don't use the "knowledge" of LLMs as evidence or support for anything. Generative models generate things that have some likelihood of being consistent with their input material, they don't "know" things.
Just last night, I did a Google search related to the cell tower recently constructed next to our local fire house. Above the search results, Gemini stated that the new tower is physically located on the Facebook page of the fire department.
Does this support the idea that "some physical cell towers are located on Facebook pages"? It does not. At best, it supports that the likelihood that the generated text is completely consistent with the model's input is less than 100% and/or that input to the model was factually incorrect.
[-]- chrisjj 16 hours agoThanks. For a moment I slipped and fell for the "AI" con trick :)
- fmbb 18 hours ago> - in any current law.
It has been since at least 2012 here in Sweden. That case went to our highest court and they decided a manga drawing was CSAM (maybe you are hung up on this term though, it is obviously not the same in Swedish).
The holder was not convicted but that is besides the point about the material.
[-]- chrisjj 16 hours ago> It has been since at least 2012 here in Sweden. That case went to our highest court
This one?
"Swedish Supreme Court Exonerates Manga Translator Of Porn Charges"
https://bleedingcool.com/comics/swedish-supreme-court-exoner...
It has zero bearing on the "Putting a bikini on a photo of a child ... is not abuse of a child" you're challenging.
> and they decided a manga drawing was CSAM
No they did not. They decided "may be considered pornographic". A far lesser offence than CSAM.
- lawn 18 hours agoIn Swedish:
https://www.regeringen.se/contentassets/5f881006d4d346b199ca...
> Även en bild där ett barn t.ex. genom speciella kameraarrangemang framställs på ett sätt som är ägnat att vädja till sexualdriften, utan att det avbildade barnet kan sägas ha deltagit i ett sexuellt beteende vid avbildningen, kan omfattas av bestämmelsen.
Which translated means that the children does not have to be apart of sexual acts and indeed undressing a child using AI could be CSAM.
I say "could" because all laws are open to interpretation in Sweden and it depends on the specific image. But it's safe to say that many images produces by Grok are CSAM by Swedish standards.
- freejazz 12 hours agoWhere do these people come from???
- drcongo 14 hours agoThe lady doth protest too much, methinks.[-]
- direwolf20 10 hours agoThat's the problem with CSAM arguments, though. If you disagree with the current law and think it should be loosened, you're a disgusting pedophile. But if you think it should be tightened, you're a saint looking out for the children's wellbeing. And so laws only go one way...
- tokai 18 hours ago"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"
Because that is up to the courts to interpret. You cant use your common law experience to interpret the law in other countries.
[-]
- moolcool 17 hours agoAre you implying that it's not abuse to "undress" a child using AI?
You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.
[-]- chrisjj 15 hours ago> Are you implying that it's not abuse to "undress" a child using AI?
Not at all. I am saying just it is not CSAM.
> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.
Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.
[-]- enaaem 4 hours agoWhy do you want to keep insisting that undressing children is not CSAM? It's a weird hill to die on..
- mrtksn 11 hours agoCSAM: Child Sexual Abuse Material.
When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.
[-]- chrisjj 5 hours ago> When you undress a child with AI,
I guess you mean pasting a naked body on a photo of a child.
> especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated.
In which country is that?
Here in UK, I've never heard of anyone jailed for doing that. Whereas many are for making actual child sexual abuse material.
- secretsatan 19 hours agoIt doesn't mention grok?[-]
- chrisjj 18 hours agoSure does. Twice. E.g.
Musk's social media platform has recently been subject to intense scrutiny over sexualised images generated and edited on the site using its AI tool Grok.
[-]
- tehjoker 9 hours agoIt's cool that not every law enforcement agency in the world is under the complete thumb of U.S. based billionaires.
- tomlockwood 10 hours agoElon's in the files asking Epstein about "wild parties" and then doesn't seem to care about all this. Easy to draw a conclusion here.[-]
- guywithahat 8 hours agoAll I've seen is Elon tried to invite himself to the "wild parties" and they told him he couldn't come and that they weren't doing them anymore lol. It's possible he went but, from what I've seen, he wasn't ever invited.
- alex1138 9 hours agoElon is literally in the files, talking about going to the island. It's documented[-]
- yodsanklai 8 hours agoWho knows who did what on this island, and I hope we'll figure it out. But in the meantime, going to this island or/and being friend with Epstein doesn't automatically make someone a pedo or rapist.[-]
- fatherwavelet 7 hours agoAs part of the irrational mob that is out to find the witch, you are just being too rational. Down vote![-]
- anigbrowl 5 hours agoIt's odd to be so prim about someone who is notorious for irrational trolling for the sake of mob entertainment.
https://www.theguardian.com/technology/2018/jul/15/elon-musk...
- jjkaczor 8 hours agoNeither does your wife divorcing you at about the same time things started to go through legal process...
Oops... yeah, in retrospect it was even worse... no... you can and should be judged by the friends you keep and hang-out with... The same ones who seem to be circling the wagons with innocuous statements or attempts to find other scapegoats (DARVO)... hmm, what was that quote again:
"We must all hang together or we will all hang separately"
- fatbird 7 hours agoNo, but they all knew he was a pedo/rapist, and were still friends with him and went to the island of a pedo/rapist, and introduced the pedo/rapist to their friends...
We don't know how many were pedo/rapists, but we know all of them liked to socialize with one and trade favours and spread his influence.
- tomlockwood 6 hours agoYes yes such a complex situation and so hard to tell whether the guy with the pedo non-con site wanted to go to the pedo non-con island.
- hunterpayne 6 hours agoYou know the flight logs are public record and have been for a decade right? We know (and have known for awhile), exactly who was and wasn't there. Who was there: Obama, Bill Clinton, and Bill Gates (his frequency of visits cost him his marriage). Who wasn't there? Trump and Elon because at the time they weren't important enough to get an invite. All of this is a matter of public record.[-]
- tzs 4 hours agoObama is not in the flight logs and there is no evidence he was ever on the island.
- anigbrowl 5 hours agoElon Musk has his own planes, he would not have needed a ride had Epstein invited him. Recently released emails also show people (like commerce secretary Howard Lutnick, who asserted at great length last year that he hadn't had any contact with Epstein since meeting him in 2005) arranging to visit Epstein at his island and taking their own yacht over there.
- chihuahua 9 hours agoHe was only going to the island to get rid of bots on Twitter. Just like OJ spent the rest of his life looking for the real killer.[-]
- alex1138 9 hours agoIt's timestamped like 2013, I think. Years before he bought Twitter (yes, I know you're joking)[-]
- andrewflnr 8 hours agoHe was planning way ahead, like a real genius.
- etchalon 12 hours ago[flagged][-]
- jayGlow 11 hours agoif a user uses a tool to break the law it's on the person who broke the law not the people who made the tool. knife manufacturers aren't to blame if someone gets stabbed right?[-]
- 4887d30omd8 8 hours agoThis seems different. With a knife the stabbing is done by the human. That would be akin to a paintbrush or camera or something being used to create CSAM.
Here you have a model that is actually creating the CSAM.
It seems more similar to a robot that is told to go kill someone and does so. Sure, someone told the robot to do something, but the creators of the robot really should have to put some safeguards to prevent it.
- KaiserPro 10 hours agoIf the knife manufacturer willingly broke the law in order to sell it, then yes.
If the manufacturer advertised that the knife is not just for cooking but also stabbing people, then yes.
if the knife was designed to evade detection, then yes.
- irl_zebra 10 hours agoText on the internet and all of that, but you should have added the "/s" to the end so people didn't think you were promoting this line of logic seriously.
- plagiarist 10 hours agoIf a knife manufacturer constructs an apparatus wherein someone can simply write "stab this child" on a whim to watch a knife stab a child, that manufacturer would in fact discover they are in legal peril to some extent.
- ToucanLoucan 10 hours agoI mean, no one's ever made a tool who's scope is "making literally anything you want," including, apparently CSAM. So we're in a bit of uncharted waters, really. Like mostly, no I would agree, it's a bad idea to hold the makers of a tool responsible for how it's used. And, this is an especially egregious offense on the part of said tool-maker.
Like how I see this is:
* If you can't restrict people from making kiddie porn with Grok, then it stands to reason at the very least, access to Grok needs to be strictly controlled.
* If you can restrict that, why wasn't that done? It can't be completely omitted from this conversation that Grok is, pretty famously, the "unrestrained" AI, which in most respects means it swears more, quotes and uses highly dubious sources of information that are friendly to Musk's personal politics, and occasionally spouts white nationalist rhetoric. So as part of their quest to "unwoke" Grok did they also make it able to generate this shit too?
- kouteiheika 12 hours agoThis is really amusing to watch, because everything that Grok is accused of is something which you can also trigger in currently available open-weight models (if you know what you're doing).
There's nothing special about Grok in this regard. It wasn't trained to be a MechaHitler, nor to generate CSAM. It's just relatively uncensored[1] compared to the competition, which means it can be easily manipulated to do what the users tell it to, and that is biting Musk in the ass here.
And just to be clear, since apparently people love to jump to conclusions - I'm not excusing what is happening. I'm just pointing out the fact that the only special thing about Grok is that it's both relatively uncensored and easily available to a mainstream audience.
[1] -- see the Uncensored General Intelligence leaderboard where Grok is currently #1: https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard
[-]- JumpCrisscross 12 hours ago> everything that Grok is accused of is something which you can also trigger in currently available open-weight models (if you know what you're doing)
Well, yes. You can make child pornography with any video-editing software. How is this exoneration?
[-]- kouteiheika 12 hours agoI'm not talking about video editing software; that's a different class of software. I'm talking about other generative AI models, which you can download today onto your computer, and have it do the same thing as Grok does.
> How is this exoneration?
I don't know; you tell me where I said it was? I'm just stating a fact that Grok isn't unique here, and if you want to ban Grok because of it then you need to also ban open weight models which can do exactly the same thing.
[-]- JumpCrisscross 10 hours ago> that's a different class of software. I'm talking about other generative AI models
And the article is talking about a social media site. A different class of software and company.
> if you want to ban Grok
Straw man. Nobody has suggested this.
[-]- Pedro_Ribeiro 10 hours agoI think he is talking about France who does very much seem like they want to ban X and Grok?[-]
- JumpCrisscross 9 hours ago> France who does very much seem like they want to ban X and Grok?
Source? I’m not seeing that in the French-language press.
- jdross 12 hours agoWell you could not sue the video-editing software for someone making child pornography with it. You would, quite sanely, go after the pedophiles themselves.
- ls612 12 hours agoWe don't go after Adobe for doing that. We go after the person who did it.
- Marsymars 11 hours agoMaybe tying together an uncensored AI model and a social network just isn't something that's ethical / should be legal to do.
There are many things where each is legal/ethical to provide, and where combining them might make business sense, but where we, as a society have decided to not allow combining them.
- throwaway132448 12 hours agoWhataboutism on CSAM, classy. I hope this is the rock bottom for you and that things can only look up from here.[-]
- kouteiheika 12 hours agoNo. I'm just saying that people should be consistent and if they apply a certain standard to Grok then they should also apply the same standard to other things. Be consistent.
Meanwhile what I commonly see is people dunking on anything Musk-related because they dislike him, but give a free pass on similar things if it's not related to him.
[-]- brahma-dev 11 hours agoEvery island is capable of hosting pedophiles, but they don't. The one island that's famous for pedos is the one Musk wanted to be invited to. Find me more pedo islands, I'll dunk on them too very consistently. Whether it's AI with CSAM or islands with pedos, Musk is definitely consistent.
- MBlume 11 hours ago[flagged]
- lingrush4 12 hours agoEvery AI system is capable of generating CSAM and deep fakes if requested by a savvy user. The only thing this proves is that you can't upset the French government or they'll go on a fishing expedition through your office praying to find evidence of a crime.[-]
- NewsaHackO 12 hours ago>Every AI system is capable of generating CSAM and deep fakes if requested by a savvy user.
There is no way this is true, especially if the system is PaaS only. Additionally, the system should have a way to tell if someone is attempting to bypass their safety measures and act accordingly.
- Lalabadie 12 hours ago> if requested by a savvy user
Grok brought that thought all the way to "... so let's not even try to prevent it."
The point is to show just how aware X were of the issue, and that they chose to repeatedly do nothing against Grok being used to create CSAM and probably other problematic and illegal imagery.
I can't really doubt they'll find plenty of evidence during discovery, it doesn't have to be physical things. The raid stops office activity immediately, and marks the point in time after which they can be accused of destroying evidence if they erase relevant information to hide internal comms.
[-]- lingrush4 12 hours agoGrok does try to prevent it. They even publicly publish their safety prompt. It clearly shows they have disallowed the system from assisting with queries that create child sexual abuse material.
The fact that users have found ways to hack around this is not evidence of X committing a crime.
https://github.com/xai-org/grok-prompts/blob/main/grok_4_saf...
- robbru 12 hours agoGrok makes it especially easy to do so.[-]
- grunder_advice 12 hours agoWhat makes Grok special compared to random "AI gf generator 9001" which is hosted specifically with the intent of generating NSFW content?[-]
- JumpCrisscross 12 hours ago> What makes Grok special
X. xAI isn’t being raided. X is. If Instagram bought a girlfriend generator and built it into its app, it would face liability as well.
- plagiarist 9 hours agoIf AI GF Generator 9001 is producing unwilling deepfake pornography of real people, especially if of children, feel free to raid their offices as well.
- etchalon 12 hours ago[dead]
- tw04 12 hours ago>Every AI system is capable of generating CSAM and deep fakes if requested by a savvy user. The only thing this proves is that you can't upset the French government or they'll go on a fishing expedition through your office praying to find evidence of a crime.
If every AI system can do this, and every AI system in incapable of preventing it, then I guess every AI system should be banned until they can figure it out.
Every banking app on the planet "is capable" of letting a complete stranger go into your account and transfer all your money to their account. Did we force banks to put restrictions in place to prevent that from happening, or did we throw our arms up and say: oh well the French Government just wants to pick on banks?
[-]- fourseventy 12 hours agoYou can use photoshop to create CSAM too, should that be banned?
- kalterdev 10 hours agoYet another nail
- ChrisMarshallNY 8 hours ago> They have also summoned billionaire owner Elon Musk for questioning.
Good luck with that...
[-]- dathinab 8 hours agothe thing is a lot of recent legal preceding surrounding X is about weather X fulfilled the legally required due diligence and if not what level of negligence we are speaking about
and the things about negligence which caused harm to humans (instead of e.g. just financial harm) is that
a) you can't opt out of responsibility, it doesn't matter what you put into your TOS or other contracts
b) executives which are found responsible for the negligent action of a company can be hold _personally_ liable
and independent of what X actually did Musk as highest level executive personal did
1) frequently did statements that imply gross negligence (to be clear that isn't necessary how X acted, which is the actual relevant part)
2) claimed that all major engineering decisions etc. are from him and no one else (because he love bragging about how good of an engineer he is)
This means summoning him for questioning is legally speaking a must have independent of weather you expect him to show up or not. And he probably should take it serious, even if that just means he also could send a different higher level executive from X instead.
- sleepybrett 8 hours agoI guess he could just never enter the EU ever again. Maybe he can buy Little St. James.
- afavour 18 hours agoI’m sure Musk is going to say this is about free speech in an attempt to gin up his supporters. It isn’t. It’s about generating and distributing non consensual sexual imagery, including of minors. And, when notified, doing nothing about it. If anything it should be an embarrassment that France are the only ones doing this.
(it’ll be interesting to see if this discussion is allowed on HN. Almost every other discussion on this topic has been flagged…)
[-]- rsynnott 17 hours ago> If anything it should be an embarrassment that France are the only ones doing this.
As mentioned in the article, the UK's ICO and the EC are also investigating.
France is notably keen on raids for this sort of thing, and a lot of things that would be basically a desk investigation in other countries result in a raid in France.
[-]- chrisjj 15 hours agoFull marks to France for addressing its higher than average rate of unemployment.
/i
- cbeach 18 hours ago> when notified, doing nothing about it
When notified, he immediately:
Have the other AI companies followed suit? They were also allowing users to undress real people, but it seems the media is ignoring that and focussing their ire only on Musk's companies...* "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing" - https://www.bbc.co.uk/news/articles/ce8gz8g2qnlo * locked image generation down to paid accounts only (i.e. those individuals that can be identified via their payment details).[-]- afavour 18 hours agoYou and I must have different definitions of the word “immediately”. The article you posted is from January 15th. Here is a story from January 2nd:
https://www.bbc.com/news/articles/c98p1r4e6m8o
> Have the other AI companies followed suit? They were also allowing users to undress real people
No they weren’t? There were numerous examples of people feeding the same prompts to different AIs and having their requests refused. Not to mention, X was also publicly distributing that material, something other AI companies were not doing. Which is an entirely different legal liability.
[-]- chrisjj 15 hours ago> Which is an entirely different legal liability.
In UK, it is entirely the same. Near zero.
Making/distributing a photo of a non-consenting bikini-wearer is no more illegal when originated by computer in bedroom than done by camera on public beach.
[-]- lokar 12 hours agoI thought this was about France[-]
- chrisjj 12 hours agoIt was... until it diverted. https://news.ycombinator.com/item?id=46870196
- bonesss 18 hours agoThe part of X’s reaction to their own publishing I’m most looking forward to seeing in slow-motion in the courts and press was their attempt at agency laundering by having their LLM generate an apology in first-person.
“Sorry I broke the law. Oops for reals tho.”
- freejazz 12 hours agoKiddie porn but only for the paying accounts!
- techblueberry 18 hours ago[flagged]
- gulfofamerica 18 hours ago[dead]
- SilverElfin 14 hours agoSurprised the EU hasn’t banned it yet given that the platform is manipulated by Musk to destabilize Europe and move it towards the far right. The child abuse feels like a smaller problem compared to that risk.[-]
- Bender 14 hours agoIn my opinion I think the reason they raided the offices for CSAM would be there are laws on the books for CSAM and not for social manipulation. If people could be jailed for manipulation there would be no social media platforms, lobbyists, political campaign groups or advertisements. People are already being manipulated by AI.
On a related note given AI is just a tool and requires someone to tell it to make CSAM I think they will have to prove intent possibly by grabbing chat logs, emails and other internal communications but I know very little about French law or international law.
[-]- caminante 12 hours agoIt's broader and mentioned in the article:
>French authorities opened their investigation after reports from a French lawmaker alleging that biased algorithms on X likely distorted the functioning of an automated data processing system. It expanded after Grok generated posts that allegedly denied the Holocaust, a crime in France, and spread sexually explicit deepfakes, the statement said.
[-]- chrisjj 5 hours agoBroader still.
and fraudulent data extraction by an organised group.
- trcarney 8 hours agohold on, are you saying that you should be able to be jailed for manipulation? Where would that end? could i be jailed if i post a review for a restaurant if you feel it manipulated you? or anyone stating an opinion could be construed as manipulation. that is beyond a slippery slope, that is an authoritarian nightmare.[-]
- Bender 6 hours agoI believe the context I was proposing would be at the scale of world-wide manipulation. Rigging elections and such. There is a Netflix documentary called "The Great Hack" that gets into what I am discussing though from the perspective of social media algorithm. This only gets more effective when people are chatting with an AI bot that mimics a human and they think is their significant other that laughs at all their jokes and strokes their ego.
I think your interpretation would be more along the line of making 1984, Brave New World, Fahrenheit 451 and The Handmaid's Tale a reality.
[-]- trcarney 4 hours agoYeah i get that. I just hesitate to give any government even more power than they do now to silence people, which they would definitely use any law like that to do.
I will have to check that out, it sounds interesting. It was also pretty obvious how all the social media companies pushed the same narrative through COVID.
I don't like how these social networks and the media try to manipulate things but I don't think giving the government even more power will fix anything. It will probably make it worse. I think even if you had those laws on the books, you would still get manipulation through selective enforcement.
I think the only solution is education and individuals saying no to these platforms' and their algorithmic feeds. I think we are already seeing a growing movement towards people either not using social media or using it way less than they did previously. I know for me personally, I use X but only follow tech people i like and only look at the "following" tab. It is a much better experience than the "for you" tab
- gf000 8 hours agoSo you think writing a review is somehow on the same magnitude as social media platforms with 300 million-3 billion users?
And how is that different from TV channels/media en large having laws to abide by? Slippery slope arguments are themselves slippery slopes..
[-]- trcarney 6 hours agoThe TV station thing, talking about the US here, only applies to broadcast TV and it is a condition of getting their a frequency allotment from the government.
No, i am not saying that it is the same. I am saying that it would start as "We are just going after the tech companies" but if you give the government an inch they will take a mile. They would take that and expand upon the hate speech stuff you are already see around the world as an excuse to arrest whoever they wanted.
I am a free market person, so i think these sites are providing something to the market that people like or they wouldn't be there. If you wanted to rein them in, fine but you have to be careful how you word stuff or it gets pretty scary pretty quickly.
- chrisjj 5 hours ago> I think the reason they raided the offices for CSAM
Sigh. The French raid statement makes no mention of CSAM.
- FireBeyond 13 hours agoI had to make a choice to not even use Grok (I wasn't overly interested in the first place, but wanted to review how it might compare to the other tools), because even just the Explore option shows photos and videos of CSAM, CSAM-adjacent, and other "problematic" things in a photorealistic manner (such as implied bestiality).
Looking at the prompts below some of those image shows that even now, there's almost zero effort at Grok to filter prompts that are blatantly looking to create problematic material. People aren't being sneaky and smart and wordsmithing subtle cues to try to bypass content filtering, they're often saying "create this" bluntly and directly, and Grok is happily obliging.
- SilverElfin 14 hours agoGiven America passed PAFACA (intended to ban TikTok, which Trump instead put in hands of his friends), I would think Europe would also have a similar law. Is that not the case?[-]
- Bender 14 hours agoAre you talking about this [1]? I don't know the answer to your question whether or not the EU has the same policy. That is talking about control by a foreign adversary.
I think that would delve into whether or not the USA would be considered a foreign adversary to France. I was under the impression we were allies since like the 1800s or so despite some little tiffs now and again.
[1] - https://www.congress.gov/bill/118th-congress/house-bill/7521
[-]- direwolf20 13 hours agoEngineerUSA needs to vastly change his tone to avoid being flagged. I vouched it because it's broadly true but the wording could be a LOT better.
- EngineerUSA 13 hours ago[flagged]
- q3k 9 hours agoThere's no tool, technological or legal, to block/ban a website EU-wide.[-]
- blell 14 minutes agoThey banned Russia Today EU-wide.
- tjpnz 5 hours agoThe EU can declare a company a criminal enterprise and the financial industry must then prevent EU citizens from transacting with them.
- IOT_Apprentice 9 hours agoThey will set their DNS servers to drop all incoming connections to X. That can be done in each country. They can use Deep Packet inspection tools and go from there. If the decision is EU wide then they will roll that out.[-]
- q3k 7 hours agoThere is no law that would permit the EU to do this. This would be a huge thing to introduce and implement, probably a 2-3 year project, and would almost certainly be strongly opposed by multiple member countries.
- fluoridation 9 hours agoDeep packet inspection? What do you mean? Are you talking about domain name confiscation or building a Great Firewall of EU?
- BrandoElFollito 10 hours agoI am not surprised at all. Independent of whether this is true, such a decision from the EU would never be acted upon. The number of layers between the one who says "ban it" somewhere in Bruissels and the operator blackholing the DNS and filtering traffic is decades.[-]
- bulbar 9 hours agoWhy do you think that? It can take a few years for national laws bring in place, but that also depends on how much certain countries push it. Regarding Internet traffic I assume a few specific countries that route most of the traffic would be enough to stop operation for the most part.[-]
- BrandoElFollito 8 hours agoHave you ever seen an actual EU-wide decision on such matters and an actual application?
The closest I can think of is GDPR which has its great aspects and also the cookies law (which is incorrectly interpreted). And some things like private IPs being PIIs which promotes nonsnsical "authorities notifications" that are not used afterwards.
We have consulting companies doing yearly audits on companies to close the books. And yet hacks happen all the time. Without consequences.
There is an ocean between what is announced and lives on paper vs. the reality of the application. If you work in compliance and cubersecurity you see this everyday.
- lowkey_ 12 hours ago> The child abuse feels like a smaller problem compared to that risk.
I think we can and should all agree that child sexual abuse is a much larger and more serious problem than political leanings.
It's ironic as you're commenting about a social media platform, but I think it's frightening what social media has done to us with misinformation, vilification, and echo chambers, to think political leanings are worse than murder, rape, or child sexual abuse.
[-]- preisschild 8 hours agoThose innocent "political leanings" get people killed. See the ICE killings in Minneapolis.
- lingrush4 12 hours agoIn fairness, AI-generated CSAM is nowhere near as evil as real CSAM. The reason why possession of CSAM was such a serious crime is because its creation used to necessitate the abuse of a child.
It's pretty obvious the French are deliberately conflating the two to justify attacking a political dissident.
[-]- lowkey_ 12 hours agoDefinitely agree on which is worse! To be clear, I'm not saying I agree with the French raid. Just that statements about severe crimes (child sexual abuse for the above poster - not AI-generated content) being "lesser problems" compared to politics is a concerning measure of how people are thinking.
- chrisjj 5 hours ago> The reason why possession of CSAM was such a serious crime is because its creation used to necessitate the abuse of a child.
Used to? Still does. A convincing fake is still only a fake.
> It's pretty obvious the French are deliberately conflating the two to justify attacking a political dissident.
Agreed. But the same conflation in the comments hereabouts is ... puzzling.
I mean, abuse of a photo == abuse of a child? Like, voodoo dolls? Creepy.
- jjkaczor 8 hours agoIt may not be worse "objectively" and in direct harm.
However - it has one big problem that is rarely discussed... Normalizing of behaviour, interests and attitudes. It just becomes a thing that Grok can do - for paid accounts, and people think - ok, "no harm, no problem"... Long-term, there will be harm. This has been demonstrated over decades of investigation of CSAM.
- sunshine-o 12 hours agoSimply because if you were to ban this type of platform you wouldn't need Musk to "move it towards the far right" because you would already be the very definition of a totalitarian regime.
But whatever zombie government France is running can't "ban" X anyway because it would get them one step closer to the guillotine. Like in the UK or Germany it is a tinderbox cruising on a 10-20% approval rating.
If "French prosecutor" want to find a child abuse case they can check the Macron couple Wikipedia pages.
[-]- bulbar 9 hours agoWhat do you mean with "this type of platform"? Platforms that don't follow (any) national laws have been banned in multiple countries over the years.
By itself this isn't extraordinary in a democracy.
[-]- rvnx 8 hours agoand France is known for filtering internet access where domains are blocked (over 4000 added per year), including porn, but also news websites
- JumpCrisscross 10 hours ago> if you were to ban this type of platform you wouldn't need Musk to "move it towards the far right" because you would already be the very definition of a totalitarian regime
Paradox of tolerance. (The American right being Exhibit A for why trying to let sunlight disinfect a corpse doesn’t work.)
- blell 12 hours agoBig platforms and media are only good if they try to move the populace to the progressive, neoliberal side. Otherwise we need to put their executives in jail.
- direwolf20 13 hours agoAlmost like the EU can't just ban speech on a whim the way US far right people keep saying it can.
- 936966931646863 9 hours ago[flagged]
- Uhhrrr 12 hours ago[flagged][-]
- latexr 9 hours ago> fairly open platform where people can choose what to post and who to follow.
It is well known Musk amplifies his own speech and the words of those he agrees with on the platform, while banning those he doesn’t like.
https://www.theguardian.com/commentisfree/2024/jan/15/elon-m...
> could you clarify what the difference is between the near right and the far right?
It’s called far-right because it’s further to the right (starting from the centre) than the right. Wikipedia is your friend, it offers plenty of examples and even helpfully lays out the full spectrum in a way even a five year old with a developmental impairment could understand.
[-]- Uhhrrr 8 hours ago[flagged][-]
- antiframe 8 hours agoI was surprised by your claim that Wikipedia would categorize mild restrictions on immigration as an element of far-right politics, so I read that article to see it for myself. I didn't see anything about mild restrictions. Would you care to point out where you saw that?[-]
- Uhhrrr 8 hours ago[flagged][-]
- antiframe 7 hours agoWell, far right is a spectrum, obviously. But a party that equates immigration of a particular religion as terrorism is not "mild immigration restrictions" in my reading.
I cross-checked Wikipedia's information with another source: https://www.connexionfrance.com/news/french-election-is-it-c...
[-]- Uhhrrr 7 hours agoI don't know about that party, but National Rally doesn't say that, and also polls around 34% of French people. So it remains that the Wikipedia "far right" definition is a very wide spectrum.[-]
- antiframe 7 hours agoUm, the article I posted was about the same party. The BBC considers them far-right [1], Politico considers them far-right [2], Reuters considers them far-right [3], AP News considers them far-right [4], NBC News considers them far-right [5], the New York Times considers them far-right [6], Deutsche Welle considers them far-right [7].
I don't think the Wikipedia characterization is far off a pretty commonly held sentiment. You are of course, able to disagree and consider them far-left, center, or whatever label you want.
You stated earlier that because Wikipedia called mild immigration reform far-right (which it did not to my reading, so you pointed to National Rally as an example) words don't mean anything. But words do mean things by consensus, and from my reading the consensus is that National Rally is far-right.
Of course, many far-right (and far-left) thinkers consider themselves centrists or mild, so there will be disagreement.
[1]: https://www.bbc.com/news/articles/cxeee385en1o [2]: https://www.politico.eu/article/france-far-right-faces-inter... [3]: https://www.reuters.com/world/europe/le-pens-far-right-waiti... [4]: https://apnews.com/article/france-election-le-pen-national-r... [5]: https://www.nbcnews.com/world/europe/france-raid-far-right-n... [6]: https://www.nytimes.com/2024/07/02/world/europe/france-natio... [7]: https://www.dw.com/en/france-far-right-rally-after-marine-le...
[-]- Uhhrrr 6 hours agoThe article you posted said, "we just call them that because everyone else does".
But there's also an obvious semantic fail when 34% of the electorate is "far right". This means (16% - half the moderate percentage) is on the non-far right. It implies that "far" is just meaningless cant.
[-]- antiframe 5 hours agoWhere are you getting 34% of the electorate identify as far right from? I tried to find numbers and failed.
- 10xDev 12 hours agoThis is obviously diversion but anyway: Bunch of "American and European" "patriots" that he retweets 24/7 turned out to be people from Iran, Pakistan, India and Russia. These accounts generate likes by default by accounts with "wife of vet" in bio and generic old_blonde_women.jpeg aka bots.[-]
- Uhhrrr 12 hours ago[flagged][-]
- gyudin 10 hours agoPeople having different opinions other than globalists elites is destabilizing to their reign :))[-]
- p_j_w 4 hours agoAre we implying that Musk isn’t part of the global elite?
- preisschild 8 hours agoYou meant to write "Literal russian state-sponsored bots"
- nemo44x 10 hours agoThey can’t fathom that their opinions are unpopular and probably wrong.
- rienbdj 12 hours agoElon fiddles with the algorithm to boost certain accounts. Some accounts are behind an auth wall and others are not. It’s open but not even.[-]
- Uhhrrr 12 hours ago[flagged][-]
- aucisson_masque 9 hours agoIt's pretty obvious, media is called the 4th power.
Control the media, you control the information that a significant part of Europeans get. Elections aren't won by 50%, you only need to convince 4 or 5% of the population that the far right is great.
- sunaookami 8 hours agoSchrödingers social network: It's somehow irrelevant but somehow "destablizies our democracy" ;)
- gmd63 10 hours agoIt gives people who aren't aware of the bot accounts / thumb on the scale the perception that insane crackpot delusions are more popular than they are.
There is a reason Musk paid so much for Twitter. If this stuff had no effect he wouldn't have bought it.
[-]- 936966931646863 9 hours ago[flagged][-]
- javascriptfan69 8 hours agoSocial media should not allow algorithms to actively AMPLIFY disinformation to the public.
If people want to post disinformation that's fine, but the way that these companies push that information onto users is the problem. There either needs to be accountability for platforms or a ban on behavior driven content feeds.
People lying on the internet is fine. Social media algorithms amplifying the lie because it has high engagement is destroying our society.
- javascriptfan69 10 hours agoThe same way that social media has destabilized the USA.
By exposing people to a flood of misinformation and politically radicalizing content designed to maximize engagement via emotion (usually anger).
Remember when Elon Musk alleged that he was going to find a trillion dollars (a year) in waste fraud and abuse with DOGE? Did he ever issue a correction on that statement after catastrophically failing to do so? Do you think that kind of messaging might damage the trust in our institutions?
[-]- bulbar 9 hours ago> Did he ever issue a correction on that statement after catastrophically failing to do so?
To be 'fair', finding fraud never was the real purpose of DOGE, just some fake argument that enough citizen would find plausible.
- verdverm 12 hours ago> where people can choose
How true is this really?
We certainly have data points to show Musk has put his thumb on the scale
[-]- Uhhrrr 12 hours ago[flagged][-]
- verdverm 12 hours agoWhile there may be some feeds on Xitter that are basic algorithms, (1) it's not the only one (2) there may still be less mechanical algorithmic choices within following (what order, what mix, how much) (3) evidence to the contrary exists, are you freeing yourself of facts?
I haven't dug into whatever they open sourced about the algorithm to make definitive statements. Regardless, there are many pieces out there where you can learn about the evidence for direct manipulation.
[-]- Uhhrrr 11 hours ago[flagged][-]
- verdverm 11 hours ago> You can just go on the app yourself and verify this
That's not how science and statistics works. Comprehensive evidence and analysis is a search or chat bot away. The legal cases will go into the details as well, by nature of how legal proceedings work
[-]- Uhhrrr 9 hours ago[flagged]
- mcintyre1994 12 hours agoIn case you're not playing dumb, the term you're looking for would be centre right.
- SilverElfin 11 hours agoFar right to me is advocating for things that discriminate based on protected traits like race, sex, etc. So if you’re advocating for “white culture” above others, that’s far right. If you’re advocating for the 19th amendment (women’s right to vote) to be repealed (as Nick Fuentes and similar influencers do), that’s also far right. Advocating for ICE to terrorize peaceful residents, violate constitutional rights, or outright execute people is also far right.
Near right to me is advocating for things like lower taxes or different regulations or a secure border (but without the deportation of millions who are already in the country and abiding by laws). Operating the government for those things while still respecting the law, upholding the constitution, defending civil rights, and avoiding the deeply unethical grifting and corruption the Trump administration has normalized.
Obviously this is very simplified. What are your definitions out of curiosity?
[-]- Uhhrrr 8 hours agoI think your definition is mostly fine, although deporting illegal immigrants is a moderate position, not near right.
And I would agree with the other reply that Musk is not far right by that definition.
- phasnox 10 hours agoBy your definition Musk is not far right.
> Avoiding the deeply unethical grifting and corruption the Trump administration has normalized.
Care to give examples of these?
[-]- uep 10 hours agoI hate to wade into this cesspool. How about some of the real obvious ones:
This is with two minutes of thought while waiting for a compile. I'm open to hearing how I am wrong.* Crypto currency rug pulls (World Liberty Financial) * Donations linked with pardons (Binance) * Pardoning failed rebels of a coup that favored him (Capitol rioters) * Bringing baseless charges against political enemies and journalists (Comey, Letitia James, Don Lemon) * Musk (DOGE) killing government regulatory agencies that had investigations and cases against his companies
- causalscience 12 hours ago[dead]
- lm28469 12 hours ago[flagged][-]
- ahmeneeroe-v2 12 hours agode Gaulle would be considered insanely far right today. Many aspects of Bush (assuming GW here) would be considered not in line with America's far-right today.
Assume good intent. It helps you see the actually interesting point being made.
[-]- rkomorn 12 hours agoThey wrote "Bush was right wing" (unless it was edited), so what's your point in saying "Many aspects of Bush (assuming GW here) would be considered not in line with America's far-right today." ?[-]
- ahmeneeroe-v2 12 hours agoNope no stealth edit, my bad.
My point still stands, "politics change and assessments of politicians change accordingly".
Bill Clinton's crime bill would be considered far right today.
Ronald Regean's amnesty bill would be considered far left today.
[-]- southerntofu 11 hours agoEven at the time Bill Clinton was already very much right-wing. When he was in power, he oversaw the destruction of public services and the introduction of neoliberalism. Is that not right-wing?
It's not just me saying this. Ask anyone who was politically active (as a leftist) in the 90s. I'm not sure what was the equivalent of the Democratic Socialists of America (center-left) at that time, but i'm sure there was an equivalent and Bill Clinton was much more right-wing. That's without mentioning actual left-wing parties (like communists, anarchists, black panthers etc).
[-]- 5upplied_demand 8 hours ago> Even at the time Bill Clinton was already very much right-wing.
He raised taxes, lowered military spending, and pursued universal healthcare. Those are not, and have never been, right-wing stances in the US.
- ahmeneeroe-v2 10 hours ago>Is that not right-wing?
I don't think many self-described "right-leaning" people would have called Clinton "right wing" in the 90s.
I 100% see your point and agree with you that he had major policies that I would call right wing today.
- southerntofu 11 hours ago> de Gaulle would be considered insanely far right today
As much as it pains me to say this, because i myself consider de Gaulle to be a fascist in many regards, that's far from a majority opinion (disclaimer: i'm an anarchist).
I think de Gaulle was a classic right-wing authoritarian ruler. He had to take some social measures (which some may view as left-wing) because the workers at the end of WWII were very organized and had dozens of thousands of rifles, so such was the price of social peace.
He was right-wing because he was rather conservative, for private property/entrepreneurship and strongly anti-communist. Still, he had strong national planning for the economy, much State support for private industry (Elf, Areva, etc) and strong policing on the streets (see also, Service d'Action Civique for de Gaulle's fascist militias with long ties with historical nazism and secret services).
That being said, de Gaulle to my knowledge was not really known for racist fear-mongering or hate speech. The genocides he took part in (eg. against Algerian people) were very quiet and the official story line was that there was no story. That's in comparison with far-right people who already at the time, and still today, build an image of the ENEMY towards whom all hate and violence is necessary. See also Umberto Eco's Ur-fascism for characteristics of fascist regimes.
In that sense, and it really pains me to write this, but de Gaulle was much less far-right than today's Parti Socialiste, pretending to be left wing despite ruling with right-wing anti-social measures and inciting hatred towards french muslims and binationals.
[-]- constantius 10 hours agoWhile de Gaulle being far-right is not a majority opinion (except in some marginal circles), he would undoubtedly be considered far-right if he was governing today, which is what GP seems to have meant.
I think that, for most Western people today, far-right == bad to non-white people, independent of intention (as you demonstrated with your remark about the PS), so de Gaulle's approach to Algeria, whether he's loud about it or not, would qualify him as far-right already.
All this to say, the debate is based on differing definitions of far-right (for example you conflate fascism and far-right and use Eco, while GP and I seem to think it's about extremely authoritarian + capitalist), and has started from an ignorant comment by an idiot who considers Bush (someone who is responsible for the death of around a million Iraqis, the creation of actual torture camps, large-scale surveillance, etc.) not far-right because, I assume, he didn't say anything mean about African-Americans.
- throwaway132448 12 hours agoBad assumptions are just another form of stupidity.
- lm28469 9 hours agoNo one can assume good intent with such question, at best it's bait.
But then again people on this very forum will argue Sanders is a literal communist so we circle back to the sub 70iq problem
- 762236 12 hours agoIt used to be a principle of the left to believe in free speech. Now that is called right wing.[-]
- JohnTHaller 11 hours agoMAGA talks about free speech but doesn't believe in or practice it.
- southerntofu 12 hours agoBelieving in free speech is neither left nor right, it's on the freedom/authority axis which is perpendicular. Most people on the left never advocated to legalize libel, defamation, racist campaigns, although the minority that did still do today.
The "free-speechism" of the past you mention was about speaking truth to power, and this movement still exists on the left today, see for example support for Julian Assange, arrested journalists in France or Turkey, or outright murdered in Palestine.
When Elon Musk took over Twitter and promised free speech, he very soon actually banned accounts he disagreed with, especially leftists. Why free speech may be more and more perceived as right wing is because despite having outright criminal speech with criminal consequences (such as inciting violence against harmless individuals such as Mark Bray), billionaires have weaponized propaganda on a scale never seen before with their ownership of all the major media outlets and social media platforms, arguing it's a matter of free speech.
- throwaway132448 12 hours agoThere’s no such thing as free speech and there never has been. To believe there is, is to fundamentally fail to understand what a society even is.
- pu_pe 18 hours agoI suppose those are the offices from SpaceX now that they merged.[-]
- omnimus 18 hours agoSo France is raiding offices of US military contractor?[-]
- mkjs 18 hours agoHow is that relevant? Are you implying that being a US military contractor should make you immune to the laws of other countries that you operate in?
The onus is on the contractor to make sure any classified information is kept securely. If by raiding an office in France a bunch of US military secrets are found, it would suggest the company is not fit to have those kind of contracts.
- hermanzegerman 16 hours agoI know it's hard to grasp for you. But in France, french laws and jurisdiction applies, not those of the United States
- fanatic2pope 16 hours agoEven if it is, being affiliated with the US military doesn't make you immune to local laws.
https://www.the-independent.com/news/world/americas/crime/us...
- hereme888 6 hours agoThat's one way to steal the intellectual property and trade secrets of an AI company more successful than any French LLMs. And maybe accidentally leak confidential info.
- mhh__ 2 hours agoI think the grok incident/s were distasteful but I can't honestly think of a reason to ban grok and not any other AI product or even photoshop.
I barely use it these days and think adding it to twitter is pretty meh but I view this as regulators exploiting an open goal to attack the infrastructure itself rather than grok e.g. prune-juice drinking sandal wearers in britain (many of whom are now government backbenchers) absolutely despise twitter and want to ban it ever since their team lost control. Similar vibe across the rest of europe.
They have (astutely, if they realise it at least) found one of the last vaguely open/mainstream spaces for dissenting thought and are thus almost definitely plotting to shut it down. Reddit is completely captured. The right is surging dialectically at the moment but it is genuinely reliant on twitter. The centre-left is basically dead so it doesn't get the same value from bluesky / their parts of twitter.
- vessenes 19 hours agoInteresting. This is basically the second enforcement on speech / images that France has done - first was Pavel Durov @ Telegram. He eventually made changes in Telegram's moderation infrastructure and I think was allowed to leave France sometime last year.
I don't love heavy-handed enforcement on speech issues, but I do really like a heterogenous cultural situation, so I think it's interesting and probably to the overall good to have a country pushing on these matters very hard, just as a matter of keeping a diverse set of global standards, something that adds cultural resilience for humanity.
linkedin is not a replacement for twitter, though. I'm curious if they'll come back post-settlement.
[-]- tokai 18 hours agoIn what world is generating CSAM a speech issue? Its really doing a disservice to actual free speech issues to frame it was such.[-]
- direwolf20 7 hours agoif pictures are speech, then either CSAM is speech, or you have to justify an exception to the general rule.
CSAM is banned speech.
- logicchains 18 hours agoThe point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.[-]
- cwillu 17 hours agoIf libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children.[-]
- whamlastxmas 15 hours agoI'm strongly against CSAM but I will say this analogy doesn't quite hold (though the values behind it does)
Libel must be as assertion that is not true. Photoshopping or AIing someone isn't an assertion of something untrue. It's more the equivalent of saying "What if this is true?" which is perfectly legal
[-]- cwillu 14 hours ago“ 298 (1) A defamatory libel is matter published, without lawful justification or excuse, that is likely to injure the reputation of any person by exposing him to hatred, contempt or ridicule, or that is designed to insult the person of or concerning whom it is published.
It doesn't have to be an assertion, or even a written statement.Marginal note:Mode of expression (2) A defamatory libel may be expressed directly or by insinuation or irony (a) in words legibly marked on any substance; or (b) by any object signifying a defamatory libel otherwise than by words.”[-]- 93po 13 hours agoYou're quoting Canadian law.
In the US it varies by state but generally requires:
A false statement of fact (not opinion, hyperbole, or pure insinuation without a provably false factual core).
Publication to a third party.
Fault
Harm to reputation
----
In the US it is required that it is written (or in a fixed form). If it's not written (fixed), it's slander, not libel.
[-]- cwillu 12 hours agoThe relevant jurisdiction isn't the US either.
- chrisjj 14 hours ago> The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.
Quite.
> That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.
Really? By what US definition of CSAM?
https://rainn.org/get-the-facts-about-csam-child-sexual-abus...
"Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "
- tokai 18 hours agoThat's not what we are discussing here. Even less when a lot of the material here is edits of real pictures.
- duckbilled2 15 hours ago[dead]
- StopDisinfo910 18 hours agoVery different charges however.
Durov was held on suspicion Telegram was willingly failing to moderate its platform and allowed drug trafficking and other illegal activities to take place.
X has allegedly illegally sent data to the US in violation of GDPR and contributed to child porn distribution.
Note that both are directly related to direct violation of data safety law or association with a separate criminal activities, neither is about speech.
[-]- vessenes 15 hours agoI like your username, by the way.
CSAM was the lead in the 2024 news headlines in the French prosecution of Telegram also. I didn't follow the case enough to know where they went, or what the judge thought was credible.
From a US mindset, I'd say that generation of communication, including images, would fall under speech. But then we classify it very broadly here. Arranging drug deals on a messaging app definitely falls under the concept of speech in the US as well. Heck, I've been told by FBI agents that they believe assassination markets are legal in the US - protected speech.
Obviously, assassinations themselves, not so much.
[-]- direwolf20 7 hours agoIn some shady corners of the internet I still see advertisements for child porn through Telegram, so they must be doing a shit job at it
- f30e3dfed1c9 4 hours ago"I've been told by FBI agents that they believe assassination markets are legal in the US - protected speech."
I don't believe you. Not sure what you mean by "assassination markets" exactly, but "Solicitation to commit a crime of violence" and "Conspiracy to murder" are definitely crimes.
- StopDisinfo910 14 hours agoThe issue is still not really speech.
Durov wasn't arrested because of things he said or things that were said on his platform, he was arrested because he refused to cooperate in criminal investigations while he allegedly knew they were happening on a platform he manages.
If you own a bar, you know people are dealing drugs in the backroom and you refuse to assist the police, you are guilty of aiding and abetting. Well, it's the same for Durov except he apparently also helped them process the money.
- logicchains 18 hours ago>but I do really like a heterogenous cultural situation, so I think it's interesting and probably to the overall good to have a country pushing on these matters very hard
Censorship increases homogeneity, because it reduces the amount of ideas and opinions that are allowed to be expressed. The only resilience that comes from restricting people's speech is resilience of the people in power.
[-]- vessenes 15 hours agoYou were downvoted -- a theme in this thread -- but I like what you're saying. I disagree, though, on a global scale. By resilience, I mean to reference something like a monoculture plantation vs a jungle. The monoculture plantation is vulnerable to anything that figures out how to attack it. In a jungle, a single plant or set might be vulnerable, but something that can attack all the plants is much harder to come by.
Humanity itself is trending more toward monoculture socially; I like a lot of things (and hate some) about the cultural trend. But what I like isn't very important, because I might be totally wrong in my likes; if only my likes dominated, the world would be a much less resilient place -- vulnerable to the weaknesses of whatever it is I like.
So, again, I propose for the race as a whole, broad cultural diversity is really critical, and worth protecting. Even if we really hate some of the forms it takes.
[-]- direwolf20 7 hours agoThey were downvoted for completely misunderstanding the comment they replied to.
- moolcool 16 hours agoI really don't see reasonable enforcement of CSAM laws as a restriction on "diversity of thought".
- AureliusMA 16 hours agoThis is precisely the point of the comment you are replying to: a balance has to be found and enforced.
- derrida 18 hours agoI wouldn't equate the two.
There's someone who was being held responsible for what was in encrypted chats.
Then there's someone who published depictions of sexual abuse and minors.
Worlds apart.
[-]- direwolf20 7 hours agoTelegram isn't encrypted. For all the marketing about security, it has none, apart from TLS, and an optional "secret chat" feature that you have to explicitly select, only works with 2 participants and doesn't work very well.
They can read all messages, so they don't have an excuse for not helping in a criminal case. Their platform had a reputation of being safe for crime, which is because they just... ignored the police. Until they got arrested for that. They still turn a blind eye but not to the police.
[-]- derrida 3 hours agook thank you! I did not know that, I'm ashamed to admit! sort of like studying physics at university a decade later forgetting V=IR when I actually needed it for some solar install. I took "technical hiatus" about 5 years and recently coming back.
Anyway cut to the chase, I just checked out Mathew Greens post on the subject, he is on my list of default "trust what he says about cryptography" along with some others like djb, nadia henninger etc
Embarrased to say I did not realise, I should of known! 10+ years ago I used to lurk the IRC dev chans of every relevant cypherpunk project, including of text secure and otr-chat when I saw signal being made and before that was witnessing chats with devs and ian goldberg and stuff, I just assumed Telegram was multiparty OTR,
OOPS!
Long winded post because that is embarrassing (as someone who studied cryptography undergrad in 2009 mathematics, 2010 did postgrad wargames and computer security course and worse - whose word once about 2012-2013 was taken on these matters by activists, journalists, researchers with pretty knarly threat model - like for instance - some guardian stories and former researcher into torture - i'm also the person that wrote the bits of 'how to hold a crypto party' that made it a protocol without an organisation and made clear the threat model was anyone could be there, oops oops oops
Yes thanks for letting me know I hang my head in shame for missing that one or some how believing that one without much investigation, thankfully it was just my own personal use to contact like friend in the states where they aren't already on signal etc.
EVERYONE: DON'T TRUST TELEGRAM AS END TO END ENCRYPTED CHAT https://blog.cryptographyengineering.com/2024/08/25/telegram...
Anyway as they say "use it or lose it" yeah my assumptions here no longer valid or considered to have educated opinion if I got something that basic wrong.
- cbeach 18 hours ago[flagged][-]
- techblueberry 18 hours agoIn November 2012, Epstein sent Musk an email asking “how many people will you be for the heli to island”.
“Probably just Talulah and me. What day/night will be the wildest party on your island?” Musk replied, in an apparent reference to his former wife Talulah Riley.
https://www.theguardian.com/technology/2026/jan/30/elon-musk...
I think there's just as much evidence Clinton did as Musk. Gates on the other hand.
[-]- antonymoose 18 hours agoTo my knowledge Musk asked to go but never actually went. Clinton, however, went a dozen or so times with Epstein on his private jet?
Has the latest release changed that narrative?
[-]- lawn 17 hours agoMusk did ask to go after Epstein was sentenced.[-]
- antonymoose 8 hours agoI hate to be the “source” guy but can I get one?[-]
- direwolf20 7 hours agoThe Epstein files
- whamlastxmas 15 hours agoAdditionally Clinton is listed several times on the Lolita express flight logs, Elon never
Elon didn't ask to go, he was invited multiple times
[-]- direwolf20 7 hours agoIf Elon never asked to go, why do the Epstein files have an email from Elon to Jeff where Elon asks to go? Was it fabricated?
- rsynnott 17 hours ago... Eh? This isn't about Musk's association with Epstein, it's about his CSAM generating magic robot (and also some other alleged dodgy practices around the GDPR etc).