Telegram repeatedly refuses to join child protection schemes
bbc.comHere's a cartoon I found years ago that says it better than a thousand words. I posted it like a half dozen times, and it keeps being spot on.
https://starecat.com/content/wp-content/uploads/control-of-i...
I guess when you label your mendacious, snooping, encryption-breaking, backdoor sneak schemes created for the sake of easier mass surveillance as "child protection measures", moral alchemy turns them into wholesome good programs that only monsters would object to.
Yeah it's all true except 99% of Telegram is not encrypted by design and majority of content is publicly accessible to anyone with account
I would agree with you if Telegram actually had e2ee like Signal. But it isn't. No encryption breaking required to moderate public content.
So wider encryption should be a requirement of resisting mass surveillance facilitation pushed in the name of "protecting children"? It's amazing how far so many people have gone towards normalizing the idea that the state, or some supranational organization in the case of the EU, deserves the right to simply be able to monitor and access reams of private communications and their media at its whim.
Bad and criminal behaviors always existed. I see no evidence of them having been made any lesser or infrequent by virtue of giving massively powerful legally empowered organizations the right to monitor whatever they like at their self-righteously couched discretion.
When Stalin did it, it was good. When the "western" "democracies" do it, it is "to protect the children".
The state shouldn't be allowed to censor chat services no matter how much encryption is used. E2EE might make the censorship less practical but it doesn't make it OK.
Note that only US-companies are required by (US, I guess) law to join these programs.
Also note this part:
> IWF said that the company did remove CSAM once material was confirmed but said it was slower and less responsive to day-to-day requests.
So in the end Telegram removed the content.
I think it would be better if Telegram used the hash lists, however I think that they should use manual review and not remove content automatically, because this is an US platform that theoretically can be misused to remove legal content that US govt doesn't like.
Hash lists aren't that hard to defeat. They'll stop the amateurs tossing stuff around endlessly but the real problem--the creators--will know to keep changing it up a little bit.
And the capability to remove anything means they have to respond to secret orders from the government to remove something.
Easiest thing is to say that you are protecting children. End to end encryption is that technology which ensures that when your wife sends you the shopping list on WhatsApp, tavarish militsiyan cannot eavesdrop and see that you ran out of toilet paper and liquid soap. But they must see! What if you accidentally dropped some pedoporno on that list? It's for kids protection!
So with this attack on Telegram encryption, definitely EU didn't wanna see what political opponents are doing or who's organizing what protest so they undermine it before it happens. We're just hunting pedophiles, what's your problem?
From the french gov's perspective, telegram is a worldwide web of underground tunnels that are inaccessible to the gov. And the gov, being a paranoidal control-freak, gets really upset when you're hiding something from it.
> Another norm that Telegram does not conform to in the usual way
Seems beyond a "norm" if your CEO is jailed for not "conforming"
"He was jailed for not following norms and we know he was not following norms because he was jailed" is maybe slightly circular
Good thing they have judges in France no?
But let's face it many on HN deny the norm and couldn't care less if Telegram is used for criminal content. It's undeniable that the app has a certain reputation.
I’ll take this bait.
For those who have built or participated in building large scale social networks (nvm global scale), you learn the Tim Ferris rule very quickly: 1 in a million is a common occurrence.
As soon as you have a social network you’ll experience a massive industry, trained over decades & with plenty of financial backers, farming it for victims.
Government bodies are hand-waving in the same way companies are - moderation is a difficult, unsolved problem and if you solved it you’d have a new set of difficult problems (opposing sides feeling they are getting more censored than others). No one has a solution to this, which is why regulations are entirely “doing enough to prevent the problem.” Can’t be done.
What I’ve seen expressed on HN has been the central posit of information systems since the beginning - if you require your information systems to be crime & abuse free, you will not have an information system anyone can use.
My personal stance is that an issue does not become a moral one until there is an actual solution on the table or the will to fund its development for the public good.
Where is the EU grants to solve this problem?
It’s seems fairly clear that the EU has invented a revenue engine that collects rent from tech to alleviate pressure on its own unpopular cost burden on its member countries. It’s smart, probably inevitable, and a reality for tech companies to contend with for the foreseeable future.
> if you require your information systems to be crime & abuse free, you will not have an information system anyone can use.
No one does and that’s not why Durov was arrested. It’s fine if there is crime on your system providing you are ready to work with law enforcement so that the space can be policed to the best of your ability.
If you don’t, you are basically voluntarily harming society and will be prosecuted. I’m personally fine with that.
I find it very hard to take the discussion about Telegram here seriously anyway because I just opened Telegram right now to check and sure enough the third contact in people nearby right now is called “Weed, Coke, Viagra - Buy now”. At some point, if you don’t see the issue, I think you might be intentionally blind.
> I just opened Telegram right now to check and sure enough the third contact in people nearby right now is called “Weed, Coke, Viagra - Buy now”. At some point, if you don’t see the issue, I think you might be intentionally blind.
Or we just see these discussions as something that shouldn't be criminalized. I'm happy that those who wish to buy drugs can do so on a safer platform than the street corner.
Well then lobby for drugs to be legalised because arguing for free expression when you are actually sad that your drug dealer got busted is completely hypocritical.
Which part is hypocritical?
I do support drug legalization and decriminalization of all "victimless" crimes.
But I also realize that libertarian policies are unlikely to be adopted in today's political climate, so I also support technology that gets the government out of personal lives, regardless of the law.
>No one does and that’s not why Durov was arrested. It’s fine if there is crime on your system providing you are ready to work with law enforcement so that the space can be policed to the best of your ability. If you don’t, you are basically voluntarily harming society and will be prosecuted. I’m personally fine with that.
What you say is absurd because you're assuming that what the government requests for being "ready to work with law enforcement" is actually reasonable, fair and considerate to certain rights that most of western society ostensibly takes seriously. You assume that the measures proposed under the guise of protecting some vulnerable group won't be used for other much more self-serving things that also very much harm society.
Both of these assumptions are visibly false in so many cases of such state requests that you can't just be "fine" with that unless you flat out don't give a shit about people's fundamental rights against a powerful state.
It seems he was jailed for being in France, really.
Backdooring a social media platform to undermine encryption "in the name of children"... It's for child safety alone. Sure. ::insert eye roll::
Telegram is not really encrypted. No one blame Signal that you can send whatever on it.
It's Telegram's "groups" channels that make it popular.
I'm not sure if Signal has that feature?
I think a more valid comparison is WhatsApp, where Zuckerberg allows many governments ability to "peek" into user metadata at-will.
Hot take: the way to end CSAM (childhoods sexual abusing MPEGs (motion picture expert group) ) aka CP (childhood predator images) for those whom don't know the acronyms -- it is to legalize (((AI generatrated))) CSAM. No child harmed. As we see the internet is already full of AI slop, there can be no question that an infinite amount of CSAMslop may be generated. The reason why this is good, is because anyone looking to profit or market REAL such material, now there is no longer a market to sell it, nor is there a market for giving it for free to get some sort of fucked up pedo kudos. For people who are actual victims and their data shared, their data is but a drop in a vast ocean.
whats the three parentheses mean? I've seen the two in bash, but rarely more than one in regular online chat.