UK Home Secretary says encryption on messaging services is unacceptable
reuters.comWe're a very long way from being a totalitarian state and likely to remain so for quite some time but, make no mistake, this is the thin end of a very long and ultimately very fat wedge. It therefore behooves us well to hold the government to account when they try to get us to swallow more of that wedge.
Sure, encryption helps terrorists as well as ordinary citizens but it's my belief that freedom and privacy are more important than that. The work of police and security services has never been easy in a free society, but protecting and upholding that free society is the very essence of the job. Dilution of that freedom is therefore counter to the purpose for which these agencies exist, and so when the government tries to move in that direction we, as citizens, should voice our resistance, and keep voicing it until they understand.
Every time a person in a position of power calls for the intentional weakening of cryptographic systems - be it via backdoors, limits on key length or whatever, I long for a gutsy interviewer to ask them - preferably live - whether they advocate that position out of ignorance or malice.
There really aren't many other alternatives.
I would rather interviewers asked if that means that Jo Public should also be allowed the tools to read the politicians private messages as well.
I'm all for a Utopian society where nobody needs to encrypt private messages, but so long as there are people in power who feel they need special treatment, then I will continue to demand the same level privacy as them.
I believe they do it because they've been elected to do so.
It takes too much to reply: "Tell you what, I'm going to educate you better instead". Because you can educate all you want, you will not have results that helps your re-election 4-5 years down the road.
No properly-informed voter would want this kind of legislation.
"No true Scotsman..."
I have a feeling that some top level FBI people are properly informed, but still want this kind of legislation.
Simply because it wouldn't apply to them.
More likely they would say they want to do it because it would be "helping the police and security services to do their job" and be "protecting the people".
For sure, but wouldn't it be nice if the interviewer at least challenged the assumption that such measures do protect people overall? Leaving aside the question of whether there would be any benefit in the case of attacks like the one we saw last week, which is debatable in itself, there's also the question of whether the reduced security would do more harm than good.
The interviewer could explore the impact of online fraud or identity theft or cyber-bullying, and look at how fast these problems have been growing in recent years. Then challenge the advocate of weakened encryption or mass surveillance over why they want to require security vulnerabilities or create huge databases that will make great targets for criminals. If they claim everything would be securely held and strictly for police use or similar, go with Snowden and Wikileaks.
If the advocate brings up their other favourite argument about protecting children, the interviewer could ask whether it's really a good idea to make it easier to intercept private picture messages between teenagers.
They could ask why the government wants measures that would inevitably undermine investigative journalism that holds the government and the police and the security services to account. Then start listing past controversies relating to the behaviour of those groups to demonstrate why that public interest reporting matters.
It's not as if privacy and security advocates only think these things are important because they don't like the government or something. There are real, serious consequences in play several different ways here.
The media can't really challenge the government any more, unless the government has done something truly unpopular. If they do challenge on something which doesn't have strong public opinion then they [the media] risk losing interview and access rights to the politicians.
On the other hand, if you're just being a free mouthpiece for whatever politician is on your show anyway because you're not allowed to actually interview them, you might as well tell them to take a hike and present real news and analysis instead, or go for satire. Heck, if the alternative was just letting politicians repeat whatever they want unchallenged, you could do a whole show presenting "alternative facts" and you still wouldn't be any worse off.
We're a very long way from being a totalitarian state and likely to remain so for quite some time
The scary thing is that the difference between the UK and the kind of place we might describe using words like "totalitarian state" is now more about how our laws are used in practice than what the laws actually say. The government and its agents already have very broad powers, our courts have already taken surprisingly illiberal positions when some of those powers have been challenged, and we lack the constitutional checks and balances often found elsewhere, more so if the government uses Brexit as a mechanism to remove those deriving from Europe without replacing them. We're basically just trusting that the government and its agents will be decent people and use the powers they have responsibly at this point, but as we've seen with the likes of Trump, that's a dangerous strategy when you don't know who the government will be in the future.
I agree with this. The last thing we need is more laws and new powers for the surveillance state.
For example, Australia's Lindt Cafe siege - the guy was already under "24 hour surveillance" by ASIO (Australian Security Intelligence) - which did nothing to prevent the attack. Despite this, AFAIK there was not much blame placed on ASIO. I'm sure there are many other examples. I'm not saying it's an easy problem to solve, just that more surveillance is probably not the answer.
Sure, the attacker is the real culprit, but adding more laws and surveillance will not prevent crazies from doing crazy stuff.
Man Monis was not under surveillance by ASIO. He was considered a "serial pest" and "not a threat".
Sorry, yes I was mistaken.
> Dr Barrett placed Monis on anti-psychotic medication after he told her he was under surveillance 24 hours a day from ASIO and Iranian authorities, including in the bathroom of his home. [1]
> Mr Abbott conceded Man Haron Monis was not on a security watchlist, despite his long criminal history and known "infatuation with extremism". [2]
Still, the whole thing demonstrates how useless more surveillance would be.
If they can't even watch someone with a long criminal history and infatuation with extremism, why do they expect us to believe that looking at my dick pics will somehow stop terrorism?
[1] http://www.smh.com.au/nsw/lindt-cafe-siege-inquest-man-haron...
[2] http://www.abc.net.au/news/2014-12-17/sydney-siege3a-could-i...
At some point, some politician is going to have to man up and speak the truth. The public will have to either accept that the government can try and protect it from low probability events by invading every aspect of our lives or people will have to accept the idea that there is a sub 1% chance of being killed by a crazy person.
Corbyn has already spoken up:
Labour leader Jeremy Corbyn said authorities already had "huge powers". There had to be a balance between the "right to know" and "the right to privacy", he said. [1]
Unfortunately when it came to actually doing something he provided practically no opposition to the Investigatory Powers Act.
That's how it's always been. Northern Irish terrorism was used to justify many restrictions of civil liberties in the UK, and while opposition figures made disapproving noises nothing actually changed and their opposition was limited to expressing their opinions while sitting in comfy chairs.
It's one reason I left the UK in the first place; not because of facing personal discrimination for being Irish (I did face some but most British people are pragmatic and fair-minded), but because of the whole securitized atmosphere with security cameras everywhere - it was like being in prison.
However, I have little hope of this changing. As far as I can tell the vast majority of people value security over privacy or autonomy, and of course they never think Bad Things are going to happen to them because they're Good People - like the woman in the news the other day who voted for Trump and is now surprised that her husband is being deported even though he's not a 'bad hombre.'
It's unlikely that there will ever be a mass movement for privacy and autonomy, because the genius and failure o democracy is that it's harder to blame your problems on some antipathetic Other - foreign invaders, aristocrats, an elite social class or whatever. Democracy really depends on people thinking about issues, and most people want to be firmly embedded in a social context, perhaps because were a eusocial species. It's hard for them to conceptualize an oppressive state in the same way that it's hard for most people to imagine hating their parents or the people int heir community.
Happily, as far as digital technology is concerned there is not a whole lot the UK government can actually do about the issue, and May's speechifying is more directed at appeasing the drooling tabloid-reading class than it is reflective of any serious policy initiative. the likely effect sit aht operating systems on computing devices sold in the UK won't be allowed to have built-in encryption and the smart set will be using mods of some sort. Pretty much how it was with PGP 25 years ago.
I suspect that will prove to be Corbyn's "tuition fees" moment. If he's still there by the next general election, the Lib Dems will surely attack him for being all talk and no action.
Lib Dems are experts at all talk and no action. Oh irony.
Unsubstantive partisan comments are a kind we really need fewer of, so please don't post them here.
> sub 1% chance of being killed by a crazy person.
Please don't attribute homicidal political views to insanity. These people don't have schizophrenia, and people with schizophrenia aren't terrorists.
There is also a wide spectrum of mental illnesses that aren't schizophrenia that may make people act in these ways. While OP was being flippant I don't think the essence of what they were saying is out of line, doing something extreme and causing a lot of harm and damage could justifiably be called crazy even if not medically accurate with out being a slur on those with diagnosed a mental illness.
Terrorists are politically motivated murderers. So are soldiers in any army.
The only difference is in who they answer to.
You just called a lot of really honorable people murderers.
Murder is unjust killing. Killing civilians to make a point or to cause chaos is, by definition, murder. Let's not equate that with all the legitimate things armed forces are for.
Sorry, many soldiers are murderers. There's clear intention to kill. And justification does not happen simply because some idiot in power tells someone to go kill other people for political reasons.
And coalition is also killing civilians for political reasons, more than thousand of them a month ATM. It is obviously calculated. It's just that someone determined murdering perhaps 5-10 thousands civilians is a OK to give power over Mosul back to Iraqi government, and gave it a go.
Murder is, by definition, an unlawful killing.
And the definition as it exists in a given body of law is not an absolute definition nor how people typically use the term.
The Catholic Church is a good example. The fifth commandment is thou shall not murder. And the Catechism clarifies this by saying
> The fifth commandment forbids direct and intentional killing as gravely sinful. The murderer and those who cooperate voluntarily in murder commit a sin that cries out to heaven for vengeance.
Source: http://www.vatican.va/archive/ccc_css/archive/catechism/p3s2...
There's no single law. Is there a law in Somalia that allows US killing arbitrary people there, unilaterally? Is there such a law in Yemen? Syria? In Syria if you follow the letter of the law, and intention of the (idiot) leader there, he's against the US involvement, it would be a murder. What law was in Vietnam, that killing was somehow justified by it?
In international arena, there's no universal law.
Some idiots killed a few thousands Americans, and then some Americans did go on a killing rampage as a revenge, murdering 10s of thousands mostly unrelated people. That's about how it looks from the outside.
> You just called a lot of really honorable people murderers.
Pretty much every single person involved in, say, the Iraq war was indeed a murderer. Most of the battles the West has engaged in since WWII were unjust and ill-founded. The people involved in those wars are murderers.
'Crazy' refers to more than schizophrenia... nobody uses that word to refer to an exclusive group anymore
There's also a grey area where someone does something horrible and clearly politically motivated (shooting Reagan or Senator Giffords, shooting at Ft. Hood) and that person has mental health issues.
There's a semantic discussion about whether "lone gunmen" type attacks count as "terrorism", but categorizing motivations in real life as politics or insanity isn't so simple.
A long way? Real democracies do not spy on their people because it isn't a democracy anymore when you do. Whether that was always just a... well it is another matter. Business had a reasonable expectation of safe-passage. Who will want their traffic coming through the UK now? What multi-national/trans-national wants all their IP belong to the UK government? They will just go elsewhere. It's the economy stupid. Rudd is using the recent murders to her own twisted ends.
On the other hand: China.
China just reinforces the parent comment. How many online services do you use from China?
China is doing just fine, economically speaking. When they first implemented the Great Firewall, a lot of people predicted it wouldn't last under economic pressure. 20 years later, and it's still going strong. That was my point.
The lack of adoption of Chinese services elsewhere has more to do with cultural and technological issues than anything related to censorship.
It doesn't seem very thin from here. Encrypted messaging is a mathematical fact that your government is attempting to deny.
If I need a communication to be secret, I will encrypt it, and I don't need special software.
Not all people who need it can write their own encryption.
You don't have to, because it is already written and fairly easy to use. You don't have to be a rocket scientist to encrypt a message using asymmetric cryptography, that will be difficult - if possible to break. People who will want to conceal contents of their messages could still send them encrypted via WhatsApp or other platform regardless if there is a backdoor or not. UK Home Secretary has just displayed unprecedented level of incompetence.
> People who will want to conceal contents of their messages could still send them encrypted via WhatsApp or other platform
Even if posession of encrypted messages without the ability to decrypt carries a 10 year prison sentence?
(Yes, this has serious enforceability problems, but that doesn't mean it can't become law)
How will you prove what I possess is encrypted data and not random bits?
No, how will you prove that it's random data and not encrypted data? What if you're ordered to provide a password to decrypt it or go to jail?
https://arstechnica.co.uk/tech-policy/2017/02/justice-naps-m...
Deniable encryption
https://link.springer.com/chapter/10.1007%2F978-3-540-77566-...
It's up to the prosecution to prove their case against me, not vice versa. (At least in the USA.)
If you have a suicide-bomber, I doubt they will care much about the 10 year prison sentence.
The "Terrorists" really don't care about the law or the prison sentence behind breaking it.
And you have just displayed a high level of social incompetence. How this will manifest in practice is not the UK government attempting to outlaw mathematics, but UK judges being instructed to accept the fact of encryption usage as circumstantial evidence of criminal liability in prosecutions. That is to say, if all your communications are encrypted, that can be treated as probable cause for arrest and detention, or could be enough to swing the decision in an otherwise undecidable court case.
Law is social code and it runs not on logic but on the belief of a sufficient majority of the public. If technological factors cannot be overcome, social ones can. You are very naive to think that a governmental entity has to care about logic with regard to individual humans, just as ants would be naive to think they could dissuade you from stepping on them when you walk through the garden. The fact that encryption is technically possible under almost any circumstances (even in prison you could conceivably exchange encrypted messages tapped out in morse code through the walls, say) doesn't matter because the calculus of criminal responsibility doesn't depend on some objective process int he way that an encryption algorithm does.
Nerds are very logical, but people in general are not, and appealing to their sense of logic or consistency is dangerous because you cannot rely on them to change their behavior or attitude for cognitive reasons. Organized religion epitomizes this; people may or may not believe in the actuality of an inaccessible personal divinity, but a) the social rewards for professing to do so may vastly outweigh other considerations, and b) the people who do believe will abandon logic before they'll abandon a belief structure that makes them feel good about themselves.
The UK Home Secretary isn't applying for a job in network security or at a tech company. She's telling people what sort of trouble they're going to be in if they insist on deploying or using strong encryption. And since she's in charge of the police, she is capable of making good on those claims. She is perfectly competent - not at the consistent management of information systems, but at wielding political power.
Your idea of winning an argument is a logical demonstration that would be accepted by your peers. A social entity's idea of winning the argument (by social entity I mean an organized collective intelligence, from a village to a superstate) is to simply remove you physically from the field of play. Societies are coordinated in the same manner as insect swarms or other eusocial structures; They are no less distinct for being distributed, and logical arguments have no meaning to them except insofar as they impact the swam's environment, which is not at all the same thing as the environment of the individual swarm members, even the most senior ones.
This is why a 'privacy first' app/platform/protocol will never succeed on those merits alone. The social body can always make arguments against privacy, for exactly the same reason that you don't care about the feelings of any cancerous or invasive cells that spring to life inside your physical body. What's needed are tools that are built to include privacy from the ground up, but whose use case is better speed and functionality, such that people cannot bear to go without the tools they confer an overwhelming economic advantage.
Thus, fax machines were more 'private' than telex machines insofar as fax transmissions were harder to decrypt, plus they could just be plugged into any telephone socket. But if that had been their only advantage they'd have been banned. the overwhelming benefit of a fax machine was that you could just feed a sheet of paper into it - almost any kind of paper - and send it to someone else by pushing a single button. This was a massive time-saver for business - much cheaper and simpler than installing a Telex system, much cheaper and faster than sending documents around by courier, and much more practical than relying on verbal agreements and notes from telephone conversations.
(I'd like to make it clear that fax machines were never designed or marketed to be secure comms channels, but as a purely practical matter they filled that function for many people, and people who still use faxes often do so because they feel somehow more 'secure' than email.)
"John Oliver: Edward Snowden dick pic". That's how you explain it to the public.
I'm sure a 6 years old kid can learn how to use OTP (https://en.wikipedia.org/wiki/One-time_pad) with friends.
> The work of police and security services has never been easy in a free society,
I think the work of security services have never been easier as it is now thanks to the massive use of social networks and mobile phones, CCTVs everywhere, GEOINT, etc. At least for the Five Eyes.
Just my opinion, but I think now they are having problems because they are getting too much information in, so separating the wheat from the chaff is difficult.
I'm not sure that's even true. This seems more like a clipper chip moment, and more than being the beginning of a slippery slope, it made the government a laughing stock and probably sped up the pivot to encryption.
This too will fail spectacularly.
Even looking at the recent Westminster attack, is it realistic to think that monitoring of his WhatsApp messages would have prevented anything from happening? Was he already under surveillance? Could the attacker really have written anything so specific and unequivocal that it would have actually made police go to Westminster and stop his car? Police with limited resources can't just go investigating everyone who sends messages, so I doubt it would have helped anyway.
I hope you're right, but I don't think so.
The next time there is an attack, I don't think there will be an outcry that the all-pervasive surveillance has failed us, only an outcry against the terrorists (who, let's face it, are the real offenders).
I wrote a letter to Amber Rudd based in part on your comment: https://medium.com/@flukes1/my-letter-to-amber-rudd-on-encry...
Suppose we came upon a time where some new technology had emerged allowing anyone with an average intelligence to be able to create a virulent bio weapon from materials so common and available that their restriction is impossible. In this world, we are almost surely doomed, but perhaps the only hope for survival on earth (ignore space diaspora for the moment) would be from instituting an all pervasive surveilance presence. In this scenario, I do not see the value of privacy trumping survival.
One might counter that the scenario I lay out above is not possible. However I would posit that technology enables our capacities to create/preserve and to destroy. However, perhaps stemming from thr laws of thermodynamics, it does seem that our capacities to destroy is always outpacing our capacities to create or preserve, and eventually the gap between these capacities will unsettle the center which cannot hold.
> I do not see the value of privacy trumping survival.
Survival is not a value. Survival is a prerequisite for a lot of other values, but it's not a value in and of itself.
As many people have difficulty grasping what living in a world without privacy would be like, let me propose a different solution: We'll put everyone into solitary confinement, to ensure everyone's survival, as I do not see the value of freedom of movement trumping survival.
Would you agree with that as well? If not, why not?
Also, you might want to realize that surveillance does not ensure that your set of values gets enforced. It's the values of whoever manages to obtain that power, and whose power as a result of the surveillance might be impossible to challenge. The idea that you could create such a power structure and then guarantee that it's going to be used exclusively to prevent that bio weapon from being built and used is extremely naive. You would instead most likely find yourself alive, living in a world that makes you constantly wish for being dead, but thanks to the surveillance unable to kill yourself.
You presuppose many things about my intentions. I frequently post on the internet about the need for freedom and privacy today. However, as I briefly laid out, I do wonder whether technological innovations will at one point allow lone individuals to kill not a few dozen (with semiautomatics), but tens of thousands, and if at that point our current models will be insufficient for the purpose of survival...that is, no survival without giving up many freedoms. This is not a value judgement, indeed I would applaud the many who would rather die than be controlled completely.
That's a happy Sunday afternoon thought, but we are not at that point yet. The point we are at is one in which corrupt governments seek to control us not for our own safety, but because they crave power and control.
I agree. However, I frequently see absolute declarations about how freedom should not be curtailed for security, and while I believe that is true today, I can envision a possible future where that is not possible without leading to near or complete extinction. Therefore I find it useful foil to shake up absolutisms.
If I can make a super weapon from air and water, how will extra surveillance stop a terrorist from doing that?
I'm not sure what point you're trying to make, are you saying we need more surveillance to protect us because weapons are becoming easier to produce?
Disagree should not equal downvote.
Obviously there are some really passionate freedom lovers here verging on fundalmentalists; will brook no alternative viewpoints from even being hypothesized.
Amber Rudd is the UK's Home Secretary not just any minister.
"We need to make sure that our intelligence services have the ability to get into situations like encrypted Whatsapp."
She has said she is "calling in" technology companies this week to try to "deliver a solution".
Marr asks if they refuse to do that, will you legislate to force them to change? She's not drawn on that.
Interview is here:
http://www.bbc.co.uk/iplayer/episode/b08l62r7/the-andrew-mar... [from 45:18]
I understood that UK IP Bill already mean that she already has the ability to e.g. demand a backdoored version of Whatsapp be sent to a target device, but that's not covered in the interview.
https://www.theregister.co.uk/2016/11/30/investigatory_power...
The thing that concerns me about this perspective that everyday use of encryption is bad is that it makes no damn sense. Pandora's box is opened; if you force software solutions to backdoor their technology, someone will just step up who doesn't care about your laws. There are no global treaties on software development, and we aren't going to be signing any such thing any day soon. Even if we could force something like that down the throats of every country on Earth, the knowledge exists, and anyone can roll out their own solution with a high-school level understanding of the topic.
It's absurd to think this can be resolved through legislation or cajoling companies into cooperation. But what really bothers me about this whole issue is that we already have laws in place that handle this situation, at least in the USA. In the USA, if you refuse to hand over an encryption key (or can't) and are being compelled to by a court, you can and will be held in contempt of court, and possibly convicted of destruction of evidence. The only thing that forcing people to backdoor their crypto does is allow government entities to investigate people without having sufficient evidence to compel them to give up their keys, and destroy the marketability of large scale, centralized end-to-end encryption solutions.
I mean, you could make the argument that end-to-end encryption restricts the ability to wiretap people, sure, but a wiretap warrant should require a decent amount of evidence, and at that point, there are most likely other options.
People can simply send random binary data to keep authorities busy.
That would be somewhat entertaining, and reminds me of someone I read about years ago who was accidentally added to a watchlist, and no one was willing to remove him, so he started generating as much data as he could, just to keep them busy. I can't find a link, sadly.
I don't think it's a good solution, though. This is going to sound like an argument that you hear about privacy by people who don't understand why privacy matters, but I think on this issue, it holds a little more truth: To be honest, I think the people who are likely to care to do this are people who are advocates for privacy and are tech savvy, and people who have something to hide. I don't think it's terribly difficult, with the resources the global Intelligence Community has to build a profile and dump people into those two buckets with a fair degree of accuracy. Everyone who's not generating random garbage data would still be observable, so it wouldn't really change much outside of the group that's acting. But for those acting, suddenly there's a red flag that they can look for, and then when they see it, start building a profile to figure out if you're a tech person with an interest in privacy, or someone doing something they don't want seen, and act accordingly. In the mean time, we waste time generating garbage instead of just using good enough encryption, and making it so easy to use that people don't even have to know they're using it.
Now, if we could somehow implant babies with a chip that causes them to generate random noise (something babies are already pretty good at, mind you) from birth, that might be worth something. There's no profile building if the noise is just something that humans make by being alive.
UK's best shot at surviving Brexit is become stronger on value added industries. They have a very good head start over any other EU country in IT and in some research areas.
Amber Rudd seems hell bent on destroying their only chance.
They all are intent on crippling any chance the UK has at becoming stronger in high tech industries.
I'm so sick of getting "this is an adult resource and you can't view it" anytime I search for information about a drug (pharmaceutical, not just "weed LSD and lols").
Great fucking way to encourage your future chemists. Maybe ban keywords like JavaScript, PHP and SQL while at it, them's the powerful drugs maaan.
> I'm so sick of getting "this is an adult resource and you can't view it" anytime I search for information
Switch ISP, or contact your current one to disable this. They don't all do it by default, or at all.
As far as I'm aware all new accounts in the U.K. do this unless you call to have it disabled (and give personal information). I mean, how else would you protect the children...
>> As far as I'm aware all new accounts in the U.K. do this unless you call to have it disabled (and give personal information).
Usually you can have it disabled while you sign up (and they usually ask if you want it). As for giving personal info - they're you're ISP. They have your name, address, billing details, have done a credit check...what else are they going to ask for that you haven't already supplied?
I can buy a 'burner' SIM when landing at Heathrow without giving any information - paying in cash. However, to disable the content filter requires giving up personal information.
I think all mobile providers do, but the filters are not required on all ISPs, only the big 7 IIRC.
You can find MVNOs without filtering, such as Andrews & Arnold
I was going to suggest them as well. My boss used to use them back when I was in the UK ten years ago and now I would definitely pay the extra for what they provide. Unfortunately I am in Spain now and the choice of ISP's is far more limited.
Andrews & Arnold is unfortunately many times more expensive than the more "mainstream" ISPs. Much cheaper to get a Sky/Virgin connection, and then route all your traffic off their networks over a VPN to a cheap couple of quid VPS per month.
they aren't "many times more expensive"
compare BT[1] (£44.99 for 12 months, then £53.99) with AAISP[2] (£45)
once you add on "a couple of quid VPS a month", you could even go for the AAISP 1TB package at £60/month
[1]: https://www.productsandservices.bt.com/products/broadband-pa...
My point was more about them offering a mobile service - many here know of them as a good ISP, but it is less well known that they offer the same quality of Internet access via mobile.
I certainly fall into that bucket. Thanks for the note, I had no idea they did mobile stuff too.
Or use a VPN service.
> "this is an adult resource and you can't view it"
Those filters, if they're ISP supplied, are optional. You can turn them off.
It's completely optional, turn it off. It also tends to only be mobile networks that do that kind of filtering by default, ISP's ask you when you sign up if you want 'parental controls' or not.
Funnily enough, I just signed up to 3 mobile broadband last week, and I was pleasantly surprised that I didn't have to call and get it turned off (always have before)
One word, my friend: VPN.
> One word, my friend: VPN.
A VPN is a way to circumvent surveillance but make no mistake: We must press with all our power for legislation which guarantees privacy, all over the world. This is a battle that in the long run, we can't win with tech. We need to become more privacy-aware.
"Those who surrender freedom for security will not have, nor do they deserve, either one."
+1. I agree with this.
With the current trend, how long until VPNs are made illegal? "For the children!"
To circumvent bad policy like encryption... Use encryption?
Being ahead of all other EU countries in some field can be something significant while they're inside the EU, in those cases when an investment or purchase needs to be made inside the EU for any reason. But when they're out, what difference does that make?
My point is that in case British businesses have to overcome barriers (e.g. tariffs), their best bet at not going bankrupt is selling non-physical goods and services with as much added value as possible. That implies high technology and investment in research.
Non-physical so selling to America, Africa or Asia is as cheap as selling to Europe and added value to absorb the new cost of doing businesses (in case the UK does not remain in the Single Market).
If e22 encryption is outlawed in the UK, their businesses would be less trusted (harder to sell expensive things) and will be at the mercy of other countries' intelligence and espionage services.
It's not like relations with other European countries will immediately cease on leaving the EU.
That's not what I meant or wish. But it's a (for me sad) fact that at that point the expression "compared to the other EU countries" won't have any sense anymore. And if May will go with hard Brexit, as many predict, it will even become easier to trade with eg Canada than with the UK.
You can say "compared to the other European countries" instead. It has the bonus of being more inclusive! No longer will your statement exclude Norway, Switzerland etc.
Depending on context, this comparison may or may not make sense. Keep in mind that Norway and Switzerland are both part of the single market, which the UK will probably opt out of (as it would require the free movement of workers).
From [1], linked elsewhere in this discussion, and referring to a failed plot in India:
"The Hindi-speaking handler guiding the men in Hyderabad also insisted on using a kaleidoscope of encrypted messaging applications, with Mr. Yazdani instructed to hop between apps so that even if one message history was discovered and cracked, it would reveal only a portion of their handiwork."
"the handler taught Mr. Yazdani how to use the Tails operating system, which is contained on a USB stick and allows a user to boot up a computer from the external device and use it without leaving a trace on the hard drive."
Even if the British government is successful with WhatsApp, can they do much against free, open source tools?
[1] https://www.nytimes.com/2017/02/04/world/asia/isis-messaging...
"Even if the British government is successful with WhatsApp, can they do much against free, open source tools?"
Why would they care about open source tools and niche use of encryption? Of course they don't. They are after mass surveillance and use fear of terrorism to push for it. It's very logical of them.
I feel countries will ultimately make it a crime to use any non approved encryption.
The UK actually tried to enforce key escrow as far back and 1999(http://www.cyber-rights.org/crypto/ukpolicy.htm). But backed down due to conflicts with EU law.
It's not entirely by accident that the UK current ramp up of legalized government surveillance coincide with Brexit, as the UK doesn't actually have a democratic constitution limiting government surveillance outside of the ECHR treaty they signed as a prerequisite for joining the EU/EEC.
> ECHR treaty they signed as a prerequisite for joining the EU/EEC
The UK signed the ECHR in 1950 (and were involved in writing it); the EEC did not exist until 1957.
thanks for correcting my sometimes failing memory though the reality is more complex then that everything ECHR related was defined by 1950, as the active component of the ECHR did not come into existence until 1959 and was not acknowledged as superior court by the UK prior to the UK joining the EU.
The court http://www.coe.int/t/democracy/migration/bodies/echr_en.asp is what makes the ECHR significantly more effective then the unenforceable UN Declaration of Human Rights signed in 1948 only came into effect in 1859 and was only explicitly acknowledged as superior in British law with the still controversial Human rights act of 1998 https://www.supremecourt.uk/about/the-supreme-court-and-euro....
The fact that EU membership demand actual rather then pretend ECHR compliance is a fairly big deal in the anti-EU Tory circles currently running the show in the UK and some of them seem to presume that leaving the EU will absolve the UK of any duty to submit to the ECHR court http://www.telegraph.co.uk/news/2017/01/26/theresa-may-prepa... even though I am sure they think otherwise in Strasbourg.
but you are correct in stating that officially the ECHR came into dejura effect in 1950 under the Council of Europe where the UK unlike for the ECSC(1951) and the EEC(1957) was a founding member, but it's worth nothing here that the Council of Europe is a far more toothless organization(like the UN) then either the ECSC and the EEC.
Edit: fixed links
That's the beauty of it; even with their more restrictive measures and massively increased surveillance, they won't make a significant dent in these sorts of attacks.
So they'll never run out of reasons to push further. Hooray.
I'm surprised I haven't heard the IP bill mentioned other than here in this whole matter. Why all the fanfare? Especially if it's something they can already just do quietly; the public controversy from that law has mostly passed over.
I watched Amber Rudd interviewed by Andrew Marr this morning and the scariest thing about it was that Marr completely agreed with her. Rather than providing an opposing viewpoint and counteracting her points, he agreed with the idea that it was unacceptable for people to be allowed to use encryption and that it was terrible these companies were using it as a selling point. All he pushed her on was if she would enforce cooperation from tech companies.
This isn't all that surprising, given that the government has repeatedly threatened the BBC with the loss/alteration of their charter for being critical of the government and not sufficiently jubilant about Brexit. ITV owning the BBC would arguably be a greater disaster than watered down coverage.
For a corollary see the paucity of coverage on the mass demonstration in London yesterday.
It isn't all that surprising to me either. But for different reasons. They have no idea what they are talking about.
Do we think they know our online banking software uses the same kind of encryption? Probably not. Andrew Marr not knowing this is annoying. But an entire government being ignorant of it is deeply worrying.
Why do you think the government doesn't know? Why do you think they believe what they say?
This is just populist bullshit. It follows on from other populist bullshit.
I think they don't know because I've never seen any of them describe encryption in a way that would suggest they know what they are talking about.
I don't doubt that someone in government knows. Probably an entry level staffer. But whenever technology comes up (such as blocking types of content from the internet) the policy is always utterly ham fisted.
I agree that the policies are always ham fisted, but I'm not sure how you could have a policy that achieves the goal of blocking specific "bad sites" without it being ham fisted.
How would you achieve the goal of coming up with a non-ham fisted technical solution to a ham fisted problem?
We saw prominent features on the BBC, furious headlines from the Daily Mail and self-congratulation from the Guardian. What would you have considered sufficient coverage for yesterdays demonstration?
He actually asked a very unprofessional leading question.
Given that the BBC is a state funded organisation, why would you expect anything else?
How do they want to prevent someone from creating his own end-to-end encryption app? It may use other protocols to encode content (images, tweets, fb posts etc.).
For me it seems to be more in a direction of so called "Big Brother" than real counter-terrorism.
She's probably being led by the intelligence services on this, and of course they have ulterior motives, their ultimate project is to collect all signals, or as many as they can manage, which apps like whatsapp are thwarting at present.
Why can't we collect all the signals all the time?
This is incredibly dangerous for our society, no-one should have that much power. That power isn't about terrorism (or even very useful against terrorism), but about subverting governments, judiciary and businesses.
Yes, this: she might as well just be ignorant (not that it is any justification), but her supposedly competent advisors are actually frauds, fakes and spooks, that's what truly scary to me.
An intelligence led response to terrorism is a result of learning from Northern Ireland. Internment caused harm; talking to the terrorists brought peace.
Don't forget that while they were talking to the IRA politicians were saying in public "we don't talk to terrorists".
Ironically the early stages of such talks were held in a chip shop via a senior civil servant as a proxy, because it was vital to keep this "collusion" away from the awareness of security forces.
(There was a great documentary about this on TV, but I cannot remember the name of it)
> How do they want to prevent someone from creating his own end-to-end encryption app?
It's basically impossible. One can also use steganography to hide messages in lolcat pictures, or music files. The only way to prevent this, I think, is to start a totalitarian surveillance state where using Free or custom software or hardware is punishable by death. Even then, I'm not sure this will be enough.
Given that even the most totalitarian states eventually fail and don't ever have complete and total control over the entire populace, I think you're correct that it will not be enough.
What they really need is to invent time travel, and murder Ada Lovelace.
That should be a canonical test for the implausibility of any policy. "Do we need to invent a time machine for this to work?"
> How do they want to prevent someone from creating his own end-to-end encryption app?
They can't. The US tried it in the 90's when SSL sites could not use strong encryption outside the US and you'd need a license to "export" PGP... That went well! :-/
https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...
Ultimately this type of "lone wolf" attackers will not communicate at all and what are they going to do next?
Install a device on one's head?
Don't give them ideas!
You're forgetting that this has always been possible, and not all adversaries are capable, bother, or aware of what they could do.
I expect it's quite likely this one was using WhatsApp because that's what he used; not because he read about its end-to-end encryption.
The same way you prevent anything.. they make it illegal for people in the UK to make, or use, such products.
Don't think we can "tech" our way out of this.
Exactly, and given we live in a "walled garden" society now, all they need to do is require google or apple to remove from the app stores any app that implements encryption for messaging.
It's actually easier than ever to ban encryption for messaging.
Would that stop determined people? No, but it's never been about that anyway. Just make the pool small enough and it becomes too difficult to use. (See PGP / email).
Also, if you genuinely legislate against encrypted messaging then it's easy to pick up on the relative handful of people who go outside the app stores to get encrypted messaging applications.
And it shouldn't come to technical solutions, we should have people challenge the notion that two people should never be allowed to share a private message, because that's why Rudd and the government is suggesting.
> we should have people challenge the notion that two people should never be allowed to share a private message, because that's why Rudd and the government is suggesting.
+1. This is the crux of the matter, although unfortunately I don't think the average person realises it.
Not "we" as in "we the general public with no specific interest in staying hidden". But of course criminals, terrorists or secret services do have the right incentive structure to always benefit from circumventing encryption bans.
Which shows who the real target is.
Tell that to the pirate bay.
Just because something is illegal doesn't mean it is enforceable.
Well, the Pirate Bay probably never really caught the interest of totalitarian states that really wanted to suppress its existence.
Now, the UK isn't at that point obviously, but if they really wanted to use draconian measures against encryption, it probably would be somewhat effective.
They wouldn't prevent you from making encryption apps. It would be about regulation. You can regulate kinds of encryption (the strength of the algorithms/keyspace etc), and you can regulate who can use it (licensed copies only, or specific businesses only, or types of businesses, non-messaging platforms only, etc).
Then there's how you use it. They could mandate all of X businesses could only use encryption that could be inspected by the state, so either weak encryption, or PKI where you send the government your site's private key or use the state's CA or something. They can also mandate backdoors in encryption used in certain ways. And they can mandate that weak encryption be used outside their country's borders.
All of these are real parts of US laws on cryptography from WWII to 2000 to prevent "export" of "strong encryption", because of course evildoers around the world might make use of these "munitions". US law still regulates how we can use or distribute cryptography around the world. It is illegal in the US to release open source crypto on the internet without notifying the Bureau of Industry and Security. And 41 other countries (including the UK) have similar laws.
The one thing the US has going for it is the 1st Amendment, which makes it illegal for the US to prevent its citizens from making or using crypto within the US.
If you ban encryption and monitor all traffic in the world then you can easily flag messages you can't read as suspicious. You can then hunt down people using the encryption.
How do they want to prevent someone from creating his own end-to-end encryption app?
That's not an issue. Writing solid encryption software is very difficult on its own. You will hear "do not roll your own crypto" all the time from security experts. We don't live in a James Bond universe and it's beyond the reach of terrorist organisations.
You don't have to roll your own crypto to create an own end-to-end encryption app. You can use existing crypto. Writing a user interface around it is not so difficult.
Beyound the reach of the terrorist organisations? We have already seen pretty sophisticated operations by relatively small crime organizations (like exploiting pseudorandom generators in casino slot machines). There's an established black market for exploits. I think writing an end-to-end encryption app is not much more difficult compared to this. What's more, it will even be perfectly legal in many countries, meaning you could legally hire professionals to do the job. Terrorist organisations won't need to esablish a development office in SV to write the app, they will only need to know how to use Tor and wire money to the app producer. Which isn't such a huge competence to ask for.
I think writing an end-to-end encryption app is not much more difficult compared to this.
Plus, they don't have to write it. They could just 'pivot' from an existing open source messenger.
> small crime organizations (like exploiting pseudorandom generators in casino slot machines)
If all you do is pushing the buttons of the slot machine in the right order with the right timing, that's hardly a crime —and I don't care about court judgements to the contrary. If a slot machine has a crappy pseudo random number generator, they're just asking for it. I'd rather sue the slot machine's maker for providing a machine that's not fit for its intended purpose.
It is a crime as in "unlawful act punishable by a state or other authority". You are welcome to disagree, but I'd be interested in your definition of "crime" then.
My point is, there is nothing wrong with pushing a slot machine's buttons, even if it is done in a way that defeats the RNG. The difference between this and counting cards in a game of Black Jack is, the RNG can easily be fixed to prevent this.
You provided it as an example of "organized crime", and doing so heavily suggests that it is wrong.
We tend to conflate "wrong" and "unlawful", and for good reason: the law is supposed to prevent wrong things from being committed. There are exceptions however, and this is one of them. I'd rather use another example if possible.
My argument is that even small organized crime groups were capabale of sophisticated operations so it's absolutely not far-fetched to assume that they may be able to implement end-to-end encryption apps. Whether this "organized crime" is "wrong" does not make much difference. It might be even easier to find people to implement it as not many would have moral issues.
Speaking of moral issues, cheating on casino is pretty much off limits on my personal moral compass. That the attack was possible within the normal mode of operation does not make it less of a fraud. Imagine if the casino would reverse-engineer a slot machine and find a way to abuse it within the normal mode of operation, making odds (even more) in their favor. That would be fraud, plain and simple, and I don't see why a player should be held to a different standard.
You are absolutely right, not everything unlawful is wrong. But I fail to see which benefit we as a society would have by allowing exploitation of technical deficiencies in slot machines for profit. It is a crime and it is wrong in my book.
OK, so our disagreement is very simple: exploiting the flaws of a slot machine is not cheating in my book. Neither is counting cards now that I think of it.
The rules for slot machines are ostensibly very simple. As long as you're only pushing the buttons that are supposed to be pushed without deteriorating them, you are acting within the rules of the slot machine, and as such cannot cheat.
The presence of hidden rules such as "don't push the buttons in this particular order and timing", or "don't push the buttons in a way that reliably causes you to win", are just silly and unfair. Especially considering casinos are exploiting gamblers' minds in the first place. Don't like slot machine exploiters? Fix your slot machines.
Likewise for counting cards: the player is merely acting upon information naturally gathered buy observation and play. Asking players not to act upon such information is intrusive —and unheard of in competitive play. Don't like card counters? Invest in a continuous shuffling machine.
"The presence of hidden rules..." except they are not hidden. IANAL and don't have link at hand, but there was recently a legally pretty well based argument in a case of a player who had an assitant able of recognizing cards from the back pattern.
"you're only pushing the buttons" except they were not only pushing a button, they were also recording sequences and sending them abroad for analysis.
But as you directly say that exploiting the flaws of a slot machine is not cheating and that it's fair, I guess I won't be able to persuade you otherwise.
> that's hardly a crime —and I don't care about court judgements to the contrary
Either you meant something along the lines of "that shouldn't be a crime" or you're essentially saying "it's not crime even though it is a crime" - which doesn't make terribly much sense.
> like exploiting pseudorandom generators in casino slot machines
Anyone have a link about this story? I'm curious to read about it.
You don't really have to roll your own crypto to create such an app. There's always openssl and the signal protocol, which you'd only need to implement without designing anything.
Sure that can go wrong as anything can, but it's far from rolling your own crypto and makes things a lot easier.
> Writing solid encryption software is very difficult on its own.
It's not. You can use existing software, reuse existing protocols, and stick to safe languages as much as possible. Even implementing your own crypto isn't all that difficult¹. I have written my own crypto library², and I can almost recommend it for production use.
[1]: http://loup-vaillant.fr/articles/rolling-your-own-crypto
I think you've missed the point.. Not talking about writing own crypto.. Talking about not using applications which the western security services have ability to force backdoors.
Are you suggesting gpg has been backdoored? A simple wrapper around gpg is not-beyond terrorist organisations.
You vastly overestimate the difficulty. The reason we're commonly told not to roll our own crypto is because it's easy, and also easy to get wrong and possibly catastrophic if you do. But many perfectly serviceable algorithms are simple and public knowledge. Arguably a scenario where everyone's using their own crypto and half of it's broken is still better than everyone using the "industry standard, pre-backdoored for your convenience" version.
Of course it's utterly trivial to make a one-time-pad cryptosystem, and more practical in 2017 than ever. So what if the keylength must match the message length, my phone has a 32gb uSD. That's a lot of text messages.
This.
If we outlaw encryption, then only outlaws will have encryption.
>do not roll your own crypto
Sure, but what's to prevent someone from building something on top of OpenSSL or PGP or whatever? Can't be that hard.
Nobody is saying write your own crypto just your own app. Plenty of excellent crypto libraries out there.
We hear most terrorists ate with forks so all forks are now banned.
Also, we were shocked to discover that virtually ALL criminals rely on something called Oxygen to perform their work so this is now a controlled substance that will be heavily regulated.
We were then terrified to learn that after banning forks, terrorists were able to successfully eat with spoons or even their hands.
/s
Seriously, you cannot ban tools. Lawmakers have to approach this with a firm grounding in statistics (how LIKELY is a risk, relative to the magnitude of the measures to prevent it?). They also have to realize that some things are just necessary for society to function. Stop being paranoid.
It's perfectly legal to use a fork, if you choose to take that risk. We absolutely haven't banned forks, we just urge special precautions due to the clear and obvious terrorism risks. If you want to use a fork or fork-like object (various devices that can be used for stabbing crimes like pencils or pens), just put your name on this watchlist...
More to the point, terrorists lock their doors! We need to ban deadbolts!
Seriously though. Next step would be to force people to install locks that have a master key that only intelligence services have and can only use for good reasons.
If you're ok with encryption back doors you should also be ok with govt master keys for all your stuff (house, car, bank account, etc)
So they get a backdoor into WhatsApp and terrorists just move onto some other non-compromised tool. Rinse and repeat. You can't ban maths ffs.
TBH I am surprised attackers do not better destroy their electronic equipment just before they carry out their attack. Pop your phone and SSD/flash drives in the microwave on high for a few minutes is pretty much going to destroy all evidence on them, and if not then chances are you are dead anyway so whatever data they might be able to get off will most likely be useless to them anyway.
It's in the interest of the terrorists to get WhatsApp banned.
Terrorists just use something else while the populace feels gradually more oppressed/controlled/...
In a way they get something for nothing.
I think this is not about terrorists (that is just a side effect), but for state ability to know what people think and talk about. That is very powerful thing to have.
Not sure about banning maths, but back when I was in school I had to suffer through a couple of maths teachers who were so bad that they might as well have been trying to instill a life long hatred for maths.
Wow, that sentence got away from me.
Wouldn't putting a bunch of metal in a microwave be risky? I'm picturing something exploding and sending shards of glass into bystanders.
(Then again, a 4 Lions moment where an intrepid terrorist slits his own throat with a molten SSD wouldn't be the worst thing in the world...)
That's why the justice system sort of works. Criminals are even dumber than law enforcement.
Only the ones that get caught...
>Referring to Whatsapp's system of end-to-end encryption, she said: "It is completely unacceptable. There should be no place for terrorists to hide.
Thats it guys. Mommy says no more maths.
> "It is completely unacceptable. There should be no place for terrorists to hide."
Agreed. I'm terrorized when I hear gov representatives talking like that. Who's the terrorist, I wonder.
I seem to remember in pre-inet news, rogue actors with mental disorders seldom made national headlines. Now an individual with no affiliation with organized hate-based groups and some twisted logic can dream of international recognition for their actions if they manage to fulfill loose criteria under the "terrorists" FUD umbrella. Dead or alive, they want to be significant in some way different from their previously banal existence, consequences be damned.
I was discussing this with friends. The fact that people are acting in the name of Islam and there is a common ideology behind the attacks is why they are considered "terrorism" rather than lone wold attacks. Admittedly this one seems to be somewhere in between. Mass shootings in the US by contrast, do seem to be done by rouge actors in general, and there doesn't seem to be a common element to such attacks. As with everything such definitions aren't black and white and more of a sliding scale, so can be interpreted differently depending on your viewpoint.
No common element in US attacks? I think in the majority of cases that's just not true. People weren't killed, but the PizzaGate shooter was motivated by what can only be described as a large cult of a particular persuasion, Planned Parenthood shooter certainly had his political motives, Charleston shooter had motives that have existed in the US almost continuously, Oregon standoff were all anti-government types, I could go on and on. I would say the main difference is we are not actively at war with people that share their faith or look like them overseas. As far as I'm aware the Boston bombers and Orlando shooter were only inspired by, but received no support from groups such as ISIS. In that way they were just as self-radicalized as Dylan Roof the Charleston shooter.
Obviously it's all more complicated than I could quickly write, but to me there's a big difference between the self-radicalized generally disconnected persons in the US or U.K. versus those on the ground overseas.
The British gov is looking more and more like the Finger from V for Vendetta. The US president more and more like the one from Idiocracy. That we tend to live up to caricatures should be an alarming sign, but I only see worries on sites like HN. Most people still don't see the catastrophy in it.
> I only see worries on sites like HN. Most people still don't see the catastrophy in it.
I'm not in the US. I have actually been very impressed by the outspoken actions of anti-Trump people in the US, with the massive protests and constant (well-deserved) media scrutiny. Also I never knew I could have so much respect for Hawaiian judges.
Why they didn't bother to vote is beyond me, though. Trump is a buffoon, but he was able to successfully motivate other buffoons to actually vote.
I did hear the description of their vote as being force to choose "between a disaster and a catastrophe" though, so that might go some way to explaining it.
If I may ask a very naive question: Do politicians like her really think encryption is dangerous or is it a devious way to expand mass surveillance?
Attacks of the past have shown that terrorists don't have a need to resort to encryption. The people involved in the Berlin attack last year, for instance, were monitored. Authorities knew they would strike but they didn't have sufficient incriminating evidence that would count in court to lock those guys up.
Even if encryption on messaging services were forbidden (which would make millions of law abiding people vulnerable in some way), terrorists could use throwaway email accounts from internet cafés and wrap their messages in password protected attachments.
> really think encryption is dangerous or is it a devious way to expand mass surveillance
The latter her and the precious home secretary (now PM) have been banging on about how under threat we are from the terrorist hoards for years now - all so they can erode freedoms and increase mass surveillance under the guise of 'keeping Britain safe'.
The idea that banning encryption of private conversations will prevent these few crazy people from causing damage is of course ridiculous.
It's an interesting question though, isn't it.
They must know enough to know that this won't actually fix the problem, so I would have to surmise that they are just trying to do something and stay somehow relevant before their term comes to an end.
"Never mind the collateral damage, I'll be retired on a government pension by then."
It would help the UK government's argument if they didn't grossly abuse every single surveillance power they have: https://www.theguardian.com/world/2016/dec/25/british-counci...
Coming from the same government that wants all ISPs to keep a log of all the sites you visit. These people are beasts and as dangerous, if not more, as the perils they are supposed to save us from.
If people knew the damage these idiots do, they would be in the streets.
Oh wait, they already are in the streets...
I'm surprised it took this long for her to bring up the subject – Theresa May would've had her soundbites prepared in advance and released within hours of the attack if she was still Home Sec.
> That is my view - it is completely unacceptable
You know what else is completely unacceptable? Technologically illiterate, authoritarian jobsworths capitalising on tragedy to push through their agendas. But that's just my view.
Home Office always seems to attract the nastiest and dumbest of politicians, but this is a whole new level of dumb, and sadly will only gain her more support, because the general public either have no idea about the implications of backdoored crypto, or simply don't have any expectation of privacy and are happy to give up what little they have left in order to feel safe.
Sure she'll get support and they may even be stupid enough to attempt to enact this stuff. But rest assured there'll be a massive backlash from big business and various political fallouts from the scandals that will ensue.
Then some genius will come up with what's essentially an "encryption is illegal for terrorists" bill and we'll have the best of both worlds: full use of encryption where we need it, whilst the terrorists can't use it because it's illegal!!
Yeah that'll be totally humane, just like how it's illegal to commit crimes while wearing body armor in most of the US! Or like how street dealers and felons aren't allowed to carry guns or vote! We are being declawed.
Listening to some old episodes of 'The News Quiz', and I'm struck by the fact that this is not a whole new level of anything at all. If anything, this is par for the course, and Theresa May being in power at all is a result of that.
It is the duty of the Home Secretary (and the UK's various nosey institutions - e.g. intelligence agencies, police, etc) to continuously badger us for this information - unfortunately, it's pretty much part of the job description.
It is our duty, as the public, to continuously say "no".
Disregarding any negative consequences, their motivations are pretty transparent - there's little doubt that being able to read everyone's private messages will enable the intelligence services to better do their jobs. However, as Edward Snowden and others have already shown to us many times over the last few years, the UK government can't be trusted with this responsibility - and that this is probably the thin end of the wedge. Britain is already the closest thing that Europe has to a surveillance state, and the number of people killed in the UK by terrorism is vanishingly small - we are hundreds of times more likely to die in a car accident. Is it really worth giving up the last vestiges of our privacy for a little bit more security?
is the duty of the Home Secretary (and the UK's various nosey institutions - e.g. intelligence agencies, police, etc) to continuously badger us for this information - unfortunately, it's pretty much part of the job description.
On the contrary. The Home Secretary is literally the holder of the ministerial authority that is required for police and security services to use a lot of the powers they have, and is supposed to be providing oversight and ensuring that those powers are used responsibly.
Unfortunately, that means the Home Secretary spends several hours every day just looking at cases presumably involving some very nasty people. You have to wonder how anyone could keep a balanced perpsective if they're doing that for 20, 30, 40 hours every week for months or years. Everyone who becomes HS in the UK turns into a severe authoritarian within a few months of taking the job, regardless of their prior political views or how reasonable they might be about other matters.
Perhaps I phrased it poorly - what I meant was that one can view the Home Secretary's requests for less privacy as a fact of life (just as death and taxes), and could consider refusing these requests as part of civic duty. You're correct, the HS usually turns somewhat authoritarian (regardless of whether it is their job to or not) - it is simply the public's duty to resist.
> there's little doubt that being able to read everyone's private messages will enable the intelligence services to better do their jobs
[citation needed]
Seriously, this argument is FUD. I'm sorry for picking on this quote, as I agree with the rest of your post, but allow me to go on a short rant..
We've seen this argument used many times over. It was used to introduce surveillance cameras on every UK street. What has it achieved? Less parking lot crimes[1].
The EU used it when introducing the data retention directive. Which was "nullified" eight years later due to violating fundamental human rights[2]. Of course, the infrastructure is still in place, and everyone is still using it. What has it achieved? AFAICT nothing except a blatant danger to society. The ability to know everything about anyone and actively take over their private devices is not something that should be taken lightly.
The GCHQ even admitted that the London terrorist was "on their radar". Well duh, who isn't. If that's not admitting mass surveillance is ineffective, I don't know what is.
It is impossible to prevent all crime before it occurs. The world isn't NP complete. Get over it. Or, to paraphrase Gödel: "I would rather live in a world that is inconsistent, than one that is incomplete"[3].
The intelligence agencies are just bored. They have no wars, except drugs and "terror". They use this "downtime" to get more data sources by influencing politicians.
Guess what, gathering more of the same shit data won't increase your signal.
[1] https://www.aclu.org/files/images/asset_upload_file708_35775...
[2] https://en.wikipedia.org/wiki/Data_Retention_Directive
[3] Not an actual quote, but I'm sure he would agree.
In the 1970s, an American president had to resign because of some bugs planted.
Now, private conversation is illegal.
I guess it leads to "ownlife".
> She said it was a case of getting together "the best people who understand the technology, who understand the necessary hashtags"
Our Government is an absolute disgrace; and unfortunately, one to which there is currently no credible, strong opposition.
(from https://www.buzzfeed.com/matthewchampion/necessary-hashtags)
If you are referring to the current state of the Labour party then that's irrelevant. Even when Labour were "strong" opposition, or were in power with large majorities, they have had very authoritarian positions on this kind of thing.
Labour were supporters of the recent IP Bill (it actually applied restrictions to some of the crazy powers the last Labour government gave to the police, which gives you an indication of their general position on these things). Labour have had authoritarian positions on crime and policing issues since Blair became shadow Home Secretary (1992). It has been part of their 'tough on crime' strategy of attacking the Conservatives from the right since that point and was a core part of the New Labour strategy.
The only thing a "stronger" Labour opposition would get you in this situation is a parliament even more united in support for restrictions on encryption.
It's a damn shame as a Communist myself to agree with Labour on various issues, but so vehemently disagree with them on the abstention or even outright support of spying bills.
And if the best people who understand the technology and 'all the necessary hashtags' say it can't be done?
Because that's where we're at currently.
The emperor's new hashtags..
It's an incredibly foolish thing for a minister to suggest. She demonstrates a complete lack of understanding on the subject and has commited political seppuku. Has she never read Orwell, Huxley, seen articles about tyrannical governments or even heard about the reasons the US constitution was drawn up?
has commited political seppuku
Since the current prime minister supports her, I doubt it. It's an absurd position, but not without support in the current administration, just like her outspoken views on immigration.
> Since the current prime minister supports her
And, I'm sorry to say, a large chunk of the public, who have for years been force fed rubbish from politicians and the media alike about the huge terrorist threat that threatens to destroy our country (when in reality just about anything else you can think of is more of a threat than the odd crazy with a knife and car...)
Specifically the current PM is as bad or worse.
Poorly timed opportunism. Police have said the attacker acted alone, so he wasn't using encrypted comms to talk to anyone.
I really hate this "going dark" narrative.
They can track his purchases via his debit card, his movements via CCTV + cell tower records, intercept his emails... but there's one bit of his digital life that's inaccessible and we're "going dark?"
We are burning bright with data. More data does not necessarily mean less terrorism.
The English might be better served by posting some armed officers in high value areas. The French do this at major train stations and tourist spots like the Eiffel Tower. This doesn't stop terrorism, but vastly reduces the body count.
Frankly, I think it's laughable that countries which resisted the Nazis will let 10 people dying make them consider rolling back civil liberties.
I posted about this on twitter this morning, and it seems appropriate to repeat it here:
For most of history, governments have not had the ability to easily monitor the communications of their citizens. Widely available, user friendly encryption tools are just returning us to normal. Well, except for the massive trail of metadata everyone now leaves.
He sent a message 2 mins before the attack, and they want to know what that message was and who it was sent to, in case he wasn't acting alone.
However, can't they already find out who the message was sent to? Whatsapp obviously has to have that information, and it appears they will give it to law enforcement:
https://www.forbes.com/sites/thomasbrewster/2017/01/22/whats...
I'm not sure that knowing the contents of that message will really help more than knowing the person who it was sent to.
It seems that they had this guy on their radar a few years ago, but didn't think he was worth keeping an eye on, so even if they could decrypt whatsapp messages it wouldn't really have helped them.
Shhh! You're letting the "facts" get in the way of a good power grab.
They arrested 8 people so the "acted alone" theory might have changed.
http://www.aljazeera.com/news/2017/03/uk-police-arrests-west...
That link is old. They've mostly if not all been released.
There are a few reasons to laugh at her position.
* The UK government leads the "free world" in ignoring its own warrant process, and pursuing a "collect it all" strategy for commsec. UK citizens have no reason to trust that their government, given such access, would not abuse it. They've abused all their other access thus far.
* Privacy and Security help normal citizens and criminals alike. This is as true for a locked front door as it is for an encrypted message. We grant governments the ability to violate privacy under warrant - they may snoop, spy, enter our homes, and read our mail. We do not grant them the ability to violate security, however. They still have to pick the lock, steam the envelope, and crack the safe. These are important distinctions. We do not engineer a backdoor into all encrypted messages, for the same reason we don't mandate a government master key for all doors.
* The idea that you can legislate math out of existence is a joke.
There is one reason to cry at her position.
* They will eventually legislate this way anyway.
"He sent an encrypted message from whatsapp"
Yes, and then he went and did something stupid with easily accessible tools and acted alone.
You might have an argument if he was part of a coordinated attack against something but lone-wolf terrorism has always been defined as unpreventable by security services such as SIS. Once radicalised it's impossible to prevent individuals doing stupid stuff.
The only thing she has revealed his the conservative parties desire for totalitarian control. :(
> You might have an argument if he was part of a coordinated attack
No.
Even ignoring the erosion of privacy angle, this just doesn't work. Outlaw encryption, and only outlaws will use encryption. Provide government backdoors into the popular commercial messaging apps, and people coordinating terror attacks will just use custom, unknown, private encrypted messaging apps.
I'm going to play devils advocate here, I also dislike the erosion of privacy (enough that I even left the UK).
But you _can_ make the argument that if only outlaws use encryption then they're painting a target on their back, which leads to greater scrutiny by security services.
This is reasonably achieved by the current dragnet surveillance systems in place, along with ISP's logging everything.
I don't agree with it, of course I don't, but that's probably an angle people could take- But the angle Amber Rudd took is even more starved of sense.
It's like she didn't ask the appropriate question: "What could we have done to prevent this attack" and the follow up "If we had direct access to his phone and all of his communication information, what could we have caught" and the answer is _nothing_. He used tools commonly available to him, acted alone, probably told nobody.
Anyway, tell the bad guys you're watching the comms and they'll figure out how to talk, they're motivated and smart.
"What could we have done to prevent this attack"
Actually there is a lot they could have done to help him in his obviously troubled life but that doesn't fit with conservative ideology.
We can't help everyone. It's naïve to think we can, 100% of the time, help all people.
Even the most socially progressive system on the planet will have people slipping through the cracks- we have to be able to deal with that eventuality too.
True but at the moment we have a particularly poor record of helping people with mental illness. This guy didn't slip through the cracks, he was totally ignored along with many others who are struggling.
There's absolutely no evidence he had mental illness, and you do harm to people with mental illness when you incorrectly link violent behaviour to mental illness.
More important is his time in prison - where most UK terrorists were radicalised - and if you were saying that UK prisons don't rehabilitate I'd agree.
Is brutal violence against others not associated with mental disturbation?
Violent people are violent. They may also have mental illness, but usually it's coincidental.
In this specific case there's no suggestion he had mental illness, and it's ignorant to suggest he did.
Violent people aren't violent because they are violent. There is an underlying cause for someone to use force against another, especially when it is socially unacceptable.
Perhaps it doesn't fit under the "common" mental illnesses of depression, anxiety, etc. but it lines up well with thought disorders. A sane and well person would not jeopardize themselves, and their fellow species.
> A sane and well person would not jeopardize themselves, and their fellow species.
Sane (by the usual definition, though it's possible you are using an unusual definition of your own) people jeopardize themselves to harm other members of the species all the time.
In fact, societies tend to have organized groups of people who are expected to do this when the targets are enemies of the group, and who are honored for it; they also not infrequently honor people who independently do it against people theor society has decided are "the enemy".
There are some mental illnesses for which are associated with violence, but much violence is not associated with mental illness.
"absolutely no evidence"... except the random crazy violent act against innocent strangers that made international headlines.
It's a bit odd to pretend those are the actions of a well-adjusted, sane person.
Most violent people do not have mental illness.
You cause harm to people with mental illness when you ignorantly link violence to mental illness.
>Outlaw encryption, and only outlaws will use encryption.
Well, if only outlaws used encryption and you sent a non-plaintext message then the police would knock on your door at 04:00 the next morning. That's what happens in Morocco if you like something related to terrorism on Facebook. A bit extreme, yes, but that's how some countries do it.
Sure, technically sophisticated enemies know not to like things on Facebook and know to use steganography, but most don't know and those that learn it through terrorist networks have a long vulnerable period where they are malicious but before they become sophisticated.
Hmm, this definitely brings up an interesting discussion I don't think HN has had before, especially something in a similar vein since Apple+San Bernardino fiasco.
Obviously privacy is something that HN holds very close to its heart. But I'm interested in what do people here have to say about the privacy features are used by terrible people to do terrible things.
And I want to share something that I think is one of the best arguments for privacy, complete privacy. I do agree with this completely: https://moxie.org/blog/we-should-all-have-something-to-hide/
The thing is, banning encryption to fight terrorism makes as much difference as banning knifes to fight murder. Encryption is simple enough (in theory at least) and prevalent enough that banning it will only stop legitimate usage (e.g. personal privacy) without having any real impact on those who want to break the law. Modern steganography techniques make it easy for enterprising individuals to tunnel encrypted data over unencrypted channels; there is just too much randomness in the real world to exploit.
Regardless of what delusional politicians want, encryption is here to stay. It's just a matter of how much people are willing to give up to feel safe.
Terrorism, and crime in general is a nuisance that we have to live with. You can have a society with no crime. All you need is a super repressive totalitarian state, total transparency with citizens reporting on each others, state surveillance everywhere. It will work. But first I don't want to live in such a state. And second these totalitarian states slide invariably toward corruption and state crimes.
So we have to live with some level of crime. It doesn't mean we shouldn't be tough on criminals, but we have to accept that it is not possible in a free society to reach zero criminality.
I think the paradox is that people are reasonably relaxed with some level of criminality but are absolutely intolerant to any form of terrorism. And this intolerance is a new phenomenon. Terrorism isn't new. There isn't more terrorism in Europe than 20 or 40 years ago. In fact a few months ago I compiled the number of incidents and victims from a wikipedia page [1]:
https://zbpublic.blob.core.windows.net/public/Deads.png
https://zbpublic.blob.core.windows.net/public/Injured.png
https://zbpublic.blob.core.windows.net/public/Incidents.png
As you can tell, the 70s and 80s were rather more brutal, with far-left, IRA and Palestinian terrorism. And our democracies resisted much better the temptation to introduce more surveillance.
Now why have we become intolerant to terrorism? There are literally tens of thousands of knife attacks every year just in London. Most don't even make it to the local news. Why would this particular incident be treated as a state affair? Terrorism is the buzz of a mosquito. In itself pretty much harmless. But most people will not sleep in a room where they can hear the buzz. I don't have a good explanation. The only thing I can think of is the 24h news cycle where the media will make a big deal of anything that can push the audience up. But that doesn't explain everything. They do the same with plane crashes, but still repeat over and over that though spectacular, plane crashes are extremely rare and flying is extremely safe. Whereas when there is a terrorist attack, the message is "this could happen to YOU!"
> "... total transparency with citizens reporting on each others, state surveillance everywhere."
Like this? https://www.youtube.com/watch?v=RIuf1V1FhpY
(Tom Scott's "Oversight" from 2013)
That gave me actual shivers. I can't believe the general populace (especially in the USA and UK) fall for this garbage. The bread and circuses must be really good. Ok maybe just circuses, I think Trump wants to cut SNAP.
I completely agree. You are one of the very few!
I have to ask, what's up with the domain name? Is that some sort of public windows share folder?
It's just the online file storage of Microsoft Azure.
So we have to live with some level of crime. It doesn't mean we shouldn't be tough on criminals, but we have to accept that it is not possible in a free society to reach zero criminality.
As the saying goes, "insecurity is freedom." I've always found it somewhat disturbing that people have welcomed the walled-garden ecosystems popular today, which are essentially the cyber-equivalent.
> ... plane crashes ...
One difference is that airlines advertise in mass media, terrorist organizations don't.
> total transparency
Baxter and Clarke: The Light of Other Days: https://en.wikipedia.org/wiki/The_Light_of_Other_Days
People should be careful what they wish for.
In some sort of abstract theoretical vacuum I might be persuaded to support some form of limited targeted warrant-based intercept.
However, this is the real world, and I'd want the serious trust issues fixed first. Surveillance of journalists. Invasion of privacy by journalists with the complicity of corrupt police. Surveillance of peaceful left-wing and environmentalist groups.
Let's not be ignorant of history either, of secret prisons and unaccountable courts. Let Martin McGuinness' death remind us of H block and the Maze. Who here is old enough to remember the bizarre compromise where Gerry Adams appeared on TV with his words read by an actor, because he was deemed too dangerous to listen to?
Then there is the business of foreign intelligence agencies. If some communication isn't completely private, can it be compromised by the Russians? Remember the US election?
We need to have a conversation about radicalisation, but much of it happens in public or verbally, and it's not at all limited to Islamic fundamentalism. It needs to include the far-right too.
We all hope for a perfect world. But the problem is none of our perfect worlds are the same, so everyone of us end up living in our imperfect world.
The expectations you have are something I'd agree with too, but many other don't. So how do you reconcile this? Again the reconciliation process you will come up with is perfect according to you but most likely not according to others.
No one can ever win I guess.
I strongly believe that tools are ethically neutral.
A hammer, a knife or a government can be pretty useful, or pretty violent - depending on how you use it. This alone does not imply that a hammer, a knife or a government should not exist or be banned.
edit: words
> I strongly believe that tools are ethically neutral.
Well, that's a very evasive argument in my opinion. It's absolutely true that objects are neutral, but you can't make a blanket argument with topics like these. That argument has been made lots of times by many (including me).
But it eventually breaks down. You can't give a child a gun and when the kid shoots someone say it was the kid's fault. Whose fault is it? I'm guessing you're going to say the adult responsible for putting the gun within the reach of a child. You're still taking away an object from the kid. In this case it was a kid who didn't know better.
Now this isn't a narrow argument. This becomes interesting when you get to powerful things, like say nuclear weapons. They aren't inherently evil too. But if you look around, UN is trying to ban them[1]. Shouldn't UN ban them?
What I'm trying to get at is that you can't always but the blame on people. Just like you can't blame a child for not knowing better, you can't blame a person for knowing better (although people do). Sometimes you just have to take the gun (or nuclear weapons or encryption) away.
I always thought that putting the blame on actual agents instead of stuff is the direct opposite of being evasive.
You rightly said that I will answer that it was the adults fault. The bad act done with the weapon was giving it to a child, not the childs shooting someone. It gets clearer if the adult hands the weapon to a monkey or a randomized shooting machine. All three scenarios change nothing in regard to the moral responsibility of the adult.
I do not know whether the UN should ban nuclear weapons, but if someone uses a nuclear weapon and we would check out who might be morally responsible for the attack, I'd point at the attacker.
> I always thought that putting the blame on actual agents instead of stuff is the direct opposite of being evasive.
Well I meant evasive argument as in statements which directly avoid the question. Because the thing is, we don't live in a perfect world. So putting the blame on agents doesn't solve the problem. That's why I brought up that sometimes things have to be taken away from people. The question is when and how.
> The bad act done with the weapon was giving it to a child
I didn't say the adult gave the weapon. I just said the adult left it within the reach of the child. One is direct, the other is indirect. I wouldn't even say indirect, but lets go with it.
> I'd point at the attacker
It gets blurry deciding who the attacker is, depending on whose side you are on. Things aren't always so clear.
In a perfect world, government would be good, capable not oppressive and there would be zero reason for anyone not to trust them with private data - but government would not want it, since they would not be immoderate and power hungry but perfect.
But we both believe that the world is not perfect. This is why we need to talk about moral responsibility in the first place. Blurry definitions might only make us even more humble while proposing new, disruptive power centralizations like an encryption ban.
Let's turn the analogy around to the other side.
With their utterly poor understanding of encryption and the harms of compromising it, people in governments who want to do so might as well be monkeys.
By providing backdoored encryption products, we would as you put it be handing out randomized shooting machines to monkeys.
on the other hand a huge nuclear arsenal maintained a long period of peace during a troubled time, so there's that
> long period of peace
What if the huge nuclear arsenal was in the wrong hands? That begs the question, what is 'wrong'?
Your definition of peace was probably characterized as oppression/dominance by the people who did want to revolt. You don't account for the extremely subjective nature of things, especially when it comes to nationalistic actions of people.
It arguably has so far, but let's not get too smug about it. The end of that story has not yet been written.
>or a government should not exist or be banned.
I was with you up until 'government'. I regard states as exercising unjust authority over people and defenders of private property which is why I'm an anarcho-Communist. The way in which the modern world is divided up means that one must be a subject of some state, which I believe makes there no way to provide proper consent to be governed.
I believe the same. But what if these tools have too great impact if everything goes south? It happened with dynamites before, now look how jittery we are about nuclear.
You can flip this for privacy too. The more governments can spy on everyone, well sure we may catch more terrorists and terrorism might even decrease. But at what cost? Totalitarianism? Shudder.
It's interesting when you extend that discussion to guns, in the UK guns are quite difficult to get hold of, I wonder what the incident at Westminster would have looked like if they were as available as they are in the US.
As always it's a trade off, some people loose the right to arm themselves at home but that means other people may not loose their life to a shooting.
I had this thought as well. However, the comparison to knives seems better. It's pretty hard to make your own gun. It's pretty straight forward to make your own encryption app on top of existing OSS libraries. Like it's pretty easy to make your own makeshift, large knife. Maybe I just see it like that because I'm a software developer though.
It is not. Smash "home made gun" into youtube sometime. $30 at any hardware store and you have a functioning shotgun.
That would mean the 1st cop he stabbed, who was unarmed, might have stopped it. Its dumb to play whatif in these situations.
There is no other way to use a gun than shooting with it. (Useful! Of course you could use it like a hammer, but... Srsly.)
It's whole purpose is, to hurt someone; i think that's some point to acknowledge first.
You can go hunting with it or shoot for fun or make yourself feel safer. What can I do with encryption other than encrypt messages I want to hide?
//Playing devil's advocate
Well, there are very good reasons to hide messages.
First of all, I don't want any company in transit, from my ISP, to the message broker, to the receivers ISP to mine my data and use/sell it for profit.
Secondly, there are a lot of messages which are not illegal, but can be personally embarrassing if they were to become public. Think of sexually-tinted messages, psychological help, a kid who lives in a very conservative community and has doubts about their religion, discussions about a candidate for a job position, etc.
The problem with backdoor is that the question is not if they are exploited, but when they are exploited. And this is all assuming that the organization that has backdoor access is not of ill will.
What about guns, tanks or hydrogen bombs?
tools designed to directly cause destruction are not similar to tools designed to provide privacy.
However they were treated as such by US export regulations.
How does does this counter my point?
>> But I'm interested in what do people here have to say about the privacy features are used by terrible people to do terrible things.
Giving up such a valuable right to possibly stop attacks which, in the grand scheme of things actually harm very few people, is idiotic. Terrorism is obviously awful but the number of people in the UK actually affected by it is far, far too small to consider forgoing such an important right. And IMO, once you do that, the terrorists have won.
Take the attack in London last week for example. It doesn't require planning. Anyone could get in a car and mow down a lot of people in seconds. It doesn't need discussion on WhatsApp. It doesn't require purchase of weapons. It doesn't require you to do anything shady that could give you away more than a second before you do it. No amount of intelligence gathering could figure it out. You could force every citizen to wear a mic and body cam and you still wouldn't be able to stop it.
How about tackling the actual problem - terrorists seem to have resorted to using cars and trucks to kill people. Lets put up some metal/concrete bollards alone the edge of pavements that have no 'escape route', such as the one on Westminster Bridge.
> How about tackling the actual problem - terrorists seem to have resorted to using cars and trucks to kill people. Lets put up some metal/concrete bollards alone the edge of pavements that have no 'escape route', such as the one on Westminster Bridge.
Nice post. I agree almost entirely with you, but you can't put a bollard everywhere, and even if you could, bad people would find a way around or between the bollards, or simply another way to hurt people. It would be like playing a futile game of whack-a-mole.
At the end of the day, there are people who are so mean-spirited that they want to hurt innocent people for no reason, and they will find a way to do that no matter what we do. Honestly I think a lot of it is mental health more than anything we can really protect against.
It's not possible to wrap everyone in cotton wool, and in order to have some freedom we risk a small percentage of harm. There is no way around that. Without that freedom, there's also the IMO much larger risk of harm from the authorities themselves.
There's no way around it, living in the world involves some risk. It's unrealistic to not accept that risk and fantasize that all outcomes are preventable.
Like another comment mentioned, there are literally tens of thousands of stabbings in the UK every year. Why are we even talking about removing fundamental freedoms (the right to privacy) in order to probably not prevent a few unfortunate deaths per year? The payoff is so small and the cost is much too great.
>> you can't put a bollard everywhere
I agree, this wasn't my suggestion. I was thinking more of areas like a bridge where if a car does start speeding along the pavement, even if you are further along and see it, there is nowhere for you to go. Your choices are stay put (and get hit), run into the road and probably get hit by traffic, or jump off the bridge (dangerous). Bollards along pavements like that would be useful. Even just one at each end and one in the middle would halve the damage by 50% at least. My greater point though is that something like bollards tackles the problem directly and is much more effective than SIGINT for these type of attacks yet nobody is talking about it.
The real problem is not the fact that we have places where pedestrians can be hit. It's the fact that there are people who would want to hurt innocent people for no reason. TBH I don't really understand why these crimes should be treated in any special way other than any other murder/assault crime. What difference does it make what ideology they are claimed to identify with? I am not convinced that the volume of such crimes (assault with relevant ideology) is enough to warrant putting them in a special category in the first place.
I do think that the bollard solution is a bit unnecessary though, as there'll always be a place where pedestrians would be vulnerable, and many other ways that people could be hurt besides.
At least it's something that could have an effect though. Snooping on emails would have almost no effect, and I hope everyone knows that. Snowden/William Binney, etc, should have made it patently obvious to everyone that there is no shortage of data flowing in, and I'm sure any successful preventative efforts would have been trumpeted to the rooftops with the way those agencies love to pat themselves on the back to justify bigger budgets.
More crap data, from millions of law-abiding innocent citizens, is not going to make it any easier for them to separate out the signal from the massive amount of noise.
The fact the media is not presenting real solutions to either of the actual problems - people being run down by cars or trucks, and people wanting to hurt other innocent people, or even questioning the imaginary solutions makes me strongly suspect there's ulterior motives at play.
To be quite blunt, this is such a blatant and transparent power grab by the authorities that I can't help but think that if the average person cannot see that our media is not interested in presenting the true story, with real facts that make sense, and our government representatives aren't addressing any of the real issues and just trying to remove our freedoms at every turn in order to not even solve imagined problems, then our society is already doomed, and not at the hands of terrorists.
I personally think if there is a warrant the company should provide information. The same way my house is private but if there is any kind of problem police has access to it.
Although we should have mechanism to protect from mass random surveillance.
So, no matter how bad the government, and how ridiculous its laws (let's say they ban music, and dole out harsh punishments to people who dare listen to music), then we as technology providers should enable them to catch such people and punish them for their "crimes"?
Saudi Arabia punishes rape victims. We should help with that?
China punishes people who try to air grievances about government abuse and corruption. Again, we should enable them to be more effective in their invasive prying into those individuals than they already are?
In North Korea, your entire family can be punished if you dare be disobedient to the government.
In the US, we recently elected Donald Trump.
Etc. etc... why do you think governments can be trusted with this power?
On top of all that, once the technical means exist, they will also be discovered, cracked, and used by fraudsters, extortionists, and anyone else who can figure out a way to abuse the information.
I can ask the same. Why should a suspected rapist, assassin, drug dealer, corrupt politician, be free or unjustly imprisoned, because of lack of evidence? Do you think we should help criminals?
Nowadays here are other more efective ways, than encrypted WhatsApp (secrecy), to fight bad governments and ridiculous laws.
North Korea and Saudi Arabia are obviously very extreme examples. Internet encryption must be the least of their worries.
Governments with working justice systems should be trusted with power to provide security.
There should be No technical means or backdoors globally accessible. Information should be provided on request basis, based on a warrant for that suspect. And data stored should follow data protection laws.
If you think you country justice system is not working properly there are ways to fight that. And probably there will be people and institutions already doing.
I like how you leap from "suspected" to "Do you think we should help criminals?"
North Korea and Saudi Arabia and China and the UK would claim that their justice systems are working just fine.
As would my local sheriff jurisdiction where they can't even manage to hire anybody who bothers to do so much as use turn signals. If they can't even manage to do that tiny thing, greatest country on earth or no, I don't trust them with the temptations of the kind of power you're talking about.
>If you think you country justice system is not working properly there are ways to fight that.
LOL! Good luck! Strictly speaking, you are correct that "there are ways to fight that" but the consequences are brutal! It's a pretty big ask for most people. And that possibility will be eroded and, more likely, wholly negated by such systems.
It was the same leap you made with > "Saudi Arabia punishes rape victims. We should help with that?"
There is no temptation if they need a warrant issued by judge authorizing for the officer to request to the company information related with that suspect. Upon which the company might charge administrative costs to handle the request.
I am saying suspect of a crime committed or with complaint filed against. As stated before I am against mass surveillance and crime "prediction". Even terrorism is a small problem compared with economic/political corruption.
Otherwise the governments might block applications. Or secretly spy on us, with the company help if needed. All done without any supervision.
>Why should a suspected rapist, assassin, drug dealer, corrupt politician, be free or unjustly imprisoned, because of lack of evidence?
"suspected"
That includes you, so... be careful.
If the suspect definition is wrong should be changed. Of course I am not aware how.
But I cannot feel safe just because one application company is saying everything is encrypted.
I am just afraid the day security is so good, there will be no more corruption leaks, and the ruling classes can do whatever they want with total privacy.
My unfair imprisonment is less important, than a fair imprisonment of a corrupt politician. I think...
It's just reverse psychology.
They have the means to break, degrade or bypass the encryption and they emit statements like these so people remain confident that they're not being spied on.
This routinely happens after leaks reveal that certain type of traffic is being targeted. In this particular case, Wikileaks.
In the past after all the PRISM collusion was revealed, all the PRISM partners started their PR campaigns showing their "commitment to privacy", and the soap opera with law enforcement agencies claiming they couldn't decrypt devices. In reality they have many tricks they have used for years now, like setting up a fake cell antenna, impersonate a phone carrier to take over a device.
precisely. I was around when only hatters and nutjobs talked about echelon, getting ridiculed on internet and the outernet, only to be proven correct decades later
people have very short memory, it seems.
How can it be acceptable to say shit like this when you have such a position within the government.
Because we've accepted it. We know the government is watching us 24/7 and no one cares. This is the new norm.
Stories like this fill me with a slight bit of hope that encryption works
I wonder how they are not even busy explaining how this could happen with all that surveillance already in place. I mean no camera jumped down the pole to stop the car...
She did admit during the interview that these incidents couldn't be completely stopped.
The point is, that they can't be stopped at all.
There is a way to suspect someone may at some point, which is what the UK security apparatus is aiming for but this kind of profiling will end up being a psychological analysis of whole groups of the population. The results won't be great for any of us.
Britain is well known around the world for not caring about privacy/surveillance.
And the most CCTV-ed society.
Seriously, have you seen any US news lately? You Brits look like geniuses compared to what we have.
For two decades I've been waiting for popular support for a complete or at least Clipper-chip-style encryption ban in the "free world". It always was on the other far end of the spectrum, directly oppsite questions like IV/nonce choice, PRNG initialization flaws, RSA attack vectors. I have great fear for the freedom and living standard of my kids when I read these top-level news pieces. We stand a real test and we will have to argue against hatred, fear and terrorism. Let's just hope our leaders have no-nonsense advisors as well as those that inspire such news.
This is a complete nonsense. Such move would simply encourage "bad guys" to find other means of secure communication while exposing everyone else.
When you take away our freedom in order to stop terrorism, then the terrorists win. This is one guy in an estate car. Amber Rudd is not a democrat if she really believes this
Best ban everything that can be misused by terrorists... cars, knives, encrypted messaging.
that's the blindspot isn't it ? What's the one common thing about recent terrorist acts ? Vehicles used as weapons. Yet nobody's calling for a ban on the use of vehicles
Even one level up... Since when is a crazy individual with a knife in a car a terrorist? He's just a crazy idiot and we should call him so. If this is modern day terrorism we have nothing to be worried about.
Only because it is not yet feasible. Once autonomous vehicles become common there will be pressure to make driving a car manually difficult, expensive, and ultimately a crime.
Politicians.
Reading all of the comments I am deeply concerned. Everyone who is opposed to this is doing 'their side' a disservice.
Comments are about how stupid, or ill informed the Home Secretary and advisors are, or that they are being blackmailed by the intelligence services. Seriously? These kinds of comments are not going to get the broader public to support your ideals.
I think you misunderstand why she (and law enforcement) believe that they should have access to the messages. If the terrorist called someone they can get a warrant for the metadata and see who he called and whether it is relevant to the investigation. If the terrorist sent an SMS they can get a warrant for it. However, if the terrorist sends a WhatsApp message what can they get? Why should a WhatsApp message be treated different from an SMS?
That is what we as the tech community need to explain, why backdoors, weak encryption, and escrow are not a solution.
I value my privacy. I want my messages to be secure. But if the tech community keep acting like most of the comments on this, we will lose.
So when they discover that he wrote and sent actual letters, will they then demand access to open our mail?
Also: Will breaking encryption stop a man grabbing a knife and jumping into his car? No.
> So when they discover that he wrote and sent actual letters, will they then demand access to open our mail?
Except, like, with a warrant, they can already open our mail. That's a pre-existing power.
The difference is under the current legislation a warrant doesn't get them the ability to read WhatsApp conversations; that's the point of contention here, and the difference with the above is perceived to be the problem.
If the govt. was to force WhatsApp's hand, I'm sure we'd see democracy in action if they prevented everyone using the app for 24 hours, replacing the facility to message with a note telling users to contact their local MP (with clickable email / phone numbers - and maybe links to the ORG).
Or even just put a banner at the top of the app.
And then suddenly Facebook find themselves legally responsible for the things published on their platform, perhaps. They're not going to stand up to the government, they're in too weak a position.
By the same logic if we ban freedom of speech terrorism won't be able to speak with each other.
Seriously who voted these idiots.
Well, she could start herself publishing all her emails, how can we know she isn't a covert terrorist?
Even though the article mentions specifically about UK, there are many in the US who hold the same belief. If you want to ban encryption because terrorists might misuse it, what about Guns? Then it is a matter of "freedom".
Dear minister, can I whisper in my wife's ear while having sex or do I have to get permission from government?
can I talk to myself using my inner monologue please ?
only if it involves something deemed legal by the Thought Police and the Party.
Smartness should be banmed. They are too much of a problem! Everyday disruption disruption disruption...
Evolution should be banned too and all those books about biology or astronomy. God made it all!
I morbidly curious how many terrorist attacks we are away from actual laws that will attempt to outlaw encryption as used by WhatsApp (even if it wouldn't make sense to do that). Resistance against such measures outside of the tech scene would probably be low. The "I've got nothing to hide" mentality is actually quite widespread among the population, so I don't even think it would be a risky move politically.
I assume someone has already brought this up, but it is late and I can't read through 300 comments. From what I recall and have read, this individual has been on the radar of the security services since 2010 and so was a known potential threat. With a history of violence and criminal behavior. Yet effective monitoring of such individuals WAS NOT DONE and apparently isn't. Instead, there is this post-hoc demand that all of the public must give up their right to privacy because the idea of 'pre-crime' prevention is actually viable...
complete and utter bollocks.
So a blanket violation of law abiding citizens rights is more important than actually keeping tabs on known threats more closely and effectively. Pedophiles are viewed with less disdain than terrorists it seems. And the threat of terrorism is trumpeted to the heavens while pedophilia is apparently more rampant is UK society...
It is quite illogical that law abiding people suddenly snap and decide to drive their cars into groups of tourists. How prevalent are the actual potential terrorists - i.e. those with a history of violence, trouble with the law, radicalization, etc? If I knew those stats, then I personally would be better able to judge the claims of the authorities. But I don't have those stats and so the logical assumption is that their claims are exaggerated shite designed to drum up fear and etc etc. Meanwhile idiotic claims that all encryption must be banned or tapped, even for law abiding businesses (does no one remember Cameron's proposals?) are floated... nothing but Band-aids all the way down.
I could move back to America, but at this point, that is like jumping out of the frying pan. I really need to learn a second language, preferably Mongolian.
How difficult would it be for these so called terrorists to develop their own end to end encrypted app? Perhaps something masquerading as something common like any port under 1000? It is feasible that the elimination of whatsapp/telegram/signal encryption would just lead to a way more complicated encryption system developed internally to these organisations.
Not that difficult, but it'd almost certainly have major security flaws that GHCQ and the NSA could exploit, because homebrew end-to-end encryption tends to. The bigger problem is that merely using an Isis-branded chat application is the equivalent of sending a text message to the security services saying "hello, I'm a potential terrorist, please pay special attention to my every movement". Even the Tor project hasn't actually managed to reliably disguise their protocol as something else, and they have a bunch of smart people working on it with the advantage that countries tend to reveal the fact they've been detected through blocking rather than just quietly monitoring the people using it and rounding them up the moment they try something.
I generally agree, but intelligence groups have to first find the signal they want to monitor. Modern steganography techniques coupled with the free randomness you get from the physical world gives you much to work with. And that's assuming common tech will be used.
Also, designing a secure general purpose messaging system is much harder that designing a system tailored for a specific use case.
Banning encryption by law is like demanding, loudly, that people not talk behind your back. Some will listen, and some will not. Only legitimate users and use cases will suffer.
Well, ISPs could implement a whitelist of communication methods. Maybe all content would have to be signed by a whitelist of apps before it's allowed through the network. Images would have to be signed by the camera app, so no steganography would be possible.
From a business perspective it would be like going back to before the internet, but many of the services we associate with the internet like Facebook, Netflix etc. would survive.
What about using latency or throughput as a signal? What about tunneling data over seemingly normal pictures taken by said camera app? What about switching the "User Available online" indicators at a seemingly natural rate? What about all three at the same time?
I assure you, steganography will always be possible. The only think a ban on encryption would do is hurt (badly) society, personal privacy, and those who want to follow the law.
Sure, it would be a bit of a whack-a-mole, but in the end sending concealed messages would become extremely difficult.
It's technically impossible to do perfectly, but as we all know, perfect is the enemy of good enough ;)
You just described a rather common programming assignment at most schools..
How about they look into their business partner Saudi Arabia first? It sounds like as if they let this country poison the minds of mentally ill people in hope the attacks they carry on could be used as an excuse to expand control of the society. Use of this tragedy to do just that is simply disgusting and put in question what government is actually doing.
How would that have prevented anything? As if they'd have responded within 2 minutes to some guy sending weird messages.
The message he sent right before the attack will likely not have been his first communication about it. There's been a pattern of Isis operatives abroad guiding and supporting terrorists in detail via end-to-end encrypted messaging for weeks or months, right up until the moment they attack: https://www.nytimes.com/2017/02/04/world/asia/isis-messaging... Presumably the British police are assuming that this is like those previous attacks, but they haven't managed to obtain the actual message contents after the fact this time around for some reason.
Come on. If this was really ISIS, then we plainly have nothing to fear.
My bet is that he's just a random crazy, but of course these days it suits the political narrative to brand such people 'terrorists' to stoke public fear
Stop tossing out mental health as a reason. There's nothing to say he had mental illness, and you cause harm to people with mental illness when you incorrectly link violence to mental illness.
I used the word 'crazy', rather than mentioning mental health issues. 'Crazy' does not just refer to people with mental health issues, if refers to irrational acts.
To expand on my point, some of these small-scale 'terrorist' attacks show very little evidence of being coherently planned, and it's difficult to believe that an organised terrorist group is behind them. It seems more likely that some of these acts were performed by people acting alone, or vulnerable people provoked to it - and yes, some of these people may well be mentally ill; that hardly means they cannot be capable of violence.
Suppose they intercept WhatsApp. The need to react to each suspicious message within the minute, and be perfectly efficient in catching terrorists while avoiding innocents.
Of course they would need to intercept all other communication services, including home-made ones.
Some people will simply refuse to let all and sundry (we have no idea as to who reads and acts on intercepted emails) to read private emails and they will therefore turn to steganography or one time pads with a seemingly ambiguous pre-arranged code. Good luck with reading the latter or even thinking it has a hidden message.
Isnt it weird that drasticly restrictive all encompassing rules are hastily pushed after attacks? Blanket Decryption of messages, and other privacy suppression rules will make intelligence agencies into super powers with too much control at a very reduced cost (less messy assassinations, or physical threats needed)
That's the perfect time to hastily push them through!
So a statement that he acted alone by the met police is bing utterly ignored. Ironically no mention of banning 4x4 cars and that frankly puts this whole situation into perspective - government ignorance of encryption, once again.
If we could bundle a 4WD car ban in with the new legislation, I might actually change my mind about the massive loss of basic freedoms. :-)
One reason it's good that governments cannot force WhatsApp to disable end-to-end encryption is that different governments have different definitions of nefarious activity. While the British Government could arguably use a backdoor to stop terrorist attacks, what would stop Pakistan or Saudi Arabia from using the same back door to enforce blasphemy laws? The issue is the same: should a private company help law enforcement by disabling encryption?
It's nice to know WhatsApp can help people break the law in places where the law itself is immoral.
what if the guy read a book and agreed with it because he was an sad angry teenager with no life
Possession of The Anarchist Cookbook has been used as evidence against terrorist suspects in Britain, and also as a crime itself, "Possession of terrorist material". In the latter case, the 17 year old's argument was pretty much what you suggest, and he was cleared.
http://news.bbc.co.uk/2/hi/uk_news/7030096.stm
https://en.wikipedia.org/wiki/The_Anarchist_Cookbook#Legalit...
Then the obvious answer is to ban books, right?
Right?
This is a thing in France.
They have access to library records because of that scenario. They would be able to access online bookstore records. The book may not have been purchased of course, or may have been banned for sale in a country - many illegal copies of books were made in Communist states in the 80s for that reason.
"Home Secretary Amber Rudd told Sky News it was "completely unacceptable" that police and security services had not been able to crack the heavily encrypted service."
This is great news, actually. It means that WhatsApp's encryption works, and stonewalls the efforts of state actors (or at least, hers) to break it.
That said, we don't know if she's lying about this, or not.
They don't need to touch encryption in any way. It's way simpler to subvert the endpoints, as most people use closed-source operating systems such as iOS and Android which offer closed-source applications.
All they need to do is to pressure Apple and Google to keep some backdoors open, which is more than realistic, as Snowden's revelations have shown a couple of years ago.
Even better, why not introduce backdoors at the hardware level?
Shameful way to capitalize on the recent Westminster attack. See Naomi Klein's "The Shock Doctrine" for more.
Looking away from the fact that what they want isn't actually achievable, what does the UK risk by beginning to go down this road? What consequences could this potentially have for their domestic tech sector?
My intuition says that they stand to lose more than they could possibly gain, but I'm curious to hear a more knowledgeable perspective.
Thought-experimentally: could we potentially be able to scan message databases for the absence of certain phrases, using something like [1], but in a probabilistic manner akin to that of a Bloom filter? This would ensure that law enforcement would be able to flag certain keywords with a nonzero (and nontrivial) false-positive rate. That way, repeated flags end up identifying potentially interesting members of society, without proof and with data inadmissible as reliable evidence in a court of law.
Of course, one runs the risk of the existence of false positives being forgotten, TLA/government pressure to reduce the false positive rate, and so on. But I think this is a slightly interesting way to (partially) preserve privacy while satisfying lawmakers who demand that there be some way for them to listen in on (what should ideally be completely private) data. (This is, of course, only possible once one drops the axiom of privacy being an absolute right: I don't personally support doing this at all.)
Theoretically, I'm sure some crafty solution like that exists, but then that is no longer true end-to-end encryption. It's 'leaky' E2E, which in the eyes of most crypto enthusiasts, is practically worse than open channel, because it gives a false sense of security.
In these digital always online times; its like claiming no one should be allowed to have a private conversation.
I wonder how she felt about private communication during her directorships in off shore tax havens?
In a similar vein, to prevent corruption and bribery we should require Ms. Rudd et al to post all email exchanges (official or otherwise) they engage in publicly, along with their bank statements.
After all, we can't allow corrupt politicians ANYWHERE TO HIDE. ;)
It's going to be a total clusterf*ck when the UK leaves the EU and starts introducing draconian intelligence gathering laws that go further than the EU regulations permit. Think Privacy Shield style problems but much worse...
The relevant discussion is here: https://www.youtube.com/watch?v=8yIPuHsB8q8
I assume that the UK government has been doing these extremely pro-surveillance, anti-encryption, and anti-porn stances because they detect sufficient support from the UK population?
Theresa May is just using the tragic deaths of some innocents to push her own political agenda. Pathetic political games at their worst.
Is it technically feasible to have a back door and still be `end to end` encrypted ?
The definition of end-to-end encryption is that the decryption keys are only available to the client. Now your question asks the question of where the backdoor should be.
It is feasible. The backdoor should be at a very low level (not say a sandboxed application) from which basically nothing can hide on the device.
Yes, this was revealed by the recent agency leaks. Cracking end to end encryption is still extremely difficult (currently impossible?). It's much easier to get root on a target's phone and run a keylogger or break into the app. The messages are still end-end encrypted, but you can sniff them before they're sent and after they're received since they're shown to the user in plaintext.
But then the side channel (pre enc info) is also sent, using different encryption? Otherwise, just as broken.
The recent wikileaks documents from the CIA say that yes it is possibly but instead of blanket backdoor they have to be specifically targeted. Whether that's the current state of affairs of not, I dont know.
Yes, but the back door must be placed on either `end`, so an eavesdropper needs to intercept communication before encryption or after decryption.
Yes, you can encrypt messages by yourself and that way it doesn't matter if the mean of communication is insecure.
First global warming deniers then mathematics deniers. Where do we go from here?
Messaging services without encryption is unacceptable - TRUTH
> "That is my view - it is completely unacceptable, there should be no place for terrorists to hide."
I am sure a ban on encryption would work.
Hey, guys, I just had a great idea. Let's ban bombs, knifes, and driving into people. That would fix the terrorism problem. Once it is illegal, no terrorist would dare do it!!!
I'm wondering why Churchill didn't think to ban the Enigma machine. If only England was led by smart people like the British interior minister...
...we've all been there in US before 1999 where exporting strong crypto had $1mln penalty with long prison terms, basically classified together with chemical/biological weapons.
It was recognised that it's impossible to enforce, ie. PGP is there, in general tech is just available to anyone and what about research papers and academia? Treat them as criminals? Even if banned in UK, it's available in the rest of the world.
It would be nice if politicians were banned from saying stupid things.
While we're at it, can we also ban crime? That'd be great.
Couldn't we just ban everything illegal outright? Why take so many detours?
While we're at it, let's just ban living. Everyone born could become a criminal, right? </s>
The difference is that it's relatively easy to detect when people use encrypted communication. You can just arrest them. It's not so easy to detect whether someone is planning to drive into people.
Compare like with like. It's plenty easy to detect whether someone is driving. Whether they're planning to drive into people has the same level of difficulty as whether their encrypted communications are about terrorist activity, so shouldn't we ban cars? And buses and planes and anything else with wheels or an engine?
If we banned planes that would certainly have prevented 9/11. What better argument could there be?
They forgot to ban steganography, didn't they?
It's, in my view, more akin to trying to ban mathematics in a world in which mathematics exist. Tools and weapons don't exist indenpently from us, mathematics do.
> Hey, guys, I just had a great idea. Let's ban bombs, knifes, and driving into people. That would fix the terrorism problem.
They started banning guns almost a century ago. While it did probably reduce the number of gun murders, it certainly didn't make terrorist organisation like the IRA any less effective — and it probably made non-gun crime worse.
Possessing encryption tools, lockpicks, knives, guns &c. is a fundamental human right of free men.
Shouldn't possessing surveillance tools be a fundamental human right in this hypothetical scenario?
No, because we also have the right to privacy.
I beg to differ. My (and everyone's else) right to have a copy of tcpdump doesn't violate you right to privacy.
Sure, but how do you feel about a person's right over their personal information being systematically collected by their government, several large corporate entities, and a dozen or so international spy agencies? What about when this information is used to economically bind them into submission?
Ah, but that's about how the tools are used, not what they are and who has them.
I believe, any entity, from a mere individual to a government agency or multinational corporation should be free to possess any software tools they may desire. How they use those is another matter.
I do "spy" on my own traffic on my own network running on my own hardware on my own premises. I've had some audits for possible malware/spyware, did reverse engineering protocol analysis, etc - and believe everyone should be able to do that. I shouldn't be able to do this on someone's else traffic without their informed consent - nor technically, neither legally (I believe, both of those aspects are important).
As for your question - I don't like this, of course. Don't think there is anyone well-informed and in their sane mind who does.
This doesn't extend well to other domains.
We're talking about what are essentially cyberweapons and surveillance tools.
Why can I not own a nuclear warhead, as long as I promise not to do anything with it?
Why can I not put cameras and 3D radio imaging equipment up across the street on a small private plot of land and spy on you and your children's home without being visited by the police at your request?
If you don't have a problem with how those two cases are legislated then you might be able to understand how this could relate back to software.
Should you have access to any program you want? I guess so, if you can get it in a licensed manner. If it's FOSS or something similar, that will be easy. But using these tools to collect surveillance data and PII is an entirely other issue because of the potential weaponization of collected data and the harm that can result from it.
I supposed I did not fully understand what the OP was asking, because I do believe you should be able to own these tools and test them on your own equipment, and that the issue is that we need to ensure proper protections against these tools are in place for the average uninformed user.
Well, he wants to ban secret hiding places, not bombs. I am sure that might work.
Come to think of it...
Sorry, he is a she.
I'll be surprised if you don't get a full HN ban for this.
as usual she has no idea.
If I have to choose one from end-to-end encryption and security, I will choose security. I don't mind my WhatsApp chats are scanned by police's software, if it can reduce terrorism. Of course, we need to make sure it is used for anti-terrorism only.
Update: One solution of 'make sure' is the source code of the monitoring software must be reviewed by independent and trusted software engineers/experts.
PS. Downvoting my post doesn't solve any problem. If you have any better idea, welcome to post it out. Thanks
> Of course, we need to make sure it is used for anti-terrorism only.
See that's the problem everyone is talking about. The thing, is, turns out you can't. That's was the ENTIRE point of the Snowden revelations.
No sane person is okay with terrorism, but at what point are you going to stop relinquishing your rights?
First, texts with Whatsapp. Then your phone calls. Then your bags and notes when you go through airport security. Then bugs in your house. All of these will help curb terrorism. But where will you stop? Will you lose all your private life in the name of law?
One solution in my mind is the source code of the monitoring software must be reviewed by independent and trusted software engineers/experts.
... and everybody who uses a different software not approved by state will be flagged as a criminal. This will make the job of police and spooks easier, we know there was order and security in East Germany or other countries of the Soviet block.
Now consider the cost of such 'solution'. Free speech gets redefined, most of the people get divided into informants,opportunists, naive state suckers and silent fragmented opposition. Is that kind of security and police state an acceptable cost? For preventing small number of violent deaths each year?
There are much bigger problems in Western societies than a bunch of lunatics killing small number of people, but those can't be used so easily to make a power grab.
How do you ensure that what was the reviewed source is what is actually being used? Also, how do you encode in source code who is the correct target to use this against, for now and in the future?
That's a perfect solution actually. But sadly, we aren't there just yet. There are nuances with these things that software can't (yet) pickup.
So humans have to do it till then. We were maybe born too early. But I think it makes things interesting.
That means there are still problems for you and me to solve.
Actually it's a horrible non-solution.
Assuming these experts are perfect and infallible (a bad assumption), then what does it prove?
That only an authorized government agent can have access?
Can you not think of any problem with that whatsoever?
I actually didn't suggest a complete solution. You seem to judge the proposition without any further questions.
I said the monitoring software having access to the data was a solution. But you're probably thinking of a case where there is a master encryption key which we just hand to the government. But have you thought of a solution where we can be sure of the access that the software will have?
Something like a infallible way we can choose only the software can view the data. Sure, you're quick to dismiss it because it doesn't exist. That's why I said it didn't exist
There needn't be centralized way of communication you're thinking of now. It can be public software that people can choose to run.
> Assuming these experts are perfect and infallible
Well, you can have the same skepticism for the end-to-end encrypted software you use. How can you assume that it isn't broken?
>I actually didn't suggest a complete solution.
Nobody is saying you did. You yourself said "that is a perfect solution actually" in response to vinceyuan, who had a one-liner comment about "the source code of the monitoring software must be reviewed by independent and trusted software engineers/experts."
Maybe we are interpreting this in different ways.
How do you envision this "solution" working? It is a bit vaguely specified.
Who is doing the monitoring? What or who is being monitored? For example are we talking about monitoring the authorities to see if their access is done properly? Or are we talking about something / someone monitoring communications, on behalf of the authorities? Not sure what you had in mind. Can you explain how what you called "perfect" might work, were it to be developed at some point in the future?
I'll say up front that I'm skeptical, but let's see if we are even talking about the same thing. As long as you're being super vague, you don't have a solution at all.
And if you're just saying: there's no solution now but maybe one can be developed, fine (I believe you're wrong) but please clarify how you think it might work.
> That means there are still problems for you and me to solve.
This was my last sentence. With which I tried to say that we have to still solve the problem and come up with the solution. My comment "that's a perfect solution" was about the answer "software that can effectively monitor communications with proper privacy" to the question about properly reconciling privacy and security, in a situation where the people are okay with their communications being monitored.
But are you are expecting a answer to the question, "How will the software work?" from me.
I have no clue as so how it'll exactly work. But since you're so interested, I'll take a stab:
> Who is doing the monitoring?
The software. No humans will ever see the raw communications which haven't been flagged. Now this is obviously the tricky part. This is not a backdoored system with a magic decryption key. What I had in mind was a software possibly in-built with the communications protocol, which will, with near perfect accuracy flag suspicious communications. This is will need a leap of tech in Machine learning with NLP.
> What or who is being monitored?
All the communications (through the node) are being monitored.
> For example are we talking about monitoring the authorities to see if their access is done properly?
'They' have no access. Only the software does. How that is done is up to the "engineers/experts" to figure out. This will obviously need a change in communications architecture. When it comes to properly securing the physical part (the servers), I'm sure something can be figured out there.
> As long as you're being super vague, you don't have a solution at all.
See my first line in this comment. I don't have a solution, but I do believe that a solution exists to a problem. They're very different things.
As an analogy, in mathematics, that's similar to me saying the problem is solvable, but you're talking about the actual solution.
And sure, this is a 'perfect' solution where monitoring communications is even a possibility. I don't even support that possibility. The first comment I replied to does, which said:
"If I have to choose one from end-to-end encryption and security, I will choose security. I don't mind my WhatsApp chats are scanned by police's software, if it can reduce terrorism. Of course, we need to make sure it is used for anti-terrorism only."
So in the first place, monitoring is something that will be done. Now in that scenario, there's a solution (In retrospect, I don't think I should've said perfect).
I don't think you are going to be happy with this solution. I don't expect everyone to be. I probably will be, because while I want privacy, I'm amenable to a solution I can trust in a situation where there has to be some kind of monitoring.
Since we live in a democracy (I hope you don't live in an oppressive monarchy), it can happen when the majority of the people (senators, actually, because it is a Republic) agree with a situation when monitoring is okay.
Your opinion or my opinion is not enough to change everyone else's opinions. So we might have to learn to live with it.
We live in a world, not a democracy. There are many different countries, with many different systems.
Any proposed solution has to deal with that reality, not with the little bubble of one democracy which may arguably in the questionable opinions of some subset of people have a good government.
The reality includes police states where the police are truly evil.
It also includes police states where the software is written by truly evil people, to do evil things, with evil experts overseeing it all and approving evil behavior in the software they are checking.
Please tell me how you can be confident that there can be a solution that addresses this reality while protecting the privacy of users. Sometimes all the user wants to do is send a message to their boyfriend, without getting thrown off a building, burned, flogged, or killed, possibly having several generations of your family killed as well (see North Korea).
The system has to work for this reality. I'm pretty sure that simply drawing a line and fully protecting the privacy of users' messages, full stop, is a better solution than whatever you and your senators will come up with.
And yes, the security of a crypto system can be verified. If it's designed to be secure. Not if it's designed to be monitored. Even if the experts are perfect angels and absolutely competent, if there is a way to monitor, hackers will find a way to get access to it.
>When it comes to properly securing the physical part (the servers), I'm sure something can be figured out there.
You're dreaming. Remember, the authorities will have full power over that system, and even in countries where the authorities are not evil, the authorities as a rule are inevitably corruptible if not corrupt. This isn't just cynicism, it's reality. Look around.
> And yes, the security of a crypto system can be verified. If it's designed to be secure.
Theoretical security and actual security are two very different things. Once is mathematical which can be verified by equations. Other deals with software and imperfect developers. Software can't be verified for perfect security in a deterministic way, no matter how hard you try. Vulnerabilities pop up all the time. Your expectation that theoretical security translates to real world security is something I believe you need to think about again.
>Not if it's designed to be monitored. Even if the experts are perfect angels and absolutely competent, if there is a way to monitor, hackers will find a way to get access to it.
You seem to miss the part where I said a new protocol, not something which is modified, or backdoored. I'm surprised at you being so sure about the failure of a non-existent protocol. Do you have anything to back up your claim that any such protocol wouldn't work? Remember, it doesn't exist yet.
I honestly didn't find most of your post very coherent. There is no avenue for free speech in North Korea and other authoritarian regimes so it is a waste of time talking about working around the existing government for privacy and free speech rights. The only place where the masses can bring about change is in a democracy.
>not with the little bubble of one democracy
Last time I checked, most countries are democratic. Please show me the case where democratic countries vastly differ in how their government is organized.
> The reality includes police states where the police are truly evil.
Again, I talked about a democracy since we really can't do anything to help them with encryption and code. If there are no rights, strong encryption doesn't really matter. Look up rubber-hose cryptanalysis.
> The system has to work for this reality. I'm pretty sure that simply drawing a line and fully protecting the privacy of users' messages, full stop, is a better solution than whatever you and your senators will come up with.
It is of course is a better solution for individual privacy, I thought I talked about this at the end of my last comment. I don't have much control over my senators.
>>>When it comes to properly securing the physical part (the servers), I'm sure something can be figured out there.
>You're dreaming. Remember, the authorities will have full power over that system, and even in countries where the authorities are not evil, the authorities as a rule are inevitably corruptible if not corrupt. This isn't just cynicism, it's reality. Look around.
Full power? I don't believe you have understood what I said.
At this point it feels like you're arguing for the sake of an argument.
Looking forward to learning more about this new perfect future protocol that you think will solve the problems.
/s
Freedom and openness comes at a cost; this applies just as much to individuals as it does to countries. So then the ends of the spectrum are "I can take anything that comes at me, let the chips fall where they may"; and "Mummy make the bad man go away".
Unfortunately maths doesn't work that way, there is not much in the way of a spectrum when it comes to encryption, there is a very steep cliff from secure to insecure.
So then you are faced with a very stark contrast; the security afforded by a surveillance state, or freedom with the possibility of terrorism. Personally I prefer the latter.
The trick is there is no spoon, just like there is no control; only influence.
The better idea is to achieve security by encouraging people of different cultures to get along with each other. So, don't give free housing and support to people who fight against harmony and promote anger.
Two questions for you.
>One solution of 'make sure' is the source code of the monitoring software must be reviewed by independent and trusted software engineers/experts.
What does that help with? (Because trusted expert aren't perfect, right?)
And second question, what if the police and government are evil, then how does your plan help?
> if it can reduce terrorism It can't. If these supposed hoards of terrorists have half a brain cell between them, they'd simply communicate using something else.
> Of course, we need to make sure it is used for anti-terrorism only Hah, not likely in the UK - if sweeping powers exist, there will be a creeping escalation of their use by different government bodies, and for purposes not related to terrorism.
How are you going to make sure?