> As recently seen with Intel, there seems to be a trend where developers will do this pointless client-side decryption. When the client has the key, it’s strange that anyone would think that would be secure.
I stay and work in India. Yesterday, as part of a VAPT audit by a third party auditor, the auditors "recommended" that we do exactly this. I wonder if this directive comes as part of some outdated cyber security guidelines that are passed around here? Not entirely sure.
When I asked them about how I'd pass the secret to the client to do the client side encryption/decryption without that key being accessible to someone who is able to MITM intercept our HTTPS only API calls anyway, the guy basically couldn't understand my question and fumbled around in his 'Burp' suite pointing exasperatedly to how he is able to see the JSON body in POST requests.
Most of the security people we've met here, from what I can tell are really clueless. Internally, we call these guys "burp babies" (worse than "script kiddies") who just seem to know how to follow some cookie cutter instructions on using the Burp suite.
I am a pretty cookie cutter developer. We just make glorified CRUDs and I have tried to convince the engineering director hundreds of times that "There is no use of encrypting and decrypting localstorage with a key thats sitting right inside the client code." Yet they keep insisting on it in the code-quality checklist.
My guess - he’s avoiding political risk. If something goes bad, it’s better to say “it was encrypted but they got the keys” than to defend data wasn’t encrypted.
It’s semantics in terms of actual difference to an attacker, but it’s a world of difference when explaining to executives.
I guess they think it results in some kind of security by obscurity... Maybe ward off lazy beginner hackers..
You’re right, of course, but this reminds me of when Chrome didn’t obscure your passwords when looking at its autofill settings. The developers argued that it would just be security by obscurity -- if somebody has access to your computer when it’s unlocked, they can do anything they want, so obscuring your passwords does nothing.
The counter-argument is, even if it’s not perfectly secure, that extra bit of friction before you can see the passwords is useful, and may just save your bacon if a casual thief has access to your computer for a few seconds.
The Chrome team eventually saw sense and added some client-side password protection.
As long as you don’t only have client-side protections, of course (and maybe your clueless auditors were making that mistake).
He's definitely wrong. If you want to see why this is wrong you should look at what Kaspersky had to do to unravel Operation Triangulation. They did, eventually, succeed but the absolute nightmare they went through should simply inform you why its a good thing.
Assuming that youve been mitm'd is a different violation of trust. And when you break your own assumptions, well of course nothing makes sense. Were i the burp baby i would've asked why you think we should not defend against literally any other side channel because maybe they broke tls.
Appreciate the insight!
lmao
burp suite babies is crazy work