> It should be illegal

It should be illegal to host insecure services, especially when you're dealing with PII. Breaches keep happening and nobody gives a fuck, because the worst that'll happen is you might lose a handful of customers and buy some "credit monitoring".

Incidents like this should be followed by an audit and charges being laid. Send corp officers to jail for negligent security failures. If you can go to jail for accounting fraud, you should be able to go to jail for cybersecurity-promises-fraud.

They claim to be compliant with a number of security standards [1]. I would love to see a postmortem audit of how much of this they actually implemented.

[1] https://www.instructure.com/en-au/trust-center/compliance

I don't think that criminal negligence is the most helpful legal tool for incentivizing improved security. It's too hard to prove negligence.

Instead, there should be standard civil penalties for leaking various degrees of PII paid as restitution to the affected individual. Importantly, this must be applied REGARDLESS of "certification" or whether any security practices were "incorrect" or "insufficient". Even if there's a zero-day exploit and you did everything right, you pay. That's the cost of storing people's secrets.

This would make operating services whose whole "thing" is storing a bunch of information about individuals (like Canvas) much more expensive. Good! It's far to cheap to stockpile a ticking time bomb of private info and then walk away paying no damages just because you complied with some out-of-date list of rules or got the stamp of approval from a certification org that's incentivized to give out stamps of approval.

And this strict liability will come with an expectation of insurance. The insurance policies will necessitate audits, which will actually improve security.

[deleted]

I feel like there’s a tendency here to seriously overestimate how damaging these leaks are to individuals.

For most individuals impacted by these hacks, appropriate restitution would be $0. Anything more than that would go beyond making them whole.

It's not a popular opinion but I agree. I live in a country that has a very extensive principle of public records, and often times these leaks disclose much less than you would get by simply calling the authorities and ask. Now, whether that's good or bad is a different story.

Leaking school or medical record can have serious personal consequences that cannot even be enumerated

We use to hand out whole books of this information to as many people as possible. (phone books)

The only right answer.

Let's do this.

How could you possibly make it illegal to host insecure services? Is any service 100% secure? And if it were how would we know?

I do agree with the audit and punishments for clear failure to adhere to established standards.

This is a solved problem in pretty much every other domain of life - if you are following best practises but something that wasn't reasonably forseeable happens, then you're fine, but if the bad thing happens as a result of negligence then you are in trouble.

Criminal law isn't about making things alright for the victim. That's what insurance is for.

Even if you leave your door unlocked, if someone walks in and steals your stuff, it's a crime. The state has an interest in prosecuting crimes even if the victim didn't do everything they could to prevent it.

> Criminal law isn't about making things alright for the victim

Restitution and retribution are the components of justice [1] entirely about "making things alright for the victim."

[1] https://www.unodc.org/e4j/en/crime-prevention-criminal-justi...

The company is not the victim here. Its users are. [I suppose my previous comment was a bit ambigious - i meant something bad happens to someone else not to yourself]

A better version of your analogy would be if your landlord failed to repair your front door in a reasonable period of time and as a result soneone walked in and stole your stuff. Yes the theif is the primary responsible party, but the landlords negligence in maintaining the property probably also exposes them to some liability.

P.s. This is neither here nor there, but restitution is a part of criminal law.

"Best practice" in cybersecurity is largely vendor-driven with little to no independent empirical validation.

That standard is likely to lock people into buying some pretty bad software, but it does little to ensure that they're running reasonably secure systems.

I like to relate it to operating an automobile. You can follow every traffic law and still be liable in an accident, because you owned the vehicle that caused the damage. This is why you have insurance.

In civil law maybe, but you aren’t allowed to blame a rape victim for choosing to walk down rape alley…

"established standards" - now who has the incentive to run shitty services? those big enough to control the "established standards".

No building has a 100% chance of not caving in, yet somehow I think charges would be laid if a skyscraper caved in.

The equivalent analogy is charging lock/door/drywall/timber makers and suppliers for lapses if a thief entered the house by picking a lock or drilling/sawing through the wall.

No, it’s more like me storing my money at a bank, and then someone stealing from the bank, who told me they were secure. And turns out they had shitty locks.

This analogy seems to be portraying 'ransomware hackers' as an unstoppable force of nature akin to gravity.

I'm not sure that's a fair analogy.

I think it’s a very fair analogy. The _only_ way to stop them is to make your stuff secure. That’s literally the only way.

Your analogy portrays gravity as a thing that buildings cannot be built to withstand. There are plenty of structurally sound buildings and while there are plenty of secure apps the problem is there’s no incentive to build the latter.

The other side of that spectrum portrays the service providers as pure, negligence-free victims. The truth is probably somewhere in the middle.

If Boeing claimed a plane was airworthy, but it crashed because basic engineering controls were skipped, we have collectively put our faith in the NTSB to preserve evidence, run an independent technical investigation, etc. There is no such authority for software - most security auditors (SOC2, HITRUST, etc) are just looking at self-reported data.

Just take a look at the recent Epic vs. Health Gorilla lawsuit to see how nonexistent the protection is around exchanging your medical records, one of the most sensitive types of PII.

Edit: I was incorrect / non-American, I was thinking of your FAA.

People who haven’t been hacked just haven’t been looked at. If someone wants to hack you, they will hack you. It’s really unfortunate that people have this level of confidence in their ability.

Here’s an example. https://hacks.mozilla.org/2026/05/behind-the-scenes-hardenin...

Has a corporate officer ever gone to jail or been meaningfully fined for a data breach?

> Incidents like this should be followed by an audit and charges being laid

What? Why? Who died? This whole thing is perfectly dealt with through civil process.