That is the bonkers thing about this story. Why take on the liability? Get what you need and toss the responsibility. If you must store it (which seems unlikely) put that extra-bad-if-leaked information behind a separate append only service for which read is heavily restricted.
Because there is no liability.
If they were fined $10k per leaked ID, then there is a serious liability there.
Right now, they publish a press release, go 'oopsie poopsie', maybe have to pay for some anit-fraud things from equifax if someone asks, and call it day.
> Right now, they publish a press release, go 'oopsie poopsie', maybe have to pay for some anit-fraud things from equifax if someone asks, and call it day.
Don't forget the usual Press Release starting with "At [Company], we take security very seriously..."
Because it's free training data and great for building profiles on users so you can make money showing them targeted ads
Discord isn't really monetized through 'traditional' targeted advertising, though.
Discord no, but my credit card from Advanzia bank actually changed their TOS to allow AI training with your submitted documents for their anti-fraud model.
I complained to the CNPD of Luxembourg and sent a GDPR request, as they defaulted to doing this WITHOUT asking for consent (super illegal as doing AI training with your data is definitely not the minimum required to offer the service)
The data is valuable to sell or train ai on. You can use that data to train ai hr people or whatever