We need some kind of group like "tech people with morals". I'm done with these people and their corruption and garbage.

It's why I think "software engineer" is a misnomer. We don't have a license, we don't have an ethics code, we don't sign off on stuff. In other disciplines, an engineer could topple a project they feel is unsafe or against code, and be backed by their union if replaced. A software engineer just says yes if their stocks aren't vested, and will be replaced if not.

I just looked this up so might not be fully accurate but it seems most private sector “engineers” don’t require a license. You only need a PE license when providing a service to the public. That is quite a strict band on the title.

Where can I read more about all the licensed engineers toppling unethical military projects?

Not a group per se but I maintain an index of 'good' people in tech here, and their contraries - https://goodindex.org

Sam has -6/100? How does that work? If you can go into the negatives, how low can you get?

Nowhere is it stated that it is a score out of 100. It is a baseline of 0, good actions make it go up, bad actions make it go down.

> Nowhere is it stated that it is a score out of 100.

It says it right on the homepage. Twice. Once for people, once for organisations. It’s right there in green: “BEST (SCORED OUT OF 100)”. And if you go into any of them, you see a score like N/100.

Found the methodology page, and it clarifies it goes from -100 to 100.

https://goodindex.org/methodology#:~:text=How%20Scoring%20Wo...

[deleted]

A union?

Unfortunately most engineers irrationally hate unions

Unions would’ve been useful at a time CEO’s are salivating at the idea of slashing jobs and replacing SWEs with AI.

I think it would still be useful. Call my cynical but gone are the days where the individual comp and benefits available to SWEs outweigh the benefits of collective bargaining.

[dead]

A guild. Control who learns the trade.

Yeah some new banner to organise around- the hard part is easily communicating you're an ethical technologist and finding others.

Also, it's probably tricky to find a Schelling point that a broad range of people can agree to.

* no military use

* no lethal use

* no use in support of law enforcement

* no use in support of immigration enforcement

* no use in mass surveillance

* no use in domestic mass surveillance (but mass surveillance of foreigners is OK)

* no use in domestic surveillance

* no use in surveillance

* require independent audits

* require court oversight

* require company to monitor use

* require company to monitor use and divulge it to employees

* some other form of human rights monitoring or auditing

* some other form of restriction on theaters/conflicts/targets

* company will permit some of these uses (not purport to forbid them by license, contract, or ToS) but not customize software to facilitate them

* company can unilaterally block inappropriate uses

* company can publicly disclose uses it thinks are inappropriate

* some other form of remedy

* government literally has to explain why some uses are necessary or appropriate to reassure people developing capabilities, and they have some kind of ongoing bargaining power to push back

It feels normal to me that a lot of people would want some of those things, but kind of unlikely that they would readily agree on exactly which ones.

I even think there's a different intuition about the baseline because one version is "nobody works on weapons except for people who specifically make a decision to work for an arms company because they have decided that's OK according to their moral views" (working on weapons is an abnormal, deliberate decision) and another version is "every company might sell every technology as part of a weapons system or military application, and a few people then object because they've decided that's not OK according to their moral views" (refusing to work on weapons is an abnormal, deliberate decision). I imagine a fair number of people in computing fields effectively thought that the norm or default for their industry was the latter, because of the perception that there are "special" military contractors where people get security clearances and navigate military procurement processes, and most companies are not like that, so you were not working on any form of weapon unless you intentionally chose to do so. But, having just been to the Computer History Museum earlier this week, I also see that a lot of Silicon Valley companies have actually been making weapons systems for as long as there has been a Silicon Valley.

It'd probably be kind of "big tent" and a bit fuzzy at the edges like most big movements are.

There is definitely a muddle on so many levels about signaling and agreeing on ethics in technology.

But as innovation slows globally, it is implementation, ethics, and ideology that will once again be the dominant metrics of progress, so there's a new window emerging to push for this social/moral change in technology once again.

So it's still critically important that we actively work towards finding a meaningful, socially contagious differentiator other than "ethical technologist" even if it's difficult- look at what OpenAI gets away with under that flimsy banner.

"Starting today I will be asking prominent members of the tech community to sign their name onto this. A code of conduct, authored by me, that pledges them to a universal ethos, which I created, that I call tech ethics or Tethics for short."

[deleted]

This, honestly. Seeing all those billionaires on inauguration day lined up to kiss the ring was utterly pathetic. Like what is the fucking point of having billions of dollars if you're just going to be someone else's bitch. And for what? A couple more billion dollars. Oof

[dead]