BASIC was slow in the 80s. Games for the C64 (and similar machines) were written in machine code.

> By comparison, giving how they optimized the games for 8 and 16 bit machines I should have been able to compile Cataclysm DDA:BN under my potato netbook and yet it needs GIGABYTES of RAM to compile, it crazy that you need damn swap for something it required far less RAM 15 years ago for the same features.

That’s not crazy. You’re comparing interpreted, line delimited, ASCII, with a compiler that converts structured ASCII into machine code.

The two processes are as different to one another as a driving a bus is to being a passenger on it.

I don’t understand what your point is in the next two paragraphs. What Go, TCL, UNIX nor Inferno have to do with the C64 or modern software. So you’ll have to help out there.

Compare Limbo+Tk under Inferno with current C#/Java. Or C++ against Plan9C.

We have impressive CPU's running really crappy software.

Remember Claude Code asking 66GB for a damn CLI AI agent for something NetBSD under a Vax (real or physical) from 1978 could do with NCurses in miliseconds every time you spawn Nethack or any other NCurses tool/game.

On speed, Forth for the ACE was faster than Basic running under the ZX80. So, it wasn't about using a text-parsed language. Forth was fast, but people was not ready for neither RPN nor to manage the stack, people tought in an algebraic way.

But that was an 'obsolete' mindset, because once you hit HS you were supposed to split 'big problems into smaller tasks (equations). In order to implement a 2nd degree equation solver in Forth you wouldn't juggle with the stack; you created discrete functions (words) for the discrimination part and so on.

In the end you just managed two stack items per step.

If Forth won instead of Basic, instead of allowing spaghetti code as a normal procedure we would be pretty much asking to decompose code into small functions as the right thing to do from the start.

Most dialects of BASIC actually had functions too. They just weren’t popularised because line numbers were still essential for line editing on home micros.

> On speed, Forth for the ACE was faster than Basic running under the ZX80. So, it wasn't about using a text-parsed language.

Forth and BASIC are completely different languages and you’re arguing a different point to the one I made too.

Also I don’t see much value in hypothetical arguments like “if Forth won instead of BASIC” because it didn’t and thus we are talking about actual systems people owned.

I mean, I could list a plethora of technologies I’d have preferred to dominate: Pascal and LISP being two big examples. But the C64 wasn’t a lisp machine and people aren’t writing modern software in Pascal. So they’re completely moot to the conversation.

They were different but both came in-ROM and with similar storage options (cassette/floppy).

On Pascal, Delphi was used for tons of RAD software in the 90's, both for the enterprise and for home users with zillions of shareware (and shovelware). And Lazarus/FPC+SQLITE3 today is not bad at all.

On Lisp... it was used on niche places such as game engines, Emacs -Org Mode today it's a beast-, a whole GNU supported GNU distro (Scheme) and Maxima among others.

Still, the so called low-level C++ it's an example on things picking the wrong route. C++ and QT5/6 can be performant enough. But, for a roguelike, the performance on compiling it's atrocious and by design Go with the GC would fix a 90% of the problems and even gain more portability.

I’m very aware of Lazarus, Delphi and Emacs. But they’re exceptions rather than industry norms.

And thus pointing them out misses the point I was making when, ironically, I was pointing out how you’re missing the original point of this discussion.

My point was about performance. Yes, Basic vs Forth was the worst choice back in the day, and you could say low level stuff was done under assembler.

Fine. But the correct choice for 'low level' stuff it's C++ and I state that most of the C++ compilers have huge compiling times for software (GCC), or much better but they still eat ram like crazy (clang) and except for few software, the performance boost compared to Go doesn't look as huge for mosts tasks except for Chromium/Electron and QT.

For what software it's doing a 90% of the time, Go + a nice toolkit UI would be enough to cover most tasks while having a safe language to use. Even for bloated propietary IM clones such as Discord and Slack.

Because, ironically, most of the optimized C++ code is to run bloated runtimes like Electron tossing out any C++ gives to you, because most Electron software it's implementing half an OS with every application.

With KDE and QT at least you are sharing code, even by using Flatpak, which somehow deduplicates stuff a little bit. With Electron you are running separate, isolated silos with no awareness of each other. You are basically running several 'desktop environments' at once.

You can say, hey, Go statically builds everything, there's no gain on shared libraries then... until you find the Go compiler can still do a better job using less RAM than average than tons of stuff.

With Electron often you are shipping the whole debugging environment with yourself. Loaded, and running graphical software with far less performance than the 'bloated' KDE3 software back in the day doing bells and wistles under a Kopete chat window under an AMD Athlon. QT3 tools felt snappy. Seeing Electron based software everywhere has the appeal of running everything GUI based under TCL/Tk under a Pentium modulo video decoders and the like. It will crawl against pure Win32/XLib under a Pentium 90 if everything it's a TK window with debugging options enabled.

So, these are our current times. You got an i7 with 16GB of RAM and barely got any improvement with modern 'apps' over an i3 with 2GB of RAM and native software.

You’re talking about compiler footprint and runtime footprint in the same conversation but they’re entirely different processes (obviously) and I don’t think it makes any sense to compare the two.

C++ is vastly more performant than Go. I love Go as a language but let’s not get ourselves carried away here about Gos performance.

It also makes no sense no sense to talk about Electron as C++. The problem with Electron isn’t that it was written in C++, it’s that it’s ostensibly an entire operating system running inside a virtual machine executing JIT code.

You talked about using Go for UI stuff, but have you actually tried? I’ve written a terminal emulator in Go and performance UI was a big problem. Almost everything requires either CGO (thus causing portability problems) or uses of tricks like WASM or dynamic calls that introduced huge performance overheads. This was something I benchmarked in SDL so have first hand experience.

Then you have issues that GUI operations need to be owned by the OS thread, this causes issues writing idiomatic Go that calls GUI widgets.

And then you have a crap load of edge cases for memory leaks where Go’s GC will clear pointers but any allocations happening outside of Go will need to be manually deallocated.

In the end I threw out all the SDL code. It was slow to develop, hard to make pretty, and hard to maintain. It worked well but it was just far too limiting. So switched to Wails, which basically displays a WebKit (on MacOS) window so it’s lower footprint than Electron, allows you to write native Go code, but super easy to build UIs with. I hate myself for doing this but it was by far the best option available, depressingly.

I know C++ it's far more performant than Go but for some games and software C++ wouldn't be needed at all, such as nchat with tdlib (the library should be a Go native one by itself, is not rocket science). These could be working close in low end machines with barely performance losses. In these cases there's nothing to gain with C++, because even compared to C, most C++ software -save for Dillo and niche cases- won't run as snappy as C ones. Running them under Golang won't make them unusable, for sure.

On the GUI, there's Fyne; but what Go truly needs it's a default UI promoted from the Golang developers written in the spirit of Tk.Tk itself would be good enough. Even Limbo for Inferno (Go's inspiration) borrowed it from TCL. Nothing fancy, but fast and usable enough for most entry tasks.

Python ships it by default because it weights near NIL and most platforms have a similar syntax to pack the widgets. Is not fancy and under mobile you need to write dedicated code and set theming but again if people got to set Androwish as a proof of concept, Golang could do it better...

Another good use case for Go would be Mosh. C++ and Protobuf? Goland should have been good for this. C++ mosh would be far snappier (it feels with some software like Bombadillo and Anfora vs Telescope) but for 'basic' modern machines (first 64 bit machines with Core Duo's or AMD64 processors) it would be almost no delay for the user.

Yes, 32 bit machines, sorry, but for 2030 and up I expect these be like using 16 bit DOS machines in 1999. Everyone moved on and 32 bit machines were cheap enough. Nowadays it's the same, I own an Atom n270 and I love it, but I don't expect to reuse it as a client or Go programming (modulo for Eforth) in 4 years, I'd expect to compute everything in the low 64 end bit machines I own.

But it will be a good Go testing case, for sure. If it runs fast in the Atom, it would shine under amd64. With the current crysis, everyone should expect to refurbish and keep 'older' machines just in case. And be sure that long compiling times should be cut in half, even if you use ccache. RAM and storage will be expensive and current practices will be pretty much discarded. Yes, C++ will be used in these times, but Golang too. Forget Electron/Chromium being used as a standalone toolkit outside of being the engine of a browser.

And if oil/gas usage it's throttled for the common folk, E/V and electric heating will reach crazy numbers. Again, telecomms and data centers will have their prices skyrocketted so the power rise doesn't blackout a whole country/state. Again, expect power computing caps, throttled resolutions for internet media/video/RDP content, even bandwith caps (unless you pay a premium price, that's it) and tons of changes. React developers using 66GB of RAM for Claude Code... forget it. Either they rebase their software in Go... or they already lost.