Great, but unfortunately, even when compiled, the startup overhead is about half a second, which makes it unsuitable for many applications. Still I applaud it, as Shell scripting is finicky, people thend to rely on bash features, and perl is kind of over. Ruby was, and still is, my go-to language for this purpose, but I've recently migrated some scripts over to Swift.
Swift does a much better job at this as interprets by default, and a compiled version starts instantaneously. I made a transparent caching layer for your Swift cli apps.Result: instant native tools in one of the best languages out there.
Not GP, but can confirm on my M3 Max using the hello world sample:
$ time dotnet run hello-world.cs > /dev/null
real 0m1.161s
user 0m0.849s
sys 0m0.122s
$ time dotnet run hello-world.cs > /dev/null
real 0m0.465s
user 0m0.401s
sys 0m0.065s
There are a lot of optimizations that we plan to add to this path. The intent of this preview was getting a functional version of `dotnet run app.cs` out the door. Items like startup optimization are going to be coming soon.
Ah, I didn't managed to find something that talked about what was planned for this, so I opened an issue asking for that.
Is there a doc somewhere talking about it ?
This is nuts. More than a decade ago Microsoft made a big deal of startup optimisations they had made in the .Net framework.
I had some Windows command-line apps written in C# that always took at least 0.5s to run. It was an annoying distraction. After Microsoft's improvements the same code was running in 0.2s. Still perceptible, but a great improvement. This was on a cheap laptop bought in 2009.
I'm aware that .Net is using a different runtime now, but I'm amazed that it so slow on a high-end modern laptop.
This is also a preview feature at the moment. They mention in the embedded video that it is not optimized or ready for production scenarios. They release these features very early in preview to start getting some feedback as they prepare for final release in November.
For comparison, skipping dotnet run and running the compiled program directly:
time "/Users/bouke/Library/Application Support/dotnet/runfile/hello-world-fc604c4e7d71b490ccde5271268569273873cc7ab51f5ef7dee6fb34372e89a2/bin/debug/hello-world" > /dev/null
real 0m0.051s
user 0m0.029s
sys 0m0.017s
So yeah the overhead of dotnet run is pretty high in this preview version.
IME, windows defender hates compilers. When I run my big C++ project, Defender consumes at least 60% of the CPU. Even when exempting every relevant file, directory, and process.
Task manager doesn't show it, but process explorer shows kernel processes and the story is quite clear.
I run in a Debian arm64 container. I get 500ms consistently. It is using a cached binary, because when I add —no-build, it used the previous version. I’m not sure where it stores cached versions though.
I’ll try to compare with explicitly compiling to a binary later today.
But that’s the thing. It’s a JIT, running a VM. Swift emits native code. Big difference.
Maybe I’ll add AOT compilation for dotnet then.. Strange they didnt incorporate that though.
> But that’s the thing. It’s a JIT, running a VM. Swift emits native code. Big difference.
It's not only a JIT, you can preJIT with R2R if you need, or precompile with NativeAOT, or i think you can fully interpret with Mono.
Edit: it looks like the issue with the dotnet cli itself, which until now was not on a 'hot path'. `dotnet help` also take half a second to show up. When running a dll directly, I think it doesn't load the cli app and just run the code necessary to start the dll.
Tangential, but Windows Powershell kept nagging me to download PS6, so I did it, then I had to revert it to 5.1, because running a script had a ~1 second overhead. Very annoying. For one-off runs it's often the starting time what's matter, and Powershell just got worse at that. (In the end, I settled for .bat files in a cmd.exe window, chatGPT can write any of them anyway.)
Does dotnet install the script's dependencies all over again every time you run it? The quoted part was about the 0.5 second startup overhead, which I figured did not include installing the dependencies.
Anyway, lots of Python scripting can be done with the standard library, without installing any dependencies. I rarely use third-party dependencies in my Python scripts.
Ruby can be slow as hell as well. Start the ruby shell for gitlab. Of course this only happens when tons of packages are loaded which will probably never happen for a cli tool, right?
It may be that they are speeding it up by keeping the .net runtime resident in memory. They used to do this with Visual Basic runtime
I ran norton utilities on my pc yesterday and noticed a new service - it was .net runtime. Please note that I am a developer so this may be just to help launch the tools.
> I’d say anything except long running processes or scripts that take 5+ second to complete.
I don't think you're presenting a valid scenario. I mean, the dotnet run file.cs workflow is suited for one-off runs, but even if you're somehow assuming that these hypothetical cold start times are impossible to optimize (big, unrealistic if) then it seems you're missing the fact that this feature also allows you to build the app and generate a stand-alone binary.
These cold start times are not hypothetical, as shown by multiple commenters in this thread. They also have been demonstrably impossible to optimize for years. Cold start times for .NET lambda functions are still an order of magnitude greater than that of Go (which also has a runtime). AOT compilation reduces the gap somewhat but even then the difference is noticeable enough on your monthly bill.
This dismissive “startup time doesn’t matter” outlook is why software written in C# and Java feels awful to use. PowerShell ISE was a laughingstock until Microsoft devoted thousands of man-hours over many years to make the experience less awful.
But it doesn’t. It still seems to run in JIT mode, instead of AOT. That’s exactly why I made swift-scc (interpret vs compile, but essentially the same problem)
I recommend you read the article. They explicitly address the usecases of "When your file-based app grows in complexity, or you simply want the extra capabilities afforded in project-based apps".
> Imagine cat, ls, cd, grep, mkdir, etc. would all take 500ms.
Those are all compiled C programs. If you were to run a C compiler before you ran them, they would take 500 milliseconds. But you don't, you compile them ahead of time and store the binaries on disk.
The equivalent is compiling a C# program, which you can, of course, do.
Does this recompile each time? It should be simple to cache the binary on the hash of the input? A sub second first run followed by instant rerun seems acceptable.
The dotnet run command caches. However, even with the cached version, you have a startup overhead of about half a second on my M1.
My "Swift Script Caching Compiler" compiles and caches, but will stay in interpreted mode the first three runs when you're in an interactive terminal. This allows for a faster dev-run cycle.
It's interesting that they're actively promoting using it with a shebang. I find it pretty appealing.
Go prior to modules worked really well this way and I believe Ubuntu was using it like this, but the Go authors came out against using it as a scripting language like this.
The author's didn't come out against it; they came out in favor of using it as a programming language first and foremost. Tools like gorun https://github.com/erning/gorun have existing for almost as long as Go has, so if you want to use Go that way, it is easy to do.
They recently added in support to do this:
go run github.com/kardianos/json/cmd/jsondiff@v1.0.1
Which pulls a tag and runs it directly, which is also kinda cool.
I used to work for a .NET shop that randomly wrote some automation scripts in bash. The expertise to maintain them long term (and frankly, write them half-decently to begin with) simply wasn't there. Never understood why they didn't just write their tooling in C#.
Maybe this will make it seem like a more viable approach.
Here’s the really annoying thing with powershell: it doesn’t have a module distribution model. Like, at all. A csharp script, on the other hand, gets NuGet out of the box.
It is reasonably unlikely that bash scripts are easily replaceable by powershell scripts.
Theres a fair argument that complex scripts require a complex scripting language, but you have to have a good reason to pick powershell.
Typically, on non-windows, there is not one.
Its the same “tier” as bash; the thing you use because its there and then reach past for hard things. The same reason as bash.
Theres no realistic situation I would (and Ive written a lot of powershell) go, “gee, this bash script/makefile is too big and complex, lets rewrite it in powershell!”
To this day, error handling in most Unix shells just sucks. Background commands, pipes, command substitutions, functions, if/while conditions, subshells, etc. are all "special cases" when errors are involved. You can make them suck less but the more you try to handle all the possible ways things can fail, the vastly more complex your code becomes, which the Bourne lineage languages just aren't ergonomically suitable for.
I think PowerShell was totally right to call this out and do it better, even though I don't particularly love the try-catch style of exception handling. False is not an error condition, exceptions are typed, exceptions have messages, etc.
The problem with PowerShell coming from bash etc. is that the authors took one look at Unix shells and ran away screaming. So features that were in the Bourne shell since the late 1970s are missing and the syntax is drastically different for anything non-trivial. Other problems like treating everything as UTF-16 and otherwise mishandling non-PowerShell commands have gotten better though.
Sure, but these days so is Python. And you've got a dozen other languages available after a single command. You wouldn't recommend using a kitchen knife instead of a chainsaw to cut down a tree just because it's already there, would you?
Bash has a huge number of footguns, and a rather unusual syntax. It's not a language where you can safely let a junior developer tweak something in a script and expect it to go well. Using a linter like ShellCheck is essentially a hard requirement.
If you're working in a team where 99.9% of the code you're maintaining is C#, having a handful of load-bearing Bash scripts lying around is probably a Really Bad Idea. Just convert it to C# as well and save everyone a lot of trouble.
> Bash has a huge number of footguns, and a rather unusual syntax. It's not a language where you can safely let a junior developer tweak something in a script and expect it to go well.
I'd say as someone who started with shell/bash in ~ 2000 to cater Linux systems, it's quote usual syntax and I believe that's true for many sysadmins.
No way I'd like to deal with opaque .Net or even Go stuff - incapable of doing "bash -x script.sh" while debugging production systems at 3AM. And non production as well - just loosing my time (and team time) on unusual syntax, getting familiar with nuget and ensuring internet access for that repos and pinging ITSec guys to open access to that repos.
> let a junior developer tweak something in a script and expect it to go well
let developers do their job, writing bash scripts is something extraordinary for dev team to do, just because - where they expected to apply it? I can imagine "lonely dev startups" situations only, where it may be reasonably needed
Not an old hat. in the dotnet world for 3 years, bash over a decade. Agree with bash not being easy to maintain.
Argument about bash being always there breaks down quickly. Even if you limit yourself to bash+awk+grep, they dont work consistently across different bash flavors or across platforms (mac/win/linuz).
My approach now is to have my programs, in the language most convenient to me, compiled to a bunch of archs and have that run across systems. One time pain, reduces over time.
> Argument about bash being always there breaks down quickly. Even if you limit yourself to bash+awk+grep, they dont work consistently across different bash flavors or across platforms
IMO this is why Perl was invented and you should just use Perl. Bash isn't portable and isn't very safe either. If you're only going to use a couple of commands, there's really no reason to use bash. The usecase, in my head, for bash is using a variety of unix utils. But then those utils aren't portable. Perl is really great here because it's available on pretty much every computer on Earth and is consistent.
Except Windows, of course. You can install it there but it's plainly obvious it's in a foreign environment.
A lot of people focus on Perl's bad reputation as one of the first web languages instead of its actual purpose, a better sh/awk/sed. If you're writing shell scripts Perl's a godsend for anything complex.
>Dotnet is a pig with its dependencies by comparison.
Dotnet with all dependencies you will get in how much time? In like 6 minutes including all the preparations? So the difference between "already there" and dotnet - is 6 minutes. It's hard to imagine a case where this difference matters.
Knowing stuff is what we're paid for, otherwise we'd all be making minimum wage. Besides, there are plenty of non-bash shell scripting resources out there, which is good to learn if you want to work with the BSDs or proprietary UNIX systems.
Powershell on UNIX is like Perl on Windows. It works, but it's weird and alien. But the same can be said for .NET, really.
I wonder where the divide of "shebang" vs "hashbang" lands geographically and chronologically, During college and for many years in the early 90s and 2000s in the south it commonly called hashbang, didn't hear shebang until C# became a thing, I know it predates that, just never heard it before then.
I believe the dividing moment came with Ricky Martin circa 2000.
Lame joke aside, I only heard "shebang" prior to around that time, then "hashbang" and now I get a mix of it. Google trends indicates "shebang" always dominated.
In my head I translate an old Swedish term: "timber yard" (from Brädgård). Everything to make interacting with other hard I guess. As a kid we also called it staket, which translates to "fence".
I always found it interesting as the sharp term including C# was odd because it isn't the sharp symbol, which is ♯. All of them use the # hash character, so calling it sharp always seemed odd to me, though C-Hash also doesn't roll of the tongue admittedly. It is also interesting how hash is correctly used in some places "Hash Tag" but not others.
It's supposed to be the sharp symbol; it's just that it was a hassle for them to use it consistently in paths etc, so they defaulted to # as a stand-in.
It's "sharp" (i.e. higher tone) because it's a higher-level language compared to C and C++.
In 2010 I met a person from India who pronounced it "C pound", and they were as confused by my reaction as I was by their pronunciation. I guess somehow that pronunciation became popular enough that it acquired momentum and everyone in their circle (or maybe all of India?) assumed it was correct. The # key on the phone is called the "pound key" in India, which is where it would've started from, and I guess they never heard any foreigner Youtube video etc pronouncing it.
I don't know if they still pronounce it that way or not.
I started using C# towards the end of the 1.0 beta or maybe just after RTM...I embarrassingly called it "C pound" for quite a while. Because, even as someone born and raised in the US, pretty much my only exposure to the symbol was in the context of phones. "Call me at blah, pound one-two-three" as in the extension is "#123".
Remember, it was originally release +20 years ago (goddamn, I feel old now); recorded video or even audio over the internet were much, much, MUCH rarer then, when "high-speed" speed internet for a lot of people meant a 56K modem.
Back then, most developer's first exposure to C# then was likely in either print form (books or maybe MSDN magazine).
It still is, to this day: if you call an automated system such as voicemail, you may be prompted to "press 'pound'". This is really standardized, AFAICT, and no telephone system has told me to "press hash" or "press the 'number' key" [because that's ambiguous]
I was trying to recall what we called it. I used SVR3, so I would've been using "#!/bin/sh" as early as 1990, and even more on SunOS 4 and other Unix servers.
I can't recall having a name for it until "hash-bang" gained currency later. We knew it was activated by the magic(5) kernel interpretations. I often called "#" as "pound" from the telephone usage, and I recall being tempted to verbalize it in C64 BASIC programming, or shell comment characters, but knowing it was not the same.
"The whole shebang" is a Civil-War-era American idiom that survived with my grandparents, so I was familiar with that meaning. And not really paying attention to Ricky Martin's discography.
Wikipedia says that Larry Wall used it in 1989. I was a fervent follower of Larry Wall in the mid-90s and Perl was my #1 scripting language. If anyone would coin and/or popularize a term like that, it's Just Another Perl Hacker,
Likewise, "bang" came from the "bang path" of UUCP email addresses, or it stood for "not" in C programming, and so "#!/bin/sh" was ambiguously nameless for me, perhaps for a decade.
Come to think of it, vi and vim have a command "!" where you can filter your text through a shell command, or "shell out" from other programs. This is the semantic that makes sense for hash-bangs, but which came first?
> "bang" came from the "bang path" of UUCP email addresses
"Bang" was in common use by computer users around 1970 when I was working at Tymshare. On the SDS/XDS Sigma 7, there was a command you could use from a Teletype to send a message to the system operator on their Teletype in the computer room. I may have this detail wrong, but I seem to recall that it included your username as a prefix, maybe like this:
GEARY: CAN YOU LOAD TAPE XYZ FOR ME?
What I do remember clearly is that there were also messages originated by the OS itself, and those began with "!!", which we pronounced "bang bang". Because who would ever want to say "exclamation point exclamation point"?
The reason this is vivid in my mind is that I eventually found the low-level system call to let me send "system" messages myself. So I used it to prank the operator once in a while with this message:
!! UNDETECTABLE ERROR
I was proud of calling it an "undetectable" error. If it was undetectable, how did the OS detect it?
The builtin subcommand is unstable, but a long time ago a dev I knew at the time maintained an external subcommand for it. https://crates.io/crates/cargo-script
Although it's old and its dependencies are old, it doesn't have a dependency on cargo (it spawns cargo as a subprocess instead of via cargo API directly), so it might still work fine with latest toolchain. I haven't tried myself.
On the BUILD talk I certainly don't remember that, and I am wondering if that was part of the original blog post, or a reaction to several people complaining since then.
What do you mean by incompatible? If you mean syntax, that's different intentionally for clarity and also we don't want to create a new C# scripting dialect hence it isn't possible to "import" other files (that's not how C# works).
You were already using F#'s syntax for C# NuGet references in .NET Interactive. Even NuGet itself labels this syntax "Script & Interactive" in its picker.
.NET interactive uses a dialect of C# - the script dialect. With file based apps, we strive to use the standard C#, so you can convert file based apps to full projects at any time.
That doesn't explain why standard C# had to deviate further from the script dialect just for import directives. These directives don't interfere with the rest of the grammar.
Is #:package <pkg> really so much nicer than #r "nuget: <pkg>" as to deserve deviating from .NET Interactive, F# and the existing NuGet picker? (It could be, if properly argued!) Has there been any effort to have the rest of .NET adopt this new syntax or share concerns?
On that note, is any language other than C# supported for `dotnet run`? Is the machinery at least documented for other MSBuild SDKs to use? Considering how dotnet watch keeps breaking for F#, I suspect not.
Assembly references are not really a mainline scenario, hence we support nuget package references only for now. And `#r` is just too opaque (it comes from old times when assembly references were the norm).
I have to agree with others that it's strange this has not been in line with .net interactive and the usage of "#r" there. I think "#r" should at least be an option.
You say that importing other files is not how c# works, but I think that's not entirely true. If you now treat a file as a project then an import is pretty much a project reference just to a .cs file instead of a .csproj file.
So I'd love to see something like that working:
#:reference ./Tools.cs
Which should be the same as a ProjectReference in csproj or for a "real" project something like this:
#:reference ./Tools/Tools.csproj
That would also enable things like this in a csproj file:
<ProjectReference Include="..\Tools\Helpers.cs" />
Referencing other projects is currently out of scope (and might always be, file based programs are intended to be simple), you can convert file based programs to full projects though to regain full functionality.
I disagree, this is the kind of decisions that make CLR the C# Language Runtime, instead of its original purpose, where there was an ecosystem of common approaches.
Unless you plan to also support the same approach on VB and F# scripts.
F# and C# are intentionally different languages, I don't think it makes sense to have 100% equivalent functionality in both. AFAIK, F# already has scripting functionality built in. VB support could be added, but that seems unlikely given https://learn.microsoft.com/en-us/dotnet/fundamentals/langua...
janjones, can you please advocate inside your team for taking F# seriously? It should have been a first class language inside .net.
I dread the moment C# introduces sum types (long overdue) and then have them incompatible with F#. On a general note, the total disregard towards F# is very much undeserved. It is far superior to the likes of Python, and would easily be the best in the areas of machine learning and scripting, but apparently nobody inside MS is that visionary.
> It is far superior to the likes of Python, and would easily be the best in the areas of machine learning...
The fact that Python is the language of choice for ML makes it clear that technical superiority was not necessary or relevant. Python is the worst language around. The lack of better languages was never the problem. Making F# technically better wasn't going to make it the choice for ML.
Agreed. But MS has certainly some weight to throw around in their own ecosystem. They have their own offerings for data processing and ambitions for AI. If they want to integrate with .net, they are already set. But that requires vision.
Note that until the layoffs of the week predating BUILD, Microsoft was exactly one of the companies pushing for improving CPython performance.
This is why I always mention, when .NET team complains about adoption among new generations not going as desired they should look into their own employer first.
This feature is probably a big thing for .NET developer productivity. It's quite a shame, that it only came now.
There is another thing I'm really missing for .NET projects. The possibility to easily define project-specific commands. Something like "npm run <command>"
I think everyone needs to do `dotnet tool install -g dotnet-script` before running them. This is the most annoying part where .NET 10 announcement would be really appreciated.
But then each script has an individual list of dependencies so there should be no need for further scoping like in npm (as in, the compilation of the script is always scoped behind the scenes). In this regard, both should be similar to https://docs.astral.sh/uv/guides/scripts/#declaring-script-d... which I absolutely love.
I don't think make and task are suited very well for running scripts that are not part of the build process. Something like scaffolding, or starting a dev dependency. Also it requires the developers to install additional tools to get started with the project.
Nah, we've been doing this for years. At least ten years ago I built myself a little console based on Roslyn that would evaluate any C# snippets I gave it. C# scripts were pretty well supported, and it was not hard to throw one together. The engine of my console was a couple dozen lines mostly just processing the string.
I'm sure the tooling is better now, though. I seem to recall visual studio and/or Rider also supporting something like this natively at the time.
PowerShell is the ultimate Chatgpt language. For better or worse. Usually worse as most shops end up with "write only" PowerShell scripts running all the glue/infrastructure stuff.
It sounds like it can potentially replace way more than PowerShell. I mean, why would a .NET shop even bother with Python or any form of shell scripts if they can attach a shebang on top of an ad-hoc snippet? And does anyone need to crank out express.js for a test service if they can simply put together a ASP.NET minimal API in a script?
I've been writing shell scripts in PHP for more than 20 years for this reason. Don't work on a lot of PHP sites any more but I still do most of my shell scripting in it. I think this'll be a big win for C# users once they get used to the paradigm shift.
I notice another poster said it's a bit slow but for many common use cases even half a second startup time is probably a small price to pay to be able to write in the language you're familiar with, use your standard libraries, etc.
> Ecosystem, people keep forgetting syntax, grammar and standard library isn't everything.
Ecosystem means nothing if you have comparable or even better alternatives in a framework of choice.
Also, it's not like the likes of Python don't have their warts. Anyone with a cursory experience with Python is aware of all the multiplatform gotchas it has with basic things like file handing.
For me, every time I have to use python it's the package handling that leaves my head spinning. It still feels like the bad old days of npm.
I think it's a popular language with scientists despite that because they don't have to care about portability, reproducability or needing your replacement to be able to run it without ever speaking to you.
I use python infrequently enough that every time it's a pain point.
Not sure I follow. I wish python was as good as npm/node_modules. And how is nuget better than npm? Is it just package quality or something else? I rarely use npm and I'm not a webdev but whenever I use it I think it's pretty great.
Ah sorry that makes sense!! Yeah, that's exactly what how I feel too. It's sad that npm has improved so much while Python's packaging hasn't (not by default at least, whereas npm is basically a default in js projects by now), in the same time frame.
It certainly is, unless the folks at the .NET shop get to be the ones writing the missing libraries, related tools, books, tutorials, conference talks,....
Oh God. I hadn't considered that windows sysadmins are likely the most prolific ChatGPT scripters. If I was still one, given the state of the MS docs, I would be guilty for sure.
I think most people would be shocked to know how much glue code is in powershell or bash/perl that kind of keeps everything running.
I remember looking on in horror as a QA person who was responsible for the install/deploy of some banking software scrolled through the bash/perl script that installed this thing. I think it had to be 20k+ lines of the gnarliest code I've ever seen. I was the java/.net integration guy who worked with larger customers to integrate systems.
My group insisted we should do single sign in in perl script but I couldn't get the CPAN package working. I had a prototype in java done in an afternoon. I never understood why people loved perl so much. Same with powershell. Shell scripters are a different breed.
On the other hand, I was once tasked to rewrite a couple of Korn shell scripts, initially written for Aix, ported to Red-Hat, into Java.
The reason being the lack of UNIX skills on the new team, and apparently it was easier to pay for the development effort.
Afterwards there were some questions about the performance decrease.
I had to patiently explain the difference between having a set of scripts orchestrating a workflow of native code applications written in C, and having the JVM do the same work, never long enough for the JIT C2 to kick in.
> PowerShell scripts running all the glue/infrastructure stuff.
I'm pleased to report it is usually not possible to do that. It would only create a huge mess. C# is more conducive for anything more than a few methods. And there is almost no barrier. PS is great for smaller ad-hoc stuff, and it is the "script that is on every Windows platform" component similar to what VBScript was a few years ago.
That's great; now I can finally have scripts with type-safety. Note that on macOS the shebang either reads `#!/usr/local/share/dotnet/dotnet run` or `#!/usr/bin/env -S dotnet run`.
Kind of obsoletes NetPad, and as soon as there's debugging, LINQPad can be put out to pasture. LINQPad was instrumental for me many years ago, and I appreciate that, but that stone-age text editor is not usable for actually writing/editing code in this decade.
Linqpad is super cool in certain .net ways, but oh man the text editor component is the worst one I interact with on a regular basis. I wish that part would be replaced by neovim or monoaco or, basically, anything. The snappiness, the table visualizations and so forth - very nice.
It’s also been unable to keep up with notebook tech for many potential use cases. I guess it’s a one man show and it shows.
Still, massive hat tip - I use Linqpad every day because it’s super useful for playing with your SQL data.
I dunno about obsoleting LINQPad yet. Half the power on LINQPad is its UI; I'd like to see how comparable VSCode / VS are with dotnet run vs LINQPad.
IE: LINQPad has a great way to visualize results. If dotnet run only outputs text, or otherwise requires a lot of plugins to visualize an object graph, there will still be quite a niche for LINQPad.
In contrast, if all you're using LINQPad for is to double-check syntax, then dotnet run might be a better option. (Sometimes if I'm "in the zone" and unsure about syntax that I use infrequently, I'll write a test LINQPad script.)
My main usecase for linqpad is database interactive stuff or exploratory code with .dump().
I see this as more of a complement to that. However I have worked at places that HATED powershell and we used linqpad for almost all scripting. It worked ok.
Looks like they will be adding support for VS Code, including for debugging.
"In upcoming .NET 10 previews we’re aiming to improve the experience of working with file-based apps in VS Code, with enhnanced IntelliSense for the new file-based directives, improved performance, and support for debugging."
I'm excited for this one. I can see it replacing some of the powershell scripts I have in CI/CD pipelines.
As much as I like Powershell and Bash, there are some tasks that my brain is wired to solve more efficiently with a C-like syntax language and this fill that gap for me.
Finally! Perfect for making a quick utility to use in a script. There have been third party solutions but having to install something to do this was always another obstacle / headache
I feel like top level statements were created specifically to open the door to C# scripts. The syntactic sugar of top level statements doesn't make sense in regular .NET production apps that you need to build and run, but for quick scripts they do represent important improvements in DX. Well done.
Their argumentation for both is accessibility for learning environments. Python and node were eating the cake there with ease of first use (python file.py). And what the academia is using today's is next year's company language
As someone mostly focused on JVM, .NET and nodejs ecosystems, this won't cut it.
The problem with UNIX culture shops not picking up .NET has everything to do with the Microsoft stigma, and everything the management keeps doing against .NET team efforts, like VSCode vs VS tooling features, C# DevKit license, what frameworks get to be on GNU/Linux, and the current ongoing issues with FOSS on .NET and the role of .NET Foundation.
Minimal APIs and now scripting, which already existed as third party solutions (search for csx), won't sort out those issues.
They can even start by going into Azure and check why there are so many projects now chosing other languages instead of .NET, when working on the open.
This would already be the first place to promote .NET adoption.
I mean I don't want the viability of my business to be dependent on the whims of Microsoft either. Unfortunately your business is going to have counterparty risk no matter what your tech stack is but Microsoft's record is pretty mixed.
Installing the full dotnet sdk was very high friction, in addition to the full csproj scaffolding. Running a single cs file is a HUGE ux improvement.
If python hadn't (nearly) caught up to c# in typing support, I'd seriously consider moving or at least running it...but as it stands, python has established itself too well for me.
Python typing hasn’t nearly caught up with C#. I regularly use both and c#’s type system is pragmatic and very helpful in avoiding bugs, whereas Python has no types by default and doesn’t check them by default when it does have them. It’s worse than typescript because at least typescript fails to compile if you have a type error.
You still need full dotnet sdk, the tool merely automates full csproj scaffolding, which is then compiled with full dotnet sdk, roslyn, nuget, compiler server and everything, and allows to customize msbuild project properties with pragmas in code. The only difference is that before you had sdk+folder+my.cs+my.csproj, now you have sdk+folder+my.cs
Very cool. Will probably still stick with LINQPad for my every day experimentation needs but if i decide to write actual scripts for ~production use, this would be a better alternative to Powershell in most cases I think.
Its good to finally see this being 'a thing' in C#. In my opinion, it is 10 years overdue!
This feature is likely added to compete with Python, Ruby, etc. The fact you just create a file, write some code and run it.
However, I don't see C# being a competitor of said languages even for simple command lines. If anything, it could be a viable replacement to Powershell or maybe F# especially if you need to link it to other .NET DLLs and 'do things'
I am also interested in the performance difference compared to other languages. I mean even Dlang has a script-like feature for sometime, now.. rdmd. Not sure the status of that, but it still compiled and runs the program. Just seems overkill for something rather simple.
This is about education and mindset. Not to enter the space. There might be some people working with it, if they are more relaxed with C# than bash.
Performance right now is horrible (500ms), they promised to improve it, but let us be honest: They are in-memory scaffolding a project, run msbuild restore/reassess/reuse dependencies, and then compile a dll and then load it into a JIT. that is so many more steps than a interpreter will do.
i couldn't help noticing that none of the autocomplete suggestions in his terminal or VS Code were relevant or correct. It looked really annoying. Is that the usual quality Copilot delivers?
It's slightly better for established code but it's like all of them. If it's well-known problem, it's decent. If it's esoteric or unique to your business, it's confused.
Since C# single file is new, there is not a ton of code for Copilot to reference so it's probably confused.
.NET Interactive had already added a directive for C# NuGet references, compatible with F#'s, and NuGet's picker had already labeled that syntax "Script & Interactive". But no, they had to invent a new directive.
I've been developing with .NET since version 1.1. I feel like it's always been pretty easy to use for beginners. You install Visual Studio, create a new project and BOOM you've got a program that builds and runs.
Having used Turbo Pascal and Turbo C prior to Visual Basic and Visual C, Microsoft's "Visual" IDEs and other Windows based IDEs of the 90's were a step up in ease of use even if they did require more files to build a project.
They indeed started years ago. For some time it's possible to run project with a Program.cs file that doesn't contain a static class. There's also the so called the ASP.NET minimal API. And .net scripting got official support a few releases ago.
With `uv` you can even put a magic comment atop the script saying like, "Use this version of Python, use these dependencies" and uv will fault the runtime and deps into a cache folder before launching the script. 2025 might be the year of Python on the desktop
When Astral ships the ability to package your Python script + its dependencies in a single, statically linked executable, it's game over for everyone else.
Yes, but it is not without cost. The time poured on all the failed projects to make Python into what it fundamentally isn't is quite wasteful in my opinion. But people are free to do whatever they want with their own time.
Some 20y ago I used "scratchpad" or whatever it was called to write short "bash" scripts and run them directly from that "notepad"... and it was a blast.
I never understood why it has to be so hard on Windows to enable users to do just a little bit of scripting like it's not 80's anymore.
LINQPad [1] is a terrific utility for this workflow with dotnet. I was part of a team where we built all kinds of runbooks, support utilities, and debugging tools with LINQPad using a common set of utility functions to dump (pretty-print) system-specific entities and even charts.
I'm building a dotnet job orchestrator called Didact (https://www.didact.dev), and this is the sort of thing I was looking for years ago when I was first dreaming it up. Class libraries is the approach I am taking now, but this is still extremely interesting. Could see some interesting use cases for this...
&& in a shebang isn't portable, having more than one argument in a shebang isn't portable, subshell in a shebang isn't portable, semicolon in a shebang isn't portable, basename usage isn't portable, not littering the current working directory is important (it might not even be RW, or you might not have permissions to write to it), your usage won't work with symlinks or with the script added to $PATH, caching the compiled output is a nice speedup and reduces startup costs, correct (and optimal) cache invalidation is one of the known tricky problems in CS.
Shell scripts are ugly :)
(If you didn't mean to use them in a shebang but rather as the script body, that's fine but that wouldn't be possible without the polyglot syntax and #allow abuse I posted.)
Give me some sane subset of TypeScript with native support and I'll use it everywhere. But C# is far from as convenient as TypeScript so far. Just a few more years of convenience language features and APIs added, and I think it'll be set.
Interpreted general purpose languages have been fairly similar for a while. I worked with C# for a decade, Go replaced it for us (for human reasons not technical ones) but these days everything is basically Python and then some C/Zig for parts that require efficiency.
Didn't Mono implement something like this ages ago? And I mean ages ago, before C# even had standardized code outside "static void Main" in the language spec, IIRC they had an interpreter that probably used Reflection.Emit or something and executed the output and you could #! it.
I've been coding in C# for more than a decade and a half now, and I see myself using this rather than creating an entire project structure to test some new approach to something.
No, I do absolutely remember them integrating dotnet-script-like functionality into the runtime, and I remember experimenting with it.
Was it only in a beta version and removed to re-appear now? It used the Rosyln compiler to compile on demand and then execute it, and used pragma directives that were compatible with dotnet-script's. I cannot remember what the shebang was set to, but it wasn't the same one as dotnet-script.
On the topic of scripts, it would be nice if they had a more seamless way to invoke processes. The whole Process.Start() way can be convoluted when I just want something simple.
Either way, this is a dream come true. I always hated Powershell and it’s strange fixation with OOP. Modern C# seems to acknowledge that sometimes all we want are functions, scripts, Linq and records.
There's a rule that if something is easy to invent, a bunch of people will whip up their own versions independently, sometimes even within the same organisation.
.net interactive is a kernel based approach for notebooks. CSI is more like a traditional repl (extracted from a VS feature). This here is more like traditional scripting. It is coding without project file. There is no attempt of interactivity here.
What would be really useful is to include one of these with Windows, so that it simply works and can be used for scripting without needing to write a batch file to bootstrap a PowerShell script to bootstrap the .NET runtime to bootstrap the C# scripting environment.
> There's a rule that if something is easy to invent, (...)
You seem to be confused. There is nothing being invented in here. What they are announcing is basically an update to the dotnet command line app to support building and running plain vanilla C# programs.
Maybe this iteration it won't be a bolt-on, who knows. Because of the origin of C# (Microsoft Java replacement) it's all still very MSBuild/IDe-magic-ish instead of being its own thing where you can decide your own dependency resolution, your own compiler and your own linker.
It's similar to cmd.exe and conhost etc. It's all tied to decades old legacy baselines that Microsoft just won't or can't let go of.
I used to think there's nothing wrong with them either. For simple setups it is very painless. However, I recently stumbled into a use case where setting up the project file is a huge hassle and very error prone as well.
Mainly I wanted another project to be referenced only at the "analyzer" step, and this appeared to work just fine at first, but after the 3rd/4th compilation it failed for some reason. Digging into this opened a huge can of worms of partially/falsely documented features for something I sincerely believed should be easy to achieve. Now the only thing I can do is copy the project into the analyzer, or let the build fail with the dependency disabled just so that the next build works again.
There's also some issues regarding restoring other projects, but that doesn't appear to be the fault of .csproject files.
P.S.: Having a project be referenced (i.e.: marker attributes) at both the analyzer step _and_ later in the final artifact is something I never got working reliably so far. From what I've read a nuget project could make this easier, but I don't want to do that and would expect there to be a way without using the package-management system.
> I used to think there's nothing wrong with them either.
From your complain, it doesn't seem you're pointing anything wrong with .csproj files. You struggled with a usecases that's far from normal and might not even be right for you. Without details, it's hard to tell if you're missing something or you're blaming the tool instead of focusing on getting things to work.
I found a solution I am happy with. Based on my research into this it is a problem I'm not alone with.
I do concede that writing your own analyzers is unusual (which is why i wrote that it's fine for simple setups).
At the same time I deem having a common library to be referred by more than one analyzer project something that should be possible without running into strange errors.
If a tool (dotnet build) tells me something is wrong I am fine with it. If the same tool works after I added something to the project description and then fails at a random time after without having changed the referenced stuff, then I will happily blame the tool.
Especially when commenting out the reference, recompiling until error, and then uncommenting it fixes the issue. While this behavior doesn't necessitate an issue with the files per-se, there is only entity consuming it, so to me there is no distinction.
Back when I last used C#, which admittedly is over 10 years ago, .csproj files were written in XML so they were annoying to edit. It was difficult to understand the structure of them, and I'm not sure if they were documented very well.
Just compare a .csproj to something modern like a Cargo.toml and you'll see why someone might think .csproj is awful. It is immediately obvious what each section of a Cargo.toml does, and intuitive how you might edit or extend them.
Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Very outdated view. Csproj files got hugely simplified with dotnet core, and thanks God they kept the XML format instead of the JSON they tried at first.
(It's named "SDK-style" because that Sdk="Sdk.Name" attribute on the Project tag does a ton of heavy lifting and sets a lot of smart defaults.)
In the "SDK-Style", files are included by default by wildcards, you no longer need to include every file in the csproj. Adding NuGet references by hand is now just as easy as under an <ItemGroup> add a <PackageReference Include"Package.Name" Version="1.0.0" />. (For a while NuGet references were in their own file, but there would also be a dance of "assembly binding redirects" NuGet used to also need to sometimes include in a csproj. All of that is gone today and much simplified to a single easy to write by hand tag.) Other previously "advanced" things you'd only trust the UI to do are more easily done by hand now, too.
> Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Depends on the era, but in the worst days of the csproj verbosity where every file had to be mentioned (included as an Item) in the csproj I didn't know a single C# developer that hadn't needed to do some XML surgery in a csproj file at some point, though most of that was to fix merge conflicts because back then it was a common source of merge conflicts. Fixing merge conflicts in a csproj used to be a rite of passage for any reasonably sized team. (I do not miss those days and am very happy with the "SDK-Style csproj" today.)
Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
Half of the properties are XML attributes <tag ID=5> while the other half are child tags <tag><hello>5</hello></tag>
It's okay to read, but basically impossible to write by hand.
The tooling support is nothing like a JSON with JSON schema that gives you full intellisense with property autocomplete plus attribute description. (Imagine VSCode settings)
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
Is it, though? The guidelines seem to be pretty straight forward: unless you want to target a new language feature, you can just target netstandard or whatever target framework your code requires at the moment. This barely registers as a concern.
> It's okay to read, but basically impossible to write by hand.
Are you sure about that? I mean, what's the best example you can come up with?
This complain is even more baffling considering a) the amount of XML editor helpers out there, b) the fact that most mainstream build systems out there are already XML-based.
I personally wouldn't bother with .NET Standard anymore. .NET Standard 2.0 is nicely frozen and still useful to some legacy shops, but at this point I think for all new code the only decision is to support LTS or not, and then just pick the latest version.
.NET Standard 2.0 will break? I think it's frozen for good. That's also the biggest problem with it: there are so many performance improvements in .NET 7+ that .NET Standard can't opt into but come "free" with retargeting. (Among other things, a lot more Span<T> overloads throughout the BCL. There is a compatibility shim for Memory<T> and Span<T> access in .NET Standard 2.0 and that also includes some of the new overloads, but not all of them.)
Targeting a specific .NET version will break in a year? LTS versions specifically have two years of support. But also .NET has very rarely broken backward compat and you can still easily load libraries built targeting .NET 5 in .NET 9 today. You don't necessarily have to "keep up with the treadmill". It's a good idea: see "free performance boosts when you do". But it isn't required and .NET 5 is still a better target from today's perspective than .NET Standard 2.0. (The biggest instance I know of a backwards compatibility break in .NET was the rescoping between .NET [Framework] 3.5.x and .NET [Framework] 4.0 on what was BCL and what was out/no longer supported and that was still nothing like Python 2 versus 3. I know a lot of people would also count the .NET Framework 4.x and .NET Core 1.0 split, too, which is the reason for the whole mess of things like .NET Standard, but also .NET Standard was the backward compatibility guarantee and .NET Standard 2.0 was its completion point, even though yes there are versions > 2.0, which are even less something anyone needs to worry about today.)
AFAIK, net5 and net9 are incompatible with each other and you can't run one on another, major versions and all that. You need netstandard to run on different versions, but it's only for libraries, you can't run a process on netstandard.
I don't think things are quite that bad. I'd take a csproj files over many Maven files or Makefiles. The three or four ways I've seen Python manage dependencies didn't improve things either. I'm quite comfortable with Rust's toml files these days but they're also far from easy to write as a human. I still don't quite understand how Go does things, it feels like I'm either missing something or Go just makes you run commands manually when it comes to project management and build features.
I don't think there are any good project definition files. At least csproj is standardised XML, so your IDE can tell if you're allowed to do something or not before you try to hit build.
As for targeting frameworks and versions, I think that's only a problem on Windows (where you have the built in one and the one(s) you download to run applications) and even then you can just target the latest version of whatever framework you need and compile to a standard executable if you don't want to deal with framework stuff. The frameworks themselves don't have an equivalent in most languages, but that's a feature, not a bug. It's not even C# exclusive, I've had to download specific JREs to run Java code because the standard JRE was missing a few DLLs for instance.
The "built-in to Windows" one is essentially feature frozen and "dead". It's a bit like the situation where a bunch of Linux distros for a long while included a "hidden" Python 2 for internal scripts and last chance backwards compatibility even despite Python 3 supposed to be primary in the distro and Python 2 out of support.
Except this is also worse because this is the same Microsoft commitment to backwards compatibility of "dead languages" that leads to things like the VB6 runtime still being included in Windows 11 despite the real security support for the language itself and writing new applications in it having ended entirely in the Windows XP era. (Or the approximately millions of side-by-side "Visual C++ Redistributables" in every Windows install. Or keeping the Windows Scripting Host and support for terribly old dialects of VBScript and JScript around all these decades later, even after being known mostly as a security vulnerability and malware vector for most of those same decades.)
Exactly the reason why The Year of Desktop Linux has become a meme, and apparently it is easier to translate Win32 calls than convince game devs already targeting POSIX like platforms to take GNU/Linux into account.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
It's gotten real simple in the last few years, and is basically the exact same flowchart as Node.JS:
Do you need LTS support? --> Yes --> target the most recent even number (.NET 8.0 today)
^--> No --> target the most recent version (.NET 9.0 today)
Just like Node, versions cycle every six months (.NET 10, the next LTS, is in Preview [alpha/beta testing] today; the feature being discussed is a part of this preview) and LTS add an extra year and a half security support safety net to upgrade to the next LTS.
Everything else can be forgotten. It's no longer needed. It's no longer a thing. It's a dead version for people that need deep legacy support in dark brownfields and nothing more than that.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
What do you think happens when you try to use Python 3.9 to run a script that depends on features added in 3.10? This is inherent to anything that requires an interpreter or runtime. Most scripting tools just default "your stuff breaks if you get it wrong" whereas .Net requires you to explicitly define the dependency.
It is insane to me how long it takes people to realize that low barriers for execution and experimentation are important.
Imagine if the Commodore 64 or Microsoft Basic required you to scaffold your program with helper files containing very specific and correct information in order to run those programs.
Which 10 extra key word? It is a statically typed (bring things like types, type parameters, ...), object oriented language (brings things like abstract, override, class, ...) and high performance (brings things like ref, in, out, ...).
And it is expressive instead of "mathematical" (like ML languages) which makes productively in code reviews a thing.
Maybe functional language programmers will one day pull the stick out of their arses and get a bit less supercilious, and realise that expressiveness and function name length has absolutely nothing to do with runtime performance, especially in native and properly JITed runtimes. Maybe they'll realise it makes things easier to approach, read, reason about, and hence write more correctly.
> OOP is a scam, a useful scam but still a scam, it is in no way easier or better to force _everything_ to be a class
You're in luck - C# doesn't force everything to be a class and has many functional programming features. Hell, even C++ has had functions as data for, like, 10 years now.
well to do anything new in a C# project one typically ends up making a class inside a new namespace, if I just want to write a solver with no state it's quite jarring and distracts from otherwise a pretty good language that I like working in. So 2 new nouns and a handful of keywords.
That's it. No namespaces (which were never required anyway, not even in C# 1.0), no classes, no functions even. If you want to define a function, you can just do so in global scope, and if it's a single expression you don't even need the braces:
int Fib(int n) => (n <= 1) ? n : Fib(n - 1) + Fib(n - 2);
They've been supported for decades at this point, they're not new. Your suggestion would require more changes to the language than what's actually implemented, afaiu.
Yes, but I would have thought attributes were the canonical "think you might want to reflect over" rather than this imperative solution. But you're all happy with it and it's not my circus, so enjoy.
This cannot be something you reflect over, because they allow to encode build arguments.
They need to be parsed before it's compiled, which is exactly what preprocessors directive allows.
Java's annotations (which are mostly equivalent to C#'s attributes) can be consumed at compile time using annotation processors. If C# had this feature for attributes, could attributes have been used here?
C# have this feature, it's called roslyn code generator.
I believe your java annotations cannot change the build parameters of the package being currently compiled, which is why you wouldnt be able to do that in java.
btw, anything that is present in the AST could be used for that, but I think the preprocessor directive is the most sensible choice.
From what I've read of this preview `dotnet list` works as expected. Under the hood it supposedly builds a mini-project file so that most of the `dotnet` cli works as expected. It also provides an "eject button" to build the csproj from the script file `dotnet project convert`.
The motivation was almost certainly compatibility UNIX shebang lines for shell scripts:
#!/usr/bin/dotnet run
Console.WriteLine("Hello from a C# script!");
Which just looks like a crime against nature and right order of things.
It's really hard to explain to anyone who wasn't born in a monoculture, but I come from an eastern european country where in my childhood there may have been maybe a few hundred truly foreign people living there at any one time, mostly diplomats and their families. Decades later I visited and saw a chinese immigrant speaking my native tongue. You can't imagine how... disconcerting that is. Not bad, I'm not against it, it's just... weird.
Great, but unfortunately, even when compiled, the startup overhead is about half a second, which makes it unsuitable for many applications. Still I applaud it, as Shell scripting is finicky, people thend to rely on bash features, and perl is kind of over. Ruby was, and still is, my go-to language for this purpose, but I've recently migrated some scripts over to Swift.
Swift does a much better job at this as interprets by default, and a compiled version starts instantaneously. I made a transparent caching layer for your Swift cli apps.Result: instant native tools in one of the best languages out there.
Swift Script Caching Compiler (https://github.com/jrz/tools)
dotnet run doesn't need it, as it already caches the compiled version (you can disable with --no-build or inspect the binaries with --artifacts-path)
Where does your half a second number come from ? I ran a hello world to test, the overhead is 63ms.
neuecc ran benchmark on CLI libs overhead, none reach half a second: https://neuecc.medium.com/consoleappframework-v5-zero-overhe...
> Swift does a much better job at this as interprets by default
The .NET JIT is a tiered JIT, it doesn't immediatly emit code immediatly.
Not GP, but can confirm on my M3 Max using the hello world sample:
There are a lot of optimizations that we plan to add to this path. The intent of this preview was getting a functional version of `dotnet run app.cs` out the door. Items like startup optimization are going to be coming soon.
I’m not really into the whole dotnet space. Except during the beta early on BSD ‘00.
It’s good that it allows scripts to run, and does packages. Simple is good
I was just curious and then surprised that it already caches compiled binaries, but that the time remained the same.
I opened an issue since I couldn't find docs that indicate what they were working on to improve the start time, and they replied:
https://github.com/dotnet/sdk/issues/49197
Ah, I didn't managed to find something that talked about what was planned for this, so I opened an issue asking for that. Is there a doc somewhere talking about it ?
maybe it's worth adding that info to the blog post :)
thanks Jared and team, keep up the great work.
If you're trying things like that in a Mac, watch out for the notary check delay https://eclecticlight.co/2025/04/30/why-some-apps-sometimes-...
It can easily add hundreds of milliseconds in various situations you can't easily control.
This is nuts. More than a decade ago Microsoft made a big deal of startup optimisations they had made in the .Net framework.
I had some Windows command-line apps written in C# that always took at least 0.5s to run. It was an annoying distraction. After Microsoft's improvements the same code was running in 0.2s. Still perceptible, but a great improvement. This was on a cheap laptop bought in 2009.
I'm aware that .Net is using a different runtime now, but I'm amazed that it so slow on a high-end modern laptop.
To be fair, this is timing early bits of a preview feature. Compiled .NET apps have much better startup perf.
This is also a preview feature at the moment. They mention in the embedded video that it is not optimized or ready for production scenarios. They release these features very early in preview to start getting some feedback as they prepare for final release in November.
They are not on Windows. As someone else pointed out, there are very likely Apple-specific shenanigans going on.
For comparison, skipping dotnet run and running the compiled program directly:
So yeah the overhead of dotnet run is pretty high in this preview version.I can also confirm the overhead on Windows 11 (it's a simple Console.WriteLine)
First run was around 2500ms, consecutive runs around 1300ms.IME, windows defender hates compilers. When I run my big C++ project, Defender consumes at least 60% of the CPU. Even when exempting every relevant file, directory, and process.
Task manager doesn't show it, but process explorer shows kernel processes and the story is quite clear.
I run in a Debian arm64 container. I get 500ms consistently. It is using a cached binary, because when I add —no-build, it used the previous version. I’m not sure where it stores cached versions though.
I’ll try to compare with explicitly compiling to a binary later today.
But that’s the thing. It’s a JIT, running a VM. Swift emits native code. Big difference.
Maybe I’ll add AOT compilation for dotnet then.. Strange they didnt incorporate that though.
What command do you use ?
> But that’s the thing. It’s a JIT, running a VM. Swift emits native code. Big difference.
It's not only a JIT, you can preJIT with R2R if you need, or precompile with NativeAOT, or i think you can fully interpret with Mono.
Edit: it looks like the issue with the dotnet cli itself, which until now was not on a 'hot path'. `dotnet help` also take half a second to show up. When running a dll directly, I think it doesn't load the cli app and just run the code necessary to start the dll.
Tangential, but Windows Powershell kept nagging me to download PS6, so I did it, then I had to revert it to 5.1, because running a script had a ~1 second overhead. Very annoying. For one-off runs it's often the starting time what's matter, and Powershell just got worse at that. (In the end, I settled for .bat files in a cmd.exe window, chatGPT can write any of them anyway.)
Powershell 7 was released 5 years ago, was this not an option?
On my M2 Air:
508ms with caching, 1090ms with `--no-cache`
But as others already mentioned, optimizing this seems to be pretty high priority...
Dotnet is getting a fully interpreted mode in 10 or 11 so I wonder if they'll switch to that for things like this
https://github.com/dotnet/runtime/issues/112748
>Great, but unfortunately, even when compiled, the startup overhead is about half a second, which makes it unsuitable for many applications
So why python is this popular in this domain?
Python has replaced Java in many universities, and is used in other domains than CS.
Python caches compiled versions by default. (__pycache__) and simply starts faster.
Python is more a scripting language, similar to Ruby. Swift was late to the game, and is quite strict.
*performance is a feature*
And in particular perceived performance and startup performance (“lag”)
It’s one of the reasons for the success of Chrome, MySQL, Mongodb, and many others.
A Python script can start and finish in less than 10 ms on my machine.
Even with installing dependencies?
Because that's what dotnet run does
Does dotnet install the script's dependencies all over again every time you run it? The quoted part was about the 0.5 second startup overhead, which I figured did not include installing the dependencies.
Anyway, lots of Python scripting can be done with the standard library, without installing any dependencies. I rarely use third-party dependencies in my Python scripts.
1. I don't think so, but it may check it if is needed? I don't know.
2. Same can be said about C# which has really strong and well designed (APIs) standard lib.
Ruby can be slow as hell as well. Start the ruby shell for gitlab. Of course this only happens when tons of packages are loaded which will probably never happen for a cli tool, right?
This is still in early preview. They acknowledged the startup speed in several presentations I saw and said that they are working on speeding it up.
It may be that they are speeding it up by keeping the .net runtime resident in memory. They used to do this with Visual Basic runtime
I ran norton utilities on my pc yesterday and noticed a new service - it was .net runtime. Please note that I am a developer so this may be just to help launch the tools.
That’s probably just a non-Visual Studio version of what VS already does with reusing .NET-hosting processes when you run an app in debug mode.
If you want it to start up quickly, you can easily covert it to native code using https://learn.microsoft.com/en-us/dotnet/core/deploying/
> Great, but unfortunately, even when compiled, the startup overhead is about half a second, which makes it unsuitable for many applications.
Which applications are those? I mean, one example they showcase is launching a basic web service.
I’d say anything except long running processes or scripts that take 5+ second to complete.
Cli scripts/apps should simply be responsive, just like websites and apps shouldn’t be slow to open.
Slower scripts result in distraction, which ultimately leads to hackernews being in front of your nose again.
A few of my examples:
Resizing/positioning windows using shortcut keys. Unbearable if it’s not instantaneous.
Open a terminal based on some criteria. Annoying if I have to wait.
> I’d say anything except long running processes or scripts that take 5+ second to complete.
I don't think you're presenting a valid scenario. I mean, the dotnet run file.cs workflow is suited for one-off runs, but even if you're somehow assuming that these hypothetical cold start times are impossible to optimize (big, unrealistic if) then it seems you're missing the fact that this feature also allows you to build the app and generate a stand-alone binary.
So exactly what's the problem?
These cold start times are not hypothetical, as shown by multiple commenters in this thread. They also have been demonstrably impossible to optimize for years. Cold start times for .NET lambda functions are still an order of magnitude greater than that of Go (which also has a runtime). AOT compilation reduces the gap somewhat but even then the difference is noticeable enough on your monthly bill.
This dismissive “startup time doesn’t matter” outlook is why software written in C# and Java feels awful to use. PowerShell ISE was a laughingstock until Microsoft devoted thousands of man-hours over many years to make the experience less awful.
But it doesn’t. It still seems to run in JIT mode, instead of AOT. That’s exactly why I made swift-scc (interpret vs compile, but essentially the same problem)
> But it doesn’t.
I recommend you read the article. They explicitly address the usecases of "When your file-based app grows in complexity, or you simply want the extra capabilities afforded in project-based apps".
Complexity is not the problem. It a simple hello world example.
Imagine cat, ls, cd, grep, mkdir, etc. would all take 500ms.
It’s the same as the electron shit. It’s simply not necessary
> Imagine cat, ls, cd, grep, mkdir, etc. would all take 500ms.
Those are all compiled C programs. If you were to run a C compiler before you ran them, they would take 500 milliseconds. But you don't, you compile them ahead of time and store the binaries on disk.
The equivalent is compiling a C# program, which you can, of course, do.
> hypothetical cold start times
Long standing complaint about .NET / .NET Core
2017 Github issue: https://github.com/dotnet/core/issues/1060
2018 Github issue: https://github.com/dotnet/core/issues/1968
Regular people complaining, asking, and writing about it for years: https://duckduckgo.com/?t=ffab&q=cold+start+NET.&ia=web
Right up to this thread, today.
Why are you denying that this exists?
Yep, little can be done about it, even grep suffers from cold start.
Does this recompile each time? It should be simple to cache the binary on the hash of the input? A sub second first run followed by instant rerun seems acceptable.
Binaries are cached and next runs are much faster
The dotnet run command caches. However, even with the cached version, you have a startup overhead of about half a second on my M1.
My "Swift Script Caching Compiler" compiles and caches, but will stay in interpreted mode the first three runs when you're in an interactive terminal. This allows for a faster dev-run cycle.
I use “tcc -run” instead. It’s instantaneous.
Python seems to fit the old Perl niche.
It's interesting that they're actively promoting using it with a shebang. I find it pretty appealing.
Go prior to modules worked really well this way and I believe Ubuntu was using it like this, but the Go authors came out against using it as a scripting language like this.
The author's didn't come out against it; they came out in favor of using it as a programming language first and foremost. Tools like gorun https://github.com/erning/gorun have existing for almost as long as Go has, so if you want to use Go that way, it is easy to do. They recently added in support to do this: go run github.com/kardianos/json/cmd/jsondiff@v1.0.1 Which pulls a tag and runs it directly, which is also kinda cool.
I used to work for a .NET shop that randomly wrote some automation scripts in bash. The expertise to maintain them long term (and frankly, write them half-decently to begin with) simply wasn't there. Never understood why they didn't just write their tooling in C#.
Maybe this will make it seem like a more viable approach.
Or just use powershell. It has some idiosyncrasies but its a pretty nice platform for scripting
Especially for C# developers. You can use any CLR (e.g. C#) objects in powershell, for prototyping, automation, proof of concept, etc.,.
Here’s the really annoying thing with powershell: it doesn’t have a module distribution model. Like, at all. A csharp script, on the other hand, gets NuGet out of the box.
What is the PowerShellGet module[1] missing? It comes preinstalled in all versions post 5.1 (though 5.1 itself is limited).
[1]: https://learn.microsoft.com/en-us/powershell/module/powershe...
TFW where you're dead wrong and have to upvote everyone who told you so...
False: https://www.powershellgallery.com/
Not only it has modules, there are several NuGet like repos for Powershell.
It is reasonably unlikely that bash scripts are easily replaceable by powershell scripts.
Theres a fair argument that complex scripts require a complex scripting language, but you have to have a good reason to pick powershell.
Typically, on non-windows, there is not one.
Its the same “tier” as bash; the thing you use because its there and then reach past for hard things. The same reason as bash.
Theres no realistic situation I would (and Ive written a lot of powershell) go, “gee, this bash script/makefile is too big and complex, lets rewrite it in powershell!”
To this day, error handling in most Unix shells just sucks. Background commands, pipes, command substitutions, functions, if/while conditions, subshells, etc. are all "special cases" when errors are involved. You can make them suck less but the more you try to handle all the possible ways things can fail, the vastly more complex your code becomes, which the Bourne lineage languages just aren't ergonomically suitable for.
I think PowerShell was totally right to call this out and do it better, even though I don't particularly love the try-catch style of exception handling. False is not an error condition, exceptions are typed, exceptions have messages, etc.
The problem with PowerShell coming from bash etc. is that the authors took one look at Unix shells and ran away screaming. So features that were in the Bourne shell since the late 1970s are missing and the syntax is drastically different for anything non-trivial. Other problems like treating everything as UTF-16 and otherwise mishandling non-PowerShell commands have gotten better though.
Bash is the tool that’s already there; that is always a real good reason to use it.
Dotnet is a pig with its dependencies by comparison.
Sure, but these days so is Python. And you've got a dozen other languages available after a single command. You wouldn't recommend using a kitchen knife instead of a chainsaw to cut down a tree just because it's already there, would you?
Bash has a huge number of footguns, and a rather unusual syntax. It's not a language where you can safely let a junior developer tweak something in a script and expect it to go well. Using a linter like ShellCheck is essentially a hard requirement.
If you're working in a team where 99.9% of the code you're maintaining is C#, having a handful of load-bearing Bash scripts lying around is probably a Really Bad Idea. Just convert it to C# as well and save everyone a lot of trouble.
> Bash has a huge number of footguns, and a rather unusual syntax. It's not a language where you can safely let a junior developer tweak something in a script and expect it to go well.
I'd say as someone who started with shell/bash in ~ 2000 to cater Linux systems, it's quote usual syntax and I believe that's true for many sysadmins.
No way I'd like to deal with opaque .Net or even Go stuff - incapable of doing "bash -x script.sh" while debugging production systems at 3AM. And non production as well - just loosing my time (and team time) on unusual syntax, getting familiar with nuget and ensuring internet access for that repos and pinging ITSec guys to open access to that repos.
> let a junior developer tweak something in a script and expect it to go well
let developers do their job, writing bash scripts is something extraordinary for dev team to do, just because - where they expected to apply it? I can imagine "lonely dev startups" situations only, where it may be reasonably needed
> Bash is the tool that’s already there; that is always a real good reason to use it.
If you are a dotnet dev shop, it is quite likely that dotnet is also a tool that is already there the places you need automation.
Plus, its also the tool that is already there in your team’s skillset.
I am an old hat. I've been in dotnet dev land for decades and still use bash (git bash).
Pros:
Syntax + access to familiar shell commands
Cons:
Bash scripts are not easy to maintain.
Not an old hat. in the dotnet world for 3 years, bash over a decade. Agree with bash not being easy to maintain.
Argument about bash being always there breaks down quickly. Even if you limit yourself to bash+awk+grep, they dont work consistently across different bash flavors or across platforms (mac/win/linuz).
My approach now is to have my programs, in the language most convenient to me, compiled to a bunch of archs and have that run across systems. One time pain, reduces over time.
> Argument about bash being always there breaks down quickly. Even if you limit yourself to bash+awk+grep, they dont work consistently across different bash flavors or across platforms
IMO this is why Perl was invented and you should just use Perl. Bash isn't portable and isn't very safe either. If you're only going to use a couple of commands, there's really no reason to use bash. The usecase, in my head, for bash is using a variety of unix utils. But then those utils aren't portable. Perl is really great here because it's available on pretty much every computer on Earth and is consistent.
Except Windows, of course. You can install it there but it's plainly obvious it's in a foreign environment.
A lot of people focus on Perl's bad reputation as one of the first web languages instead of its actual purpose, a better sh/awk/sed. If you're writing shell scripts Perl's a godsend for anything complex.
I worked at a dotnet shop a few years ago back. We still used bash over powershell unless we required COM
There's still version/dependency hell (one thing wants 6, one thing wants 8, one thing wants 8.1, etc.).
This can be managed but you have to manage it.
(A fanatical commitment to backwards compatibility can make this a lot easier, but it doesn't seem to me that dotnet has that.)
>Dotnet is a pig with its dependencies by comparison.
Dotnet with all dependencies you will get in how much time? In like 6 minutes including all the preparations? So the difference between "already there" and dotnet - is 6 minutes. It's hard to imagine a case where this difference matters.
Its trivial to install a consistent PowerShell environment on the major OSes. Not so for Bash.
If you write to POSIX it's a lot more portable. If you're doing something outside POSIX, it's probably best to use something else, like Perl.
...Or powershell!
Besides, the trick is knowing what is POSIX compliant and portable, and what isn't. A lot of things will almost work.
Knowing stuff is what we're paid for, otherwise we'd all be making minimum wage. Besides, there are plenty of non-bash shell scripting resources out there, which is good to learn if you want to work with the BSDs or proprietary UNIX systems.
Powershell on UNIX is like Perl on Windows. It works, but it's weird and alien. But the same can be said for .NET, really.
I wonder where the divide of "shebang" vs "hashbang" lands geographically and chronologically, During college and for many years in the early 90s and 2000s in the south it commonly called hashbang, didn't hear shebang until C# became a thing, I know it predates that, just never heard it before then.
I believe the dividing moment came with Ricky Martin circa 2000.
Lame joke aside, I only heard "shebang" prior to around that time, then "hashbang" and now I get a mix of it. Google trends indicates "shebang" always dominated.
It's kind of the same with the # symbol. I call it the pound sign but some people call it hash.
In my head I translate an old Swedish term: "timber yard" (from Brädgård). Everything to make interacting with other hard I guess. As a kid we also called it staket, which translates to "fence".
I can see the fence and timber yard imagery.
Interesting, I’ve never even heard it called “hashbang” until you just did.
California, 40yo fwiw
I was a Unix sysadmin back in the late 90’s in east coast US and we called it a shebang when writing shell or perl scripts
I always found it interesting as the sharp term including C# was odd because it isn't the sharp symbol, which is ♯. All of them use the # hash character, so calling it sharp always seemed odd to me, though C-Hash also doesn't roll of the tongue admittedly. It is also interesting how hash is correctly used in some places "Hash Tag" but not others.
It's supposed to be the sharp symbol; it's just that it was a hassle for them to use it consistently in paths etc, so they defaulted to # as a stand-in.
It's "sharp" (i.e. higher tone) because it's a higher-level language compared to C and C++.
In 2010 I met a person from India who pronounced it "C pound", and they were as confused by my reaction as I was by their pronunciation. I guess somehow that pronunciation became popular enough that it acquired momentum and everyone in their circle (or maybe all of India?) assumed it was correct. The # key on the phone is called the "pound key" in India, which is where it would've started from, and I guess they never heard any foreigner Youtube video etc pronouncing it.
I don't know if they still pronounce it that way or not.
I started using C# towards the end of the 1.0 beta or maybe just after RTM...I embarrassingly called it "C pound" for quite a while. Because, even as someone born and raised in the US, pretty much my only exposure to the symbol was in the context of phones. "Call me at blah, pound one-two-three" as in the extension is "#123".
Remember, it was originally release +20 years ago (goddamn, I feel old now); recorded video or even audio over the internet were much, much, MUCH rarer then, when "high-speed" speed internet for a lot of people meant a 56K modem.
Back then, most developer's first exposure to C# then was likely in either print form (books or maybe MSDN magazine).
Not alone, also got those CDs for MSFT partners only, with the draft documentation written in red, before it became know to the outside world?
Also got a few of those magazine CDs, in some box.
I was always partial to "C octothorpe".
https://en.m.wiktionary.org/wiki/octothorpe
The "#" key on the phone is called the "pound key" in the US and Canada.
https://en.wikipedia.org/wiki/Number_sign#Names
It still is, to this day: if you call an automated system such as voicemail, you may be prompted to "press 'pound'". This is really standardized, AFAICT, and no telephone system has told me to "press hash" or "press the 'number' key" [because that's ambiguous]
cks has a history of #! but not an etymology: https://utcc.utoronto.ca/~cks/space/blog/unix/ExecAndShebang...
He links to Wikipedia which documents a good history: https://en.wikipedia.org/wiki/Shebang_(Unix)#History
I was trying to recall what we called it. I used SVR3, so I would've been using "#!/bin/sh" as early as 1990, and even more on SunOS 4 and other Unix servers.
I can't recall having a name for it until "hash-bang" gained currency later. We knew it was activated by the magic(5) kernel interpretations. I often called "#" as "pound" from the telephone usage, and I recall being tempted to verbalize it in C64 BASIC programming, or shell comment characters, but knowing it was not the same.
"The whole shebang" is a Civil-War-era American idiom that survived with my grandparents, so I was familiar with that meaning. And not really paying attention to Ricky Martin's discography.
Wikipedia says that Larry Wall used it in 1989. I was a fervent follower of Larry Wall in the mid-90s and Perl was my #1 scripting language. If anyone would coin and/or popularize a term like that, it's Just Another Perl Hacker,
Likewise, "bang" came from the "bang path" of UUCP email addresses, or it stood for "not" in C programming, and so "#!/bin/sh" was ambiguously nameless for me, perhaps for a decade.
Come to think of it, vi and vim have a command "!" where you can filter your text through a shell command, or "shell out" from other programs. This is the semantic that makes sense for hash-bangs, but which came first?
> "bang" came from the "bang path" of UUCP email addresses
"Bang" was in common use by computer users around 1970 when I was working at Tymshare. On the SDS/XDS Sigma 7, there was a command you could use from a Teletype to send a message to the system operator on their Teletype in the computer room. I may have this detail wrong, but I seem to recall that it included your username as a prefix, maybe like this:
What I do remember clearly is that there were also messages originated by the OS itself, and those began with "!!", which we pronounced "bang bang". Because who would ever want to say "exclamation point exclamation point"?The reason this is vivid in my mind is that I eventually found the low-level system call to let me send "system" messages myself. So I used it to prank the operator once in a while with this message:
I was proud of calling it an "undetectable" error. If it was undetectable, how did the OS detect it?Indeed. I joked on slashdot that C# was pronounced “cash” back when it was first announced, which seemed appropriate given the company.
The joke didn’t land, sadly.
It lands for me, today :)
This can also be done with Rusts cargo. Though it's not yet stabilized: https://rust-lang.github.io/rfcs/3424-cargo-script.html
The builtin subcommand is unstable, but a long time ago a dev I knew at the time maintained an external subcommand for it. https://crates.io/crates/cargo-script
Although it's old and its dependencies are old, it doesn't have a dependency on cargo (it spawns cargo as a subprocess instead of via cargo API directly), so it might still work fine with latest toolchain. I haven't tried myself.
I find it a pity the lack of acknowledgement of the CSX/VBX effort.
https://ttu.github.io/dotnet-script/
Or that they on the wisdom of the C# Language Runtime, decided on an incompatible approach to how F# references dependencies on its scripts.
https://learn.microsoft.com/en-us/dotnet/fsharp/tools/fsharp...
> I find it a pity the lack of acknowledgement of the CSX/VBX effort.
They do acknowledge it and other efforts:
https://devblogs.microsoft.com/dotnet/announcing-dotnet-run-...
On the BUILD talk I certainly don't remember that, and I am wondering if that was part of the original blog post, or a reaction to several people complaining since then.
They added it after the reactions I believe. Source: https://bsky.app/profile/davidfowl.com/post/3lq3zfjo6gk2d
Thanks for sharing, mostly likely then, given some of the people commenting, well known in the community.
What do you mean by incompatible? If you mean syntax, that's different intentionally for clarity and also we don't want to create a new C# scripting dialect hence it isn't possible to "import" other files (that's not how C# works).
You were already using F#'s syntax for C# NuGet references in .NET Interactive. Even NuGet itself labels this syntax "Script & Interactive" in its picker.
https://github.com/dotnet/interactive/blob/main/docs/nuget-o...
Now you've created a new dialect.
.NET interactive uses a dialect of C# - the script dialect. With file based apps, we strive to use the standard C#, so you can convert file based apps to full projects at any time.
That doesn't explain why standard C# had to deviate further from the script dialect just for import directives. These directives don't interfere with the rest of the grammar.
Is #:package <pkg> really so much nicer than #r "nuget: <pkg>" as to deserve deviating from .NET Interactive, F# and the existing NuGet picker? (It could be, if properly argued!) Has there been any effort to have the rest of .NET adopt this new syntax or share concerns?
On that note, is any language other than C# supported for `dotnet run`? Is the machinery at least documented for other MSBuild SDKs to use? Considering how dotnet watch keeps breaking for F#, I suspect not.
The # declarations syntax for Assembly references.
Assembly references are not really a mainline scenario, hence we support nuget package references only for now. And `#r` is just too opaque (it comes from old times when assembly references were the norm).
I have to agree with others that it's strange this has not been in line with .net interactive and the usage of "#r" there. I think "#r" should at least be an option.
You say that importing other files is not how c# works, but I think that's not entirely true. If you now treat a file as a project then an import is pretty much a project reference just to a .cs file instead of a .csproj file.
So I'd love to see something like that working: #:reference ./Tools.cs
Which should be the same as a ProjectReference in csproj or for a "real" project something like this: #:reference ./Tools/Tools.csproj
That would also enable things like this in a csproj file: <ProjectReference Include="..\Tools\Helpers.cs" />
Referencing other projects is currently out of scope (and might always be, file based programs are intended to be simple), you can convert file based programs to full projects though to regain full functionality.
I disagree, this is the kind of decisions that make CLR the C# Language Runtime, instead of its original purpose, where there was an ecosystem of common approaches.
Unless you plan to also support the same approach on VB and F# scripts.
F# and C# are intentionally different languages, I don't think it makes sense to have 100% equivalent functionality in both. AFAIK, F# already has scripting functionality built in. VB support could be added, but that seems unlikely given https://learn.microsoft.com/en-us/dotnet/fundamentals/langua...
Thing is, for quick and dirty throwaway scripts, an assembly reference is sometimes exactly what one wants.
E.g. let's say I have a compiled app and I just want to reach inside of it and invoke some method.
janjones, can you please advocate inside your team for taking F# seriously? It should have been a first class language inside .net.
I dread the moment C# introduces sum types (long overdue) and then have them incompatible with F#. On a general note, the total disregard towards F# is very much undeserved. It is far superior to the likes of Python, and would easily be the best in the areas of machine learning and scripting, but apparently nobody inside MS is that visionary.
> It is far superior to the likes of Python, and would easily be the best in the areas of machine learning...
The fact that Python is the language of choice for ML makes it clear that technical superiority was not necessary or relevant. Python is the worst language around. The lack of better languages was never the problem. Making F# technically better wasn't going to make it the choice for ML.
Agreed. But MS has certainly some weight to throw around in their own ecosystem. They have their own offerings for data processing and ambitions for AI. If they want to integrate with .net, they are already set. But that requires vision.
Note that until the layoffs of the week predating BUILD, Microsoft was exactly one of the companies pushing for improving CPython performance.
This is why I always mention, when .NET team complains about adoption among new generations not going as desired they should look into their own employer first.
Why people still beat the dead horse of Python (performance) is beyond me.
This feature is probably a big thing for .NET developer productivity. It's quite a shame, that it only came now.
There is another thing I'm really missing for .NET projects. The possibility to easily define project-specific commands. Something like "npm run <command>"
> This feature is probably a big thing for .NET developer productivity. It's quite a shame, that it only came now.
I am using https://github.com/dotnet-script/dotnet-script without any issues. Skipping an extra step would be cool though.
Is it possible to add those scripts project-scoped like with npm run? They should work for everyone who checks out the repo.
I think everyone needs to do `dotnet tool install -g dotnet-script` before running them. This is the most annoying part where .NET 10 announcement would be really appreciated.
But then each script has an individual list of dependencies so there should be no need for further scoping like in npm (as in, the compilation of the script is always scoped behind the scenes). In this regard, both should be similar to https://docs.astral.sh/uv/guides/scripts/#declaring-script-d... which I absolutely love.
You can use project-scoped tool manifests. Then you can call dotnet tool restore to load all tools specified in the manifest.
https://learn.microsoft.com/en-us/dotnet/core/tools/global-t...
TIL, thank you!
Are you talking about this?: https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-r...
This would be great! I'm currently using make files to accomplish this: https://www.gnu.org/software/make/
I remember using LINQPad when I needed this functionality filled.
you’re better off using a separate task runner like make or task.
I don't think make and task are suited very well for running scripts that are not part of the build process. Something like scaffolding, or starting a dev dependency. Also it requires the developers to install additional tools to get started with the project.
Nah, we've been doing this for years. At least ten years ago I built myself a little console based on Roslyn that would evaluate any C# snippets I gave it. C# scripts were pretty well supported, and it was not hard to throw one together. The engine of my console was a couple dozen lines mostly just processing the string.
I'm sure the tooling is better now, though. I seem to recall visual studio and/or Rider also supporting something like this natively at the time.
This also exists for Kotlin: https://github.com/Kotlin/kotlin-script-examples/blob/master... (note that the file extension is load bearing in this case, if you don't name it "*.main.kts", it will not work).
It's excellent for writing small scripts/prototyping where you need access to some Kotlin/JVM feature.
Ruby is still my preferred language for small scripts though - the backticks for running external programs is remarkably ergonomic
It's not going away entirely, but its future is also not so shiny:
https://blog.jetbrains.com/kotlin/2024/11/state-of-kotlin-sc...
I think it also exists for Java itself now.
Yes. You can just run java myfile.java
Seems like a great PowerShell replacement.
PowerShell is the ultimate Chatgpt language. For better or worse. Usually worse as most shops end up with "write only" PowerShell scripts running all the glue/infrastructure stuff.
> Seems like a great PowerShell replacement.
It sounds like it can potentially replace way more than PowerShell. I mean, why would a .NET shop even bother with Python or any form of shell scripts if they can attach a shebang on top of an ad-hoc snippet? And does anyone need to crank out express.js for a test service if they can simply put together a ASP.NET minimal API in a script?
I've been writing shell scripts in PHP for more than 20 years for this reason. Don't work on a lot of PHP sites any more but I still do most of my shell scripting in it. I think this'll be a big win for C# users once they get used to the paradigm shift.
I notice another poster said it's a bit slow but for many common use cases even half a second startup time is probably a small price to pay to be able to write in the language you're familiar with, use your standard libraries, etc.
Python has a lot of libraries (ai, machinelearning, data stuff, whatever) that no one has bothered porting to .net (or other platforms).
.Net is usually a second tier target but python ALWAYS has first tier support (along with java and usually go).
Ecosystem, people keep forgetting syntax, grammar and standard library isn't everything.
That is why even the languages I dislike and am not a big fan of, have a little place on my toolbox.
> Ecosystem, people keep forgetting syntax, grammar and standard library isn't everything.
Ecosystem means nothing if you have comparable or even better alternatives in a framework of choice.
Also, it's not like the likes of Python don't have their warts. Anyone with a cursory experience with Python is aware of all the multiplatform gotchas it has with basic things like file handing.
If you work alone, maybe.
Some of us have to take other devs into consideration.
Human collaboration is also part of the ecosystem.
For me, every time I have to use python it's the package handling that leaves my head spinning. It still feels like the bad old days of npm.
I think it's a popular language with scientists despite that because they don't have to care about portability, reproducability or needing your replacement to be able to run it without ever speaking to you.
I use python infrequently enough that every time it's a pain point.
Not sure I follow. I wish python was as good as npm/node_modules. And how is nuget better than npm? Is it just package quality or something else? I rarely use npm and I'm not a webdev but whenever I use it I think it's pretty great.
When I said "bad old days" I mean a previous iteration, not the state today. I'm talking about the early days of npm.
In the early days of npm a lot of install examples would do global installs, you'd often end up with a confusing mess in npm.
Nowadays people are much better at only doing project level installs and even correctly telling you whether to have it as a dev dependency or not.
Ah sorry that makes sense!! Yeah, that's exactly what how I feel too. It's sad that npm has improved so much while Python's packaging hasn't (not by default at least, whereas npm is basically a default in js projects by now), in the same time frame.
I don't think ecosystem or people forgetting syntax would be an issue in a .NET shop.
It certainly is, unless the folks at the .NET shop get to be the ones writing the missing libraries, related tools, books, tutorials, conference talks,....
You can invoke C# from PS if you want:
https://learn.microsoft.com/en-us/powershell/module/microsof...
I love PowerShell. It’s amazing the things I’ve been able to accomplish with it. Hands down my favorite language.
I think you're the only one.
On polyglot, OS agnostic agencies, I bet there are plenty of Powershell folks.
BS. There are dozens of us ;)
Oh God. I hadn't considered that windows sysadmins are likely the most prolific ChatGPT scripters. If I was still one, given the state of the MS docs, I would be guilty for sure.
I think most people would be shocked to know how much glue code is in powershell or bash/perl that kind of keeps everything running.
I remember looking on in horror as a QA person who was responsible for the install/deploy of some banking software scrolled through the bash/perl script that installed this thing. I think it had to be 20k+ lines of the gnarliest code I've ever seen. I was the java/.net integration guy who worked with larger customers to integrate systems.
My group insisted we should do single sign in in perl script but I couldn't get the CPAN package working. I had a prototype in java done in an afternoon. I never understood why people loved perl so much. Same with powershell. Shell scripters are a different breed.
On the other hand, I was once tasked to rewrite a couple of Korn shell scripts, initially written for Aix, ported to Red-Hat, into Java.
The reason being the lack of UNIX skills on the new team, and apparently it was easier to pay for the development effort.
Afterwards there were some questions about the performance decrease.
I had to patiently explain the difference between having a set of scripts orchestrating a workflow of native code applications written in C, and having the JVM do the same work, never long enough for the JIT C2 to kick in.
> PowerShell scripts running all the glue/infrastructure stuff.
I'm pleased to report it is usually not possible to do that. It would only create a huge mess. C# is more conducive for anything more than a few methods. And there is almost no barrier. PS is great for smaller ad-hoc stuff, and it is the "script that is on every Windows platform" component similar to what VBScript was a few years ago.
Seeing as Powershell can run .net code I wonder if this actually augments Powershell.
Only if you enjoy coding the low level stuff, instead of the higher level cmdlets.
You can also use shebang to run C# scripts like bash scripts https://devblogs.microsoft.com/dotnet/announcing-dotnet-run-...
I couldn't get the shebang to work by running the file directly with .net10 preview 4 sdk image.
It works if I run `dotnet run <file>`.
Update: It is working now, the file was using CRLF instead of LF.
That's great; now I can finally have scripts with type-safety. Note that on macOS the shebang either reads `#!/usr/local/share/dotnet/dotnet run` or `#!/usr/bin/env -S dotnet run`.
Kind of obsoletes NetPad, and as soon as there's debugging, LINQPad can be put out to pasture. LINQPad was instrumental for me many years ago, and I appreciate that, but that stone-age text editor is not usable for actually writing/editing code in this decade.
Linqpad is super cool in certain .net ways, but oh man the text editor component is the worst one I interact with on a regular basis. I wish that part would be replaced by neovim or monoaco or, basically, anything. The snappiness, the table visualizations and so forth - very nice.
It’s also been unable to keep up with notebook tech for many potential use cases. I guess it’s a one man show and it shows.
Still, massive hat tip - I use Linqpad every day because it’s super useful for playing with your SQL data.
I dunno about obsoleting LINQPad yet. Half the power on LINQPad is its UI; I'd like to see how comparable VSCode / VS are with dotnet run vs LINQPad.
IE: LINQPad has a great way to visualize results. If dotnet run only outputs text, or otherwise requires a lot of plugins to visualize an object graph, there will still be quite a niche for LINQPad.
In contrast, if all you're using LINQPad for is to double-check syntax, then dotnet run might be a better option. (Sometimes if I'm "in the zone" and unsure about syntax that I use infrequently, I'll write a test LINQPad script.)
My main usecase for linqpad is database interactive stuff or exploratory code with .dump().
I see this as more of a complement to that. However I have worked at places that HATED powershell and we used linqpad for almost all scripting. It worked ok.
Only if they do all GUI features and extension points, which I doubt pretty much.
Looks like they will be adding support for VS Code, including for debugging.
"In upcoming .NET 10 previews we’re aiming to improve the experience of working with file-based apps in VS Code, with enhnanced IntelliSense for the new file-based directives, improved performance, and support for debugging."
Still, it isn't like graphically dump a data structure like in LinQPad.
There's very useful stuff there for sure, just gated behind an unfathomably deficient code editor. IMO Albahari should just embed Monaco in it.
I'm excited for this one. I can see it replacing some of the powershell scripts I have in CI/CD pipelines.
As much as I like Powershell and Bash, there are some tasks that my brain is wired to solve more efficiently with a C-like syntax language and this fill that gap for me.
There is even more info in the proposal itself https://github.com/dotnet/sdk/blob/main/documentation/genera... specifically things about multiple files and more specifics of the implementation and capabilities (implicit project file etc)
>> Install Visual Studio Code (recommended) If you’re using Visual Studio Code, install the C# Dev Kit
I wish Microsoft would just provide a LSP server for C#. Not just a half proprietary extension for VS Code.
would they still be able to charge for it if it was an lsp?
Finally! Perfect for making a quick utility to use in a script. There have been third party solutions but having to install something to do this was always another obstacle / headache
I feel like top level statements were created specifically to open the door to C# scripts. The syntactic sugar of top level statements doesn't make sense in regular .NET production apps that you need to build and run, but for quick scripts they do represent important improvements in DX. Well done.
Their argumentation for both is accessibility for learning environments. Python and node were eating the cake there with ease of first use (python file.py). And what the academia is using today's is next year's company language
As someone mostly focused on JVM, .NET and nodejs ecosystems, this won't cut it.
The problem with UNIX culture shops not picking up .NET has everything to do with the Microsoft stigma, and everything the management keeps doing against .NET team efforts, like VSCode vs VS tooling features, C# DevKit license, what frameworks get to be on GNU/Linux, and the current ongoing issues with FOSS on .NET and the role of .NET Foundation.
Minimal APIs and now scripting, which already existed as third party solutions (search for csx), won't sort out those issues.
They can even start by going into Azure and check why there are so many projects now chosing other languages instead of .NET, when working on the open.
This would already be the first place to promote .NET adoption.
>UNIX culture shops
Jesus christ, it sounds like a religion
Welcome to technology flamewars.
The only change through the times is where they take place and how hard, or low, they might get.
I mean I don't want the viability of my business to be dependent on the whims of Microsoft either. Unfortunately your business is going to have counterparty risk no matter what your tech stack is but Microsoft's record is pretty mixed.
AFAIK, you could always dotnet run app/app.csproj since 2016. Obviously you needed dotnet sdk installed.
Installing the full dotnet sdk was very high friction, in addition to the full csproj scaffolding. Running a single cs file is a HUGE ux improvement.
If python hadn't (nearly) caught up to c# in typing support, I'd seriously consider moving or at least running it...but as it stands, python has established itself too well for me.
> Installing the full dotnet sdk was very high friction, in addition to the full csproj scaffolding.
What's the difference between installing .NET or, say, python of node?
Python typing hasn’t nearly caught up with C#. I regularly use both and c#’s type system is pragmatic and very helpful in avoiding bugs, whereas Python has no types by default and doesn’t check them by default when it does have them. It’s worse than typescript because at least typescript fails to compile if you have a type error.
You still need full dotnet sdk, the tool merely automates full csproj scaffolding, which is then compiled with full dotnet sdk, roslyn, nuget, compiler server and everything, and allows to customize msbuild project properties with pragmas in code. The only difference is that before you had sdk+folder+my.cs+my.csproj, now you have sdk+folder+my.cs
Yes, but this change now also allows you to do the same with, say, script.cs file.
Very cool. Will probably still stick with LINQPad for my every day experimentation needs but if i decide to write actual scripts for ~production use, this would be a better alternative to Powershell in most cases I think.
And then you move to Linux. Not kidding when I say that the only reason why I have a W11 VM is to run LinqPad, at least until November.
Well, my organization still isn't mature/large enough to administer linux machines for employee use and I prefer Win+WSL to Mac, so.
Its good to finally see this being 'a thing' in C#. In my opinion, it is 10 years overdue!
This feature is likely added to compete with Python, Ruby, etc. The fact you just create a file, write some code and run it.
However, I don't see C# being a competitor of said languages even for simple command lines. If anything, it could be a viable replacement to Powershell or maybe F# especially if you need to link it to other .NET DLLs and 'do things'
I am also interested in the performance difference compared to other languages. I mean even Dlang has a script-like feature for sometime, now.. rdmd. Not sure the status of that, but it still compiled and runs the program. Just seems overkill for something rather simple.
This is about education and mindset. Not to enter the space. There might be some people working with it, if they are more relaxed with C# than bash.
Performance right now is horrible (500ms), they promised to improve it, but let us be honest: They are in-memory scaffolding a project, run msbuild restore/reassess/reuse dependencies, and then compile a dll and then load it into a JIT. that is so many more steps than a interpreter will do.
> They are in-memory scaffolding a project...
Absolutely...100% !!
The oldest one: https://github.com/oleg-shilo/cs-script
Used it 10+ years in production, but usually no one I talk to in .net world has ever heard about it.
I used this MANY years ago for a reporting system for a French supermarket chain. Their developers were really pleased with it!
I heard about alintex script http://web.archive.org/web/20051104042052/http://www.alintex...
i couldn't help noticing that none of the autocomplete suggestions in his terminal or VS Code were relevant or correct. It looked really annoying. Is that the usual quality Copilot delivers?
It's slightly better for established code but it's like all of them. If it's well-known problem, it's decent. If it's esoteric or unique to your business, it's confused.
Since C# single file is new, there is not a ton of code for Copilot to reference so it's probably confused.
I guess this continues the trend of c# eventually getting all the features from f#
And the trend of doing so in an incompatible way.
.NET Interactive had already added a directive for C# NuGet references, compatible with F#'s, and NuGet's picker had already labeled that syntax "Script & Interactive". But no, they had to invent a new directive.
It seems Microsoft is trying to make C# more flexible and attractive to Data Scientists and beginners. They should have done this years ago.
I've been developing with .NET since version 1.1. I feel like it's always been pretty easy to use for beginners. You install Visual Studio, create a new project and BOOM you've got a program that builds and runs.
Having used Turbo Pascal and Turbo C prior to Visual Basic and Visual C, Microsoft's "Visual" IDEs and other Windows based IDEs of the 90's were a step up in ease of use even if they did require more files to build a project.
They indeed started years ago. For some time it's possible to run project with a Program.cs file that doesn't contain a static class. There's also the so called the ASP.NET minimal API. And .net scripting got official support a few releases ago.
Seems like a great Python replacement.
And F# which is more pythonlike already supports this via dotnet fsi - see https://learn.microsoft.com/en-us/dotnet/fsharp/tools/fsharp...
Any number of these small papercut-like things is what has made a tool like Python ubiquitous.
Absolutely. I guess this is mostly relevant for those who work on larger dotnet/mssql/ms-tech projects where it fits into the workflows in general.
With `uv` you can even put a magic comment atop the script saying like, "Use this version of Python, use these dependencies" and uv will fault the runtime and deps into a cache folder before launching the script. 2025 might be the year of Python on the desktop
When Astral ships the ability to package your Python script + its dependencies in a single, statically linked executable, it's game over for everyone else.
But don't tell the people they could have that right now, but performant and type safe.
Having the thing is better than not having the thing, even if it's not the best version of the thing.
Yes, but it is not without cost. The time poured on all the failed projects to make Python into what it fundamentally isn't is quite wasteful in my opinion. But people are free to do whatever they want with their own time.
[dead]
Some 20y ago I used "scratchpad" or whatever it was called to write short "bash" scripts and run them directly from that "notepad"... and it was a blast.
I never understood why it has to be so hard on Windows to enable users to do just a little bit of scripting like it's not 80's anymore.
LINQPad [1] is a terrific utility for this workflow with dotnet. I was part of a team where we built all kinds of runbooks, support utilities, and debugging tools with LINQPad using a common set of utility functions to dump (pretty-print) system-specific entities and even charts.
[1]: https://www.linqpad.net/
I thought it was Windows only, but looks like there's a macOS version now!
I'm building a dotnet job orchestrator called Didact (https://www.didact.dev), and this is the sort of thing I was looking for years ago when I was first dreaming it up. Class libraries is the approach I am taking now, but this is still extremely interesting. Could see some interesting use cases for this...
Java SE 11 (Sep 2018) introduced the ability to run source files directly. So instead of the old `javac Main.java && java Main`, you can do `java Main.java`. https://openjdk.org/projects/jdk/11/ , https://openjdk.org/jeps/330
Furthermore, there's ongoing work ([0] and related JEPs) to allow running single Java files without an enclosing class and the static main method:
This is currently (as of JDK 24) under preview, so such a file needs to be run with "--enable-preview". E.g. [0] https://openjdk.org/jeps/445With the latest EA release (EA 24) of JDK 25 [1], you no longer need the `--enable-preview`. You can also simplify the println to:
JEP 445 has been followed by JEP 512 which contained minor improvements and finalizes the feature [2].If you want to use 3rd-party libraries, I highly recommend trying JBang [3].
[1] https://jdk.java.net/25/release-notes
[2] https://openjdk.org/jeps/512
[3] https://www.jbang.dev
As nobody mentioned before, Java has had JBang for a while now: https://www.jbang.dev/
As someone just getting into C# from PowerShell, this is huge.
C# is my second-favorite language. I made a shebang to do the same for rust (my number one) many years ago with zero (external) dependencies: https://neosmart.net/blog/self-compiling-rust-code/
Seems a little complicated when you can do:
Or&& in a shebang isn't portable, having more than one argument in a shebang isn't portable, subshell in a shebang isn't portable, semicolon in a shebang isn't portable, basename usage isn't portable, not littering the current working directory is important (it might not even be RW, or you might not have permissions to write to it), your usage won't work with symlinks or with the script added to $PATH, caching the compiled output is a nice speedup and reduces startup costs, correct (and optimal) cache invalidation is one of the known tricky problems in CS.
Shell scripts are ugly :)
(If you didn't mean to use them in a shebang but rather as the script body, that's fine but that wouldn't be possible without the polyglot syntax and #allow abuse I posted.)
Thanks, I looked it up and it's indeed quite ugly, and my shebang isn't portable. A little saddened that even 'env -S' didn't make it to POSIX.
No worries and believe me, I feel your pain (I moonlight as a (fish-)shell dev).
I would have liked to use `exec -a` to set the executable name (as mentioned in the comments on the article) but alas, that too is not portable.
With Java allowing void main() {} like C,
and now shebang C# scripts,
is everything converging into one meta language?
Give me some sane subset of TypeScript with native support and I'll use it everywhere. But C# is far from as convenient as TypeScript so far. Just a few more years of convenience language features and APIs added, and I think it'll be set.
Interpreted general purpose languages have been fairly similar for a while. I worked with C# for a decade, Go replaced it for us (for human reasons not technical ones) but these days everything is basically Python and then some C/Zig for parts that require efficiency.
Every subset of Lisp is growing into a full Lisp :)
Didn't Mono implement something like this ages ago? And I mean ages ago, before C# even had standardized code outside "static void Main" in the language spec, IIRC they had an interpreter that probably used Reflection.Emit or something and executed the output and you could #! it.
I've been coding in C# for more than a decade and a half now, and I see myself using this rather than creating an entire project structure to test some new approach to something.
How is it going with native AOT? I'd rather prefer:
dotnetc app.cs
./app
>Because of the implicit project file, other files in the target directory or its subdirectories are included in the compilation.
Errm, so how is this different from a folder with a project file?
> Errm, so how is this different from a folder with a project file?
It requires neither a folder nor a project file. Just pass the source file as the argument and you have everything up and running.
If you don't put it in a folder, then entire folder tree is compiled into your script.
That doesn't seem true based on the introduction. Where are you seeing that info?
In the end the post links to spec.
>By default, file-based apps use the Microsoft.NET.Sdk SDK.
This behavior is default provided by sdk to keep project files small.
Probably, you can have multiple scripts in one folder. Eg script1.cs, script2.cs...
That's it, and they are all compiled into one executable as implicit items, and everything in subfolders.
this will make scripting with C# so much easier
Would be cute to alias dnx to get ‘dnx app.cs’
Announcing proudly that botnet can now compile without a csproj file seems a bit over excited.
It his basically:
gcc test.c -o test.exe test.exe.
and
#!/ comple and run gcc test.c -o test.exe test.exe.
On the contrary if cpp had something like csproj and it'd package manager by *default*, then it be 5 times less painful to use
You know that 99% of use are not botnets.
Sounds like argument for not sharpening knives because criminals can stab people.
I think it's reasonable to assume the poster meant "dotnet" and that you're responding to a typo
This isn't a new feature, and was announced for... I think 8.
Why is it being re-announced now?
Maybe you're confusing it with top-level statements? This new feature doesn't even require a csproj.
No, I do absolutely remember them integrating dotnet-script-like functionality into the runtime, and I remember experimenting with it.
Was it only in a beta version and removed to re-appear now? It used the Rosyln compiler to compile on demand and then execute it, and used pragma directives that were compatible with dotnet-script's. I cannot remember what the shebang was set to, but it wasn't the same one as dotnet-script.
On the topic of scripts, it would be nice if they had a more seamless way to invoke processes. The whole Process.Start() way can be convoluted when I just want something simple.
Either way, this is a dream come true. I always hated Powershell and it’s strange fixation with OOP. Modern C# seems to acknowledge that sometimes all we want are functions, scripts, Linq and records.
very cool, but only one thing come in mind, the new .VBS !
There's a rule that if something is easy to invent, a bunch of people will whip up their own versions independently, sometimes even within the same organisation.
So for there's...
C# scripting (CSI.exe): https://learn.microsoft.com/en-us/archive/msdn-magazine/2016...
PowerShell Add-Type that allows one-file inline C# scripting: https://learn.microsoft.com/en-us/powershell/module/microsof...
The Roslyn .CSX script files (RCSI.exe): https://devblogs.microsoft.com/visualstudio/introducing-the-...
.NET Interactive: https://github.com/dotnet/interactive
... and now this.
.net interactive is a kernel based approach for notebooks. CSI is more like a traditional repl (extracted from a VS feature). This here is more like traditional scripting. It is coding without project file. There is no attempt of interactivity here.
What would be really useful is to include one of these with Windows, so that it simply works and can be used for scripting without needing to write a batch file to bootstrap a PowerShell script to bootstrap the .NET runtime to bootstrap the C# scripting environment.
This would basically be VbScript 2.0, and MS killed VbScript because it had become a massive security hole.
> There's a rule that if something is easy to invent, (...)
You seem to be confused. There is nothing being invented in here. What they are announcing is basically an update to the dotnet command line app to support building and running plain vanilla C# programs.
Maybe this iteration it won't be a bolt-on, who knows. Because of the origin of C# (Microsoft Java replacement) it's all still very MSBuild/IDe-magic-ish instead of being its own thing where you can decide your own dependency resolution, your own compiler and your own linker.
It's similar to cmd.exe and conhost etc. It's all tied to decades old legacy baselines that Microsoft just won't or can't let go of.
Reading all the above, this is nothing like that.
It's just standard c# no new dialect, works like a proper 'shell' script and it's not a repl, what am I missing
Indeed, what I find somehow dishonest is the lack of acknowledgement of previous efforts.
The whole presentation at BUILD was done as if it was a great ideas, they only thought about it now.
.csproject and visual studio is the main reason i avoid csharp. The whole setup feels dumb for no reason.
What's wrong with .csproj files?
I used to think there's nothing wrong with them either. For simple setups it is very painless. However, I recently stumbled into a use case where setting up the project file is a huge hassle and very error prone as well. Mainly I wanted another project to be referenced only at the "analyzer" step, and this appeared to work just fine at first, but after the 3rd/4th compilation it failed for some reason. Digging into this opened a huge can of worms of partially/falsely documented features for something I sincerely believed should be easy to achieve. Now the only thing I can do is copy the project into the analyzer, or let the build fail with the dependency disabled just so that the next build works again.
There's also some issues regarding restoring other projects, but that doesn't appear to be the fault of .csproject files.
P.S.: Having a project be referenced (i.e.: marker attributes) at both the analyzer step _and_ later in the final artifact is something I never got working reliably so far. From what I've read a nuget project could make this easier, but I don't want to do that and would expect there to be a way without using the package-management system.
> I used to think there's nothing wrong with them either.
From your complain, it doesn't seem you're pointing anything wrong with .csproj files. You struggled with a usecases that's far from normal and might not even be right for you. Without details, it's hard to tell if you're missing something or you're blaming the tool instead of focusing on getting things to work.
I found a solution I am happy with. Based on my research into this it is a problem I'm not alone with. I do concede that writing your own analyzers is unusual (which is why i wrote that it's fine for simple setups). At the same time I deem having a common library to be referred by more than one analyzer project something that should be possible without running into strange errors.
If a tool (dotnet build) tells me something is wrong I am fine with it. If the same tool works after I added something to the project description and then fails at a random time after without having changed the referenced stuff, then I will happily blame the tool. Especially when commenting out the reference, recompiling until error, and then uncommenting it fixes the issue. While this behavior doesn't necessitate an issue with the files per-se, there is only entity consuming it, so to me there is no distinction.
Back when I last used C#, which admittedly is over 10 years ago, .csproj files were written in XML so they were annoying to edit. It was difficult to understand the structure of them, and I'm not sure if they were documented very well.
Just compare a .csproj to something modern like a Cargo.toml and you'll see why someone might think .csproj is awful. It is immediately obvious what each section of a Cargo.toml does, and intuitive how you might edit or extend them.
Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Very outdated view. Csproj files got hugely simplified with dotnet core, and thanks God they kept the XML format instead of the JSON they tried at first.
"SDK-Style csproj" has been around since .NET 5. They are pretty great and not bad to hand edit. Starts as just:
(It's named "SDK-style" because that Sdk="Sdk.Name" attribute on the Project tag does a ton of heavy lifting and sets a lot of smart defaults.)In the "SDK-Style", files are included by default by wildcards, you no longer need to include every file in the csproj. Adding NuGet references by hand is now just as easy as under an <ItemGroup> add a <PackageReference Include"Package.Name" Version="1.0.0" />. (For a while NuGet references were in their own file, but there would also be a dance of "assembly binding redirects" NuGet used to also need to sometimes include in a csproj. All of that is gone today and much simplified to a single easy to write by hand tag.) Other previously "advanced" things you'd only trust the UI to do are more easily done by hand now, too.
> Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Depends on the era, but in the worst days of the csproj verbosity where every file had to be mentioned (included as an Item) in the csproj I didn't know a single C# developer that hadn't needed to do some XML surgery in a csproj file at some point, though most of that was to fix merge conflicts because back then it was a common source of merge conflicts. Fixing merge conflicts in a csproj used to be a rite of passage for any reasonably sized team. (I do not miss those days and am very happy with the "SDK-Style csproj" today.)
The immediately obvious Cargo.toml is
Why dumb? Sure there is 1-2 syntax warts left, but the files are very expressive nowadays.
Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
Half of the properties are XML attributes <tag ID=5> while the other half are child tags <tag><hello>5</hello></tag>
It's okay to read, but basically impossible to write by hand.
The tooling support is nothing like a JSON with JSON schema that gives you full intellisense with property autocomplete plus attribute description. (Imagine VSCode settings)
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
Is it, though? The guidelines seem to be pretty straight forward: unless you want to target a new language feature, you can just target netstandard or whatever target framework your code requires at the moment. This barely registers as a concern.
https://learn.microsoft.com/en-us/dotnet/standard/net-standa...
> It's okay to read, but basically impossible to write by hand.
Are you sure about that? I mean, what's the best example you can come up with?
This complain is even more baffling considering a) the amount of XML editor helpers out there, b) the fact that most mainstream build systems out there are already XML-based.
I personally wouldn't bother with .NET Standard anymore. .NET Standard 2.0 is nicely frozen and still useful to some legacy shops, but at this point I think for all new code the only decision is to support LTS or not, and then just pick the latest version.
It will break in a year when a new shiny version appears.
.NET Standard 2.0 will break? I think it's frozen for good. That's also the biggest problem with it: there are so many performance improvements in .NET 7+ that .NET Standard can't opt into but come "free" with retargeting. (Among other things, a lot more Span<T> overloads throughout the BCL. There is a compatibility shim for Memory<T> and Span<T> access in .NET Standard 2.0 and that also includes some of the new overloads, but not all of them.)
Targeting a specific .NET version will break in a year? LTS versions specifically have two years of support. But also .NET has very rarely broken backward compat and you can still easily load libraries built targeting .NET 5 in .NET 9 today. You don't necessarily have to "keep up with the treadmill". It's a good idea: see "free performance boosts when you do". But it isn't required and .NET 5 is still a better target from today's perspective than .NET Standard 2.0. (The biggest instance I know of a backwards compatibility break in .NET was the rescoping between .NET [Framework] 3.5.x and .NET [Framework] 4.0 on what was BCL and what was out/no longer supported and that was still nothing like Python 2 versus 3. I know a lot of people would also count the .NET Framework 4.x and .NET Core 1.0 split, too, which is the reason for the whole mess of things like .NET Standard, but also .NET Standard was the backward compatibility guarantee and .NET Standard 2.0 was its completion point, even though yes there are versions > 2.0, which are even less something anyone needs to worry about today.)
AFAIK, net5 and net9 are incompatible with each other and you can't run one on another, major versions and all that. You need netstandard to run on different versions, but it's only for libraries, you can't run a process on netstandard.
I don't think things are quite that bad. I'd take a csproj files over many Maven files or Makefiles. The three or four ways I've seen Python manage dependencies didn't improve things either. I'm quite comfortable with Rust's toml files these days but they're also far from easy to write as a human. I still don't quite understand how Go does things, it feels like I'm either missing something or Go just makes you run commands manually when it comes to project management and build features.
I don't think there are any good project definition files. At least csproj is standardised XML, so your IDE can tell if you're allowed to do something or not before you try to hit build.
As for targeting frameworks and versions, I think that's only a problem on Windows (where you have the built in one and the one(s) you download to run applications) and even then you can just target the latest version of whatever framework you need and compile to a standard executable if you don't want to deal with framework stuff. The frameworks themselves don't have an equivalent in most languages, but that's a feature, not a bug. It's not even C# exclusive, I've had to download specific JREs to run Java code because the standard JRE was missing a few DLLs for instance.
The "built-in to Windows" one is essentially feature frozen and "dead". It's a bit like the situation where a bunch of Linux distros for a long while included a "hidden" Python 2 for internal scripts and last chance backwards compatibility even despite Python 3 supposed to be primary in the distro and Python 2 out of support.
Except this is also worse because this is the same Microsoft commitment to backwards compatibility of "dead languages" that leads to things like the VB6 runtime still being included in Windows 11 despite the real security support for the language itself and writing new applications in it having ended entirely in the Windows XP era. (Or the approximately millions of side-by-side "Visual C++ Redistributables" in every Windows install. Or keeping the Windows Scripting Host and support for terribly old dialects of VBScript and JScript around all these decades later, even after being known mostly as a security vulnerability and malware vector for most of those same decades.)
Exactly the reason why The Year of Desktop Linux has become a meme, and apparently it is easier to translate Win32 calls than convince game devs already targeting POSIX like platforms to take GNU/Linux into account.
JScript is still a proper programming language and isn't Electron sized. Also hta that did electron before google was planned.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
It's gotten real simple in the last few years, and is basically the exact same flowchart as Node.JS:
Just like Node, versions cycle every six months (.NET 10, the next LTS, is in Preview [alpha/beta testing] today; the feature being discussed is a part of this preview) and LTS add an extra year and a half security support safety net to upgrade to the next LTS.Everything else can be forgotten. It's no longer needed. It's no longer a thing. It's a dead version for people that need deep legacy support in dark brownfields and nothing more than that.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
What do you think happens when you try to use Python 3.9 to run a script that depends on features added in 3.10? This is inherent to anything that requires an interpreter or runtime. Most scripting tools just default "your stuff breaks if you get it wrong" whereas .Net requires you to explicitly define the dependency.
Try to find any language with more than 20 years production deployment across the planet without similar issues.
I can’t believe it took this long.
It is insane to me how long it takes people to realize that low barriers for execution and experimentation are important.
Imagine if the Commodore 64 or Microsoft Basic required you to scaffold your program with helper files containing very specific and correct information in order to run those programs.
[dead]
Maybe one day they'll bite the bullet and let you write a function without extra 10 keywords and a sprinkling of nouns
I really want to like C# but there is a reason why it has no ecosystem outside of enterprise and gaming slop
Which 10 extra key word? It is a statically typed (bring things like types, type parameters, ...), object oriented language (brings things like abstract, override, class, ...) and high performance (brings things like ref, in, out, ...).
And it is expressive instead of "mathematical" (like ML languages) which makes productively in code reviews a thing.
It is exactly where it should be.
Maybe functional language programmers will one day pull the stick out of their arses and get a bit less supercilious, and realise that expressiveness and function name length has absolutely nothing to do with runtime performance, especially in native and properly JITed runtimes. Maybe they'll realise it makes things easier to approach, read, reason about, and hence write more correctly.
When I dream I don't think about public static class IDream
OOP is a scam, a useful scam but still a scam, it is in no way easier or better to force _everything_ to be a class
P.S. not a functional programmer at all - except in dotnet (F#) because there's less stuff to get in my way.
> OOP is a scam, a useful scam but still a scam, it is in no way easier or better to force _everything_ to be a class
You're in luck - C# doesn't force everything to be a class and has many functional programming features. Hell, even C++ has had functions as data for, like, 10 years now.
A scam that powers all major mainstream programming language ecosystems, and their toolchains.
I'd much rather be writing in C# than going back to C++ any day of the week.
What's there besides enterprise and gaming?
System programming, Ai, and start ups crud apps?
> Maybe one day they'll bite the bullet and let you write a function without extra 10 keywords and a sprinkling of nouns
I have no idea what you are talking about. Care to present a single example?
well to do anything new in a C# project one typically ends up making a class inside a new namespace, if I just want to write a solver with no state it's quite jarring and distracts from otherwise a pretty good language that I like working in. So 2 new nouns and a handful of keywords.
> well to do anything new in a C# project one typically ends up making a class inside a new namespace (...)
You don't need to. A few years ago, C# introduced top-level statements. They were introduced in 2020, I think.
https://learn.microsoft.com/en-us/dotnet/csharp/fundamentals...
Hello world in C# today looks like this:
That's it. No namespaces (which were never required anyway, not even in C# 1.0), no classes, no functions even. If you want to define a function, you can just do so in global scope, and if it's a single expression you don't even need the braces:It’s clear you haven’t run `dotnet new console` in years or you wouldn’t be saying this.
You want to write something in C# smaller than 10 words?
A namespace isn't required
I know C# people like to think up three new forms of syntax before breakfast, but it feels like these magic comments could have been attributes?
Those are not comments:
https://learn.microsoft.com/en-us/dotnet/csharp/language-ref...
They've been supported for decades at this point, they're not new. Your suggestion would require more changes to the language than what's actually implemented, afaiu.
Apologies, I left C# years ago when there were only 5,000 things to keep in your head at any one time.
I find it odd you feel that way.
I find C# to be remarkably better than javascript, for example, of having a 'right' way to do something.
Even when there are multiple ways the community tends to coalesce around one way.
C# always supported preprocessor directive.
Yes, but I would have thought attributes were the canonical "think you might want to reflect over" rather than this imperative solution. But you're all happy with it and it's not my circus, so enjoy.
This cannot be something you reflect over, because they allow to encode build arguments. They need to be parsed before it's compiled, which is exactly what preprocessors directive allows.
Java's annotations (which are mostly equivalent to C#'s attributes) can be consumed at compile time using annotation processors. If C# had this feature for attributes, could attributes have been used here?
C# have this feature, it's called roslyn code generator.
I believe your java annotations cannot change the build parameters of the package being currently compiled, which is why you wouldnt be able to do that in java.
btw, anything that is present in the AST could be used for that, but I think the preprocessor directive is the most sensible choice.
Fair enough. Presumably I can query a script first somehow to see what the dependency tree is going to look like?
Well for regular dotnet project there is `dotnet list` that allow that, I don't know how it will look c# script file, probably the same.
From what I've read of this preview `dotnet list` works as expected. Under the hood it supposedly builds a mini-project file so that most of the `dotnet` cli works as expected. It also provides an "eject button" to build the csproj from the script file `dotnet project convert`.
Yup; from v1
The motivation was almost certainly compatibility UNIX shebang lines for shell scripts:
Which just looks like a crime against nature and right order of things.It's really hard to explain to anyone who wasn't born in a monoculture, but I come from an eastern european country where in my childhood there may have been maybe a few hundred truly foreign people living there at any one time, mostly diplomats and their families. Decades later I visited and saw a chinese immigrant speaking my native tongue. You can't imagine how... disconcerting that is. Not bad, I'm not against it, it's just... weird.
This feels exactly like that.
This come from C++ preprocessor directive, and have been here since the first version of C#.
Being able to use a shebang as a preprocessing directive is really creative. I’m really glad c# isn’t design-by-committee language.
Well, shebang is a comment.