I used to think there's nothing wrong with them either. For simple setups it is very painless. However, I recently stumbled into a use case where setting up the project file is a huge hassle and very error prone as well.
Mainly I wanted another project to be referenced only at the "analyzer" step, and this appeared to work just fine at first, but after the 3rd/4th compilation it failed for some reason. Digging into this opened a huge can of worms of partially/falsely documented features for something I sincerely believed should be easy to achieve. Now the only thing I can do is copy the project into the analyzer, or let the build fail with the dependency disabled just so that the next build works again.
There's also some issues regarding restoring other projects, but that doesn't appear to be the fault of .csproject files.
P.S.: Having a project be referenced (i.e.: marker attributes) at both the analyzer step _and_ later in the final artifact is something I never got working reliably so far. From what I've read a nuget project could make this easier, but I don't want to do that and would expect there to be a way without using the package-management system.
> I used to think there's nothing wrong with them either.
From your complain, it doesn't seem you're pointing anything wrong with .csproj files. You struggled with a usecases that's far from normal and might not even be right for you. Without details, it's hard to tell if you're missing something or you're blaming the tool instead of focusing on getting things to work.
I found a solution I am happy with. Based on my research into this it is a problem I'm not alone with.
I do concede that writing your own analyzers is unusual (which is why i wrote that it's fine for simple setups).
At the same time I deem having a common library to be referred by more than one analyzer project something that should be possible without running into strange errors.
If a tool (dotnet build) tells me something is wrong I am fine with it. If the same tool works after I added something to the project description and then fails at a random time after without having changed the referenced stuff, then I will happily blame the tool.
Especially when commenting out the reference, recompiling until error, and then uncommenting it fixes the issue. While this behavior doesn't necessitate an issue with the files per-se, there is only entity consuming it, so to me there is no distinction.
Back when I last used C#, which admittedly is over 10 years ago, .csproj files were written in XML so they were annoying to edit. It was difficult to understand the structure of them, and I'm not sure if they were documented very well.
Just compare a .csproj to something modern like a Cargo.toml and you'll see why someone might think .csproj is awful. It is immediately obvious what each section of a Cargo.toml does, and intuitive how you might edit or extend them.
Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Very outdated view. Csproj files got hugely simplified with dotnet core, and thanks God they kept the XML format instead of the JSON they tried at first.
(It's named "SDK-style" because that Sdk="Sdk.Name" attribute on the Project tag does a ton of heavy lifting and sets a lot of smart defaults.)
In the "SDK-Style", files are included by default by wildcards, you no longer need to include every file in the csproj. Adding NuGet references by hand is now just as easy as under an <ItemGroup> add a <PackageReference Include"Package.Name" Version="1.0.0" />. (For a while NuGet references were in their own file, but there would also be a dance of "assembly binding redirects" NuGet used to also need to sometimes include in a csproj. All of that is gone today and much simplified to a single easy to write by hand tag.) Other previously "advanced" things you'd only trust the UI to do are more easily done by hand now, too.
> Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Depends on the era, but in the worst days of the csproj verbosity where every file had to be mentioned (included as an Item) in the csproj I didn't know a single C# developer that hadn't needed to do some XML surgery in a csproj file at some point, though most of that was to fix merge conflicts because back then it was a common source of merge conflicts. Fixing merge conflicts in a csproj used to be a rite of passage for any reasonably sized team. (I do not miss those days and am very happy with the "SDK-Style csproj" today.)
Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
Half of the properties are XML attributes <tag ID=5> while the other half are child tags <tag><hello>5</hello></tag>
It's okay to read, but basically impossible to write by hand.
The tooling support is nothing like a JSON with JSON schema that gives you full intellisense with property autocomplete plus attribute description. (Imagine VSCode settings)
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
Is it, though? The guidelines seem to be pretty straight forward: unless you want to target a new language feature, you can just target netstandard or whatever target framework your code requires at the moment. This barely registers as a concern.
> It's okay to read, but basically impossible to write by hand.
Are you sure about that? I mean, what's the best example you can come up with?
This complain is even more baffling considering a) the amount of XML editor helpers out there, b) the fact that most mainstream build systems out there are already XML-based.
I personally wouldn't bother with .NET Standard anymore. .NET Standard 2.0 is nicely frozen and still useful to some legacy shops, but at this point I think for all new code the only decision is to support LTS or not, and then just pick the latest version.
.NET Standard 2.0 will break? I think it's frozen for good. That's also the biggest problem with it: there are so many performance improvements in .NET 7+ that .NET Standard can't opt into but come "free" with retargeting. (Among other things, a lot more Span<T> overloads throughout the BCL. There is a compatibility shim for Memory<T> and Span<T> access in .NET Standard 2.0 and that also includes some of the new overloads, but not all of them.)
Targeting a specific .NET version will break in a year? LTS versions specifically have two years of support. But also .NET has very rarely broken backward compat and you can still easily load libraries built targeting .NET 5 in .NET 9 today. You don't necessarily have to "keep up with the treadmill". It's a good idea: see "free performance boosts when you do". But it isn't required and .NET 5 is still a better target from today's perspective than .NET Standard 2.0. (The biggest instance I know of a backwards compatibility break in .NET was the rescoping between .NET [Framework] 3.5.x and .NET [Framework] 4.0 on what was BCL and what was out/no longer supported and that was still nothing like Python 2 versus 3. I know a lot of people would also count the .NET Framework 4.x and .NET Core 1.0 split, too, which is the reason for the whole mess of things like .NET Standard, but also .NET Standard was the backward compatibility guarantee and .NET Standard 2.0 was its completion point, even though yes there are versions > 2.0, which are even less something anyone needs to worry about today.)
AFAIK, net5 and net9 are incompatible with each other and you can't run one on another, major versions and all that. You need netstandard to run on different versions, but it's only for libraries, you can't run a process on netstandard.
.NET 9 will absolutely load a library that was built targeting .NET 5.
For instance, GraphQL was built targeting both .NET 5 and .NET Standard 2.0, as you can see towards the top of the NuGet page: https://www.nuget.org/packages/GraphQL
.NET 9 will use the .NET 5 build, not .NET Standard 2.0. (.NET Framework 4.8.x would use .NET Standard 2.0.) Because .NET 5 > .NET Standard 2.0.
I don't think things are quite that bad. I'd take a csproj files over many Maven files or Makefiles. The three or four ways I've seen Python manage dependencies didn't improve things either. I'm quite comfortable with Rust's toml files these days but they're also far from easy to write as a human. I still don't quite understand how Go does things, it feels like I'm either missing something or Go just makes you run commands manually when it comes to project management and build features.
I don't think there are any good project definition files. At least csproj is standardised XML, so your IDE can tell if you're allowed to do something or not before you try to hit build.
As for targeting frameworks and versions, I think that's only a problem on Windows (where you have the built in one and the one(s) you download to run applications) and even then you can just target the latest version of whatever framework you need and compile to a standard executable if you don't want to deal with framework stuff. The frameworks themselves don't have an equivalent in most languages, but that's a feature, not a bug. It's not even C# exclusive, I've had to download specific JREs to run Java code because the standard JRE was missing a few DLLs for instance.
The "built-in to Windows" one is essentially feature frozen and "dead". It's a bit like the situation where a bunch of Linux distros for a long while included a "hidden" Python 2 for internal scripts and last chance backwards compatibility even despite Python 3 supposed to be primary in the distro and Python 2 out of support.
Except this is also worse because this is the same Microsoft commitment to backwards compatibility of "dead languages" that leads to things like the VB6 runtime still being included in Windows 11 despite the real security support for the language itself and writing new applications in it having ended entirely in the Windows XP era. (Or the approximately millions of side-by-side "Visual C++ Redistributables" in every Windows install. Or keeping the Windows Scripting Host and support for terribly old dialects of VBScript and JScript around all these decades later, even after being known mostly as a security vulnerability and malware vector for most of those same decades.)
Exactly the reason why The Year of Desktop Linux has become a meme, and apparently it is easier to translate Win32 calls than convince game devs already targeting POSIX like platforms to take GNU/Linux into account.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
It's gotten real simple in the last few years, and is basically the exact same flowchart as Node.JS:
Do you need LTS support? --> Yes --> target the most recent even number (.NET 8.0 today)
^--> No --> target the most recent version (.NET 9.0 today)
Just like Node, versions cycle every six months (.NET 10, the next LTS, is in Preview [alpha/beta testing] today; the feature being discussed is a part of this preview) and LTS add an extra year and a half security support safety net to upgrade to the next LTS.
Everything else can be forgotten. It's no longer needed. It's no longer a thing. It's a dead version for people that need deep legacy support in dark brownfields and nothing more than that.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
What do you think happens when you try to use Python 3.9 to run a script that depends on features added in 3.10? This is inherent to anything that requires an interpreter or runtime. Most scripting tools just default "your stuff breaks if you get it wrong" whereas .Net requires you to explicitly define the dependency.
What's wrong with .csproj files?
I used to think there's nothing wrong with them either. For simple setups it is very painless. However, I recently stumbled into a use case where setting up the project file is a huge hassle and very error prone as well. Mainly I wanted another project to be referenced only at the "analyzer" step, and this appeared to work just fine at first, but after the 3rd/4th compilation it failed for some reason. Digging into this opened a huge can of worms of partially/falsely documented features for something I sincerely believed should be easy to achieve. Now the only thing I can do is copy the project into the analyzer, or let the build fail with the dependency disabled just so that the next build works again.
There's also some issues regarding restoring other projects, but that doesn't appear to be the fault of .csproject files.
P.S.: Having a project be referenced (i.e.: marker attributes) at both the analyzer step _and_ later in the final artifact is something I never got working reliably so far. From what I've read a nuget project could make this easier, but I don't want to do that and would expect there to be a way without using the package-management system.
> I used to think there's nothing wrong with them either.
From your complain, it doesn't seem you're pointing anything wrong with .csproj files. You struggled with a usecases that's far from normal and might not even be right for you. Without details, it's hard to tell if you're missing something or you're blaming the tool instead of focusing on getting things to work.
I found a solution I am happy with. Based on my research into this it is a problem I'm not alone with. I do concede that writing your own analyzers is unusual (which is why i wrote that it's fine for simple setups). At the same time I deem having a common library to be referred by more than one analyzer project something that should be possible without running into strange errors.
If a tool (dotnet build) tells me something is wrong I am fine with it. If the same tool works after I added something to the project description and then fails at a random time after without having changed the referenced stuff, then I will happily blame the tool. Especially when commenting out the reference, recompiling until error, and then uncommenting it fixes the issue. While this behavior doesn't necessitate an issue with the files per-se, there is only entity consuming it, so to me there is no distinction.
Back when I last used C#, which admittedly is over 10 years ago, .csproj files were written in XML so they were annoying to edit. It was difficult to understand the structure of them, and I'm not sure if they were documented very well.
Just compare a .csproj to something modern like a Cargo.toml and you'll see why someone might think .csproj is awful. It is immediately obvious what each section of a Cargo.toml does, and intuitive how you might edit or extend them.
Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Very outdated view. Csproj files got hugely simplified with dotnet core, and thanks God they kept the XML format instead of the JSON they tried at first.
"SDK-Style csproj" has been around since .NET 5. They are pretty great and not bad to hand edit. Starts as just:
(It's named "SDK-style" because that Sdk="Sdk.Name" attribute on the Project tag does a ton of heavy lifting and sets a lot of smart defaults.)In the "SDK-Style", files are included by default by wildcards, you no longer need to include every file in the csproj. Adding NuGet references by hand is now just as easy as under an <ItemGroup> add a <PackageReference Include"Package.Name" Version="1.0.0" />. (For a while NuGet references were in their own file, but there would also be a dance of "assembly binding redirects" NuGet used to also need to sometimes include in a csproj. All of that is gone today and much simplified to a single easy to write by hand tag.) Other previously "advanced" things you'd only trust the UI to do are more easily done by hand now, too.
> Also, just talk to a C# developer, I bet over half of them have never even edited a .csproj and only use visual studio as a GUI to configure their projects.
Depends on the era, but in the worst days of the csproj verbosity where every file had to be mentioned (included as an Item) in the csproj I didn't know a single C# developer that hadn't needed to do some XML surgery in a csproj file at some point, though most of that was to fix merge conflicts because back then it was a common source of merge conflicts. Fixing merge conflicts in a csproj used to be a rite of passage for any reasonably sized team. (I do not miss those days and am very happy with the "SDK-Style csproj" today.)
The immediately obvious Cargo.toml is
Why dumb? Sure there is 1-2 syntax warts left, but the files are very expressive nowadays.
Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
Half of the properties are XML attributes <tag ID=5> while the other half are child tags <tag><hello>5</hello></tag>
It's okay to read, but basically impossible to write by hand.
The tooling support is nothing like a JSON with JSON schema that gives you full intellisense with property autocomplete plus attribute description. (Imagine VSCode settings)
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
Is it, though? The guidelines seem to be pretty straight forward: unless you want to target a new language feature, you can just target netstandard or whatever target framework your code requires at the moment. This barely registers as a concern.
https://learn.microsoft.com/en-us/dotnet/standard/net-standa...
> It's okay to read, but basically impossible to write by hand.
Are you sure about that? I mean, what's the best example you can come up with?
This complain is even more baffling considering a) the amount of XML editor helpers out there, b) the fact that most mainstream build systems out there are already XML-based.
I personally wouldn't bother with .NET Standard anymore. .NET Standard 2.0 is nicely frozen and still useful to some legacy shops, but at this point I think for all new code the only decision is to support LTS or not, and then just pick the latest version.
It will break in a year when a new shiny version appears.
.NET Standard 2.0 will break? I think it's frozen for good. That's also the biggest problem with it: there are so many performance improvements in .NET 7+ that .NET Standard can't opt into but come "free" with retargeting. (Among other things, a lot more Span<T> overloads throughout the BCL. There is a compatibility shim for Memory<T> and Span<T> access in .NET Standard 2.0 and that also includes some of the new overloads, but not all of them.)
Targeting a specific .NET version will break in a year? LTS versions specifically have two years of support. But also .NET has very rarely broken backward compat and you can still easily load libraries built targeting .NET 5 in .NET 9 today. You don't necessarily have to "keep up with the treadmill". It's a good idea: see "free performance boosts when you do". But it isn't required and .NET 5 is still a better target from today's perspective than .NET Standard 2.0. (The biggest instance I know of a backwards compatibility break in .NET was the rescoping between .NET [Framework] 3.5.x and .NET [Framework] 4.0 on what was BCL and what was out/no longer supported and that was still nothing like Python 2 versus 3. I know a lot of people would also count the .NET Framework 4.x and .NET Core 1.0 split, too, which is the reason for the whole mess of things like .NET Standard, but also .NET Standard was the backward compatibility guarantee and .NET Standard 2.0 was its completion point, even though yes there are versions > 2.0, which are even less something anyone needs to worry about today.)
AFAIK, net5 and net9 are incompatible with each other and you can't run one on another, major versions and all that. You need netstandard to run on different versions, but it's only for libraries, you can't run a process on netstandard.
.NET 9 will absolutely load a library that was built targeting .NET 5.
For instance, GraphQL was built targeting both .NET 5 and .NET Standard 2.0, as you can see towards the top of the NuGet page: https://www.nuget.org/packages/GraphQL
.NET 9 will use the .NET 5 build, not .NET Standard 2.0. (.NET Framework 4.8.x would use .NET Standard 2.0.) Because .NET 5 > .NET Standard 2.0.
Or Automapper 14 targets only .NET 8: https://www.nuget.org/packages/AutoMapper
It runs on .NET 9 and .NET 10.
Just to pick two examples at mostly random from the top packages on NuGet.
I don't think things are quite that bad. I'd take a csproj files over many Maven files or Makefiles. The three or four ways I've seen Python manage dependencies didn't improve things either. I'm quite comfortable with Rust's toml files these days but they're also far from easy to write as a human. I still don't quite understand how Go does things, it feels like I'm either missing something or Go just makes you run commands manually when it comes to project management and build features.
I don't think there are any good project definition files. At least csproj is standardised XML, so your IDE can tell if you're allowed to do something or not before you try to hit build.
As for targeting frameworks and versions, I think that's only a problem on Windows (where you have the built in one and the one(s) you download to run applications) and even then you can just target the latest version of whatever framework you need and compile to a standard executable if you don't want to deal with framework stuff. The frameworks themselves don't have an equivalent in most languages, but that's a feature, not a bug. It's not even C# exclusive, I've had to download specific JREs to run Java code because the standard JRE was missing a few DLLs for instance.
The "built-in to Windows" one is essentially feature frozen and "dead". It's a bit like the situation where a bunch of Linux distros for a long while included a "hidden" Python 2 for internal scripts and last chance backwards compatibility even despite Python 3 supposed to be primary in the distro and Python 2 out of support.
Except this is also worse because this is the same Microsoft commitment to backwards compatibility of "dead languages" that leads to things like the VB6 runtime still being included in Windows 11 despite the real security support for the language itself and writing new applications in it having ended entirely in the Windows XP era. (Or the approximately millions of side-by-side "Visual C++ Redistributables" in every Windows install. Or keeping the Windows Scripting Host and support for terribly old dialects of VBScript and JScript around all these decades later, even after being known mostly as a security vulnerability and malware vector for most of those same decades.)
Exactly the reason why The Year of Desktop Linux has become a meme, and apparently it is easier to translate Win32 calls than convince game devs already targeting POSIX like platforms to take GNU/Linux into account.
JScript is still a proper programming language and isn't Electron sized. Also hta that did electron before google was planned.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible.
It's gotten real simple in the last few years, and is basically the exact same flowchart as Node.JS:
Just like Node, versions cycle every six months (.NET 10, the next LTS, is in Preview [alpha/beta testing] today; the feature being discussed is a part of this preview) and LTS add an extra year and a half security support safety net to upgrade to the next LTS.Everything else can be forgotten. It's no longer needed. It's no longer a thing. It's a dead version for people that need deep legacy support in dark brownfields and nothing more than that.
> Deciding whether something is netFramework 4.8 or .NET Core 5 or NET3000 or nEt5 or NET Common 2.0 is impossible. Isn't that one of the most basic things that should be easy.
What do you think happens when you try to use Python 3.9 to run a script that depends on features added in 3.10? This is inherent to anything that requires an interpreter or runtime. Most scripting tools just default "your stuff breaks if you get it wrong" whereas .Net requires you to explicitly define the dependency.
Try to find any language with more than 20 years production deployment across the planet without similar issues.