All you need for that is the ability to read and write binary blobs to and from files, which Windows gives you, and to know what "text files" means for the other programs on that platform. Windows itself doesn't care for text much; but the other programs have a shared convention that ASCII text files have CRLF-separated variable-length lines of text, and Unicode text files store text in UTF16-LE, (including the CRLF pairs, so those look like "\x0D\x00\x0A\x00" as raw bytes).
All of this is left to the user space to sort out, just as it is on Linux, so I am not entirely sure why you demand Windows to do more for you than Linux does.
The OS is the one providing the filesystem, it should define and support how it's used (including providing standard utilities for manipulating it, both from programs or by the operator) rather than leaving the programs to figure it out between themselves. (After all, if the text storage format didn't matter to the OS, why would we bother using the CRLF format on windows at all? I submit that third-party programs did not spontaneously come up with an arbitrary convention that everyone would use a different text format on Windows; rather programs use CRLF when running on Windows precisely because the standard utilities that ship as part of DOS/Windows expect that format)
As already stated multiple times here, the CRLF is actually the "correct" way (at least in the telex days, where CR and LF have actual meanings of "Return Carriage to home" and "Feed a new Line"), while the LF-only one is a Unix "hack"/abstraction (which was actually converted back into CRLF if fed to a telex or a terminal). It is not really a surprise that DOS, which was inspired by CP/M, simply copied what was supposed to be a physical signal. This is the reason the ASCII/ANSI code has a BEL indicator for ringing a bell. In short, CRLF is the way to handle newlines at the time that DOS was designed. You will expect that CRLF is the ending because that's how terminals work (unlike with the magicking Unix which smooshes two differing things into a character).
If you are writing a developer suite, whether you're Delphi developing for MS-DOS or Microsoft developing for Apple II, you kinda have the idea of how things should work (because you have the reference book for the platform, not the compiler/language). It is not the assumption that the OS provides abstraction for text - in thise days, everyone just implement it from scratch, really ("code page" was from literal code pages, where each character has a well-defined byte). This is manifested in command-line handling on Windows: the platform convention is that it is just a flat string, and the C runtime determines how to chop that up (MSVC and Intel C has historically disagreed heavily here) The abberation of Windows only having CRLF is because Unix-based designs took over the world: macOS is Unix, Linux was insiped by Unix, *BSD was Unix-derived.
It still shows up in IETF-style textual network protocols, which evolved on non-Unix systems (HTTP, SMTP, etc.)
MTA-STS, a very recent standard (RFC 8461), only allows CRLF as the line terminator (to the chagrin of *nix lovers, and to the fact that a majority of mail systems are being operated on *nix systems)
Peril of writing protocols with an eye to debug them using cheapest terminal you can find on campus and a grad student paid in coffee