As a static typing advocate I do find it funny how all the popular dynamic languages have slowly become statically typed. After decades of people saying it's not at all necessary and being so critical of statically typed languages.
When I was working on a fairly large TypeScript project it became the norm for dependencies to have type definitions in a relatively short space of time.
People adapt to the circumstances. A lot of Python uses are no longer about fast iteration on the REPL. Instead of that we are shipping Python to execute in clusters on very long running jobs or inside servers. It's not only about having to start all over after hours, it's simply that concurrent and distributed execution environments are hostile to interactive programming. Now you can't afford to wait for an exception and launch the debugger in postmortem. Or even if you do it's not very useful.
And now my personal opinion: If we are going the static typing way I would prefer simply to use Scala or similar instead of Python with types. Unfortunately in the same way that high performance languages like C attracts premature optimizers static types attract premature "abstracters" (C++ both). I also think that dynamic languages have the largest libraries for technical merit reasons. Being more "fluid" make them easier to mix. In the long term the ecosystem converges organically on certain interfaces between libraries.
And so here we are with the half baked approach of gradual typing and #type: ignore everywhere.
Here we are because:
* Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
* Types are incredibly valuable on hardened production code.
* Most good production code started out spikey, experimental or as an MVP and transitioned.
And so here we are with gradual typing because "throwing away all the code and rewriting it to be "perfect" in another language" has been known for years to be a shitty way to build products.
Im mystified that more people here dont see that the value and cost of types is NOT binary ("they're good! theyre bad!") but exists on a continuum that is contingent on the status of the app and sometimes even the individual feature.
> Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
I find I’ve spent so much time writing with typed code that I now find it harder to write POC code in dynamic languages because I use types to help reason about how I want to architect something.
Eg “this function should calculate x and return”, well if you already know what you want the function to do then you know what types you want. And if you don’t know what types you want then you haven’t actually decided what that function should do ahead of building it.
Now you might say “the point of experimental code is to figure out what you want functions to do”. But even if you’re writing an MVP, you should know what that each function should do by the time you’ve finished writing it. Because if you don’t know who to build a function then how do you even know that the runtime will execute it correctly?
Python doesn’t have “no types,” in fact it is strict about types. You just don’t have to waste time reading and writing them early on.
While a boon during prototyping, a project may need more structural support as the design solidifies, it grows, or a varied, growing team takes responsibility.
At some point those factors dominate, to the extent “may need” support approaches “must have.”
My point is if you don’t know what types you need, then you can’t be trusted to write the function to begin with. So you don’t actually save that much time in the end. typing out type names simply isn’t the time consuming part of prototyping.
But when it comes to refactoring, having type safety makes it very easy to use static analysis (typically the compiler) check for type-related bugs during that refactor.
I’ve spent a fair amount of years in a great many different PL paradigms and I’ve honestly never found loosely typed languages any fast for prototyping.
That all said, I will say that a lot of this also comes down to what you’re used to. If you’re used to thinking about data structures then your mind will go straight there when prototyping. If you’re not used to strictly typed languages, then you’ll find it a distraction.
Right after hello world you need a list of arguments or a dictionary of numbers to names. Types.
Writing map = {}, is a few times faster than map: Dictionary[int, str] = {}. Now multiply by ten instances. Oh wait, I’m going to change that to a tuple of pairs instead.
It takes me about three times longer to write equivalent Rust than Python, and sometimes it’s worth it.
Rust is slower to prototype than Python because Rust is a low level language. Not because it’s strictly typed. So that’s not really a fair comparison. For example, assembly doesn’t have any types at all and yet is slower to prototype than Rust.
Let’s take Visual Basic 6, for example. That was very quick to prototype in even with “option explicit” (basically forcing type declarations) defined. Quicker, even, than Python.
Typescript isn’t any slower to prototype in than vanilla JavaScript (bar setting up the build pipeline — man does JavaScript ecosystem really suck at DevEx!).
Writing map = {} only saves you a few keystrokes. And Unless you’re typing really slowly with one finger like an 80 year old using a keyboard for the first time, you’ll find the real input bottleneck isn’t how quickly you can type your data structures into code, but how quickly your brain can turn a product spec / Jira ticket into a mental abstraction.
> Oh wait, I’m going to change that to a tuple of pairs instead
And that’s exactly when you want the static analysis of a strict type system to jump in and say “hang on mate, you’ve forgotten to change these references too” ;)
Having worked on various code bases across a variety of different languages, the refactors that always scare me the most isn’t the large code bases, it’s the ones in Python or JavaScript because I don’t have a robust type system providing me with compile-time safety.
There’s an old adage that goes something like this: “don’t put off to runtime what can be done in compile time.”
As computers have gotten exponentially faster, we’ve seemed to have forgotten this rule. And to our own detriment.
I've found the transition point where types are useful to start even within a few hundred lines of code, and I've found types are not that restrictive if at all, especially if the language started out typed. The rare case I need to discard types that is available usually, and a code smell your doing something wrong.
Even within a recent toy 1h python interview question having types would've saved me some issues and caught an error that wasn't obvious. Probably would've saved 10m in the interview.
Yep, depends on your memory context capacity.
For me I often don't feel any pain-points when working before about 1kloc (when doing JS), however if a project is above 500loc it's often a tad painful to resume it months later when I've started to forget why I used certain data-structures that aren't directly visible (adding types at that point is usually the best choice since it gives a refresher of the code at the same time as doing a soundness check).
The transition to where type hints become valuable or even necessary isnt about how many lines of code you have it is about how much you rely upon their correctness.
Type strictness also isnt binary. A program with lots of dicts that should be classes doesnt get much safer just because you wrote : dict[str, dict] everywhere.
> * Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
Press "X" to doubt. Types help _a_ _lot_ by providing autocomplete, inspections, and helping with finding errors while you're typing.
This significantly improves the iteration speed, as you don't need to run the code to detect that you mistyped a varible somewhere.
Pycharm, pyflakes, et all can do most of these without written types.
The more interesting questions, like “should I use itertools or collections?” Autocomplete can’t help with.
In some fields throwing away and rewriting is the standard, and it works, more or less. I'm thinking about scientific/engineering software: prototype in Python or Matlab and convert to C or C++ for performance/deployment constraints. It happens frequently with compilers too. I think migrating languages is actually more successful than writing second versions.
> * Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
This is what people say, but I don't think it's correct. What is correct is that say, ten to twenty years ago, all the statically typed languages had other unacceptable drawbacks and "types bad" became a shorthand for these issues.
I'm talking about C (nonstarter for obvious reasons), C++ (a huge mess, footguns, very difficult, presumably requires a cmake guy), Java (very restrictive, slow iteration and startups, etc.). Compared to those just using Python sounds decent.
Nowadays we have Go and Rust, both of which are pretty easy to iterate in (for different reasons).
> Nowadays we have Go and Rust, both of which are pretty easy to iterate in (for different reasons).
It's common for Rust to become very difficult to iterate in.
https://news.ycombinator.com/item?id=40172033
I think Java was the main one. C/C++ are (relatively) close to the metal, system-level languages with explicit memory management - and were tacitly accepted to be the "complicated" ones, with dynamic typing not really applicable at that level.
But Java was the high-level, GCed, application development language - and more importantly, it was the one dominating many university CS studies as an education language before python took that role. (Yeah, I'm grossly oversimplifying - sincere apologies to the functional crowd! :) )
The height of the "static typing sucks!" craze was more like a "The Java type system sucks!" craze...
For me it was more the “java can’t easily process strings” craze that made it impractical to use for scripts or small to medium projects.
Not to mention boilerplate BS.
Recently, Java has improved a lot on these fronts. Too bad it’s twenty-five years late.
The issue with moving the ship where it's passanger wants it to be makes it more difficult for new passengers to get on.
This is clearly seen with typescript and the movement for "just use JS".
Furthermore, with LLMs, it should be easier than ever to experiment in one language and use another language for production loads.
I don't think types are expensive for MVP code unless they're highly complicated (but why would you do that?) Primitives and interfaces are super easy to type and worth the extra couple seconds.
Software quality only pays off on the long time. For the short time, garbage is quick and gets the job done.
Also, in my experience, the long time for software arrives in a couple of weeks.
PHP is a great example of the convergence of interfaces. Now they have different “PSR” standards for all sorts of things. There is one for HTTP clients, formatting, cache interfaces, etc. As long as your library implements the spec, it will work with everything else and then library authors are free to experiment on the implementation and contribute huge changes to the entire ecosystem when they find a performance breakthrough.
Types seem like a “feature” of mature software. You don’t need to use them all the time, but for the people stuck on legacy systems, having the type system as a tool in their belt can help to reduce business complexity and risk as the platform continues to age because tooling can be built to assert and test code with fewer external dependencies.
Python is ubiquitous in ML, often you have no choice but to use it
[dead]
> slowly become statically typed
They don't. They become gradually typed which is a thing of it's own.
You can keep the advantages of dynamic languages, the ease of prototyping but also lock down stuff when you need to.
It is not a perfect union, generally the trade-off is that you can either not achieve the same safety level as in a purely statically typed language because you need to provide same escape hatches or you need a extremely complex type system to catch the expressiveness of the dynamic side. Most of the time it is a mixture of both.
Still, I think this is the way to go. Not dynamic typing won or static typing won but both a useful and having a language support both is a huge productivity boost.
> how all the popular dynamic languages have slowly become statically typed
Count the amount of `Any` / `unknown` / `cast` / `var::type` in those codebases, and you'll notice that they aren't particularly statically typed.
The types in dynamic languages are useful for checking validity in majority of the cases, but can easily be circumvented when the types become too complicated.
It is somewhat surprising that dynamic languages didn't go the pylint way, i.e. checking the codebase by auto-determined types (determined based on actual usage).
Julia (by default) does the latter, and its terrible. It makes it a) slow, because you have to do nonlocal inference through entire programs, b) impossible to type check generic library code where you have no actual usage, c) very hard to test that some code works generically, as opposed to just with these concrete types, and finally d) break whenever you have an Any anywhere in the code so the chain of type information is broken.
In the discussion of static vs dynamic typing solutions like typescript or annotated python were not really considered.
IMHO the idea of a complex and inference heavy type system that is mostly useless at runtime and compilation but focused on essentially interactive linting is relatively recent and its popularity is due to typescript success
I think that static typing proponents were thinking of something more along the lines of Haskell/OCaml/Java rather than a type-erased system a language where [1,2] > 0 is true because it is converted to "NaN" > "0"
OTH I only came to realize that I actually like duck typing in some situations when I tried to add type hints to one of my Python projects (and then removed them again because the actually important types consisted almost entirely of sum types, and what's the point of static typing if anything is a variant anyway).
E.g. when Python is used as a 'scripting language' instead of a 'programming language' (like for writing small command line tools that mainly process text), static typing often just gets in the way. For bigger projects where static typing makes sense I would pick a different language. Because tbh, even with type hints Python is a lousy programming language (but a fine scripting language).
> Because tbh, even with type hints Python is a lousy programming language (but a fine scripting language).
I'd be interested in seeing you expand on this, explaining the ways you feel Python doesn't make the cut for programming language while doing so for scripting.
The reason I say this is because, intuitively, I've felt this way for quite some time but I am unable to properly articulate why, other than "I don't want all my type errors to show up at runtime only!"
Learn how to use the tools to prevent that last paragraph.
Note1: Type hints are hints for the reader. If you cleverly discovered that your function is handling any type of data, hint that!
Note2: From my experience, in Java, i have NEVER seen a function that consumes explicitely an Object. In Java, you always name things. Maybe with parametric polymorphism, to capture complex typing patterns.
Note 3: unfortunately, you cannot subclass String, to capture the semantic of its content.
> Java, i have NEVER seen a function that consumes explicitely an Object
So you did not see any Java code from before version 5 (in 2004) then, because the language did not have generics for the first several years it was popular. And of course many were stuck working with older versions of the language (or variants like mobile Java) without generics for many years after that.
Exactly, I have never seen such codes [*].
Probably because the adoption of the generics has been absolutely massive in the last 20 years. And I expect the same thing to eventually happen with Typescript and [typed] Python.
[*]: nor have I seen EJB1 or even EJB2. Spring just stormed them, in the last 20 years.
An example of a function in Java that consumes a parameter of type Object is System.out.println(Object o)
Many such cases.
Sounds to be more of a symptom of the types of programs and functions you have written, rather than something inherent about types or Python. I've never encountered the type of gerry-mangled scenario you have described no matter how throwaway the code is.
If you like dynamic types have you considered using protocols? They are used precisely to type duck typed code.
AI tab-complete & fast LSP implementations made typing easy. The tools changed, and people changed their minds.
JSON's interaction with types is still annoying. A deserialized JSON could be any type. I wish there was a standard python library that deserialized all JSON into dicts, with opinionated coercing of the other types. Yes, a custom normalizer is 10 lines of code. But, custom implementations run into the '15 competing standards' problem.
Actually, there should be a popular type-coercion library that deals with a bunch of these annoying scenarios. I'd evangelize it.
> all the popular dynamic languages have slowly become statically typed
I’ve heard this before, but it’s not really true. Yes, maybe the majority of JavaScript code is now statically-typed, via Typescript. Some percentage of Python code is (I don’t know the numbers). But that’s about it.
Very few people are using static typing in Ruby, Lua, Clojure, Julia, etc.
Types become very useful when the code base reaches a certain level of sophistication and complexity. It makes sense that for a little script they provide little benefit but once you are working on a code base with 5+ engineers and no longer understand every part of it having some more strict guarantees and interfaces defined is very very helpful. Both for communicating to other devs as well as to simply eradicate a good chunk of possible errors that happen when interfaces are not clear.
How many people are using Ruby, Lua, Clojure, Julia, etc.?
Fair enough, apart from Ruby they’re all pretty niche.
OTOH I’m not arguing that most code should be dynamically-typed. Far from it. But I do think dynamic typing has its place and shouldn’t be rejected entirely.
Also, I would have preferred it if Python had concentrated on being the best language in that space, rather than trying to become a jack-of-all-trades.
I have my doubts about majority of JavaScript being TypeScript.
You’re probably right. RedMonk [0] shows JavaScript and TypeScript separately and has the former well above the latter.
[0] https://redmonk.com/sogrady/2025/06/18/language-rankings-1-2...
Even if they're not written as TypeScript, there are usually add on definitions like "@types/prettier" and the like.
I disagree for Julia, but that probably depends on the definition of static typing.
For the average Julia package I would guess, that most types are statically known at compile time, because dynamic dispatch is detrimental for performance. I consider, that to be the definition of static typing.
That said, Julia functions seldomly use concrete types and are generic by default. So the function signatures often look similar to untyped Python, but in my opinion this is something entirely different.
At least in ruby theres mayor code bases using stripes sorbet and the official RBS standard for type hints. Notably its big code bases with large amounts of developers, fitting in with the trend most people in this discussion point to.
My last job was working at a company that is notorious for Ruby and even though I was mostly distant from it, there seemed to be a big appetite for Sorbet there.
The big difference between static typing in Python and Ruby is that Guido et al have embraced type hints, whereas Matz considers them to be (the Ruby equivalent of) “unpythonic”. Most of each language’s community follows their (ex-)BDFL’s lead.
PHP as well has become statically typed.
All the languages you name are niche languages compared to Python, JS (/ TS) and PHP. Whether you like it or not.
I think you're ignoring how for some of us, gradual typing, is a far better experience than languages with static types.
For example what I like about PHPStan (tacked on static analysis through comments), that it offers so much flexibility when defining type constraints. Can even specify the literal values a function accepts besides the base type. And subtyping of nested array structures (basically support for comfortably typing out the nested structure of a json the moment I decode it).
Not ignoring, I just didn't write an essay. In all that time working with TypeScript there was very little that I found to be gradually typed, it was either nothing or everything, hence my original comment. Sure some things might throw in a bunch of any/unknown types but those were very much the rarity and often some libraries were using incredibly complicated type definitions to make them as tight as possible.
Worked with python, typescript and now php, seems that phpstan allows this gradual typing, while typescript kinda forces you to start with strict in serious projects.
Coming from Java extreme verbosity, I just loved the freedom of python 20 years ago. Working with complex structures with mixed types was a breeze.
Yes, it was your responsibility to keep track of correctness, but that also taught me to write better code, and better tests.
Writing tests is harder work than writing the equvalent number of type hints though
Type hints and/or stronger typing in other languages are not good substitutes for testing. I sometimes worry that teams with strong preferences for strong typing have a false sense of security.
People write tests in statically typed languages too, it's just that there's a whole class of bugs that you don't have to test for.
Hints are not sufficient, you’ll need tests anyway. They somewhat overlap.
Writing and maintaining tests that just do type checking is madness.
Dynamic typing also gives tooling such as LSPs and linters a hard time figuring out completions/references lookup etc. Can't imagine how people work on moderate to big projects without type hints.
Type hints / gradual typing is crucially different from full static typing though.
It’s valid to say “you don’t need types for a script” and “you want types for a multi-million LOC codebase”.
Static typing used to be too rigid and annoying to the point of being counterproductive. After decades of improvement of parsers and IDEs they finally became usable for rapid development.
Everything goes in cycles. It has happened before and it will happen again. The softward industry is incredibly bad at remembering lessons once learned.
That's because many do small things that don't really need it, sure there are some people doing larger stuff and are happy to be the sole maintainer of a codebase or replace the language types with unit-test type checks.
And I think they can be correct for rejecting it, banging out a small useful project (preferably below 1000 loc) flows much faster if you just build code doing things rather than start annotating (that quickly can be come a mind-sinkhole of naming decisions that interrupts a building flow).
However, even less complex 500 loc+ programs without typing can become a pita to read after the fact and approaching 1kloc it can become a major headache to pick up again.
Basically, can't beat speed of going nude, but size+complexity is always an exponential factor in how hard continuing and/or resuming a project is.
Thing is, famous dynamic languages of the past, Common Lisp, BASIC, Clipper, FoxPro, all got type hints for a reason, then came a new generation of scripting languages made application languages, and everyone had to relearn why the fence was in the middle of the field.
I think both found middle ground. In Java you don’t need to define the type of variables within the method. In Python people have learned types in method arguments is a good thing.
> After decades of people saying
You have to admit that the size and complexity of the software we write has increased dramatically over the last few "decades". Looking back at MVC "web applications" I've created in the early 2000s, and comparing them to the giant workflows we deal with today... it's not hard to imagine how dynamic typing was/is ok to get started, but when things exceed one's "context", you type hints help.
I like static types but advocating for enforcing them in any situation is different. Adding them when you need (Python currently) seems a better strategy than forcing you to set them always (Typescript is in between as many times it can determine them).
Many years ago I felt Java typing could be overkill (some types could have been deduced from context and they were too long to write) so probably more an issue about the maturity of the tooling than anything else.
What I would need is a statically typed language that has first class primitives for working with untyped data ergonomically.
I do want to be able to write a dynamically typed function or subsystem during the development phase, and „harden” with types once I’m sure I got the structure down.
But the dynamic system should fit well into the language, and I should be able to easily and safely deal with untyped values and convert them to typed ones.
So… Typescript?
Yes, the sad part is that some people experienced early TypeScript that for some reason had the idea of forcing "class" constructs into a language where most people wasn't using or needing them (and still aren't).
Sometimes at about TypeScript 2.9 finally started adding constructs that made gradual typing of real-world JS code sane, but by then there was a stubborn perception of it being bad/bloated/Java-ish,etc despite maturing into something fairly great.
The need for typing changed, when the way the language is used changed.
When JavaScript programs were a few hundred lines to add interactivity to some website type annotationd were pretty useless. Now the typical JavaScript project is far larger and far more complex. The same goes for python.
Trends change. There is still no hard evidence that static types are net positive outside of performance.
dynamically-typed languages were typically created for scripting tasks - but ended up going viral (in part due to d-typing), the community stretched the language to its limits and pushed it into markets it wasn't designed/thought for (embedded python, server-side js, distributed teams, dev outsourcing etc).
personally i like the dev-sidecar approach to typing that Python and JS (via TS) have taken to mitigate the issue.
Javascript is no longer was just scripting. Very large and complex billion dollar apps were being written in pure Javascript. It grew up.
I guess Python is next.
Next stop is to agree that JSON is really NOT the semantic data exchange serialization for this "properly typed" world.
Then what is?
Everybody knows the limitations of JSON. Don't state the obvious problem without stating a proposed solution.
The RDF structure is a graph of typed instances of typed objects, serializable as text.
Exchanging RDF, more precisely its [more readable] "RDF/turtle" variant, is probably what will eventually come to the market somehow.
Each object of a RDF structure has a global unique identifier, is typed, maintains typed links with other objects, have typed values.
For an example of RDF being exchanged between a server and a client, you can test
https://search.datao.net/beta/?q=barack%20obama
Open your javascript console, and hover the results on the left hand side of the page with your mouse. The console will display which RDF message triggered the viz in the center of the page.
Update: you may want to FIRST select the facet "DBPedia" at the top of the page, for more meaningful messages exchanged.
Update2: the console does not do syntax higlighting, so here is the highlighted RDF https://datao.net/ttl.jpg linked to the 1st item of " https://search.datao.net/beta/?q=films%20about%20barack%20ob... "
That's a circular argument. What serialization format would you recommend? JSON?
Turtle directly.
JSON forces you to fit your graph of data into a tree structure, that is poorly capturing the cardinalities of the original graph.
Plus of course, the concept of object type is not existing in JSON.
Thank you, I did not realize that RDF has its own serialization format. I'm reading about it now.
Huh. It's almost like these people didn't know what they were talking about. How strange.
I think that the practically available type checkers evolved to a point where many of the common idioms can be expressed with little effort.
If one thinks back to some of the early statically typed languages, you'd have a huge rift: You either have this entirely weird world of Caml and Haskell (which can express most of what python type hints have, and could since many years), and something like C, in which types are merely some compiler hints tbh. Early Java may have been a slight improvement, but eh.
Now, especially with decent union types, you can express a lot of idioms of dynamic code easily. So it's a fairly painless way to get type completion in an editor, so one does that.
Well, we do coalesce on certain things... some static type languages are dropping type requirements (Java and `var` in certain places) :D
There's no dropping of type requirements in Java, `var` only saves typing.
When you use `var`, everything is as statically typed as before, you just don't need to spell out the type when the compiler can infer it. So you can't (for example) say `var x = null` because `null` doesn't provide enough type information for the compiler to infer what's the type of `x`.
> `var` only saves typing.
this is a lovely double entendre
var does absolutely nothing to make Java a less strictly typed language. There is absolutely no dropping of the requirement that each variable has a type which is known at compile time.
Automatic type inference and dynamic typing are totally different things.
I have not written a line of Java in at least a decade, but does Java not have any 'true' dynamic typing like C# does? Truth be told, the 'dynamic' keyword in C# should only be used in the most niché of circumstances. Typically, only practitioners of Dark Magic use the dynamic type. For the untrained, it often leads one down the path of hatred, guilt, and shame. For example:
dynamic x = "Forces of Darkness, grant me power";
Console.WriteLine(x.Length); // Dark forces flow through the CLR
x = 5;
Console.WriteLine(x.Length); // Runtime error: CLR consumed by darkness.
C# also has the statically typed 'object' type which all types inherit from, but that is not technically a true instance of dynamic typing.
Same nonsense repeated over and over again... There aren't dynamic languages. It's not a thing. The static types aren't what you think they are... You just don't know what you are saying and your conclusion is just a word salad.
What happened to Python is that it used to be a "cool" language, whose community liked to make fun of Java for their obsession with red-taping, which included the love for specifying unnecessary restrictions everywhere. Well, just like you'd expect from a poorly functioning government office.
But then everyone wanted to be cool, and Python was adopted by the programming analogue of the government bureaucrats: large corporations which treat programming as a bureaucratic mill. They don't want fun or creativity or one-of bespoke solutions. They want an industrial process that works on as large a scale as possible, to employ thousands of worst quality programmers, but still reliably produce slop.
And incrementally, Python was made into Java. Because, really, Java is great for producing slop on an industrial scale. But the "cool" factor was important to attract talent because there used to be a shortage, so, now you have Python that was remade to be a Java. People who didn't enjoy Java left Python over a decade ago. So that Python today has nothing in common with what it was when it was "cool". It's still a worse Java than Java, but people don't like to admit defeat, and... well, there's also the sunk cost fallacy: so much effort was already spent at making Python into a Java, that it seems like a good idea to waste even more effort to try to make it a better Java.
Yeah, this is the lens through which I view it. It's a sort of colonization that happens, when corporations realize a language is fit for plunder. They start funding it, then they want their people on the standards boards, then suddenly the direction of the language is matched very nicely to their product roadmap. Meanwhile, all the people who used to make the language what it was are bought or pushed out, and the community becomes something else entirely.