Conversely, whenever I see people talking about server-side JS, I can't find any reason why I wouldn't use PHP instead.
PHP has a vastly simpler toolchain (making it much more effective for rapid iteration), much more consistent and well-thought-out syntax, a more extensive standard library, type safety without having to transpile code from another language (so no build processes that rival C++ in complexity just to still have interpreted code at the end), native and full-featured object orientation, generally better runtime performance, and a package ecosystem with Composer that isn't overrun with inane vanity projects and supply-chain vulnerabilities.
The only major downside to PHP is that it's not great at multithreading, but if you're building microservices where parallelization is handled by an external orchestrator, then you can design around that pretty effectively.
PHP's performance can be significantly lower than JS, because it doesn't have application state (in a standard runtime/setup) and needs to re-run the entire application for every request. Now there are a whole bunch of tricks both in the language and with tooling to alleviate that, but still it's inherently there. It's an advantage for other reasons though.
There are advantages to the lack of application state, though. Memory leaks and similar bugs became largely irrelevant, for instance. Regarding performance, a simple LAMP stack on a dedicated machine can easily give you <250ms pageloads for many web apps. If that's not fast enough, or you're averaging dozens or hundreds of requests per second, you're probably big enough that you can use parallelization or more exotic architectures to speed things up.
A simple LAMP stack can give you <1ms page loads for most applications. Even 15 years ago you could serve over 100k/rps of lightweight PHP pages from a low end server.
> PHP's performance can be significantly lower than JS, because it doesn't have application state (in a standard runtime/setup) and needs to re-run the entire application for every request.
Sure, in PHP, the reality is that after your request is processed, all the state is garbage and is thrown out. But once you embrace that reality and stop trying to make sculpture from garbage, you can make some pretty damn fast pages that get straight to the point. Of course, a lot of people look at my fast PHP and say that it too is garbage, but at least it's fast garbage :P
I agree that PHP's request oriented "shared nothing" approach has its advantages.
That being said there are very decent options for long running processes/application servers these days - see RoadRunner, Swoole and Frankenphp.
That ceased to be true a while ago since the ecosystem gravitated towards FrankenPHP, a stateful application server written in Go as a Caddy module. The performance is amazing, the Go Bridge allows easy extension, and it’s rock solid.
> because it doesn't have application state (in a standard runtime/setup) and needs to re-run the entire application for every request.
Where "application" is basically a single page with less code than a typical React page. Even 20 years ago you'd run into DB struggling to give you data fast enough before you hit any issues with the "re-running the entire app".
And you have to screw your database really badly to see any issues early. Hell, phpBB was horrendously bad, running dozens of heavy DB queries on each page, and was still powering some of the internet's busiest forums.
> Now there are a whole bunch of tricks both in the language and with tooling to alleviate that, but still it's inherently there. It's an advantage for other reasons though.
Yes. It is an enormous advantage: it's fire and forget. You don't need to "SSR" your app (getting all data and state), ship it to the client with a bundle, then "re-hydrate" it (once again pulling data and state from the server) etc.
> type safety without having to transpile code from another language (so no build processes that rival C++ in complexity just to still have interpreted code at the end)
All major server-side JS runtimes are capable of executing TypeScript without transpilation these days. Complex build processes are only really a thing for client-side JS. The normal state of affairs these days is to run server-side code with no build step whatsoever.
That's true to some extent, but working with many modern frameworks often still requires a build step. Next.js in particular has quite a heavy toolchain.
It's true that the lack of multithreading in PHP has been a persistent pain. I love PHP and I've done PHP-centric projects for 20 years, but I end up falling back to Go when it's time to write a background worker that will handle lots of tasks in parallel. I really wish my PHP apps could include native components that process background tasks in parallel.
On the other hand, Javascript's parallelization is one of the hardest-to-understand things I've ever seen: in order to really grasp the async/await model, you have to know the whole story of callback hell and how modern Javascript papers over it while carefully preserving backwards compatibility.
> It's true that the lack of multithreading in PHP has been a persistent pain.
No actually it's a joy to have no multithreading. It keeps the complexity budget lower. You can do multithreading through a custom PHP module if you had a very specific need. Maybe my requirements were too simple but I've never really felt the need. The shared nothing PHP architecture really helps you get away with this.
Anyways as the parent comment said:
> but if you're building microservices where parallelization is handled by an external orchestrator, then you can design around that pretty effectively.
I feel like I'm on a different planet when I see this kind of comment.
What if you need to call multiple external APIs at once with complex json? Sure you can call them one after another, but if each take (say) 2s to return (not uncommon IME), then you are in real trouble with only one thread - even if it is just for one "request".
I guess I'm spoilt in .NET with Task.WhenAll which makes it trivial to do this kind of stuff.
This can be done with curl_multi_exec(), or with $client->getAsync() in Guzzle.
https://www.php.net/manual/en/function.curl-multi-exec.php
https://docs.guzzlephp.org/en/stable/quickstart.html#concurr...
ReactPHP is another good option.
> What if you need to call multiple external APIs at once with complex json?
A few years ago, I had a PHP project that had grown by accretion from taking a single complex input and triggering 2-3 external endpoints to eventually making calls to about 15 sequentially. Processing a single submission went from taking 5-10 seconds to over five minutes.
This was readily solved by moving to ReactPHP (https://reactphp.org/), which implements async via event loops. I was able to reduce the 15 sequential external API calls to four sequential loops (which was the minimum number due to path dependencies in the sequence of operations). That reduced the five minutes back to an average of 20-30 seconds for the complete process.
It wasn't using true multithreading, but in a situation where most of the time was just waiting for responses from remote servers, an event loop solution is usually more than sufficient.
Yeah, though AFIAK these event loops still suffer from blocking on (eg) complex json parsing or anything CPU driven (where real multithreading shines).
But regardless I agree, I'm just saying that these kind of patterns _are_ needed in any moderately complex system, and taking the view that "it's great not to even have it" in the core framework is really strange to me. Esp given every machine I have these days has >10 CPU threads and it won't be long before 100+ is normal.
> Yeah, though AFIAK these event loops still suffer from blocking on (eg) complex json parsing or anything CPU driven (where real multithreading shines).
This is only a problem if the JSON parsing is being done inside the event loop itself. The idea here is that you'd have a separate JSON-parser service that the code in the event loop passes the JSON into, then continues executing the other operations in the loop while it awaits the response from the JSON parser.
Just translate anything you'd spawn a parallel thread for into something you'd pass to a separate endpoint -- that's what I was referring to when I said that the poor multithreading can be easily worked around if you're achieving parallelization by orchestration of microservices.
[dead]
> No actually it's a joy to have no multithreading.
To build CPU-bound applications in PHP, you have to install a bunch of packages, rely on Redis, and try to approximate what Python or Go can do in a dozen lines of code. Can that really be enjoyable?
> It's true that the lack of multithreading in PHP has been a persistent pain.
That's... not necessarily a bad thing to lack. Entire classes of bugs that are common in Java, C/C++, .NET and other true multi-threaded environments simply cannot exist in the PHP world at all.
It seems like you’re talking about something similar: https://true-async.github.io/en/docs/components/thread-pool....
I’ve also wanted parallelism in PHP for a long time. Especially the kind of parallelism that works together with concurrent I/O. That’s when it becomes truly useful, not just a toy.
Asynchrony will always be a complex tool because of the number of states involved. That’s true. But if it’s made convenient and used properly, it becomes enjoyable to work with.
I think frankenphp checks a lot of boxes here, go to handle the threads with php built in https://frankenphp.dev/
As someone who prefers PHP in general and find the TC39 committee has kneecapped the JS language in the past few years...
> PHP has a vastly simpler toolchain
Firmly disagree.
You can install Node and have a basic server running in a few seconds.
PHP requires installing and setting up a server tied into FPM and then reconfiguring a slurry of bad defaults. If you don't avoid the footgun of "traditional" deployments, you get to deal with mixed versions of source. If you don't avoid the footgun of "native" sessions, you get to deal with INCOMPLETE_CLASS errors galore.
And if you want a dynamic frontend, you're still going to want to bust out JS.
> I can't find any reason why I wouldn't use PHP instead
Using a single language for both frontend and backend with (largely) the same availability of tooling and the ability to share code (i.e. type definitions).
> generally better runtime performance
I find this hard to believe? Intuitively, I would assume that the Node / Bun engines are significantly faster than PHP - which doesn't even come with it's JIT enabled by default on the (perfectly valid) note that backends are almost always constrained by DB times.
> a package ecosystem with Composer that isn't overrun with inane vanity projects and supply-chain vulnerabilities.
Functionally, Composer is worse than any of the NPM package managers IMO. PHP's global, shared namespaces preventing monkey patching and nested dependencies is a huge burden when you need to use Lib A and Lib B, but both have conflicting dependencies on Lib C.
But the only reason it doesn't suffer (as many) supply chain issues is two-fold:
1. Packagist's source of truth is the repo and tags. It's much easier to notice a Github account being compromised, which is already harder because it's always had better account security expectations, than NPM. But this comes at costs - such as being unable to rename a package in-place, because Composer gets really confused when it clones a trunk that has a different name than what you asked for. And it's not intrinsically more secure, since tags are not immutable and people can host on less secure VCS platforms.
2. But more than that... it's just less used? The PHP ecosystem is noticeably smaller and has way less happening.
So its very much trade-offs.
> You can install Node and have a basic server running in a few seconds. PHP requires installing and setting up a server tied into FPM...
Without mentioning more, the PHP equivalent to your Node example is `php -S`.
Or FrankenPHP, or hell, there's still even good old Apache. Or avoid the SAPI interface entirely with servers in PHP like Workerman, AMPHP, or Swoole. FPM is entirely too fussy for me to bother with: its error handling is atrocious (restarting in an infinite loop with no backoff is common), and no one really knows how to tune it.
Those are great solutions for production deployment, but to the previous commenter's point, for iterating on your local machine during development work, nothing beats just running `php -S`.
Launch the interpreter's built-in dev server in your project directory, load up localhost in your browser, work on your code, and testing incremental changes locally is just a matter of hitting F5.
By default the dev server is single threaded, but since PHP 7.4 you can add more with an env: `PHP_CLI_SERVER_WORKERS`
I’ve been using PHP for over a decade and have never used FPM.
“Using a single language for both frontend and backend with (largely) the same availability of tooling and the ability to share code”
Is a negative I went backend world and front end world to be different because they do very very different things.
“But more than that... it's just less used? The PHP ecosystem is noticeably smaller and has way less happening.”
That’s not true, PHP is less resume driven development and actually about productivity. I’m really happy narcissist hate PHP and don’t burden it with their garbage and slopworks.