> It's true that the lack of multithreading in PHP has been a persistent pain.
No actually it's a joy to have no multithreading. It keeps the complexity budget lower. You can do multithreading through a custom PHP module if you had a very specific need. Maybe my requirements were too simple but I've never really felt the need. The shared nothing PHP architecture really helps you get away with this.
Anyways as the parent comment said:
> but if you're building microservices where parallelization is handled by an external orchestrator, then you can design around that pretty effectively.
I feel like I'm on a different planet when I see this kind of comment.
What if you need to call multiple external APIs at once with complex json? Sure you can call them one after another, but if each take (say) 2s to return (not uncommon IME), then you are in real trouble with only one thread - even if it is just for one "request".
I guess I'm spoilt in .NET with Task.WhenAll which makes it trivial to do this kind of stuff.
This can be done with curl_multi_exec(), or with $client->getAsync() in Guzzle.
https://www.php.net/manual/en/function.curl-multi-exec.php
https://docs.guzzlephp.org/en/stable/quickstart.html#concurr...
ReactPHP is another good option.
> What if you need to call multiple external APIs at once with complex json?
A few years ago, I had a PHP project that had grown by accretion from taking a single complex input and triggering 2-3 external endpoints to eventually making calls to about 15 sequentially. Processing a single submission went from taking 5-10 seconds to over five minutes.
This was readily solved by moving to ReactPHP (https://reactphp.org/), which implements async via event loops. I was able to reduce the 15 sequential external API calls to four sequential loops (which was the minimum number due to path dependencies in the sequence of operations). That reduced the five minutes back to an average of 20-30 seconds for the complete process.
It wasn't using true multithreading, but in a situation where most of the time was just waiting for responses from remote servers, an event loop solution is usually more than sufficient.
Yeah, though AFIAK these event loops still suffer from blocking on (eg) complex json parsing or anything CPU driven (where real multithreading shines).
But regardless I agree, I'm just saying that these kind of patterns _are_ needed in any moderately complex system, and taking the view that "it's great not to even have it" in the core framework is really strange to me. Esp given every machine I have these days has >10 CPU threads and it won't be long before 100+ is normal.
> Yeah, though AFIAK these event loops still suffer from blocking on (eg) complex json parsing or anything CPU driven (where real multithreading shines).
This is only a problem if the JSON parsing is being done inside the event loop itself. The idea here is that you'd have a separate JSON-parser service that the code in the event loop passes the JSON into, then continues executing the other operations in the loop while it awaits the response from the JSON parser.
Just translate anything you'd spawn a parallel thread for into something you'd pass to a separate endpoint -- that's what I was referring to when I said that the poor multithreading can be easily worked around if you're achieving parallelization by orchestration of microservices.
[dead]
> No actually it's a joy to have no multithreading.
To build CPU-bound applications in PHP, you have to install a bunch of packages, rely on Redis, and try to approximate what Python or Go can do in a dozen lines of code. Can that really be enjoyable?