All the examples I find are like the trivial ones here where it just feels like instead of jamming a bunch of code into a single messy function that yields, you'd be better off particularly from a static analysis standpoint just having ordered method calls or chained callables where each step returns the next step as a callable.
I've yet to see a use case where I can't come up with a safer way to do it without Fibers, but I would love if someone could enlighten me because I feel like I am absolutely missing something.
That was the first and only time they were kinda useful to me.
They were added to PHP by the maintainers of amphp (https://amphp.org), which is the best library for async PHP out there, providing a clean, object-oriented and async API for files, databases and everything that can make use of async I/O.
There is however the 'yield from' statement.
$a = 5;
$b = fn ($c) => $a * $b;
print($b(2)); // 10
print($b(4)); // 20
function numbers() {
yield from [1,2,3];
yield from [4,5,6];
};
foreach (numbers() as $k => $v) { echo "$k => $v\n"; }
0 => 1
1 => 2
2 => 3
0 => 4
1 => 5
2 => 6
It's literally in the interface.
Having Fibers in PHP is a nice addition but it definitely feels more like plumbing for other PHP extensions/frameworks to use, rather than something the average dev would use themselves directly day to day.
There's an evaporative cooling effect on languages that don't have certain features where all the people who need those features leave after some number of years of not having them, leaving behind only the people who don't need them. There's a survivorship bias as a result.
I worked with dynamic scripting languages primarily for the first 15 years of my career but the definitive split for me was precisely when I had a problem they couldn't solve because they couldn't handle tens of thousands of simultaneous connections in any reasonable way.
This reply applies to all the commenters posting "but I've never needed this".
That doesn't mean PHP was useless without this feature, as it observably has solved a lot of problems. It just means that you shouldn't draw out too many conclusions from "I've never needed it", because you're still using it precisely because you're working in a space that doesn't need it, and your community is not constantly complaining about it because the people who would be complaining are no longer in your community. It means if you do develop a need for it in the future you won't have to leave PHP.
As a result this is a bad metric to measure the utility of a feature with.
On the flip side, it is valid to say "we've come this far without it and maybe we should be focusing on what we can do and continuing to invest in making that better rather than chasing the things we can't", especially since features like adding true threading add a lot of constraints to an implementation and make everything else you ever do in that implementation more expensive. Personally I am of the opinion that all of the dynamic scripting languages really need to just settle down, accept that they are what they are, and stop adding feature after feature after feature to 25-30 year-old languages.
To be clear, my point wasn’t that I think fibers are useless or that people shouldn’t use them. I think it was a great addition.
Just that they wouldn’t be directly useful to the average PHP dev, until they’re being used by frameworks/libraries/extensions, say for example: a HTTP library that can fire off multiple requests asynchronously.
"Just that they wouldn’t be directly useful to the average PHP dev,"
But you're restating my point, whether you realize it or not. They're not useful to the "average PHP dev" because if the "average PHP dev" needed them, they would cease to be a PHP dev, just as I ceased to be a Perl dev when I needed something it couldn't do for similar reasons. All the use cases that PHP could have with fibers have evaporatively-cooled out of the community because PHP couldn't do them, leaving behind precisely that set of people who don't have problems that could be solved with fibers.
https://stackoverflow.com/questions/12939319/coroutines-in-p...
Looks like generators were released in php 5.5 in 2013:
https://versionlog.com/php/5.5/
I was interested in coroutines because most backend server logic eventually devolves into a sea of state machines where each request/response advances the state by updating database rows. This becomes unmanageable by humans, which is why server code can only reach a certain level of complexity, perhaps 1 million lines, before it becomes "enterprise" and triaging overtakes architecting as the main mode of operation for developers.
This is akin to how in the 1990s, object-oriented programming (OOP) limited the size of most desktop programs to around 1 million lines, due to similar state management limits under imperative programming.
I had hoped to replace the state machine soup of backend API endpoints with coroutines that guided users through stuff like their onboarding steps along one-shot functions made up of mainstream conditional logic and higher-order methods.
This history explains why most websites and apps today have so little actual business logic. Most would be considered entry-level or semester projects for desktop developers in the olden days, who were forced to wrangle the complexities of C++ and Java.
Today, most time is lost to the idiosyncrasies of managing build pipelines, version updates, boilerplate, etc. Meaning that we're too mired in babysitting our tools to see how old school approaches like using spreadsheets and batch files in office environments would make a mockery of our work. Now it's mostly all a waste of time, and we can feel it, but I digress.
Anyway, I had attempted to make this state machine <-> coroutine bridge by saving php functions and their state using the jeremeamia/super_closure package:
https://packagist.org/packages/jeremeamia/superclosure
https://github.com/jeremeamia/super_closure
Which gave way to opis/closure:
https://packagist.org/packages/opis/closure
https://github.com/opis/closure
Which gave way to laravel/serializable-closure:
https://packagist.org/packages/laravel/serializable-closure
https://github.com/laravel/serializable-closure
That way the coroutine would get resurrected during each user request and proceed through its logic. Thereby removing the mental load complexity limit imposed by state machines and allowing 1 or 2 developers to compete with larger enterprise teams at big companies.
Since then, I've abandoned these types of approaches and moved towards pure functional programming (less state management and fewer side effects), declarative programming (repeatable processes that eventually meet constraints imposed by integration tests), and data-driven development (higher-order methods on trees and graphs). So lots of work with spreadsheets, Terraform, Firebase, etc. I find that programming languages mostly get in the way now.
After a career mostly spent hacking on legacy code and tearing my hair out, I yearn to be free to get real work done. This would look like abandoning most approaches people are pursuing today. For example, most of the async/await stuff in Javascript is an evolutionary dead end, because we already went down the cooperative threading road in the 1990s and discovered that there was no there, there. Async is today's goto. Same with absurdities like the "final" keyword, which is a self-flagellation habit born from difficulties around name-mangling when exporting C++ methods in object files. The compiler should reorder structures and classes to match the constraints of the runtime, not humans. Otherwise we break Postel's Law:
https://martinfowler.com/bliki/Seal.html
https://martinfowler.com/bliki/SoftwareDevelopmentAttitude.h...
https://martinfowler.com/bliki/DesignedInheritance.html
https://martinfowler.com/bliki/OpenInheritance.html
https://martinfowler.com/bliki/TolerantReader.html
With this context, we can see how large powerful companies have doubled down on the Directing Attitude and Designed Inheritance to the point that we're handcuffed to our tools. They've done little or nothing to advance the state of the art of our languages and frameworks from first principles to be more freeing by providing more leverage. For every revelation like Erlang and Go, there are countless AWSs and Reacts. Forcing us to focus on specifics like regions and edges, side effects and performance, etc. Because nobody did the real work of designing distributed systems that "just work" via techniques like true multiprocessing, memoization.. I could rant forever. It's all bare hands work now, by us serfs under neofeudalism.
Sorry this got long, it's a passion project for me, a dream I may never have time to live if the rest of my life gets lost to making rent.
Let alone the standard library being an unfixable mess, if they want to pursue backwards compatibility. What they would need to do is a clean break. Something like "PHP 10 is a breaking change, we are modernizing the base language and standard library and getting rid of decades of cruft." -- Which then many PHP users would hate.
No, PHP is a not a language whose design has what it takes. A library that claims to have such advanced stuff implemented in PHP is not to be easily trusted to be free of hassle and tripwire, because the language itself has those built-in. For example with Fibers, one has to manually call suspend and resume. Other languages get fibers done without this hassle.
EDIT: For all the downvoters, who don't care to actually give reasons or discuss, the cracks are already showing in the Fibers manual itself. Look at the first comment for example: https://www.php.net/manual/en/language.fibers.php#127282 which links to a bug/issue in the Fibers repo. It is a bug, even recognized via issue label, it is verified, as recognized via issue label, and it is ... closed. Apparently it will not be fixed, and has been simply closed.
The point is, that the designers of PHP decide again and again to add another wart, rather than designing the language in an extensible way. Here a little extra, there a little extra, here a new keyword, there a new thingy. Compare that with other languages. Even JavaScript has managed to have simply the same syntax for functions and anonymous functions. Although it did also introduce an arrow syntax. However it's arrow syntax is not really needed, even though shorter, in contrast to PHP, where you need it, unless you want to list all the parts of the environment you need in the anonymous function. At least in JS they didn't introduce a new extra thing like "fn".
I don't think the design of PHP is done with good oversight. It is done with the approach of tacking things onto what exists, in the easiest way they are able to do, without any goal of keeping the language concepts and keywords minimal. Designing something like "fn" or "use" is a hack to make implementation easier, because they are previously unused keywords, that will then make it easier to adapt the parser for the language, but these decisions are offloading mental load to the user, which ultimately is a bad design.
PHP 10 with a clean slate is a cute idea that would wreak havoc. Perl, or whatever it is called now, tried this and it was a total disaster.
And PHP has deprecated a lot of things along the way.
What I see from PHP is a missed opportunity for not having any native lightweight multi thread capabilities not a robust HTTP server.
I wish the situation changed.
The shared-nothing architecture of PHP makes that really a non-issue for me. Requests never share any state with each other. Something like RabbitMQ can handle communication between systems.
It is kinda funny, that you mention RabbitMQ, which is written in Erlang, which is famous for its lightweight processes. But also compare the approach with thread pools built into the standard libraries in other languages. And even many of those are heavy weight compared to Erlang's lightweight processes.
In other systems once you get beyond a single machine you need that external communication mechanism anyway, and now you have multiple classes of comms which introduces bugs and complexity and performance cliffs.
In PHP you just throw another server at it, it'll act the same as if you just added another process. Nice linear scaling and simple to understand architectures.
Man I miss working in PHP
You generally do not implement efficient systems in php, they are easy to debug, fast to code and quick to fix though.
Long running processes and async I/O are a great tool to have though. They are present in PHP for almost two decades now, and despite having many incarnations (select(), libevent, etc) and even frameworks (amp, reactphp, etc) the knowledge is highly transferrable between them if you understand the fundamentals.
The first logical step after PHP is NodeJS, which has the fast iteration cycles of PHP without the low-level memory management or the enterprisey application server headaches of other languages, AND it has the advantages of a persistent service without needing to worry too much about parallelism because it's still single process.
But if you still need more performance you have a few options. But if you're at that point, you're already lucky. Most people wish they needed the performance or throughput of a language/environment like Go.
Most people do need the performance and throughput offered by modern languages like Go, though. Time to market is the most important consideration for most. Maybe at Facebook scale you can spend eons crafting perfection in a slow-to-develop language/ecosystem like PHP or NodeJS, but most people have to get something out the door ASAP.
... not really, you still have to deal with bundlers in real-world applications.
Although now that the PHP Foundation is officially supporting FrankenPHP maybe things will be evolving into a new paradigm.
https://www.reddit.com/r/PHP/comments/1lqpkfq/frankenphp_any...
I'm not suggesting that PHP is the ideal tool for every use case. The goal is to share a concept that might be unfamiliar to some developers, using PHP as the context.
Sometimes learning about a concept in a familiar language helps you recognise where it might be useful elsewhere or apply it in a language that supports it better.
Terms like coroutines, concurrency, promises, etc, can be confusing; I just like to demystify them with easy-to-grasp examples. That does mean that examples can be contrived or very simple, but they are designed to get the point across.
Thanks for all the comments so far!
You just discovered what happens when you talk about PHP, and there are similar "Godwin's laws" for other topics, such as IPv6 (we needed IPv4 with extra octets), Google/Apple/Microsoft (any company trying to achieve a commercial objective is always equivalent to "enshittification") etc. Don't mind these too much - it's a good article!
The difficulty with these examples is that they are very different from the actual tasks that everyday web developers would like to parallelize, such as long-running database queries and API requests. These things often take orders of magnitude longer than any pure-PHP loop that a typical webapp might contain.
An example that fires off an async query and yields the result when it's ready will probably produce the right click in the minds of many more people. (mysqli can do this, but the interface is convoluted and badly in need of a Promise-like wrapper. I'm not sure if PDO/PostgreSQL even supports async queries.)