If the culture normalized such that a much larger proportion of research was conducted by permanent, non-faculty, research employees, this would both reduce the need for so many students and increase the jobs available for students, and create a new employment niche with a different balance of teaching/administration/research. It would basically be turning "post doc" into an actual career rather than a stop over.
This would be better for everyone involved, at the admitted cost of being quite a bit more expensive. My guess is that the market would naturally converge on this equilibrium if the information of job placement rates on a per-program (or even per lab/advisor) were more readily available.
This isn't really a culture problem, IMO, as much as a funding one.
My group currently employs two people of the description you have, and it does reduce the need for students (and honestly, increase productivity).
It's also by far the most stressful part of my job. Funding them involves writing multiple grants per year (because the expectation of any particular grant is low, even with a decent hit rate) and I am constantly worried that I won't be able to keep them employed.
If one of them leaves this year, I'm not likely to replace them, simply because in the current funding environment, I can't look someone in the eye and promise them a long term position. There are so many more ways to fund a student, and they're inherently time limited, so even if things collapse, there's ways to white knuckle through it in a way there aren't for staff scientists.
The funding problem is a cultural problem though. Religious right wing politicians in the US have attacked science and education funding at every opportunity. Science and education produce ideas that are at odds with right wing religious orthodoxy, so those things must not be allowed in society.
I don't think it's that simple.
I'm not religious, but I think a lot of academic funding is wasteful.
It's not just that science contradicts orthodox religious views. It's also that humanities education and exposure to a diversity of people and thought can "deprogram" students away from traditional ways of thinking, which is a threat to traditional power hierarchies.
The finding problem is an economic demand problem. There is not enough market demand for research, particularly some... questionable research. Yes, sometimes seemingly useless research can lead to breakthroughs. No, that doesn't make them economically attractive. You are effectively doing the same thing as gambling on crypto.
Also, consider what the postdoc is.
A person arrives on a 18 month funded postdoc (believe me, plenty exist). They have just completed a PhD which means they probably have a couple papers published and maybe another one or two in the pipeline. So as they spin up their time with you, they are also finishing these papers from their previous job. By six months in they are done with that and fully onboarded to the project. So they spend six months working. But now, they only have six months left of contract. You don't have money to keep them or perhaps your country will require you to offer a permanent contract if it is being renewed so you cannot offer them to extend their position with you. So they spend the final six months of their postdoc looking for a job. So, for 18 months of salary, you get six to eight months of work. It's unreasonable. Things need to change.
Or lets say you have a mission critical project that must be done by a postdoc. You offer them a 3 year contract that is grant funded. It is three years because most grant agencies work on three year cycles. The project requires a year commitment to building an apparatus (maybe its a lab experiment, maybe it's training some foundation model, whatever). After that year, the apparatus can be used for science. Your postdoc comes to you in year 2 month 3 and says, well I have been offered a faculty position at university X so I am leaving in the fall. So you get 18 months of work out of them and now cannot hire anyone else because you only have 18 months of funding left, but your country requires you to offer a minimum of 24 months contract. Things need to change.
It's important to note that academics often keep projects from their former positions going at their new ones. But as soon as someone leaves to industry, this falls apart. Because industrial positions expect the person to work on the project they specify, they rarely hire someone to work as an academic, pursuing their own research directions.
I think the solution here is as others have suggested, spend more money on hiring people for longer term and with higher salaries. But we shall see if anyone listens to that advice.
Notably even the role of the professor has drastically changed in the last few decades. The "publish or perish" paradigm has really taken over and changed the type of research being done. Higgs famously said he wouldn't make it as a non-tenured faculty in today's academic culture.
Not to mention that the type of research being done has drastically changed too. There's many more projects that require wide collaboration. You're not going to do something like CERN, DESI, LIGO, or many other scientific mega projects from a single lab, or even single field of study.
The academic deal has changed. It used to be that by becoming a professor you were granted facilities and time to carry out your research. In return you had to help educate and foster the next generation. It is mutually beneficial. There were definitely abusers of the system, but it is generally not too difficult to tell who in your own department is trying to take advantage of the system, but incredibly difficult to identify these people when looking from the perspective of a university administration. There's been more centralization in the university administration and I'm afraid Goodhart's Law is in full force now.
What I'd like to see is more a return to the Laissez-faire approach. It shouldn't be completely relaxed, but to summarize Mervin Kelly (who ran Bell Labs): "You don't manage a bunch of geniuses, they already know what needs to be worked on. That's what makes them experts in the first place." At the end of the day we can't run academia like a business and it really shouldn't be. The profits generated from academia are less direct and more distributed through society. Evaluating universities by focusing on their expenditures and direct profits alone is incredibly naive. We're better able to make less naive evaluations today, but we still typically don't (it is still fairly complex)
Your suggestion would have fewer fresh eyes to look at the problem. If the scientific enterprise were just about churning out widgets, then yes it’s better to have permanent staff.
But having a strong training pipeline for the globe is a huge plus for US prestige, and the top people are still offered jobs as faculty or industry within the country, so it still a net gain for USA. But it’s brutally competitive for the individual scientists
While I'm more skeptical than you are of the value of a string of new students coming through as opposed to just keeping the very best students, I'm also not suggesting we mandate this change or force it. I'm suggesting that we give people more information to make better informed decisions. If students decide that they are comfortable with a sub 20% job placement rate, then great, nothing needs to change. If they aren't satisfied with that, and we decide that actually they were performing a valuable service, then it behoovs society to pay them enough that they becoming willing to make that gamble again.
The current information assymetry is exploitative. One of two things would happen under my proposed system: either nothing would change because students think they are getting a good deal as is or students don't think the deal is worth it which means that the current system only works because students are having the reality of the job market hidden from them.
AI in industry was basically made my PhD grads. Without that pipeline, there would be no AI, and I am not exaggerating much at all.
I think a mix of the current system with more permanent researchers makes sense.
There is a lot of work in research that fits the permanent worker better than the fresh 22 year old. But having that fresh talent is really beneficial to science.
> If students decide that they are comfortable with a sub 20% job placement rate, then great, nothing needs to change.
The problem is in my opinion not this low job placement rate per se (it is very easy to find out that this is the case for basically every prospective researcher). The problem rather is the "politics" involved in filling these positions, and additionally the fact that positions are commonly filled by what is currently "fashionable". If you, for some (often good) reason, did good research in an area that simply did not become "fashionable": good luck finding an academic position.
But the current system has a problem of training people for a job and then sending them to do something else. Even a professorship is a very different job than a graduate researcher or postdoc. Most professors do little research themselves these days, instead managing research. Don't you think that's a little odd, not to mention wasteful? We definitely should have managers, and managers with research backgrounds themselves, but why not let people continue honing their research skills?
It is. But this is also a social choice dictated by how much we as a country want to fund research.In a practical sense, I would argue the scientific is primarily about churning out grants and papers.
Thats interesting, I don't know if I have ever seen this kind of labor market logic applied to science before. Is this an agreed upon idea? In my mind, science and the kind of focused research it entails is kind of definitionally distinct from something like "innovation." Like, frankly, yes, I want a stream of widgets; if that means consistent units of research done to contribute to an important area/problem, which are reviewed and judged by peers.
Like what's even the alternative? We want a Steve Jobs of science? That's really what we are going for?
Are you suggesting science and innovation are distinct?
Scientific progress is largely driven by the “Steve Jobs” of sciences.
Only a tiny fraction of papers remain relevant. So that means the quality of the average paper doesn’t matter as much as the quality of the best paper.
There is actually a lot of debate as to whether scientific discovery is driven by "heroes and geniuses" (as you argue) or by multiple people simultaneously and independently coming up with the same idea [1], often called "multiple discovery". Certainly both have occurred many times over.
That said, multiple discovery seems to be more common nowadays due to the rapid diffusion of information, which means that most people are operating in roughly the same information environment (initial conditions) when they start their research. It is interesting how often multiple discovery happens when you start to look closely at this.
[1] https://en.wikipedia.org/wiki/Multiple_discovery
What you’re describing sounds a lot like the Department of Energy national labs. They have (or had) many permanent-track research roles without teaching obligations, where scientists can have long stable research careers.
The problem, as always, is funding. In the US, the federal govt is essentially the only “customer” of basic research. There’s some private funding, often from kooky millionaires who want someone to invent a time machine, but it’s the exception that proves the rule. Universities sometimes have pure research roles, but they’re generally dependent on the employee paying themselves with a constant stream of grants. It’s a stressful and precarious position.
What all is tuition paying for anyway? It's not paying for the professors, since they have to fund themselves with grants. It's not paying for research overhead, because that also get claimed from grants. It's not paying for extracurriculars, since those get funded by donations, student contributions, and revenue. It's not paying for new facilities, since those all get named after donors.
It certainly doesn't seem to be paying a lot for post docs, grad students, (who are either contributing their own tuition or getting it contributed by someone else anyway), adjuncts, or other non-professor faculty, since they famously make starvation wages.
I'm being a bit facetious, since tuition has "transparent" line items stating how much goes to what, but university revenue streams are a bit baffling. Mountains of money go in, and mountains of money go out, but the two seem to have a very indirect relationship at times.
And I know, the common answer is that it goes to some nebulous "administration", but the executive administrative staff, while reasonably well compensated, make a pretty small portion of the overall budget, and the rest of the admin seems like it could be more reasonably be split into the actual services and departments they're administering, which, again, seem adequately funded between grants, donors, and tuition. So I'm not clear what all this ambiguous "administration" that's not executive staff and not tied directly to, say, health insurance (which gets paid for as part of tuition) or research (grants). What are they administering??