i been telling my coworker this who's only use case he can conjure up with AI is simply "im going to give claude snowflake cortex, our integration code, all our documentation, jira tickets and its gonna make everything so much better. we'll be able to ask him anything and get the answer" and he's just lost the plot because there wasn't much of a plot. Sci-fi's infused him with how great it would be to have something to answer any question he had. he's hung up on this possibility of having his own tony stark jarvis at his disposal, in his head this is going to be the thing that speeds him up.
i'd say it's been a huge distraction for him and the obsession over using LLM for Big Wikiz hasn't yielded anything near what he thought the tech was for. few occasions now he's learned the hard way how imperfect the technology is.
between that and everyones grand visions for agentic workflows i've mostly just receded into being one of the few who is still regularly delivering stuff. i'm using AI to speed my delivery up quite a bit, i'm just not wasting my time taking it on some big grand adventure. the irony that a lot of people pushed back on companies who wanted to implement chat bots and they spend most of their credits/tokens making their own chat bots by collecting six trillion .md files and adding skill files.
my real takeaway is this: i've come to reason that there is some sort of loss in actual real institutional knowledge when we attempt to take shortcuts to growing the breadth of our own knowledge. i don't mean "hey claude give me some examples of how companies typically design x to solve for y" or "golang is new to me, what are the benefits of a compiled language versus something that requires a runtime going".
no, i'm talking about these kinds of questions:
"/somePersonalBigWikiProjectInvokedBySkill.md claude review our current tooling and infrastructure, how can we 5x our deployment speed, then search the web for <some SaaS company> and put a proposal together to get it implemented at the organization and include a 5 year cost benefit analysis and ... "
i look around and it feels like everyone is nerfing themselves. that latter question? people are just sending claude proposals left and right. my eyes have completely glazed over. is it really that hard to do some digging yourself? we're already ceding the ability to just go grab an architect or senior engineer and ask him what he thinks about how <some SaaS company> will fit with the broader suite of technologies and visions on the horizon. we're just skipping the pieces where we do a little discovery together and work together on an outcome. we're walking away with surface level understanding of many things.
this clearly has visible impacts on how we engage with each other, there's something there that I'm noticing and don't have the words for. it's mostly that people are less able to explain what it is they're talking about when pressed for deeper details, but also everyone's behavior is now different because AI sort of... makes them feel like they have definitive answers/strategies and they're no longer willing to have their ideas challenged. they no longer see that as a learning experience, a chance to learn from someone who has wisdoms who is already a walking wikipedia on something. the perfect technology for people who hate when someone with way more experience than them says "maybe not a good idea and here's why"
i've met some interesting people who are just... walking encyclopedias on some or many domains. incredibly smart people who have so much knowledge and wisdom and so many years of experience not just with tech but with people and failures and successes. i don't doubt for a second that the human brain is capable of holding an unbelievable index of information in a natural way that marries well with decision making processes that come from experience. i'm not sure what gap people are trying to close building themselves some proverbial great library here, but i would encourage people to just sit back and trust that their brain is still one of the greatest technologies at their disposal.
I feel the exact same way, it helps speed up development a lot (and eliminates a lot of really annoying grunt work). But I see people I work with doing shit with it that doesn't make any sense, e.g. writing 50k lines of code for a "compiler" when it's really just an interpreter under the hood. Like they never take the time to understand the domain more deeply, they just use claude to sling some shit that barely works
> i'm not sure what gap people are trying to close building themselves some proverbial great library here, but i would encourage people to just sit back and trust that their brain is still one of the greatest technologies at their disposal.
Culturally I think this is going to fuck things up significantly. If I take the time to read all of the latest papers in the LLM space, I'm damn well not going to summarize it or document what I've learned for anyone. (Maybe this is why there are not many high quality books aggregating all of this information in all the latest papers, all of the advancements, etc. All the people doing this work would rather (smartly) milk the cash cow and maintain the information asymmetry.)
Or think about open source, this will kill it for people trying to make money off a product and keep it open source. Because someone could spin up a competitor overnight.
AI is going to make the information easier to acquire for cheap. But it's going to absolutely destroy the incentive structure and trust required to have an open exchange of information. It was already bad enough because the industry is not incentivized to produce quality literature for educational purposes like academia is. But after this, it'll be a complete shit show
>i look around and it feels like everyone is nerfing themselves
>this clearly has visible impacts on how we engage with each other
> there's something there that I'm noticing and don't have the words for.
Welcome to ASI takeoff!
> im going to give claude snowflake cortex, our integration code, all our documentation, jira tickets and its gonna make everything so much better. we'll be able to ask him anything and get the answer
This is actually a good idea because it's a very cheap way to build your own industrial-strength search engine. We've forgotten how cool search engines are because Google's is so shit now.
(Although you don't need Claude, you can self-host this with minimal effort now.)
[dead]