Nah it’s just extra protein.
Nah it’s just extra protein.
This is the hardest thing to un-train with new employees: be honest about your mistakes. I will not get mad about a mistake. Everyone makes them. The best thing to do is call it out so we can move to fix it. If you keep making the same mistake, maybe we have a talk about your process to see if there are any blind spots.
So many people try to hide their mistakes or reframe them as successes and please do not do that. Own it, see if you can learn anything from it, and let everyone know so we can help you fix it.
Ugh.
Yeah. Also, superficially good looking people can still be sketchy weirdos. Vibe, context, and prior relationship are much more important than looks. Of course, some people can’t get their head around this and start blaming literally anything else: their height, their bone structure, a worldwide conspiracy against them. It’s crazy.
Oh good it’s not just me.
I became a dad late (around middle age) and was telling dad jokes way before that. My theory is it’s less about becoming a father and more about getting older and just wanting to annoy people for my own amusement.
As someone whose employer is strongly pushing them to use AI assistants in coding: no. At best, it’s like being tied to a shitty intern that copies code off stack overflow and then blows me up on slack when it magically doesn’t work. I still don’t understand why everyone is so excited about them. The only tasks they can handle competently are tasks I can easily do on my own (and with a lot less re-typing.)
Sure, they’ll grow over the years, but Altman et al are complaining that they’re running out of training data. And even with an unlimited body of training data for future models, we’ll still end up with something about as intelligent as a kid that’s been locked in a windowless room with books their whole life and can either parrot opinions they’ve read or make shit up and hope you believe it. I’ll think we’ll get a series of incompetent products with increasing ability to make wrong shit up on the fly until C-suite moves on to the next shiny bullshit.
That’s not to say we’re not capable of creating a generally-intelligent system on par with or exceeding human intelligence, but I really don’t think LLMs will allow for that.
tl;dr: a lot of woo in the tech community that the linux community isn’t as on board with
A thing that hallucinates uncompilable code but somehow convinces your boss it’s a necessary tool.