ai-christianson 6 days ago

Aren't the jobs they'll get be expecting them to use AI?

3
nkrisc 6 days ago

If you’re hiring humans just to use AI, why even hire humans? Either AI will replace them or employers will realize that they prefer employees who can think. In either case, being a human who specializes in regurgitating AI output seems like a dead end.

throwaway290 6 days ago

> If you’re hiring humans just to use AI, why even hire humans

You hire humans to help train AI and when done you fire humans.

TimorousBestie 6 days ago

“Prompt Engineer” as a serious job title is very strange to me. I don’t have an explanation as to why it would be a learnable skill—there’s a little, but not a lot of insight into why an LLM does what it does.

jonfw 6 days ago

> there’s a little, but not a lot of insight into why an LLM does what it does.

That's a "black box" problem, and I think they are some of the most interesting problems the world has.

Outside of technology- the most interesting jobs in the world operate on a "black box". Sales people, psychologists are trying to work on the human mind. Politicians and market makers are trying to predict the behavior of large populations. Doctors are operating on the human body.

Technology has been getting more complicated- and I think that distributed systems and high level frameworks are starting to resemble a "black box" problem. LLMs even more so!

I agree that "prompt engineer" is a silly job title- but not because it's not a learnable skill. It's just not accurate to call yourself an engineer when consuming an LLM.

Aerroon 6 days ago

It's an experience thing. It's not about knowing what LLMs/diffusion models specifically do, but rather about knowing the pitfalls that the models you use have.

It's a bit like an audio engineer setting up your compressors and other filters. It's not difficult to fiddle with the settings, but knowing what numbers to input is not trivial.

I think it's a kind of skill that we don't really know how to measure yet.

TimorousBestie 6 days ago

When an audio engineer tweaks the pass band of a filter, there’s a direct casual relationship between inputs and outputs. I can imagine an audio engineer learning what different filters and effects sound like. Almost all of them are linear systems, so composing effects is easy to understand.

None of this is true of an LLM. I believe there’s a little skill involved, but it’s nothing like tuning the pass band of a filter. LLMs are chaotic systems (they kinda have to be to mimic humans); that’s one of their benefits, but it’s also one of their curses.

Now, what a human can definitely do is convince themselves that they can control somewhat the outputs of a chaotic system. Rain prognostication is perhaps a better model of the prompt engineer than the audio mixer.

ai-christianson 6 days ago

Employers are employees too

myaccountonhn 6 days ago

Even if you just use AI, you need to know the right prompts to ask.

Ekaros 6 days ago

And how to verify the output and think through it. I hear time after time that someone asked something from AI. It came up with something and then when corrected apologized and printed out it was wrong...

But how do you correct it if you do not know what is right or wrong...

throwaway290 6 days ago

> how do you correct it if you do not know what is right or wrong...

You keep human employees and require them to use LLM so that it gets corrected all the time from their input. Then you fire them.

__loam 6 days ago

Would you rather be the guy using AI as a crutch or the guy who actually knows how to do things without it?