For most of the last year I thought it was a thing that only hobbiest were doing. But I’m starting to learn how many established companies and their very educated engineering teams are just leveraging OpenAI and its relatives by sending paragraphs of text to them. Paragraphs with instructions they’ve spent massive amounts of time fine tuning.
It would seem to me that this is not the best way to get insights out of LLMs in a desired format. Calling a function where you can break down the criteria you want it to use for its input and output seems much more efficient.
Treat it like a search engine! Search engines eliminated the need for some types of skills (scouring physical books for information, using yellow pages to find the phone number of a business, etc) but they didn't eliminate the need for:
- journals & newsletters for curating & opining on said information
- new media formats for helping us understand the information
- and much more
We just moved "up the stack" so to speak. Information finding became a commodity. There will be new sources of precious information and work that may now be even more valuable because LLM's commoditized basic stuff.