User vs. Developer: The big AI question

Isaac

Lifelong Learner
Local time
Today, 09:37
Joined
Mar 14, 2017
Messages
11,422
So the big question that I see slooooowly emerging in my mind is this: Am I going to just be an AI user, or am I going to start getting educated in how a person builds programs supported by the AI infrastructures that already exist (the big players) , who allow us to build chatbots or somehow , manifestations of an AI service, by utilizing their service. I've seen tiny glimpses of this when writing for my Amazon KDP books, where ChatGPT would come to the limit of what it was willing to output (in various situations, but i.e. physical intimacy in the romance novels) - and it would suggest to me other services that people were building that were like programs they built, sort of somehow funnelling the capacity of the known LLM's through their own service with zero filters.

I don't really understand it yet, I just know enough to realize I don't know. I realize there is a whole world out there of using the existing AI companies' work to funnel it through your own service and provide a 'product' of your own - an end product. (I mean, surely they aren't reinventing the whole LLm themselves, that would be as impossible as it would be for me to open up a shop that rivals OpenAI tomorrow, right?)

I'm rambling incoherently but with good intentions. What are your thoughts for yourself? using AI vs. building an AI-related, AI-supported, or AI-adjacent end product? Do you understand the landcape better than I do? If so, share
 
I posted my observations on your post into Grok and asked Grok to come up with a response and here it is:-

Hey Isaac, love the ramble—it's spot on that "just using" AI feels like dipping a toe, while building your own wrappers (like those no-filter romance tools for KDP) unlocks the real fun of turning big players' pipes into your endgame product, without reinventing the wheel.
Spot on too that folks aren't cloning OpenAI overnight; it's all about layering services on their APIs to dodge limits and add your spin.

To your big question:
I'd say dive into building AI-adjacent stuff once you've hammered the "user" phase hard—use LLMs everywhere to build that instinct for when they're gold versus garbage.
Pound 'em in wild spots so you feel the flow, but never buy they're always right or the full fix; they're handy sidekicks, sure, but humans gotta steer or it all goes sideways.

They sound like they get you deep, but nah—they're just remixing internet scraps from similar chats, a slick scavenger vibe that fools you into thinking it's a real brain until it drops some total nonsense and snaps you back.

That said, even one of AI's founding dads, Yann LeCun, just said on X that these LLMs won't ever hit true smarts—he's cooking up LeJEPA instead, a setup that teaches models to predict real-world links (like matching photo angles) way better than word-spit.

X thread HERE:-
Grok Said, I'm all about that hybrid—xAI built me to spark your tinkering, so using me to prototype a custom bot (say, via no-code like Bubble plus our API) beats passive queries every time.
Love it! thanks for posting. I've pounded ChatGPT pretty hard. I like this: "it's all about layering services on their APIs to dodge limits and add your spin."
 

Users who are viewing this thread

Back
Top Bottom