- Local time
- Today, 13:57
- Joined
- Jul 9, 2003
- Messages
- 17,386
Hi everyone, Tony (Uncle Gizmo) here!
Many of you know me from my years working with Microsoft Access and VBA. It's a powerful platform, and I've built countless solutions with it. But recently, I embarked on a bit of an "accidental" adventure into the world of Python, web technologies, and Large Language Models (LLMs), and I wanted to share my experience, as I think it might resonate with many of you.
The Starting Point: A Familiar Itch
It all began with a simple need: a Chrome extension I used for transcription started acting up. My first instinct wasn't to dive into JavaScript and Chrome APIs – that felt like a world away from VBA forms and modules! However, I'd been hearing a lot about LLMs (like ChatGPT, Gemini, etc.) and their ability to generate code. I thought, "Could an AI help me build a replacement?" Long story short, with the AI as my coding partner, I managed to get a new transcription tool up and running:- (Nifty Transcription Tool) This was my first taste of "programming with an LLM," where I was directing the AI, defining the logic, and debugging the output, rather than writing every line of code in an unfamiliar language.
Dipping My Toes into Python (via Local AI)
This success made me curious. I wanted to experiment with running LLMs locally on my PC for privacy and cost reasons. I set up Ollama with the Mistral model, but to talk to it programmatically, I needed to use its API. The common language for this? Python.
Now, my Python knowledge was zero. But again, I turned to an LLM. I described the task: "Write a Python script to send a prompt to my local Ollama API and get a response." And it worked! This was a breakthrough. I realized I could leverage my decades of application design and logic skills from Access/VBA, and use an LLM to handle the Python syntax.
The Birth of UberChatMaster
This led to a bigger idea: Why not create a single, unified web interface to interact with multiple LLMs – both my local Ollama and cloud-based ones like OpenAI's GPT models? A place where I could easily switch models, manage chat history, and have a consistent user experience. That's how UberChatMaster was born.
It's a Python Flask web application for the backend, with HTML, CSS, and JavaScript for the frontend, and it uses SQLite for database storage.
Key features currently include:
The Revelation: I'm an "AI Application Architect," Not a Python Coder!
Building UberChatMaster has been an intense learning experience. I haven't become a traditional Python programmer overnight. Instead, I've learned to be an orchestrator. I design the system, define the features, break down the logic, prompt the LLM for code, and then critically, I test, debug, and integrate the pieces. My Access/VBA experience in structuring applications, understanding data, and user interaction has been invaluable. The LLM handles the syntax; I handle the architecture and the "does this actually work and meet the requirement?" part.
It's been challenging – LLMs can get stuck, generate buggy code, or misunderstand. I've used different LLMs (ChatGPT, Grok, and recently Google's Gemini Studio which was a huge help) at different stages. It's a very iterative process.
Why Share This with Access Developers?
I believe this AI-assisted development approach can be a game-changer for many of us. If you've ever felt limited by VBA or wanted to explore web technologies or Python utilities without facing a multi-year learning curve for a new language, LLMs offer a new path. You can leverage your existing, hard-won development and problem-solving skills.
Want to See More? (My Video Demo & Learning More)
I've actually recorded a video demonstrating UberChatMaster in action, showing how it switches between OpenAI and local Ollama, and how the chat history works.
You can watch it here:
This journey has been so fascinating that I'm considering creating a short video series specifically for developers (especially those from an Access/VBA background) on how to approach building projects like this with LLM assistance – focusing on the prompting, the mindset, the debugging, and the key concepts rather than just raw coding.
If that sounds interesting, I've set up a pre-order on Gumroad for it. The idea is if 10 people subscribe for $10, I'll go ahead and produce the series. You can find out more here: [Link to your Gumroad Pre-Order Page]
I'd love to hear your thoughts! Have any of you tried using LLMs for code generation beyond simple snippets? What have your experiences been?
Cheers,
Tony (Uncle Gizmo)
Many of you know me from my years working with Microsoft Access and VBA. It's a powerful platform, and I've built countless solutions with it. But recently, I embarked on a bit of an "accidental" adventure into the world of Python, web technologies, and Large Language Models (LLMs), and I wanted to share my experience, as I think it might resonate with many of you.
The Starting Point: A Familiar Itch
It all began with a simple need: a Chrome extension I used for transcription started acting up. My first instinct wasn't to dive into JavaScript and Chrome APIs – that felt like a world away from VBA forms and modules! However, I'd been hearing a lot about LLMs (like ChatGPT, Gemini, etc.) and their ability to generate code. I thought, "Could an AI help me build a replacement?" Long story short, with the AI as my coding partner, I managed to get a new transcription tool up and running:- (Nifty Transcription Tool) This was my first taste of "programming with an LLM," where I was directing the AI, defining the logic, and debugging the output, rather than writing every line of code in an unfamiliar language.
Dipping My Toes into Python (via Local AI)
This success made me curious. I wanted to experiment with running LLMs locally on my PC for privacy and cost reasons. I set up Ollama with the Mistral model, but to talk to it programmatically, I needed to use its API. The common language for this? Python.
Now, my Python knowledge was zero. But again, I turned to an LLM. I described the task: "Write a Python script to send a prompt to my local Ollama API and get a response." And it worked! This was a breakthrough. I realized I could leverage my decades of application design and logic skills from Access/VBA, and use an LLM to handle the Python syntax.
The Birth of UberChatMaster
This led to a bigger idea: Why not create a single, unified web interface to interact with multiple LLMs – both my local Ollama and cloud-based ones like OpenAI's GPT models? A place where I could easily switch models, manage chat history, and have a consistent user experience. That's how UberChatMaster was born.
It's a Python Flask web application for the backend, with HTML, CSS, and JavaScript for the frontend, and it uses SQLite for database storage.
Key features currently include:
- Connection to OpenAI's API (e.g., GPT-4o).
- Connection to local Ollama models (e.g., Mistral).
- Persistent chat history that provides context to the LLMs.
- UI for selecting vendors and models.
- Prompt templates and speech-to-text input.
- Chat management (new chat, rename, hide, clear all).
The Revelation: I'm an "AI Application Architect," Not a Python Coder!
Building UberChatMaster has been an intense learning experience. I haven't become a traditional Python programmer overnight. Instead, I've learned to be an orchestrator. I design the system, define the features, break down the logic, prompt the LLM for code, and then critically, I test, debug, and integrate the pieces. My Access/VBA experience in structuring applications, understanding data, and user interaction has been invaluable. The LLM handles the syntax; I handle the architecture and the "does this actually work and meet the requirement?" part.
It's been challenging – LLMs can get stuck, generate buggy code, or misunderstand. I've used different LLMs (ChatGPT, Grok, and recently Google's Gemini Studio which was a huge help) at different stages. It's a very iterative process.
Why Share This with Access Developers?
I believe this AI-assisted development approach can be a game-changer for many of us. If you've ever felt limited by VBA or wanted to explore web technologies or Python utilities without facing a multi-year learning curve for a new language, LLMs offer a new path. You can leverage your existing, hard-won development and problem-solving skills.
Want to See More? (My Video Demo & Learning More)
I've actually recorded a video demonstrating UberChatMaster in action, showing how it switches between OpenAI and local Ollama, and how the chat history works.
You can watch it here:
This journey has been so fascinating that I'm considering creating a short video series specifically for developers (especially those from an Access/VBA background) on how to approach building projects like this with LLM assistance – focusing on the prompting, the mindset, the debugging, and the key concepts rather than just raw coding.
If that sounds interesting, I've set up a pre-order on Gumroad for it. The idea is if 10 people subscribe for $10, I'll go ahead and produce the series. You can find out more here: [Link to your Gumroad Pre-Order Page]
I'd love to hear your thoughts! Have any of you tried using LLMs for code generation beyond simple snippets? What have your experiences been?
Cheers,
Tony (Uncle Gizmo)
Last edited: