How? Other then calling utility functions that C++ doesn't have you can't just like skip understanding what you are coding by using Python. If you are importing libraries that do stuff for you that wouldn't be any different than if someone wrote those libs in C++.
Are you saying I was incorrect for feeling that way?
The reason is that you no longer really know what's going on. (And yes, that feeling would be the same if C++ had as rich a library of packages as python for numerical analysis.)
If you are doing something that requires precision you need to know everything that is happening in that library. Also IIRC, I think not knowing what type something is bothered me at the time.
Reminds me of all the parables about kings who "pretend to be a common man" for a day and walk among their subjects and leave with some new enlightenment.
The idea that you lose a ton of knowledge when you experience things through intermediaries is an old one.
You don't have a strong mental model after agentic coding something in my experience.
It isn't an abstraction like assembly -> C.
If you code something like: extract the raw audio data from an audio container, it doesn't matter if you write it in assembly, C, Javascript, whatever. You will be able to visualize how the data is structured when you are done. If you had an agent generate the code the data would just be an abstraction.
It just isn't worth it to me. If I am working with audio and I get a strong mental model for what different audio formats/containers/codecs look like who knows what creative idea that will trigger down the line. If I have an agent just fix it then my brain will never even know how to think in that way. And it takes like... a day...
So I get it as a optimized search engine, but I will never just let it replace understanding every line I commit.
Maybe in order to get the most out of agentic programming one needs to either a) already master the domain in question, or b) become a master in the process.
When (b) then the process can be the thing that triggers "thinking hard", and when (a) the one's mastery can be the reason one "thinks hard" when driving the agent.
Does this help TFA? Idk. Maybe if TFA can try either doing (a) a lot or (b) a lot it might. Or maybe agentic programming is going to drive out of the business those who stop thinking hard because they have agents to help them.
I don't think it will ever really happen because of ownership.
Sure this is awesome now and maybe he shipped it in a week using AI or something, but he now owns a critical part of his wife's business. 5 years from now he is gonna be working 50/hrs a week and not want to deal with this project he barely remembers even doing, whenever an SSL cert goes bad or the CC he was paying the server bills with expires or actual bugs happen he is on the line for it.
It is lame to let family/friends pay $20/mo for something you could build in a few weeks, but they will own the product forever, I don't want to.
It doesn't have to be this messy. If I were the maker I would treat this as a good first version and transfer the ownership to the business slowly. This is just like working with any consultant.
The business is run by his wife, and if they had a SWE(-like) already, that person would’ve made this. But instead, the husband did and now owns it. He also open sourced it, so he has to live with the inevitable consequences of that too.
I wonder if a very simple moltbot can do the ongoing development on its own. I mean, the hard work is mostly out of the way. This isn't so out of the realm of capability.
This isn't about capturing the audio, it is about transcribing it. Transcribing whispered/garbled speech in the background is really really really hard.
I agree, being able to transcribe low quality audio would be an amazing new feature. What I was disputing was the notion that even an old iPhone is incapable of capturing crystal clear audio from an entire room while in your pocket. It has been able to do that forever.
It might not be in our lifetimes... the frontier models are using terabytes of RAM. In 10 years iPhones went from ~2GB to ~8GB.
2012 Macbook pros had up to 16gb, 2026 maxes out at 64gb. So 4x increase in 16 years.
1996 Mac desktop had 16MB of ram, so from 1996-2012 there was a 1000x increase.
We won't see gains like we did from the 80s-2000s again.
"in the 1920s and 1930s, to be able to drive a car you needed to understand things like spark advance, and you needed to know how to be able to refill the radiator halfway through your trip"
A car still feels weirdly grounded in reality though, and the abstractions needed to understand it aren't too removed from nature (metal gets mined from rocks, forged into engine, engine blows up gasoline, radiator cools engine).
The idea that as tech evolves humans just keep riding on top of more and more advanced abstractions starts to feel gross at a certain point. That point is some of this AI stuff for me. In the same way that driving and working on an old car feels kind of pure, but driving the newest auto pilot computer screen car where you have never even popped the hood feels gross.
I was having almost this exact same discussion with a neighbor who's about my age and has kids about my kids' ages. I had recently sold my old truck, and now I only have one (very old and fragile) car left with a manual transmission. I need to keep it running a few more years for my kids to learn how to drive it since it's really hard to get a new car with a stick now...or do I?
Is learning to drive stick as out dated as learning how to do spark advance on a Model T? Do I just give in and accept that all of my future cars, and all the cars for my kids are just going to be automatic? When I was learning to drive, I had to understand how to prime the carburetor to start my dad's Jeep. But I only ever owned fuel injected cars, so that's a "skill" I never needed in real life.
It's the same angst I see in AI. Is typing code in the future going to be like owning a carbureted engine or manual transmission is now? Maybe? Likely? Do we want to hold on to the old way of doing things just because that's what we learned on and like?
Or is it just a new (and more abstracted) way of telling a computer what to do? I don't know.
Right now, I'm using AI like when I got my first automatic transmission. It does make things easier, but I still don't trust it and like to be in control because I'm better. But now automatics are better than even the best professional driver, so do I just accept it?
Technology progresses, at what point to we "accept it" and learn the new way? How much of holding on to the old way is just our "identity".
I don't have answers, but I have been thinking about this a lot lately (both in cars for my kids, and computers for my job).
The reasons I can think of for learning to drive stick shift are subtle. Renting a stick shift car in Europe is cheaper. You might have to drive a friend's car. My kids both learned to drive our last stick shift car, which is now close to being junked. Since our next car will probably be electric, it's safe bet that it won't be stick.
The reasons for learning to drive a manual transmission aren't really about the transmission, they're about the learning and the effects on the learner. The more you get hands on with the car and in touch with the car the more deeply you understand it. Once you have the deepish understanding, you can automate it for convenience after that. It's the same reason we should always teach long division before we give students calculators, not after.
I agree with all of those statements - I always told my wife that I'd get our kids a underpowered manual so that they're always busy rowing gears and can't text and drive.
But in the bigger picture, where does it stop?
You had to do manual spark advance while driving in the 30's
You had to set the weights in the distributor to adjust spark advance in the 70's
Now the computer has a programed set of tables for spark advance
I bet you never think of spark advance while you're driving now, does that take away from deeply understanding the car?
I used to think about the accelerator pump in a the carburetor when I drove one, now I just know that the extra fuel richening comes from another lookup table in the ECU when I press the gas pedal down, am I less connected to the car now?
My old Jeep would lean cut when I took my foot off the gas and the throttle would shut quickly. My early fuel injected car from the 80's had a damper to slow the throttle closing to prevent extreme leaning out when you take your foot off the gas. Now that's all tables in the ECU.
I don't disagree with you that a manual transmission lets you really understand the car, but that's really just the latest thing were losing, we don't even remember all of the other "deep connections" to a car that were there 50-100 years ago. What makes this one different? Is it just the one that's salient now?
To bring it back on topic. I used to hand-tune assembly for high performance stuff, now the compilers do better than me and I haven't looked at assembly in probably 10 years. Is moving to AI generated code any different? I still think about how I write my C so that the compiler gets the best hints to make good assembly, but I don't touch the assembly. In a few years will be be clever with how we prompt so that the AI generates the best code? Is that a fundamentally different thing, or does it just feel weird to us because of where we are now. How did the generation of programmers before me feel about giving up assembly and handing it over to the compilers?
EVs don't have variable gearboxes at all, so when EVs become popular, it doesn't make sense to learn stick. It would be a fake abstraction, like the project featured on HN, where kids had floppy disk shells with NFC tags in them that tell the TV which video file to load from a hard disk.
i have been programming for 40years.. but still have not dipped in this brave new world of shakespeare-taught coding-LLMs.
IMO there's one basic difference with this new "generative" stuff.. it's not deterministic. Or not yet. All previous generations "AI" were deterministic.. but died.
Generating is not a problem. i have made medium-ish projects - say 200+kloc python/js - having 50%-70% of code generated (by other code - so you maintain that meta-code, and the "language" recipes-code it interprets) - but it has been all deterministic. If shit happens - or some change is needed, anywhere on the requirements-down-to-deployment chain - someone can eventually figure out where and what. It is reasoned. And most importantly, once done, it stays done. And if i regenerate it 1000 times, it will be same.
Did this made me redundant? Not at all. Producing software is much easier this way, the recipes are much shorter, there's less space for errors etc etc. But still - Higher abstractions are even harder to grasp than boilerplate. Which has quite a cost.. you cannot throw any newbie on it and expect results.
So, fine-tuning assembly - or manual transmission - might be gonna-be-obsolete skill, as it is not required.. except in rare conditions. But it is helpful to learn stuff. To flex your mind/body about alternatives, possibilities, shortcuts, wear-and-tear, fatigue, aha-moments and what not. And then move these as concepts, onto another domains, which are not as commoditized yet.
Another thing is.. Exupery in Land-of-people talks about technology (airplanes in his case), and how without technology, mankind workarounds/ avoids places/things that are not "friendly", like twisting roads around hellscapes. While technology cuts all straight through those - flies above all that, perfect for when it works - and turns into nightmare when it breaks right in the middle of such unfriendly "area".
Probably a vanishingly small number of people who drive stick, actually understand how and why it works. My kids do, of course, because I explained it to them. Most drivers just go through the motions.
> In the same way that driving and working on an old car feels kind of pure
I can understand working on it feeling pure, but driving it certainly isn't, considering how lower the emissions now, even for ICE cars. One of the worst driving experiences of my life was riding in my friends' Citroen 2CV. The restoration of that car was a labour of love that he did together with his dad. For me as a passenger, I was surprised just how loud it is, and how you can smell oil and gasoline in the cabin.
"Based on patent filings, Q.ai has built machine-learning tech for audio and “silent” voice input, including systems designed to understand to improve communication in noisy or difficult environments."
Good? Based on my recent experience the native iOS functionality around voice activity detection, echo cancellation and transcription is horrendous. If they could ship devices that had anywhere near the quality you can get with a bespoke stack to handle the audio data it would be impressive.
If my C compiler sometimes worked and sometimes didn't I would just mash compile like an ape until it started working.
reply