Reading some documentation to figure out a format is something you do once and takes you a few minutes.
Are you a developer? Then this is something you probably do a couple times a day. Prompting the correct version will take longer and will leave you with much less understanding of the system you just implemented. So once it fails you don't know how to fix it.
I love that the posture is I have a problem I need you to fix haha.
I don't need you to fix my problems. I'm reporting that the LLM-based solution beats the dogshit out of the old "become a journeyman on one of 11 billion bullshit formats or processes" practice.
I'm not trying to help you, I'm just wondering how the LLM actually helps you.
You don't need to become a journeyman at understanding a format, you just need to see a schema, or find an open source utility. I just can't comprehend the actual helplessness that a developer would have to experience in order to have to ask an LLM to do something like this.
If I were that daunted by parsing a standardized file format for a workflow, I would have to be experiencing a major burnout. How could I ever assume I could do any actual technical work if I'm overwhelmed by a parsing problem that has out-of-the-box solutions available.
I’ll give you a real concrete example. I had to build an app on the Mac, which needed to be signed. I did not want to learn Apple signing procedures in order to do this. It turns out I did not have to, because I got the robot to learn it. So then I was able to finish doing what it was I intended to do without having to spend an afternoon or a day misunderstanding the Apple signing procedures.
Could I have learned these and become a more virtuous person by knowing apples signing rules? Maybe. What’s much more likely is that I might’ve just stopped doing this rather than deal with that particular difficulty. Instead, I was able to work on other problems that arose in the building of this application.
What I am suggesting to you is that I don’t have to fucking feel bad for being daunted anymore. And neither does anyone else. Folks that want to do that on their own time are free to, but I’m never going back.
There’s a lot of projects for people where this is gonna start to be the operative situation. Folks who might have gotten stuck on an early stumbling block are now just moving ahead and are learning about different and frankly more interesting problems to solve. I’m still beating my head on things, but they are not. “did I get this format just right?”
This shift is an analogous to how we took having to do computer arithmetic out of the hands of programmers in the 80s. There used to be a substantial part of programming that was just a computer arithmetic. Now, almost nobody does that. Nobody in this thread could build a full adder if their life depended on it or produce an accurate sin function. It used to be that that would’ve stopped you cold and trying to answer an engineering problem on a computer. Now it doesn’t. We do not run around telling people that they’re not engineers or that they’re not learning because we have made this affordance.
A full adder is literally one of the easier theoretical computer science concepts, and a sine approximation is a simple Maclaurin series. And yes, if you can't do a simple series expansion, you are not an engineer. You may be a developer, but not an engineer.
These are both first or second year bachelors topics. Just because you're unable to work through simple math problems doesn't mean any semi-competent computer professional would be.
Was it a good thing for anyone writing software which included those things to need to not only work out how they are on a blackboard but how they are on the real machine in question? And how they are on the next machine over?
Do you yearn to return to that world? I suspect most people don't. It's not just knowing your own machine, but any machine the code could run on. It's also not just reaching for some 2nd year bachelor topics when the matter at hand is much more complicated. Where does your sine approximation fail? How do you know? Can you prove that? Does the compiler or the hardware decide to do things behind your back which vitiate any of those claims?
Knowing the answer to that all every time you need a sine is not something 99.99% of engineers need to worry about. IT USED TO BE. But now it's not. No one is going back to that.
I don't know what world you live in, but I still definitely need to know the approximation error of the methods I use.
sin(x) has one of the simplest Maclaurin series:
sin(x) = x - x^3/3! + x^5/5! - x^7/7! ...
For any partial sum of that series, the error is always strictly less than the absolute value of the next term in the series. The fact that this was your example of a "difficult" engineering problem is uh, embarrassing.
For good measure, I would of course fuzz any component involving numerical methods to ensure it stays within bounds. _As any competent engineer would_.
And I absolutely work things out on pen and paper or a white board before implementing them. How else would I verify designs? I'm sure you're aware that fixing bugs is cheapest in the design phase.
Are you living in an alternate reality where software quality does not matter? I'm still living in the world where engineers need to know what the fuck they're doing.
Oh, IEEE 754 double precision floating point accuracy? Rule of thumb is 17 digits. You will probably get issues related to catastrophic cancellation around x=0. As I said earlier the easiest solution is just to measure in this case. You don't really need to fuzz a sine approximation, you can scan over one period and compare against exactly calculated tables. I would probably add a cutoff around zero and move to a linear model if there is cancellation issues.
And if the measurement shows the approximation has too much floating point error, you can always move to Kahan sums or quad precision. This comes up fairly often.
If I really had to _prove_ formally an exact error bound, that would take me some time. This is not something you would be likely to have to do unless you're building software for airplanes, or some other safety critical domain. And an LLM would absolutely not be helpful in that case. You would use formal verification methods.
Are you a developer? Then this is something you probably do a couple times a day. Prompting the correct version will take longer and will leave you with much less understanding of the system you just implemented. So once it fails you don't know how to fix it.