What most people forget is that as a programmer/designer/etc, your job is to take what your client/customer tells you they want, listen to them, then try to give them what they ACTUALLY NEED, which is something that I think needs to be highlighted. Most people making requests to programmers, don’t really even know what they want, or why they want it. They had some meeting and people decided that, ‘Yes we need the program to do X!’ without realizing that what they are asking for won’t actually get them the result they want.
AI will be great at giving people exactly what they ask for…but that doesn’t mean its what they actually needed…
Yesterday the test team asked me for 3 new features to help them. I thought about it for a few minutes and understood that these features are all incompatible. You can get one and only one. Good luck finding an AI that understands this.
… AI will be great at giving people exactly what they ask for …
Honestly, I’m not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it’s just as likely to regurgitate crap. I don’t even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company’s want to even touch AI for most development.
Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It’s bound to run into something novel eventually, especially if there aren’t more datasets to pull in because human-generated development solutions become scarce.
Also, LLM doesn’t usually have memory or experience. It’s the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.
Human’s abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.
The LLM bubble can’t replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn’t when you work in a bakery. What makes the doug behave each way? You don’t need to ask so you wouldn’t know.
And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.
Getting the real requirements nailed down from the start is critical, not just doing the work the customer asked for. Otherwise, you get 6 months into a project and realize you must scrap 90% of the completed work; the requirements from the get-go were bad. The customer never fundamentally understood the problem and you never bothered to ask. Everyone is mad and you lost a repeat customer.
yeah but with agile they should be checking the product out when its a barely working poc to determine if the basic idea is what they expect and as it advances they should be seeing each stage. Youll never get the proper requirements by second guessing what they say.
What most people forget is that as a programmer/designer/etc, your job is to take what your client/customer tells you they want, listen to them, then try to give them what they ACTUALLY NEED, which is something that I think needs to be highlighted. Most people making requests to programmers, don’t really even know what they want, or why they want it. They had some meeting and people decided that, ‘Yes we need the program to do X!’ without realizing that what they are asking for won’t actually get them the result they want.
AI will be great at giving people exactly what they ask for…but that doesn’t mean its what they actually needed…
Yesterday the test team asked me for 3 new features to help them. I thought about it for a few minutes and understood that these features are all incompatible. You can get one and only one. Good luck finding an AI that understands this.
Great points. Also:
Honestly, I’m not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it’s just as likely to regurgitate crap. I don’t even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company’s want to even touch AI for most development.
Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It’s bound to run into something novel eventually, especially if there aren’t more datasets to pull in because human-generated development solutions become scarce.
AI will never not-require a human to hand hold it. Because AI can never know what’s true.
Because it doesn’t “know” anything. It only has ratios of usage maps between connected entities we call “words”.
Sure, you can run it and hope for the best. But that will fail sooner or later.
AI will. LLM won’t.
Also, LLM doesn’t usually have memory or experience. It’s the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.
Human’s abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.
The LLM bubble can’t replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn’t when you work in a bakery. What makes the doug behave each way? You don’t need to ask so you wouldn’t know.
And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.
that stuff should really get worked out in the agile process as the customer reacts to each phase of the project.
Getting the real requirements nailed down from the start is critical, not just doing the work the customer asked for. Otherwise, you get 6 months into a project and realize you must scrap 90% of the completed work; the requirements from the get-go were bad. The customer never fundamentally understood the problem and you never bothered to ask. Everyone is mad and you lost a repeat customer.
yeah but with agile they should be checking the product out when its a barely working poc to determine if the basic idea is what they expect and as it advances they should be seeing each stage. Youll never get the proper requirements by second guessing what they say.