What are we doing to our kids?
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
nepdev wrote:
Something needs to change here, doesn't it?
Yes, and it's the belief that ChatGPT is somehow any good. You can blame the mainstream media (once more) for inciting mass hysteria. Someone else has already summarized ChatGPT's fundamental problem succinctly, as being confidently wrong. I find it hard to disagree with that. It's very little more than the sort of parlor trick BS artists manage to pull off.
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
Not being an anthropologist or a sociologist or whatever, I have recently been thinking (because I can) about the nature of "intelligence" (and sapience). Particularly when people assert that many of the other animals on our planet are also sapient and intelligent -- which I don't deny. But that means that "something else" must separate humans (and probably our extinct proto-human ancestors) from "the lower animals" -- but not "intelligence". Because certainly "something" separates us from even our closest opposable-thumbed relatives. Either that or there is disagreement on what constitutes intelligence -- I'm probably too stupid to have it explained to me. While we humans have amassed a large collection of knowledge and technical ability I don't think we have actually become more intelligent than our stone-age forebears. In particular, we have the technology to transfer knowledge to others. Consider: One day a cave man -- who did not have a stone hammer -- made a stone hammer, the first stone hammer ever conceived in the history of mankind. He did not become more intelligent by making the stone hammer, he already had the intelligence required to do it. But he gained knowledge and technical skill. He could then teach others to do the same, and those others gained knowledge and technical skill, but not intelligence -- they also already had the required intelligence to grasp the concept and the benefits to their lifestyles. Regardless what the CIA thinks, I don't think knowledge and skill equate to intelligence. In short, I don't think an AI has intelligence at all, it has only knowledge -- and logic programmed into it about how best to manipulate and present that knowledge. The only way to win is not to play the game.
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
nepdev wrote:
A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination.
This is why I sent my kid to a Waldorf School K-12, fortunately had the ability to do so. I was stunned when a couple months ago he called me simply to say how he so appreciated that I had done that. That was nice. :)
Latest Article:
SVG Grids: Squares, Triangles, Hexagons with scrolling, sprites and simple animation examples -
Not being an anthropologist or a sociologist or whatever, I have recently been thinking (because I can) about the nature of "intelligence" (and sapience). Particularly when people assert that many of the other animals on our planet are also sapient and intelligent -- which I don't deny. But that means that "something else" must separate humans (and probably our extinct proto-human ancestors) from "the lower animals" -- but not "intelligence". Because certainly "something" separates us from even our closest opposable-thumbed relatives. Either that or there is disagreement on what constitutes intelligence -- I'm probably too stupid to have it explained to me. While we humans have amassed a large collection of knowledge and technical ability I don't think we have actually become more intelligent than our stone-age forebears. In particular, we have the technology to transfer knowledge to others. Consider: One day a cave man -- who did not have a stone hammer -- made a stone hammer, the first stone hammer ever conceived in the history of mankind. He did not become more intelligent by making the stone hammer, he already had the intelligence required to do it. But he gained knowledge and technical skill. He could then teach others to do the same, and those others gained knowledge and technical skill, but not intelligence -- they also already had the required intelligence to grasp the concept and the benefits to their lifestyles. Regardless what the CIA thinks, I don't think knowledge and skill equate to intelligence. In short, I don't think an AI has intelligence at all, it has only knowledge -- and logic programmed into it about how best to manipulate and present that knowledge. The only way to win is not to play the game.
PIEBALDconsult wrote:
But that means that "something else" must separate humans
Does it? Is it really a categorical difference? Could it be so simple as elephants not ending up with opposable thumbs, for example? Or rats didn't lose their fur so that they could develop oral communication, etc? I'd argue that if there is a thing that separates us from animals, it's probably a combination of things. Opposable thumbs, highly articulated language, the inclination to alter our environment rather than adapt to it. All of these things led us to the top of the proverbial pecking order in nature's hierarchy of life. I don't know if there's any one thing that's fundamentally different than other animals, at least in terms of category. Other animals have language (dophins), but not as articulate. Other animals alter their environment (beavers), other primates have opposable digits, as do many birds. With the disclaimer that I am not religious: I remember reading the fall of man story in the old testament and coming away thinking that (based on my exegesis) part of what it was saying is that what makes us different than the rest of the animals is the ability to define and articulate elaborate moral frameworks ("the tree of knowledge of good and evil"). I thought that was interesting.
To err is human. Fortune favors the monsters.
-
Not being an anthropologist or a sociologist or whatever, I have recently been thinking (because I can) about the nature of "intelligence" (and sapience). Particularly when people assert that many of the other animals on our planet are also sapient and intelligent -- which I don't deny. But that means that "something else" must separate humans (and probably our extinct proto-human ancestors) from "the lower animals" -- but not "intelligence". Because certainly "something" separates us from even our closest opposable-thumbed relatives. Either that or there is disagreement on what constitutes intelligence -- I'm probably too stupid to have it explained to me. While we humans have amassed a large collection of knowledge and technical ability I don't think we have actually become more intelligent than our stone-age forebears. In particular, we have the technology to transfer knowledge to others. Consider: One day a cave man -- who did not have a stone hammer -- made a stone hammer, the first stone hammer ever conceived in the history of mankind. He did not become more intelligent by making the stone hammer, he already had the intelligence required to do it. But he gained knowledge and technical skill. He could then teach others to do the same, and those others gained knowledge and technical skill, but not intelligence -- they also already had the required intelligence to grasp the concept and the benefits to their lifestyles. Regardless what the CIA thinks, I don't think knowledge and skill equate to intelligence. In short, I don't think an AI has intelligence at all, it has only knowledge -- and logic programmed into it about how best to manipulate and present that knowledge. The only way to win is not to play the game.
PIEBALDconsult wrote:
But that means that "something else" must separate humans (and probably our extinct proto-human ancestors) from "the lower animals" -- but not "intelligence".
The simplest concept of what that "something else" is, is the ability to choose. Animals generally respond instinctively (they can be trained to not respond instinctively, but that's still not choice.) We humans are unique in that we can choose not to respond by instinct. Ooh, that cake looks delicious, I'm going to eat it. Or, nice cake, but I'm watching my calories. Or I'm lactose intolerant so eating that would not be a good idea. Conversely, my cat loves to chew on certain plant leaves regardless of how many times he barfs them up later. Therefore, I would say that intelligence is making good choices based on knowledge and skill, and also making poor choices for reasons we are conscious of but choose to ignore.
Latest Article:
SVG Grids: Squares, Triangles, Hexagons with scrolling, sprites and simple animation examples -
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
I believe if you keep digging you'll find that the core issue is: They teach them _what_ to think. Not, _how_ to think. That's the difference between creative humans and mindless minions. chatGPT will "tell you the answer" and you cannot argue with that. It is the Genius in the room and knows far more than you. It's Bureaucracy fully-played out. But the Media-darlings love this, because then there are no arguments.
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
I've been saying this ever since I was in elementary school. The school system is inherently flawed. Curiosity is punished rather than awarded. It does some things good, given that most of us are at least a somewhat functional adult (stretching the definition of functional a bit there :laugh:). But recent study, for example, showed that kids in the Netherlands stop reading when they go to school. It has to do with the mandatory reading. Apparently, you're allowed to read LOTR or Harry Potter, but all schools talk about is rather ancient reading because that's what the teachers know. The books that are spoken about in detail today are the same as the books in my time, which were the same books my parents had to read. Once you've read a few of those book you'll learn to hate reading. I may not be the best example since I've always hated school, but I loved programming. Until I went to school for it. During that time I haven't touched Visual Studio in my spare time for months. When I quit it took a while for me to get active again. Eventually, I got some of my enthusiasm back. All in all, kids are getting worse in almost every subject, especially language and math.
Best, Sander Azure DevOps Succinctly (free eBook) Azure Serverless Succinctly (free eBook) Migrating Apps to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript
-
PIEBALDconsult wrote:
But that means that "something else" must separate humans
Does it? Is it really a categorical difference? Could it be so simple as elephants not ending up with opposable thumbs, for example? Or rats didn't lose their fur so that they could develop oral communication, etc? I'd argue that if there is a thing that separates us from animals, it's probably a combination of things. Opposable thumbs, highly articulated language, the inclination to alter our environment rather than adapt to it. All of these things led us to the top of the proverbial pecking order in nature's hierarchy of life. I don't know if there's any one thing that's fundamentally different than other animals, at least in terms of category. Other animals have language (dophins), but not as articulate. Other animals alter their environment (beavers), other primates have opposable digits, as do many birds. With the disclaimer that I am not religious: I remember reading the fall of man story in the old testament and coming away thinking that (based on my exegesis) part of what it was saying is that what makes us different than the rest of the animals is the ability to define and articulate elaborate moral frameworks ("the tree of knowledge of good and evil"). I thought that was interesting.
To err is human. Fortune favors the monsters.
So I think we are basically in agreement. :-D By "something else" I don't necessarily mean it has to be one thing, it may be a subtle variety of smaller things combined. Yes, there are "lower animals" which do some of the things we do, such as communicating, using simple tools, making shelters, altering the environment (e.g. beavers making ponds). But we're the only ones who do all of them and more and to a mind-boggling extent. Surely our ancestors learned from them. I don't think we can make assumptions about the morals of other animals, especially cats.
-
I've been saying this ever since I was in elementary school. The school system is inherently flawed. Curiosity is punished rather than awarded. It does some things good, given that most of us are at least a somewhat functional adult (stretching the definition of functional a bit there :laugh:). But recent study, for example, showed that kids in the Netherlands stop reading when they go to school. It has to do with the mandatory reading. Apparently, you're allowed to read LOTR or Harry Potter, but all schools talk about is rather ancient reading because that's what the teachers know. The books that are spoken about in detail today are the same as the books in my time, which were the same books my parents had to read. Once you've read a few of those book you'll learn to hate reading. I may not be the best example since I've always hated school, but I loved programming. Until I went to school for it. During that time I haven't touched Visual Studio in my spare time for months. When I quit it took a while for me to get active again. Eventually, I got some of my enthusiasm back. All in all, kids are getting worse in almost every subject, especially language and math.
Best, Sander Azure DevOps Succinctly (free eBook) Azure Serverless Succinctly (free eBook) Migrating Apps to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript
You're a music aficionado and you can find the entire explanation in The Logical Song (1979) by SuperTramp:
Logical Song:
When I was young, it seemed that life was so wonderful A miracle, oh it was beautiful, magical And all the birds in the trees, well they'd be singing so happily Oh joyfully, playfully watching me But then they send me away to teach me how to be sensible Logical, oh responsible, practical And they showed me a world where I could be so dependable Oh clinical, oh intellectual, cynical There are times when all the world's asleep The questions run too deep For such a simple man Won't you please, please tell me what we've learned I know it sounds absurd Please tell me who I am I said, watch what you say or they'll be calling you a radical Liberal, oh fanatical, criminal Won't you sign up your name, we'd like to feel you're acceptable Respectable, oh presentable, a vegetable Oh, take it take it yeah But at night, when all the world's asleep The questions run so deep For such a simple man Won't you please tell me what we've learned I know it sounds absurd Please tell me who I am, who I am, who I am, who I am 'Cause I was feeling so logical D-d-digital One, two, three, five Oh, oh, oh, oh It's getting unbelievable
Listen here: The Logical Song ( Lyrics ) Supertramp - YouTube[^]
-
nepdev wrote:
Something needs to change here, doesn't it?
Yes, and it's the belief that ChatGPT is somehow any good. You can blame the mainstream media (once more) for inciting mass hysteria. Someone else has already summarized ChatGPT's fundamental problem succinctly, as being confidently wrong. I find it hard to disagree with that. It's very little more than the sort of parlor trick BS artists manage to pull off.
dandy72 wrote:
It's very little more than the sort of parlor trick BS artists manage to pull off.
um not really, at all - could not be further from the truth. ChatGPT passes MBA exam given by a Wharton professor - University Business[^] https://healthitanalytics.com/news/chatgpt-passes-us-medical-licensing-exam-without-clinician-input[^] ChatGPT passes exams from law and business schools | CNN Business[^] ChatGPT is very real and will get better, faster, and more accurate every day going forward. The real danger of AI bots like ChatGPT is its eventual use as a people/political power control weapon.
-
PIEBALDconsult wrote:
But that means that "something else" must separate humans (and probably our extinct proto-human ancestors) from "the lower animals" -- but not "intelligence".
The simplest concept of what that "something else" is, is the ability to choose. Animals generally respond instinctively (they can be trained to not respond instinctively, but that's still not choice.) We humans are unique in that we can choose not to respond by instinct. Ooh, that cake looks delicious, I'm going to eat it. Or, nice cake, but I'm watching my calories. Or I'm lactose intolerant so eating that would not be a good idea. Conversely, my cat loves to chew on certain plant leaves regardless of how many times he barfs them up later. Therefore, I would say that intelligence is making good choices based on knowledge and skill, and also making poor choices for reasons we are conscious of but choose to ignore.
Latest Article:
SVG Grids: Squares, Triangles, Hexagons with scrolling, sprites and simple animation examplesI'll need to give that more thought. Personally, I'm unclear on what constitutes instinct anyway, so I may be a bit lost. As to choice, I'd still be unsure where to draw the line. For instance: When a pack of predators attacks the weakest members of a herd of prey, is that instinct or choice? Wouldn't instinct demand they attack the largest/meatiest? Is attacking the weakest members a learned strategy? This reminds me of "A Beautiful Mind". I think humans have probably lost much of the instinct our ancestors must have had and replaced it with learned knowledge. Maybe that's what makes the difference today, but there still must have been chooser-zero who had the ability and acted on it. Probably some bratty kid refusing to eat his mammoth.
-
dandy72 wrote:
It's very little more than the sort of parlor trick BS artists manage to pull off.
um not really, at all - could not be further from the truth. ChatGPT passes MBA exam given by a Wharton professor - University Business[^] https://healthitanalytics.com/news/chatgpt-passes-us-medical-licensing-exam-without-clinician-input[^] ChatGPT passes exams from law and business schools | CNN Business[^] ChatGPT is very real and will get better, faster, and more accurate every day going forward. The real danger of AI bots like ChatGPT is its eventual use as a people/political power control weapon.
-
dandy72 wrote:
It's very little more than the sort of parlor trick BS artists manage to pull off.
um not really, at all - could not be further from the truth. ChatGPT passes MBA exam given by a Wharton professor - University Business[^] https://healthitanalytics.com/news/chatgpt-passes-us-medical-licensing-exam-without-clinician-input[^] ChatGPT passes exams from law and business schools | CNN Business[^] ChatGPT is very real and will get better, faster, and more accurate every day going forward. The real danger of AI bots like ChatGPT is its eventual use as a people/political power control weapon.
I think you only have to look at how ChatGPT actually works - it tries to figure out the "best" next word in the sentence. That is NOT how to reason or think. Do you think that way? Certainly not - I would guess you have a CONCEPT first before you open the mouth. ChatGTP has no concept. It is just word babble. Thinking is not talking, no matter how many "scientists" may tell you that the way we think is through words. Einstein did not. And what about musicians? They don't think "now I need to put a F# semiquaver here in this position" (and if they do, their music is balderdash)
-
But that's my exact point - what kind of question do they ask if they can be answered by a mindless robot??? Not matter how "reputable" that exam is
nepdev wrote:
mindless robot
wow, just wow. :confused: :doh: :sigh:
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
nepdev wrote:
Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"?
For the most part the answer is yes. Despite what many might claim I see little evidence that formal education in its entirety teaches people to 'think'. Might get lucky in some cases but for the most part it is just a matter of learning the exact date of '1492', where the comma goes in the sentence and how to add and subtract.
nepdev wrote:
Cramming data down their throat is, in my opinion, NOT what a school should do.
Many others have that opinion as well. And there have been many attempts to provide an alternative way. But none of those work (they seem to work in very, very small tests but in larger rollouts they change nothing or even lead to more problems.) So certainly if you know a way that can provide revolutionary change to education then you should step up and start proving it and then popularizing it.
nepdev wrote:
Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater."
The educational system targets the average not the above average. That is necessary because, by definition, most of the users (students) of the system are and always will be average.
nepdev wrote:
Something needs to change here, doesn't it?
Discard the calculator? Discard the keyboard? Discard the slide rule? Discard pencil and paper? Should students be taught solely by making new copies of religious texts? Or recognize those are changes that do in fact help students. If AI agents were capable of, in general, answering all questions correctly then why not use such a useful tool. Doing so does not directly lead to no one thinking - it is just another tool. However nothing suggests that AI agents are even close to being capable of providing correct answers to even most things. They provide answers to many things but the validity is often in doubt and is in fact often wrong. But in the same way if you rely solely on the results of a calculator to build a bridge you should expect that it will fall down (probably even before it starts being used.)
-
dandy72 wrote:
It's very little more than the sort of parlor trick BS artists manage to pull off.
um not really, at all - could not be further from the truth. ChatGPT passes MBA exam given by a Wharton professor - University Business[^] https://healthitanalytics.com/news/chatgpt-passes-us-medical-licensing-exam-without-clinician-input[^] ChatGPT passes exams from law and business schools | CNN Business[^] ChatGPT is very real and will get better, faster, and more accurate every day going forward. The real danger of AI bots like ChatGPT is its eventual use as a people/political power control weapon.
Slacker007 wrote:
um not really, at all - could not be further from the truth.
Not true. For example from the first link. 1. It was one test 2. It was one class 3. It did not score perfect. Then the second link 1. The 'passing' score was just barely and that was 60%. 2. Hardly the only thing that goes in to becoming certified. 3. Text suggests this is not something new. They have run this test before and this is just the first time it got a score that high. 4. Why would it matter? There are studies that suggest medical errors are in the top 10 causes of death in the US. And it could be as high as the top three. So are you worried that the software might make the wrong choice?
-
Slacker007 wrote:
um not really, at all - could not be further from the truth.
Not true. For example from the first link. 1. It was one test 2. It was one class 3. It did not score perfect. Then the second link 1. The 'passing' score was just barely and that was 60%. 2. Hardly the only thing that goes in to becoming certified. 3. Text suggests this is not something new. They have run this test before and this is just the first time it got a score that high. 4. Why would it matter? There are studies that suggest medical errors are in the top 10 causes of death in the US. And it could be as high as the top three. So are you worried that the software might make the wrong choice?
Don't you see that ChatGPT is only going to get "smarter" with time? Don't you see that? it's passing all the tests, barely, but passing. It won't be long at all when it passes all the tests with 100% scores. Humans make silly mistakes, like forgetting to remove all the gauze from a site before sewing up. AI bots will not forget. I will be laughing at all of this, especially at you haters and doubters, everyday till I die.
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?
I dunno. Take out loans. Go to grad school. Have debts forgiven. Live long and prosper.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
-
I think you only have to look at how ChatGPT actually works - it tries to figure out the "best" next word in the sentence. That is NOT how to reason or think. Do you think that way? Certainly not - I would guess you have a CONCEPT first before you open the mouth. ChatGTP has no concept. It is just word babble. Thinking is not talking, no matter how many "scientists" may tell you that the way we think is through words. Einstein did not. And what about musicians? They don't think "now I need to put a F# semiquaver here in this position" (and if they do, their music is balderdash)
That is how it works now. You have to look past the now, and at the future. New versions, new updates, new branches. It eventually become what we all fear and know to be true.