What are we doing to our kids?
-
Don't you see that ChatGPT is only going to get "smarter" with time? Don't you see that? it's passing all the tests, barely, but passing. It won't be long at all when it passes all the tests with 100% scores. Humans make silly mistakes, like forgetting to remove all the gauze from a site before sewing up. AI bots will not forget. I will be laughing at all of this, especially at you haters and doubters, everyday till I die.
"Reinforced Learning with Feedback" is the algorithm that ChatGPT is based on. With a huge dataset taken from the internet and from the feedback of everyone that uses it telling it correct or incorrect it will "learn" but it won't learn the same way humans do. It won't have imagination.It won't have intuition. It won't have any of the human characteristics that make what we humans call Intelligence. That's why it'll always be ARTIFICIAL INTELLIGENCE. An IMMITATION of intelligence. Never the real thing. Let's get real. It's no more than just a system running an algorithm. If we treat is as such and it's no more than just a fancy toy. Treat it like your new "GOD" and well "It'll become faster, smarter" and all of that lah-dee-dah. Time to choose, humans.
-
Now, when ChatGPT can write essays better than school kids, and has answers to lots of questions, it seems to me that it could also answer the exam questions kid are getting in school or college. Of course, the AI proponents are going to praise this as proof of how "intelligent" ChatGPT is - it's so good, it could pass a college exam! But is it? Isn't it rather a poor comment of what nonsense we are doing in schools? Is schooling really meant to be repeating random facts, regurgitate what you have been told so you can spit it out again on an exam paper? Is this "learning"? If you think that's learning, THEN of course ChatGPT is "intelligent". Even Einstein apparently said "most of my work came from imagination, not logical thinking. And if you have problem with mathematics, I assure you mine are still greater." A school should prepare kids for life, give them some competence they can use, some knowledge they can apply, make them curious to create and use their imagination. Cramming data down their throat is, in my opinion, NOT what a school should do. It's just another example of how "automation" takes something away from humans. But is it really taking something away, or is it not rather pointing out that this was, after, not really human to do this stuff? Was it human to die as a slave while carrying stones to the pyramids in Egypt, or rowing the Roman boats? Certainly it wasn't - and now it's replaced by machines. It certainly created some unemployment, I guess - the real stupid people were then unemployed. But what business does anyone have to be stupid? That's where schools come in. But they, now, just make kids into parrots, easily replaced by chatbots. Maybe ChatGPT just points out that the "robotic" repetition really does not have a place in our schools. Something needs to change here, doesn't it?