this post was submitted on 13 Oct 2023
35 points (100.0% liked)
Solarpunk
5463 readers
31 users here now
The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.
Join our chat: Movim or XMPP client.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Things my kid learnt through "socializing and learning things naturally":
Nonwhithstanding the fact that we are talking about future tech and that the pace at which AI is advancing right now is crazy, even today, on these metrics, I think they do better than we do as a society without them.
The examples you provide are negatively biased. You don't know all of the normal and useful things they learn because they don't stand out. Also two of those examples (Church and school busses) come from current cultural biases, something a solarpunk society would hopefully mitigate.
I think AI is not suited for discussion. It might be good at conversation but discussion isn't just conversation. Discussion requires understanding of others to a degree I don't think AI can achieve.
I concede my point about resources, but will add that the model will get outdated and will need retraining every once in a while.
Textbooks are bad. I agree. I just think they should be replaced with a human that knows what they are talking about and the topics that are learnt are things that the kid actually wants to know instead of what people think they should know.
Also I can't help but notice you ignored one of my core arguments: that solarpunk societies are about strong human connections and replacing one of the main sources of these connections is a bad idea.
I also think that the process of finding information is as important as the actual information. If all of your questions are answered just by typing it into the computer then you never learn the importance of checking information accuracy, accounting for bias and other very useful skills.
AI allows you to shortcut to the information you seek which means you never learn how to actually think for yourself.
I'll address that one first then
I have had people tell me that we should not automate cashiers because it removes human connections. I believe that humans want human connections and that if you remove the obligation to get something out of these connections, they will become richer and more meaningful. I sometime feel I am tricking my kid to trigger their interest in "useful" things. I'd rather play with him than force his to rewrite the same word with good pretty round letters a hundred times.
When you remove painful obligations from a human connection, you may sever some that you were forcing yourself into, but you also give room to much more of them. If teaching was accounted for, I would spend more time showing my kid what I really love in life, places that make me feel at peace, events that make me feel alive, techniques that I find interesting despite their practical uselessness.
Human are social animals. We make social connections and may even die for it. Don't worry, removing an obligation will not remove the need for more meaningful ones.
Now for the rest:
Why do you assume that a society that manages to mitigate biases that are extremely central to our current problems in social relations would have a hard time mitigating bias in AIs training dataset?
Look at what AI mentors do today, right now. Ask them follow up questions, ask them what you don't understand. Several conversations with GPT-4 (which is the best right now but trailed by more open models) convinced me otherwise. Even if you may argue that such models currently are not as good as a human tutor, I find it hard to argue that it does not beat the "conversation" of a class of 30 with a lone teacher.
True in all day and age and on every media. Had to teach it to several kids in my family as school seems to do a pretty poor job at it. Interestingly enough, something that LLMs can teach well as well despite being (currently) pretty poor at it.
Okay, I agree. LLMs might be useful for education. But they should not replace practical experience.
I just am skeptical to everyone that says we should replace something with AI, because most of the time these decisions seem to be motivated by profit and aren't actually better.
I think 99% of the companies that hope to replace something with AI for profit are going to fail, as these models become lighter and lighter and have open source equivalents and a VERY motivated community behind them to prevent corporate lock-in.