Eh, LLMs do have a significant problem in how they can generate false information by themselves. Every other tool prior requires a person to make said false information, but LLMs can just generate it when asked a question
You are downvoted, but you are right that at least some do this
ToS are generally not binding as it's not expected for the average person to actually read through the dense language. There is precedent for this
I take my ADHD meds and put on some breakcore
My hope is that Russia barely manages to defeat the Wagner group, then is left so weakened that their corrupt government completely collapses and real change can start
Even power users do not like him one bit
Icalasari
joined 1 year ago
Last I checked, you kinda need lungs to breathe. And also last I checked, arms and legs aren't organs