AI language models are trained on vast amounts of text created by scientists, writers, thinkers, entertainers and more. Those human contributions, Lanier says, should not be erased from the output of language models. That erasure is what gives people a perception of AI as “this new super alien angel who's going to come and save us or kill us” rather than what it really is or could be: “a new, very high-level, very large-scale form of cooperation [that] is even more glorious than having a big alien angel. I think it's really cool.”
...
“It’s essentially doing statistics on word order and distance in a big amount of data, and then using that to create a new thing a that's informed by it,” Lanier said. “Isn't it surprising that natural language can support this rather simple thing and get this result? Shouldn't we be thinking to ourselves, ‘Wow, there's kind of more going on with natural language than we realize?’”