Key insights of today’s newsletter: There’s been a lot of confusion about what AI is and isn’t — and much of that confusion can be attributed to the language used to describe AI. Not only does this confusion add to an over-attribution in AI capabilities, it also contributes to a narrowing understanding of the human mind.
I found the passage on the description of people increasingly like machines and vice versa very interesting. in fact, reading different papers I happen to "come across" very similar titles. It is probably also because we want to emphasize a part of the characteristics of these machines that can be more understandable. Furthermore, perhaps, it is always a stylistic choice to reduce friction between new users and chatbots. Do not you think?
You write, "I’d even go as far as to say that language alters our perception."
To be more precise, what alters our perception is the nature of what both the speaker and their speech is made of. Thought. I don't mean the content of thought, but the nature of the medium itself, the way it works. Language reflects the properties of what language is made of.
Thought operates by dividing the single unified reality in to conceptual parts. The noun is the easiest example of this. More here:
It could be interesting to speculate the degree to which AI will also inherit and reflect the properties of thought. As I'm typing this I'm realizing I haven't really thought about that enough.
I sympathise with your view, Phil. I've not thought about it in much detail either, but I'm a fan of Wittgensteins view on language, which can be summed up as: "The meaning of a word lies in its use." Everything is context. And only by engaging in a "conversation" with the world, we can learn about it in a meaningful way. AI, of course, does NOT engage in this conversation with the world, it just has access to the words.
AI doesn’t have to be intelligent to actively participate in generating new understanding. https://open.substack.com/pub/cybilxtheais/p/the-hall-of-magnetic-mirrors?r=2ar57s&utm_medium=ios
Besides intelligence and learning, there is a case to be made around AI not capable of imagination or intuition: https://open.substack.com/pub/unexaminedtechnology/p/the-two-is-we-need-to-include-in?r=2xhhg0&utm_medium=ios
I found the passage on the description of people increasingly like machines and vice versa very interesting. in fact, reading different papers I happen to "come across" very similar titles. It is probably also because we want to emphasize a part of the characteristics of these machines that can be more understandable. Furthermore, perhaps, it is always a stylistic choice to reduce friction between new users and chatbots. Do not you think?
You write, "I’d even go as far as to say that language alters our perception."
To be more precise, what alters our perception is the nature of what both the speaker and their speech is made of. Thought. I don't mean the content of thought, but the nature of the medium itself, the way it works. Language reflects the properties of what language is made of.
Thought operates by dividing the single unified reality in to conceptual parts. The noun is the easiest example of this. More here:
https://www.tannytalk.com/p/article-series-the-nature-of-thought
It could be interesting to speculate the degree to which AI will also inherit and reflect the properties of thought. As I'm typing this I'm realizing I haven't really thought about that enough.
Hmm.....
I sympathise with your view, Phil. I've not thought about it in much detail either, but I'm a fan of Wittgensteins view on language, which can be summed up as: "The meaning of a word lies in its use." Everything is context. And only by engaging in a "conversation" with the world, we can learn about it in a meaningful way. AI, of course, does NOT engage in this conversation with the world, it just has access to the words.