Irrespective of whether I agree with everything or not, this article genuinely gives me a pinch of hope. I also found shades of Dostoevsky’s Notes from the Underground in the way you wrestle with the tension between material abundance and inner emptiness. Thank you for this piece!
Human nature…the good, the bad, and the ugly. People are terrified of AI because it is the good, the bad, and the ugly reflection of ourselves. The bad—lying, lack of empathy, biases, soulless efficiency, etc. The good—medicine, space exploration, new knowledge, etc.
Hunger for meaning is ancient, but we got there through struggle, and hard work, and we don’t feel value in cheating our way out of half a million years of human history. People don’t want children now because they are petrified of an uncertain future. We still have cave man brains. We are still superstitious…or religious, in our own way. AI is our own Lucifer—the angel of god that could turn away from the good.
Overheard a guy in SF who was interviewing for AI dev jobs. One company told him, yeah we have some problems, but you don’t want Grok to come out on top do you? So it’s a kind of AI battle between good and evil. This is what humans understand because this is us. Religion is not in churches anymore, but people still believe in the dualistic nature of reality. Reality/technology is whatever we make it. And we know what human nature can do.
AI is not some technology substitute or addition where the Jevons paradox would come into play (what you are describing). It’s a full replacement of all cognitive work tasks (and later on physical work as robotics continues improving).
Irrespective of whether I agree with everything or not, this article genuinely gives me a pinch of hope. I also found shades of Dostoevsky’s Notes from the Underground in the way you wrestle with the tension between material abundance and inner emptiness. Thank you for this piece!
Once again, exactly correct for the wrong reason.
I’ll take the win
Human nature…the good, the bad, and the ugly. People are terrified of AI because it is the good, the bad, and the ugly reflection of ourselves. The bad—lying, lack of empathy, biases, soulless efficiency, etc. The good—medicine, space exploration, new knowledge, etc.
Hunger for meaning is ancient, but we got there through struggle, and hard work, and we don’t feel value in cheating our way out of half a million years of human history. People don’t want children now because they are petrified of an uncertain future. We still have cave man brains. We are still superstitious…or religious, in our own way. AI is our own Lucifer—the angel of god that could turn away from the good.
Overheard a guy in SF who was interviewing for AI dev jobs. One company told him, yeah we have some problems, but you don’t want Grok to come out on top do you? So it’s a kind of AI battle between good and evil. This is what humans understand because this is us. Religion is not in churches anymore, but people still believe in the dualistic nature of reality. Reality/technology is whatever we make it. And we know what human nature can do.
AI is not some technology substitute or addition where the Jevons paradox would come into play (what you are describing). It’s a full replacement of all cognitive work tasks (and later on physical work as robotics continues improving).