Yuval Noah Harari Reveals the Real Dangers Ahead
Simple ideas...explained complexly to sell books and seem mystical.
Yuval Noah Harari is an author. He wrote Sapiens and 21 Lessons for the 21st Century.
When you talk to the Fourth Industrial Yuppies, they’re basically repeating their child-like wonder of Yuval’s ideas. These are basic ideas…explained mystically.
So before you go to your next meeting, learn to identify the ideas he’s putting forth and how to respond to them. It’ll make people think you’re smart…so you can charge them more money or whatever.
Overview
In this video, Yuval Noah Harari discusses the dangers of a world in which technology is replacing human jobs and humans are becoming more reliant on information. He also discusses the dangers of surveillance capitalism and how it could lead to totalitarian regimes.
Response: Surveillance Capitalism is a phrase that Fourth Industrial Yuppies will often parrot…even as they spend 100% of their time surveilling data. They’ll talk about everything wrong with capitalism while feeding their family with it. When someone mentions surveillance capitalism, let them know that,
Risk cannot be destroyed; merely transformed. - Hoffstein
The Theory of Narrative Causality
00:00:00Â Yuval Noah Harari discusses how humans have gotten to where we are in large part because of our ability to create and believe fictional stories, and how this power has led to our current situation where we don't have a narrative to explain what is happening in the world.
Response: Yes yes yes the Theory of Narrative Causality by Terry Pratchett. What Yuval is really getting at is that we worship the pie chart instead of the crucifix because the pie chart aims at simplifying the hopelessly complex. What could be more divine than a tool that simplifies the complex?
People Think The Past Is Better Than The Present
00:05:00Â Yuval Noah Harari discusses the history of the term "liberal," which covers a broad range of political, economic, and personal values. He argues that since the 1990s, the world has seen numerous advances that were not possible in previous decades, but that this has created a false sense of security and has led some people to nostalgically recall a "better" past.
Response: Yup. And?
Humans Are Being Left Behind
00:10:00Â Yuval Noah Harari discusses the disillusionment and backlash against the liberal order, which he believes is due to a sense of being left behind in the story of the 20th century. He argues that this is due to the idea that the big heroes of the story are the common people, rather than all people, and that this has led to the current dissatisfaction.
Response: This feeling of disillusionment has never not been with us.
Will AI Replace Humans?
00:15:00Â Yuval Noah Harari discusses the potentially negative consequences of advances in artificial intelligence and machine learning. He warns that many jobs will be replaced, and that many humans will be left behind. However, he believes that many humans will be able to adapt and find new and exciting jobs.
00:20:00Â Yuval Noah Harari discusses the dangers of technology replacing human jobs, and offers ways to avert these scenarios. He argues that society will need to redefine what it means to work and live in order to avoid drastic consequences.
Response: Yes yes yes - society advances by the number of operations it can perform without thinking. This is a whole pile of duh. Risk cannot be destroyed, only transformed (Hoffstein).
The Data Economy Cannibalizes
00:25:00Â Yuval Noah Harari discusses the dangers that arise from the current data economy, in which people give up their data for free services. He recommends that we think about ways to attribute the value of data back to its individual owners, which could lead to growth in the economy.
Response: You can see that what he’s doing is pointing out problems without really saying much. It’s easier to describe a problem than solve it (Thiel).
The Information Tax
00:30:00Â Yuval Noah Harari discusses the dangers of a world in which only information is exchanged, and suggests that we may need to adopt an information tax to reflect the value of data.
Response: Finally! An idea! I’m sure he’s not doing anything about any of these things. What a stupid thing to be - a critic who does nothing.
Surveillance Capitalism and Totalitarian Regimes
00:35:00Â Yuval Noah Harari discusses the dangers of surveillance capitalism, which could lead to totalitarian regimes. He points out that the technology is also capable of providing us with better healthcare and guidance in our lives. However, there are also dangers associated with the technology, such as abuse.
Response: Wow - this guy is basically a liberal arts student with footnotes.
Self-Reflection Is Important
00:40:00Â Yuval Noah Harari discusses how algorithms can often be better at making decisions than humans, and how this can pose a challenge to humans' understanding of what is good and bad. He suggests that we need to develop guidelines for making decisions about what is important to us, and that self-reflection and self-exploration are often more difficult than we imagine.
Response: Oh yes, you think we should balance Christianity with Taoism….duh.
The Importance of Meditation
00:45:00Â Yuval Noah Harari discusses the benefits of meditation and how it can help us to better understand our own attention and how it is often misleading. He goes on to say that without regularly practicing meditation, we will be unable to understand the current state of the world and its attention economy.
Response: Oh yes, you think we should balance Christianity with Taoism….duh. If it’s one thing Fourth Industrial Yuppies love to misunderstand, it’s Buddhism and Hinduism. If they talk about mindfulness, they’re mindless.
Consciousness in Tech
00:50:00Â Yuval Noah Harari discusses how technology can be used to exploit and manipulate people, and how we need to be careful not to let it do so. He offers the example of how Google is not conscious, and suggests that we may be able to write ourselves back into the story of consciousness as the only things in the universe that we know of that are capable of the things that matter most.
Response: Let me guess - everything is bad. Yeah yeah yeah I get it. AI is replacing humans. AI is replacing what humans do to make meaning from their life - there’s a difference.
Stories…Again
00:55:00Â Yuval Noah Harari discusses the dangers of global cooperation without a shared story, pointing to examples like the power of science and the American Dream. He argues that while some stories may be useful in the short term, over time they may become truer or more useful.
Response: What a complicated simple man.