影片字幕
00:00:03 When do you see what you define as
00:00:03 你什麼時候看到你定義的
00:00:03 When do you see what you define as digital super intelligence?
00:00:03 何時會看到您定義的數位超級智慧?
00:00:04 digital super intelligence?
00:00:04 數位超級智慧?
00:00:04 digital super intelligence? Uh, within 10 years.
00:00:04 數位超級智慧?呃,10年內。
00:00:06 Uh, within 10 years.
00:00:06 呃,10年內。
00:00:06 Uh, within 10 years. The AI's ability to generate its own
00:00:06 呃,10年內。人工智慧能夠產生自己的
00:00:09 The AI's ability to generate its own
00:00:09 人工智慧能夠產生自己的
00:00:09 The AI's ability to generate its own scaffolding is imminent. Pretty much
00:00:09 人工智慧即將擁有自己建造鷹架的能力。
00:00:12 scaffolding is imminent. Pretty much
00:00:12 腳手架即將搭建。差不多
00:00:12 scaffolding is imminent. Pretty much sure that that will be a 2025 thing. We
00:00:12 腳手架即將搭建。幾乎可以肯定,這將是2025年的事。我們
00:00:15 sure that that will be a 2025 thing. We
00:00:15 肯定是2025年的事。我們
00:00:15 sure that that will be a 2025 thing. We certainly don't know what super
00:00:15 肯定是2025年的事。我們當然不知道超級
00:00:16 certainly don't know what super
00:00:16 當然不知道什麼超級
00:00:16 certainly don't know what super intelligence will deliver, but we know
00:00:16 我們當然不知道超級智慧會帶來什麼,但我們知道
00:00:18 intelligence will deliver, but we know
00:00:18 情報會傳遞訊息,但我們知道
00:00:18 intelligence will deliver, but we know it's coming.
00:00:18 情報將會公佈,但我們知道它即將到來。
00:00:19 it's coming. 00:00:19 它來了。
00:00:19 it's coming. And what do people need to know about
00:00:19 它來了。人們需要知道什麼
00:00:21 And what do people need to know about
00:00:21 人們需要了解什麼
00:00:21 And what do people need to know about that?
00:00:21 人們需要了解什麼呢?
00:00:21 that? 00:00:21 那?
00:00:22 that? You're going to have your own polymath.
00:00:22 那個?你將會擁有自己的博學者。
00:00:24 You're going to have your own polymath.
00:00:24 你將會擁有自己的博學者。
00:00:24 You're going to have your own polymath. So, you're going to have the sum of
00:00:24 你將擁有自己的博學之人。所以,你將擁有
00:00:25 So, you're going to have the sum of
00:00:25 所以,你會得到
00:00:25 So, you're going to have the sum of Einstein and Leonardo da Vinci in the
00:00:25 所以,你將得到愛因斯坦和列奧納多達文西的總和
00:00:28 Einstein and Leonardo da Vinci in the
00:00:28 愛因斯坦和李奧納多達文西在
00:00:28 Einstein and Leonardo da Vinci in the equivalent of your pocket. agents are
00:00:28 愛因斯坦和達文西在你的口袋裡。特工是
00:00:30 equivalent of your pocket. agents are
00:00:30 相當於你的口袋。代理是
00:00:30 equivalent of your pocket. agents are going to happen. This math thing is
00:00:30 相當於你的口袋。代理將會發生。這個數學問題是
00:00:31 going to happen. This math thing is
00:00:31 會發生。這個數學問題
00:00:32 going to happen. This math thing is going to happen. The software thing is
00:00:32 會發生。數學的事情會發生。軟體的事情
00:00:33 going to happen. The software thing is
00:00:33 會發生。軟體的事情
00:00:33 going to happen. The software thing is going to happen. Everything I've talked
00:00:33 會發生。軟體的事情會發生。我所說的一切
00:00:34 going to happen. Everything I've talked
00:00:34 會發生的。我所說的一切
00:00:34 going to happen. Everything I've talked about is in the positive domain, but
00:00:34 會發生。我談論的都是正面的方面,但是
00:00:36 about is in the positive domain, but
00:00:36 大約在正域中,但是
00:00:36 about is in the positive domain, but there's a negative domain as well. It's
00:00:36 是關於正域的,但也有一個負域。它是
00:00:38 there's a negative domain as well. It's
00:00:38 還有一個負域。它是
00:00:38 there's a negative domain as well. It's likely, in my opinion, that you're going
00:00:38 還有一個負面領域。在我看來,你很可能會
00:00:40 likely, in my opinion, that you're going
00:00:40 在我看來,你很可能會
00:00:40 likely, in my opinion, that you're going to see.
00:00:40 在我看來,你很有可能會看到這一點。
00:00:48 Now, that's a moonshot, ladies and
00:00:48 女士們,這真是個偉大的夢想
00:00:48 Now, that's a moonshot, ladies and gentlemen.
00:00:48 女士、先生們,這是一個偉大的夢想。
00:00:53 Hey, everybody. Welcome to Moonshots.
00:00:53 大家好,歡迎收看《Moonshots》。
00:00:53 Hey, everybody. Welcome to Moonshots. I'm here live with my Moonshot mate,
00:00:53 大家好!歡迎收看《Moonshots》。我和我的《Moonshot》夥伴一起直播。
00:00:54 I'm here live with my Moonshot mate,
00:00:54 我和我的 Moonshot 夥伴在這裡直播,
00:00:54 I'm here live with my Moonshot mate, Dave London. Uh we're here in our Santa
00:00:54 我和我的 Moonshot 夥伴 Dave London 一起在這裡直播。呃,我們在聖誕老人
00:00:57 Dave London. Uh we're here in our Santa
00:00:57 戴夫倫敦。呃,我們在聖誕老公公這裡
00:00:57 Dave London. Uh we're here in our Santa Monica studios and we have a special
00:00:57 戴夫倫敦。呃,我們在聖莫尼卡工作室,我們有一個特別的
00:00:58 Monica studios and we have a special
00:00:58 莫妮卡工作室,我們有一個特別的
00:00:58 Monica studios and we have a special guest today,
00:00:58 莫妮卡工作室,今天我們有一位特別嘉賓,
00:01:00 guest today, 00:01:00 今天的客人,
00:01:00 guest today, Eric Schmidt, the author of Genesis. We
00:01:00 今天的嘉賓是《創世紀》的作者艾瑞克·施密特。我們
00:01:03 Eric Schmidt, the author of Genesis. We
00:01:03 艾瑞克‧施密特,《創世紀》的作者。我們
00:01:03 Eric Schmidt, the author of Genesis. We talk about China. We're going to talk
00:01:03 艾瑞克‧施密特,《創世紀》的作者。我們討論中國。我們將討論
00:01:05 talk about China. We're going to talk
00:01:05 談中國。我們將討論
00:01:05 talk about China. We're going to talk about, you know, digital super
00:01:05 談中國。我們要討論的是數位超級
00:01:06 about, you know, digital super
00:01:06 關於數字超級
00:01:06 about, you know, digital super intelligence. We'll talk about, you
00:01:06 關於數位超級智慧。我們會討論你
00:01:08 intelligence. We'll talk about, you
00:01:08 情報。我們會討論,你
00:01:08 intelligence. We'll talk about, you know, what people should be thinking
00:01:08 智力。我們會討論人們應該思考什麼
00:01:10 know, what people should be thinking
00:01:10 知道人們應該想什麼
00:01:10 know, what people should be thinking about over the 10 years.
00:01:10 知道,人們在未來 10 年內應該思考什麼。
00:01:11 about over the 10 years.
00:01:11 大約過去10年。
00:01:11 about over the 10 years. And we're talking about the guy who has
00:01:11 大約十年前。我們談論的是
00:01:14 And we're talking about the guy who has
00:01:14 我們正在談論的這個人
00:01:14 And we're talking about the guy who has more access to more more actionable
00:01:14 我們討論的是那些擁有更多機會獲取更多可操作資訊的人
00:01:16 more access to more more actionable
00:01:16 更多存取權限,更多可操作性
00:01:16 more access to more more actionable information than probably anyone else
00:01:16 比其他任何人都能獲得更多可操作的信息
00:01:17 information than probably anyone else
00:01:17 比其他任何人都了解
00:01:17 information than probably anyone else you could think of. So, it should be
00:01:17 比你能想到的任何人都更了解資訊。所以,它應該是
00:01:20 you could think of. So, it should be
00:01:20 你能想到的。所以,應該是
00:01:20 you could think of. So, it should be should be pretty exciting.
00:01:20 你能想到的。所以,這應該會很令人興奮。
00:01:21 should be pretty exciting.
00:01:21 應該會非常令人興奮。
00:01:21 should be pretty exciting. Incredibly brilliant. All right, stand
00:01:21 應該會很精彩。太棒了。好的,站起來
00:01:23 Incredibly brilliant. All right, stand
00:01:23 太精彩了。好,站起來
00:01:23 Incredibly brilliant. All right, stand by for a conversation with the Eric
00:01:23 太棒了。好的,請稍等,聽聽 Eric 的對話。
00:01:25 by for a conversation with the Eric
00:01:25 與 Eric 的對話
00:01:25 by for a conversation with the Eric Schmidt, CEO or past CEO of Google and
00:01:25 與 Google CEO 或前 CEO Eric Schmidt 的對話
00:01:28 Schmidt, CEO or past CEO of Google and
00:01:28 施密特,Google 執行長或前任首席執行官
00:01:28 Schmidt, CEO or past CEO of Google and an extraordinary investor and uh and
00:01:28 施密特,Google CEO 或前 CEO,也是一位傑出的投資者,呃,
00:01:30 an extraordinary investor and uh and
00:01:30 一位非凡的投資者,呃,
00:01:30 an extraordinary investor and uh and thinker in this field of AI.
00:01:30 一位傑出的投資者和人工智慧領域的思想家。
00:01:32 thinker in this field of AI.
00:01:32 人工智慧領域的思想家。
00:01:32 thinker in this field of AI. Let's do it.
00:01:32 人工智慧領域的思想家。我們一起行動吧。
00:01:33 Let's do it. 00:01:33 我們開始吧。
00:01:33 Let's do it. Eric, welcome back to Moonshots.
00:01:33 開始吧! Eric,歡迎回到《Moonshots》節目。
00:01:35 Eric, welcome back to Moonshots.
00:01:35 Eric,歡迎回到 Moonshots。
00:01:35 Eric, welcome back to Moonshots. It's great to be here with you guys.
00:01:35 Eric,歡迎回到 Moonshots。很高興和大家在一起。
00:01:36 It's great to be here with you guys.
00:01:36 很高興和你們在一起。
00:01:36 It's great to be here with you guys. Thank you. It's been uh it's been a long
00:01:36 很高興和你們在一起。謝謝。好久不見了
00:01:39 Thank you. It's been uh it's been a long
00:01:39 謝謝。好久不見了
00:01:39 Thank you. It's been uh it's been a long road since I first met you at Google. I
00:01:39 謝謝。自從我在谷歌第一次見到你以來,走過了漫長的道路。我
00:01:42 road since I first met you at Google. I
00:01:42 自從我在谷歌第一次見到你以來,我
00:01:42 road since I first met you at Google. I remember uh our first conversations were
00:01:42 自從我在谷歌第一次見到你以來,我記得我們第一次談話是
00:01:44 remember uh our first conversations were
00:01:44 記得我們第一次談話是
00:01:44 remember uh our first conversations were fantastic. Uh it's been a crazy month in
00:01:44 記得我們第一次談話非常愉快。嗯,這是一個瘋狂的月份
00:01:48 fantastic. Uh it's been a crazy month in
00:01:48 太棒了。呃,這是一個瘋狂的月份
00:01:48 fantastic. Uh it's been a crazy month in the world of AI, but I think every month
00:01:48 太棒了。嗯,這是人工智慧世界瘋狂的一個月,但我認為每個月
00:01:50 the world of AI, but I think every month
00:01:50 人工智慧的世界,但我認為每個月
00:01:50 the world of AI, but I think every month from here is going to be a crazy month.
00:01:50 人工智慧的世界,但我認為從現在開始每個月都會是一個瘋狂的月份。
00:01:52 from here is going to be a crazy month.
00:01:52 從現在開始將會是一個瘋狂的月份。
00:01:52 from here is going to be a crazy month. And so I'd love to hit on a number of
00:01:52 從現在開始,這將是一個瘋狂的月份。所以我想談一些
00:01:55 And so I'd love to hit on a number of
00:01:55 所以我想談談
00:01:55 And so I'd love to hit on a number of subjects and get your your take on them.
00:01:55 因此,我很樂意談論一些話題並聽聽你的看法。
00:01:57 subjects and get your your take on them.
00:01:57 主題並了解你對它們的看法。
00:01:57 subjects and get your your take on them. I want to start with probably the most
00:01:57 了解你對這些主題的看法。我想先從最
00:01:59 I want to start with probably the most
00:01:59 我想從最
00:01:59 I want to start with probably the most important point that you've made
00:01:59 我想從你提出的最重要的一點開始
00:02:00 important point that you've made
00:02:00 你提出的重要觀點
00:02:00 important point that you've made recently that got a lot of traction, a
00:02:00 你最近提出的一個重要觀點得到了廣泛的關注,
00:02:02 recently that got a lot of traction, a
00:02:02 最近,這個主題引起了很大的關注,
00:02:02 recently that got a lot of traction, a lot of attention, which is that AI is
00:02:02 最近,人工智慧引起了廣泛關注,
00:02:05 lot of attention, which is that AI is
00:02:05 很多關注,那就是人工智慧
00:02:05 lot of attention, which is that AI is underhyped when the rest of the world is
00:02:05 許多關注,這是因為當世界其他地方都在關注人工智慧時,人工智慧卻沒有得到充分重視。
00:02:07 underhyped when the rest of the world is
00:02:07 當世界其他地方都在
00:02:07 underhyped when the rest of the world is either confused, lost, or think it's,
00:02:07 當世界其他地方感到困惑、迷茫或認為它…時,它卻沒有受到重視。
00:02:09 either confused, lost, or think it's,
00:02:09 要嘛困惑,要嘛迷茫,要嘛認為,
00:02:10 either confused, lost, or think it's, you know, not impacting us.
00:02:10 要麼感到困惑,要麼感到迷茫,要麼認為它不會影響到我們。
00:02:13 you know, not impacting us.
00:02:13 你知道,這不會影響我們。
00:02:13 you know, not impacting us. We'll get into in more detail, but quick
00:02:13 你知道,不會影響我們。我們會更詳細地講,但很快
00:02:16 We'll get into in more detail, but quick
00:02:16 我們會更詳細地講解,但很快
00:02:16 We'll get into in more detail, but quick most important point to make there.
00:02:16 我們將進行更詳細的介紹,但最重要的一點是要在這裡提出。
00:02:19 most important point to make there.
00:02:19 這是最重要的一點。
00:02:19 most important point to make there. AI is a learning machine. Yeah.
00:02:19 最重要的一點是,人工智慧是一台學習機器。是的。
00:02:21 AI is a learning machine. Yeah.
00:02:21 人工智慧是一種學習機器。是的。
00:02:21 AI is a learning machine. Yeah. And in network effect businesses, when
00:02:21 人工智慧是一台學習機器。是的。在網路效應企業中,當
00:02:24 And in network effect businesses, when
00:02:24 在網路效應企業中,當
00:02:24 And in network effect businesses, when the learning machine learns faster,
00:02:24 在網路效應企業中,當機器學習得更快時,
00:02:26 the learning machine learns faster,
00:02:26 學習機器學得更快,
00:02:26 the learning machine learns faster, everything accelerates.
00:02:26 學習機器學習得更快,一切都加速了。
00:02:28 everything accelerates.
00:02:28 一切都在加速。
00:02:28 everything accelerates. It accelerates to its natural limit. The
00:02:28 一切都加速了。它加速到了它的自然極限。
00:02:31 It accelerates to its natural limit. The
00:02:31 它加速到自然極限。
00:02:31 It accelerates to its natural limit. The natural limit is electricity.
00:02:31 它加速到它的自然極限。這個自然極限就是電力。
00:02:35 natural limit is electricity.
00:02:35 自然極限是電力。
00:02:35 natural limit is electricity. Not chips,
00:02:35 自然極限是電力。不是晶片,
00:02:36 Not chips, 00:02:36 不是洋芋片,
00:02:36 Not chips, electricity really. Okay.
00:02:36 不是晶片,是電。好的。
00:02:39 electricity really. Okay.
00:02:39 確實是電。好的。
00:02:39 electricity really. Okay. So that gets me to the next point here,
00:02:39 真的有電。好的。接下來我要講的是,
00:02:41 So that gets me to the next point here,
00:02:41 這引出了我的下一個觀點,
00:02:41 So that gets me to the next point here, which is uh a discussion on AI and
00:02:41 這引出了我的下一個主題,關於人工智慧和
00:02:44 which is uh a discussion on AI and
00:02:44 這是關於人工智慧的討論
00:02:44 which is uh a discussion on AI and energy. So, we saw recently was Meta
00:02:44 這是關於人工智慧和能源的討論。我們最近看到的是 Meta
00:02:47 energy. So, we saw recently was Meta
00:02:47 能量。我們最近看到的是 Meta
00:02:47 energy. So, we saw recently was Meta recently announcing uh that they signed
00:02:47 能量。我們最近看到 Meta 宣布他們簽下了
00:02:50 recently announcing uh that they signed
00:02:50 最近宣布他們簽署了
00:02:50 recently announcing uh that they signed a 20-year nuclear contract with uh with
00:02:50 最近宣布,他們與呃簽署了一份為期 20 年的核子合約
00:02:53 a 20-year nuclear contract with uh with
00:02:53 與呃簽訂了20年的核子合約
00:02:53 a 20-year nuclear contract with uh with Constellation Energy. We've seen Google,
00:02:53 與 Constellation Energy 簽訂了為期 20 年的核子合約。我們已經看到谷歌,
00:02:56 Constellation Energy. We've seen Google,
00:02:56 星座能源。我們已經看到谷歌,
00:02:56 Constellation Energy. We've seen Google, Microsoft, Amazon, everybody buying
00:02:56 星座能源。我們看到谷歌、微軟、亞馬遜等都在收購
00:02:59 Microsoft, Amazon, everybody buying
00:02:59 微軟、亞馬遜等都在購買
00:02:59 Microsoft, Amazon, everybody buying basically nuclear capacity right now.
00:02:59 微軟、亞馬遜,現在每個人都在買核能。
00:03:02 basically nuclear capacity right now.
00:03:02 目前基本上是核能。
00:03:02 basically nuclear capacity right now. That's got to be weird
00:03:02 基本上是現在的核能容量。這肯定很奇怪
00:03:05 That's got to be weird
00:03:05 這一定很奇怪
00:03:05 That's got to be weird uh that private companies are are
00:03:05 這一定很奇怪,私人公司
00:03:08 uh that private companies are are
00:03:08 呃,私人公司是
00:03:08 uh that private companies are are basically taking over into their own
00:03:08 嗯,私人公司基本上接管了
00:03:10 basically taking over into their own
00:03:10 基本上接管了他們自己的
00:03:10 basically taking over into their own hands what was utility function before.
00:03:10 基本上將先前的效用功能掌握在自己手中。
00:03:14 hands what was utility function before.
00:03:14 之前的效用函數是什麼。
00:03:14 hands what was utility function before. Um,
00:03:14 之前的效用函數是什麼?嗯,
00:03:14 Um, 00:03:14 嗯,
00:03:14 Um, well, just to be cynical, I I'm so glad
00:03:14 嗯,好吧,只是憤世嫉俗地說,我很高興
00:03:17 well, just to be cynical, I I'm so glad
00:03:17 好吧,只是憤世嫉俗地說,我很高興
00:03:17 well, just to be cynical, I I'm so glad those companies plan to be around the 20
00:03:17 好吧,只是憤世嫉俗地說,我很高興這些公司計劃在 20
00:03:19 those companies plan to be around the 20
00:03:19 這些公司計劃在 20
00:03:19 those companies plan to be around the 20 years that it's going to take to get the
00:03:19 這些公司計劃在 20 年左右的時間內實現
00:03:21 years that it's going to take to get the
00:03:21 需要花上幾年的時間才能實現
00:03:21 years that it's going to take to get the nuclear power plants built.
00:03:21 建造核電廠需要花費數年時間。
00:03:23 nuclear power plants built.
00:03:23 座核電廠建成。
00:03:23 nuclear power plants built. In my recent testimony, I talked about
00:03:23 核電廠建成。在我最近的證詞中,我談到了
00:03:26 In my recent testimony, I talked about
00:03:26 在我最近的證詞中,我談到
00:03:26 In my recent testimony, I talked about the the current expected need for the AI
00:03:26 在我最近的證詞中,我談到了目前對人工智慧的預期需求
00:03:28 the the current expected need for the AI
00:03:28 目前對人工智慧的預期需求
00:03:28 the the current expected need for the AI revolution in the United States is 92
00:03:28 目前美國對人工智慧革命的預期需求是 92
00:03:31 revolution in the United States is 92
00:03:31 美國革命是92
00:03:31 revolution in the United States is 92 gawatt of more power.
00:03:31 美國的革命是92千兆瓦以上的電力。
00:03:33 gawatt of more power.
00:03:33 兆瓦的更多功率。
00:03:33 gawatt of more power. For reference, one gawatt is one big
00:03:33 兆瓦的更多功率。作為參考,一兆瓦是
00:03:37 For reference, one gawatt is one big
00:03:37 參考一下,一兆瓦是一大
00:03:37 For reference, one gawatt is one big nuclear power station. And there are
00:03:37 參考一下,一千兆瓦相當於一座大型核電廠。
00:03:39 nuclear power station. And there are
00:03:39 核電廠。還有
00:03:39 nuclear power station. And there are none essentially being started now.
00:03:39 核電廠。目前基本上沒有啟動任何核電廠。
00:03:41 none essentially being started now.
00:03:41 目前基本上還沒開始。
00:03:41 none essentially being started now. And there have been two in the last
00:03:41 目前基本上沒有啟動。過去已經有兩個
00:03:42 And there have been two in the last
00:03:42 過去兩年
00:03:42 And there have been two in the last what, 30 years built. There is
00:03:42 過去30年已經建造了兩座。
00:03:44 what, 30 years built. There is
00:03:44 什麼? 30年建成。
00:03:44 what, 30 years built. There is excitement that there's an SMR, small
00:03:44 什麼?建了 30 年。 SMR,小型的,令人興奮。
00:03:46 excitement that there's an SMR, small
00:03:46 很高興有 SMR,小
00:03:46 excitement that there's an SMR, small modular reactor coming in at 300
00:03:46 令人興奮,SMR,小型模組化反應器即將問世,300
00:03:48 modular reactor coming in at 300
00:03:48 模組化反應器即將投入使用
00:03:48 modular reactor coming in at 300 megawws, but it won't start till 2030.
00:03:48 模組化反應器的發電量為 300 兆瓦,但要到 2030 年才能啟動。
00:03:51 megawws, but it won't start till 2030.
00:03:51 megawws,但要到 2030 年才會開始。
00:03:51 megawws, but it won't start till 2030. As important as nuclear, both fision and
00:03:51 兆瓦,但要到 2030 年才會開始。與核能一樣重要,核分裂和
00:03:54 As important as nuclear, both fision and
00:03:54 與核能一樣重要,裂變和
00:03:54 As important as nuclear, both fision and fusion is, they're not going to arrive
00:03:54 儘管核能、裂變和聚變都很重要,但它們不會到來
00:03:56 fusion is, they're not going to arrive
00:03:56 融合是,他們不會到達
00:03:56 fusion is, they're not going to arrive in time to get us what we need as a
00:03:56 融合是,他們不會及時到達,提供我們所需的
00:03:59 in time to get us what we need as a
00:03:59 及時提供我們所需的東西
00:04:00 in time to get us what we need as a globe to deal with our many problems and
00:04:00 及時為我們帶來地球所需的一切,以解決我們面臨的許多問題,
00:04:01 globe to deal with our many problems and
00:04:01 地球來處理我們許多的問題,
00:04:01 globe to deal with our many problems and the many opportunities that are before
00:04:01 地球來應對我們面臨的許多問題和許多機會
00:04:03 the many opportunities that are before
00:04:03 眼前的眾多機會
00:04:03 the many opportunities that are before us. Do you think uh so if if you look at
00:04:03 我們面前有很多機會。你覺得,呃,如果你看看
00:04:05 us. Do you think uh so if if you look at
00:04:05 我們。你覺得呃,如果你看看
00:04:05 us. Do you think uh so if if you look at the sort of three-year timeline toward
00:04:05 我們。你覺得,呃,如果看三年的時間表,
00:04:07 the sort of three-year timeline toward
00:04:07 三年時間表
00:04:07 the sort of three-year timeline toward AGI, do you think if you started a a
00:04:07 通用人工智慧的三年時間表,你認為如果你開始
00:04:10 AGI, do you think if you started a a
00:04:10 AGI,你認為如果你開始 AA
00:04:10 AGI, do you think if you started a a fusion reactor project today that won't
00:04:10 AGI,你認為如果你今天啟動一個核融合反應器項目,那不會
00:04:12 fusion reactor project today that won't
00:04:12 今天的聚變反應器專案不會
00:04:12 fusion reactor project today that won't come online for five, six, seven years,
00:04:12 目前的聚變反應器工程五、六七年內不會上線,
00:04:15 come online for five, six, seven years,
00:04:15 上線五、六、七年了,
00:04:15 come online for five, six, seven years, is there a probability that the AGI
00:04:15 上線五、六、七年後,AGI 是否有可能
00:04:17 is there a probability that the AGI
00:04:17 AGI 的機率
00:04:17 is there a probability that the AGI comes up with some other breakthrough
00:04:17 AGI 是否有可能取得其他突破
00:04:19 comes up with some other breakthrough
00:04:19 又有了一些突破
00:04:19 comes up with some other breakthrough fusion or otherwise that makes it
00:04:19 提出了一些其他突破性的融合,或其他什麼使得它
00:04:20 fusion or otherwise that makes it
00:04:20 融合,或其他方式
00:04:20 fusion or otherwise that makes it irrelevant before it even gets online?
00:04:20 融合還是其他原因使得它在上線之前就變得無關緊要?
00:04:22 irrelevant before it even gets online?
00:04:22 上線前就無關緊要?
00:04:22 irrelevant before it even gets online? A very good question. We don't know what
00:04:22 甚至在它上線之前就無關緊要了?這個問題問得好。我們不知道
00:04:24 A very good question. We don't know what
00:04:24 這個問題問得好。我們不知道
00:04:24 A very good question. We don't know what artificial general intelligence will
00:04:24 這個問題問得很好。我們不知道通用人工智慧會做什麼。
00:04:27 artificial general intelligence will
00:04:27 通用人工智慧將
00:04:27 artificial general intelligence will deliver. Yeah. And we certainly don't
00:04:27 通用人工智慧將會實現。是的。而我們當然不會
00:04:29 deliver. Yeah. And we certainly don't
00:04:29 交付。是的。我們當然不會
00:04:29 deliver. Yeah. And we certainly don't know what super intelligence will
00:04:29 交付。是的。我們當然不知道超級智能會做什麼
00:04:31 know what super intelligence will
00:04:31 知道超級智慧會做什麼
00:04:31 know what super intelligence will deliver, but we know it's coming.
00:04:31 不知道超級智慧會帶來什麼,但我們知道它即將到來。
00:04:34 deliver, but we know it's coming.
00:04:34 交付,但我們知道它即將到來。
00:04:34 deliver, but we know it's coming. So, first we need to plan for it. And
00:04:34 交付,但我們知道它即將到來。所以,首先我們需要為此做好計劃。
00:04:36 So, first we need to plan for it. And
00:04:36 所以,首先我們需要做好計劃。
00:04:36 So, first we need to plan for it. And there's lots of issues as well as
00:04:36 所以,首先我們要先做好規劃。還有很多問題,
00:04:38 there's lots of issues as well as
00:04:38 還有很多問題
00:04:38 there's lots of issues as well as opportunities for that. But the fact of
00:04:38 這其中有許多問題,也有很多機會。但事實上
00:04:39 opportunities for that. But the fact of
00:04:39 機會。但事實上
00:04:40 opportunities for that. But the fact of the matter is that the computing needs
00:04:40 機會。但事實上,計算需要
00:04:42 the matter is that the computing needs
00:04:42 問題是計算需要
00:04:42 the matter is that the computing needs that we name now are going to come from
00:04:42 問題是,我們現在所說的運算需求將來自
00:04:44 that we name now are going to come from
00:04:44 我們現在命名的將來自
00:04:44 that we name now are going to come from traditional energy suppliers in places
00:04:44 我們現在提到的這些能源將來自一些地方的傳統能源供應商
00:04:47 traditional energy suppliers in places
00:04:47 各地傳統能源供應商
00:04:47 traditional energy suppliers in places like the United States and the Arab
00:04:47 美國和阿拉伯國家等地的傳統能源供應商
00:04:49 like the United States and the Arab
00:04:49 就像美國和阿拉伯
00:04:49 like the United States and the Arab world and Canada and the Western world.
00:04:49 例如美國、阿拉伯世界、加拿大和西方世界。
00:04:51 world and Canada and the Western world.
00:04:51 世界和加拿大以及西方世界。
00:04:51 world and Canada and the Western world. And it's important to note that China
00:04:51 世界、加拿大和西方世界。值得注意的是,中國
00:04:54 And it's important to note that China
00:04:54 值得注意的是,中國
00:04:54 And it's important to note that China has lots of electricity. So if they get
00:04:54 值得注意的是,中國電力充足。所以如果他們
00:04:57 has lots of electricity. So if they get
00:04:57 有很多電。所以如果他們得到
00:04:57 has lots of electricity. So if they get the chips, it's going to be one heck of
00:04:57 有很多電。所以如果他們得到晶片,那將是一次巨大的
00:04:59 the chips, it's going to be one heck of
00:04:59 晶片,這將是一個
00:04:59 the chips, it's going to be one heck of a race.
00:04:59 晶片,這將是一場激烈的比賽。
00:04:59 a race. 00:04:59 一場比賽。
00:04:59 a race. Yeah. They've been scaling it uh you
00:04:59 比賽。是的。他們一直在擴大規模,呃,你
00:05:02 Yeah. They've been scaling it uh you
00:05:02 是的。他們一直在擴大規模,呃,你
00:05:02 Yeah. They've been scaling it uh you know at two or three times. The US has
00:05:02 是的。他們已經擴大了兩到三倍。美國
00:05:04 know at two or three times. The US has
00:05:04 知道兩三次。美國已經
00:05:04 know at two or three times. The US has been flat for how long in terms of
00:05:04 知道兩三次。美國已經平穩多久了
00:05:05 been flat for how long in terms of
00:05:05 持平多久了
00:05:06 been flat for how long in terms of energy production?
00:05:06 就能源生產而言,持平了多久?
00:05:06 energy production?
00:05:06 能源生產?
00:05:06 energy production? Um from my perspective uh infinite. In
00:05:06 能源生產?嗯,從我的角度來看,是無限的。在
00:05:09 Um from my perspective uh infinite. In
00:05:09 嗯,從我的角度來看,無限。在
00:05:09 Um from my perspective uh infinite. In fact,
00:05:09 嗯,從我的角度來看,無限。事實上,
00:05:10 fact, 00:05:10 事實,
00:05:10 fact, electricity demand declined for a while
00:05:10 事實上,電力需求一度下降
00:05:12 electricity demand declined for a while
00:05:12 電力需求一度下降
00:05:12 electricity demand declined for a while as has overall energy needs because of
00:05:12 電力需求暫時下降,整體能源需求也下降,原因是
00:05:14 as has overall energy needs because of
00:05:14 整體能源需求也因
00:05:14 as has overall energy needs because of conservation and other things.
00:05:14 由於節約和其他原因,整體能源需求也是如此。
00:05:16 conservation and other things.
00:05:16 保護和其他事。
00:05:16 conservation and other things. But the data center story is the story
00:05:16 節能減排和其他事情。但資料中心的故事才是關鍵
00:05:19 But the data center story is the story
00:05:19 但資料中心的故事就是故事
00:05:19 But the data center story is the story of the energy people, right? And you sit
00:05:19 但資料中心的故事是能源人的故事,對吧?你坐在
00:05:22 of the energy people, right? And you sit
00:05:22 是能量人,對吧?然後你坐著
00:05:22 of the energy people, right? And you sit there and you go, how could these data
00:05:22 能源界人士,對吧?你坐在那裡,然後想,這些數據怎麼可能
00:05:23 there and you go, how could these data
00:05:23 好了,這些數據怎麼可能
00:05:23 there and you go, how could these data centers use so much power? Well, and
00:05:23 看完這些,你就會想,這些資料中心怎麼會耗費這麼多電力?嗯,
00:05:26 centers use so much power? Well, and
00:05:26 中心耗電那麼多?嗯,
00:05:26 centers use so much power? Well, and especially when you think about how
00:05:26 中心耗電那麼多?嗯,尤其是當你想到
00:05:28 especially when you think about how
00:05:28 尤其是當你想到
00:05:28 especially when you think about how little power our brains do. Well, these
00:05:28 尤其是當你想到我們的大腦有多弱的時候。嗯,這些
00:05:30 little power our brains do. Well, these
00:05:30 我們的大腦幾乎沒有什麼力量。嗯,這些
00:05:30 little power our brains do. Well, these are our best approximation in digital
00:05:30 大腦所能做到的微乎其微。好吧,這是我們在數位領域的最佳近似值
00:05:33 are our best approximation in digital
00:05:33 是我們數字方面的最佳近似值
00:05:33 are our best approximation in digital form of how our brains work. But when
00:05:33 是我們用數字形式對大腦運作方式的最佳近似。但是當
00:05:35 form of how our brains work. But when
00:05:35 大腦運作的方式。但是當
00:05:35 form of how our brains work. But when they start working together, they become
00:05:35 大腦運作的方式。但當它們開始協同工作時,它們就變成了
00:05:37 they start working together, they become
00:05:37 他們開始合作,他們成為
00:05:37 they start working together, they become superbrains. The promise of a superbrain
00:05:37 他們開始合作,成為超級大腦。超級大腦的前景
00:05:40 superbrains. The promise of a superbrain
00:05:40 超級大腦。超級大腦的前景
00:05:40 superbrains. The promise of a superbrain with a 1 gawatt for example data center
00:05:40 超級大腦。超級大腦的前景是,擁有1千兆瓦的資料中心
00:05:43 with a 1 gawatt for example data center
00:05:43 以 1 千兆瓦的資料中心為例
00:05:43 with a 1 gawatt for example data center is so palpable. People are going crazy.
00:05:43 以1千兆瓦的資料中心為例,其影響顯而易見。人們都快瘋了。
00:05:46 is so palpable. People are going crazy.
00:05:46 太明顯了。人們都瘋了。
00:05:46 is so palpable. People are going crazy. And by the way, the economics of these
00:05:46 如此明顯。人們都瘋了。順便說一句,這些經濟
00:05:48 And by the way, the economics of these
00:05:48 順便說一下,這些的經濟
00:05:48 And by the way, the economics of these things are unproven. How much revenue do
00:05:48 順便說一句,這些事情的經濟效益尚未得到證實。
00:05:51 things are unproven. How much revenue do
00:05:51 事尚未得到證實。
00:05:51 things are unproven. How much revenue do you have to have to have 50 billion in
00:05:51 事情還沒有被證實。你需要多少收入才能擁有500億美元的
00:05:53 you have to have to have 50 billion in
00:05:53 你必須擁有 500 億美元
00:05:53 you have to have to have 50 billion in capital? Well, if you depreciate it over
00:05:53 你得有500億的資本?好吧,如果你把它貶值
00:05:55 capital? Well, if you depreciate it over
00:05:55 資本?好吧,如果你把它貶值
00:05:55 capital? Well, if you depreciate it over three years or four years, you need to
00:05:55 資本?好吧,如果你在三年或四年內折舊,你需要
00:05:57 three years or four years, you need to
00:05:57 三年或四年,你需要
00:05:57 three years or four years, you need to have 10 or 15 billion dollars of capital
00:05:57 三年或四年,你需要有100億或150億美元的資本
00:06:00 have 10 or 15 billion dollars of capital
00:06:00 擁有 100 億或 150 億美元的資本
00:06:00 have 10 or 15 billion dollars of capital spend per year just to handle the
00:06:00 每年需要花費 100 到 150 億美元來處理
00:06:03 spend per year just to handle the
00:06:03 每年花費只是為了處理
00:06:03 spend per year just to handle the infrastructure. Those are huge
00:06:03 每年僅用於維護基礎設施就花費龐大。
00:06:05 infrastructure. Those are huge
00:06:05 基礎設施。這些基礎設施非常龐大
00:06:05 infrastructure. Those are huge businesses and huge revenue, which in
00:06:05 基礎設施。這些都是龐大的業務和龐大的收入,
00:06:07 businesses and huge revenue, which in
00:06:07 企業和巨額收入,其中
00:06:07 businesses and huge revenue, which in most places is not there yet.
00:06:07 企業和巨額收入,而大多數地方尚未實現這一目標。
00:06:10 most places is not there yet.
00:06:10 大多數地方尚未實現。
00:06:10 most places is not there yet. I'm curious, there's so much capital
00:06:10 大多數地方還沒有。我很好奇,那裡有這麼多資本
00:06:12 I'm curious, there's so much capital
00:06:12 我很好奇,有這麼多資本
00:06:12 I'm curious, there's so much capital being invested and deployed right now in
00:06:12 我很好奇,現在有這麼多的資本被投資和部署在
00:06:15 being invested and deployed right now in
00:06:15 目前正在投資和部署
00:06:15 being invested and deployed right now in SMRs in in nuclear bringing Three Mile
00:06:15 目前正在投資和部署 SMR,將三英里核電廠
00:06:17 SMRs in in nuclear bringing Three Mile
00:06:17 SMR 核電廠將帶來三英里
00:06:17 SMRs in in nuclear bringing Three Mile Island back online. uh in in fusion
00:06:17 SMRs 核電廠讓三哩島核電廠重新投入運作。呃,核融合
00:06:20 Island back online. uh in in fusion
00:06:20 島嶼重新上線。呃,融合
00:06:20 Island back online. uh in in fusion companies. Why isn't there an equal
00:06:20 島嶼復育在線。呃,在核融合公司。為什麼沒有平等的
00:06:22 companies. Why isn't there an equal
00:06:22 公司。為什麼沒有一個平等的
00:06:22 companies. Why isn't there an equal amount of capital going into making uh
00:06:22 公司。為什麼沒有同等數量的資本投入到呃
00:06:25 amount of capital going into making uh
00:06:25 投入的資金量
00:06:25 amount of capital going into making uh the entire you know chipset and compute
00:06:25 投入製造整個晶片組和計算機的資本
00:06:29 the entire you know chipset and compute
00:06:29 你知道的整個晶片組和計算
00:06:29 the entire you know chipset and compute just a thousand times more energy
00:06:29 你知道整個晶片組和計算只是一千倍的能量
00:06:30 just a thousand times more energy
00:06:30 只是能量增加了一千倍
00:06:30 just a thousand times more energy efficient?
00:06:30 能源效率僅僅提高了一千倍?
00:06:31 efficient? 00:06:31 有效率嗎?
00:06:31 efficient? There is a similar amount in going in
00:06:31 效率?投入的資金也差不多
00:06:33 There is a similar amount in going in
00:06:33 進入時也有類似的數量
00:06:33 There is a similar amount in going in capital. There are many many startups
00:06:33 投入的經費也差不多。有很多新創公司
00:06:35 capital. There are many many startups
00:06:35 資本。有很多新創公司
00:06:35 capital. There are many many startups that are working on non-traditional ways
00:06:35 資本。有許多新創公司正在以非傳統的方式開展業務
00:06:37 that are working on non-traditional ways
00:06:37 正在以非傳統方式工作
00:06:37 that are working on non-traditional ways of doing chips. The transformer
00:06:37 正在研究非傳統的晶片製造方法。變壓器
00:06:39 of doing chips. The transformer
00:06:39 做晶片。變壓器
00:06:39 of doing chips. The transformer architecture which is what is powering
00:06:39 晶片的製造。變壓器架構就是為
00:06:41 architecture which is what is powering
00:06:41 架構是驅動力
00:06:41 architecture which is what is powering things today has new variants. Every
00:06:41 當今驅動事物的架構有了新的變體。每個
00:06:44 things today has new variants. Every
00:06:44 今天的事物都有新的變種。每個
00:06:44 things today has new variants. Every week or so I get a pitch from a new
00:06:44 當今事物都有新的變體。每週我都會收到來自新
00:06:46 week or so I get a pitch from a new
00:06:46 一週左右,我收到了一位新
00:06:46 week or so I get a pitch from a new startup that's going to build inference
00:06:46 一週左右,我收到一家新創公司的推銷,他們打算建立推理
00:06:48 startup that's going to build inference
00:06:48 一家將建構推理的新創公司
00:06:48 startup that's going to build inference time, test time computing which are
00:06:48 新創公司將建構推理時間、測驗時間計算,
00:06:50 time, test time computing which are
00:06:50 時間,測試時間計算
00:06:50 time, test time computing which are simpler and they're optimized for
00:06:50 時間,測試時間計算更簡單,並且針對
00:06:52 simpler and they're optimized for
00:06:52 更簡單,並且針對
00:06:52 simpler and they're optimized for inference. It looks like the hardware
00:06:52 更簡單,而且它們針對推理進行了最佳化。看起來硬體
00:06:56 inference. It looks like the hardware
00:06:56 推理。看起來硬體
00:06:56 inference. It looks like the hardware will arrive just as the software needs
00:06:56 推論。看起來硬體會隨著軟體需求的增加而到來
00:06:59 will arrive just as the software needs
00:06:59 將在軟體需要時到達
00:06:59 will arrive just as the software needs expand.
00:06:59 將會隨著軟體需求的擴展而到來。
00:07:00 expand. 00:07:00 展開。
00:07:00 expand. And by the way, that's always been true.
00:07:00 展開。順便說一句,這一直都是事實。
00:07:02 And by the way, that's always been true.
00:07:02 順便說一句,這一直都是事實。
00:07:02 And by the way, that's always been true. We old-timers had a phrase um grove
00:07:02 順便說一句,這一直都是事實。我們老一輩有句俗語,叫“樹林”
00:07:05 We old-timers had a phrase um grove
00:07:05 我們老一輩有一句話叫“樹林”
00:07:05 We old-timers had a phrase um grove giveth and gates take it away. So Intel
00:07:05 我們老一輩有句俗話:格羅夫給予,蓋茲奪走。所以英特爾
00:07:08 giveth and gates take it away. So Intel
00:07:08 給予,蓋茲拿走。所以英特爾
00:07:08 giveth and gates take it away. So Intel would improve the chipsets right way
00:07:08 給予,蓋茲拿走。所以英特爾會立即改進晶片組
00:07:11 would improve the chipsets right way
00:07:11 可以正確改進晶片組
00:07:11 would improve the chipsets right way back when
00:07:11 會立即改進晶片組
00:07:12 back when 00:07:12 回到
00:07:12 back when and the software people would
00:07:12 當時軟體人員
00:07:13 and the software people would
00:07:13 軟體人員會
00:07:13 and the software people would immediately use it all and suck it all
00:07:13 軟體人員會立即使用並吸收所有
00:07:16 immediately use it all and suck it all
00:07:16 立刻用完並吸完
00:07:16 immediately use it all and suck it all up. I have no reason to believe
00:07:16 立刻用完,然後承受一切。我沒有理由相信
00:07:19 up. I have no reason to believe
00:07:19 起來。我沒有理由相信
00:07:19 up. I have no reason to believe that that's that that law grove and
00:07:19 起來。我沒有理由相信那就是那片法律樹林,
00:07:22 that that's that that law grove and
00:07:22 那就是那片法律樹林
00:07:22 that that's that that law grove and gates law has changed. If you look at
00:07:22 那就是格羅夫和蓋茲定律已經改變了。如果你看看
00:07:24 gates law has changed. If you look at
00:07:24 蓋茲定律已經改變了。如果你看看
00:07:24 gates law has changed. If you look at the gains in like the Blackwell chip or
00:07:24 蓋茲定律已經改變了。如果你看看像布萊克韋爾晶片或
00:07:27 the gains in like the Blackwell chip or
00:07:27 像 Blackwell 晶片或
00:07:27 the gains in like the Blackwell chip or the AS uh the the 350 chip in AMD,
00:07:27 像 Blackwell 晶片或 AMD 的 AS 350 晶片的收益,
00:07:30 the AS uh the the 350 chip in AMD,
00:07:30 AMD 的 AS 350 晶片,
00:07:30 the AS uh the the 350 chip in AMD, these chips are massive supercomputers
00:07:30 AMD 的 AS 350 晶片,這些晶片是大型超級計算機
00:07:33 these chips are massive supercomputers
00:07:33 這些晶片是巨大的超級計算機
00:07:33 these chips are massive supercomputers and yet we need according to the people
00:07:33 這些晶片是巨大的超級計算機,但根據人們的說法,我們需要
00:07:36 and yet we need according to the people
00:07:36 但我們需要根據人民
00:07:36 and yet we need according to the people have hundreds of thousands of these
00:07:36 但根據人們的說法,我們需要數十萬個這樣的
00:07:38 have hundreds of thousands of these
00:07:38 有數十萬個這樣的
00:07:38 have hundreds of thousands of these chips just to make a data center work.
00:07:38 僅維持一個資料中心的運作就需要數十萬個這樣的晶片。
00:07:40 chips just to make a data center work.
00:07:40 晶片只是為了讓資料中心運轉。
00:07:40 chips just to make a data center work. That shows you the scale of what this
00:07:40 晶片只是為了維持資料中心的運作。這顯示了這個
00:07:42 That shows you the scale of what this
00:07:42 這顯示了這件事的規模
00:07:42 That shows you the scale of what this kind of thinking algorithms. Now you sit
00:07:42 這展示了這種思考演算法的規模。現在你坐下
00:07:44 kind of thinking algorithms. Now you sit
00:07:44 這類思考演算法。現在你坐下
00:07:44 kind of thinking algorithms. Now you sit there and you go what could these people
00:07:44 這類思考演算法。現在你坐在那裡,想著這些人能做什麼
00:07:46 there and you go what could these people
00:07:46 在那裡,你去看看這些人
00:07:46 there and you go what could these people possibly be doing with all these chips?
00:07:46 你想這些人用這些晶片能做什麼?
00:07:49 possibly be doing with all these chips?
00:07:49 這些晶片可能用來做什麼?
00:07:49 possibly be doing with all these chips? I'll give you an example. We went from
00:07:49 所有這些晶片可能做什麼?我給你舉個例子。我們從
00:07:51 I'll give you an example. We went from
00:07:51 我舉個例子給你聽。我們從
00:07:51 I'll give you an example. We went from language to language which is what
00:07:51 我舉個例子給你聽。我們從一種語言學到另一種語言,這就是
00:07:53 language to language which is what
00:07:53 語言到語言,也就是
00:07:53 language to language which is what chatbd can be understood at to reasoning
00:07:53 語言與語言之間的轉換,也就是 Chatbd 能理解的推理
00:07:55 chatbd can be understood at to reasoning
00:07:55 chatbd 可以理解為推理
00:07:55 chatbd can be understood at to reasoning and thinking. If you want to look at an
00:07:55 Chatbd 可以理解為推理和思考。如果你想看一個
00:07:57 and thinking. If you want to look at an
00:07:57 並思考。如果你想看一個
00:07:57 and thinking. If you want to look at an open eye example look at open oi03
00:07:57 思考一下。如果你想看一個睜開眼睛的例子,你可以看看 oi03
00:08:01 open eye example look at open oi03
00:08:01 睜開眼睛範例查看睜眼 oi03
00:08:01 open eye example look at open oi03 which go does forward and back
00:08:01 睜開眼睛的例子看看打開 oi03,它執行前進和後退
00:08:02 which go does forward and back
00:08:02 哪個 go 可以前進和後退
00:08:02 which go does forward and back reinforcement learning and planning.
00:08:02 它進行前向和後向強化學習和規劃。
00:08:04 reinforcement learning and planning.
00:08:04 強化學習和規劃。
00:08:04 reinforcement learning and planning. Now the cost of doing the forward and
00:08:04 強化學習和規劃。現在進行前向和
00:08:06 Now the cost of doing the forward and
00:08:06 現在進行正向和
00:08:06 Now the cost of doing the forward and back is many orders of magnitude besides
00:08:06 現在,執行前進和後退的成本是多個數量級的,此外
00:08:09 back is many orders of magnitude besides
00:08:09 回溯是許多數量級,此外
00:08:09 back is many orders of magnitude besides just answering your question for your
00:08:09 除了回答你的問題之外,還有許多數量級的
00:08:11 just answering your question for your
00:08:11 只是回答你的問題
00:08:12 just answering your question for your PhD thesis or your college paper that
00:08:12 只是回答你的博士論文或大學論文的問題
00:08:15 PhD thesis or your college paper that
00:08:15 博士論文或你的大學論文
00:08:15 PhD thesis or your college paper that planning the back and forth is
00:08:15 博士論文或大學論文中提到,反覆規劃是
00:08:17 planning the back and forth is
00:08:17 來回規劃是
00:08:17 planning the back and forth is computationally very very expensive. So
00:08:17 規劃來回的計算成本非常高。所以
00:08:19 computationally very very expensive. So
00:08:19 計算成本非常高。所以
00:08:19 computationally very very expensive. So with the best energy and the best
00:08:19 計算成本非常高。所以用最好的能量和最好的
00:08:21 with the best energy and the best
00:08:21 用最好的能量和最好的
00:08:21 with the best energy and the best technology today we are able to show
00:08:21 憑藉當今最好的能源和最好的技術,我們能夠展示
00:08:23 technology today we are able to show
00:08:23 今天的科技讓我們能夠展示
00:08:23 technology today we are able to show evidence of planning. Many people
00:08:23 如今,我們能夠展示規劃的證據。很多人
00:08:26 evidence of planning. Many people
00:08:26 有計劃的證據。很多人
00:08:26 evidence of planning. Many people believe that if you combine planning and
00:08:26 規劃的證據。很多人認為,如果將規劃和
00:08:28 believe that if you combine planning and
00:08:28 相信如果你結合規劃和
00:08:28 believe that if you combine planning and very deep memories you can build human
00:08:28 相信如果你結合計畫和非常深刻的記憶,你就可以建立人類
00:08:31 very deep memories you can build human
00:08:31 你可以建立非常深刻的記憶
00:08:31 very deep memories you can build human level intelligence. Now of course they
00:08:31 非常深的記憶,你可以建構人類層次的智慧。當然,現在他們
00:08:34 level intelligence. Now of course they
00:08:34 水平的智力。當然,他們
00:08:34 level intelligence. Now of course they will be very expensive to start with but
00:08:34 級智能。當然,它們的起步成本會非常高,但是
00:08:37 will be very expensive to start with but
00:08:37 一開始會非常昂貴,但是
00:08:37 will be very expensive to start with but humans are very very industrious and
00:08:37 一開始會非常昂貴,但人類非常勤勞,
00:08:39 humans are very very industrious and
00:08:39 人類非常勤勞,
00:08:39 humans are very very industrious and furthermore the great future companies
00:08:39 人類非常勤勞,未來偉大的公司
00:08:41 furthermore the great future companies
00:08:41 另外還有偉大的未來公司
00:08:42 furthermore the great future companies will have AI scientists that is
00:08:42 此外,未來的公司將擁有人工智慧科學家
00:08:43 will have AI scientists that is
00:08:43 將擁有人工智慧科學家
00:08:43 will have AI scientists that is non-human scientists AI programmers that
00:08:43 將有人工智慧科學家,也就是非人類科學家,人工智慧程式設計師
00:08:46 non-human scientists AI programmers that
00:08:46 非人類科學家 AI 程式設計師
00:08:46 non-human scientists AI programmers that as opposed to human programmers who will
00:08:46 非人類科學家 AI 程式設計師,與人類程式設計師不同,他們會
00:08:48 as opposed to human programmers who will
00:08:48 而人類程式設計師會
00:08:48 as opposed to human programmers who will accelerate their impact. So, if you
00:08:48 而不是人類程式設計師,他們會加速他們的影響。所以,如果你
00:08:51 accelerate their impact. So, if you
00:08:51 加速其影響。所以,如果你
00:08:51 accelerate their impact. So, if you think about it, going back to you're the
00:08:51 加速它們的影響。所以,如果你仔細想想,回到你是
00:08:52 think about it, going back to you're the
00:08:52 想想吧,回到你是
00:08:52 think about it, going back to you're the author of the abundance thesis, as best
00:08:52 想想看,回到你是豐裕論的作者,最好
00:08:54 author of the abundance thesis, as best
00:08:54 豐裕論的作者,作為最好的
00:08:54 author of the abundance thesis, as best I can tell, Peter, you've talked about
00:08:54 豐富論題的作者,據我所知,彼得,你談到了
00:08:55 I can tell, Peter, you've talked about
00:08:55 彼得,我知道你談到了
00:08:55 I can tell, Peter, you've talked about this for 20 years. You saw it first. It
00:08:55 彼得,我看得出來,你已經談論這個話題20年了。你最先看到了。它
00:08:58 this for 20 years. You saw it first. It
00:08:58 20年來一直如此。你先看到的。它
00:08:58 this for 20 years. You saw it first. It sure looks like if we get enough
00:08:58 20年來一直如此。你先看到了。看起來如果我們得到足夠的
00:09:00 sure looks like if we get enough
00:09:00 看起來如果我們得到足夠的
00:09:00 sure looks like if we get enough electricity, we can generate the power
00:09:00 看起來如果我們有足夠的電力,我們就可以發電
00:09:03 electricity, we can generate the power
00:09:03 電力,我們可以發電
00:09:03 electricity, we can generate the power in in the sense of intellectual power to
00:09:03 電力,我們可以從智力的角度來發電
00:09:05 in in the sense of intellectual power to
00:09:05 就智力而言
00:09:05 in in the sense of intellectual power to generate abundance along the lines that
00:09:05 就智力力量而言,創造豐富的資源,
00:09:07 generate abundance along the lines that
00:09:07 沿著這條線產生豐富的
00:09:07 generate abundance along the lines that you predicted two decades ago.
00:09:07 按照你二十年前預測的方式產生豐富的資源。
00:09:08 you predicted two decades ago.
00:09:08 你二十年前就預測到了。
00:09:08 you predicted two decades ago. Every week, I study the 10 major tech
00:09:08 你二十年前就預測到了。每週我都會研究十大科技
00:09:11 Every week, I study the 10 major tech
00:09:11 每週我都會學習十大科技
00:09:11 Every week, I study the 10 major tech meta trends that will transform
00:09:11 每週,我都會研究 10 大科技元趨勢,它們將改變
00:09:13 meta trends that will transform
00:09:13 元趨勢將改變
00:09:13 meta trends that will transform industries over the decade ahead. I
00:09:13 未來十年將改變各行各業的元趨勢。我
00:09:15 industries over the decade ahead. I
00:09:15 未來十年的產業。我
00:09:15 industries over the decade ahead. I cover trends ranging from humanoid
00:09:15 未來十年的產業趨勢。我涵蓋了從人形機器人到
00:09:17 cover trends ranging from humanoid
00:09:17 涵蓋從人形到
00:09:17 cover trends ranging from humanoid robots, AGI, quantum computing,
00:09:17 涵蓋人形機器人、AGI、量子運算等趨勢,
00:09:19 robots, AGI, quantum computing,
00:09:19 機器人、通用人工智慧、量子計算、
00:09:19 robots, AGI, quantum computing, transport, energy, longevity, and more.
00:09:19 機器人、AGI、量子計算、運輸、能源、長壽等等。
00:09:22 transport, energy, longevity, and more.
00:09:22 運輸、能源、長壽等等。
00:09:22 transport, energy, longevity, and more. No fluff, only the important stuff that
00:09:22 交通、能源、壽命等等。沒有廢話,只有重要的事情
00:09:25 No fluff, only the important stuff that
00:09:25 沒有廢話,只有重要的事情
00:09:25 No fluff, only the important stuff that matters, that impacts our lives and our
00:09:25 沒有廢話,只有重要的事情,影響我們的生活和我們的
00:09:27 matters, that impacts our lives and our
00:09:27 事情,影響我們的生活和我們的
00:09:27 matters, that impacts our lives and our careers. If you want me to share these
00:09:27 很重要,會影響我們的生活和事業。如果你想讓我分享這些
00:09:29 careers. If you want me to share these
00:09:29 職業。如果你想讓我分享這些
00:09:29 careers. If you want me to share these with you, I write a newsletter twice a
00:09:29 職業。如果你想讓我跟你分享這些,我每週都會寫兩次新聞通訊
00:09:31 with you, I write a newsletter twice a
00:09:31 和你一起,我每週寫兩次新聞通訊
00:09:31 with you, I write a newsletter twice a week, sending it out as a short
00:09:31 和你一起,我每週寫兩次時事通訊,以短篇形式發送
00:09:33 week, sending it out as a short
00:09:33 週,以短篇形式發送
00:09:33 week, sending it out as a short two-minute read via email. And if you
00:09:33 週,透過電子郵件發送兩分鐘短文。如果你
00:09:35 two-minute read via email. And if you
00:09:35 透過電子郵件閱讀,需要兩分鐘。如果你
00:09:35 two-minute read via email. And if you want to discover the most important meta
00:09:35 透過電子郵件閱讀,需要兩分鐘。如果你想發現最重要的元
00:09:37 want to discover the most important meta
00:09:37 想要發現最重要的元
00:09:37 want to discover the most important meta trends 10 years before anyone else,
00:09:37 想要比別人提早 10 年發現最重要的元趨勢,
00:09:39 trends 10 years before anyone else,
00:09:39 比其他任何人都領先 10 年,
00:09:39 trends 10 years before anyone else, these reports are for you. Readers
00:09:39 領先其他人十年,這些報告是為你準備的。讀者們
00:09:41 these reports are for you. Readers
00:09:41 這些報告是寫給你們的。讀者們
00:09:41 these reports are for you. Readers include founders and CEOs from the
00:09:41 這些報告是寫給你的。讀者包括來自
00:09:43 include founders and CEOs from the
00:09:43 包括來自
00:09:43 include founders and CEOs from the world's most disruptive companies and
00:09:43 包括世界上最具顛覆性的公司的創始人和首席執行官,以及
00:09:45 world's most disruptive companies and
00:09:45 世界上最具顛覆性的公司和
00:09:46 world's most disruptive companies and entrepreneurs building the world's most
00:09:46 世界上最具顛覆性的公司和企業家正在打造世界上最
00:09:48 entrepreneurs building the world's most
00:09:48 企業家打造世界上最
00:09:48 entrepreneurs building the world's most disruptive companies. It's not for you
00:09:48 企業家打造世界上最具顛覆性的公司。這不適合你
00:09:50 disruptive companies. It's not for you
00:09:50 顛覆性的公司。這不適合你
00:09:50 disruptive companies. It's not for you if you don't want to be informed of
00:09:50 顛覆性的公司。如果你不想被告知
00:09:52 if you don't want to be informed of
00:09:52 如果你不想被告知
00:09:52 if you don't want to be informed of what's coming, why it matters, and how
00:09:52 如果你不想了解即將發生的事情,為什麼這很重要,以及如何
00:09:54 what's coming, why it matters, and how
00:09:54 即將發生什麼,為什麼重要,以及如何
00:09:54 what's coming, why it matters, and how you can benefit from it. To subscribe
00:09:54 即將推出的內容、重要性以及您如何從中受益。訂閱
00:09:56 you can benefit from it. To subscribe
00:09:56 您可以從中受益。訂閱
00:09:56 you can benefit from it. To subscribe for free, go to dmadis.com/tatrends.
00:09:56 您可以從中受益。免費訂閱,請造訪 dmadis.com/tatrends。
00:10:00 for free, go to dmadis.com/tatrends.
00:10:00 免費,請造訪 dmadis.com/tatrends。
00:10:00 for free, go to dmadis.com/tatrends. That's dmandis.com/tatrends
00:10:00 免費,請造訪 dmadis.com/tatrends。網址是 dmandis.com/tatrends
00:10:03 That's dmandis.com/tatrends
00:10:03 這是 dmandis.com/tatrends
00:10:03 That's dmandis.com/tatrends to gain access to trends 10 plus years
00:10:03 造訪 dmandis.com/tatrends 了解 10 多年的趨勢
00:10:06 to gain access to trends 10 plus years
00:10:06 了解 10 多年的趨勢
00:10:06 to gain access to trends 10 plus years before anyone else.
00:10:06 比其他任何人都領先 10 多年,掌握趨勢動態。
00:10:07 before anyone else.
00:10:07 比任何人都先。
00:10:08 before anyone else. Let me throw some numbers at you just to
00:10:08 比任何人都早。我先給你一些數字,只是為了
00:10:09 Let me throw some numbers at you just to
00:10:09 讓我給你一些數字,只是為了
00:10:09 Let me throw some numbers at you just to reinforce what you said. you know, we
00:10:09 讓我給你一些數字來支持你的說法。你知道,我們
00:10:10 reinforce what you said. you know, we
00:10:10 強化你所說的。你知道,我們
00:10:10 reinforce what you said. you know, we have a couple companies in the lab that
00:10:10 強化你所說的。你知道,我們實驗室裡有幾家公司
00:10:12 have a couple companies in the lab that
00:10:12 實驗室裡有幾家公司
00:10:12 have a couple companies in the lab that are doing voice customer service, voice
00:10:12 實驗室裡有幾家公司正在做語音客戶服務,語音
00:10:14 are doing voice customer service, voice
00:10:14 正在做語音客服,語音
00:10:14 are doing voice customer service, voice sales with the new, you know, just as of
00:10:14 正在用新的語音客戶服務、語音銷售,你知道,就像
00:10:16 sales with the new, you know, just as of
00:10:16 新的銷售情況,你知道,就像
00:10:16 sales with the new, you know, just as of the last month.
00:10:16 新的銷售情況,你知道,就上個月的情況而言。
00:10:16 the last month. 上個月 00:10:16。
00:10:16 the last month. Sure.
00:10:16 上個月。當然。
00:10:17 Sure. 00:10:17 當然。
00:10:17 Sure. And the value of these these
00:10:17 當然。這些的價值
00:10:20 And the value of these these
00:10:20 這些的價值
00:10:20 And the value of these these conversations is 10 to $1,000. And the
00:10:20 這些對話的價值是10到1000美元。
00:10:23 conversations is 10 to $1,000. And the
00:10:23 對話是 10 到 1,000 美元。
00:10:23 conversations is 10 to $1,000. And the cost of the compute is, you know, maybe
00:10:23 對話的成本是 10 到 1,000 美元。計算成本大概是
00:10:25 cost of the compute is, you know, maybe
00:10:25 計算成本大概是
00:10:25 cost of the compute is, you know, maybe two three concurrent GPUs is optimal.
00:10:25 運算成本是,也許兩個三個並發 GPU 是最佳的。
00:10:28 two three concurrent GPUs is optimal.
00:10:28 兩個三個並發 GPU 是最佳的。
00:10:28 two three concurrent GPUs is optimal. It's like 10 20 cents. And so they would
00:10:28 兩三個並發 GPU 是最佳選擇。大約是 10 美分 20 美分。所以他們會
00:10:31 It's like 10 20 cents. And so they would
00:10:31 大概是10美分20美分。所以他們會
00:10:31 It's like 10 20 cents. And so they would buy massively more compute to improve
00:10:31 大概是 10 到 20 美分。所以他們會購買大量的運算資源來提高
00:10:34 buy massively more compute to improve
00:10:34 購買大量運算資源來提高
00:10:34 buy massively more compute to improve the the quality of the conversation.
00:10:34 購買大量運算資源來提升對話品質。
00:10:36 the the quality of the conversation.
00:10:36 對話的品質。
00:10:36 the the quality of the conversation. There aren't even close to enough. We we
00:10:36 對話的品質。遠遠不夠。我們
00:10:38 There aren't even close to enough. We we
00:10:38 遠遠不夠。我們
00:10:38 There aren't even close to enough. We we count about 10 million concurrent phone
00:10:38 遠遠不夠。我們統計了一下,大約有 1000 萬支手機同時在線上。
00:10:39 count about 10 million concurrent phone
00:10:39 統計約1000萬支手機同時在線
00:10:39 count about 10 million concurrent phone calls that should move to AI in the next
00:10:39 估計未來 1000 萬通並發電話應該會轉移到人工智慧
00:10:42 calls that should move to AI in the next
00:10:42 下一步應該轉移到人工智慧
00:10:42 calls that should move to AI in the next year or so.
00:10:42 呼籲在明年左右轉向人工智慧。
00:10:44 year or so. 00:10:44 一年左右。
00:10:44 year or so. And and my view of that is that's a good
00:10:44 一年左右。我認為這是一個很好的
00:10:46 And and my view of that is that's a good
00:10:46 我認為這是一件好事
00:10:46 And and my view of that is that's a good tactical solution and a great business.
00:10:46 我認為這是一個很好的戰術解決方案和一項偉大的事業。
00:10:48 tactical solution and a great business.
00:10:48 戰術解決方案和偉大的業務。
00:10:48 tactical solution and a great business. Let's look at other examples of tactical
00:10:48 戰術解決方案和偉大的事業。讓我們看看其他戰術解決方案的例子
00:10:50 Let's look at other examples of tactical
00:10:50 讓我們來看看其他戰術的例子
00:10:50 Let's look at other examples of tactical solutions that are great businesses.
00:10:50 讓我們來看看其他優秀的戰術解決方案的例子。
00:10:52 solutions that are great businesses.
00:10:52 解決方案對於企業來說非常有幫助。
00:10:52 solutions that are great businesses. And I obviously have a conflict of
00:10:52 解決方案對企業來說很重要。我顯然有一個衝突
00:10:53 And I obviously have a conflict of
00:10:53 顯然我有一個衝突
00:10:53 And I obviously have a conflict of interest talking about Google because I
00:10:53 顯然,談論 Google 有利益衝突,因為我
00:10:55 interest talking about Google because I
00:10:55 我對談論谷歌很感興趣,因為我
00:10:55 interest talking about Google because I love it so much. So with that as in
00:10:55 我很喜歡談論谷歌,因為我太喜歡它了。所以
00:10:57 love it so much. So with that as in
00:10:57 太喜歡了。所以
00:10:57 love it so much. So with that as in mind, look at the Google strength in
00:10:57 非常喜歡。所以考慮到這一點,看看谷歌在
00:10:58 mind, look at the Google strength in
00:10:58 看看 Google 的實力
00:10:58 mind, look at the Google strength in GCP. Now Google Google's cloud product
00:10:58 看看 Google 在 GCP 的實力。現在 Google 的雲端產品
00:11:01 GCP. Now Google Google's cloud product
00:11:01 GCP。現在是 Google 的雲端產品
00:11:02 GCP. Now Google Google's cloud product where they have a completely fully
00:11:02 GCP。現在是 Google 的雲端產品,他們擁有完全
00:11:04 where they have a completely fully
00:11:04 他們有一個完全
00:11:04 where they have a completely fully served enterprise offering for
00:11:04 他們擁有完全服務的企業產品
00:11:06 served enterprise offering for
00:11:06 為企業提供
00:11:06 served enterprise offering for essentially automating your company with
00:11:06 為企業提供本質上自動化的公司服務
00:11:08 essentially automating your company with
00:11:08 本質上實現公司自動化
00:11:08 essentially automating your company with AI.
00:11:08 本質上利用人工智慧實現公司自動化。
00:11:09 AI. 00:11:09 人工智慧。
00:11:09 AI. Yeah. 00:11:09 人工智慧。是的。
00:11:09 Yeah. 00:11:09 是的。
00:11:09 Yeah. And the remarkable thing and this is to
00:11:09 是的。最了不起的是
00:11:12 And the remarkable thing and this is to
00:11:12 值得注意的是
00:11:12 And the remarkable thing and this is to me is shocking is you can in an
00:11:12 最令人驚訝的是,對我來說,這是令人震驚的,你可以在
00:11:14 me is shocking is you can in an
00:11:14 令我震驚的是,你可以
00:11:14 me is shocking is you can in an enterprise write the task that you want
00:11:14 令我震驚的是,你可以在企業中寫下你想要的任務
00:11:18 enterprise write the task that you want
00:11:18 企業寫下你想要的任務
00:11:18 enterprise write the task that you want and then using something called the
00:11:18 企業寫下你想要的任務,然後用一個叫做
00:11:19 and then using something called the
00:11:19 然後使用一種叫做
00:11:19 and then using something called the model context protocol you can connect
00:11:19 然後使用模型上下文協議,你可以連接
00:11:21 model context protocol you can connect
00:11:21 您可以連接的模型上下文協議
00:11:21 model context protocol you can connect your databases to that and the large
00:11:21 模型上下文協議,你可以將你的資料庫連接到它和大型
00:11:23 your databases to that and the large
00:11:23 你的資料庫和大型
00:11:23 your databases to that and the large language model can produce the code for
00:11:23 你的資料庫和大型語言模型可以產生程式碼
00:11:26 language model can produce the code for
00:11:26 語言模型可以生成
00:11:26 language model can produce the code for your enterprise. Now, there's 100,000
00:11:26 語言模型可以為你的企業產生程式碼。現在有 10 萬個
00:11:29 your enterprise. Now, there's 100,000
00:11:29 你的企業。現在有 10 萬個
00:11:29 your enterprise. Now, there's 100,000 enterprise software companies,
00:11:29 你的企業。現在有10萬家企業軟體公司,
00:11:31 enterprise software companies,
00:11:31 企業軟體公司,
00:11:31 enterprise software companies, middleware companies that grew up in the
00:11:31 企業軟體公司、中介軟體公司在
00:11:33 middleware companies that grew up in the
00:11:33 成長於
00:11:33 middleware companies that grew up in the last 30 years that I've been working on
00:11:33 在過去 30 年裡,我一直在研究成長的中間件公司
00:11:35 last 30 years that I've been working on
00:11:35 過去 30 年我一直在努力
00:11:35 last 30 years that I've been working on this that are all now in trouble because
00:11:35 過去 30 年來我一直致力於此,但現在都陷入困境,因為
00:11:37 this that are all now in trouble because
00:11:37 現在都陷入困境了,因為
00:11:37 this that are all now in trouble because that that interstitial connection is no
00:11:37 現在都陷入困境了,因為間隙連接不存在
00:11:39 that that interstitial connection is no
00:11:39 那個間隙連接是沒有的
00:11:39 that that interstitial connection is no longer needed
00:11:39 不再需要這種間隙連接
00:11:40 longer needed 00:11:40 不再需要
00:11:40 longer needed with their business
00:11:40 不再需要他們的業務
00:11:41 with their business
00:11:41 他們的生意
00:11:41 with their business and and and of course they'll have to
00:11:41 他們的生意,當然他們必須
00:11:42 and and and of course they'll have to
00:11:42 當然他們必須
00:11:42 and and and of course they'll have to change as well. The good news for them
00:11:42 當然,他們也必須改變。對他們來說,好消息是
00:11:44 change as well. The good news for them
00:11:44 也改變了。對他們來說好消息
00:11:44 change as well. The good news for them is enterprises make these changes very
00:11:44 變化。對他們來說,好消息是企業非常
00:11:46 is enterprises make these changes very
00:11:46 企業做出這些改變非常
00:11:46 is enterprises make these changes very slowly. If you built a brand new
00:11:46 企業做出這些改變非常緩慢。如果你建立了一個全新的
00:11:49 slowly. If you built a brand new
00:11:49 慢慢來。如果你建造了一個全新的
00:11:49 slowly. If you built a brand new enterprise um architecture for ERP and
00:11:49 慢慢來。如果你為 ERP 建構了一個全新的企業架構,
00:11:52 enterprise um architecture for ERP and
00:11:52 企業 ERP 架構和
00:11:52 enterprise um architecture for ERP and MRP, you would be highly tempted to not
00:11:52 企業架構 ERP 和 MRP,你很可能會不
00:11:55 MRP, you would be highly tempted to not
00:11:55 MRP,你很可能會不
00:11:55 MRP, you would be highly tempted to not use any of the ERP or MRP suppliers, but
00:11:55 MRP,你很可能會不使用任何 ERP 或 MRP 供應商,但是
00:11:58 use any of the ERP or MRP suppliers, but
00:11:58 使用任何 ERP 或 MRP 供應商,但
00:11:58 use any of the ERP or MRP suppliers, but instead use open- source libraries,
00:11:58 不要使用任何 ERP 或 MRP 供應商,而是使用開源函式庫,
00:12:01 instead use open- source libraries,
00:12:01 而是使用開源函式庫,
00:12:01 instead use open- source libraries, build essentially use BigQuery or the
00:12:01 而是使用開源函式庫,本質上使用 BigQuery 或
00:12:03 build essentially use BigQuery or the
00:12:03 建置本質上使用 BigQuery 或
00:12:03 build essentially use BigQuery or the equivalent from Amazon, which is Red
00:12:03 建構基本上使用 BigQuery 或亞馬遜的同類產品,即 Red
00:12:05 equivalent from Amazon, which is Red
00:12:05 亞馬遜的等價物,也就是紅色
00:12:05 equivalent from Amazon, which is Red Redshift, and essentially build that
00:12:05 亞馬遜的類似產品,也就是 Red Redshift,本質上就是建構這個
00:12:07 Redshift, and essentially build that
00:12:07 Redshift,本質上建構了
00:12:07 Redshift, and essentially build that architecture and it gives you infinite
00:12:07 Redshift,本質上建構了這個架構,它為你提供了無限的
00:12:08 architecture and it gives you infinite
00:12:08 建築,它給你無限
00:12:08 architecture and it gives you infinite flexibility and the computer system
00:12:08 架構,它給你無限的彈性和電腦系統
00:12:10 flexibility and the computer system
00:12:10 靈活性和電腦系統
00:12:10 flexibility and the computer system writes most of the code. Now,
00:12:10 彈性,電腦系統編寫了大部分程式碼。現在,
00:12:13 writes most of the code. Now,
00:12:13 編寫了大部分程式碼。現在,
00:12:13 writes most of the code. Now, programmers don't go away at the moment.
00:12:13 編寫了大部分程式碼。現在,程式設計師不會消失。
00:12:15 programmers don't go away at the moment.
00:12:15 程式設計師此刻不會消失。
00:12:15 programmers don't go away at the moment. It's pretty clear that junior
00:12:15 程式設計師目前還沒有消失。很明顯,初級
00:12:17 It's pretty clear that junior
00:12:17 很明顯,小
00:12:17 It's pretty clear that junior programmers go away. The sort of
00:12:17 很明顯,初級程式設計師已經消失了。
00:12:18 programmers go away. The sort of
00:12:18 程式設計師走了。
00:12:18 programmers go away. The sort of journeymen, if you will, of the
00:12:18 程式設計師消失了。如果你願意的話,
00:12:20 journeymen, if you will, of the
00:12:20 如果你願意的話,
00:12:20 journeymen, if you will, of the stereotype because these systems aren't
00:12:20 如果你願意的話,這些系統並不是刻板印像中的熟練工
00:12:22 stereotype because these systems aren't
00:12:22 刻板印象,因為這些系統不是
00:12:22 stereotype because these systems aren't good enough yet to automatically write
00:12:22 刻板印象,因為這些系統還不夠好,無法自動寫入
00:12:24 good enough yet to automatically write
00:12:24 還不夠好,無法自動寫入
00:12:24 good enough yet to automatically write all the code. They need very senior
00:12:24 還不夠好,無法自動編寫所有程式碼。他們需要非常高級的
00:12:27 all the code. They need very senior
00:12:27 所有代碼。他們需要非常高級的
00:12:27 all the code. They need very senior computer scientists, computer engineers
00:12:27 所有代碼。他們需要非常資深的電腦科學家和電腦工程師
00:12:28 computer scientists, computer engineers
00:12:28 電腦科學家、電腦工程師
00:12:28 computer scientists, computer engineers who are watching it, that will
00:12:28 正在觀看的電腦科學家和電腦工程師們,
00:12:30 who are watching it, that will
00:12:30 誰在觀看,那將
00:12:30 who are watching it, that will eventually go away.
00:12:30 正在觀看它的人最終都會消失。
00:12:32 eventually go away.
00:12:32 最終消失。
00:12:32 eventually go away. One of the things to say about
00:12:32 最終會消失。關於
00:12:33 One of the things to say about
00:12:33 有一件事要說
00:12:33 One of the things to say about productivity, and I call this the San
00:12:33 關於生產力,我稱之為 San
00:12:35 productivity, and I call this the San
00:12:35 生產力,我稱之為 San
00:12:35 productivity, and I call this the San Francisco consensus because it's it's
00:12:35 生產力,我稱之為舊金山共識,因為它是
00:12:36 Francisco consensus because it's it's
00:12:36 弗朗西斯科共識,因為它是
00:12:36 Francisco consensus because it's it's largely the view of people who operate
00:12:36 弗朗西斯科共識,因為這主要是那些運營
00:12:39 largely the view of people who operate
00:12:39 很大程度上是那些運營
00:12:39 largely the view of people who operate in San Francisco,
00:12:39 很大程度上是舊金山人的觀點,
00:12:40 in San Francisco,
舊金山時間 00:12:40
00:12:40 in San Francisco, goes something like this. uh we're just
00:12:40 在舊金山,情況是這樣的。呃,我們只是
00:12:43 goes something like this. uh we're just
00:12:43 就像這樣。呃,我們只是
00:12:43 goes something like this. uh we're just about to the point where we can do two
00:12:43 就像這樣。呃,我們很快就能做兩件事了
00:12:45 about to the point where we can do two
00:12:45 我們可以做兩個
00:12:45 about to the point where we can do two things that are shocking. The first is
00:12:45 我們可以做兩件令人震驚的事。第一是
00:12:48 things that are shocking. The first is
00:12:48 令人震驚的事。首先是
00:12:48 things that are shocking. The first is we can replace most programming tasks by
00:12:48 令人震驚的事。首先,我們可以用
00:12:50 we can replace most programming tasks by
00:12:50 我們可以用以下方式取代大多數程式設計任務
00:12:50 we can replace most programming tasks by computers and we can replace both most
00:12:50 我們可以用電腦取代大多數程式設計任務,我們可以取代大多數
00:12:54 computers and we can replace both most
00:12:54 我們可以更換大多數電腦
00:12:54 computers and we can replace both most mathemat mathematical tasks by
00:12:54 我們可以用計算機取代大多數數學任務
00:12:56 mathemat mathematical tasks by
00:12:56 數學任務
00:12:56 mathemat mathematical tasks by computers.
00:12:56 計算機完成的數學任務。
00:12:57 computers. 00:12:57 電腦。
00:12:57 computers. Now you sit there and you go why? Well,
00:12:57 電腦。現在你坐在那裡,你會問為什麼?嗯,
00:12:59 Now you sit there and you go why? Well,
00:12:59 現在你坐在那裡,你會問為什麼?嗯,
00:12:59 Now you sit there and you go why? Well, if you think about programming and math,
00:12:59 現在你坐在那裡,你會問為什麼?好吧,如果你考慮一下程式設計和數學,
00:13:02 if you think about programming and math,
00:13:02 如果你考慮程式設計和數學,
00:13:02 if you think about programming and math, they have limited language sets compared
00:13:02 如果你考慮程式設計和數學,它們的語言集相比有限
00:13:05 they have limited language sets compared
00:13:05 相較之下,他們的語言能力有限
00:13:05 they have limited language sets compared to human language. So close they're
00:13:05 與人類語言相比,它們的語言集有限。它們如此接近
00:13:07 to human language. So close they're
00:13:07 人類語言。它們如此接近
00:13:07 to human language. So close they're simpler computationally
00:13:07 類似人類語言。它們在計算上更簡單
00:13:09 simpler computationally
00:13:09 計算更簡單
00:13:09 simpler computationally and they're scale free. You can just do
00:13:09 計算起來更簡單,而且它們是無標度的。你可以這樣做
00:13:12 and they're scale free. You can just do
00:13:12 而且它們是無尺度的。你可以這樣做
00:13:12 and they're scale free. You can just do it and do it and do it with more
00:13:12 而且它們是無尺度的。你可以不斷地重複,用更多的
00:13:13 it and do it and do it with more
00:13:13 做這件事,做得更多
00:13:14 it and do it and do it with more electricity. You don't need data. You
00:13:14 這樣做,用更多的電力。你不需要數據。你
00:13:16 electricity. You don't need data. You
00:13:16 電力。你不需要數據。你
00:13:16 electricity. You don't need data. You don't need real world input. You don't
00:13:16 電力。你不需要數據。你不需要現實世界的輸入。你不需要
00:13:18 don't need real world input. You don't
00:13:18 不需要現實世界的輸入。你不需要
00:13:18 don't need real world input. You don't need telemetry. You don't need sensors.
00:13:18 不需要現實世界的輸入。你不需要遙測。你不需要感測器。
00:13:20 need telemetry. You don't need sensors.
00:13:20 需要遙測。不需要感測器。
00:13:20 need telemetry. You don't need sensors. Yeah.
00:13:20 需要遙測。不需要感測器。是的。
00:13:21 Yeah. 00:13:21 是的。
00:13:21 Yeah. So, it's likely in my opinion that
00:13:21 是的。所以,我認為
00:13:23 So, it's likely in my opinion that
00:13:23 所以,我認為
00:13:23 So, it's likely in my opinion that you're going to see worldclass
00:13:23 所以,我認為你很可能會看到世界級的
00:13:24 you're going to see worldclass
00:13:24 你會看到世界級的
00:13:24 you're going to see worldclass mathematicians emerge in the next one
00:13:24 你會看到世界級的數學家在下一屆比賽中出現
00:13:27 mathematicians emerge in the next one
00:13:27 數學家在下一個
00:13:27 mathematicians emerge in the next one year that are AI based and worldclass
00:13:27 未來一年將出現一批基於人工智慧的世界級數學家
00:13:30 year that are AI based and worldclass
00:13:30 基於人工智慧且世界一流的
00:13:30 year that are AI based and worldclass programmers that going to appear within
00:13:30 年內將會出現以人工智慧為基礎的世界級程式設計師
00:13:32 programmers that going to appear within
00:13:32 程式設計師將會出現
00:13:32 programmers that going to appear within the next one or two years. When those
00:13:32 未來一兩年內會出現的程式設計師。當這些
00:13:35 the next one or two years. When those
00:13:35 未來一兩年。當這些
00:13:35 the next one or two years. When those things are deployed at scale, remember
00:13:35 未來一兩年。當這些技術大規模部署時,記住
00:13:37 things are deployed at scale, remember
00:13:37 事情是大規模部署的,記住
00:13:37 things are deployed at scale, remember math and programming are the basis of
00:13:37 事物是大規模部署的,記住數學和程式設計是
00:13:39 math and programming are the basis of
00:13:39 數學和程式設計是
00:13:39 math and programming are the basis of kind of everything, right? It's an
00:13:39 數學和程式設計是一切的基礎,對吧?這是一個
00:13:41 kind of everything, right? It's an
00:13:41 什麼都有,對吧?這是一個
00:13:41 kind of everything, right? It's an accelerate accelerant for physics,
00:13:41 什麼都行,對吧?它是物理學的加速劑,
00:13:43 accelerate accelerant for physics,
00:13:43 加速物理學的促進劑,
00:13:43 accelerate accelerant for physics, chemistry, biology, material science.
00:13:43 加速物理、化學、生物、材料科學的促進劑。
00:13:46 chemistry, biology, material science.
00:13:46 化學、生物、材料科學。
00:13:46 chemistry, biology, material science. So, going back to things like climate
00:13:46 化學、生物、材料科學。所以,回到氣候等議題
00:13:47 So, going back to things like climate
00:13:47 那麼,回到氣候議題上
00:13:47 So, going back to things like climate change, can you imagine if we and this
00:13:47 那麼,回到氣候變遷的問題上,你能想像如果我們和這個
00:13:49 change, can you imagine if we and this
00:13:49 改變,你能想像如果我們和這個
00:13:49 change, can you imagine if we and this goes back to your original argument,
00:13:49 改變,你能想像如果我們回到你最初的論點,
00:13:51 goes back to your original argument,
00:13:51 回到你最初的論點,
00:13:51 goes back to your original argument, Peter, imagine if we can accelerate the
00:13:51 回到你最初的論點,彼得,想像一下如果我們可以加速
00:13:53 Peter, imagine if we can accelerate the
00:13:53 彼得,想像一下如果我們可以加速
00:13:53 Peter, imagine if we can accelerate the discoveries of the new materials that
00:13:53 彼得,想像一下,如果我們能夠加速新材料的發現,
00:13:55 discoveries of the new materials that
00:13:55 新材料的發現
00:13:55 discoveries of the new materials that allow us to deal with a carbonized
00:13:55 新材料的發現使我們能夠處理碳化
00:13:56 allow us to deal with a carbonized
00:13:56 讓我們處理碳化
00:13:56 allow us to deal with a carbonized world.
00:13:56 讓我們應對碳化世界。
00:13:57 world. 00:13:57 世界。
00:13:57 world. Yeah. 00:13:57 世界。是的。
00:13:58 Yeah. 00:13:58 是的。
00:13:58 Yeah. Right. It's very exciting. Can I love to
00:13:58 是的。沒錯。這很令人興奮。我可以
00:14:01 Right. It's very exciting. Can I love to
00:14:01 對。這很令人興奮。我可以
00:14:01 Right. It's very exciting. Can I love to drill in about
00:14:01 對。這很令人興奮。我可以詳細談談
00:14:03 drill in about 00:14:03 深入探討
00:14:03 drill in about you first?
00:14:03 先深入了解你?
00:14:03 you first? 00:14:03 你先來?
00:14:04 you first? I just want to hit this because it's
00:14:04 你先來?我只是想打這個,因為它
00:14:05 I just want to hit this because it's
00:14:05 我只是想打這個,因為它
00:14:05 I just want to hit this because it's important the potential for there to be
00:14:05 我只是想強調這一點,因為
00:14:09 important the potential for there to be
00:14:09 重要的是存在
00:14:09 important the potential for there to be I don't want to use the word PhD level
00:14:09 重要的是存在潛力,我不想使用「博士程度」這個詞
00:14:12 I don't want to use the word PhD level
00:14:12 我不想用「博士程度」這個詞
00:14:12 I don't want to use the word PhD level you know other than uh thinking in the
00:14:12 我不想用「博士程度」這個詞,你知道,除了思考
00:14:15 you know other than uh thinking in the
00:14:15 你知道除了思考
00:14:15 you know other than uh thinking in the terms of research PhD level AIS and uh
00:14:15 除了從研究博士程度的 AIS 角度思考之外
00:14:19 terms of research PhD level AIS and uh
00:14:19 博士研究水準 AIS 和呃
00:14:19 terms of research PhD level AIS and uh that can basically attack any problem
00:14:19 博士級 AIS 研究,基本上可以解決任何問題
00:14:23 that can basically attack any problem
00:14:23 基本上可以解決任何問題
00:14:23 that can basically attack any problem and solve it uh and solve math if you
00:14:23 基本上可以解決任何問題,呃,還可以解決數學問題,如果你
00:14:26 and solve it uh and solve math if you
00:14:26 並解決它,如果你
00:14:26 and solve it uh and solve math if you would in physics. uh this idea of an AI,
00:14:26 並解決它,就像在物理中解決數學問題一樣。這個人工智慧的想法,
00:14:29 would in physics. uh this idea of an AI,
00:14:29 在物理學中。呃,人工智慧的想法,
00:14:29 would in physics. uh this idea of an AI, you know, intelligence explosion. Um Leo
00:14:29 物理學領域。呃,人工智慧的概念,你知道,智慧爆炸。嗯,Leo
00:14:33 you know, intelligence explosion. Um Leo
00:14:33 你知道,智力爆炸。呃,Leo
00:14:33 you know, intelligence explosion. Um Leo Leopold put that at like 26 27
00:14:33 你知道,智力爆炸。嗯,Leo Leopold 把它算作 26 27
00:14:38 Leopold put that at like 26 27
00:14:38 Leopold 把它放在 26 27
00:14:38 Leopold put that at like 26 27 uh heading towards digital super
00:14:38 Leopold 把它放在 26 27 呃,朝著數字超級
00:14:40 uh heading towards digital super
00:14:40 呃,走向數位超級
00:14:40 uh heading towards digital super intelligence in the next few years. Do
00:14:40 呃,未來幾年將走向數位超級智慧。
00:14:42 intelligence in the next few years. Do
00:14:42 未來幾年內,人工智慧的發展將如何?
00:14:42 intelligence in the next few years. Do you buy that time frame?
00:14:42 未來幾年內,情報工作將會取得長足進展。你認同這個時間框架嗎?
00:14:44 you buy that time frame?
00:14:44 您購買那個時間框架?
00:14:44 you buy that time frame? So again, I consider that to be the San
00:14:44 你買這個時間框架嗎?所以,我再說一遍,我認為這是聖
00:14:46 So again, I consider that to be the San
00:14:46 所以,我再次認為這是聖
00:14:46 So again, I consider that to be the San Francisco consensus. I think the dates
00:14:46 所以,我再次認為這是舊金山共識。我認為日期
00:14:48 Francisco consensus. I think the dates
00:14:48 弗朗西斯科共識。我認為日期
00:14:48 Francisco consensus. I think the dates are probably off by one and a half or
00:14:48 弗朗西斯科達成共識。我認為日期可能相差一年半或
00:14:51 are probably off by one and a half or
00:14:51 可能相差一個半或
00:14:51 are probably off by one and a half or two times,
00:14:51 可能相差一倍半或兩倍,
00:14:52 two times, 00:14:52 兩次,
00:14:52 two times, which is pretty close. So a reasonable
00:14:52 兩次,非常接近。所以一個合理的
00:14:55 which is pretty close. So a reasonable
00:14:55 非常接近。所以一個合理的
00:14:55 which is pretty close. So a reasonable prediction is that we're going to have
00:14:55 非常接近。所以一個合理的預測是,我們會有
00:14:58 prediction is that we're going to have
00:14:58 預測是,我們將會有
00:14:58 prediction is that we're going to have specialized soants in every field within
00:14:58 預測是,我們將在各個領域擁有專門的
00:15:01 specialized soants in every field within
00:15:01 各領域的專業解決方案
00:15:01 specialized soants in every field within five years.
00:15:01 五年內在每個領域都有專門的 soants。
00:15:03 five years. 00:15:03 五年。
00:15:03 five years. That's pretty much in the bag as far as
00:15:03 五年。這幾乎是板上釘釘的事了
00:15:05 That's pretty much in the bag as far as
00:15:05 基本上已經確定了
00:15:05 That's pretty much in the bag as far as I'm concerned.
00:15:05 就我而言,這基本上是板上釘釘的事了。
00:15:06 I'm concerned. 00:15:06 我很擔心。
00:15:06 I'm concerned. Sure.
00:15:06 我很擔心。當然。
00:15:06 Sure. 00:15:06 當然。
00:15:06 Sure. And here's why. You have this amount of
00:15:06 當然。原因如下。你有這麼多
00:15:08 And here's why. You have this amount of
00:15:08 原因如下。你有這麼多
00:15:08 And here's why. You have this amount of humans and then you add a million AI
00:15:08 原因如下。你有這麼多的人類,然後你又增加了一百萬個人工智慧
00:15:11 humans and then you add a million AI
00:15:11 人類,然後增加一百萬個人工智慧
00:15:11 humans and then you add a million AI scientists to do something, your slope
00:15:11 人類,然後你加入一百萬人工智慧科學家來做某事,你的斜率
00:15:13 scientists to do something, your slope
00:15:13 科學家做了一些事情,你的斜率
00:15:13 scientists to do something, your slope goes like this. Your rate of
00:15:13 科學家做了一些事情,你的斜率是這樣的。你的速率
00:15:14 goes like this. Your rate of
00:15:14 就像這樣。你的
00:15:14 goes like this. Your rate of improvement, we should get there.
00:15:14 就像這樣。你的進步速度,我們應該要達到。
00:15:17 improvement, we should get there.
00:15:17 改進,我們應該達成目標。
00:15:18 improvement, we should get there. The real question is once you have all
00:15:18 改進,我們應該能達到目標。真正的問題是,一旦你擁有了所有
00:15:19 The real question is once you have all
00:15:19 真正的問題是,一旦你擁有了所有
00:15:19 The real question is once you have all these sants, do they unify?
00:15:19 真正的問題是,一旦擁有了所有這些聖人,他們會統一嗎?
00:15:23 these sants, do they unify?
00:15:23 這些聖人,他們統一了嗎?
00:15:24 these sants, do they unify? Do they ultimately become a superhum?
00:15:24 這些聖人,他們會統一嗎?他們最終會成為超級人類嗎?
00:15:26 Do they ultimately become a superhum?
00:15:26 他們最終會成為超級英雄嗎?
00:15:26 Do they ultimately become a superhum? The term we're using is super
00:15:26 他們最終會成為超級人物嗎?我們用的是超級
00:15:28 The term we're using is super
00:15:28 我們使用的術語是超級
00:15:28 The term we're using is super intelligence, which implies intelligence
00:15:28 我們使用的術語是超級智能,這意味著智能
00:15:30 intelligence, which implies intelligence
00:15:30 智力,這意味著智力
00:15:30 intelligence, which implies intelligence that beyond the sum of what humans can
00:15:30 智力,這意味著智力超越了人類所能達到的總和
00:15:32 that beyond the sum of what humans can
00:15:32 超越人類所能
00:15:32 that beyond the sum of what humans can do.
00:15:32 超越了人類所能做到的一切。
00:15:33 do. 00:15:33 做。
00:15:33 do. The race to super intelligence, which is
00:15:33 確實如此。超級智能競賽
00:15:36 The race to super intelligence, which is
00:15:36 超級智慧競賽
00:15:36 The race to super intelligence, which is incredibly important because imagine
00:15:36 超級智慧競賽非常重要,因為想像一下
00:15:38 incredibly important because imagine
00:15:38 非常重要,因為想像一下
00:15:38 incredibly important because imagine what a super intelligence could do that
00:15:38 非常重要,因為想像一下超級智慧可以做什麼
00:15:40 what a super intelligence could do that
00:15:40 超級智慧能做到什麼程度
00:15:40 what a super intelligence could do that we ourselves cannot imagine, right?
00:15:40 超級智慧能做什麼是我們自己無法想像的,對吧?
00:15:42 we ourselves cannot imagine, right?
00:15:42 我們自己都無法想像,對吧?
00:15:42 we ourselves cannot imagine, right? There it's so much smarter than we and
00:15:42 我們自己都無法想像,對吧?它比我們聰明得多,
00:15:45 There it's so much smarter than we and
00:15:45 它比我們聰明得多
00:15:45 There it's so much smarter than we and it has huge proliferation issues,
00:15:45 它比我們聰明得多,而且有嚴重的擴散問題,
00:15:47 it has huge proliferation issues,
00:15:47 它有嚴重的擴散問題,
00:15:47 it has huge proliferation issues, competitive issues, China versus the US
00:15:47 它存在巨大的擴散問題、競爭問題,中國與美國
00:15:49 competitive issues, China versus the US
00:15:49 競爭問題,中國與美國
00:15:49 competitive issues, China versus the US issues, electricity issues, so forth. We
00:15:49 競爭問題、中美問題、電力問題等等。我們
00:15:52 issues, electricity issues, so forth. We
00:15:52 電力問題等等。我們
00:15:52 issues, electricity issues, so forth. We don't even have the language for the
00:15:52 電力問題等等。我們甚至沒有語言來描述
00:15:54 don't even have the language for the
00:15:54 甚至沒有語言可以表達
00:15:54 don't even have the language for the deterrence aspects and the proliferation
00:15:54 甚至沒有關於威懾和擴散的語言
00:15:56 deterrence aspects and the proliferation
00:15:56 威懾方面和擴散
00:15:56 deterrence aspects and the proliferation issues of these powerful models
00:15:56 這些強大模型的威懾面向和擴散問題
00:15:58 issues of these powerful models
00:15:58 這些強大模型的問題
00:15:58 issues of these powerful models or the imagination.
00:15:58 這些強大模型的問題或想像。
00:15:59 or the imagination.
00:15:59 或想像。
00:15:59 or the imagination. Totally agree. In fact, it's it's one of
00:15:59 或想像。完全同意。事實上,這是
00:16:00 Totally agree. In fact, it's it's one of
00:16:00 完全同意。事實上,這是
00:16:00 Totally agree. In fact, it's it's one of the great flaws actually in the original
00:16:00 完全同意。事實上,這是原版最大的缺陷之一
00:16:02 the great flaws actually in the original
00:16:02 原文存在重大缺陷
00:16:02 the great flaws actually in the original conception. You remember Singularity
00:16:02 最初的設想其實有很大缺陷。你還記得奇點嗎?
00:16:04 conception. You remember Singularity
00:16:04 概念。你還記得奇點嗎
00:16:04 conception. You remember Singularity University and Ray Curtzwhile's books
00:16:04 概念。你還記得奇點大學和雷柯茨維爾的書嗎
00:16:06 University and Ray Curtzwhile's books
00:16:06 大學與雷‧柯茨維爾的書
00:16:06 University and Ray Curtzwhile's books and everything. And we kind of drew this
00:16:06 大學和雷·柯茨維爾的書等等。我們畫了這樣一幅畫
00:16:08 and everything. And we kind of drew this
00:16:08 等等。我們畫了這樣一幅畫
00:16:08 and everything. And we kind of drew this curve of rat level intelligence, then
00:16:08 等等。我們畫出了老鼠智力水平的曲線,然後
00:16:10 curve of rat level intelligence, then
00:16:10 老鼠智力水平曲線,然後
00:16:10 curve of rat level intelligence, then cat, then monkey, and then it hits human
00:16:10 老鼠智力曲線,然後是貓,然後是猴子,最後是人類
00:16:12 cat, then monkey, and then it hits human
00:16:12 先是貓,然後是猴子,最後是人類
00:16:12 cat, then monkey, and then it hits human and then it goes super intelligent. But
00:16:12 貓,然後是猴子,然後是人類,然後是超級智慧。但是
00:16:14 and then it goes super intelligent. But
00:16:14 然後它就變得超級聰明。但是
00:16:14 and then it goes super intelligent. But it's now really obvious when you talk to
00:16:14 然後它就變得非常聰明。但現在當你和
00:16:16 it's now really obvious when you talk to
00:16:16 現在當你和
00:16:16 it's now really obvious when you talk to one of these multilingual models that's
00:16:16 當你與這些多語言模型交談時,這一點很明顯
00:16:19 one of these multilingual models that's
00:16:19 這些多語言模型之一
00:16:19 one of these multilingual models that's explaining physics to you that it's
00:16:19 這些多語言模型之一可以向你解釋物理學
00:16:21 explaining physics to you that it's
00:16:21 向你解釋物理學
00:16:21 explaining physics to you that it's already hugely super intelligent within
00:16:21 向你解釋物理學,它已經非常聰明了
00:16:23 already hugely super intelligent within
00:16:23 已經非常智能
00:16:23 already hugely super intelligent within its savant category. And so Dennis keeps
00:16:23 已經是天才型的超級智商了。所以丹尼斯繼續
00:16:26 its savant category. And so Dennis keeps
00:16:26 屬於天才類別。所以丹尼斯
00:16:26 its savant category. And so Dennis keeps redefining AGI day as well when it can
00:16:26 是專家類別。所以丹尼斯一直在重新定義通用人工智慧,當它能夠
00:16:29 redefining AGI day as well when it can
00:16:29 重新定義 AGI 日
00:16:29 redefining AGI day as well when it can discover relativity the same way
00:16:29 重新定義 AGI 的日子,當它能以同樣的方式發現相對論時
00:16:30 discover relativity the same way
00:16:30 用同樣的方式發現相對論
00:16:30 discover relativity the same way Einstein did with data that was
00:16:30 用愛因斯坦的方法發現相對論,
00:16:32 Einstein did with data that was
00:16:32 愛因斯坦用數據
00:16:32 Einstein did with data that was available up until that date. That's
00:16:32 愛因斯坦利用當時已有的數據進行了計算。
00:16:34 available up until that date. That's
00:16:34 截至該日期。
00:16:34 available up until that date. That's when we have AGI.
00:16:34 直到那個日期為止都可用。那時我們就有了通用人工智慧 (AGI)。
00:16:35 when we have AGI.
00:16:35 當我們擁有 AGI 時。
00:16:35 when we have AGI. So long before that.
00:16:35 當我們擁有通用人工智慧的時候。這比那要早得多。
00:16:37 So long before that.
00:16:37 那是很久以前了。
00:16:37 So long before that. Yeah. So I think it's worth getting the
00:16:37 很久以前。是的。所以我覺得值得
00:16:38 Yeah. So I think it's worth getting the
00:16:38 是的。所以我認為值得
00:16:38 Yeah. So I think it's worth getting the timeline right.
00:16:38 是的。所以我認為有必要確定時間軸。
00:16:39 timeline right. 00:16:39 時間軸右側。
00:16:39 timeline right. Yeah.
00:16:39 時間線正確。是的。
00:16:40 Yeah. 00:16:40 是的。
00:16:40 Yeah. So the following things are baked in.
00:16:40 是的。所以以下內容已經融入其中。
00:16:42 So the following things are baked in.
00:16:42 因此,以下內容已融入其中。
00:16:42 So the following things are baked in. You're going to have an agentic
00:16:42 因此,以下內容已包含在內。你將擁有一個代理
00:16:44 You're going to have an agentic
00:16:44 你將有一個代理
00:16:44 You're going to have an agentic revolution where agents are connected to
00:16:44 你將經歷一場代理革命,代理將連結到
00:16:46 revolution where agents are connected to
00:16:46 革命中特工與
00:16:46 revolution where agents are connected to solve business processes, government
00:16:46 革命,代理人連結起來解決業務流程,政府
00:16:48 solve business processes, government
00:16:48 解決業務流程、政府
00:16:48 solve business processes, government processes and so forth. They will be
00:16:48 解決商業流程、政府流程等等。他們將
00:16:51 processes and so forth. They will be
00:16:51 流程等等。它們將
00:16:51 processes and so forth. They will be adopted most quickly in companies in
00:16:51 流程等等。這些方法將在以下公司最快被採用:
00:16:54 adopted most quickly in companies in
00:16:54 在公司採用速度最快
00:16:54 adopted most quickly in companies in country companies that have a lot of
00:16:54 在擁有大量
00:16:55 country companies that have a lot of
00:16:55 擁有大量
00:16:55 country companies that have a lot of money and a lot of uh time latency
00:16:55 擁有大量資金和大量時間延遲的國家公司
00:16:58 money and a lot of uh time latency
00:16:58 金錢和大量的時間延遲
00:16:58 money and a lot of uh time latency issues at stake. It will adop be adopted
00:16:58 金錢和大量的時間延遲問題都岌岌可危。它將被採用
00:17:00 issues at stake. It will adop be adopted
00:17:00 問題。它將被採納
00:17:00 issues at stake. It will adop be adopted most slowly in places like government
00:17:00 問題至關重要。在政府等地方,採用速度最慢
00:17:02 most slowly in places like government
00:17:02 最慢的地方是政府
00:17:02 most slowly in places like government which do not have an incentive for
00:17:02 最慢的地方是政府,因為政府沒有動力
00:17:04 which do not have an incentive for
00:17:04 沒有動力
00:17:04 which do not have an incentive for innovation. Um and fundamentally are job
00:17:04 沒有創新動力。嗯,從根本上來說,
00:17:07 innovation. Um and fundamentally are job
00:17:07 創新。嗯,從根本上來說,就是工作
00:17:07 innovation. Um and fundamentally are job programs and redistribution of income
00:17:07 創新。嗯,根本是就業計劃和收入再分配
00:17:09 programs and redistribution of income
00:17:09 計劃和收入再分配
00:17:09 programs and redistribution of income kind of programs.
00:17:09 計劃和收入再分配類計劃。
00:17:10 kind of programs.
00:17:10 種程序。
00:17:10 kind of programs. So call it what you will. The important
00:17:10 這類程式。隨便你怎麼稱呼它。重要的是
00:17:12 So call it what you will. The important
00:17:12 隨便你怎麼稱呼它。重要的是
00:17:12 So call it what you will. The important thing is that there will be a tip of the
00:17:12 隨便你怎麼稱呼它。重要的是,
00:17:13 thing is that there will be a tip of the
00:17:13 事情是會有提示的
00:17:13 thing is that there will be a tip of the spear in places like financial services,
00:17:13 問題是,在金融服務等領域,
00:17:17 spear in places like financial services,
00:17:17 涉足金融服務等領域,
00:17:17 spear in places like financial services, certain kind of bio biomedical things,
00:17:17 涉足金融服務、某些生物醫學領域,
00:17:19 certain kind of bio biomedical things,
00:17:19 某些生物醫學的東西,
00:17:19 certain kind of bio biomedical things, startups and so forth. And that's the
00:17:19 某些生物醫學領域,新創公司等等。這就是
00:17:20 startups and so forth. And that's the
00:17:20 新創公司等等。這就是
00:17:20 startups and so forth. And that's the place to watch. So all of that is going
00:17:20 新創公司等等。這是值得關注的地方。所以所有這些都將
00:17:24 place to watch. So all of that is going
00:17:24 值得關注的地方。所以所有這些都將
00:17:24 place to watch. So all of that is going to happen. The agents are going to
00:17:24 值得關注的地方。所以這一切都會發生。特工們將
00:17:26 to happen. The agents are going to
00:17:26 發生。特工們將
00:17:26 to happen. The agents are going to happen. This math thing is going to
00:17:26 會發生。代理將會發生。這個數學問題將
00:17:27 happen. This math thing is going to
00:17:27 發生。這個數學問題將會
00:17:28 happen. This math thing is going to happen. The software thing is going to
00:17:28 會發生。數學的事情會發生。軟體的事情也會發生。
00:17:29 happen. The software thing is going to
00:17:29 會發生。軟體的事情將會
00:17:29 happen. The software thing is going to happen. We can debate the rate at which
00:17:29 會發生。軟體的事情會發生。我們可以討論一下
00:17:31 happen. We can debate the rate at which
00:17:31 發生。我們可以討論
00:17:31 happen. We can debate the rate at which the biological revolution will occur,
00:17:31 發生。我們可以討論生物革命發生的速度,
00:17:33 the biological revolution will occur,
00:17:33 生物革命將會發生,
00:17:34 the biological revolution will occur, but everyone agrees that it's right
00:17:34 生物革命將會發生,但每個人都同意這是正確的
00:17:36 but everyone agrees that it's right
00:17:36 但每個人都同意這是正確的
00:17:36 but everyone agrees that it's right after that. We're very close to these
00:17:36 但大家都同意它就在那之後。我們非常接近這些
00:17:38 after that. We're very close to these
00:17:38 之後。我們非常接近這些
00:17:38 after that. We're very close to these major biological understandings. Um in
00:17:38 之後。我們非常接近這些主要的生物學理解。嗯
00:17:41 major biological understandings. Um in
00:17:41 主要的生物學理解。嗯
00:17:41 major biological understandings. Um in physics you're limited by data but you
00:17:41 主要的生物學理解。嗯,在物理學中,你會受到數據的限制,但你
00:17:43 physics you're limited by data but you
00:17:43 物理學你受到數據的限制,但是你
00:17:43 physics you're limited by data but you can generate it synthetically. There are
00:17:43 物理學會受到資料的限制,但你可以合成資料。
00:17:45 can generate it synthetically. There are
00:17:45 可以合成。有
00:17:45 can generate it synthetically. There are groups which I'm funding which are
00:17:45 可以合成它。我資助的一些團隊
00:17:47 groups which I'm funding which are
00:17:47 我資助的團體
00:17:47 groups which I'm funding which are generating physics um essentially um
00:17:47 我資助的團體正在研究物理學
00:17:50 generating physics um essentially um
00:17:50 生成物理 嗯 本質上 嗯
00:17:50 generating physics um essentially um models that can approximate algorithms
00:17:50 產生物理模型,本質上是可以近似演算法的模型
00:17:53 models that can approximate algorithms
00:17:53 可以近似演算法的模型
00:17:53 models that can approximate algorithms that cannot be they're incomputable. So
00:17:53 可以近似演算法的模型是不可計算的。所以
00:17:55 that cannot be they're incomputable. So
00:17:55 不可能,它們是不可計算的。所以
00:17:55 that cannot be they're incomputable. So in other words you have a a essentially
00:17:55 這不可能,它們是不可計算的。換句話說,你有一個本質上
00:17:57 in other words you have a a essentially
00:17:57 換句話說你基本上有一個
00:17:57 in other words you have a a essentially a foundation model that can answer the
00:17:57 換句話說,你有一個可以回答這個問題的基礎模型
00:17:59 a foundation model that can answer the
00:17:59 一個可以回答這個問題的基礎模型
00:17:59 a foundation model that can answer the question good enough for the purposes of
00:17:59 一個基礎模型,可以很好地回答這個問題
00:18:01 question good enough for the purposes of
00:18:01 這個問題夠好了
00:18:01 question good enough for the purposes of doing physics without having to spend a
00:18:01 這個問題對於做物理研究來說已經夠好了,而不需要花費
00:18:03 doing physics without having to spend a
00:18:03 無需花費
00:18:03 doing physics without having to spend a million years doing the computation of
00:18:03 做物理研究,而不必花一百萬年的時間進行計算
00:18:05 million years doing the computation of
00:18:05 百萬年進行計算
00:18:05 million years doing the computation of you know quantum chromodnamics and
00:18:05 百萬年進行量子色動力學計算
00:18:07 you know quantum chromodnamics and
00:18:07 你知道量子色動力學和
00:18:07 you know quantum chromodnamics and things like that. Yep.
00:18:07 你知道量子色動力學之類的東西。是的。
00:18:08 things like that. Yep.
00:18:08 諸如此類。是的。
00:18:08 things like that. Yep. Um, all of that's going to happen.
00:18:08 諸如此類的事。是的。嗯,這些都會發生。
00:18:11 Um, all of that's going to happen.
00:18:11 嗯,所有這些都會發生。
00:18:11 Um, all of that's going to happen. The next questions have to do with what
00:18:11 嗯,所有這些都會發生。接下來的問題是
00:18:14 The next questions have to do with what
00:18:14 接下來的問題與
00:18:14 The next questions have to do with what is the point in which this becomes a
00:18:14 接下來的問題是,這在什麼時候會變成一個
00:18:17 is the point in which this becomes a
00:18:17 是這成為一個
00:18:17 is the point in which this becomes a national emergency
00:18:17 是這成為國家緊急狀態的時刻
00:18:19 national emergency
00:18:19 國家緊急狀態
00:18:20 national emergency and it goes something like this.
00:18:20 國家緊急狀態,情況如下。
00:18:22 and it goes something like this.
00:18:22 事情是這樣的。
00:18:22 and it goes something like this. Everything I've talked about is in the
00:18:22 大致如下。我討論的所有內容都在
00:18:24 Everything I've talked about is in the
00:18:24 我所談論的一切都在
00:18:24 Everything I've talked about is in the positive domain, but there's a negative
00:18:24 我談論的都是正面的方面,但也存在著負面的方面
00:18:26 positive domain, but there's a negative
00:18:26 正域,但有一個負域
00:18:26 positive domain, but there's a negative domain as well. The ability for
00:18:26 正域,但也有負域。
00:18:28 domain as well. The ability for
00:18:28 域。
00:18:28 domain as well. The ability for biological attacks, um, uh, obviously
00:18:28 領域也是如此。生物攻擊的能力,嗯,呃,顯然
00:18:31 biological attacks, um, uh, obviously
00:18:31 生物攻擊,嗯,呃,顯然
00:18:31 biological attacks, um, uh, obviously cyber attacks. Imagine a cyber attack
00:18:31 生物攻擊,嗯,呃,顯然是網路攻擊。想像網路攻擊
00:18:34 cyber attacks. Imagine a cyber attack
00:18:34 網路攻擊。想像網路攻擊
00:18:34 cyber attacks. Imagine a cyber attack that we as humans cannot conceive of,
00:18:34 網路攻擊。想像一下我們人類無法想像的網路攻擊,
00:18:37 that we as humans cannot conceive of,
00:18:37 我們人類無法想像,
00:18:37 that we as humans cannot conceive of, which means there's no defense for it
00:18:37 我們人類無法想像,這意味著沒有防禦措施
00:18:38 which means there's no defense for it
00:18:38 這表示它無法防禦
00:18:38 which means there's no defense for it because no one ever thought about it.
00:18:38 這意味著對此沒有防禦措施,因為沒有人想到過它。
00:18:40 because no one ever thought about it.
00:18:40 因為沒有人想過這個問題。
00:18:40 because no one ever thought about it. Right? These are real issues. A
00:18:40 因為沒人想過這個問題。對吧?這些都是現實問題。
00:18:42 Right? These are real issues. A
00:18:42 對吧?這些都是現實問題。
00:18:42 Right? These are real issues. A biological attack, you take a virus, I
00:18:42 對吧?這些都是現實問題。生物攻擊,你感染了病毒,我
00:18:44 biological attack, you take a virus, I
00:18:44 生物攻擊,你感染病毒,我
00:18:44 biological attack, you take a virus, I won't obviously go into the details. You
00:18:44 生物攻擊,你會感染病毒,我當然不會詳述。你
00:18:46 won't obviously go into the details. You
00:18:46 顯然不會講細節。你
00:18:46 won't obviously go into the details. You take a virus that's bad and you make it
00:18:46 顯然不會講細節。你拿一個壞病毒,然後你製造它
00:18:49 take a virus that's bad and you make it
00:18:49 感染一種有害病毒,然後你就可以製造它
00:18:49 take a virus that's bad and you make it undetectable by some changes in its
00:18:49 拿一個有害的病毒,透過改變它的某些部分使它無法被偵測到
00:18:52 undetectable by some changes in its
00:18:52 無法透過其內部的一些變化來檢測
00:18:52 undetectable by some changes in its structure, which again I won't go into
00:18:52 無法透過結構上的一些變化來檢測,我不會再深入討論
00:18:53 structure, which again I won't go into
00:18:53 結構,我不會再深入討論
00:18:53 structure, which again I won't go into the details. We released a whole report
00:18:53 結構,我不再贅述。我們發布了一份完整的報告
00:18:55 the details. We released a whole report
00:18:55 細節。我們發布了一份完整的報告
00:18:55 the details. We released a whole report at the national level on this issue. So
00:18:55 細節。我們在國家層級就這個問題發布了一份完整的報告。所以
00:18:59 at the national level on this issue. So
00:18:59 在國家層面。所以
00:18:59 at the national level on this issue. So at some point the government and not it
00:18:59 在國家層面。所以某種程度上,政府而不是
00:19:02 at some point the government and not it
00:19:02 在某種程度上,政府而不是它
00:19:02 at some point the government and not it doesn't appear to understand this now is
00:19:02 在某種程度上,政府似乎不明白這一點
00:19:03 doesn't appear to understand this now is
00:19:03 似乎現在不明白這一點
00:19:04 doesn't appear to understand this now is going to have to say this is very big
00:19:04 似乎不明白這一點現在不得不說這是非常大的
00:19:07 going to have to say this is very big
00:19:07 不得不說這非常重要
00:19:07 going to have to say this is very big because it affects national security,
00:19:07 必須說這是件大事,因為它影響國家安全,
00:19:09 because it affects national security,
00:19:09 因為它影響國家安全,
00:19:09 because it affects national security, national economic strengths and so
00:19:09 因為它影響國家安全、國家經濟實力等等
00:19:10 national economic strengths and so
00:19:10 國家經濟實力等
00:19:10 national economic strengths and so forth. Now China clearly understands
00:19:10 國家經濟實力等等。現在中國清楚地明白
00:19:13 forth. Now China clearly understands
00:19:13 現在中國清楚明白
00:19:13 forth. Now China clearly understands this and China is putting an enormous
00:19:13。現在中國清楚地認識到了這一點,並且正在投入巨大的
00:19:16 this and China is putting an enormous
00:19:16 中國正在投入巨大的
00:19:16 this and China is putting an enormous amount of money into this. We have
00:19:16 中國正為此投入巨額資金。我們
00:19:18 amount of money into this. We have
00:19:18 投入的經費。我們有
00:19:18 amount of money into this. We have slowed them down by virtue of our chips
00:19:18 我們投入了大量資金。我們利用晶片減緩了他們的速度
00:19:21 slowed them down by virtue of our chips
00:19:21 借助我們的晶片減慢了他們的速度
00:19:21 slowed them down by virtue of our chips controls but they found clever ways
00:19:21 我們利用晶片控制來減慢他們的速度,但他們找到了聰明的方法
00:19:23 controls but they found clever ways
00:19:23 但他們找到了巧妙的方法
00:19:23 controls but they found clever ways around this. There are also
00:19:23 但他們找到了巧妙的方法來解決這個問題。還有
00:19:25 around this. There are also
00:19:25 左右。還有
00:19:25 around this. There are also proliferation issues. Many of the chips
00:19:25 圍繞這一點。還有擴散問題。許多晶片
00:19:27 proliferation issues. Many of the chips
00:19:27 擴散問題。許多晶片
00:19:27 proliferation issues. Many of the chips that they're not supposed to have, they
00:19:27 擴散問題。許多他們不該擁有的晶片,
00:19:29 that they're not supposed to have, they
00:19:29 他們不應該擁有
00:19:29 that they're not supposed to have, they seem to be able to get. And more
00:19:29 他們本不該擁有的東西,卻似乎都能得到。還有更多
00:19:30 seem to be able to get. And more
00:19:30 似乎可以得到。還有更多
00:19:30 seem to be able to get. And more importantly, as I mentioned, the
00:19:30 似乎能夠得到。更重要的是,正如我所提到的,
00:19:32 importantly, as I mentioned, the
00:19:32 重要的是,正如我所提到的,
00:19:32 importantly, as I mentioned, the algorithms are changing. And instead of
00:19:32 重要的是,正如我所提到的,演算法正在改變。
00:19:34 algorithms are changing. And instead of
00:19:34 演算法正在改變。
00:19:34 algorithms are changing. And instead of having these expensive foundation models
00:19:34 演算法正在改變。這些昂貴的基礎模型
00:19:36 having these expensive foundation models
00:19:36 擁有這些昂貴的基礎模型
00:19:36 having these expensive foundation models by themselves, you have continuous
00:19:36 擁有這些昂貴的基礎模型,你就有持續的
00:19:38 by themselves, you have continuous
00:19:38 你自己,你有連續的
00:19:38 by themselves, you have continuous updating, which is called test time
00:19:38 你自己不斷更新,稱為測試時間
00:19:39 updating, which is called test time
00:19:39 更新,稱為測試時間
00:19:39 updating, which is called test time training. That continuous updating
00:19:39 更新,這稱為測試時間訓練。持續更新
00:19:41 training. That continuous updating
00:19:41 訓練。持續更新
00:19:41 training. That continuous updating appears to be capable of being done with
00:19:41 訓練。這種持續更新似乎可以透過
00:19:44 appears to be capable of being done with
00:19:44 似乎可以完成
00:19:44 appears to be capable of being done with lesser power chips. So, so we I there
00:19:44 似乎可以用功率較小的晶片來實現。所以,我們
00:19:47 lesser power chips. So, so we I there
00:19:47 功率較小的晶片。所以,我們在那裡
00:19:47 lesser power chips. So, so we I there are so many questions that I think we
00:19:47 功率較小的晶片。所以,我們有很多問題,我認為我們
00:19:49 are so many questions that I think we
00:19:49 有很多問題,我認為我們
00:19:49 are so many questions that I think we don't know. We don't know the role of
00:19:49 我覺得還有很多問題我們不知道。我們不知道
00:19:51 don't know. We don't know the role of
00:19:51 不知道。我們不知道
00:19:51 don't know. We don't know the role of open source because remember open source
00:19:51 不知道。我們不知道開源的作用,因為記得開源
00:19:53 open source because remember open source
00:19:53 開源,因為記得開源
00:19:53 open source because remember open source means open weights, which means everyone
00:19:53 開源,記住開源意味著開放權重,這意味著每個人
00:19:55 means open weights, which means everyone
00:19:55 表示開放權重,這意味著每個人
00:19:55 means open weights, which means everyone can use it. A fair reading of this is
00:19:55 表示開放權重,這意味著每個人都可以使用它。對此的合理解讀是
00:19:57 can use it. A fair reading of this is
00:19:57 可以使用它。對此的合理解讀是
00:19:57 can use it. A fair reading of this is that every country that's not in the
00:19:57 可以使用它。公平的解讀是,每一個不屬於
00:19:59 that every country that's not in the
00:19:59 所有不在
00:19:59 that every country that's not in the west will end up using open source
00:19:59 所有非西方國家最終都會使用開源
00:20:01 west will end up using open source
00:20:01 西方最終將使用開源
00:20:01 west will end up using open source because they'll perceive it as cheaper
00:20:01 西方最終會使用開源,因為他們認為開源更便宜
00:20:03 because they'll perceive it as cheaper
00:20:03 因為他們會認為它更便宜
00:20:03 because they'll perceive it as cheaper which trans transfers leadership in open
00:20:03 因為他們會認為這比較便宜,跨性別者在公開場合轉移領導權
00:20:05 which trans transfers leadership in open
00:20:05 哪些跨性別者公開轉移領導權
00:20:05 which trans transfers leadership in open source from America to China. That's a
00:20:05 哪些轉變將開源的領導權從美國轉移到了中國。這是一個
00:20:07 source from America to China. That's a
00:20:07 從美國到中國。這是一個
00:20:07 source from America to China. That's a big deal, right? If that occurs.
00:20:07 病毒來源從美國轉移到中國。這可是個大問題,對吧?如果真的發生了。
00:20:09 big deal, right? If that occurs.
00:20:09 沒什麼大不了的,對吧?如果真的發生了。
00:20:09 big deal, right? If that occurs. Um how much longer do the chip bans if
00:20:09 有什麼大不了的?如果發生這種情況。嗯,如果
00:20:12 Um how much longer do the chip bans if
00:20:12 嗯,如果晶片禁令持續多久
00:20:12 Um how much longer do the chip bans if you will hold and how long before China
00:20:12 嗯,如果禁令繼續執行,中國什麼時候才能
00:20:14 you will hold and how long before China
00:20:14 你們會撐多久?中國
00:20:14 you will hold and how long before China can answer?
00:20:14 你們會撐多久,中國才能回答?
00:20:16 can answer? 00:20:16 可以解答嗎?
00:20:16 can answer? What are the effects of the current uh
00:20:16 能回答嗎?目前的影響是什麼?
00:20:18 What are the effects of the current uh
00:20:18 目前的影響是什麼
00:20:18 What are the effects of the current uh government's policies of getting rid of
00:20:18 當前政府的政策擺脫
00:20:20 government's policies of getting rid of
00:20:20 政府的政策擺脫
00:20:20 government's policies of getting rid of foreigners and foreign investment? what
00:20:20 政府驅逐外國人和外國投資的政策是什麼?
00:20:22 foreigners and foreign investment? what
00:20:22 外國人和外國投資?什麼
00:20:22 foreigners and foreign investment? what happens with the Arab U data centers
00:20:22 外國人和外國投資?阿拉伯 U 資料中心的情況如何?
00:20:24 happens with the Arab U data centers
00:20:24 發生在阿拉伯 U 資料中心
00:20:24 happens with the Arab U data centers assuming they work and I'm generally
00:20:24 發生在阿拉伯 U 資料中心,假設他們工作,我一般
00:20:26 assuming they work and I'm generally
00:20:26 假設它們有效,我通常
00:20:26 assuming they work and I'm generally supportive of them um if those things
00:20:26 假設它們有效,我通常會支持它們,嗯,如果這些事情
00:20:29 supportive of them um if those things
00:20:29 支持他們,如果這些事情
00:20:29 supportive of them um if those things are then misused uh to help train train
00:20:29 支持他們,如果這些東西被濫用,呃,以幫助訓練
00:20:32 are then misused uh to help train train
00:20:32 然後被濫用來幫忙訓練
00:20:32 are then misused uh to help train train models. The list just goes on and on. We
00:20:32 然後被濫用來訓練模型。這樣的例子不勝枚舉。我們
00:20:34 models. The list just goes on and on. We
00:20:34 型號。這個名單很長。我們
00:20:34 models. The list just goes on and on. We just don't know. Okay. Can I ask you
00:20:34 型號。這個清單很長。我們不知道。好的。我可以問你
00:20:36 just don't know. Okay. Can I ask you
00:20:36 就是不知道。好的。我可以問你嗎
00:20:36 just don't know. Okay. Can I ask you probably one of the toughest questions?
00:20:36 就是不知道。好的。我可以問你一個可能是最難的問題嗎?
00:20:37 probably one of the toughest questions?
00:20:37 這可能是最棘手的問題之一?
00:20:38 probably one of the toughest questions? I don't know if you saw Mark Andre
00:20:38 這可能是最難的問題之一?我不知道你有沒有見過馬克安德烈
00:20:40 I don't know if you saw Mark Andre
00:20:40 我不知道你是否見過馬克安德烈
00:20:40 I don't know if you saw Mark Andre uh he went and talked to the Biden
00:20:40 我不知道你是否看到馬克安德烈,他去和拜登談話
00:20:42 uh he went and talked to the Biden
00:20:42 呃,他去跟拜登談了
00:20:42 uh he went and talked to the Biden administration past administration and
00:20:42 呃,他去跟拜登政府談了談,以及
00:20:44 administration past administration and
00:20:44 政府過去政府和
00:20:44 administration past administration and said how are we going to deal with
00:20:44 政府上一屆政府說,我們將如何處理
00:20:45 said how are we going to deal with
00:20:45 說我們該如何處理
00:20:45 said how are we going to deal with exactly what you just talked about
00:20:45 我們該如何處理你剛才談到的問題
00:20:47 exactly what you just talked about
00:20:47 正是你剛才說的
00:20:47 exactly what you just talked about chemical and biological and radiological
00:20:47 正是你剛才談到的化學、生物和放射學
00:20:49 chemical and biological and radiological
00:20:49 化學、生物和放射學
00:20:49 chemical and biological and radiological and nuclear risks from big foundation
00:20:49 大基金會的化學、生物、放射與核子風險
00:20:51 and nuclear risks from big foundation
00:20:51 以及大基金會的核子風險
00:20:51 and nuclear risks from big foundation models being operated by foreign
00:20:51 以及外國營運的大型基礎模型的核子風險
00:20:53 models being operated by foreign
00:20:53 外國營運的模型
00:20:53 models being operated by foreign countries. And the Biden answer was you
00:20:53 外國正在運作的模型。拜登的答案是你
00:20:56 countries. And the Biden answer was you
00:20:56 國。拜登的答案是你
00:20:56 countries. And the Biden answer was you know we're going to keep it into the
00:20:56 國。拜登的回答是,你知道我們會把它保留到
00:20:57 know we're going to keep it into the
00:20:57 知道我們會把它保留到
00:20:57 know we're going to keep it into the three or four big companies like Google
00:20:57 我們知道我們會把它保留在谷歌這樣的三四家大公司中
00:21:00 three or four big companies like Google
00:21:00 像 Google 這樣的三、四家大公司
00:21:00 three or four big companies like Google and we'll just regulate them. And Mark
00:21:00 像 Google 這樣的三、四家大公司,我們會對它們進行監管。馬克
00:21:03 and we'll just regulate them. And Mark
00:21:03 我們會監管它們。馬克
00:21:03 and we'll just regulate them. And Mark was like, "That is a surefire way to
00:21:03 我們會監管它們。馬克說:「這肯定能
00:21:05 was like, "That is a surefire way to
00:21:05 就像是,「這是一個萬無一失的方法
00:21:05 was like, "That is a surefire way to lose the race with China because all
00:21:05 就像是,「這肯定會輸掉與中國的競爭,因為所有
00:21:08 lose the race with China because all
00:21:08 輸掉與中國的競爭,因為所有
00:21:08 lose the race with China because all innovation comes from a startup that you
00:21:08 輸掉與中國的競爭,因為所有的創新都來自於你
00:21:10 innovation comes from a startup that you
00:21:10 創新來自於你創辦的新創公司
00:21:10 innovation comes from a startup that you didn't anticipate or you know it's just
00:21:10 創新來自於你沒有預料到的創業,或者你知道這只是
00:21:12 didn't anticipate or you know it's just
00:21:12 沒有預料到,或者你知道這只是
00:21:12 didn't anticipate or you know it's just the American history and you're you're
00:21:12 沒有預料到,或者你知道這只是美國歷史,你
00:21:14 the American history and you're you're
00:21:14 美國史,你
00:21:14 the American history and you're you're cutting off the entrepreneur from
00:21:14 美國歷史,你正在切斷企業家與
00:21:16 cutting off the entrepreneur from
00:21:16 切斷創業家
00:21:16 cutting off the entrepreneur from participating in this." So as of right
00:21:16 阻止企業家參與此事。 」因此
00:21:18 participating in this." So as of right
00:21:18 參與其中。 」所以
00:21:18 participating in this." So as of right now with the open source models, the
00:21:18 參與其中。 」所以就目前開源模型而言,
00:21:20 now with the open source models, the
00:21:20 現在有了開源模型,
00:21:20 now with the open source models, the entrepreneurs are in great shape. But if
00:21:20 現在有了開源模式,創業者們的狀態很好。但如果
00:21:22 entrepreneurs are in great shape. But if
00:21:22 企業家們的狀態很好。但如果
00:21:22 entrepreneurs are in great shape. But if you think about the models getting crazy
00:21:22 企業家們狀態很好。但如果你想想那些模特兒變得瘋狂
00:21:24 you think about the models getting crazy
00:21:24 你認為模特兒會變得瘋狂
00:21:24 you think about the models getting crazy smart a year from now, how are we going
00:21:24 想像一年後模型會變得多麼聰明,我們該怎麼辦
00:21:26 smart a year from now, how are we going
00:21:26 一年後,我們會怎樣
00:21:26 smart a year from now, how are we going to have the the balance between startups
00:21:26 一年後,我們將如何在新創公司之間取得平衡
00:21:29 to have the the balance between startups
00:21:29 在新創企業之間取得平衡
00:21:29 to have the the balance between startups actually being able to work with the
00:21:29 找到新創公司與
00:21:31 actually being able to work with the
00:21:31 實際上能夠與
00:21:31 actually being able to work with the best technology but proliferation not
00:21:31 實際上能夠使用最好的技術,但擴散卻不是
00:21:34 best technology but proliferation not
00:21:34 最好的技術,但擴散不是
00:21:34 best technology but proliferation not percolating to every country in the
00:21:34 最好的技術,但擴散並沒有滲透到每個國家
00:21:36 percolating to every country in the
00:21:36 滲透到每個國家
00:21:36 percolating to every country in the world.
00:21:36 滲透到世界每個國家。
00:21:37 world. 00:21:37 世界。
00:21:37 world. Again, a set of unknown questions and
00:21:37 世界。又有一系列未知的問題和
00:21:38 Again, a set of unknown questions and
00:21:38 再次,一組未知的問題和
00:21:38 Again, a set of unknown questions and anybody who knows the answer to these
00:21:38 再一次,一連串未知的問題,任何知道答案的人
00:21:40 anybody who knows the answer to these
00:21:40 任何知道答案的人
00:21:40 anybody who knows the answer to these things is not telling the full truth.
00:21:40 任何知道這些問題答案的人都沒有說出全部真相。
00:21:42 things is not telling the full truth.
00:21:42 事情並沒有說出全部真相。
00:21:42 things is not telling the full truth. Um the doctrine in the B administration
00:21:42 事情並沒有完全真相。嗯,B 政府的信條
00:21:45 Um the doctrine in the B administration
00:21:45 嗯,B 政府的理論
00:21:45 Um the doctrine in the B administration was called 10 to the 26 flops. It was a
00:21:45 嗯,B 政府的理論被稱為「10 到 26 次失敗」。這是一個
00:21:48 was called 10 to the 26 flops. It was a
00:21:48 稱為 10 到 26 翻牌。這是
00:21:48 was called 10 to the 26 flops. It was a point that was a consensus above which
00:21:48 被稱為 10 比 26 的假摔。這是一個共識,高於這個點
00:21:51 point that was a consensus above which
00:21:51 這是一項共識,
00:21:51 point that was a consensus above which the models were powerful enough to cause
00:21:51 點,這是共識,高於這個點,模型就足夠強大,可以導致
00:21:54 the models were powerful enough to cause
00:21:54 這些模型夠強大,可以
00:21:54 the models were powerful enough to cause some damage. So the theory was that if
00:21:54 這些模型夠強大,足以造成一些損害。所以理論是,如果
00:21:56 some damage. So the theory was that if
00:21:56 一些損害。所以理論是,如果
00:21:56 some damage. So the theory was that if you stayed below 10 the 26 you didn't
00:21:56 一些損害。所以理論上,如果你保持在10以下,那麼26你就不會
00:21:59 you stayed below 10 the 26 you didn't
00:21:59 你保持在 10 以下,而 26 你沒有
00:21:59 you stayed below 10 the 26 you didn't need to be regulated.
00:21:59 如果你保持在 10 以下,那麼 26 你就不需要受到監管。
00:22:00 need to be regulated.
00:22:00 需要監管。
00:22:00 need to be regulated. But if you were above that you needed to
00:22:00 需要監管。但如果你的收入高於這個水平,你就需要
00:22:02 But if you were above that you needed to
00:22:02 但如果你高於這個水平,你需要
00:22:02 But if you were above that you needed to be regulated. And the proposal in the
00:22:02 但如果你超出了這一點,你就需要受到監管。而
00:22:04 be regulated. And the proposal in the
00:22:04 受到監管。
00:22:04 be regulated. And the proposal in the Biden administration was to regulate
00:22:04 受到監管。拜登政府的提案是監管
00:22:06 Biden administration was to regulate
00:22:06 拜登政府將監管
00:22:06 Biden administration was to regulate both the open source and the closed
00:22:06 拜登政府將同時監管開源和閉源
00:22:07 both the open source and the closed
00:22:07 開源和閉源
00:22:08 both the open source and the closed source.
00:22:08 開源和閉源。
00:22:08 source. 00:22:08 來源。
00:22:08 source. Okay that's that's the those are the the
00:22:08 來源。好的,這就是這些
00:22:11 Okay that's that's the those are the the
00:22:11 好的,就是這樣,這些就是
00:22:11 Okay that's that's the those are the the summary
00:22:11 好的,這就是摘要
00:22:11 summary 00:22:11 摘要
00:22:11 summary that of course has been ended by the
00:22:11 摘要當然已經結束了
00:22:13 that of course has been ended by the
00:22:13 當然已經結束了
00:22:13 that of course has been ended by the Trump administration. um they have not
00:22:13 當然,川普政府已經終止了這項進程。嗯,他們還沒有
00:22:16 Trump administration. um they have not
00:22:16 川普政府。嗯,他們沒有
00:22:16 Trump administration. um they have not yet produced their own thinking in this
00:22:16 川普政府。嗯,他們還沒有在這個問題上提出自己的想法
00:22:18 yet produced their own thinking in this
00:22:18 卻產生了自己的想法
00:22:18 yet produced their own thinking in this area. They're very concerned about China
00:22:18 但在這個領域他們也有自己的想法。他們非常擔心中國
00:22:20 area. They're very concerned about China
00:22:20 地區。他們非常擔心中國
00:22:20 area. They're very concerned about China and it getting forward. So, they'll come
00:22:20 地區。他們非常關心中國及其發展。所以,他們會來
00:22:22 and it getting forward. So, they'll come
00:22:22 並且它正在向前發展。所以他們會來
00:22:22 and it getting forward. So, they'll come out with something. From my perspective,
00:22:22 並且它正在推進。所以,他們會拿出一些東西。從我的角度來看,
00:22:25 out with something. From my perspective,
00:22:25 出了點問題。從我的角度來看,
00:22:25 out with something. From my perspective, the the core questions are the
00:22:25 提出一些問題。在我看來,核心問題是
00:22:27 the the core questions are the
00:22:27 核心問題是
00:22:27 the the core questions are the following. Will the Chinese be able to
00:22:27 核心問題如下。中國能不能
00:22:30 following. Will the Chinese be able to
00:22:30 中國人能
00:22:30 following. Will the Chinese be able to use even with um chip restrictions, will
00:22:30 如下。中國人是否能夠使用,即使有晶片限制,
00:22:33 use even with um chip restrictions, will
00:22:33 即使有晶片限制,也會使用
00:22:33 use even with um chip restrictions, will they use architectural changes that will
00:22:33 即使有晶片限制,他們也會使用架構變化來
00:22:34 they use architectural changes that will
00:22:34 他們使用架構變化來
00:22:34 they use architectural changes that will allow them to build models as powerful
00:22:34 他們使用架構變化,這將使他們能夠建立強大的模型
00:22:36 allow them to build models as powerful
00:22:36 讓他們能夠建立強大的模型
00:22:36 allow them to build models as powerful as ours?
00:22:36 允許他們建立和我們一樣強大的模型?
00:22:37 as ours? 00:22:37 和我們的一樣嗎?
00:22:37 as ours? And let's assume they're government
00:22:37 和我們一樣?假設他們是政府
00:22:38 And let's assume they're government
00:22:38 假設他們是政府
00:22:38 And let's assume they're government funded. That's the first question. The
00:22:38 我們假設它們是由政府資助的。這是第一個問題。
00:22:41 funded. That's the first question. The
00:22:41 資金到位。這是第一個問題。
00:22:41 funded. That's the first question. The next fun question is how will you raise
00:22:41 資金到位。這是第一個問題。下一個有趣的問題是,你將如何籌集資金
00:22:44 next fun question is how will you raise
00:22:44 下一個有趣的問題是,你將如何籌集
00:22:44 next fun question is how will you raise $50 billion for your data center if your
00:22:44 下一個有趣的問題是,如果你的
00:22:47 $50 billion for your data center if your
00:22:47 如果你的資料中心
00:22:47 $50 billion for your data center if your product is open source?
00:22:47 如果您的產品是開源的,您的資料中心將獲得 500 億美元的收益?
00:22:48 product is open source?
00:22:48 產品是開源的嗎?
00:22:48 product is open source? Yeah.
00:22:48 產品是開源的嗎?是的。
00:22:49 Yeah. 00:22:49 是的。
00:22:49 Yeah. In the American model, part of the
00:22:49 是的。在美國模式中,
00:22:51 In the American model, part of the
00:22:51 在美國模式中,
00:22:51 In the American model, part of the reason these models are closed is that
00:22:51 在美國模式中,這些模型封閉的部分原因是
00:22:52 reason these models are closed is that
00:22:52 這些模型關閉的原因是
00:22:52 reason these models are closed is that the business people and the lawyers
00:22:52 這些模型被關閉的原因是商人和律師
00:22:54 the business people and the lawyers
00:22:54 商人和律師
00:22:54 the business people and the lawyers correctly are saying I've got to sell
00:22:54 商人和律師正確地說我必須賣掉
00:22:57 correctly are saying I've got to sell
00:22:57 正確地說我必須賣掉
00:22:57 correctly are saying I've got to sell this thing because I've got to pay for
00:22:57 正確地說我必須賣掉這個東西,因為我必須支付
00:22:58 this thing because I've got to pay for
00:22:58 因為我得付錢
00:22:58 this thing because I've got to pay for my capital. These are not free goods.
00:22:58 因為我得支付我的資本。這不是免費的商品。
00:23:00 my capital. These are not free goods.
00:23:00 我的資本。這些不是免費的。
00:23:00 my capital. These are not free goods. And the US government correctly is not
00:23:00 我的資本。這些不是免費的商品。美國政府正確地
00:23:02 And the US government correctly is not
00:23:02 美國政府並沒有正確
00:23:02 And the US government correctly is not giving $50 billion to these companies.
00:23:02 美國政府沒有向這些公司提供 500 億美元,這是正確的。
00:23:05 giving $50 billion to these companies.
00:23:05 向這些公司提供 500 億美元。
00:23:05 giving $50 billion to these companies. So we don't know that. Um the to me the
00:23:05 給這些公司500億美元。所以我們不知道。嗯,對我來說
00:23:09 So we don't know that. Um the to me the
00:23:09 所以我們不知道。嗯,對我來說
00:23:09 So we don't know that. Um the to me the key question to watch is look at
00:23:09 所以我們不知道。嗯,對我來說,關鍵問題是觀察
00:23:11 key question to watch is look at
00:23:11 需要關注的關鍵問題是
00:23:11 key question to watch is look at Deepseek. So Deepseek um a week or so
00:23:11 關鍵問題是關注 Deepseek。 Deepseek 大概一週左右
00:23:14 Deepseek. So Deepseek um a week or so
00:23:14 Deepseek。所以 Deepseek 大概一週左右
00:23:14 Deepseek. So Deepseek um a week or so ago Gemini 2.5 Pro got to the top of the
00:23:14 Deepseek。大約一週前,Gemini 2.5 Pro 登上了
00:23:18 ago Gemini 2.5 Pro got to the top of the
00:23:18 前 Gemini 2.5 Pro 登上
00:23:18 ago Gemini 2.5 Pro got to the top of the leaderboards in intelligence. Great
00:23:18 前 Gemini 2.5 Pro 登上了智慧排行榜榜首。太棒了
00:23:21 leaderboards in intelligence. Great
00:23:21 情報排行榜。太棒了
00:23:21 leaderboards in intelligence. Great achievement for my friends at Gem at
00:23:21 智力排行榜。我的朋友們在 Gem 取得了巨大的成就
00:23:22 achievement for my friends at Gem at
00:23:22 為我的朋友們在 Gem at 所取得的成就
00:23:22 achievement for my friends at Gem at Gemini. A week later deepseek comes in
00:23:22 為我在 Gemini 的 Gem 朋友們取得了成就。一週後,Deepseek 也來了
00:23:27 Gemini. A week later deepseek comes in
00:23:27 雙子座。一週後,Deepseek 來了
00:23:27 Gemini. A week later deepseek comes in and is slightly better than Gemini. and
00:23:27 Gemini。一週後,Deepseek 上線,比 Gemini 略好一些。
00:23:29 and is slightly better than Gemini. and
00:23:29 比 Gemini 好一點。
00:23:29 and is slightly better than Gemini. and Deeps of course is trained on the
00:23:29 比 Gemini 略好一點。 Deeps 當然是在
00:23:31 Deeps of course is trained on the
00:23:31 Deeps 當然在
00:23:31 Deeps of course is trained on the existing hardware that's in China which
00:23:31 Deeps 當然是在中國現有的硬體上進行訓練的,
00:23:33 existing hardware that's in China which
00:23:33 在中國現有的硬體
00:23:33 existing hardware that's in China which includes stuff that's been Pilford and
00:23:33 在中國現有的硬件,包括 Pilford 和
00:23:35 includes stuff that's been Pilford and
00:23:35 包括 Pilford 和
00:23:35 includes stuff that's been Pilford and some of the Ascend it's called the
00:23:35 包括 Pilford 和一些 Ascend 的東西,它被稱為
00:23:37 some of the Ascend it's called the
00:23:37 一些 Ascend 被稱為
00:23:37 some of the Ascend it's called the Ascend Huawei chips and a few others
00:23:37 其中一些被稱為 Ascend Huawei 晶片,還有一些其他的
00:23:41 Ascend Huawei chips and a few others
00:23:41 Ascend 華為晶片和其他一些
00:23:41 Ascend Huawei chips and a few others what happens now the US people say well
00:23:41 華為晶片和其他一些晶片現在發生了什麼,美國人民說好
00:23:45 what happens now the US people say well
00:23:45 現在發生了什麼,美國人民說好
00:23:45 what happens now the US people say well you know the the deepseek people cheated
00:23:45 現在發生了什麼,美國人說,你知道 Deepseek 的人被騙了
00:23:48 you know the the deepseek people cheated
00:23:48 你知道 Deepseek 的人作弊了嗎
00:23:48 you know the the deepseek people cheated and they cheated by doing a technique
00:23:48 你知道深度搜尋的人作弊了,他們透過一種技術作弊
00:23:50 and they cheated by doing a technique
00:23:50 他們用一種技巧作弊
00:23:50 and they cheated by doing a technique called distillation where you take a
00:23:50 他們透過一種叫做蒸餾的技術作弊,你取一個
00:23:52 called distillation where you take a
00:23:52 蒸餾,你取一個
00:23:52 called distillation where you take a large model and you ask it 10,000
00:23:52 稱為蒸餾,你拿一個大模型,然後問它 10,000
00:23:54 large model and you ask it 10,000
00:23:54 大型模型,你問它 10,000
00:23:54 large model and you ask it 10,000 questions you get its answers and then
00:23:54 大型模型,你問它 10,000 個問題,它會給出答案,然後
00:23:55 questions you get its answers and then
00:23:55 問題你得到答案然後
00:23:55 questions you get its answers and then then you use that as your training
00:23:55 問題,你會得到答案,然後你用它作為你的訓練
00:23:57 then you use that as your training
00:23:57 然後你用它作為你的訓練
00:23:57 then you use that as your training material
00:23:57 然後你用它作為你的訓練材料
00:23:57 material 00:23:57 材料
00:23:57 material yep 00:23:57 材料是的
00:23:57 yep 00:23:57 是的
00:23:58 yep so the US companies will have to figure
00:23:58 是的,所以美國公司必須弄清楚
00:23:59 so the US companies will have to figure
00:23:59 所以美國公司必須弄清楚
00:23:59 so the US companies will have to figure out a way to make sure that their
00:23:59 因此美國公司必須想辦法確保他們的
00:24:01 out a way to make sure that their
00:24:01 找到一種方法來確保他們的
00:24:01 out a way to make sure that their proprietary information that they've
00:24:01 找到一種方法來確保他們擁有的專有資訊
00:24:02 proprietary information that they've
00:24:02 他們擁有的專有資訊
00:24:02 proprietary information that they've spent so much money on does not get
00:24:02 他們花了這麼多錢獲得的專有資訊沒有得到
00:24:05 spent so much money on does not get
00:24:05 花了那麼多錢卻沒有得到
00:24:05 spent so much money on does not get leaked into these open source things. Um
00:24:05 花了這麼多錢,但資訊卻沒有洩漏到開源專案上。嗯
00:24:08 leaked into these open source things. Um
00:24:08 洩漏到這些開源專案中。嗯
00:24:08 leaked into these open source things. Um I just don't know with respect to uh
00:24:08 洩漏到這些開源專案中。嗯,我只是不知道
00:24:11 I just don't know with respect to uh
00:24:11 我只是不知道
00:24:11 I just don't know with respect to uh nuclear, biological, chemical and so
00:24:11 我只是不知道關於核、生物、化學等等
00:24:13 nuclear, biological, chemical and so
00:24:13 核、生物、化學等
00:24:13 nuclear, biological, chemical and so forth issues. Um the US companies are
00:24:13 核子、生物、化學等問題。嗯,美國公司
00:24:15 forth issues. Um the US companies are
00:24:15 第四個問題。嗯,美國公司
00:24:15 forth issues. Um the US companies are doing a really good job of looking for
00:24:15 第四點。嗯,美國公司在尋找
00:24:17 doing a really good job of looking for
00:24:17 好好尋找
00:24:17 doing a really good job of looking for that. There's a great concern, for
00:24:17 在這方面做得很好。人們非常擔心,
00:24:19 that. There's a great concern, for
00:24:19 這一點。人們非常擔心
00:24:19 that. There's a great concern, for example, that nuclear information would
00:24:19 這一點。例如,人們非常擔心核子資訊會
00:24:21 example, that nuclear information would
00:24:21 例如,核資訊將
00:24:22 example, that nuclear information would leak into these models as they're
00:24:22 例如,核資訊可能會洩漏到這些模型中,因為它們
00:24:23 leak into these models as they're
00:24:23 洩漏到這些模型中
00:24:24 leak into these models as they're training without us knowing it. And by
00:24:24 在我們不知情的情況下,這些模型在訓練過程中會洩漏這些資訊。
00:24:25 training without us knowing it. And by
00:24:25 訓練過程中,我們卻渾然不知。
00:24:25 training without us knowing it. And by the way, that's a violation of law.
00:24:25 在我們不知情的情況下進行訓練。順便說一句,這是違法的。
00:24:27 the way, that's a violation of law.
00:24:27 順便說一句,這是違法的。
00:24:27 the way, that's a violation of law. Oh, really? they work and the whole
00:24:27 順便說一句,這是違法的。哦,真的嗎?他們工作,而且整個
00:24:29 Oh, really? they work and the whole
00:24:29 哦,真的嗎?它們有效,而且整個
00:24:29 Oh, really? they work and the whole nuclear information thing is is there's
00:24:29 哦,真的嗎?它們有效,而且整個核子資訊都是
00:24:31 nuclear information thing is is there's
00:24:31 核子訊息的事情是
00:24:31 nuclear information thing is is there's no free speech in that world for good
00:24:31 核子資訊的問題在於,這個世界根本沒有言論自由
00:24:33 no free speech in that world for good
00:24:33 這世界永遠沒有言論自由
00:24:33 no free speech in that world for good reasons
00:24:33 這個世界沒有言論自由,這是有原因的
00:24:34 reasons 00:24:34 原因
00:24:34 reasons and there's no free use and copyright
00:24:34 原因,沒有免費使用和版權
00:24:36 and there's no free use and copyright
00:24:36 並且沒有免費使用和版權
00:24:36 and there's no free use and copyright and all that kind of stuff. It's illegal
00:24:36 並且沒有自由使用和版權之類的權利。這是非法的
00:24:37 and all that kind of stuff. It's illegal
00:24:37 諸如此類。這是違法的
00:24:37 and all that kind of stuff. It's illegal to do it and so they're doing a really
00:24:37 諸如此類的事。這樣做是違法的,所以他們真的
00:24:39 to do it and so they're doing a really
00:24:39 這樣做,所以他們真的在做
00:24:39 to do it and so they're doing a really really good job of making sure that that
00:24:39 這樣做,所以他們做得非常非常好,確保
00:24:41 really good job of making sure that that
00:24:41 確實做得很好,確保了
00:24:41 really good job of making sure that that does not happen. They also put in very
00:24:41 確保這種情況不會發生,做得很好。他們也投入了非常
00:24:44 does not happen. They also put in very
00:24:44 不會發生。他們也投入了非常
00:24:44 does not happen. They also put in very significant tests for biological
00:24:44 不會發生。他們也對生物進行了非常重要的測試
00:24:45 significant tests for biological
00:24:45 生物顯著性檢驗
00:24:45 significant tests for biological information and certain kinds of cyber
00:24:45 生物資訊和某些網路資訊的重要測試
00:24:47 information and certain kinds of cyber
00:24:47 資訊和某些類型的網絡
00:24:47 information and certain kinds of cyber attacks. What happens there? Their
00:24:47 資訊和某些類型的網路攻擊。會發生什麼事?他們的
00:24:50 attacks. What happens there? Their
00:24:50 襲擊。那裡發生了什麼事?他們的
00:24:50 attacks. What happens there? Their incentive is their incentive to continue
00:24:50 攻擊。那裡發生了什麼事?他們的動機就是繼續
00:24:51 incentive is their incentive to continue
00:24:51 激勵是他們繼續下去的動力
00:24:51 incentive is their incentive to continue especially if it's not if it's not
00:24:51 激勵是他們繼續下去的動力,特別是如果不是
00:24:53 especially if it's not if it's not
00:24:53 尤其是如果它不是
00:24:53 especially if it's not if it's not required by law. The government has just
00:24:53 特別是如果法律沒有要求的話。政府只是
00:24:56 required by law. The government has just
00:24:56 法律要求。政府剛剛
00:24:56 required by law. The government has just gotten rid of the the safety institutes
00:24:56 法律要求。政府剛取消了安全機構
00:24:58 gotten rid of the the safety institutes
00:24:58 擺脫了安全機構
00:24:58 gotten rid of the the safety institutes that were in place in Biden and are
00:24:58 擺脫了拜登時期設立的安全機構,
00:25:00 that were in place in Biden and are
00:25:00 拜登時代就已經存在了
00:25:00 that were in place in Biden and are replacing it by a new term which is
00:25:00 拜登時代就已經存在了,現在用一個新字來取代它,那就是
00:25:02 replacing it by a new term which is
00:25:02 用一個新術語取代它
00:25:02 replacing it by a new term which is largely a safety assessment program
00:25:02 用一個新術語取代它,主要是安全評估程序
00:25:05 largely a safety assessment program
00:25:05 主要是安全評估項目
00:25:05 largely a safety assessment program which is a fine answer. I think
00:25:05 主要是安全評估項目,這是一個很好的答案。我認為
00:25:07 which is a fine answer. I think
00:25:07 答案很好。我認為
00:25:07 which is a fine answer. I think collectively we in the industry just
00:25:07 答案很好。我認為我們這個行業整體上
00:25:10 collectively we in the industry just
00:25:10 我們這個行業
00:25:10 collectively we in the industry just want the government at the secret and
00:25:10 我們這個行業的人都希望政府能保守秘密
00:25:11 want the government at the secret and
00:25:11 想要政府知道這個秘密
00:25:12 want the government at the secret and top secret level to have people who are
00:25:12 希望政府在秘密和最高機密級別上擁有
00:25:14 top secret level to have people who are
00:25:14 絕密級別,
00:25:14 top secret level to have people who are really studying what China and others
00:25:14 絕密級別,讓真正研究中國和其他國家
00:25:16 really studying what China and others
00:25:16 真正研究中國和其他國家
00:25:16 really studying what China and others are doing. You can be sure that China
00:25:16 真正研究中國和其他國家正在做的事情。你可以肯定中國
00:25:18 are doing. You can be sure that China
00:25:18 正在做。你可以肯定中國
00:25:18 are doing. You can be sure that China really has very smart people studying
00:25:18 正在做。你可以肯定中國確實有很多非常聰明的人在學習
00:25:20 really has very smart people studying
00:25:20 確實有非常聰明的人在學習
00:25:20 really has very smart people studying what we're doing. We at the secret and
00:25:20 確實有很多聰明人研究我們在做什麼。我們處於秘密之中
00:25:23 what we're doing. We at the secret and
00:25:23 我們正在做什麼。我們處於秘密之中
00:25:23 what we're doing. We at the secret and top secret level should have the same
00:25:23 我們正在做什麼。我們在秘密和絕密級別應該有相同的
00:25:25 top secret level should have the same
00:25:25 最高機密等級應該有相同的
00:25:25 top secret level should have the same thing.
00:25:25 絕密等級應該也有同樣的東西。
00:25:25 thing. 00:25:25 事。
00:25:25 thing. Have you read the uh AI27 paper?
00:25:25 這件事。你有讀過 AI27 論文嗎?
00:25:28 Have you read the uh AI27 paper?
00:25:28 你有讀過 AI27 論文嗎?
00:25:28 Have you read the uh AI27 paper? I have. Uh, and so for those listening
00:25:28 你有讀過 AI27 的論文嗎?我讀過。嗯,所以對於那些正在聽的人來說
00:25:31 I have. Uh, and so for those listening
00:25:31 是的。呃,所以對於那些正在聽的人來說
00:25:31 I have. Uh, and so for those listening who haven't read it, it's a it's a
00:25:31 是的。嗯,所以對於那些還沒讀過的人來說,這是一個
00:25:33 who haven't read it, it's a it's a
00:25:33 誰沒讀過,這是一個這是一個
00:25:33 who haven't read it, it's a it's a future vision of the AI and US and China
00:25:33 那些沒讀過的人,這是對美國和中國人工智慧的未來展望
00:25:36 future vision of the AI and US and China
00:25:36 人工智慧的未來願景以及美國和中國
00:25:36 future vision of the AI and US and China racing towards AI and at some point the
00:25:36 人工智慧的未來願景,美國和中國競相向人工智慧邁進,在某個時候
00:25:40 racing towards AI and at some point the
00:25:40 向人工智慧邁進,在某個時候
00:25:40 racing towards AI and at some point the story splits into a we're going to slow
00:25:40 向人工智慧邁進,在某個時刻,故事分裂成我們將放慢
00:25:42 story splits into a we're going to slow
00:25:42 故事分成了“我們要放慢速度”
00:25:42 story splits into a we're going to slow down and work on alignment or we're
00:25:42 故事分成兩個部分,一部分是我們要放慢速度,進行協調,另一部分是
00:25:45 down and work on alignment or we're
00:25:45 向下並進行對齊,否則我們
00:25:45 down and work on alignment or we're going full out and uh, you know, spoiler
00:25:45 向下調整,或者我們全力以赴,呃,你知道,劇透
00:25:48 going full out and uh, you know, spoiler
00:25:48 全力以赴,你知道,劇透
00:25:48 going full out and uh, you know, spoiler alert and the race to infinity uh,
00:25:48 全力以赴,你知道,劇透警告和無限競賽,
00:25:52 alert and the race to infinity uh,
00:25:52 警報和無限競賽呃,
00:25:52 alert and the race to infinity uh, humanity vanishes. So the right outcome
00:25:52 警報響起,人類的競賽無限,人類最終消失。所以正確的結果
00:25:55 humanity vanishes. So the right outcome
00:25:55 人類消失了。所以正確的結果
00:25:55 humanity vanishes. So the right outcome will ultimately be some form of
00:25:55 人類消失了。所以正確的結果最終會是某種形式的
00:25:58 will ultimately be some form of
00:25:58 最終會以某種形式
00:25:58 will ultimately be some form of deterrence and mutually assured
00:25:58 最終將是某種形式的威懾和相互保證
00:26:00 deterrence and mutually assured
00:26:00 威懾與相互保證
00:26:00 deterrence and mutually assured destruction. Uh I wrote a paper with two
00:26:00 威懾和相互確保摧毀。呃,我寫了一篇論文,裡面有兩個
00:26:03 destruction. Uh I wrote a paper with two
00:26:03 毀滅。呃,我寫了一篇論文,裡面有兩個
00:26:03 destruction. Uh I wrote a paper with two other authors Dan Hendricks and Alex
00:26:03 毀滅。呃,我和另外兩位作者丹‧亨德里克斯和亞歷克斯寫了一篇論文
00:26:05 other authors Dan Hendricks and Alex
00:26:05 其他作者 Dan Hendricks 與 Alex
00:26:05 other authors Dan Hendricks and Alex Wang where we named it mutual AI
00:26:05 其他作者 Dan Hendricks 和 Alex Wang 我們將其命名為相互人工智慧
00:26:09 Wang where we named it mutual AI
00:26:09 王,我們將其命名為相互人工智慧
00:26:09 Wang where we named it mutual AI malfunction.
00:26:09 王,我們將其命名為相互人工智慧故障。
00:26:10 malfunction. 00:26:10 故障。
00:26:10 malfunction. And the idea was goes something like
00:26:10 故障。這個想法是這樣的
00:26:12 And the idea was goes something like
00:26:12 這個想法是這樣的
00:26:12 And the idea was goes something like this. Um you're the United States, I'm
00:26:12 這個想法是這樣的。嗯,你是美國,我是
00:26:15 this. Um you're the United States, I'm
00:26:15 這個。嗯,你是美國,我是
00:26:15 this. Um you're the United States, I'm China, you're ahead of me. Um at some
00:26:15 這個。嗯,你是美國,我是中國,你領先我。嗯,在某種程度上
00:26:17 China, you're ahead of me. Um at some
00:26:17 中國,你領先我了。嗯,在某種程度上
00:26:17 China, you're ahead of me. Um at some point you cross a line. You know, you
00:26:17 中國,你領先我了。嗯,總有一天你會越界。你知道,你
00:26:19 point you cross a line. You know, you
00:26:19 點,你越界了。你知道,你
00:26:19 point you cross a line. You know, you Peter cross a line and I China go this
00:26:19 點,你越界了。你知道,你彼得越界了,我中國就這麼做了
00:26:22 Peter cross a line and I China go this
00:26:22 彼得越界了,我中國就這麼做了
00:26:22 Peter cross a line and I China go this is unacceptable.
00:26:22 彼得越界了,我中國認為這是不可接受的。
00:26:23 is unacceptable.
00:26:23 是不可接受的。
00:26:23 is unacceptable. At some point it becomes
00:26:23 是不可接受的。在某種程度上,它變成了
00:26:25 At some point it becomes
00:26:25 到了某個時候,它就變成了
00:26:25 At some point it becomes in terms of amount of compute and amount
00:26:25 在某種程度上,它變成了計算量和數量
00:26:26 in terms of amount of compute and amount
00:26:26 就計算量和金額而言
00:26:26 in terms of amount of compute and amount of
00:26:26 就計算量和數量而言
00:26:27 of 00:26:27 的
00:26:27 of it's it's something you're doing where
00:26:27 這是你正在做的事情
00:26:29 it's it's something you're doing where
00:26:29 這是你正在做的事情
00:26:29 it's it's something you're doing where it affects my sovereignty.
00:26:29 你所做的事情影響了我的主權。
00:26:31 it affects my sovereignty.
00:26:31 它影響了我的主權。
00:26:31 it affects my sovereignty. It's not just words and yelling and an
00:26:31 這影響了我的主權。這不僅僅是言語和叫喊,
00:26:34 It's not just words and yelling and an
00:26:34 這不只是言語和叫喊,
00:26:34 It's not just words and yelling and an occasional shooting down a jet. It's
00:26:34 這不只是言語和叫喊,偶爾還會擊落一架飛機。這是
00:26:35 occasional shooting down a jet. It's
00:26:35 偶爾會擊落一架噴射機。
00:26:35 occasional shooting down a jet. It's it's a real threat to the identity of my
00:26:35 偶爾擊落一架噴射機。這對我的身分構成了真正的威脅
00:26:40 it's a real threat to the identity of my
00:26:40 這對我的身分構成了真正的威脅
00:26:40 it's a real threat to the identity of my my country, my economic what have you.
00:26:40 這對我的國家的認同、我的經濟等等都構成了真正的威脅。
00:26:43 my country, my economic what have you.
00:26:43 我的國家,我的經濟怎麼了。
00:26:43 my country, my economic what have you. Under this scenario, I would be highly
00:26:43 我的國家,我的經濟狀況如何?在這種情況下,我會非常
00:26:45 Under this scenario, I would be highly
00:26:45 在這種情況下,我會非常
00:26:45 Under this scenario, I would be highly tempted to do a cyber attack to slow you
00:26:45 在這種情況下,我很想發動網路攻擊來拖慢你的速度
00:26:49 tempted to do a cyber attack to slow you
00:26:49 試圖透過網路攻擊來拖慢你的速度
00:26:49 tempted to do a cyber attack to slow you down. Okay? In mutually assured mal
00:26:49 試圖發動網路攻擊來拖慢你的速度。好嗎?在相互保證的惡意攻擊中
00:26:53 down. Okay? In mutually assured mal
00:26:53 下來。好嗎?在相互保證的
00:26:53 down. Okay? In mutually assured mal malfunction, if you will, we have to
00:26:53 下來。好嗎?在相互保證故障的情況下,如果你願意,我們必須
00:26:55 malfunction, if you will, we have to
00:26:55 故障,如果你願意,我們必須
00:26:56 malfunction, if you will, we have to engineer it so that you have the ability
00:26:56 故障,如果你願意,我們必須設計它,這樣你就有能力
00:26:57 engineer it so that you have the ability
00:26:57 設計它,這樣你就有能力
00:26:57 engineer it so that you have the ability to then do the same thing to me.
00:26:57 設計它,以便你有能力對我做同樣的事情。
00:27:00 to then do the same thing to me.
00:27:00 然後對我做同樣的事情。
00:27:00 to then do the same thing to me. And that causes both of us to be careful
00:27:00 然後對我做同樣的事情。這讓我們倆都小心
00:27:04 And that causes both of us to be careful
00:27:04 這讓我們兩個都要小心
00:27:04 And that causes both of us to be careful not to trigger the other.
00:27:04 這導致我們雙方都要小心,以免激怒對方。
00:27:05 not to trigger the other.
00:27:05 不會觸發另一個。
00:27:06 not to trigger the other. That's what mutual assured destruction
00:27:06 不觸發對方。這就是相互保證摧毀
00:27:07 That's what mutual assured destruction
00:27:07 這就是相互保證摧毀
00:27:07 That's what mutual assured destruction is. That's our best formulation right
00:27:07 這就是「相互保證摧毀」的精髓。這是我們最好的表述,對吧?
00:27:09 is. That's our best formulation right
00:27:09 是。這是我們最好的表述,對吧
00:27:09 is. That's our best formulation right now. We also recommend in our work, and
00:27:09 是。這是我們目前最好的方案。我們也建議在我們的工作中,
00:27:12 now. We also recommend in our work, and
00:27:12 現在。我們也建議在我們的工作中,
00:27:12 now. We also recommend in our work, and I think it's very strong, that the
00:27:12 現在。我們在工作中也建議,我認為非常有力的是,
00:27:14 I think it's very strong, that the
00:27:14 我認為這非常有力,
00:27:14 I think it's very strong, that the government require that we know where
00:27:14 我認為這非常有力,政府要求我們知道
00:27:16 government require that we know where
00:27:16 政府要求我們知道
00:27:16 government require that we know where all the chips are. And remember, the
00:27:16 政府要求我們知道所有晶片的存放位置。記住,
00:27:18 all the chips are. And remember, the
00:27:18 所有的籌碼都是。記住,
00:27:18 all the chips are. And remember, the chips can tell you where they are
00:27:18 所有晶片都是。記住,晶片可以告訴你它們在哪裡
00:27:19 chips can tell you where they are
00:27:19 晶片可以告訴你它們在哪裡
00:27:19 chips can tell you where they are because they're computers. Yeah.
00:27:19 晶片可以告訴你它們在哪裡,因為它們是計算機。是的。
00:27:21 because they're computers. Yeah.
00:27:21 因為它們是電腦。是的。
00:27:21 because they're computers. Yeah. And it would be easy to add a little
00:27:21 因為它們是電腦。是的。而且很容易添加一點
00:27:22 And it would be easy to add a little
00:27:22 很容易加一點
00:27:22 And it would be easy to add a little crypto thing, which would say, "Yeah,
00:27:22 並且很容易添加一些加密的東西,比如說,「是的,
00:27:24 crypto thing, which would say, "Yeah,
00:27:24 加密的東西,它會說,「是的,
00:27:24 crypto thing, which would say, "Yeah, here I am, and this is what I'm doing."
00:27:24 加密的東西,它會說,“是的,我在這裡,這就是我正在做的事情。”
00:27:26 here I am, and this is what I'm doing."
00:27:26 我在這裡,這就是我正在做的事情。 」
00:27:26 here I am, and this is what I'm doing." So, so knowing where the chips are,
00:27:26 我在這裡,這就是我正在做的事情。所以,知道晶片在哪裡,
00:27:28 So, so knowing where the chips are,
00:27:28 所以,知道晶片在哪裡,
00:27:28 So, so knowing where the chips are, knowing where the training runs are, and
00:27:28 所以,知道晶片在哪裡,知道訓練運作在哪裡,
00:27:30 knowing where the training runs are, and
00:27:30 知道訓練路線在哪裡,並且
00:27:30 knowing where the training runs are, and knowing what these fault lines are are
00:27:30 知道訓練運行在哪裡,知道這些斷層線在哪裡
00:27:33 knowing what these fault lines are are
00:27:33 知道這些斷層線是什麼
00:27:33 knowing what these fault lines are are very important. Now, there are a whole
00:27:33 知道這些斷層線在哪裡非常重要。現在,有一整套
00:27:35 very important. Now, there are a whole
00:27:35 非常重要。現在,
00:27:35 very important. Now, there are a whole bunch of assumptions in this scenario
00:27:35 非常重要。現在,這個場景中有很多假設
00:27:37 bunch of assumptions in this scenario
00:27:37 在這個場景中有很多假設
00:27:37 bunch of assumptions in this scenario that I described. The first is that
00:27:37 我描述的這個場景有很多假設。首先是
00:27:39 that I described. The first is that
00:27:39 我描述的。首先是
00:27:39 that I described. The first is that there was enough electricity. The second
00:27:39 就像我所描述的。第一,電力充足。第二
00:27:41 there was enough electricity. The second
00:27:41 電力充足。第二個
00:27:41 there was enough electricity. The second is that there was enough power. The
00:27:41 有足夠的電力。第二是有足夠的動力。
00:27:43 is that there was enough power. The
00:27:43 是有足夠的力量。
00:27:43 is that there was enough power. The third is the Chinese had enough
00:27:43 是有足夠的力量。第三是中國人有足夠的
00:27:44 third is the Chinese had enough
00:27:44 第三是中國受夠了
00:27:44 third is the Chinese had enough electricity, which they do, and enough
00:27:44 第三,中國人有足夠的電力,他們確實有,而且足夠
00:27:46 electricity, which they do, and enough
00:27:46 電力,他們確實這麼做了,而且足夠
00:27:46 electricity, which they do, and enough computing resources, which they may or
00:27:46 電力,他們有,還有足夠的計算資源,他們可能或
00:27:48 computing resources, which they may or
00:27:48 計算資源,它們可能或
00:27:48 computing resources, which they may or may not have
00:27:48 計算資源,他們可能有也可能沒有
00:27:49 may not have 00:27:49 可能沒有
00:27:49 may not have or may in the future have,
00:27:49 可能沒有,或將來可能會有,
00:27:50 or may in the future have,
00:27:50 或將來可能會有,
00:27:50 or may in the future have, and may in the future have. And also,
00:27:50 或將來可能會有,並且將來可能會有。而且,
00:27:52 and may in the future have. And also,
00:27:52 將來也可能會。而且,
00:27:52 and may in the future have. And also, I'm asserting that everyone arrives at
00:27:52 並且將來可能還會有。而且,我斷言每個人都會
00:27:55 I'm asserting that everyone arrives at
00:27:55 我斷言每個人都會到達
00:27:55 I'm asserting that everyone arrives at this eventual state of super
00:27:55 我斷言每個人最終都會達到超級
00:27:57 this eventual state of super
00:27:57 超級的最終狀態
00:27:57 this eventual state of super intelligence at a roughly the same time.
00:27:57 大約在同一時間,超級智慧的最終狀態。
00:27:59 intelligence at a roughly the same time.
00:27:59 情報大致在同一時間。
00:27:59 intelligence at a roughly the same time. Again, these are debatable points, but
00:27:59 情報大致相同。同樣,這些都是有爭議的觀點,但是
00:28:02 Again, these are debatable points, but
00:28:02 再次強調,這些都是有爭議的觀點,但是
00:28:02 Again, these are debatable points, but the most interesting scenario is we're
00:28:02 再次強調,這些都是有爭議的觀點,但最有趣的情況是,我們
00:28:04 the most interesting scenario is we're
00:28:04 最有趣的情況是我們
00:28:04 the most interesting scenario is we're saying it's 1938. the letter has come,
00:28:04 最有趣的場景是我們說這是1938年。這封信來了,
00:28:08 saying it's 1938. the letter has come,
00:28:08 說是1938年。信來了,
00:28:08 saying it's 1938. the letter has come, you know, from Einstein to the president
00:28:08 說是1938年。你知道,這封信是愛因斯坦寫給總統的
00:28:10 you know, from Einstein to the president
00:28:10 你知道,從愛因斯坦到總統
00:28:10 you know, from Einstein to the president and we're having a conversation and
00:28:10 你知道,從愛因斯坦到總統,我們正在進行對話,
00:28:12 and we're having a conversation and
00:28:12 我們正在交談
00:28:12 and we're having a conversation and we're saying,"Well, how does this end?"
00:28:12 我們正在進行對話,我們說,“好吧,這件事該如何結束?”
00:28:15 we're saying,"Well, how does this end?"
00:28:15 我們說,“好吧,這件事該如何結束?”
00:28:15 we're saying,"Well, how does this end?" Okay. So, if you were so brilliant in
00:28:15 我們會說,「好吧,這該怎麼收場?」 好吧。所以,如果你在
00:28:17 Okay. So, if you were so brilliant in
00:28:17 好的。所以,如果你在
00:28:17 Okay. So, if you were so brilliant in 38, what you would have said is this
00:28:17 好的。所以,如果你在38歲時如此聰明,你會說這樣的話
00:28:20 38, what you would have said is this
00:28:20 38,你會說的是
00:28:20 38, what you would have said is this ultimately ends with us having a bomb,
00:28:20 38,你會說這最終會以我們擁有一枚炸彈而告終,
00:28:22 ultimately ends with us having a bomb,
00:28:22 最後我們得到了一枚炸彈,
00:28:22 ultimately ends with us having a bomb, the other guys having a bomb, and then
00:28:22 最終結果是我們得到了一枚炸彈,其他人也得到了一枚炸彈,然後
00:28:24 the other guys having a bomb, and then
00:28:24 其他人有炸彈,然後
00:28:24 the other guys having a bomb, and then we're going to have one heck of a
00:28:24 其他人有炸彈,然後我們就會有一個
00:28:25 we're going to have one heck of a
00:28:25 我們將會有一個
00:28:25 we're going to have one heck of a negotiation to try to make sure that we
00:28:25 我們將進行一場艱苦的談判,以確保我們
00:28:28 negotiation to try to make sure that we
00:28:28 談判,試圖確保我們
00:28:28 negotiation to try to make sure that we don't end up destroying each other. And
00:28:28 談判,以確保我們最終不會互相毀滅。
00:28:30 don't end up destroying each other. And
00:28:30 不要最終互相毀滅。
00:28:30 don't end up destroying each other. And I think the same conversation needs to
00:28:30 不要最終互相毀滅。我認為同樣的對話需要
00:28:32 I think the same conversation needs to
00:28:32 我認為同樣的對話需要
00:28:32 I think the same conversation needs to get started now, well before the
00:28:32 我認為同樣的對話需要現在就開始,在
00:28:35 get started now, well before the
00:28:35 現在就開始,在
00:28:35 get started now, well before the Chernobyl events, well before the
00:28:35 現在就開始,早在切爾諾貝利事件之前,早在
00:28:37 Chernobyl events, well before the
00:28:37 切爾諾貝利事件,早在
00:28:37 Chernobyl events, well before the buildups.
00:28:37 切爾諾貝利事件,遠在核外洩漏之前。
00:28:38 buildups. 00:28:38 累積。
00:28:38 buildups. Can I just take that one more step? And
00:28:38 累積。我可以再走一步嗎?
00:28:40 Can I just take that one more step? And
00:28:40 我可以再走一步嗎?
00:28:40 Can I just take that one more step? And and don't answer if you don't want to,
00:28:40 我可以再走一步嗎?如果你不想回答,就不要回答。
00:28:42 and don't answer if you don't want to,
00:28:42 不想回答就不要回答,
00:28:42 and don't answer if you don't want to, but if it was 1947, 1948,
00:28:42 不想回答就不要回答,但如果是 1947 年、1948 年,
00:28:45 but if it was 1947, 1948,
00:28:45 但如果是 1947 年、1948 年,
00:28:45 but if it was 1947, 1948, so before the Cold War really took off,
00:28:45 但如果是1947年、1948年,也就是冷戰真正開始之前,
00:28:48 so before the Cold War really took off,
00:28:48 所以在冷戰真正開始之前,
00:28:48 so before the Cold War really took off, and you say, well, that's similar to
00:28:48 所以在冷戰真正爆發之前,你會說,嗯,這類似於
00:28:50 and you say, well, that's similar to
00:28:50 你會說,這類似於
00:28:50 and you say, well, that's similar to where we are with China right now. We
00:28:50 你會說,這和我們現在在中國的情況很相似。我們
00:28:51 where we are with China right now. We
00:28:51 我們目前與中國的關係如何。我們
00:28:51 where we are with China right now. We have a competitive lead, but it may or
00:28:51 我們目前與中國的差距在哪裡?我們擁有競爭優勢,但也可能
00:28:53 have a competitive lead, but it may or
00:28:53 擁有競爭優勢,但也可能
00:28:54 have a competitive lead, but it may or may not be fragile.
00:28:54 擁有競爭領先優勢,但這種領先優勢可能脆弱,也可能不脆弱。
00:28:56 may not be fragile.
00:28:56 可能並不脆弱。
00:28:56 may not be fragile. What would you do differently 1947 1940
00:28:56 可能並不脆弱。你會做什麼不同的事情? 1947 1940
00:28:58 What would you do differently 1947 1940
00:28:58 你會做什麼不同的事 1947 1940
00:28:58 What would you do differently 1947 1940 or what would Kissinger do different
00:28:58 你會在 1947 年和 1940 年採取什麼不同的做法,或者基辛格會採取什麼不同的做法
00:28:59 or what would Kissinger do different
00:28:59 或是基辛格會做什麼不同的事情
00:28:59 or what would Kissinger do different 1947 1948 1949 than what we did do?
00:28:59 或者基辛格在 1947 年、1948 年和 1949 年會採取與我們不同的行動嗎?
00:29:03 1947 1948 1949 than what we did do?
00:29:03 1947 1948 1949 比我們做的要多嗎?
00:29:03 1947 1948 1949 than what we did do? You know I I wrote two books with Dr.
00:29:03 1947、1948、1949年比我們做的多嗎?你知道我和博士一起寫了兩本書。
00:29:05 You know I I wrote two books with Dr.
00:29:05 你知道我和博士一起寫了兩本書。
00:29:05 You know I I wrote two books with Dr. Kissinger and I miss him very much. He
00:29:05 你知道我和基辛格博士合寫了兩本書,我非常想念他。他
00:29:07 Kissinger and I miss him very much. He
00:29:07 基辛格和我都非常想念他。他
00:29:07 Kissinger and I miss him very much. He was my closest friend. Um and Henry was
00:29:07 基辛格,我非常想念他。他是我最親密的朋友。嗯,亨利
00:29:11 was my closest friend. Um and Henry was
00:29:11 是我最親密的朋友。嗯,亨利是
00:29:11 was my closest friend. Um and Henry was very much a realist in the sense that
00:29:11 是我最親密的朋友。嗯,亨利非常現實,因為他
00:29:14 very much a realist in the sense that
00:29:14 非常現實主義,因為
00:29:14 very much a realist in the sense that when you look at his history in uh
00:29:14 非常現實主義,當你回顧他的歷史時
00:29:17 when you look at his history in uh
00:29:17 當你回顧他的歷史時
00:29:17 when you look at his history in uh roughly 36 38 he and his uh I guess 37
00:29:17 當你回顧他的歷史時,大概是 36 38 他和他的呃我猜是 37
00:29:21 roughly 36 38 he and his uh I guess 37
00:29:21 大概 36 38 他和他的呃我猜 37
00:29:21 roughly 36 38 he and his uh I guess 37 38 his family were were Jewish were
00:29:21 大概 36 38 他和他的呃我猜 37 38 他的家人是猶太人
00:29:23 38 his family were were Jewish were
00:29:23 38 他的家人都是猶太人
00:29:24 38 his family were were Jewish were forced to immigrate from uh Germany
00:29:24 38 他的家人都是猶太人,被迫從德國移民
00:29:26 forced to immigrate from uh Germany
00:29:26 被迫從德國移民
00:29:26 forced to immigrate from uh Germany because of the Nazis
00:29:26 因為納粹而被迫從德國移民
00:29:27 because of the Nazis
00:29:27 因為納粹
00:29:28 because of the Nazis and he watched the entire world that
00:29:28 因為納粹,他目睹了全世界
00:29:30 and he watched the entire world that
00:29:30 他看著整個世界
00:29:30 and he watched the entire world that he'd grown up with as a boy be destroyed
00:29:30 他親眼目睹了童年時代所熟悉的世界被毀滅
00:29:32 he'd grown up with as a boy be destroyed
00:29:32 他從小就被摧毀
00:29:32 he'd grown up with as a boy be destroyed by the Nazis and by Hitler and then he
00:29:32 他從小就被納粹和希特勒摧毀,然後他
00:29:35 by the Nazis and by Hitler and then he
00:29:35 納粹和希特勒,然後他
00:29:36 by the Nazis and by Hitler and then he saw the confilgration that occurred as a
00:29:36 納粹和希特勒,然後他看到了發生的混亂
00:29:37 saw the confilgration that occurred as a
00:29:37 看到了作為
00:29:37 saw the confilgration that occurred as a result and I tell you that whether you
00:29:37 看到了結果發生的混亂,我告訴你,無論你
00:29:40 result and I tell you that whether you
00:29:40 結果,我告訴你,無論你
00:29:40 result and I tell you that whether you like him or not, he spent the rest of
00:29:40 結果,我告訴你,不管你喜不喜歡他,他花了剩下的時間
00:29:42 like him or not, he spent the rest of
00:29:42 不管你喜不喜歡他,他花了剩下的
00:29:42 like him or not, he spent the rest of his life trying to prevent that from
00:29:42 不管你喜不喜歡他,他用餘生努力阻止這種事發生
00:29:44 his life trying to prevent that from
00:29:44 他一生都在努力阻止
00:29:44 his life trying to prevent that from happening again.
00:29:44 他竭盡全力阻止這種事情再次發生。
00:29:45 happening again.
00:29:45 再次發生。
00:29:45 happening again. Mhm.
00:29:45 又發生了。嗯。
00:29:46 Mhm. 00:29:46 嗯。
00:29:46 Mhm. So we we are today safe because people
00:29:46 嗯。所以我們今天是安全的,因為人們
00:29:49 So we we are today safe because people
00:29:49 所以我們今天是安全的,因為人們
00:29:49 So we we are today safe because people like Henry saw the world fall apart.
00:29:49 所以我們今天是安全的,因為像亨利這樣的人見證了世界分崩離析。
00:29:52 like Henry saw the world fall apart.
00:29:52 就像亨利看到世界崩潰一樣。
00:29:52 like Henry saw the world fall apart. Mhm.
00:29:52 就像亨利親眼目睹世界崩塌一樣。嗯。
00:29:52 Mhm. 00:29:52 嗯。
00:29:52 Mhm. So I think from my perspective, we
00:29:52 嗯。所以我認為從我的角度來看,我們
00:29:55 So I think from my perspective, we
00:29:55 所以我認為從我的角度來看,我們
00:29:55 So I think from my perspective, we should be very careful in our language
00:29:55 所以我認為從我的角度來看,我們應該非常謹慎地使用我們的語言
00:29:57 should be very careful in our language
00:29:57 我們應該非常小心我們的語言
00:29:57 should be very careful in our language and our strategy to not start that
00:29:57 我們應該要非常小心我們的語言和策略,不要開始
00:30:00 and our strategy to not start that
00:30:00 我們的策略是不開始
00:30:00 and our strategy to not start that process. Henry's view on China was
00:30:00 以及我們不啟動這一進程的策略。亨利對中國的看法是
00:30:02 process. Henry's view on China was
00:30:02 過程。亨利對中國的看法是
00:30:02 process. Henry's view on China was different from other China scholars. His
00:30:02 過程。亨利對中國的看法與其他中國學者不同。他的
00:30:04 different from other China scholars. His
00:30:04 與其他中國學者不同。他的
00:30:04 different from other China scholars. His view was in China was that we shouldn't
00:30:04 與其他中國學者不同。他的觀點是,我們不應該
00:30:06 view was in China was that we shouldn't
00:30:06 中國的觀點是,我們不應該
00:30:06 view was in China was that we shouldn't poke the bear, that we shouldn't talk
00:30:06 中國的觀點是,我們不該惹熊,我們不該談
00:30:08 poke the bear, that we shouldn't talk
00:30:08 戳熊,我們不該說話
00:30:08 poke the bear, that we shouldn't talk about Taiwan too much and we let China
00:30:08 戳熊,我們不該多談台灣,我們讓中國
00:30:11 about Taiwan too much and we let China
00:30:11 太多關於台灣的事情,我們讓中國
00:30:11 about Taiwan too much and we let China deal with our own problems which were
00:30:11 我們過度關注台灣問題,而讓中國處理我們自己的問題,
00:30:12 deal with our own problems which were
00:30:12 處理我們自己的問題
00:30:12 deal with our own problems which were very significant. But he was worried
00:30:12 處理我們自己的問題,這些問題非常重要。但他擔心
00:30:15 very significant. But he was worried
00:30:15 非常重要。但他擔心
00:30:15 very significant. But he was worried that we or China in a small way would
00:30:15 非常重要。但他擔心我們或中國在某種程度上會
00:30:18 that we or China in a small way would
00:30:18 我們或中國會在某種程度上
00:30:18 that we or China in a small way would start World War II in the same way that
00:30:18 我們或中國會以同樣的方式引發第二次世界大戰
00:30:20 start World War II in the same way that
00:30:20 以同樣的方式引發第二次世界大戰
00:30:20 start World War II in the same way that World War I was started. You remember
00:30:20 第二次世界大戰的爆發方式和第一次世界大戰一樣。你還記得嗎
00:30:22 World War I was started. You remember
00:30:22 第一次世界大戰爆發了。你還記得嗎
00:30:22 World War I was started. You remember that World War One one, World War I
00:30:22 第一次世界大戰爆發了。你還記得第一次世界大戰嗎?
00:30:24 that World War One one, World War I
00:30:24 第一次世界大戰,第一次世界大戰
00:30:24 that World War One one, World War I started with a essentially a small
00:30:24 第一次世界大戰,第一次世界大戰始於一場小規模的
00:30:26 started with a essentially a small
00:30:26 一開始基本上就是一個小
00:30:26 started with a essentially a small geopolitical event which was quickly
00:30:26 一開始是個很小的地緣政治事件,但很快就
00:30:29 geopolitical event which was quickly
00:30:29 地緣政治事件
00:30:29 geopolitical event which was quickly escalated for political reasons on on
00:30:29 地緣政治事件因政治原因迅速升級
00:30:31 escalated for political reasons on on
00:30:31 因政治原因升級
00:30:31 escalated for political reasons on on all sides
00:30:31 各方因政治原因升級
00:30:32 all sides 00:30:32 各方
00:30:32 all sides and then the rest was a horrific war,
00:30:32 各方都陷入了可怕的戰爭之中,
00:30:34 and then the rest was a horrific war,
00:30:34 接下來是一場可怕的戰爭,
00:30:34 and then the rest was a horrific war, the war to end all wars at the time.
00:30:34 接下來是一場可怕的戰爭,一場結束當時所有戰爭的戰爭。
00:30:36 the war to end all wars at the time.
00:30:36 這場戰爭結束了當時所有的戰爭。
00:30:36 the war to end all wars at the time. So we have to be very very careful when
00:30:36 戰爭結束了當時所有的戰爭。所以我們必須非常非常小心
00:30:38 So we have to be very very careful when
00:30:38 所以我們必須非常小心
00:30:38 So we have to be very very careful when we have these conversations not to
00:30:38 所以我們在進行這些對話時必須非常小心,不要
00:30:40 we have these conversations not to
00:30:40 我們進行這些對話不是為了
00:30:40 we have these conversations not to isolate each other. Um Henry started a
00:30:40 我們進行這些對話是為了避免彼此孤立。嗯,亨利開始
00:30:43 isolate each other. Um Henry started a
00:30:43 互相隔離。嗯,亨利開始
00:30:43 isolate each other. Um Henry started a number of what are called track two
00:30:43 互相隔離。嗯,亨利啟動了一些所謂的“第二軌道”
00:30:44 number of what are called track two
00:30:44 所謂的第二軌道的數量
00:30:44 number of what are called track two dialogues which I'm part of one of them
00:30:44 所謂的第二軌道對話,我參與了其中的一個
00:30:46 dialogues which I'm part of one of them
00:30:46 對話,我參與了其中之一
00:30:46 dialogues which I'm part of one of them to try to make sure we're talking to
00:30:46 對話,我參與了其中一次,以確保我們正在談論
00:30:48 to try to make sure we're talking to
00:30:48 確保我們正在談論
00:30:48 to try to make sure we're talking to each other. And so somebody who's a a
00:30:48 確保我們彼此之間能夠溝通。所以,
00:30:51 each other. And so somebody who's a a
00:30:51 彼此。所以有人是
00:30:51 each other. And so somebody who's a a hardcore person would say, well, you
00:30:51 彼此。所以一個鐵桿粉絲會說,好吧,你
00:30:52 hardcore person would say, well, you
00:30:52 鐵桿玩家會說,好吧,你
00:30:52 hardcore person would say, well, you know, we're Americans and we're better
00:30:52 鐵粉會說,嗯,你知道,我們是美國人,我們更優秀
00:30:54 know, we're Americans and we're better
00:30:54 知道,我們是美國人,我們更好
00:30:54 know, we're Americans and we're better and so forth. Well, I can tell you
00:30:54 知道,我們是美國人,我們更優秀等等。好吧,我可以告訴你
00:30:56 and so forth. Well, I can tell you
00:30:56 等等。好吧,我可以告訴你
00:30:56 and so forth. Well, I can tell you having spent lots of time on this, the
00:30:56 等等。好吧,我可以告訴你,我花了很多時間在這上面,
00:30:58 having spent lots of time on this, the
00:30:58 在這上面花了很多時間,
00:30:58 having spent lots of time on this, the Chinese are very smart, very care
00:30:58 中國人在這方面花了很多時間,他們非常聰明,非常關心
00:31:01 Chinese are very smart, very care
00:31:01 中國人很聰明,很關心
00:31:01 Chinese are very smart, very care capable, very much up here. And if
00:31:01 中國人非常聰明,非常善於照顧他人,在這方面非常先進。如果
00:31:04 capable, very much up here. And if
00:31:04 能力很強,非常優秀。如果
00:31:04 capable, very much up here. And if you're confused about that, again, look
00:31:04 能力很強,非常高。如果你對此感到困惑,再看看
00:31:06 you're confused about that, again, look
00:31:06 你對此感到困惑,再說一遍,看看
00:31:06 you're confused about that, again, look at the arrival of Deep Seek. A year ago,
00:31:06 如果你對此感到困惑,再看看 Deep Seek 的到來。一年前,
00:31:08 at the arrival of Deep Seek. A year ago,
00:31:08 Deep Seek 到來。一年前,
00:31:08 at the arrival of Deep Seek. A year ago, I said they were two years behind.
00:31:08 Deep Seek 的到來。一年前,我就說過他們落後兩年了。
00:31:10 I said they were two years behind.
00:31:10 我說他們落後兩年了。
00:31:10 I said they were two years behind. I was clearly wrong.
00:31:10 我說他們落後兩年了。我顯然錯了。
00:31:12 I was clearly wrong.
00:31:12 我顯然錯了。
00:31:12 I was clearly wrong. With enough money and enough power,
00:31:12 我顯然錯了。有了足夠的金錢和權力,
00:31:15 With enough money and enough power,
00:31:15 有了足夠的金錢和足夠的權力,
00:31:15 With enough money and enough power, they're in the game.
00:31:15 有了足夠的金錢和足夠的權力,他們就參與了遊戲。
00:31:15 they're in the game.
00:31:15 他們在遊戲中。
00:31:16 they're in the game. Yeah. Let me actually drill in just a
00:31:16 他們參與了遊戲。是的。讓我來深入解說一下
00:31:18 Yeah. Let me actually drill in just a
00:31:18 是的。讓我來實際演練一下
00:31:18 Yeah. Let me actually drill in just a little bit more on that too because I
00:31:18 是的。讓我再深入講一下,因為我
00:31:19 little bit more on that too because I
00:31:19 我也想多說一點,因為我
00:31:19 little bit more on that too because I think um one of the reasons deep sea
00:31:19 再多說一點,因為我認為深海
00:31:21 think um one of the reasons deep sea
00:31:21 想想深海的原因之一
00:31:21 think um one of the reasons deep sea caught up so quickly is because it
00:31:21 認為深海如此迅速被追趕的原因之一是
00:31:22 caught up so quickly is because it
00:31:22 這麼快就趕上是因為它
00:31:22 caught up so quickly is because it turned out that inference time generates
00:31:22 之所以能如此迅速地趕上,是因為推理時間會產生
00:31:24 turned out that inference time generates
00:31:24 事實證明,推理時間會產生
00:31:24 turned out that inference time generates a lot of IQ and I don't think anyone saw
00:31:24 事實證明,推理時間會產生大量的智商,我認為沒有人看到
00:31:26 a lot of IQ and I don't think anyone saw
00:31:26 智商很高,但我認為沒有人看到
00:31:26 a lot of IQ and I don't think anyone saw that coming and inference time is a lot
00:31:26 很多智商,我不認為有人預見到這一點,推理時間也很多
00:31:29 that coming and inference time is a lot
00:31:29 推理時間很長
00:31:29 that coming and inference time is a lot easier to catch up on and also if you
00:31:29 推理時間更容易趕上,而且如果你
00:31:31 easier to catch up on and also if you
00:31:31 更容易趕上,而且如果你
00:31:31 easier to catch up on and also if you take one of our big open source models
00:31:31 更容易趕上,而且如果你採用我們的一個大型開源模型
00:31:33 take one of our big open source models
00:31:33 以我們一個大型的開源模型為例
00:31:33 take one of our big open source models and distill it
00:31:33 選取我們一個大型開源模型並對其進行精煉
00:31:34 and distill it 00:31:34 然後提煉
00:31:34 and distill it and then make it a specialist like you
00:31:34 提煉它,然後使其成為像你一樣的專家
00:31:36 and then make it a specialist like you
00:31:36 然後讓它成為像你一樣的專家
00:31:36 and then make it a specialist like you were saying a minute ago and then you
00:31:36 然後讓它成為專家,就像你剛才說的那樣,然後你
00:31:38 were saying a minute ago and then you
00:31:38 一分鐘前我們說過,然後你
00:31:38 were saying a minute ago and then you put a ton of infra time compute behind
00:31:38 一分鐘前我們說過,然後你把大量的時間計算放在後面
00:31:40 put a ton of infra time compute behind
00:31:40 投入大量時間計算
00:31:40 put a ton of infra time compute behind it, it's a massive advantage and also a
00:31:40 投入大量的時間計算,這是一個巨大的優勢,也是一個
00:31:43 it, it's a massive advantage and also a
00:31:43 這是一個巨大的優勢,也是一個
00:31:43 it, it's a massive advantage and also a ma massive leak of capability within
00:31:43 這是一個巨大的優勢,也是一個巨大的能力洩漏
00:31:46 ma massive leak of capability within
00:31:46 大量能力洩漏
00:31:46 ma massive leak of capability within CBRN for example that nobody anticipated
00:31:46 大規模洩漏 CBRN 能力,例如,沒有人預料到
00:31:50 CBRN for example that nobody anticipated
00:31:50 例如,沒人預料到的 CBRN
00:31:50 CBRN for example that nobody anticipated and CBNN remember is chemical,
00:31:50 例如沒人預料到的 CBRN,CBNN 是化學物質,
00:31:52 and CBNN remember is chemical,
00:31:52 CBNN 記得是化學物質,
00:31:52 and CBNN remember is chemical, biological, radiological and nuclear.
00:31:52 並且 CBNN 記住是化學、生物、放射和核。
00:31:55 biological, radiological and nuclear.
00:31:55 生物、放射和核。
00:31:55 biological, radiological and nuclear. Um
00:31:55 生物、放射和核。嗯
00:31:57 Um 00:31:57 一
00:31:57 Um let me rephrase what you said.
00:31:57 嗯,讓我重新表達你所說的話。
00:32:00 let me rephrase what you said.
00:32:00 讓我重新表達你所說的話。
00:32:00 let me rephrase what you said. If the structure of the world in 5 to 10
00:32:00 讓我重新表達你剛才說的話。如果世界的結構在 5 到 10 年內
00:32:03 If the structure of the world in 5 to 10
00:32:03 如果世界的結構在5到10
00:32:03 If the structure of the world in 5 to 10 years is 10 models
00:32:03 如果5到10年後的世界結構有10個模型
00:32:07 years is 10 models
00:32:07 年是 10 個模型
00:32:07 years is 10 models and I'll make some numbers up. Five in
00:32:07 年是 10 個模型,我會編造一些數字。
00:32:10 and I'll make some numbers up. Five in
00:32:10 我會編一些數字。五
00:32:10 and I'll make some numbers up. Five in the United States, three in China, two
00:32:10 我會算出一些數字。美國有五家,中國有三家,
00:32:12 the United States, three in China, two
00:32:12 美國,3個中國,2個
00:32:12 the United States, three in China, two elsewhere. And those models are data
00:32:12 美國有 3 個,中國有 3 個,其他地區有 2 個。這些模型都是數據
00:32:15 elsewhere. And those models are data
00:32:15 其他地方。這些模型是數據
00:32:15 elsewhere. And those models are data centers that are multi- gigawatts.
00:32:15 其他地方。這些模型是多千兆瓦的資料中心。
00:32:18 centers that are multi- gigawatts.
00:32:18 數千兆瓦的中心。
00:32:18 centers that are multi- gigawatts. They will be all nationalized in some
00:32:18 幾千兆瓦的發電中心。它們將在某些地方被國有化
00:32:21 They will be all nationalized in some
00:32:21 有些地區將被國有化
00:32:21 They will be all nationalized in some way.
00:32:21 它們都將以某種方式被國有化。
00:32:23 way. 00:32:23 方式。
00:32:23 way. In China, they will be owned by the
00:32:23 在中國,它們將歸
00:32:25 In China, they will be owned by the
00:32:25 在中國,它們將歸
00:32:25 In China, they will be owned by the government.
00:32:25 在中國,它們將歸政府所有。
00:32:25 government. 00:32:25 政府。
00:32:25 government. Mhm.
00:32:25 政府。嗯。
00:32:26 Mhm. 00:32:26 嗯。
00:32:26 Mhm. The stakes are too high.
00:32:26 嗯。風險太高了。
00:32:27 The stakes are too high.
00:32:27 風險太高了。
00:32:27 The stakes are too high. Mhm. Um, one in my military work one day
00:32:27 風險太高了。嗯。嗯,有一天我的軍隊工作
00:32:30 Mhm. Um, one in my military work one day
00:32:30 嗯。嗯,有一天我在軍隊工作時
00:32:30 Mhm. Um, one in my military work one day I visited a place where we keep our
00:32:30 嗯。嗯,有一天,我在軍隊工作時,參觀了一個我們保存
00:32:31 I visited a place where we keep our
00:32:31 我參觀了一個我們保存
00:32:31 I visited a place where we keep our plutonium and we keep our plutonium in
00:32:31 我參觀了我們存放鈽的地方,我們把鈽存放在
00:32:35 plutonium and we keep our plutonium in
00:32:35 鈽,我們把鈽保存在
00:32:35 plutonium and we keep our plutonium in in a base that's inside of another base
00:32:35 鈽,我們把鈽保存在一個基地裡,而這個基地又在另一個基地裡
00:32:37 in a base that's inside of another base
00:32:37 在一個基地裡,另一個基地裡
00:32:37 in a base that's inside of another base with even more machine guns and even
00:32:37 在一個基地裡,另一個基地裡有更多的機關槍,甚至
00:32:39 with even more machine guns and even
00:32:39 甚至更多的機槍和
00:32:39 with even more machine guns and even more specialized because the plutonium
00:32:39 擁有更多的機槍,更專業化,因為鈽
00:32:41 more specialized because the plutonium
00:32:41 更專業化,因為鈽
00:32:41 more specialized because the plutonium is so is so interesting and and
00:32:41 更專業,因為鈽非常有趣,而且
00:32:44 is so is so interesting and and
00:32:44 真是太有趣了
00:32:44 is so is so interesting and and obviously very dangerous and I believe
00:32:44 非常有趣,而且顯然非常危險,我相信
00:32:46 obviously very dangerous and I believe
00:32:46 顯然非常危險,我相信
00:32:46 obviously very dangerous and I believe it's the only one or two facilities that
00:32:46 顯然非常危險,我相信這是唯一一兩個
00:32:47 it's the only one or two facilities that
00:32:47 這是唯一一兩個
00:32:47 it's the only one or two facilities that we have in America. So in that scenario,
00:32:47 這是我們在美國僅有的一、兩家這樣的設施。所以在這種情況下,
00:32:51 we have in America. So in that scenario,
00:32:51 我們在美國也有。所以在這種情況下,
00:32:51 we have in America. So in that scenario, these data centers will have the
00:32:51 我們在美國有。所以在這種情況下,這些資料中心將會擁有
00:32:53 these data centers will have the
00:32:53 這些資料中心將擁有
00:32:53 these data centers will have the equivalent of guards and machine guns
00:32:53 這些資料中心將配備相當於警衛和機關槍的人員
00:32:55 equivalent of guards and machine guns
00:32:55 相當於警衛和機關槍
00:32:55 equivalent of guards and machine guns because they're so important.
00:32:55 相當於警衛和機關槍,因為它們非常重要。
00:32:58 because they're so important.
00:32:58 因為它們非常重要。
00:32:58 because they're so important. Now is that a stable geopolitical
00:32:58 因為它們非常重要。現在是一個穩定的地緣政治
00:33:00 Now is that a stable geopolitical
00:33:00 現在是穩定的地緣政治
00:33:00 Now is that a stable geopolitical system? Absolutely. You know where they
00:33:00 現在這是一個穩定的地緣政治體系嗎?當然。你知道他們在哪裡
00:33:03 system? Absolutely. You know where they
00:33:03 系統?當然。你知道他們在哪裡
00:33:03 system? Absolutely. You know where they are. President of one country can call
00:33:03 系統?當然。你知道他們在哪裡。一個國家的總統可以打電話
00:33:06 are. President of one country can call
00:33:06 一個國家的總統可以打電話
00:33:06 are. President of one country can call the other. They can have a conversation.
00:33:06 是。一個國家的總統可以打電話給另一個國家的總統。他們可以通話。
00:33:08 the other. They can have a conversation.
00:33:08 另一個。他們可以交談。
00:33:08 the other. They can have a conversation. You know, they can agree on what they
00:33:08 另一個。他們可以進行對話。你知道,他們可以就他們想要的達成一致
00:33:10 You know, they can agree on what they
00:33:10 你知道,他們可以就他們
00:33:10 You know, they can agree on what they agree on and so forth. But let's say the
00:33:10 你知道,他們可以就他們同意的事情達成一致等等。但是假設
00:33:13 agree on and so forth. But let's say the
00:33:13 同意等等。但是假設
00:33:13 agree on and so forth. But let's say the it is not true. Let's say that the
00:33:13 同意等等。但假設它不是真的。假設
00:33:15 it is not true. Let's say that the
00:33:15 這不是真的。假設
00:33:16 it is not true. Let's say that the technology improves again unknown to the
00:33:16 這不是真的。假設技術再次改進,而
00:33:19 technology improves again unknown to the
00:33:19 技術再次進步,
00:33:19 technology improves again unknown to the point where the kind of technologies
00:33:19 技術再次進步到未知的程度,這種技術
00:33:21 point where the kind of technologies
00:33:21 點,這種技術
00:33:21 point where the kind of technologies that I'm describing are implementable on
00:33:21 我所描述的技術可以實現
00:33:23 that I'm describing are implementable on
00:33:23 我所描述的是可以實現的
00:33:23 that I'm describing are implementable on the equivalent of a small server
00:33:23 我所描述的這些功能都可以在相當於小型伺服器的裝置上實現
00:33:25 the equivalent of a small server
00:33:25 相當於一個小型伺服器
00:33:25 the equivalent of a small server then you have a humongous
00:33:25 相當於一個小型伺服器,然後你就有一個巨大的
00:33:28 then you have a humongous
00:33:28 那你就有一個巨大的
00:33:28 then you have a humongous data center proliferation problem and
00:33:28 那你就有一個巨大的資料中心擴散問題,
00:33:30 data center proliferation problem and
00:33:30 資料中心擴散問題和
00:33:30 data center proliferation problem and that's where the open-source issue is so
00:33:30 資料中心擴散問題,這就是開源問題所在
00:33:32 that's where the open-source issue is so
00:33:32 這就是開源問題所在
00:33:32 that's where the open-source issue is so important because those servers which
00:33:32 這就是開源問題如此重要的原因,因為那些伺服器
00:33:34 important because those servers which
00:33:34 很重要,因為這些伺服器
00:33:34 important because those servers which will be proliferate throughout the world
00:33:34 很重要,因為這些伺服器將會在世界各地擴散
00:33:36 will be proliferate throughout the world
00:33:36 將會在全世界擴散
00:33:36 will be proliferate throughout the world will all be on open source. We have no
00:33:36 將會在世界範圍內傳播,所有這些都將基於開源。我們沒有
00:33:38 will all be on open source. We have no
00:33:38 全部開源。我們沒有
00:33:38 will all be on open source. We have no control regime for that. Now, I'm in
00:33:38 全部都會開源。我們沒有這方面的控制機制。現在,我
00:33:40 control regime for that. Now, I'm in
00:33:40 控制機制。現在,我
00:33:40 control regime for that. Now, I'm in favor of open source as you mentioned
00:33:40 控制機制。現在,正如你所提到的,我支持開源
00:33:42 favor of open source as you mentioned
00:33:42 如你所提到的,支援開源
00:33:42 favor of open source as you mentioned earlier with Mark Andre u uh that open
00:33:42 就像你之前提到的 Mark Andre 一樣,支援開源
00:33:45 earlier with Mark Andre u uh that open
00:33:45 之前和 Mark Andre 一起
00:33:45 earlier with Mark Andre u uh that open competition and so forth tends to allow
00:33:45 之前和 Mark Andre 討論過,開放競爭等等往往允許
00:33:47 competition and so forth tends to allow
00:33:47 競爭等等往往允許
00:33:47 competition and so forth tends to allow people to run ahead in defense of the
00:33:47 競爭等等往往讓人們跑在前面捍衛
00:33:50 people to run ahead in defense of the
00:33:50 人衝上前去保衛
00:33:50 people to run ahead in defense of the proprietary companies. Collectively,
00:33:50 人衝在最前面捍衛私有公司。總的來說,
00:33:53 proprietary companies. Collectively,
00:33:53 私有公司。總的來說,
00:33:53 proprietary companies. Collectively, they believe as best I can tell that the
00:33:53 私有公司。總的來說,他們相信,據我所知,
00:33:57 they believe as best I can tell that the
00:33:57 他們認為,據我所知
00:33:57 they believe as best I can tell that the open- source models can't scale fast
00:33:57 他們認為,據我所知,開源模型無法快速擴展
00:33:59 open- source models can't scale fast
00:33:59 開源模型無法快速擴展
00:33:59 open- source models can't scale fast enough because they need this
00:33:59 開源模型無法快速擴展,因為它們需要這個
00:34:01 enough because they need this
00:34:01 夠了,因為他們需要這個
00:34:01 enough because they need this heavyweight training. If you look, I
00:34:01 夠了,因為他們需要這種重量級訓練。如果你看,我
00:34:03 heavyweight training. If you look, I
00:34:03 重量級訓練。如果你看的話,我
00:34:03 heavyweight training. If you look, I I'll give you an example of Grock is
00:34:03 重量級訓練。如果你看的話,我舉個例子,Grock 是
00:34:05 I'll give you an example of Grock is
00:34:05 我舉個 Grock 的例子
00:34:05 I'll give you an example of Grock is trained on a single cluster that was
00:34:05 我給你舉個例子,Grock 是在單一集群上進行訓練的,
00:34:08 trained on a single cluster that was
00:34:08 在單一集群上進行訓練
00:34:08 trained on a single cluster that was built by Nvidia in 20 days or so forth
00:34:08 在 Nvidia 建置的單一叢集上進行訓練,耗時約 20 天
00:34:10 built by Nvidia in 20 days or so forth
00:34:10 由 Nvidia 在 20 天左右的時間內建造
00:34:10 built by Nvidia in 20 days or so forth in Memphis, Tennessee of 200,000 GPUs.
00:34:10 由 Nvidia 花了大約 20 天的時間在田納西州孟菲斯建造了 200,000 個 GPU。
00:34:14 in Memphis, Tennessee of 200,000 GPUs.
00:34:14 位於田納西州孟菲斯的 200,000 個 GPU。
00:34:14 in Memphis, Tennessee of 200,000 GPUs. Um GPU is about $50,000. You can say
00:34:14 田納西州孟菲斯有 20 萬個 GPU。嗯,GPU 大約 5 萬美元。你可以說
00:34:17 Um GPU is about $50,000. You can say
00:34:17 嗯,GPU 大約 5 萬美元。你可以說
00:34:17 Um GPU is about $50,000. You can say it's about a $10 billion supercomput in
00:34:17 嗯,GPU 大約 5 萬美元。你可以說它相當於一台價值 100 億美元的超級計算機
00:34:20 it's about a $10 billion supercomput in
00:34:20 大約是價值 100 億美元的超級計算機
00:34:20 it's about a $10 billion supercomput in one building that does one thing, right?
00:34:20 這相當於在一棟建築物裡有一台價值 100 億美元的超級計算機,只做一件事,對嗎?
00:34:23 one building that does one thing, right?
00:34:23 一棟建築物只做一件事,對嗎?
00:34:23 one building that does one thing, right? If that is the future, then we're okay
00:34:23 一棟大樓只做一件事,對吧?如果這就是未來,那我們沒問題。
00:34:26 If that is the future, then we're okay
00:34:26 如果這就是未來,那我們就沒事了
00:34:26 If that is the future, then we're okay because we'll be able to know where they
00:34:26 如果這就是未來,那麼我們就沒事了,因為我們能夠知道它們在哪裡
00:34:28 because we'll be able to know where they
00:34:28 因為我們可以知道他們在哪裡
00:34:28 because we'll be able to know where they are.
00:34:28 因為我們可以知道他們在哪裡。
00:34:28 are. 00:34:28 是。
00:34:28 are. Yeah. If in fact the arrival of
00:34:28 是的。如果事實上
00:34:31 Yeah. If in fact the arrival of
00:34:31 是的。如果
00:34:31 Yeah. If in fact the arrival of intelligence is ultimately a a
00:34:31 是的。如果事實上情報的到來最終是
00:34:33 intelligence is ultimately a a
00:34:33 情報最終是
00:34:33 intelligence is ultimately a a distributed problem, then we're going to
00:34:33 智能最終是分散式問題,那麼我們將
00:34:36 distributed problem, then we're going to
00:34:36 分散式問題,那麼我們將
00:34:36 distributed problem, then we're going to have lots of problems with terrorism,
00:34:36 分散式問題,那麼我們將會面臨許多恐怖主義問題,
00:34:38 have lots of problems with terrorism,
00:34:38 有很多恐怖主義問題,
00:34:38 have lots of problems with terrorism, bad actors, North Korea poorly,
00:34:38 有很多恐怖主義問題,壞人,北韓問題嚴重,
00:34:41 bad actors, North Korea poorly,
00:34:41 北韓表現糟糕,
00:34:41 bad actors, North Korea poorly, which is my which is my greatest
00:34:41 壞演員,北韓表現糟糕,這是我最偉大的
00:34:42 which is my which is my greatest
00:34:42 這是我最棒的
00:34:42 which is my which is my greatest concern. Right. China and the US are
00:34:42 這是我最擔心的。對。中國和美國
00:34:44 concern. Right. China and the US are
00:34:44 擔心。對。中國和美國
00:34:44 concern. Right. China and the US are rational actors.
00:34:44 擔憂。對。中國和美國都是理性的行為者。
00:34:46 rational actors.
00:34:46 理性的行動者。
00:34:46 rational actors. Yeah.
00:34:46 理性的行動者。是的。
00:34:47 Yeah. 00:34:47 是的。
00:34:47 Yeah. Uh the terrorist who has access to this
00:34:47 是的。呃,恐怖分子能接觸到這個
00:34:49 Uh the terrorist who has access to this
00:34:49 呃,恐怖份子可以接觸到這個
00:34:49 Uh the terrorist who has access to this and I I don't want to go all negative on
00:34:49 呃,恐怖分子可以接觸到這個,我不想對此採取負面態度
00:34:51 and I I don't want to go all negative on
00:34:51 我不想對
00:34:51 and I I don't want to go all negative on this on this podcast. It's it's an
00:34:51 我不想在這個播客上發表太多負面評論。這是一個
00:34:53 this on this podcast. It's it's an
00:34:53 播客上。這是一個
00:34:53 this on this podcast. It's it's an important thing to wake people up to the
00:34:53 在這期播客裡。喚醒人們意識到
00:34:56 important thing to wake people up to the
00:34:56 重要的是要喚醒人們
00:34:56 important thing to wake people up to the deep thinking you've done on this. Um my
00:34:56 重要的是讓人們意識到你對此的深刻思考。嗯,我的
00:34:59 deep thinking you've done on this. Um my
00:34:59 你對此進行了深入思考。嗯,我的
00:34:59 deep thinking you've done on this. Um my concern is is the terrorist who gains
00:34:59 你對此進行了深入思考。嗯,我擔心的是恐怖份子會
00:35:01 concern is is the terrorist who gains
00:35:01 令人擔憂的是恐怖分子
00:35:01 concern is is the terrorist who gains access and
00:35:01 令人擔憂的是,恐怖分子一旦獲得存取權限,
00:35:04 access and 00:35:04 訪問和
00:35:04 access and are we spending enough time and energy
00:35:04 我們是否投入了足夠的時間和精力
00:35:06 are we spending enough time and energy
00:35:06 我們投入的時間和精力是否足夠
00:35:06 are we spending enough time and energy and are we training enough models to
00:35:06 我們是否投入了足夠的時間和精力,是否訓練了足夠的模型來
00:35:08 and are we training enough models to
00:35:08 我們是否訓練了足夠的模型來
00:35:08 and are we training enough models to watch them.
00:35:08 我們是否訓練了足夠的模型來觀察它們。
00:35:10 watch them. 00:35:10 觀察他們。
00:35:10 watch them. So the first the companies are doing
00:35:10 觀察它們。所以首先這些公司正在做的
00:35:12 So the first the companies are doing
00:35:12 所以這些公司首先要做的是
00:35:12 So the first the companies are doing this
00:35:12 所以第一家公司正在這樣做
00:35:14 this 00:35:14 這個
00:35:14 this there are there's a body of work
00:35:14 這是一系列作品
00:35:16 there are there's a body of work
00:35:16 有很多作品
00:35:16 there are there's a body of work happening now which can be understood as
00:35:16 目前正在進行的一系列工作可以理解為
00:35:18 happening now which can be understood as
00:35:18 正在發生,可以理解為
00:35:18 happening now which can be understood as follows.
00:35:18 現在正在發生的事情可以理解如下。
00:35:21 follows. 00:35:21 如下。
00:35:21 follows. You have a super intelligent model. Can
00:35:21 如下。你有一個超級智慧模型。可以
00:35:24 You have a super intelligent model. Can
00:35:24 你有一個超級智慧模型。
00:35:24 You have a super intelligent model. Can you build a model that's not as smart as
00:35:24 你有一個超級智慧模型。你能建立一個不如
00:35:27 you build a model that's not as smart as
00:35:27 你建立一個不如
00:35:27 you build a model that's not as smart as the student that's studying? You know,
00:35:27 你建立的模型不如正在學習的學生聰明?你知道,
00:35:29 the student that's studying? You know,
00:35:29 正在學習的學生?你知道,
00:35:30 the student that's studying? You know, there is a professor that's watching the
00:35:30 正在學習的學生?你知道,有一位教授正在觀看
00:35:31 there is a professor that's watching the
00:35:31 有一位教授正在觀看
00:35:31 there is a professor that's watching the student,
00:35:31 有一位教授正在觀察學生,
00:35:32 student, 00:35:32 學生,
00:35:32 student, but the student is smarter than the
00:35:32 學生,但學生比
00:35:33 but the student is smarter than the
00:35:33 但學生比
00:35:34 but the student is smarter than the professor. Is it possible to watch what
00:35:34 但學生比教授聰明。可以看看
00:35:36 professor. Is it possible to watch what
00:35:36 教授。可以看看
00:35:36 professor. Is it possible to watch what it does? It appears that we can.
00:35:36 教授。我們可以觀察它的行為嗎?看來可以。
00:35:39 it does? It appears that we can.
00:35:39 真的嗎?看來我們可以。
00:35:39 it does? It appears that we can. It appears that there's a way even if
00:35:39 真的嗎?看來我們可以。看來有辦法,即使
00:35:41 It appears that there's a way even if
00:35:41 看來還是有辦法的
00:35:41 It appears that there's a way even if you have a this rogue incredible thing,
00:35:41 看來即使你有這個不可思議的東西,也有辦法,
00:35:44 you have a this rogue incredible thing,
00:35:44 你有一個令人難以置信的東西,
00:35:44 you have a this rogue incredible thing, we can watch it and understand what it's
00:35:44 你有一個令人難以置信的東西,我們可以觀察它並了解它是什麼
00:35:45 we can watch it and understand what it's
00:35:45 我們可以觀看並了解它是什麼
00:35:45 we can watch it and understand what it's doing and thereby control it. Another
00:35:45 我們可以觀察它,了解它在做什麼,從而控制它。另一個
00:35:48 doing and thereby control it. Another
00:35:48 做,從而控制它。另一個
00:35:48 doing and thereby control it. Another example of the of where where we don't
00:35:48 做事,從而控制它。另一個例子是,我們不
00:35:51 example of the of where where we don't
00:35:51 我們不
00:35:51 example of the of where where we don't know is that it's very clear that these
00:35:51 我們不知道的一個例子是,很明顯這些
00:35:54 know is that it's very clear that these
00:35:54 我們知道的是,這些
00:35:54 know is that it's very clear that these savant models will proceed. There's no
00:35:54 我們知道的是,這些專家模型將會繼續發展。沒有
00:35:57 savant models will proceed. There's no
00:35:57 專家模型將繼續進行。沒有
00:35:57 savant models will proceed. There's no question about that.
00:35:57 專家模型將繼續運作。這是毫無疑問的。
00:35:59 question about that.
00:35:59 關於這個的問題。
00:36:00 question about that. The question is how do we get the
00:36:00 這個問題。問題是我們如何獲得
00:36:02 The question is how do we get the
00:36:02 問題是我們如何獲得
00:36:02 The question is how do we get the Einsteins?
00:36:02 問題是我們如何得到愛因斯坦?
00:36:04 Einsteins? 00:36:04 愛因斯坦的?
00:36:04 Einsteins? So there are two possibilities.
00:36:04 愛因斯坦?所以有兩種可能性。
00:36:06 So there are two possibilities.
00:36:06 所以有兩種可能性。
00:36:06 So there are two possibilities. One and this is to discover completely
00:36:06 所以有兩種可能性。一種是徹底發現
00:36:08 One and this is to discover completely
00:36:08 一是徹底發現
00:36:08 One and this is to discover completely new schools of thought
00:36:08 一是發現全新的思想流派
00:36:09 new schools of thought
00:36:09 新的思想流派
00:36:09 new schools of thought which is what's the most exciting thing.
00:36:09 新的思想流派,這是最令人興奮的事情。
00:36:12 which is what's the most exciting thing.
00:36:12 這是最令人興奮的事。
00:36:12 which is what's the most exciting thing. Yeah. And in our book Genesis, Henry and
00:36:12 這才是最令人興奮的。是的。在我們的《創世紀》裡,亨利和
00:36:14 Yeah. And in our book Genesis, Henry and
00:36:14 是的。在我們的《創世紀》中,亨利和
00:36:14 Yeah. And in our book Genesis, Henry and I and Craig talk about the importance of
00:36:14 是的。在我們的書《創世紀》中,亨利、我和克雷格討論了
00:36:17 I and Craig talk about the importance of
00:36:17 我和 Craig 討論了
00:36:17 I and Craig talk about the importance of polymaths in history. In fact, the first
00:36:17 我和克雷格討論了博學者在歷史學中的重要性。事實上,第一個
00:36:20 polymaths in history. In fact, the first
00:36:20 歷史上的博學者。事實上,第一個
00:36:20 polymaths in history. In fact, the first chapter is on polymaths. What happens
00:36:20 歷史上的博學者。事實上,第一章就是關於博學者的。接下來會發生什麼
00:36:23 chapter is on polymaths. What happens
00:36:23 本章是關於博學者的。接下來會發生什麼
00:36:23 chapter is on polymaths. What happens when we have millions and millions of
00:36:23 本章是關於博學者的。當我們擁有數以百萬計的
00:36:24 when we have millions and millions of
00:36:24 當我們有數以百萬計的
00:36:24 when we have millions and millions of polymaths? Very, very interesting.
00:36:24 當我們擁有數以百萬計的博學者時?非常非常有趣。
00:36:26 polymaths? Very, very interesting.
00:36:26 博學多識?非常非常有趣。
00:36:26 polymaths? Very, very interesting. Okay.
00:36:26 博學多識?非常非常有趣。好的。
00:36:27 Okay. 00:36:27 好的。
00:36:27 Okay. Now, it looks like the great
00:36:27 好的。現在,看起來
00:36:31 Now, it looks like the great
00:36:31 現在,看起來偉大的
00:36:31 Now, it looks like the great discoveries, the greatest scientists and
00:36:31 現在,看起來偉大的發現,最偉大的科學家和
00:36:34 discoveries, the greatest scientists and
00:36:34 發現,最偉大的科學家和
00:36:34 discoveries, the greatest scientists and people in our history had the following
00:36:34 發現,我們歷史上最偉大的科學家和人物有以下
00:36:37 people in our history had the following
00:36:37 我們歷史上的人們有以下經歷
00:36:37 people in our history had the following property. They were experts in something
00:36:37 我們歷史上的某些人擁有以下特質。他們是某方面的專家
00:36:40 property. They were experts in something
00:36:40 財產。他們是某方面的專家
00:36:40 property. They were experts in something and they looked at some at a different
00:36:40 財產。他們是某方面的專家,他們從不同的角度看待某些事情
00:36:42 and they looked at some at a different
00:36:42 他們從不同的角度看了一些
00:36:42 and they looked at some at a different problem and they saw a pattern
00:36:42 他們研究了不同的問題,發現了一種模式
00:36:44 problem and they saw a pattern
00:36:44 問題,他們發現了一個模式
00:36:44 problem and they saw a pattern in one area of thinking that they could
00:36:44 問題,他們在一個思考領域發現了一種模式,他們可以
00:36:46 in one area of thinking that they could
00:36:46 認為他們可以
00:36:46 in one area of thinking that they could apply to a completely unrelated field
00:36:46 認為他們可以應用在完全不相關的領域
00:36:49 apply to a completely unrelated field
00:36:49 申請一個完全不相關的領域
00:36:49 apply to a completely unrelated field and they were able to do so and make a
00:36:49 申請一個完全不相關的領域,他們能夠做到這一點,並做出
00:36:51 and they were able to do so and make a
00:36:51 他們能夠做到這一點,並做出
00:36:51 and they were able to do so and make a huge breakthrough. The models today are
00:36:51 他們做到了,並且取得了巨大的突破。今天的模型是
00:36:54 huge breakthrough. The models today are
00:36:54 巨大的突破。今天的模型
00:36:54 huge breakthrough. The models today are not able to do that. So one thing to
00:36:54 巨大的突破。今天的模型無法做到這一點。所以
00:36:56 not able to do that. So one thing to
00:36:56 做不到。所以有一件事
00:36:56 not able to do that. So one thing to watch for is algorithmically
00:36:56 無法做到這一點。所以要注意的一點是演算法
00:36:59 watch for is algorithmically
00:36:59 演算法上觀察
00:37:00 watch for is algorithmically when can they do that? This is generally
00:37:00 演算法上要注意的是,他們什麼時候能做到這一點?這通常是
00:37:01 when can they do that? This is generally
00:37:01 他們什麼時候能做到這一點?這通常
00:37:01 when can they do that? This is generally known as the non-stationerity problem.
00:37:01 什麼時候能做到這一點?這通常被稱為非平穩性問題。
00:37:03 known as the non-stationerity problem.
00:37:03 被稱為非平穩性問題。
00:37:03 known as the non-stationerity problem. Yeah. because uh the reward functions in
00:37:03 被稱為非平穩性問題。是的。因為獎勵函數
00:37:06 Yeah. because uh the reward functions in
00:37:06 是的。因為獎勵函數
00:37:06 Yeah. because uh the reward functions in these models are fairly straightforward.
00:37:06 是的。因為這些模型中的獎勵函數相當簡單。
00:37:08 these models are fairly straightforward.
00:37:08 這些模型相當簡單。
00:37:08 these models are fairly straightforward. You know, beat the human, beat the
00:37:08 這些模型相當簡單。你知道,打敗人類,打敗
00:37:09 You know, beat the human, beat the
00:37:09 你知道,打敗人類,打敗
00:37:09 You know, beat the human, beat the question and so forth. But when the
00:37:09 你知道,打敗人類,打敗問題等等。但是當
00:37:11 question and so forth. But when the
00:37:11 等等問題。但是當
00:37:11 question and so forth. But when the rules keep changing, is it possible to
00:37:11 等等。但是當規則不斷變化時,是否有可能
00:37:13 rules keep changing, is it possible to
00:37:13 規則不斷變化,是否有可能
00:37:14 rules keep changing, is it possible to say the old rule can be applied to a new
00:37:14 規則不斷變化,是否可以說舊規則可以應用在新規則
00:37:16 say the old rule can be applied to a new
00:37:16 說舊規則可以應用在新規則
00:37:16 say the old rule can be applied to a new rule to discover something new?
00:37:16 說舊規則可以應用在新規則中來發現新的東西?
00:37:19 rule to discover something new?
00:37:19 規則來發現新的東西?
00:37:19 rule to discover something new? And and again, the research is underway.
00:37:19 規則發現新事物?而且,研究正在進行中。
00:37:22 And and again, the research is underway.
00:37:22 再次強調,研究仍在進行中。
00:37:22 And and again, the research is underway. We won't know for years.
00:37:22 再說一遍,研究還在進行中。我們還要過幾年才能知道結果。
00:37:23 We won't know for years.
00:37:23 我們多年後才會知道。
00:37:23 We won't know for years. Peter and I were over at OpenAI
00:37:23 我們可能要過幾年才能知道。 Peter 和我當時在 OpenAI
00:37:25 Peter and I were over at OpenAI
00:37:25 Peter 和我在 OpenAI
00:37:25 Peter and I were over at OpenAI yesterday, actually, and we were talking
00:37:25 實際上,Peter 和我昨天在 OpenAI 上,我們當時正在聊天
00:37:26 yesterday, actually, and we were talking
00:37:26 昨天,實際上,我們正在談論
00:37:26 yesterday, actually, and we were talking to many people, but Noan Brown in
00:37:26 昨天,實際上,我們和很多人交談過,但諾安布朗在
00:37:28 to many people, but Noan Brown in
00:37:28 對很多人來說,但 Noan Brown
00:37:28 to many people, but Noan Brown in particular, and um I said the word of
00:37:28 對很多人來說,尤其是諾亞布朗,嗯,我說的是
00:37:30 particular, and um I said the word of
00:37:30 特別是,嗯,我說的是
00:37:30 particular, and um I said the word of the year is scaffolding. And he said,
00:37:30 特別的,嗯,我說年度熱詞是鷹架。他說,
00:37:33 the year is scaffolding. And he said,
00:37:33 今年是鷹架。他說,
00:37:33 the year is scaffolding. And he said, "Yeah, maybe the word of the month is
00:37:33 年度主題是腳手架。他說:「是的,也許本月的主題詞是
00:37:34 "Yeah, maybe the word of the month is
00:37:34 「是的,也許本月的關鍵字是
00:37:34 "Yeah, maybe the word of the month is scaffolding." I was like, "Okay, what
00:37:34 「是啊,也許本月的關鍵字是鷹架。」我說,「好吧,
00:37:36 scaffolding." I was like, "Okay, what
00:37:36 腳手架。 」我當時想,「好吧,什麼
00:37:36 scaffolding." I was like, "Okay, what did I step on there?" He said, "Look,
00:37:36 腳手架。 」我當時想,「好吧,我踩到什麼了?他說,「你看,
00:37:37 did I step on there?" He said, "Look,
00:37:37 我踩到那裡了嗎?他說,「看,
00:37:38 did I step on there?" He said, "Look, you know, right now, if you try to get
00:37:38 我踩到那裡了嗎?他說,「聽著,你知道,現在,如果你試圖
00:37:40 you know, right now, if you try to get
00:37:40 你知道,現在,如果你試著
00:37:40 you know, right now, if you try to get the AI to discover relativity or, you
00:37:40 你知道,現在,如果你試圖讓人工智慧發現相對論,或者,你
00:37:42 the AI to discover relativity or, you
00:37:42 人工智慧發現相對論,或者你
00:37:42 the AI to discover relativity or, you know, just some green field opportunity,
00:37:42 人工智慧可以發現相對論,或只是一些新領域的機會,
00:37:44 know, just some green field opportunity,
00:37:44 知道,只是一些綠地機會,
00:37:44 know, just some green field opportunity, it won't it won't do it. If you set up a
00:37:44 知道,只是一些綠地機會,它不會,不會。如果你設立一個
00:37:47 it won't it won't do it. If you set up a
00:37:47 不會,不會。如果你設定一個
00:37:47 it won't it won't do it. If you set up a framework kind of like a lattice, like a
00:37:47 不會,不會。如果你建立一個像格子一樣的框架,就像一個
00:37:48 framework kind of like a lattice, like a
00:37:48 框架有點像格子,像一個
00:37:48 framework kind of like a lattice, like a trellis, the vine will grow on the
00:37:48 框架有點像格子,像棚架,藤蔓會生長在上面
00:37:50 trellis, the vine will grow on the
00:37:50 棚架,藤蔓會生長在
00:37:50 trellis, the vine will grow on the trellis beautifully, but you have to lay
00:37:50 棚架,藤蔓會在棚架上生長得很好,但你必須鋪設
00:37:52 trellis beautifully, but you have to lay
00:37:52 棚架很漂亮,但必須鋪設
00:37:52 trellis beautifully, but you have to lay out those pathways and breadcrumbs." He
00:37:52 棚架很漂亮,但你必須佈置好那些路徑和麵包屑。 」他
00:37:55 out those pathways and breadcrumbs." He
00:37:55 找出那些路徑和麵包屑。 」他
00:37:55 out those pathways and breadcrumbs." He was saying the AI's ability to generate
00:37:55 找出這些路徑和線索。 」他指的是人工智慧能夠生成
00:37:57 was saying the AI's ability to generate
00:37:57 指的是人工智慧能夠生成
00:37:58 was saying the AI's ability to generate its own scaffolding is imminent.
00:37:58 表示人工智慧即將具備產生自身鷹架的能力。
00:38:01 its own scaffolding is imminent.
00:38:01 其自身的鷹架即將搭建完成。
00:38:01 its own scaffolding is imminent. Mhm. That doesn't make it completely
00:38:01 它的鷹架即將搭建完成。嗯。但這並不能完全
00:38:03 Mhm. That doesn't make it completely
00:38:03 嗯。但這不完全是
00:38:03 Mhm. That doesn't make it completely self-improving. It's not it's not
00:38:03 嗯。這並不意味著它完全可以自我改進。它不是,不是
00:38:05 self-improving. It's not it's not
00:38:05 自我提升。這不是
00:38:05 self-improving. It's not it's not Pandora's box, but it's also much deeper
00:38:05 自我完善。這不是潘朵拉魔盒,但它也更深層
00:38:07 Pandora's box, but it's also much deeper
00:38:07 潘朵拉的盒子,但它也更深
00:38:07 Pandora's box, but it's also much deeper down the path of create an entire
00:38:07 潘朵拉魔盒,但它也更深層地創造了一個完整的
00:38:10 down the path of create an entire
00:38:10 沿著創造整個
00:38:10 down the path of create an entire breakthrough in physics or create an
00:38:10 沿著創造物理學突破的道路前進,或創造一個
00:38:11 breakthrough in physics or create an
00:38:11 物理學上的突破或創造
00:38:11 breakthrough in physics or create an entire feature length movie or you know
00:38:11 物理學上的突破,或是創作一部完整的劇情片,或是你知道
00:38:13 entire feature length movie or you know
00:38:13 整部電影或你知道
00:38:13 entire feature length movie or you know these these prompts that require 20
00:38:13 整部電影,或者你知道這些提示需要 20
00:38:16 these these prompts that require 20
00:38:16 這些提示需要 20
00:38:16 these these prompts that require 20 hours of consecutive inference time
00:38:16 這些題目需要 20 小時的連續推理時間
00:38:18 hours of consecutive inference time
00:38:18 小時連續推理時間
00:38:18 hours of consecutive inference time compute
00:38:18 小時連續推理時間計算
00:38:19 compute 00:38:19 計算
00:38:19 compute pretty much sure that that will be a
00:38:19 計算幾乎確定這將是一個
00:38:21 pretty much sure that that will be a
00:38:21 幾乎可以肯定那會是一個
00:38:22 pretty much sure that that will be a 2025 thing at least from from their
00:38:22 非常肯定這將是 2025 年的事情,至少從他們的
00:38:24 2025 thing at least from from their
00:38:24 2025 至少從他們的
00:38:24 2025 thing at least from from their point of view.
00:38:24 至少從他們的角度來看,這是 2025 年的事。
00:38:25 point of view. 00:38:25 觀點。
00:38:25 point of view. So, uh, recursive self-improvement is
00:38:25 的觀點。所以,呃,遞歸自我提升是
00:38:28 So, uh, recursive self-improvement is
00:38:28 所以,呃,遞歸自我改進是
00:38:28 So, uh, recursive self-improvement is the general term for the computer
00:38:28 所以,呃,遞歸自我改進是電腦的通用術語
00:38:31 the general term for the computer
00:38:31 計算機的通用術語
00:38:31 the general term for the computer continuing to learn.
00:38:31 計算機繼續學習的總稱。
00:38:32 continuing to learn.
00:38:32 繼續學習。
00:38:32 continuing to learn. Yeah,
00:38:32 繼續學習。是的,
00:38:33 Yeah, 00:38:33 是的,
00:38:33 Yeah, we've already crossed that
00:38:33 是的,我們已經跨越了
00:38:35 we've already crossed that
00:38:35 我們已經跨越了
00:38:35 we've already crossed that in the sense that these systems are now
00:38:35 我們已經跨越了這一點,因為這些系統現在
00:38:37 in the sense that these systems are now
00:38:37 從某種意義上來說,這些系統現在
00:38:37 in the sense that these systems are now running and learning things and they're
00:38:37 從某種意義上說,這些系統現在正在運作和學習,而且它們
00:38:39 running and learning things and they're
00:38:39 跑步和學習東西,他們
00:38:39 running and learning things and they're learning from the way they own they
00:38:39 跑步和學習東西,他們從自己的方式中學習
00:38:41 learning from the way they own they
00:38:41 從他們擁有的方式中學習
00:38:41 learning from the way they own they think within limited functions.
00:38:41 從他們自己的思考方式中學習在有限的功能內。
00:38:46 think within limited functions.
00:38:46 在有限的功能內思考。
00:38:46 think within limited functions. When does the system have the ability to
00:38:46 在有限的功能範圍內思考。系統什麼時候有能力
00:38:48 When does the system have the ability to
00:38:48 系統何時有能力
00:38:48 When does the system have the ability to generate its own objective and its own
00:38:48 系統何時有能力產生自己的目標和自己的
00:38:51 generate its own objective and its own
00:38:51 產生自己的目標和自己的
00:38:51 generate its own objective and its own question?
00:38:51 產生自己的目標和自己的問題?
00:38:52 question? 00:38:52 有問題嗎?
00:38:52 question? Does not have that today.
00:38:52 有問題嗎?今天沒有。
00:38:53 Does not have that today.
00:38:53 今天沒有。
00:38:53 Does not have that today. Yep. That's another sign. Another sign
00:38:53 今天沒有了。是的。這是另一個跡象。另一個跡象
00:38:57 Yep. That's another sign. Another sign
00:38:57 是的。這是另一個跡象。另一個跡象
00:38:57 Yep. That's another sign. Another sign would be that the system decides to uh
00:38:57 是的。這是另一個跡象。另一個跡像是系統決定呃
00:39:01 would be that the system decides to uh
00:39:01 系統會決定呃
00:39:01 would be that the system decides to uh exfiltrate itself and it takes steps to
00:39:01 系統決定自我滲透,並採取措施
00:39:04 exfiltrate itself and it takes steps to
00:39:04 自我滲透,並採取措施
00:39:04 exfiltrate itself and it takes steps to get it get itself away from the
00:39:04 自行滲透,並採取措施讓自己遠離
00:39:05 get it get itself away from the
00:39:05 讓它遠離
00:39:05 get it get itself away from the commander the control and command
00:39:05 讓它脫離指揮官的控制和指揮
00:39:07 commander the control and command
00:39:07 指揮官控制和指揮
00:39:07 commander the control and command system. Um that has not happened yet.
00:39:07 指揮官控制和指揮系統。嗯,這還沒有發生。
00:39:10 system. Um that has not happened yet.
00:39:10 系統。嗯,這還沒有發生。
00:39:10 system. Um that has not happened yet. Jim and I hasn't called you yet and
00:39:10 系統。嗯,還沒發生。吉姆和我還沒給你打電話,
00:39:11 Jim and I hasn't called you yet and
00:39:11 吉姆和我還沒打給你
00:39:11 Jim and I hasn't called you yet and said, "Hi, Eric. Can I
00:39:11 吉姆和我還沒打電話給你說:「嗨,艾瑞克。我可以
00:39:13 said, "Hi, Eric. Can I
00:39:13 說:「嗨,艾瑞克。我可以
00:39:13 said, "Hi, Eric. Can I but but there there are theoreticians
00:39:13 說:「嗨,艾瑞克。我可以但是那裡有理論家
00:39:15 but but there there are theoreticians
00:39:15 但是有理論家
00:39:15 but but there there are theoreticians who believe that the that the systems
00:39:15 但是有些理論家認為系統
00:39:17 who believe that the that the systems
00:39:17 相信系統
00:39:17 who believe that the that the systems will ultimately choose that as a reward
00:39:17 相信系統最終會選擇它作為獎勵
00:39:19 will ultimately choose that as a reward
00:39:19 最後會選擇它作為獎勵
00:39:19 will ultimately choose that as a reward function because they're programmed to,
00:39:19 最終會選擇它作為獎勵函數,因為它們被編程為,
00:39:21 function because they're programmed to,
00:39:21 因為它們被編程來
00:39:22 function because they're programmed to, you know, to continue to learn."
00:39:22 因為它們被編程為不斷學習。 」
00:39:23 you know, to continue to learn."
00:39:23 你知道,繼續學習。 」
00:39:23 you know, to continue to learn." Uh, another one is access to weapons,
00:39:23 你知道,繼續學習。 「呃,另一個是獲得武器,
00:39:26 Uh, another one is access to weapons,
00:39:26 呃,另一個是取得武器,
00:39:26 Uh, another one is access to weapons, right? And lying to get it. So, these
00:39:26 呃,另一個是取得武器,對吧?然後撒謊去獲取。所以,這些
00:39:29 right? And lying to get it. So, these
00:39:29 對吧?為了得到它還要撒謊。所以,這些
00:39:29 right? And lying to get it. So, these are trip wires,
00:39:29 對吧?然後撒謊才能得到它。所以,這些都是絆線,
00:39:31 are trip wires, 00:39:31 是絆線,
00:39:31 are trip wires, right? All of each of each of which is a
00:39:31 是絆線,對吧?所有每一個都是
00:39:33 right? All of each of each of which is a
00:39:33 對吧?所有這些都是
00:39:33 right? All of each of each of which is a trip wire that we're we're watching.
00:39:33 對吧?這一切都是我們正在監視的絆線。
00:39:36 trip wire that we're we're watching.
00:39:36 我們正在監視的絆線。
00:39:36 trip wire that we're we're watching. And again, each of these could be the
00:39:36 我們正在觀察的絆線。再說一遍,這些都可能是
00:39:38 And again, each of these could be the
00:39:38 再次強調,這些都可能是
00:39:38 And again, each of these could be the beginning of a mini Chernobyl event that
00:39:38 再次強調,這些都可能是小型切爾諾貝利事件的開始,
00:39:41 beginning of a mini Chernobyl event that
00:39:41 小型切爾諾貝利事件開始
00:39:41 beginning of a mini Chernobyl event that would become part of consciousness.
00:39:41 小型切爾諾貝利事件的開始將成為意識的一部分。
00:39:44 would become part of consciousness.
00:39:44 將成為意識的一部分。
00:39:44 would become part of consciousness. I think at the moment the US government
00:39:44 會成為意識的一部分。我認為目前美國政府
00:39:47 I think at the moment the US government
00:39:47 我認為目前美國政府
00:39:47 I think at the moment the US government is not focused on these issues. They're
00:39:47 我認為目前美國政府並沒有關注這些問題。他們
00:39:48 is not focused on these issues. They're
00:39:48 並沒有關注這些問題。他們
00:39:48 is not focused on these issues. They're focused on other things, economic
00:39:48 不關注這些問題。他們關注的是其他事情,例如經濟
00:39:50 focused on other things, economic
00:39:50 專注於其他事情,經濟
00:39:50 focused on other things, economic opportunity, growth, and so forth. It's
00:39:50 關注其他事情,例如經濟機會、成長等等。
00:39:51 opportunity, growth, and so forth. It's
00:39:51 機會、成長等等。這是
00:39:51 opportunity, growth, and so forth. It's all good, but somebody's going to get
00:39:51 機會、成長等等。這些都很好,但總有人會得到
00:39:54 all good, but somebody's going to get
00:39:54 一切都很好,但有人會得到
00:39:54 all good, but somebody's going to get focused on this and somebody's going to
00:39:54 一切都很好,但總有人會關注這個問題,總有人會
00:39:56 focused on this and somebody's going to
00:39:56 專注於此,有人會
00:39:56 focused on this and somebody's going to pay attention to it and it will
00:39:56 關注這一點,有人會注意它,它會
00:39:57 pay attention to it and it will
00:39:57 注意它,它會
00:39:58 pay attention to it and it will ultimately be a problem. A quick aside,
00:39:58 不去注意它,最終它還是會出問題。順便說一句,
00:39:59 ultimately be a problem. A quick aside,
00:39:59 最終會成為一個問題。順便說一下,
00:40:00 ultimately be a problem. A quick aside, you probably heard me speaking about
00:40:00 最終會成為一個問題。順便說一下,你可能聽到我說過
00:40:01 you probably heard me speaking about
00:40:01 你可能聽到我談論
00:40:01 you probably heard me speaking about fountain life before and you're probably
00:40:01 你可能之前聽過我談論噴泉生活,你可能
00:40:03 fountain life before and you're probably
00:40:03 在噴泉生活之前,你可能
00:40:03 fountain life before and you're probably wishing, "Peter, would you please stop
00:40:03 在噴泉生活之前,你可能會希望,「彼得,你能不能停下來
00:40:05 wishing, "Peter, would you please stop
00:40:05 希望,「彼得,你能停下來嗎
00:40:05 wishing, "Peter, would you please stop talking about fountain life?" And the
00:40:05 希望,“彼得,你能不能別再談論噴泉生活了?”
00:40:06 talking about fountain life?" And the
00:40:06 談噴泉生活?還有
00:40:06 talking about fountain life?" And the answer is no, I won't. Because
00:40:06 談噴泉生活?答案是不會,我不會。因為
00:40:08 answer is no, I won't. Because
00:40:08 答案是不會。因為
00:40:08 answer is no, I won't. Because genuinely, we're living through a
00:40:08 答案是不會。因為說實話,我們正在經歷一場
00:40:10 genuinely, we're living through a
00:40:10 說實話,我們正在經歷
00:40:10 genuinely, we're living through a healthc care crisis. You may not know
00:40:10 說實話,我們正在經歷一場醫療危機。你可能不知道
00:40:12 healthc care crisis. You may not know
00:40:12 醫療危機。你可能不知道
00:40:12 healthc care crisis. You may not know this, but 70% of heart attacks have no
00:40:12 醫療危機。你可能不知道,70%的心臟病發作沒有
00:40:14 this, but 70% of heart attacks have no
00:40:14 但70%的心臟病發作沒有
00:40:14 this, but 70% of heart attacks have no precedent, no pain, no shortness of
00:40:14 但70%的心臟病發作沒有先例,沒有疼痛,沒有呼吸急促
00:40:16 precedent, no pain, no shortness of
00:40:16 先例,沒有痛苦,沒有短缺
00:40:16 precedent, no pain, no shortness of breath. And half of those people with a
00:40:16 先例,沒有疼痛,沒有呼吸困難。其中一半的人
00:40:18 breath. And half of those people with a
00:40:18 呼吸。其中一半的人
00:40:18 breath. And half of those people with a heart attack never wake up. You don't
00:40:18 呼吸。一半心臟病發作的人永遠無法醒來。你不會
00:40:19 heart attack never wake up. You don't
00:40:19 心臟病發作永遠醒不過來。你不會
00:40:20 heart attack never wake up. You don't feel cancer until stage three or stage
00:40:20 心臟病發作永遠不會醒來。直到癌症三期或四期你才會感覺到
00:40:22 feel cancer until stage three or stage
00:40:22 直到癌症第三期或第四期
00:40:22 feel cancer until stage three or stage 4, until it's too late. But we have all
00:40:22 直到癌症第三期或四期才會感覺到,直到為時已晚。但我們都有
00:40:24 4, until it's too late. But we have all
00:40:24 4,直到為時已晚。但我們都
00:40:24 4, until it's too late. But we have all the technology required to detect and
00:40:24 4,直到為時已晚。但我們擁有所有必要的技術來檢測和
00:40:26 the technology required to detect and
00:40:26 檢測所需的技術
00:40:26 the technology required to detect and prevent these diseases early at scale.
00:40:26 大規模早期發現和預防這些疾病所需的技術。
00:40:29 prevent these diseases early at scale.
00:40:29 大規模地早期預防這些疾病。
00:40:29 prevent these diseases early at scale. That's why a group of us including Tony
00:40:29 大規模地預防這些疾病。這就是為什麼我們包括托尼在內的一群人
00:40:31 That's why a group of us including Tony
00:40:31 這就是為什麼我們一群人,包括東尼
00:40:31 That's why a group of us including Tony Robbins, Bill Cap, and Bob Heruri
00:40:31 這就是為什麼我們一群人,包括 Tony Robbins、Bill Cap 和 Bob Heruri
00:40:33 Robbins, Bill Cap, and Bob Heruri
00:40:33 羅賓斯、比爾卡普和鮑伯赫魯裡
00:40:33 Robbins, Bill Cap, and Bob Heruri founded Fountain Life, a one-stop center
00:40:33 羅賓斯、比爾卡普和鮑勃赫魯裡創立了 Fountain Life,一個一站式中心
00:40:35 founded Fountain Life, a one-stop center
00:40:35 創立 Fountain Life,一站式中心
00:40:35 founded Fountain Life, a one-stop center to help people understand what's going
00:40:35 創立 Fountain Life,一個幫助人們了解正在發生的事情的一站式中心
00:40:37 to help people understand what's going
00:40:37 幫助人們了解正在發生的事情
00:40:37 to help people understand what's going on inside their bodies before it's too
00:40:37 幫助人們了解身體內部發生的狀況,
00:40:40 on inside their bodies before it's too
00:40:40 在它們體內
00:40:40 on inside their bodies before it's too late and to gain access to the
00:40:40 在他們體內,在為時已晚之前,獲得進入
00:40:41 late and to gain access to the
00:40:41 遲到並獲得存取權限
00:40:41 late and to gain access to the therapeutics to give them decades of
00:40:41 遲到並獲得治療,讓他們幾十年的
00:40:43 therapeutics to give them decades of
00:40:43 療法,為他們提供數十年的
00:40:43 therapeutics to give them decades of extra health span. Learn more about
00:40:43 療法,讓他們的健康壽命延長數十年。了解更多關於
00:40:45 extra health span. Learn more about
00:40:45 額外健康壽命。了解更多
00:40:45 extra health span. Learn more about what's going on inside your body from
00:40:45 額外的健康壽命。了解更多關於你體內發生的事情
00:40:46 what's going on inside your body from
00:40:46 你的身體裡發生了什麼
00:40:46 what's going on inside your body from Fountainife. Go to fountainlife.com/per
00:40:46 Fountainife 告訴你你的身體內部發生了什麼事。請造訪 fountainlife.com/per
00:40:50 Fountainife. Go to fountainlife.com/per
00:40:50 Fountainife。請造訪 fountainlife.com/per
00:40:50 Fountainife. Go to fountainlife.com/per and tell them Peter sent you. Okay, back
00:40:50 Fountainife。去 fountainlife.com/per,告訴他們 Peter 派你來的。好的,回來
00:40:52 and tell them Peter sent you. Okay, back
00:40:52 告訴他們彼得派你來的。好的,回來
00:40:52 and tell them Peter sent you. Okay, back to the episode. Can I can I clean up one
00:40:52 告訴他們彼得派你來的。好了,回到正題。我可以清理一下嗎
00:40:55 to the episode. Can I can I clean up one
00:40:55 到這一集。我可以清理一下嗎
00:40:55 to the episode. Can I can I clean up one kind of common misconception there
00:40:55 到這一集。我能澄清一下那裡的一個常見誤解嗎
00:40:56 kind of common misconception there
00:40:56 那裡有一種常見的誤解
00:40:56 kind of common misconception there because um I think it's a really
00:40:56 這是一個常見的誤解,因為我認為這真的是一種
00:40:58 because um I think it's a really
00:40:58 因為嗯我認為這真的
00:40:58 because um I think it's a really important one. In the movie version of
00:40:58 因為我覺得這很重要。在電影版裡
00:41:00 important one. In the movie version of
00:41:00 很重要。在電影版中
00:41:00 important one. In the movie version of AI, you described, hey, maybe there are
00:41:00 很重要。在電影版的《人工智慧》中,你描述道,嘿,也許有
00:41:02 AI, you described, hey, maybe there are
00:41:02 人工智慧,你描述過,嘿,也許有
00:41:02 AI, you described, hey, maybe there are 10 big AIs and five are in the US, three
00:41:02 人工智慧,你描述過,嘿,也許有 10 個大型人工智慧,其中 5 個在美國,3 個
00:41:04 10 big AIs and five are in the US, three
00:41:04 十大人工智慧,其中五個在美國,三個
00:41:04 10 big AIs and five are in the US, three are in China, and two are one's not in
00:41:04 十大人工智慧,其中五個在美國,三個在中國,兩個不在
00:41:06 are in China, and two are one's not in
00:41:06 在中國,還有兩個不在
00:41:06 are in China, and two are one's not in Brussels, probably one's maybe in Dubai.
00:41:06 在中國,還有兩個不在布魯塞爾,一個可能在杜拜。
00:41:09 Brussels, probably one's maybe in Dubai.
00:41:09 布魯塞爾,也可能在杜拜。
00:41:09 Brussels, probably one's maybe in Dubai. Um or, you know, Israel.
00:41:09 布魯塞爾,可能還有杜拜。或者,你知道的,以色列。
00:41:11 Um or, you know, Israel.
00:41:11 嗯,或者,你知道的,以色列。
00:41:11 Um or, you know, Israel. Israel. Okay, there you go.
00:41:11 嗯,或者,你知道的,以色列。以色列。好的,就是這樣。
00:41:12 Israel. Okay, there you go.
00:41:12 以色列。好的,就是這樣。
00:41:12 Israel. Okay, there you go. Some somewhere like that.
00:41:12 以色列。好的,就是這樣。有些地方就是這樣。
00:41:13 Some somewhere like that.
00:41:13 一些類似的地方。
00:41:13 Some somewhere like that. Yeah. Um in the movie version of this,
00:41:13 類似這樣。嗯,在電影版裡,
00:41:16 Yeah. Um in the movie version of this,
00:41:16 是的。嗯,在電影版裡,
00:41:16 Yeah. Um in the movie version of this, if it goes rogue, you know, the SWAT
00:41:16 是的。嗯,在電影版裡,如果事情鬧大了,你知道,特警
00:41:18 if it goes rogue, you know, the SWAT
00:41:18 如果它失控了,你知道,特警
00:41:18 if it goes rogue, you know, the SWAT team comes in, they blow it up, and it's
00:41:18 如果它失控了,你知道,特警隊會進來,他們會把它炸掉,然後
00:41:20 team comes in, they blow it up, and it's
00:41:20 團隊進來,他們把它炸毀,然後
00:41:20 team comes in, they blow it up, and it's it's solved. But the actual real world
00:41:20 團隊進來,他們把它炸了,問題解決了。但現實世界
00:41:22 it's solved. But the actual real world
00:41:22 解決了。但現實世界
00:41:22 it's solved. But the actual real world is when you're using one of these huge
00:41:22 已經解決了。但實際的現實世界是,當你使用這些巨大的
00:41:24 is when you're using one of these huge
00:41:24 當你使用其中一個巨大的
00:41:24 is when you're using one of these huge data centers to create an super
00:41:24 當你使用這些大型資料中心來創造一個超級
00:41:26 data centers to create an super
00:41:26 資料中心打造超級
00:41:26 data centers to create an super intelligent AI, the training process is
00:41:26 資料中心創造一個超級智慧的 AI,訓練過程是
00:41:29 intelligent AI, the training process is
00:41:29 智能 AI,訓練過程是
00:41:29 intelligent AI, the training process is 10 E26, 10 E28, you know, or more flops.
00:41:29 智慧 AI,訓練過程是 10 E26、10 E28,你知道,或更多的 flops。
00:41:33 10 E26, 10 E28, you know, or more flops.
00:41:33 10 E26、10 E28,你知道,或更多的失敗。
00:41:33 10 E26, 10 E28, you know, or more flops. But then the final brain can be ported
00:41:33 10 個 E26,10 個 E28,你知道,或更多失敗。但最終的大腦可以移植
00:41:36 But then the final brain can be ported
00:41:36 但最終的大腦可以移植
00:41:36 But then the final brain can be ported and run on four GPUs, 8 GPUs, so a box
00:41:36 但最終的大腦可以移植到 4 個 GPU 或 8 個 GPU 上運行,所以一個盒子
00:41:39 and run on four GPUs, 8 GPUs, so a box
00:41:39 在 4 個 GPU 上運行,8 個 GPU,所以一個盒子
00:41:39 and run on four GPUs, 8 GPUs, so a box about this size.
00:41:39 並在四個 GPU、8 個 GPU 上運行,所以盒子大約是這個尺寸。
00:41:41 about this size.
00:41:41 大約這個尺寸。
00:41:41 about this size. Um, and it's just as intelligent, you
00:41:41 大概這麼大。嗯,而且它也很聰明,你
00:41:43 Um, and it's just as intelligent, you
00:41:43 嗯,它同樣聰明,你
00:41:43 Um, and it's just as intelligent, you know, it's it's it's and that's one of
00:41:43 嗯,它同樣智能,你知道,它是它是它是,這是其中之一
00:41:45 know, it's it's it's and that's one of
00:41:45 知道,這是其中之一
00:41:45 know, it's it's it's and that's one of the beautiful things about it is you
00:41:45 知道,這是這是這是這是這是它的美妙之處之一
00:41:46 the beautiful things about it is you
00:41:46 最美的是你
00:41:46 the beautiful things about it is you This is called stealing the weights.
00:41:46 最美妙的事就是你 這叫做偷重物。
00:41:47 This is called stealing the weights.
00:41:47 這叫偷重物。
00:41:47 This is called stealing the weights. Stealing the weights. Exactly. And the
00:41:47 這叫偷砝碼。偷砝碼。沒錯。而且
00:41:50 Stealing the weights. Exactly. And the
00:41:50 偷砝碼。沒錯。還有
00:41:50 Stealing the weights. Exactly. And the new new thing is that that weight file
00:41:50 偷權重。沒錯。新東西是那個權重文件
00:41:53 new new thing is that that weight file
00:41:53 新的事物是那個權重文件
00:41:53 new new thing is that that weight file with if you have an innovation in
00:41:53 新的事物是,如果你有一個創新,那麼這個權重文件
00:41:55 with if you have an innovation in
00:41:55 如果你有創新
00:41:55 with if you have an innovation in inference time speed and you say oh same
00:41:55 如果你在推理時間速度方面有所創新,你會說哦,同樣
00:41:58 inference time speed and you say oh same
00:41:58 推理時間速度,你會說哦,一樣
00:41:58 inference time speed and you say oh same weights no difference distill it or or
00:41:58 推理時間速度,你會說哦,同樣的權重沒有區別,蒸餾它或或
00:42:01 weights no difference distill it or or
00:42:01 重量沒有區別蒸餾它或或
00:42:01 weights no difference distill it or or just quantize it or whatever but I made
00:42:01 權重沒有區別,蒸餾它或只是量化它或其他什麼,但我做了
00:42:02 just quantize it or whatever but I made
00:42:02 只是量化它或其他什麼,但我做了
00:42:02 just quantize it or whatever but I made it a 100 times faster now it's actually
00:42:02 只是量化它或其他什麼,但我把它的速度提高了 100 倍,現在它實際上
00:42:05 it a 100 times faster now it's actually
00:42:05 現在速度快了 100 倍,實際上
00:42:05 it a 100 times faster now it's actually far more intelligent than what you
00:42:05 現在它的速度提高了 100 倍,實際上比你現在的
00:42:07 far more intelligent than what you
00:42:07 比你聰明很多
00:42:07 far more intelligent than what you exported from the data center and so the
00:42:07 比你從資料中心匯出的資料要聰明得多,所以
00:42:11 exported from the data center and so the
00:42:11 從資料中心匯出,因此
00:42:11 exported from the data center and so the but all of these are examples of the
00:42:11 從資料中心匯出,但所有這些都是
00:42:13 but all of these are examples of the
00:42:13 但這一切都是
00:42:13 but all of these are examples of the proliferation problem
00:42:13 但這一切都是擴散問題的例子
00:42:15 proliferation problem
00:42:15 擴散問題
00:42:15 proliferation problem and I'm not convinced that we will hold
00:42:15 擴散問題,我不相信我們會堅持下去
00:42:18 and I'm not convinced that we will hold
00:42:18 我不相信我們能堅持下去
00:42:18 and I'm not convinced that we will hold these things in the 10 places.
00:42:18 我不相信我們會在這 10 個地方舉辦這些活動。
00:42:22 these things in the 10 places.
00:42:22 這些東西在10個地方。
00:42:22 these things in the 10 places. And and here's why. Let's assume you
00:42:22 10個地方都有這些事情。原因如下。假設你
00:42:24 And and here's why. Let's assume you
00:42:24 原因如下。假設你
00:42:24 And and here's why. Let's assume you have the 10, which is possible.
00:42:24 原因如下。假設你有10,這是可能的。
00:42:27 have the 10, which is possible.
00:42:27 有 10,這是可能的。
00:42:27 have the 10, which is possible. They will have subsets
00:42:27 有 10 個,這是可能的。它們會有子集
00:42:29 They will have subsets
00:42:29 它們會有子集
00:42:29 They will have subsets of models that are smaller but nearly as
00:42:29 他們將擁有更小但幾乎一樣大的模型子集
00:42:33 of models that are smaller but nearly as
00:42:33 較小但幾乎一樣的模型
00:42:34 of models that are smaller but nearly as intelligent.
00:42:34 模型更小,但幾乎同樣智能。
00:42:35 intelligent. 00:42:35 聰明。
00:42:36 intelligent. And so the tree of knowledge of systems
00:42:36 智能。因此系統知識樹
00:42:39 And so the tree of knowledge of systems
00:42:39 因此系統知識樹
00:42:39 And so the tree of knowledge of systems that have knowledge is not going to be
00:42:39 因此,擁有知識的系統的知識樹不會
00:42:40 that have knowledge is not going to be
00:42:40 擁有知識的人不會
00:42:40 that have knowledge is not going to be 10 and then zero. It's going to be 10, a
00:42:40 擁有知識的人數不會先是 10,然後是 0。而是 10,一個
00:42:43 10 and then zero. It's going to be 10, a
00:42:43 10 然後是零。它會是 10,一個
00:42:43 10 and then zero. It's going to be 10, a h 100red, a thousand, a million, a
00:42:43 10 然後是零。它會是 10,啊,100,100,100 萬,10 ...
00:42:46 h 100red, a thousand, a million, a
00:42:46 小時 100 紅色,一千,一百萬,一
00:42:46 h 100red, a thousand, a million, a billion at different levels of
00:42:46 小時 100、1000、100 萬、10 億,不同等級的
00:42:48 billion at different levels of
00:42:48 十億在不同層面
00:42:48 billion at different levels of complexity. So the system that's on your
00:42:48 億,複雜程度各不相同。所以你的系統
00:42:51 complexity. So the system that's on your
00:42:51 複雜度。所以你的系統
00:42:51 complexity. So the system that's on your future phone may be, you know, three
00:42:51 複雜度。所以你未來手機上的系統可能是三個
00:42:54 future phone may be, you know, three
00:42:54 未來的手機可能是三
00:42:54 future phone may be, you know, three orders of magnitude, four order
00:42:54 未來的手機可能會是,你知道,三個數量級,四個數量級
00:42:55 orders of magnitude, four order
00:42:55 數量級,四階
00:42:55 orders of magnitude, four order magnitude smaller than the one at the
00:42:55 數量級,比
00:42:59 magnitude smaller than the one at the
00:42:59 比
00:42:59 magnitude smaller than the one at the very tippy top, but it will be very,
00:42:59 比最頂端的那個小很多,但它會非常,
00:43:01 very tippy top, but it will be very,
00:43:01 非常頂尖,但它會非常,
00:43:01 very tippy top, but it will be very, very powerful.
00:43:01 非常頂尖,但它會非常非常強大。
00:43:01 very powerful. 00:43:01 非常強大。
00:43:01 very powerful. You know, to exactly what you're talking
00:43:01 非常有力。你知道,正是你所說的
00:43:03 You know, to exactly what you're talking
00:43:03 你知道,你正在談論的
00:43:03 You know, to exactly what you're talking about, there's some great research going
00:43:03 你知道,關於你正在談論的事情,有一些很棒的研究正在進行
00:43:04 about, there's some great research going
00:43:04 有一些很棒的研究正在進行
00:43:04 about, there's some great research going on at MIT. It'll probably move to
00:43:04 麻省理工學院正在進行一些很棒的研究。它可能會轉移到
00:43:06 on at MIT. It'll probably move to
00:43:06 在麻省理工學院。它可能會轉移到
00:43:06 on at MIT. It'll probably move to Stanford just to be fair but it always
00:43:06 在麻省理工學院。為了公平起見,它可能會轉移到史丹佛,但它總是
00:43:08 Stanford just to be fair but it always
00:43:08 史丹佛只是為了公平起見,但它總是
00:43:08 Stanford just to be fair but it always does but uh it's great research going on
00:43:08 史丹佛大學只是為了公平起見,但它總是如此,但呃,這是正在進行的偉大研究
00:43:10 does but uh it's great research going on
00:43:10 確實如此,但嗯,這是一項很棒的研究
00:43:10 does but uh it's great research going on at MIT on uh if you have one of these
00:43:10 確實如此,但麻省理工學院正在進行一項很棒的研究,如果你有其中一個
00:43:12 at MIT on uh if you have one of these
00:43:12 在麻省理工學院,如果你有這些
00:43:12 at MIT on uh if you have one of these huge models and it's been trained on
00:43:12 在麻省理工學院,如果你有一個這樣的大型模型,並且它已經接受過訓練
00:43:15 huge models and it's been trained on
00:43:15 大型模型,並且已經進行了訓練
00:43:15 huge models and it's been trained on movies it's been trained on Swahili a
00:43:15 巨大的模型,它已經用電影進行訓練,也用斯瓦希里語進行訓練
00:43:17 movies it's been trained on Swahili a
00:43:17 電影已經接受了斯瓦希里語的訓練
00:43:18 movies it's been trained on Swahili a lot of the parameters aren't useful for
00:43:18 電影已經用斯瓦希里語進行了訓練,很多參數對電影沒用
00:43:19 lot of the parameters aren't useful for
00:43:19 很多參數對於
00:43:19 lot of the parameters aren't useful for this soant use case but the general
00:43:19 很多參數對於這個共振用例來說沒用,但是一般
00:43:21 this soant use case but the general
00:43:21 這個 soant 用例,但一般
00:43:21 this soant use case but the general knowledge and intuition is so what's the
00:43:21 這個 soant 用例,但一般的知識和直覺是,那麼
00:43:23 knowledge and intuition is so what's the
00:43:23 知識和直覺是什麼
00:43:24 knowledge and intuition is so what's the optimal balance between narrowing the
00:43:24 知識和直覺之間的最佳平衡是什麼?
00:43:26 optimal balance between narrowing the
00:43:26 縮小
00:43:26 optimal balance between narrowing the training data and narrowing the
00:43:26 縮小訓練資料和縮小
00:43:27 training data and narrowing the
00:43:27 訓練資料並縮小
00:43:27 training data and narrowing the parameter set to be a specialist without
00:43:27 訓練資料並縮小參數集,使其成為專家,無需
00:43:31 parameter set to be a specialist without
00:43:31 參數設定為專家,無需
00:43:31 parameter set to be a specialist without losing general you know learning
00:43:31 參數設定為專家,但又不失一般性,你知道學習
00:43:34 losing general you know learning
00:43:34 失敗的將軍你知道學習
00:43:34 losing general you know learning so the people who opposed to that view
00:43:34 失敗的將軍你知道學習所以反對這種觀點的人
00:43:36 so the people who opposed to that view
00:43:36 所以反對這種觀點的人
00:43:36 so the people who opposed to that view and again we don't know would say the
00:43:36 所以那些反對這種觀點的人,我們不知道他們會說
00:43:38 and again we don't know would say the
00:43:38 我們又不知道會說
00:43:38 and again we don't know would say the following. If you take a general purpose
00:43:38 再次,我們不知道會說以下。如果你採取一般目的
00:43:40 following. If you take a general purpose
00:43:40 以下。如果你採取一般目的
00:43:40 following. If you take a general purpose model and you specialize it through
00:43:40 如下。如果你採用一個通用模型,並透過
00:43:42 model and you specialize it through
00:43:42 模型,你透過
00:43:42 model and you specialize it through finetuning it also becomes more brittle.
00:43:42 模型,你透過微調來專門化它,它也會變得更加脆弱。
00:43:45 finetuning it also becomes more brittle.
00:43:45 微調也會變得更脆弱。
00:43:45 finetuning it also becomes more brittle. Mhm. Mhm.
00:43:45 微調後,它也變得更脆弱了。嗯。嗯。
00:43:46 Mhm. Mhm. 00:43:46 嗯,嗯。
00:43:46 Mhm. Mhm. Their view is that what you do is you
00:43:46 嗯。嗯。他們的觀點是,你做什麼都是你
00:43:48 Their view is that what you do is you
00:43:48 他們的觀點是,你所做的就是你
00:43:48 Their view is that what you do is you just make bigger and bigger and bigger
00:43:48 他們的觀點是,你所做的就是讓規模越來越大
00:43:51 just make bigger and bigger and bigger
00:43:51 越來越大
00:43:51 just make bigger and bigger and bigger models because they're in the big model
00:43:51 只是製作越來越大的模型,因為它們在大模型中
00:43:53 models because they're in the big model
00:43:53 模型,因為它們在大模型中
00:43:53 models because they're in the big model camp right and that's why they need
00:43:53 因為他們屬於大模型陣營,所以他們需要
00:43:55 camp right and that's why they need
00:43:55 營地對,這就是為什麼他們需要
00:43:55 camp right and that's why they need gigawatts of data centers and so forth.
00:43:55 陣營是對的,這就是為什麼他們需要千兆瓦的資料中心等等。
00:43:57 gigawatts of data centers and so forth.
00:43:57 千兆瓦的資料中心等等。
00:43:58 gigawatts of data centers and so forth. And their argument is that that
00:43:58 千兆瓦的資料中心等等。他們的論點是
00:43:59 And their argument is that that
00:43:59 他們的論點是
00:43:59 And their argument is that that flexibility of intelligence that we that
00:43:59 他們的論點是,我們智力的彈性
00:44:01 flexibility of intelligence that we that
00:44:01 情報的彈性,我們
00:44:01 flexibility of intelligence that we that they are seeing will continue.
00:44:01 我們看到的情報彈性將會持續下去。
00:44:04 they are seeing will continue.
00:44:04 他們所看到的將會繼續。
00:44:04 they are seeing will continue. Dario wrote a a piece called um
00:44:04 他們所看到的將會繼續。達裡奧寫了一篇名為「嗯」的文章
00:44:08 Dario wrote a a piece called um
00:44:08 Dario 寫了一篇名為「嗯」的文章
00:44:08 Dario wrote a a piece called um basically about machines
00:44:08 Dario 寫了一篇關於機器的文章
00:44:10 basically about machines
00:44:10 基本上是關於機器的
00:44:10 basically about machines and he argued that there
00:44:10 基本上是關於機器的,他認為
00:44:12 and he argued that there
00:44:12 他認為
00:44:12 and he argued that there machines of of grace
00:44:12 他認為,有恩典的機器
00:44:15 machines of of grace
00:44:15 優雅的機器
00:44:15 machines of of grace machines of amazing grace
00:44:15 恩典機器 奇異恩典機器
00:44:17 machines of amazing grace
00:44:17 令人驚嘆的優雅機器
00:44:17 machines of amazing grace and he argued that there are three
00:44:17 神奇優雅的機器,他認為有三個
00:44:20 and he argued that there are three
00:44:20 他認為有三個
00:44:20 and he argued that there are three scaling laws playing. The first one is
00:44:20 他認為有三個縮放定律在運作。第一個是
00:44:22 scaling laws playing. The first one is
00:44:22 縮放定律正在播放。第一個是
00:44:22 scaling laws playing. The first one is what you know of which is foundation
00:44:22 縮放定律正在播放。第一個是你所知道的基礎
00:44:23 what you know of which is foundation
00:44:23 你所知道的,這是基礎
00:44:23 what you know of which is foundation model growth. We're we're still on that.
00:44:23 你所了解的是基礎模型成長。我們仍在研究這個。
00:44:26 model growth. We're we're still on that.
00:44:26 模型增長。我們仍在進行中。
00:44:26 model growth. We're we're still on that. The second one is a test time
00:44:26 模型增長。我們還在繼續討論這個問題。第二個是測試時間
00:44:29 The second one is a test time
00:44:29 第二個是測試時間
00:44:29 The second one is a test time training law and the third one is a
00:44:29 第二個是測試時間訓練法則,第三個是
00:44:31 training law and the third one is a
00:44:31 訓練法則,第三個是
00:44:31 training law and the third one is a reinforcement learning training law.
00:44:31 訓練定律,第三個是強化學習訓練定律。
00:44:33 reinforcement learning training law.
00:44:33 強化學習訓練規律。
00:44:33 reinforcement learning training law. Training laws are where if you just put
00:44:33 強化學習訓練法則。訓練法則是指,如果你把
00:44:35 Training laws are where if you just put
00:44:35 訓練法則就是如果你把
00:44:35 Training laws are where if you just put more hardware and more data, they just
00:44:35 訓練法則是,如果你投入更多的硬體和更多的數據,它們就會
00:44:37 more hardware and more data, they just
00:44:37 更多的硬體和更多的數據,他們只是
00:44:37 more hardware and more data, they just get smarter in a in a predictable way.
00:44:37 更多的硬體和更多的數據,它們會以可預測的方式變得更加聰明。
00:44:40 get smarter in a in a predictable way.
00:44:40 以可預測的方式變得更聰明。
00:44:40 get smarter in a in a predictable way. Um, we're just at the beginning in his
00:44:40 以一種可預測的方式變得更聰明。嗯,我們才剛開始
00:44:42 Um, we're just at the beginning in his
00:44:42 嗯,我們才剛開始他的
00:44:42 Um, we're just at the beginning in his view of uh this the second and third one
00:44:42 嗯,在他看來,這才剛開始,這是第二個和第三個
00:44:46 view of uh this the second and third one
00:44:46 這是第二個和第三個
00:44:46 view of uh this the second and third one beginning. That's why I I'm sure our
00:44:46 這是第二個和第三個開始。這就是為什麼我確信我們的
00:44:49 beginning. That's why I I'm sure our
00:44:49 開始。這就是為什麼我確信我們的
00:44:49 beginning. That's why I I'm sure our audience would be frustrated. Why why do
00:44:49 開頭。所以我確信我們的觀眾會感到沮喪。為什麼
00:44:51 audience would be frustrated. Why why do
00:44:51 觀眾會感到沮喪。為什麼
00:44:51 audience would be frustrated. Why why do we not know? I'm just we don't know,
00:44:51 觀眾會很失望。為什麼我們不知道?我只是不知道,
00:44:54 we not know? I'm just we don't know,
00:44:54 我們不知道?我只是不知道,
00:44:54 we not know? I'm just we don't know, right? It's too new. It's too powerful.
00:44:54 我們不知道?我只是不知道,對吧?它太新了。它太強大了。
00:44:57 right? It's too new. It's too powerful.
00:44:57 對吧?太新了,太強大了。
00:44:57 right? It's too new. It's too powerful. And at the moment, all of these
00:44:57 對吧?它太新了,太強大了。目前,所有這些
00:45:00 And at the moment, all of these
00:45:00 目前,所有這些
00:45:00 And at the moment, all of these businesses are incredibly highly valued.
00:45:00 目前,所有這些企業的估值都非常高。
00:45:02 businesses are incredibly highly valued.
00:45:02 企業的價值極高。
00:45:02 businesses are incredibly highly valued. They're growing incredibly quickly. The
00:45:02 企業的估值極高。它們成長速度驚人。
00:45:05 They're growing incredibly quickly. The
00:45:05 它們生長得非常快。
00:45:05 They're growing incredibly quickly. The uses of them, I mentioned earlier, uh
00:45:05 它們生長得非常快。它們的用途,我之前提到過,呃
00:45:08 uses of them, I mentioned earlier, uh
00:45:08 我之前提到過它們的用途,呃
00:45:08 uses of them, I mentioned earlier, uh going back to Google, um the ability to
00:45:08 它們的用途,我之前提到過,呃,回到谷歌,嗯,能夠
00:45:11 going back to Google, um the ability to
00:45:11 回到谷歌,嗯,能夠
00:45:11 going back to Google, um the ability to refactor your entire workflow in a
00:45:11 回到谷歌,嗯,能夠重建你的整個工作流程
00:45:13 refactor your entire workflow in a
00:45:13 重構你的整個工作流程
00:45:13 refactor your entire workflow in a business is a very big deal. That's a
00:45:13 重構整個業務流程是一件非常重要的事。這是
00:45:15 business is a very big deal. That's a
00:45:15 生意很重要。這是
00:45:15 business is a very big deal. That's a lot of money to be made there for all
00:45:15 生意很重要。每個人都能從中賺到很多錢。
00:45:18 lot of money to be made there for all
00:45:18 每個人都能賺很多錢
00:45:18 lot of money to be made there for all the companies involved. We will see.
00:45:18 所有參與的公司都能從中獲利。我們拭目以待。
00:45:21 the companies involved. We will see.
00:45:21 涉及的公司。我們拭目以待。
00:45:21 the companies involved. We will see. Eric, shifting the topic. One of the
00:45:21 涉及的公司。我們拭目以待。埃里克,換個話題。其中一個
00:45:23 Eric, shifting the topic. One of the
00:45:23 艾瑞克,換個話題。
00:45:23 Eric, shifting the topic. One of the concerns that people have in the near
00:45:23 Eric,換個話題。近期人們最關心的問題之一
00:45:25 concerns that people have in the near
00:45:25 人們近期的擔憂
00:45:25 concerns that people have in the near term and people have been, you know,
00:45:25 人們近期的擔憂,以及人們一直以來的擔憂,
00:45:27 term and people have been, you know,
00:45:27 術語和人們已經,你知道,
00:45:27 term and people have been, you know, ringing the alarm bells is on jobs.
00:45:27 你知道,人們一直在敲響就業方面的警鐘。
00:45:30 ringing the alarm bells is on jobs.
00:45:30 敲響就業警鐘。
00:45:30 ringing the alarm bells is on jobs. Um, I'm wondering where you come out on
00:45:30 敲響警鐘的是就業問題。嗯,我想知道你對此的看法
00:45:32 Um, I'm wondering where you come out on
00:45:32 嗯,我想知道你從哪裡來的
00:45:32 Um, I'm wondering where you come out on this and flipping that forward to
00:45:32 嗯,我想知道你對此有何看法,並將其翻轉到
00:45:35 this and flipping that forward to
00:45:35 把這個翻到
00:45:36 this and flipping that forward to education. How do we educate our kids
00:45:36 反過來說,教育。我們該如何教育孩子
00:45:38 education. How do we educate our kids
00:45:38 教育。我們如何教育我們的孩子
00:45:38 education. How do we educate our kids today in high school and college? Uh,
00:45:38 教育。我們現在該如何教育我們的孩子在高中和大學?呃,
00:45:41 today in high school and college? Uh,
00:45:41 今天在高中和大學?呃,
00:45:41 today in high school and college? Uh, and what's your advice? So on the first
00:45:41 高中和大學都這樣嗎?呃,你有什麼建議?所以首先
00:45:43 and what's your advice? So on the first
00:45:43 你有什麼建議?首先
00:45:43 and what's your advice? So on the first thing, do you believe that as Dario has
00:45:43 你有什麼建議?首先,你認為達裡奧
00:45:46 thing, do you believe that as Dario has
00:45:46 你是否相信達裡奧
00:45:46 thing, do you believe that as Dario has gone on uh you know TV shows now and
00:45:46 事情,你相信嗎,正如達裡奧現在在電視節目中所說的那樣
00:45:50 gone on uh you know TV shows now and
00:45:50 你知道現在的電視節目
00:45:50 gone on uh you know TV shows now and speaking to significant white collar job
00:45:50 你知道現在的電視節目和重要的白領工作
00:45:52 speaking to significant white collar job
00:45:52 談論重要的白領工作
00:45:52 speaking to significant white collar job loss, we're seeing obviously a multitude
00:45:52 談到大量白領失業,我們顯然看到大量
00:45:54 loss, we're seeing obviously a multitude
00:45:54 損失,我們顯然看到大量
00:45:54 loss, we're seeing obviously a multitude of different drivers and uh robots
00:45:54 損失,我們顯然看到大量不同的司機和機器人
00:45:57 of different drivers and uh robots
00:45:57 不同的司機和機器人
00:45:57 of different drivers and uh robots coming in. How do you think about the
00:45:57 不同的司機和機器人進來。你如何看待
00:45:59 coming in. How do you think about the
00:45:59 進來。你覺得
00:45:59 coming in. How do you think about the job market over the next 5 years? Um
00:45:59 進來。你對未來五年的就業市場有什麼看法?嗯
00:46:03 job market over the next 5 years? Um
00:46:03 未來五年的就業市場會怎樣?嗯
00:46:03 job market over the next 5 years? Um let's posit that in 30 or 40 years
00:46:03 未來五年的就業市場會怎樣?嗯,我們假設30年或40年後
00:46:08 let's posit that in 30 or 40 years
00:46:08 假設30年或40年後
00:46:08 let's posit that in 30 or 40 years there'll be a very different employment
00:46:08 假設 30 或 40 年後,就業情勢將大不相同
00:46:10 there'll be a very different employment
00:46:10 就業情勢將截然不同
00:46:10 there'll be a very different employment robotic human interaction
00:46:10 機器人與人類的互動將會非常不同
00:46:12 robotic human interaction
00:46:12 機器人與人類的互動
00:46:12 robotic human interaction or the definition of of do we need to
00:46:12 機器人與人類的互動,或者我們需要
00:46:14 or the definition of of do we need to
00:46:14 或者我們需要
00:46:14 or the definition of of do we need to work at all
00:46:14 或是我們需要工作嗎
00:46:15 work at all 00:46:15 完全沒用
00:46:15 work at all the definition of work the definition of
00:46:15 工作的定義
00:46:17 the definition of work the definition of
00:46:17 工作的定義
00:46:17 the definition of work the definition of identity. Let's just posit that uh and
00:46:17 工作的定義,身份的定義。我們假設
00:46:20 identity. Let's just posit that uh and
00:46:20 身份。我們假設呃,
00:46:20 identity. Let's just posit that uh and let's also posit that it will take 20 or
00:46:20 身份。我們假設,呃,我們也假設需要 20 或
00:46:22 let's also posit that it will take 20 or
00:46:22 假設需要 20 或
00:46:22 let's also posit that it will take 20 or 30 years for those things to work
00:46:22 假設這些事情需要 20 或 30 年才能實現
00:46:25 30 years for those things to work
00:46:25 這些東西花了30年才發揮作用
00:46:25 30 years for those things to work through the economy of our world. Um,
00:46:25 這些事情需要30年才能在世界經濟中發揮作用。嗯,
00:46:29 through the economy of our world. Um,
00:46:29 透過我們世界的經濟。嗯,
00:46:29 through the economy of our world. Um, now in California and other cities in
00:46:29 透過我們世界的經濟。嗯,現在在加州和其他城市
00:46:31 now in California and other cities in
00:46:31 現在在加州和其他城市
00:46:31 now in California and other cities in America, you can get on a Whimo taxi.
00:46:31 現在在加州和美國其他城市,你可以搭乘 Whimo 計程車。
00:46:33 America, you can get on a Whimo taxi.
00:46:33 美國,你可以搭乘 Whimo 計程車。
00:46:33 America, you can get on a Whimo taxi. Um, Whimo, it's 2025. The original work
00:46:33 美國,你可以搭上 Whimo 計程車了。嗯,Whimo,現在是 2025 年。原作
00:46:38 Um, Whimo, it's 2025. The original work
00:46:38 嗯,Whimo,現在是 2025 年。原作
00:46:38 Um, Whimo, it's 2025. The original work was done in the late '9s.
00:46:38 嗯,Whimo,現在是 2025 年。最初的工作是在 90 年代末完成的。
00:46:40 was done in the late '9s.
00:46:40 是在 90 年代後期完成的。
00:46:40 was done in the late '9s. The original challenge at Stanford was
00:46:40 是在 90 年代後期完成的。史丹佛最初的挑戰是
00:46:42 The original challenge at Stanford was
00:46:42 史丹佛最初的挑戰是
00:46:42 The original challenge at Stanford was done, I believe, in 2004.
00:46:42 我相信,史丹佛最初的挑戰是在 2004 年完成的。
00:46:43 done, I believe, in 2004.
00:46:43 我相信是在 2004 年完成的。
00:46:43 done, I believe, in 2004. The DRA Grand Challenge. It was 2004.
00:46:43 我記得是在 2004 年完成的。 DRA 大挑戰賽。那是 2004 年。
00:46:45 The DRA Grand Challenge. It was 2004.
00:46:45 DRA 大挑戰賽。那是 2004 年。
00:46:45 The DRA Grand Challenge. It was 2004. 20 Sebastian through one.
00:46:45 DRA 大挑戰賽。那是 2004 年。 Sebastian 20 歲,已經通過了。
00:46:48 20 Sebastian through one.
00:46:48 20 塞巴斯蒂安通過一。
00:46:48 20 Sebastian through one. So, so more than 20 years from a visible
00:46:48 20 塞巴斯蒂安穿過一個。所以,距離可見的
00:46:52 So, so more than 20 years from a visible
00:46:52 因此,從可見的
00:46:52 So, so more than 20 years from a visible demonstration to our ability to use it
00:46:52 所以,從可見的演示到我們能夠使用它,已經過去了 20 多年的時間
00:46:55 demonstration to our ability to use it
00:46:55 展示我們使用它的能力
00:46:55 demonstration to our ability to use it in daily life. Why? It's hard. It's deep
00:46:55 展現我們在日常生活中運用它的能力。為什麼?這很難。它很深奧
00:46:58 in daily life. Why? It's hard. It's deep
00:46:58 在日常生活中。為什麼?這很難。這很深奧
00:46:58 in daily life. Why? It's hard. It's deep tech. It's regulated and all of that.
00:46:58 在日常生活中。為什麼?這很難。這是深度科技,而且受到監管等等。
00:47:00 tech. It's regulated and all of that.
00:47:00 技術。一切都受到監管。
00:47:00 tech. It's regulated and all of that. And I think that's going to be true,
00:47:00 科技。它受到監管等等。我認為這是真的。
00:47:01 And I think that's going to be true,
00:47:01 我認為這是真的,
00:47:01 And I think that's going to be true, especially in robots that are
00:47:01 我認為這是真的,特別是在機器人方面
00:47:03 especially in robots that are
00:47:03 尤其是在機器人中
00:47:03 especially in robots that are interacting with humans. They're going
00:47:03 尤其是在與人類互動的機器人中。它們會
00:47:04 interacting with humans. They're going
00:47:04 與人類互動。它們會
00:47:04 interacting with humans. They're going to get regulated. You're not going to be
00:47:04 與人類互動。它們會受到監管。你不會
00:47:05 to get regulated. You're not going to be
00:47:05 受到監管。你不會
00:47:05 to get regulated. You're not going to be wandering around and the robots going to
00:47:05 受到監管。你不會到處閒逛,機器人會
00:47:07 wandering around and the robots going to
00:47:07 四處遊蕩,機器人會
00:47:07 wandering around and the robots going to decide to slap you. It just doesn't, you
00:47:07 四處遊蕩,機器人會決定打你。它不會,你
00:47:09 decide to slap you. It just doesn't, you
00:47:09 決定打你。它只是沒有,你
00:47:09 decide to slap you. It just doesn't, you know, societyy's not going to allow that
00:47:09 決定打你。你知道,社會不會允許這種事發生。
00:47:11 know, societyy's not going to allow that
00:47:11 知道,社會不會允許這樣的事情發生
00:47:11 know, societyy's not going to allow that sort of thing.
00:47:11 知道,社會不會允許這種事情發生。
00:47:12 sort of thing. 00:47:12 之類的。
00:47:12 sort of thing. It's just not, it's not going to it's
00:47:12 之類的。它不會,它不會
00:47:13 It's just not, it's not going to it's
00:47:13 這不是,這不會是
00:47:14 It's just not, it's not going to it's it's not going to allow it.
00:47:14 這根本就不可以,這根本就不允許。
00:47:16 it's not going to allow it.
00:47:16 它不會允許這樣做。
00:47:16 it's not going to allow it. So, in the shorter term, five or 10
00:47:16 不會允許的。所以,短期內,5到10
00:47:19 So, in the shorter term, five or 10
00:47:19 因此,短期內,5 到 10
00:47:19 So, in the shorter term, five or 10 years, I'm going to argue that this is
00:47:19 因此,從短期來看,五年或十年,我認為這是
00:47:23 years, I'm going to argue that this is
00:47:23 年,我要說的是,這是
00:47:23 years, I'm going to argue that this is positive for jobs in the following way.
00:47:23 年,我將以以下方式論證這對就業是有利的。
00:47:26 positive for jobs in the following way.
00:47:26 以下列方式對工作有正面影響。
00:47:26 positive for jobs in the following way. Okay.
00:47:26 以下列方式對工作有正面影響。好的。
00:47:27 Okay. 00:47:27 好的。
00:47:27 Okay. Um if you look at the history of
00:47:27 好的。嗯,如果你看看歷史
00:47:29 Um if you look at the history of
00:47:29 嗯,如果你看看歷史
00:47:29 Um if you look at the history of automation and economic growth,
00:47:29 嗯,如果你看看自動化和經濟成長的歷史,
00:47:33 automation and economic growth,
00:47:33 自動化和經濟成長,
00:47:33 automation and economic growth, automation starts with the lowest status
00:47:33 自動化和經濟成長,自動化從最低層開始
00:47:36 automation starts with the lowest status
00:47:36 自動化從最低狀態開始
00:47:36 automation starts with the lowest status and most dangerous jobs and then works
00:47:36 自動化從最低地位和最危險的工作開始,然後發揮作用
00:47:40 and most dangerous jobs and then works
00:47:40 最危險的工作,然後是工作
00:47:40 and most dangerous jobs and then works up the chain. So if you think about
00:47:40 以及最危險的工作,然後向上層推進。所以如果你考慮一下
00:47:42 up the chain. So if you think about
00:47:42 向上層。所以如果你考慮一下
00:47:42 up the chain. So if you think about assembly lines and cars and you know
00:47:42 向上層。所以,如果你考慮一下裝配線和汽車,你就會知道
00:47:44 assembly lines and cars and you know
00:47:44 裝配線和汽車,你知道
00:47:44 assembly lines and cars and you know furnaces and all these sort of very very
00:47:44 裝配線和汽車,你知道熔爐和所有這些非常非常
00:47:46 furnaces and all these sort of very very
00:47:46 熔爐和所有這些非常非常
00:47:46 furnaces and all these sort of very very dangerous jobs that our four forefathers
00:47:46 熔爐和所有這些非常危險的工作,我們的四個祖先
00:47:48 dangerous jobs that our four forefathers
00:47:48 我們的四位祖先從事的危險工作
00:47:48 dangerous jobs that our four forefathers did, they don't do them anymore. They're
00:47:48 我們四位祖先曾經從事的危險工作,現在他們不再做了。他們
00:47:49 did, they don't do them anymore. They're
00:47:49 以前做過,現在不做了。他們
00:47:50 did, they don't do them anymore. They're done by robotic solutions of one another
00:47:50 以前做過,現在不做了。現在都是用機器人的解決方案來做
00:47:52 done by robotic solutions of one another
00:47:52 由機器人互相解決
00:47:52 done by robotic solutions of one another and typically not a humanoid robot but
00:47:52 由機器人解決方案完成,通常不是人形機器人,而是
00:47:54 and typically not a humanoid robot but
00:47:54 通常不是人形機器人,但是
00:47:54 and typically not a humanoid robot but an arm. So the so the world dominated by
00:47:54 通常不是人形機器人,而是手臂。所以世界被
00:47:57 an arm. So the so the world dominated by
00:47:57 一隻手臂。所以這個世界被
00:47:57 an arm. So the so the world dominated by arms that are intelligent and so forth
00:47:57 一隻手臂。所以這個世界被智慧手臂主宰。
00:47:59 arms that are intelligent and so forth
00:47:59 智慧手臂等等
00:47:59 arms that are intelligent and so forth will automate those functions. What
00:47:59 智慧型手臂等將自動執行這些功能。什麼
00:48:01 will automate those functions. What
00:48:01 將自動執行這些功能。
00:48:02 will automate those functions. What happens to the people? Well, it turns
00:48:02 這些功能都會自動化。那人怎麼辦?嗯,結果
00:48:04 happens to the people? Well, it turns
00:48:04 這對人們來說意味著什麼?嗯,結果
00:48:04 happens to the people? Well, it turns out that the person who was working with
00:48:04 這對人們來說意味著什麼?嗯,事實證明,與
00:48:06 out that the person who was working with
00:48:06 發現與
00:48:06 out that the person who was working with the the welder who's now operating the
00:48:06 發現與焊工一起工作的人現在正在操作
00:48:08 the the welder who's now operating the
00:48:08 正在操作的焊工
00:48:08 the the welder who's now operating the arm has a higher
00:48:08 現在操作手臂的焊工有較高的
00:48:11 arm has a higher
00:48:11 手臂有較高的
00:48:11 arm has a higher wage and the company has higher profits
00:48:11 arm 的薪水更高,公司的利潤也更高
00:48:15 wage and the company has higher profits
00:48:15 工資和公司利潤更高
00:48:15 wage and the company has higher profits because it's producing more widgets. So
00:48:15 薪水更高,公司利潤也更高,因為它生產了更多的零件。所以
00:48:17 because it's producing more widgets. So
00:48:17 因為它生產了更多小部件。所以
00:48:17 because it's producing more widgets. So the company makes more money and the
00:48:17 因為它生產了更多的部件。所以公司賺了更多錢,
00:48:19 the company makes more money and the
00:48:19 公司賺更多的錢,
00:48:19 the company makes more money and the person makes more money, right? In that
00:48:19 公司賺更多錢,個人也賺更多錢,對吧?
00:48:21 person makes more money, right? In that
00:48:21 人賺的錢更多,對吧?
00:48:21 person makes more money, right? In that sense. Now you sit there and say well
00:48:21 一個人賺的錢比較多,對吧?從這個意義上來說。現在你坐在那裡說,好吧
00:48:23 sense. Now you sit there and say well
00:48:23 感覺。現在你坐在那裡說
00:48:23 sense. Now you sit there and say well that's not true because humans don't
00:48:23 感覺。現在你坐在那裡說,好吧,這不是真的,因為人類不
00:48:24 that's not true because humans don't
00:48:24 事實並非如此,因為人類不會
00:48:24 that's not true because humans don't want to be retrained. Ah but in the
00:48:24 這不是真的,因為人類不想被重新訓練。啊,但在
00:48:27 want to be retrained. Ah but in the
00:48:27 想要重新接受訓練。啊,但是在
00:48:27 want to be retrained. Ah but in the vision that we're talking about every
00:48:27 想要重新接受訓練。啊,但在我們談論的願景中,
00:48:29 vision that we're talking about every
00:48:29 我們每天都在談論的願景
00:48:29 vision that we're talking about every single person will have a human a
00:48:29 我們正在談論的願景是每個人都擁有一個人類
00:48:31 single person will have a human a
00:48:31 單身人士將擁有人類
00:48:31 single person will have a human a computer assistant that's very
00:48:31 一個人會有一個人類電腦助手,非常
00:48:33 computer assistant that's very
00:48:33 電腦助理,非常
00:48:33 computer assistant that's very intelligent that helps them perform.
00:48:33 非常聰明的電腦助理可以幫助他們執行任務。
00:48:35 intelligent that helps them perform.
00:48:35 智能可以幫助他們表現。
00:48:35 intelligent that helps them perform. And you take a person of normal
00:48:35 智力幫助他們表現。你選正常人
00:48:37 And you take a person of normal
00:48:37 你找一個正常人
00:48:37 And you take a person of normal intelligence or knowledge and you add a
00:48:37 你找一個智力或知識正常的人,然後再加一個
00:48:39 intelligence or knowledge and you add a
00:48:39 智力或知識,然後你加一個
00:48:39 intelligence or knowledge and you add a you know sort of accelerant they can get
00:48:39 智力或知識,然後你加入一種他們可以得到的促進劑
00:48:41 you know sort of accelerant they can get
00:48:41 你知道他們可以得到什麼促進劑
00:48:41 you know sort of accelerant they can get a higher paying job. So you sit there
00:48:41 你知道,這是一種促進劑,他們可以得到一份薪水更高的工作。所以你坐在那裡
00:48:43 a higher paying job. So you sit there
00:48:43 一份薪水更高的工作。所以你坐在那裡
00:48:43 a higher paying job. So you sit there and you go well why are there more jobs?
00:48:43 一份薪水更高的工作。所以你坐在那裡,然後問為什麼會有更多工作機會?
00:48:45 and you go well why are there more jobs?
00:48:45 那為什麼會有更多工作機會呢?
00:48:45 and you go well why are there more jobs? There should be less jobs. That's not
00:48:45 你說為什麼工作機會更多了?工作機會應該更少。這不
00:48:47 There should be less jobs. That's not
00:48:47 工作應減少。這並非
00:48:47 There should be less jobs. That's not how economics works. Economics expands
00:48:47 工作應減少。經濟學不是這樣運作的。經濟擴張
00:48:50 how economics works. Economics expands
00:48:50 經濟學如何運作。經濟學不斷擴展
00:48:50 how economics works. Economics expands because the opportunities expands,
00:48:50 經濟學是如何運作的。經濟擴張是因為機會的增加,
00:48:52 because the opportunities expands,
00:48:52 因為機會在擴大,
00:48:52 because the opportunities expands, profits expands, wealth expands and so
00:48:52 因為機會擴大,利潤擴大,財富擴大等等
00:48:54 profits expands, wealth expands and so
00:48:54 利潤擴大,財富擴大,等等
00:48:54 profits expands, wealth expands and so forth. So there's plenty of dislocation
00:48:54 利潤擴大,財富擴大等等。因此,存在大量混亂
00:48:58 forth. So there's plenty of dislocation
00:48:58 因此存在大量錯位
00:48:58 forth. So there's plenty of dislocation but in aggregate are there more people
00:48:58 因此,存在大量混亂,但總體而言,是否有更多人
00:49:01 but in aggregate are there more people
00:49:01 但整體來說,
00:49:02 but in aggregate are there more people employed or fewer? The answer is more
00:49:02 但整體而言,就業人數是增加了還是減少了?答案是
00:49:04 employed or fewer? The answer is more
00:49:04 就業人數還是更少?答案是更多
00:49:04 employed or fewer? The answer is more people with higher paying jobs.
00:49:04 就業人數還是減少?答案是更多人從事高薪工作。
00:49:05 people with higher paying jobs.
00:49:05 擁有高薪工作的人。
00:49:05 people with higher paying jobs. Is that true in India as well?
00:49:05 從事高薪工作的人。在印度也是如此嗎?
00:49:07 Is that true in India as well?
00:49:07 在印度也是這樣嗎?
00:49:07 Is that true in India as well? Uh it will be and you picked India
00:49:07 印度也是這樣嗎?嗯,沒錯,你選的是印度
00:49:09 Uh it will be and you picked India
00:49:09 嗯,是的,你選了印度
00:49:09 Uh it will be and you picked India because India has a positive demographic
00:49:09 嗯,是的,你選擇印度是因為印度的人口結構
00:49:11 because India has a positive demographic
00:49:11 因為印度擁有積極的人口結構
00:49:11 because India has a positive demographic outlook although their their birth rate
00:49:11 因為印度的人口前景樂觀,儘管他們的出生率
00:49:12 outlook although their their birth rate
00:49:12 儘管他們的出生率
00:49:12 outlook although their their birth rate is now down to 2.0.
00:49:12 儘管他們的出生率現在已降至 2.0。
00:49:14 is now down to 2.0.
00:49:14 現已降至 2.0。
00:49:14 is now down to 2.0. Huh. That's good. the the the rest of
00:49:14 現在降到了 2.0。嗯。太好了。剩下的
00:49:16 Huh. That's good. the the the rest of
00:49:16 嗯。很好。剩下的
00:49:16 Huh. That's good. the the the rest of the world is choosing not to have
00:49:16 嗯。這很好。世界其他國家選擇不這樣做
00:49:17 the world is choosing not to have
00:49:17 世界選擇不擁有
00:49:17 the world is choosing not to have children.
00:49:17 全世界的人都選擇不生孩子。
00:49:18 children. 00:49:18 孩子們。
00:49:18 children. If you look at Korea, it's now down to.7
00:49:18 孩子們。如果你看看韓國,現在已經下降到.7
00:49:22 If you look at Korea, it's now down to.7
00:49:22 如果你看看韓國,現在已經下降到.7
00:49:22 If you look at Korea, it's now down to.7 children per two parents.
00:49:22 如果你看看韓國,現在每兩個父母生育0.7個孩子。
00:49:23 children per two parents.
00:49:23 每兩個父母生育一個孩子。
00:49:23 children per two parents. Yeah.
00:49:23 每對父母生育一個孩子。是的。
00:49:24 Yeah. 00:49:24 是的。
00:49:24 Yeah. China is down to one child per two
00:49:24 是的。中國每兩個孩子就有一個孩子
00:49:26 China is down to one child per two
00:49:26 中國每兩個孩子就有一個孩子
00:49:26 China is down to one child per two parents.
00:49:26 中國每兩個父母只生一個孩子。
00:49:27 parents. 00:49:27 父母。
00:49:27 parents. It's evaporating.
00:49:27 父母。它正在消失。
00:49:28 It's evaporating.
00:49:28 它正在蒸發。
00:49:28 It's evaporating. Now, what happens in those situations?
00:49:28 它正在蒸發。那麼,在這種情況下會發生什麼事呢?
00:49:30 Now, what happens in those situations?
00:49:30 那麼,在這些情況下會發生什麼事呢?
00:49:30 Now, what happens in those situations? They completely automate everything
00:49:30 那麼,在這種情況下會發生什麼事?他們會把一切都完全自動化
00:49:32 They completely automate everything
00:49:32 他們完全自動化一切
00:49:32 They completely automate everything because it's the only way to increase
00:49:32 他們完全自動化一切,因為這是增加的唯一方法
00:49:33 because it's the only way to increase
00:49:33 因為這是唯一能增加
00:49:34 because it's the only way to increase national priority. So the most likely
00:49:34 因為這是提高國家優先權的唯一方法。所以最有可能的是
00:49:36 national priority. So the most likely
00:49:36 國家優先。所以最有可能的是
00:49:36 national priority. So the most likely scenario, at least in the next decade,
00:49:36 國家優先。所以最有可能的情況是,至少在未來十年,
00:49:38 scenario, at least in the next decade,
00:49:38 至少在未來十年,
00:49:38 scenario, at least in the next decade, is it's a national emergency to use more
00:49:38 至少在未來十年內,情況是國家緊急狀態,需要使用更多
00:49:40 is it's a national emergency to use more
00:49:40 這是國家緊急狀態,需要使用更多
00:49:40 is it's a national emergency to use more AI in the workplace to give people
00:49:40 國家緊急狀態是需要在工作場所使用更多人工智慧,為人們提供
00:49:43 AI in the workplace to give people
00:49:43 人工智慧在工作場所為人們提供
00:49:43 AI in the workplace to give people better paying jobs and create more
00:49:43 人工智慧在工作場所為人們提供更好的薪資工作並創造更多
00:49:45 better paying jobs and create more
00:49:45 提供更好的薪資工作,創造更多
00:49:45 better paying jobs and create more productivity in the United States
00:49:45 在美國創造更高薪的工作並提高生產力
00:49:47 productivity in the United States
00:49:47 美國的生產力
00:49:47 productivity in the United States because our birth rate has been falling.
00:49:47 美國的生產力下降,因為我們的出生率一直在下降。
00:49:49 because our birth rate has been falling.
00:49:49 因為我們的出生率一直在下降。
00:49:49 because our birth rate has been falling. And and what happens is people have
00:49:49 因為我們的出生率一直在下降。而現在的情況是
00:49:51 And and what happens is people have
00:49:51 而人們會
00:49:51 And and what happens is people have talked about this for 20 years. If you
00:49:51 事實上,人們已經談論這個問題20年了。如果你
00:49:53 talked about this for 20 years. If you
00:49:53 談論這個話題已經20年了。如果你
00:49:53 talked about this for 20 years. If you if you have this conversation and you
00:49:53 我已經討論20年了。如果你有這樣的對話,你
00:49:55 if you have this conversation and you
00:49:55 如果你有這個對話,並且你
00:49:55 if you have this conversation and you ignore demographics, which is negative
00:49:55 如果你進行這樣的對話,而忽略了人口統計數據,那麼就會產生負面影響
00:49:57 ignore demographics, which is negative
00:49:57 忽略人口統計數據,這是負面的
00:49:57 ignore demographics, which is negative for humans, and economic growth, which
00:49:57 忽略人口結構,對人類來說是不利的,以及經濟成長,這
00:50:00 for humans, and economic growth, which
00:50:00 對人類和經濟成長來說,
00:50:00 for humans, and economic growth, which occurs naturally because of capital
00:50:00 對人類來說,經濟成長是由於資本自然發生的
00:50:01 occurs naturally because of capital
00:50:01 自然發生,因為資本
00:50:01 occurs naturally because of capital investment, then you miss the whole
00:50:01 因為資本投資而自然發生,那麼你就錯過了整個
00:50:03 investment, then you miss the whole
00:50:03 投資,那麼你就錯過了整個
00:50:03 investment, then you miss the whole story. Now, there are plenty of people
00:50:03 投資,那麼你就錯過了整個故事。現在,有很多人
00:50:05 story. Now, there are plenty of people
00:50:05 故事。現在,有很多人
00:50:05 story. Now, there are plenty of people who lose their jobs, but there's an
00:50:05 故事。現在有很多人失業,但有一個
00:50:07 who lose their jobs, but there's an
00:50:07 失業的人,但有一個
00:50:07 who lose their jobs, but there's an awful lot of people who have new jobs.
00:50:07 失業的人很多,但也有很多人找到新工作了。
00:50:09 awful lot of people who have new jobs.
00:50:09 很多人找到新工作了。
00:50:09 awful lot of people who have new jobs. And the typical simple example would be
00:50:09 很多人找到新工作了。一個典型的例子是
00:50:12 And the typical simple example would be
00:50:12 典型的簡單例子是
00:50:12 And the typical simple example would be all those people who work in in Amazon
00:50:12 一個典型簡單的例子就是所有在亞馬遜工作的人
00:50:14 all those people who work in in Amazon
00:50:14 所有在亞馬遜工作的人
00:50:14 all those people who work in in Amazon distribution centers and Amazon trucks,
00:50:14 所有在亞馬遜配送中心和亞馬遜卡車工作的人,
00:50:17 distribution centers and Amazon trucks,
00:50:17 配送中心和亞馬遜卡車,
00:50:17 distribution centers and Amazon trucks, those jobs didn't exist until Amazon was
00:50:17 配送中心和亞馬遜卡車,這些工作直到亞馬遜成立後才存在。
00:50:19 those jobs didn't exist until Amazon was
00:50:19 這些工作直到亞馬遜成立才存在
00:50:19 those jobs didn't exist until Amazon was created, right? Um the number one
00:50:19 這些工作直到亞馬遜成立才存在,對吧?嗯,第一
00:50:23 created, right? Um the number one
00:50:23 完成了,對吧?嗯,第一名
00:50:23 created, right? Um the number one shortage in jobs right now in America
00:50:23 創造了,對吧?嗯,美國現在最缺的是工作。
00:50:25 shortage in jobs right now in America
00:50:25 美國目前就業短缺
00:50:26 shortage in jobs right now in America are truck drivers. Why? Truck driving is
00:50:26 目前美國最缺的職位是卡車司機。為什麼?卡車司機
00:50:29 are truck drivers. Why? Truck driving is
00:50:29 是卡車司機。為什麼?卡車司機
00:50:29 are truck drivers. Why? Truck driving is a lonely, hard, lowpaying, right? low
00:50:29 是卡車司機。為什麼?卡車司機很孤獨、辛苦、收入低,對吧?低
00:50:33 a lonely, hard, lowpaying, right? low
00:50:33 孤獨、辛苦、低薪,對吧?低
00:50:33 a lonely, hard, lowpaying, right? low status of good people job. They don't
00:50:33 一份孤獨、辛苦、低薪的工作,對吧?低地位的好人的工作。他們不
00:50:35 status of good people job. They don't
00:50:35 好人的工作狀態。他們不
00:50:35 status of good people job. They don't want it. They want a better paying job.
00:50:35 好人的工作現況。他們不想要這份工作。他們想要一份薪水更高的工作。
00:50:37 want it. They want a better paying job.
00:50:37 想要。他們想要一份薪水更高的工作。
00:50:37 want it. They want a better paying job. Right? Going back to education,
00:50:37 想要。他們想要一份薪水更高的工作。對吧?回到教育問題上,
00:50:39 Right? Going back to education,
00:50:39 對吧?回到教育問題上,
00:50:40 Right? Going back to education, it's really a crime that our industry
00:50:40 對吧?回到教育議題上,我們的產業
00:50:42 it's really a crime that our industry
00:50:42 我們的產業確實是一種犯罪
00:50:42 it's really a crime that our industry has not invented the following product.
00:50:42 我們的產業沒有發明以下產品真是罪。
00:50:44 has not invented the following product.
00:50:44 尚未發明以下產品。
00:50:44 has not invented the following product. The product that I wanted to build is a
00:50:44 還沒有發明以下產品。我想要打造的產品是
00:50:46 The product that I wanted to build is a
00:50:46 我想要打造的產品是
00:50:46 The product that I wanted to build is a product that teaches every single human
00:50:46 我想要打造的產品是一款可以教導每個人的產品
00:50:49 product that teaches every single human
00:50:49 教導每個人的產品
00:50:49 product that teaches every single human who wants to be taught in their language
00:50:49 一個可以教導所有想要學習自己語言的人的產品
00:50:51 who wants to be taught in their language
00:50:51 誰想用他們的語言來教學
00:50:51 who wants to be taught in their language in a gamified way the stuff they need to
00:50:51 想要用他們的語言以遊戲化的方式學習他們所需要的東西
00:50:53 in a gamified way the stuff they need to
00:50:53 以遊戲化的方式
00:50:53 in a gamified way the stuff they need to know to be a great citizen in their
00:50:53 以遊戲化的方式向孩子們講解如何成為優秀公民
00:50:55 know to be a great citizen in their
00:50:55 知道在他們的
00:50:55 know to be a great citizen in their country.
00:50:55 知道自己是自己國家的優秀公民。
00:50:56 country. 00:50:56 國。
00:50:56 country. Right? That can all be done on phones
00:50:56 國。對吧?這些都可以在手機上完成
00:50:58 Right? That can all be done on phones
00:50:58 對吧?這些都可以在手機上完成
00:50:58 Right? That can all be done on phones now. It can all be learned and you can
00:50:58 對吧?現在這些都可以在手機上完成。這些都可以學習,而且你可以
00:50:59 now. It can all be learned and you can
00:50:59 現在。這一切都是可以學習的,你可以
00:51:00 now. It can all be learned and you can all learn how to do it. And why do we
00:51:00 現在。這一切都是可以學習的,你們都可以學習如何做。為什麼我們
00:51:02 all learn how to do it. And why do we
00:51:02 我們都學會怎麼做了。為什麼我們
00:51:02 all learn how to do it. And why do we not have that product? Right? The
00:51:02 大家都學會怎麼做了。為什麼我們沒有那個產品?對吧?
00:51:04 not have that product? Right? The
00:51:04 沒有那個產品?對吧?
00:51:04 not have that product? Right? The investment in the humans of the world is
00:51:04 沒有那個產品?對吧?對世界人類的投資是
00:51:06 investment in the humans of the world is
00:51:06 對世界人類的投資是
00:51:06 investment in the humans of the world is the best return always in knowledge in
00:51:06 對人類的投資是最好的回報,永遠是知識
00:51:10 the best return always in knowledge in
00:51:10 最好的回報總是在知識中
00:51:10 the best return always in knowledge in capability is always the right answer.
00:51:10 最好的回報永遠是知識和能力,這永遠是正確的答案。
00:51:12 capability is always the right answer.
00:51:12 能力永遠是正確的答案。
00:51:12 capability is always the right answer. Let me try and get get your opinion on
00:51:12 能力永遠是正確的答案。我來聽聽你對
00:51:13 Let me try and get get your opinion on
00:51:13 讓我試著聽聽你的意見
00:51:13 Let me try and get get your opinion on this because you're so influential with
00:51:13 讓我試著聽聽你對此的看法,因為你對
00:51:15 this because you're so influential with
00:51:15 因為你很有影響力
00:51:15 this because you're so influential with so I've got about a thousand people in
00:51:15 因為你很有影響力,所以我大約有一千人
00:51:17 so I've got about a thousand people in
00:51:17 所以我大約有一千人
00:51:17 so I've got about a thousand people in the companies where I'm the controlling
00:51:17 所以我控制的公司裡大約有一千人
00:51:18 the companies where I'm the controlling
00:51:18 我控制的公司
00:51:18 the companies where I'm the controlling shareholder and I've been trying to tell
00:51:18 我作為控股股東的公司,我一直試著告訴
00:51:20 shareholder and I've been trying to tell
00:51:20 股東,我一直試著告訴
00:51:20 shareholder and I've been trying to tell them exactly what you just articulated
00:51:20 股東,我一直試著告訴他們你剛才表達的意思
00:51:23 them exactly what you just articulated
00:51:23 正是你剛才表達的
00:51:23 them exactly what you just articulated where a lot of these people have been in
00:51:23 他們正是你剛才所說的,這些人中的許多人都在
00:51:25 where a lot of these people have been in
00:51:25 很多人去過那裡
00:51:25 where a lot of these people have been in the company for 10 15 years. They're
00:51:25 很多人已經在公司工作10到15年了。他們
00:51:27 the company for 10 15 years. They're
00:51:27 該公司已經運作了 10 到 15 年。他們
00:51:27 the company for 10 15 years. They're incredibly capable and loyal, but
00:51:27 在公司工作了10到15年。他們非常有能力,也很忠誠,但是
00:51:28 incredibly capable and loyal, but
00:51:28 非常有能力和忠誠,但是
00:51:28 incredibly capable and loyal, but they've learned a specific white collar
00:51:28 非常有能力和忠誠,但他們已經學會了特定的白領
00:51:30 they've learned a specific white collar
00:51:30 他們了解到一個特定的白領
00:51:30 they've learned a specific white collar skill. They worked really hard to learn
00:51:30 他們學會了一項特定的白領技能。他們非常努力地學習
00:51:33 skill. They worked really hard to learn
00:51:33 技能。他們非常努力地學習
00:51:33 skill. They worked really hard to learn the skill and the AI is coming within no
00:51:33 技能。他們非常努力地學習這項技能,而人工智慧很快就
00:51:37 the skill and the AI is coming within no
00:51:37 技能和人工智慧即將到來
00:51:37 the skill and the AI is coming within no no more than 3 years and maybe two
00:51:37 這項技能和人工智慧將在三年甚至兩年內問世
00:51:38 no more than 3 years and maybe two
00:51:38 不會超過 3 年,也許 2 年
00:51:38 no more than 3 years and maybe two years. And the the opportunity to
00:51:38 不會超過3年,也許2年。而且有機會
00:51:42 years. And the the opportunity to
00:51:42 年。並且有機會
00:51:42 years. And the the opportunity to retrain and have continuity is right
00:51:42 年。再培訓和保持連續性的機會是正確的
00:51:45 retrain and have continuity is right
00:51:45 重新訓練並保持連續性是正確的
00:51:45 retrain and have continuity is right now.
00:51:45 重新訓練並保持連續性是正確的。
00:51:46 now. 現在 00:51:46。
00:51:46 now. But if they delay, which everyone seems
00:51:46 現在。但如果他們拖延,每個人都認為
00:51:48 But if they delay, which everyone seems
00:51:48 但如果他們拖延,每個人都認為
00:51:48 But if they delay, which everyone seems to be just let's wait and see. And what
00:51:48 但如果他們拖延,就像每個人似乎都在拖延那樣,那就讓我們拭目以待吧。
00:51:50 to be just let's wait and see. And what
00:51:50 讓我們拭目以待。
00:51:50 to be just let's wait and see. And what I'm trying to tell them is if you wait
00:51:50 就讓我們拭目以待吧。我想告訴他們的是,如果你等待
00:51:52 I'm trying to tell them is if you wait
00:51:52 我想告訴他們,如果你等待
00:51:52 I'm trying to tell them is if you wait and see, you're you're really screwing
00:51:52 我想告訴他們,如果你等著看,你就真的完蛋了
00:51:55 and see, you're you're really screwing
00:51:55 看看,你真的搞砸了
00:51:55 and see, you're you're really screwing over that employee. So, so we are in
00:51:55 瞧,你真的在坑那個員工。所以我們
00:51:57 over that employee. So, so we are in
00:51:57 針對那名員工。所以,我們
00:51:57 over that employee. So, so we are in wild agreement that this is going to
00:51:57 針對那位員工。所以我們一致認為,
00:51:59 wild agreement that this is going to
00:51:59 一致認為這將
00:51:59 wild agreement that this is going to happen and the winners we the ones who
00:51:59 大家一致認為這會發生,而贏家就是我們
00:52:03 happen and the winners we the ones who
00:52:03 發生,而贏家是我們
00:52:03 happen and the winners we the ones who act. Now, what's interesting is when you
00:52:03 發生了,贏家是那些行動的人。現在,有趣的是,當你
00:52:05 act. Now, what's interesting is when you
00:52:05 行動。現在,有趣的是,當你
00:52:05 act. Now, what's interesting is when you look at innovation history, the biggest
00:52:05 行動。現在,有趣的是,當你回顧創新歷史時,最大的
00:52:08 look at innovation history, the biggest
00:52:08 回顧創新歷史,最大的
00:52:08 look at innovation history, the biggest companies who you would think of are the
00:52:08 看看創新歷史,你能想到的最大的公司是
00:52:09 companies who you would think of are the
00:52:09 你會想到的公司是
00:52:10 companies who you would think of are the slowest because they have economic
00:52:10 你可能會認為這些公司發展最慢,因為它們的經濟
00:52:12 slowest because they have economic
00:52:12 最慢,因為他們有經濟
00:52:12 slowest because they have economic resources that the little companies
00:52:12 最慢,因為他們擁有小公司所沒有的經濟資源
00:52:14 resources that the little companies
00:52:14 小公司提供的資源
00:52:14 resources that the little companies typically don't, they tend to eventually
00:52:14 小公司通常沒有的資源,他們最終往往會
00:52:17 typically don't, they tend to eventually
00:52:17 通常不會,他們最終會
00:52:17 typically don't, they tend to eventually get there, right? So, watch what the big
00:52:17 通常不會,他們最終還是會到達那裡,對吧?所以,看看大
00:52:20 get there, right? So, watch what the big
00:52:20 到那兒吧?所以,看看大
00:52:20 get there, right? So, watch what the big companies do. Mhm.
00:52:20 能到吧?所以,看看大公司是怎麼做的。嗯。
00:52:21 companies do. Mhm.
00:52:21 公司確實如此。嗯。
00:52:21 companies do. Mhm. are their CFOs and the people who
00:52:21 公司確實如此。嗯。是他們的財務長和那些
00:52:23 are their CFOs and the people who
00:52:23 是他們的財務長和那些
00:52:23 are their CFOs and the people who measure things carefully, who are very
00:52:23 是他們的財務長和那些仔細衡量事情的人,他們非常
00:52:25 measure things carefully, who are very
00:52:25 仔細測量事物,誰非常
00:52:25 measure things carefully, who are very very intelligent. They say, "I'm done
00:52:25 仔細衡量事物,他們非常非常聰明。他們會說,「我完成了
00:52:28 very intelligent. They say, "I'm done
00:52:28 非常聰明。他們說,「我受夠了
00:52:28 very intelligent. They say, "I'm done with that thousand engineering team that
00:52:28 非常聰明。他們說,「我跟那個千人工程團隊已經玩夠了,
00:52:29 with that thousand engineering team that
00:52:29 擁有上千名工程團隊
00:52:29 with that thousand engineering team that doesn't do very much. I want 50 people
00:52:29 那個千人工程團隊做不了什麼事。我想要50個人
00:52:32 doesn't do very much. I want 50 people
00:52:32 沒什麼用。我想要50個人
00:52:32 doesn't do very much. I want 50 people working in this other way and we'll do
00:52:32 沒什麼用。我想讓 50 個人用另一種方式工作,這樣我們就能
00:52:34 working in this other way and we'll do
00:52:34 用另一種方式,我們就可以
00:52:34 working in this other way and we'll do something else for the other people."
00:52:34 以另一種方式工作,我們會為其他人做些其他的事情。 」
00:52:35 something else for the other people."
00:52:35 為其他人提供一些其他的東西。 」
00:52:35 something else for the other people." And when you say big companies, we're
00:52:35 為其他人提供一些其他的東西。當你說大公司時,我們
00:52:37 And when you say big companies, we're
00:52:37 當你說大公司時,我們是
00:52:37 And when you say big companies, we're thinking Google, Meta. We're not
00:52:37 當你說大公司時,我們想到的是 Google、Meta。我們不是
00:52:38 thinking Google, Meta. We're not
00:52:38 想著 Google、Meta。我們不
00:52:38 thinking Google, Meta. We're not thinking, you know, big bank hasn't done
00:52:38 考慮 Google、Meta。我們不認為大銀行沒有這樣做
00:52:39 thinking, you know, big bank hasn't done
00:52:39 想著,你知道,大銀行還沒這樣做
00:52:40 thinking, you know, big bank hasn't done anything.
00:52:40 想著,你知道,大銀行什麼也沒做。
00:52:40 anything. 00:52:40 任何事。
00:52:40 anything. I'm thinking about big banks. Um when
00:52:40 什麼都行。我在想大銀行。嗯,什麼時候
00:52:42 I'm thinking about big banks. Um when
00:52:42 我在想大銀行。嗯,什麼時候
00:52:42 I'm thinking about big banks. Um when when I talk to CEOs and I know a lot of
00:52:42 我在想大銀行。嗯,當我和 CEO 們交談時,我知道很多
00:52:44 when I talk to CEOs and I know a lot of
00:52:44 當我和 CEO 們交談時,我知道很多
00:52:44 when I talk to CEOs and I know a lot of them in traditional industries, what I
00:52:44 當我和 CEO 們交談時,我認識很多傳統產業的 CEO,我
00:52:47 them in traditional industries, what I
00:52:47 在傳統產業中,我
00:52:47 them in traditional industries, what I counsel them is you already have people
00:52:47 在傳統產業中,我建議他們已經有人
00:52:49 counsel them is you already have people
00:52:49 建議他們你已經有人了
00:52:50 counsel them is you already have people in the company who know what to do. You
00:52:50 建議他們這麼做,因為你公司裡已經有知道怎麼做的人了。你
00:52:51 in the company who know what to do. You
00:52:51 公司裡誰知道該做什麼。你
00:52:51 in the company who know what to do. You just don't know who they are.
00:52:51 公司裡只有少數人知道該做什麼。你只是不知道他們是誰。
00:52:53 just don't know who they are.
00:52:53 只是不知道他們是誰。
00:52:53 just don't know who they are. So call a review of the best ideas to
00:52:53 只是不知道他們是誰。所以打電話來評論一下最好的想法
00:52:55 So call a review of the best ideas to
00:52:55 因此,呼籲大家回顧一下最好的想法
00:52:56 So call a review of the best ideas to apply AI in our business and ine
00:52:56 因此,呼籲大家回顧一下在我們的業務和工業中應用人工智慧的最佳想法
00:52:58 apply AI in our business and ine
00:52:58 在我們的業務和流程中應用人工智慧
00:52:58 apply AI in our business and ine inevitably the first ones are boring.
00:52:58 在我們的業務中應用人工智慧,不可避免地,第一個是無聊的。
00:53:01 inevitably the first ones are boring.
00:53:01 不可避免地,第一個是無聊的。
00:53:01 inevitably the first ones are boring. Improve customer service, improve call
00:53:01 第一個肯定很無聊。改善客戶服務,改善通話
00:53:03 Improve customer service, improve call
00:53:03 改善客戶服務,改善通話
00:53:03 Improve customer service, improve call centers and so forth. But then somebody
00:53:03 改善客戶服務,改善呼叫中心等等。但後來有人
00:53:05 centers and so forth. But then somebody
00:53:05 中心等等。但後來有人
00:53:05 centers and so forth. But then somebody says, you know, we could increase
00:53:05 中心等等。但有人說,我們可以增加
00:53:06 says, you know, we could increase
00:53:06 說,你知道,我們可以增加
00:53:06 says, you know, we could increase revenue if we built this product. I'll
00:53:06 說,你知道,如果我們生產這個產品,我們可以增加收入。我會
00:53:08 revenue if we built this product. I'll
00:53:08 如果我們生產了這款產品,收入就會增加。我會
00:53:08 revenue if we built this product. I'll give you another example. There's this
00:53:08 如果我們生產了這個產品,收入會是多少?我再舉個例子。
00:53:10 give you another example. There's this
00:53:10 再舉個例子。有這個
00:53:10 give you another example. There's this whole industry of people who work on
00:53:10 再舉個例子。整個產業都有人在
00:53:12 whole industry of people who work on
00:53:12 整個產業的人都在
00:53:12 whole industry of people who work on regulated user interfaces or one
00:53:12 整個產業從事規範使用者介面工作的人或一個
00:53:14 regulated user interfaces or one
00:53:14 受監管的使用者介面或一個
00:53:14 regulated user interfaces or one another. I think user interfaces are
00:53:14 受監管的使用者介面或彼此之間。我認為使用者介面
00:53:16 another. I think user interfaces are
00:53:16 另一個。我認為使用者介面
00:53:16 another. I think user interfaces are largely going to go away because if you
00:53:16 另一個。我認為使用者介面在很大程度上將會消失,因為如果你
00:53:18 largely going to go away because if you
00:53:18 基本上會消失,因為如果你
00:53:18 largely going to go away because if you think about it, the agents speak English
00:53:18 基本上會消失,因為如果你仔細想想,代理人會說英語
00:53:20 think about it, the agents speak English
00:53:20 想想看,特務們說的是英語
00:53:20 think about it, the agents speak English typically or other languages. You can
00:53:20 想想看,代理通常說英語或其他語言。你可以
00:53:22 typically or other languages. You can
00:53:22 通常或其他語言。你可以
00:53:22 typically or other languages. You can talk to them. You can say what you want.
00:53:22 通常或其他語言。你可以和他們交談。你想說什麼就說什麼。
00:53:24 talk to them. You can say what you want.
00:53:24 和他們談談。你想說什麼都可以。
00:53:24 talk to them. You can say what you want. The UI can be generated. So I can say
00:53:24 跟他們說話。你想說什麼就說什麼。 UI 可以產生。所以我可以說
00:53:26 The UI can be generated. So I can say
00:53:26 使用者介面已經產生。所以我可以說
00:53:26 The UI can be generated. So I can say generate me a set of buttons that allows
00:53:26 使用者介面可以產生。所以我可以說生成一組按鈕,允許
00:53:28 generate me a set of buttons that allows
00:53:28 產生一組按鈕,允許
00:53:28 generate me a set of buttons that allows me to solve this problem and it's
00:53:28 為我產生一組按鈕,讓我可以解決這個問題,這是
00:53:30 me to solve this problem and it's
00:53:30 讓我來解決這個問題
00:53:30 me to solve this problem and it's generated for you. Why do I have to be
00:53:30 這個問題已經為你生成了。為什麼我必須
00:53:32 generated for you. Why do I have to be
00:53:32 為你生成。為什麼我必須
00:53:32 generated for you. Why do I have to be stuck in what is called the WIMP
00:53:32 為你生成。為什麼我必須被困在所謂的 WIMP 中
00:53:34 stuck in what is called the WIMP
00:53:34 陷入所謂的 WIMP
00:53:34 stuck in what is called the WIMP interface, Windows, icons, menus, and
00:53:34 卡在所謂的 WIMP 介面、Windows、圖示、選單和
00:53:36 interface, Windows, icons, menus, and
00:53:36 介面、Windows、圖示、選單和
00:53:36 interface, Windows, icons, menus, and pulld down that was invented in Xerox
00:53:36 介面、Windows、圖示、選單和下拉選單都是由施樂公司發明的
00:53:38 pulld down that was invented in Xerox
00:53:38 拉下來,這是施樂公司發明的
00:53:38 pulld down that was invented in Xerox Park, right, 50 years ago? Why am I
00:53:38 拉下來,那是50年前在施樂公園發明的,對吧?為什麼我
00:53:41 Park, right, 50 years ago? Why am I
00:53:41 帕克,對吧,50年前?為什麼我
00:53:41 Park, right, 50 years ago? Why am I still stuck in that paradigm? I just
00:53:41 帕克,對吧,50年前?為什麼我還停留在那個範式?我只是
00:53:43 still stuck in that paradigm? I just
00:53:43 還停留在那個範式嗎?我只是
00:53:43 still stuck in that paradigm? I just want it to work.
00:53:43 還停留在那個模式嗎?我只想讓它發揮作用。
00:53:44 want it to work.
00:53:44 希望它能發揮作用。
00:53:44 want it to work. Yeah.
00:53:44 希望它能起作用。是的。
00:53:47 Yeah. 00:53:47 是的。
00:53:47 Yeah. Kids in high school and college now, any
00:53:47 是的。現在高中和大學的孩子,
00:53:50 Kids in high school and college now, any
00:53:50 現在高中和大學的孩子,任何
00:53:50 Kids in high school and college now, any different recommendations for where they
00:53:50 現在的孩子都在讀高中和大學,對於他們在哪裡上學有什麼不同的建議嗎?
00:53:51 different recommendations for where they
00:53:51 不同的建議
00:53:51 different recommendations for where they go? When you spend any time in a high
00:53:51 他們會推薦不同的去處嗎?當你在高
00:53:55 go? When you spend any time in a high
00:53:55 去嗎?當你在高
00:53:55 go? When you spend any time in a high school or I was at a conference
00:53:55 去嗎?當你在高中度過任何時間,或我參加一個會議
00:53:58 school or I was at a conference
00:53:58 學校或我在參加一個會議
00:53:58 school or I was at a conference yesterday where we had a drone challenge
00:53:58 學校,或是我昨天參加了一個會議,我們有一個無人機挑戰賽
00:54:00 yesterday where we had a drone challenge
昨天 00:54:00 我們舉辦了無人機挑戰賽
00:54:00 yesterday where we had a drone challenge and you watch the 15 year olds, they're
00:54:00 昨天我們舉辦了無人機挑戰賽,你看那些 15 歲的孩子,他們
00:54:03 and you watch the 15 year olds, they're
00:54:03 你看看那些15歲的孩子,他們
00:54:03 and you watch the 15 year olds, they're going to be fine.
00:54:03 你觀察一下 15 歲的孩子,他們會沒事的。
00:54:05 going to be fine.
00:54:05 一切都會好起來。
00:54:05 going to be fine. They're just going to be fine. It all
00:54:05 一切都會好起來的。一切都會好起來的。
00:54:07 They're just going to be fine. It all
00:54:07 他們會沒事的。一切都
00:54:07 They're just going to be fine. It all makes sense to them and we're in their
00:54:07 他們會沒事的。對他們來說一切都合情合理,而我們正處於他們的
00:54:08 makes sense to them and we're in their
00:54:08 對他們來說很有意義,而我們處於他們的
00:54:08 makes sense to them and we're in their way.
00:54:08 對他們來說很有意義,但我們擋了他們的路。
00:54:09 way. 00:54:09 方式。
00:54:09 way. Um, if I were
00:54:09 嗯,如果我
00:54:10 Um, if I were
00:54:10 嗯,如果我
00:54:10 Um, if I were digital natives,
00:54:10 嗯,如果我是數位原住民,
00:54:11 digital natives,
00:54:11 數位原住民,
00:54:11 digital natives, but they're more than digital natives.
00:54:11 數位原住民,但他們不只是數位原住民。
00:54:13 but they're more than digital natives.
00:54:13 但他們不只是數位原住民。
00:54:13 but they're more than digital natives. They get it. They understand the speed.
00:54:13 但他們不只是數位原住民。他們明白這一點。他們了解速度。
00:54:15 They get it. They understand the speed.
00:54:15 他們明白了。他們了解速度。
00:54:15 They get it. They understand the speed. It's natural to them. They're also,
00:54:15 他們明白。他們理解速度。這對他們來說很自然。他們也,
00:54:17 It's natural to them. They're also,
00:54:17 這對他們來說很自然。他們也,
00:54:17 It's natural to them. They're also, frankly, faster and smarter than we are,
00:54:17 對它們來說這很自然。坦白說,它們比我們更快更聰明。
00:54:19 frankly, faster and smarter than we are,
00:54:19 坦白說,比我們更快更聰明,
00:54:19 frankly, faster and smarter than we are, right? That's just how life works, I'm
00:54:19 坦白說,比我們更快更聰明,對吧?生活就是這樣,我
00:54:21 right? That's just how life works, I'm
00:54:21 對吧?生活就是這樣,我
00:54:21 right? That's just how life works, I'm sorry to say. So we have wisdom, they
00:54:21 對吧?人生就是這樣,很遺憾。所以我們有智慧,他們
00:54:24 sorry to say. So we have wisdom, they
00:54:24 很遺憾。所以我們有智慧,他們
00:54:24 sorry to say. So we have wisdom, they have intelligence, they win, right? So
00:54:24 很遺憾。所以我們有智慧,他們有智慧,他們贏了,對吧?所以
00:54:27 have intelligence, they win, right? So
00:54:27 有智慧,他們就能贏,對吧?所以
00:54:27 have intelligence, they win, right? So in their case,
00:54:27 有智慧,他們就能贏,對吧?所以對他們來說,
00:54:29 in their case, 00:54:29 對他們來說,
00:54:29 in their case, I used to think the right answer was to
00:54:29 對他們來說,我以前認為正確的答案是
00:54:31 I used to think the right answer was to
00:54:31 我以前認為正確答案是
00:54:31 I used to think the right answer was to go into biology. I now actually think
00:54:31 我以前以為正確的答案是去學生物學。現在我實際上認為
00:54:34 go into biology. I now actually think
00:54:34 進入生物學領域。我現在實際上認為
00:54:34 go into biology. I now actually think going into the application of
00:54:34 進入生物學。我現在實際上認為進入應用
00:54:37 going into the application of
00:54:37 進入應用
00:54:37 going into the application of intelligence to whatever you're
00:54:37 深入研究智慧應用,無論你正在做什麼
00:54:38 intelligence to whatever you're
00:54:38 無論你做什麼
00:54:38 intelligence to whatever you're interested in is the best thing you can
00:54:38 對你感興趣的事情充滿智慧是你所能做的最好的事情
00:54:41 interested in is the best thing you can
00:54:41 感興趣是你能做的最好的事情
00:54:41 interested in is the best thing you can do as a young person.
00:54:41 感興趣是年輕人能做的最好的事。
00:54:41 do as a young person.
00:54:41 像年輕人一樣去做。
00:54:41 do as a young person. Purpose driven.
00:54:41 像年輕人一樣去做。目標導向。
00:54:42 Purpose driven. 00:54:42 目標驅動。
00:54:42 Purpose driven. Yeah.
00:54:42 目標驅動。是的。
00:54:43 Yeah. 00:54:43 是的。
00:54:43 Yeah. Any form of solution that you find
00:54:43 是的。你找到的任何形式的解決方案
00:54:46 Any form of solution that you find
00:54:46 你找到的任何形式的解決方案
00:54:46 Any form of solution that you find interesting. Most uh most kids get into
00:54:46 任何你覺得有趣的解決方案。大多數孩子都會
00:54:48 interesting. Most uh most kids get into
00:54:48 很有趣。大多數孩子都喜歡
00:54:48 interesting. Most uh most kids get into it for gaming reasons or something and
00:54:48 有意思。大多數的孩子都是為了玩遊戲之類的原因才玩。
00:54:50 it for gaming reasons or something and
00:54:50 出於遊戲原因或類似原因
00:54:50 it for gaming reasons or something and they learn how to program very young. So
00:54:50 出於遊戲或其他原因,他們很小就學習程式設計。所以
00:54:52 they learn how to program very young. So
00:54:52 他們很小就學會了程式設計。所以
00:54:52 they learn how to program very young. So they're quite familiar with this. Um I
00:54:52 他們很小就開始學習程式設計了。所以他們對此很熟悉。嗯,我
00:54:55 they're quite familiar with this. Um I
00:54:55 他們對此很熟悉。嗯,我
00:54:55 they're quite familiar with this. Um I work uh at a particular university with
00:54:55 他們對此很熟悉。嗯,我在一所大學工作,
00:54:57 work uh at a particular university with
00:54:57 在某所大學工作
00:54:57 work uh at a particular university with undergraduates and they're already doing
00:54:57 在某所大學與本科生一起工作,他們已經在做
00:55:00 undergraduates and they're already doing
00:55:00 本科生已經開始
00:55:00 undergraduates and they're already doing different different algorithms for
00:55:00 本科生已經在做不同的演算法了
00:55:02 different different algorithms for
00:55:02 不同的演算法
00:55:02 different different algorithms for reinforcement learning as sophomores.
00:55:02 大二學生學習不同的強化學習演算法。
00:55:04 reinforcement learning as sophomores.
00:55:04 大二時開始強化學習。
00:55:04 reinforcement learning as sophomores. This shows you how fast this is
00:55:04 大二的時候就開始學習強化學習了。這顯示了它的速度有多快
00:55:06 This shows you how fast this is
00:55:06 這顯示了它有多快
00:55:06 This shows you how fast this is happening at their level. They're going
00:55:06 這顯示了在他們的層面上這種情況發生得有多快。他們正在
00:55:07 happening at their level. They're going
00:55:07 發生在他們的層面。他們正在
00:55:08 happening at their level. They're going to be just fine.
00:55:08 發生在他們這個層面。他們會沒事的。
00:55:09 to be just fine.
00:55:09 就好了。
00:55:09 to be just fine. They're responding to the economic
00:55:09 沒事。他們正在應對經濟
00:55:10 They're responding to the economic
00:55:10 他們正在應對經濟
00:55:10 They're responding to the economic signals, but they're also responding to
00:55:10 他們正在對經濟訊號做出反應,但他們也在對
00:55:13 signals, but they're also responding to
00:55:13 訊號,但它們也回應
00:55:13 signals, but they're also responding to their purpose. Right? So, an example
00:55:13 訊號,但它們也會對其目的做出反應。對吧?舉個例子
00:55:16 their purpose. Right? So, an example
00:55:16 他們的用途。對吧?舉個例子
00:55:16 their purpose. Right? So, an example would be you care about climate, which I
00:55:16 他們的宗旨。對吧?舉個例子,你關心氣候,我
00:55:18 would be you care about climate, which I
00:55:18 你會關心氣候嗎?我
00:55:18 would be you care about climate, which I certainly do. If you're a young person,
00:55:18 你會關心氣候議題嗎?我當然關心。如果你是個年輕人,
00:55:20 certainly do. If you're a young person,
00:55:20 當然。如果你是年輕人,
00:55:20 certainly do. If you're a young person, why don't you figure out a way to
00:55:20 當然可以。如果你是個年輕人,為什麼不想想辦法
00:55:21 why don't you figure out a way to
00:55:21 為什麼你不想辦法
00:55:21 why don't you figure out a way to simplify the climate science to use
00:55:21 為什麼你不想辦法簡化氣候科學的使用
00:55:23 simplify the climate science to use
00:55:23 簡化氣候科學的使用
00:55:23 simplify the climate science to use simple foundation models to answer these
00:55:23 簡化氣候科學,使用簡單的基礎模型來回答這些問題
00:55:25 simple foundation models to answer these
00:55:25 簡單的基礎模型來回答這些問題
00:55:25 simple foundation models to answer these core questions?
00:55:25 簡單的基礎模型來回答這些核心問題?
00:55:26 core questions? 00:55:26 核心問題?
00:55:26 core questions? Yeah.
00:55:26 核心問題?是的。
00:55:26 Yeah. 00:55:26 是的。
00:55:26 Yeah. Why don't you figure out a way to use
00:55:26 是的。為什麼不想想辦法利用
00:55:28 Why don't you figure out a way to use
00:55:28 為什麼你不想辦法利用
00:55:28 Why don't you figure out a way to use these powerful models to come up with
00:55:28 為什麼你不想辦法利用這些強大的模型來提出
00:55:29 these powerful models to come up with
00:55:29 這些強大的模型
00:55:29 these powerful models to come up with new materials, right, that allow us
00:55:29 這些強大的模型能夠產生新的材料,讓我們
00:55:32 new materials, right, that allow us
00:55:32 新材料,讓我們
00:55:32 new materials, right, that allow us again to address the carbon challenge?
00:55:32 新材料,對吧,這讓我們再次能夠應對碳挑戰?
00:55:34 again to address the carbon challenge?
00:55:34 再次應付碳挑戰?
00:55:34 again to address the carbon challenge? And why don't you work on energy systems
00:55:34 再次應付碳挑戰?為什麼不研究能源系統呢?
00:55:36 And why don't you work on energy systems
00:55:36 那你為什麼不研究能源系統呢
00:55:36 And why don't you work on energy systems to have better and more efficient energy
00:55:36 那你為什麼不研究能源系統,以便擁有更好、更有效率的能源
00:55:38 to have better and more efficient energy
00:55:38 擁有更好、更有效率的能源
00:55:38 to have better and more efficient energy sources that are not that less carbon?
00:55:38 擁有更好、更有效率、碳排放量更少的能源?
00:55:40 sources that are not that less carbon?
00:55:40 碳含量不是那麼低的來源嗎?
00:55:40 sources that are not that less carbon? You see my point? Yeah,
00:55:40 那些碳排放量不那麼低的來源?你明白我的意思了嗎?是的,
00:55:41 You see my point? Yeah,
00:55:41 你明白我的意思嗎?是的,
00:55:41 You see my point? Yeah, you know, I've noticed uh because I have
00:55:41 你明白我的意思嗎?是的,你知道,我注意到了,因為我
00:55:43 you know, I've noticed uh because I have
00:55:43 你知道,我注意到了,因為我有
00:55:43 you know, I've noticed uh because I have kids exactly that that era and um
00:55:43 你知道,我注意到了,因為我的孩子正是那個時代的人,嗯
00:55:45 kids exactly that that era and um
00:55:45 孩子們就是這樣的,那個時代
00:55:45 kids exactly that that era and um there's a very clear step function
00:55:45 孩子們就是這樣的,那個時代,嗯,有一個非常清晰的階躍函數
00:55:48 there's a very clear step function
00:55:48 有一個非常清晰的階躍函數
00:55:48 there's a very clear step function change largely attributable I think to
00:55:48 有一個非常明顯的階躍函數變化,我認為這主要歸因於
00:55:50 change largely attributable I think to
00:55:50 我認為變化很大程度歸因於
00:55:50 change largely attributable I think to Google and Apple that they have the
00:55:50 我認為這種變化很大程度上歸功於谷歌和蘋果,他們擁有
00:55:53 Google and Apple that they have the
00:55:53 谷歌和蘋果
00:55:53 Google and Apple that they have the assumption that things will work
00:55:53 谷歌和蘋果認為事情會順利進行
00:55:55 assumption that things will work
00:55:55 假設事情會順利
00:55:55 assumption that things will work and if you go just a couple years older
00:55:55 假設事情會順利,如果你再老幾歲
00:55:56 and if you go just a couple years older
00:55:56 如果你再老幾歲
00:55:56 and if you go just a couple years older during the wimp era like you described
00:55:56 如果你像你描述的那樣,在懦夫時代長大幾歲
00:55:58 during the wimp era like you described
00:55:58 就像你所描述的懦夫時代
00:55:58 during the wimp era like you described it which I'll attribute more to
00:55:58 就像你描述的,在懦夫時代,我認為這比較歸因於
00:55:59 it which I'll attribute more to
00:55:59 我認為這比較歸功於
00:55:59 it which I'll attribute more to Microsoft the assumption is nothing will
00:55:59 我認為這更歸功於微軟,假設沒有什麼會
00:56:01 Microsoft the assumption is nothing will
00:56:01 微軟的假設是
00:56:01 Microsoft the assumption is nothing will ever work like if I try to use this
00:56:01 微軟的假設是,如果我嘗試使用這個
00:56:03 ever work like if I try to use this
00:56:03 如果我嘗試使用這個
00:56:03 ever work like if I try to use this thing it's going to crash I'm going to
00:56:03 如果我嘗試使用這個東西,它就會崩潰,我想要
00:56:05 thing it's going to crash I'm going to
00:56:05 事情就要崩潰了我要
00:56:05 thing it's going to crash I'm going to be also interesting was that in my
00:56:05 事情會崩潰,我也很有趣的是,在我的
00:56:06 be also interesting was that in my
00:56:06 同樣有趣的是,在我的
00:56:06 be also interesting was that in my career I used to give these speeches
00:56:06 同樣有趣的是,在我的職業生涯中,我曾經發表過這些演講
00:56:08 career I used to give these speeches
00:56:08 職業生涯我曾經發表過這些演講
00:56:08 career I used to give these speeches about the internet which I enjoyed
00:56:08 職業生涯我曾經發表過關於網路的演講,我很享受
00:56:11 about the internet which I enjoyed
00:56:11 我喜歡互聯網
00:56:11 about the internet which I enjoyed uh where I said, you know, the great
00:56:11 關於互聯網,我很喜歡,我說過,你知道,偉大的
00:56:12 uh where I said, you know, the great
00:56:12 呃,我說,你知道,偉大的
00:56:12 uh where I said, you know, the great thing about the internet is it has
00:56:12 呃,我說,你知道,網路的偉大之處在於它
00:56:13 thing about the internet is it has
00:56:13 網路的一個特點是
00:56:13 thing about the internet is it has there's an off button and you can turn
00:56:13 網路有一個關閉按鈕,你可以關閉
00:56:15 there's an off button and you can turn
00:56:15 有一個關閉按鈕,你可以關閉
00:56:15 there's an off button and you can turn off your odd button and you can actually
00:56:15 有一個關閉按鈕,你可以關閉奇數按鈕,你實際上可以
00:56:18 off your odd button and you can actually
00:56:18 關掉奇怪的按鈕,你其實可以
00:56:18 off your odd button and you can actually have dinner with your family and then
00:56:18 放下你的奇怪按鈕,你就可以和家人一起吃晚飯,然後
00:56:19 have dinner with your family and then
00:56:19 和家人一起吃晚飯,然後
00:56:19 have dinner with your family and then you can turn it on after dinner. This is
00:56:19 和家人一起吃晚飯,然後晚餐後再打開它。這是
00:56:22 you can turn it on after dinner. This is
00:56:22 你可以在晚餐後打開它。這是
00:56:22 you can turn it on after dinner. This is no longer possible. So the divi the
00:56:22 你可以在晚餐後打開它。現在不行了。所以
00:56:24 no longer possible. So the divi the
00:56:24 不再可能。所以除法
00:56:24 no longer possible. So the divi the distinction between the real world and
00:56:24 不再可能。因此,現實世界和
00:56:26 distinction between the real world and
00:56:26 現實世界與
00:56:26 distinction between the real world and the digital world has become confusing.
00:56:26 現實世界和數位世界之間的差異變得令人困惑。
00:56:28 the digital world has become confusing.
00:56:28 數位世界變得令人困惑。
00:56:28 the digital world has become confusing. But no one none of us are offline for
00:56:28 數位世界變得令人困惑。但我們沒有人離線
00:56:32 But no one none of us are offline for
00:56:32 但我們沒有人離線
00:56:32 But no one none of us are offline for any significant period of time.
00:56:32 但我們沒有人長時間處於離線狀態。
00:56:33 any significant period of time.
00:56:33 任何重要的時間段。
00:56:34 any significant period of time. Yeah. And indeed the the reward system
00:56:34 任何重要的時間段。是的。而且確實有獎勵制度
00:56:36 Yeah. And indeed the the reward system
00:56:36 是的。確實有獎勵制度
00:56:36 Yeah. And indeed the the reward system in the world has now caused us to not
00:56:36 是的。事實上,世界上的獎勵制度現在已經導致我們不再
00:56:37 in the world has now caused us to not
00:56:37 世界上現在已經沒有
00:56:38 in the world has now caused us to not even be able to fly in peace. Yeah.
00:56:38 世界上的災難現在已經讓我們無法和平飛行。是的。
00:56:40 even be able to fly in peace. Yeah.
00:56:40 甚至可以平安飛行。是的。
00:56:40 even be able to fly in peace. Yeah. Right. Drive in peace, take a train in
00:56:40 甚至可以安心地飛行。是的。對。安心地開車,搭火車
00:56:42 Right. Drive in peace, take a train in
00:56:42 對。安心開車,搭火車
00:56:42 Right. Drive in peace, take a train in peace.
00:56:42 對。安心開車,安心搭火車。
00:56:42 peace. 00:56:42 和平。
00:56:42 peace. Star link is everywhere.
00:56:42 和平。星鏈無所不在。
00:56:44 Star link is everywhere.
00:56:44 星鏈無所不在。
00:56:44 Star link is everywhere. Right. And and that that ubiquitous
00:56:44 星鏈無所不在。對。而且無所不在的
00:56:45 Right. And and that that ubiquitous
00:56:45 對。而且無所不在的
00:56:45 Right. And and that that ubiquitous connectivity has some negative impact in
00:56:45 對。而且這種無所不在的連結對
00:56:48 connectivity has some negative impact in
00:56:48 連通性對
00:56:48 connectivity has some negative impact in terms of psychological stress uh loss of
00:56:48 網路連結對心理壓力有些負面影響,呃,失去
00:56:51 terms of psychological stress uh loss of
00:56:51 心理壓力方面呃損失
00:56:51 terms of psychological stress uh loss of emotional physical health and so forth.
00:56:51 心理壓力方面,情緒、身體健康的喪失等等。
00:56:53 emotional physical health and so forth.
00:56:53 情緒、身體健康等等。
00:56:53 emotional physical health and so forth. But the benefit of that productivity is
00:56:53 情緒、身體健康等等。但這種生產力的好處是
00:56:56 But the benefit of that productivity is
00:56:56 但這種生產力的好處是
00:56:56 But the benefit of that productivity is without question.
00:56:56 但這種生產力的好處是毋庸置疑的。
00:56:57 without question.
00:56:57 毫無疑問。
00:56:57 without question. Every day I get the strangest
00:56:57 毫無疑問。每天我都會遇到最奇怪的
00:56:59 Every day I get the strangest
00:56:59 每天我都會遇到最奇怪的事情
00:56:59 Every day I get the strangest compliment. Someone will stop me and
00:56:59 我每天都會收到最奇怪的讚美。有人會阻止我,然後
00:57:01 compliment. Someone will stop me and
00:57:01 讚美。有人會阻止我,
00:57:01 compliment. Someone will stop me and say, "Peter, you have such nice skin."
00:57:01 讚美。有人會攔住我說:“彼得,你的皮膚真好。”
00:57:03 say, "Peter, you have such nice skin."
00:57:03 說,“彼得,你的皮膚真好。”
00:57:03 say, "Peter, you have such nice skin." Honestly, I never thought I'd hear that
00:57:03 說:「彼得,你的皮膚真好。」說實話,我從沒想過我會聽到這樣的話
00:57:05 Honestly, I never thought I'd hear that
00:57:05 說實話,我從沒想過我會聽到這個
00:57:05 Honestly, I never thought I'd hear that from anyone. And honestly, I can't take
00:57:05 說實話,我從來沒想過會有人這麼說。說實話,我受不了
00:57:07 from anyone. And honestly, I can't take
00:57:07 任何人。說實話,我無法接受
00:57:07 from anyone. And honestly, I can't take the full credit. All I do is use
00:57:07 任何人都不會。說實話,功勞不能全部歸於我。我只是用
00:57:09 the full credit. All I do is use
00:57:09 全部功勞。我所做的就是使用
00:57:09 the full credit. All I do is use something called OneSkin OS1 twice a day
00:57:09 全部功勞。我每天只用一次 OneSkin OS1
00:57:12 something called OneSkin OS1 twice a day
00:57:12 每天兩次使用名為 OneSkin OS1 的東西
00:57:12 something called OneSkin OS1 twice a day every day. The company is built by four
00:57:12 每天兩次,一個叫做 OneSkin OS1 的東西。這家公司由四個人創立
00:57:15 every day. The company is built by four
每天 00:57:15。該公司由四個人創建
00:57:15 every day. The company is built by four brilliant PhD women who've identified a
00:57:15 每天。該公司由四位才華橫溢的女性博士創立,她們發現了一個
00:57:17 brilliant PhD women who've identified a
00:57:17 傑出的女博士們發現了
00:57:17 brilliant PhD women who've identified a peptide that effectively reverses the
00:57:17 兩位才華橫溢的女性博士發現了一種勝肽,可以有效逆轉
00:57:19 peptide that effectively reverses the
00:57:19 勝肽可以有效逆轉
00:57:19 peptide that effectively reverses the age of your skin. I love it. And again,
00:57:19 肽能有效逆轉皮膚老化。我太喜歡了。再說一次,
00:57:22 age of your skin. I love it. And again,
00:57:22 你的皮膚年齡。我喜歡它。再說一遍,
00:57:22 age of your skin. I love it. And again, I use this twice a day, every day. You
00:57:22 你的皮膚年齡。我喜歡它。而且,我每天都用兩次。你
00:57:25 I use this twice a day, every day. You
00:57:25 我每天都用這個兩次。你
00:57:25 I use this twice a day, every day. You can go to onkin.co and write peter at
00:57:25 我每天都會用這個兩次。你可以去 onkin.co 寫信給 peter
00:57:27 can go to onkin.co and write peter at
00:57:27 可以訪問 onkin.co 並寫信給 peter
00:57:27 can go to onkin.co and write peter at checkout for a discount on the same
00:57:27 可以去 onkin.co 並在結帳時寫信給 peter 以獲得相同的折扣
00:57:29 checkout for a discount on the same
00:57:29 結帳享有相同折扣
00:57:29 checkout for a discount on the same product I use. That's oneskin.co co and
00:57:29 結帳時,我使用的同款產品有折扣。這是 oneskin.co 和
00:57:32 product I use. That's oneskin.co co and
00:57:32 我使用的產品。這是 oneskin.co 和
00:57:32 product I use. That's oneskin.co co and use the code Peter at checkout. All
00:57:32 我用的產品。這是 oneskin.co.co,結帳時使用優惠碼 Peter。所有
00:57:35 use the code Peter at checkout. All
00:57:35 結帳時使用優惠碼 Peter。所有
00:57:35 use the code Peter at checkout. All right, back to the episode.
00:57:35 結帳時使用優惠碼 Peter。好了,回到正題。
00:57:37 right, back to the episode.
00:57:37 好的,回到本集。
00:57:37 right, back to the episode. Google IO was amazing.
00:57:37 好的,回到剛才的節目。 Google IO 大會太精彩了。
00:57:39 Google IO was amazing.
00:57:39 Google IO 太棒了。
00:57:40 Google IO was amazing. I mean, just hats off to the entire team
00:57:40 Google IO 太棒了。我向整個團隊致敬。
00:57:42 I mean, just hats off to the entire team
00:57:42 我的意思是,向整個團隊致敬
00:57:42 I mean, just hats off to the entire team there. Um, V3 was shocking and we're
00:57:42 我的意思是,我要向整個團隊致敬。嗯,V3 太令人震驚了,我們
00:57:47 there. Um, V3 was shocking and we're
00:57:47 嗯,V3 很震驚,我們
00:57:47 there. Um, V3 was shocking and we're we're sitting here 8 miles from
00:57:47 那裡。嗯,V3 很令人震驚,我們坐在這裡,距離
00:57:49 we're sitting here 8 miles from
00:57:49 我們坐在這裡,距離
00:57:49 we're sitting here 8 miles from Hollywood
00:57:49 我們坐在這裡,距離好萊塢 8 英里
00:57:51 Hollywood 00:57:51 好萊塢
00:57:51 Hollywood and I'm just wondering your thoughts on
00:57:51 好萊塢,我只是想知道你對
00:57:56 and I'm just wondering your thoughts on
00:57:56 我只是想知道你對
00:57:56 and I'm just wondering your thoughts on the impact this will have. you know, we
00:57:56 我只是想知道你對這會產生的影響的看法。你知道,我們
00:57:58 the impact this will have. you know, we
00:57:58 這將產生的影響。你知道,我們
00:57:58 the impact this will have. you know, we going to see the oneperson film, feature
00:57:58 這會造成什麼影響?你知道,我們要去看一部個人電影,
00:58:01 going to see the oneperson film, feature
00:58:01 去看個人電影,專題
00:58:01 going to see the oneperson film, feature film like we're seeing potentially
00:58:01 去看個人電影,就像我們可能看到的
00:58:04 film like we're seeing potentially
00:58:04 就像我們看到的
00:58:04 film like we're seeing potentially oneperson uh unicorns in the future with
00:58:04 就像我們未來可能會看到獨角獸一樣
00:58:06 oneperson uh unicorns in the future with
00:58:06 一個人呃未來的獨角獸
00:58:06 oneperson uh unicorns in the future with a with aic. Are we going to see uh an
00:58:06 oneperson 呃,未來會和 AIC 一起成為獨角獸。我們會看到呃
00:58:09 a with aic. Are we going to see uh an
00:58:09 和 aic 一起。我們能看看嗎
00:58:09 a with aic. Are we going to see uh an individual be able to compete with a
00:58:09 和 AIC 一起。我們會看到一個人能夠與
00:58:11 individual be able to compete with a
00:58:11 個人能夠與
00:58:12 individual be able to compete with a Hollywood studio? And should they be
00:58:12 個人能夠與好萊塢電影公司競爭嗎?他們應該
00:58:13 Hollywood studio? And should they be
00:58:13 好萊塢電影公司?他們應該
00:58:13 Hollywood studio? And should they be worried about their assets?
00:58:13 好萊塢電影公司?他們應該擔心自己的資產嗎?
00:58:16 worried about their assets?
00:58:16 擔心自己的資產?
00:58:16 worried about their assets? Well, they should always be worried
00:58:16 擔心他們的資產?嗯,他們應該一直擔心
00:58:17 Well, they should always be worried
00:58:17 嗯,他們應該一直擔心
00:58:17 Well, they should always be worried because of intellectual property issues
00:58:17 嗯,他們應該一直擔心智慧財產權問題
00:58:19 because of intellectual property issues
00:58:19 因為智慧財產權問題
00:58:19 because of intellectual property issues and so forth. Um, I think blockbusters
00:58:19 因為智慧財產權問題等等。嗯,我認為大片
00:58:22 and so forth. Um, I think blockbusters
00:58:22 等等。嗯,我認為是大片
00:58:22 and so forth. Um, I think blockbusters are likely to still be put together by
00:58:22 等等。嗯,我認為大片可能仍然會由
00:58:24 are likely to still be put together by
00:58:24 可能仍會被
00:58:24 are likely to still be put together by people with an awful lot of help from by
00:58:24 很可能仍會被人們在
00:58:26 people with an awful lot of help from by
00:58:26 得到了很多人的幫助
00:58:26 people with an awful lot of help from by AI. Mhm.
00:58:26 人類在人工智慧的幫助下取得了很大進展。嗯。
00:58:27 AI. Mhm. 00:58:27 人工智慧。嗯。
00:58:27 AI. Mhm. Um I don't think that goes away. Um if
00:58:27 人工智慧。嗯。我不認為這種情況會消失。嗯,如果
00:58:30 Um I don't think that goes away. Um if
00:58:30 嗯,我不認為這種情況會消失。嗯,如果
00:58:30 Um I don't think that goes away. Um if you look at what we can do with
00:58:30 嗯,我不認為這種情況會消失。嗯,如果你看看我們能做什麼
00:58:31 you look at what we can do with
00:58:31 看看我們能做什麼
00:58:31 you look at what we can do with generating long- form video, it's very
00:58:31 你看看我們能用生成長影片做什麼,這非常
00:58:34 generating long- form video, it's very
00:58:34 製作長視頻,這非常
00:58:34 generating long- form video, it's very expensive to do long-term video,
00:58:34 製作長視頻,製作長視頻的成本非常高,
00:58:35 expensive to do long-term video,
00:58:35 製作長期影片的成本很高,
00:58:35 expensive to do long-term video, although that will come down. And also
00:58:35 製作長期影片的成本很高,儘管成本會下降。而且
00:58:37 although that will come down. And also
00:58:37 雖然會下降。而且
00:58:37 although that will come down. And also there's an occasional extra leg or extra
00:58:37 雖然會下降。而且偶爾會有額外的腿或額外的
00:58:40 there's an occasional extra leg or extra
00:58:40 偶爾會有額外的腿或額外的
00:58:40 there's an occasional extra leg or extra clock or whatever. It's not perfect yet.
00:58:40 偶爾會有額外的腿或時鐘之類的。目前還不完美。
00:58:43 clock or whatever. It's not perfect yet.
00:58:43 時鐘或其他什麼的。它還不完美。
00:58:43 clock or whatever. It's not perfect yet. And that requires human editing. So even
00:58:43 時鐘或其他什麼。它還不完美。這需要人工編輯。所以即使
00:58:45 And that requires human editing. So even
00:58:45 這需要人工編輯。所以即使
00:58:45 And that requires human editing. So even in the scenario where a lot of the the
00:58:45 這需要人工編輯。所以即使在很多情況下
00:58:47 in the scenario where a lot of the the
00:58:47 在這種情況下,很多
00:58:47 in the scenario where a lot of the the video is created by by a computer, there
00:58:47 在許多影片都是由電腦製作的情況下,
00:58:50 video is created by by a computer, there
00:58:50 影片是由電腦製作的,
00:58:50 video is created by by a computer, there going to be humans that are producing it
00:58:50 影片是由電腦製作的,需要人類來製作
00:58:51 going to be humans that are producing it
00:58:51 將會是人類生產它
00:58:51 going to be humans that are producing it and directing it for reasons. My best
00:58:51 製作和導演這部電影的人是有原因的。我最好的
00:58:54 and directing it for reasons. My best
00:58:54 並且有理由去指導它。我最好的
00:58:54 and directing it for reasons. My best example in Hollywood is that let's let's
00:58:54 並且有理由去導演它。我在好萊塢最好的例子就是
00:58:57 example in Hollywood is that let's let's
00:58:57 好萊塢的例子是
00:58:57 example in Hollywood is that let's let's use the example and I was at at a studio
00:58:57 好萊塢的例子是,讓我們舉個例子,我當時在一個工作室
00:58:59 use the example and I was at at a studio
00:58:59 用這個例子,我當時在一個工作室
00:58:59 use the example and I was at at a studio where they were showing me this.
00:58:59 使用這個例子,我在一個工作室,他們向我展示了這個。
00:59:01 where they were showing me this.
00:59:01 他們向我展示了這個。
00:59:01 where they were showing me this. They had they happened to have an actor
00:59:01 他們給我看這個的時候,剛好有一個演員
00:59:03 They had they happened to have an actor
00:59:03 他們剛好有一個演員
00:59:03 They had they happened to have an actor who was recreating William Sha Shatner's
00:59:03 他們剛好有一位演員正在重現威廉夏特納的
00:59:06 who was recreating William Sha Shatner's
00:59:06 誰在重現威廉夏特納的
00:59:06 who was recreating William Sha Shatner's movies uh movements a young man and they
00:59:06 誰在重現威廉沙特納的電影呃動作一個年輕人,他們
00:59:10 movies uh movements a young man and they
00:59:10 電影呃動作一個年輕人和他們
00:59:10 movies uh movements a young man and they had licensed the likeness from you know
00:59:10 電影呃動作一個年輕人,他們已經從你所知道的
00:59:12 had licensed the likeness from you know
00:59:12 已獲得肖像權
00:59:12 had licensed the likeness from you know William Shatner who's now older and they
00:59:12 獲得了威廉·夏特納的肖像授權,他現在已經老了,他們
00:59:16 William Shatner who's now older and they
00:59:16 威廉夏特納現在年紀大了,他們
00:59:16 William Shatner who's now older and they put his head on this person's body and
00:59:16 威廉夏特納現在年紀大了,他們把他的頭放在這個人的身上,
00:59:18 put his head on this person's body and
00:59:18 把他的頭放在這個人的身上
00:59:18 put his head on this person's body and it was seamless. Well that's pretty
00:59:18 把他的頭放在這個人的身上,簡直天衣無縫。嗯,很漂亮
00:59:20 it was seamless. Well that's pretty
00:59:20 一切天衣無縫。嗯,這很漂亮
00:59:20 it was seamless. Well that's pretty impressive. That's more revenue for
00:59:20 一切都很順暢。這真是令人印象深刻。這能帶來更多收入
00:59:21 impressive. That's more revenue for
00:59:21 令人印象深刻。這意味著更多的收入
00:59:22 impressive. That's more revenue for everyone. The an unknown actor becomes a
00:59:22 令人印象深刻。這對每個人來說都是更多的收入。一個不知名的演員變成了
00:59:24 everyone. The an unknown actor becomes a
00:59:24 各位。一位不知名的演員變成了
00:59:24 everyone. The an unknown actor becomes a bit more famous, Mr. Shatner gets more
00:59:24 大家。一個不知名的演員變得更有名氣,夏特納先生變得更
00:59:26 bit more famous, Mr. Shatner gets more
00:59:26 更有名一點,夏特納先生得到更多
00:59:26 bit more famous, Mr. Shatner gets more revenue, they the whole the whole movie
00:59:26 更有名氣,夏特納先生的收入更高,他們整部電影
00:59:29 revenue, they the whole the whole movie
00:59:29 收入,他們整個整部電影
00:59:29 revenue, they the whole the whole movie genre works. That's a good thing.
00:59:29 收入,他們整個電影類型都有效。這是件好事。
00:59:32 genre works. That's a good thing.
00:59:32 類型作品。這是件好事。
00:59:32 genre works. That's a good thing. Another example is that nowadays they
00:59:32 類型作品。這是好事。另一個例子是,現在他們
00:59:35 Another example is that nowadays they
00:59:35 另一個例子是,現在他們
00:59:35 Another example is that nowadays they use green screens rather than sets. And
00:59:35 另一個例子是,現在他們使用綠幕而不是佈景。
00:59:38 use green screens rather than sets. And
00:59:38 使用綠幕而不是佈景。並且
00:59:38 use green screens rather than sets. And furthermore, in the alien department,
00:59:38 使用綠幕而不是佈景。此外,在外星人部分,
00:59:40 furthermore, in the alien department,
00:59:40 此外,在外星人部門,
00:59:40 furthermore, in the alien department, when you have, you know, scary movies,
00:59:40 此外,在外星人方面,當你有恐怖電影時,
00:59:42 when you have, you know, scary movies,
00:59:42 當你看恐怖電影的時候,
00:59:42 when you have, you know, scary movies, instead of having the makeup person,
00:59:42 你知道,在恐怖電影中,除了化妝師,
00:59:44 instead of having the makeup person,
00:59:44 而不是請化妝師,
00:59:44 instead of having the makeup person, they just add the makeup digitally.
00:59:44 他們不再需要化妝師,而是直接透過數位技術來化妝。
00:59:47 they just add the makeup digitally.
00:59:47 他們只是透過數位技術來添加妝容。
00:59:47 they just add the makeup digitally. So, who wins? The costs are lower. the
00:59:47 他們只是用數位技術添加妝容。那麼,誰贏了?成本更低。
00:59:50 So, who wins? The costs are lower. the
00:59:50 那麼,誰贏了?成本更低。
00:59:50 So, who wins? The costs are lower. the movies are made quicker. In theory, the
00:59:50 那麼,誰會贏呢?成本更低,電影製作速度更快。理論上,
00:59:53 movies are made quicker. In theory, the
00:59:53 電影製作速度更快。理論上,
00:59:53 movies are made quicker. In theory, the movies are better, right? Because you
00:59:53 電影製作速度更快了。理論上,電影品質更好了,對吧?因為你
00:59:54 movies are better, right? Because you
00:59:54 電影更好,對吧?因為你
00:59:54 movies are better, right? Because you have more choices. Um, so everybody
00:59:54 電影更好,對吧?因為你有更多選擇。嗯,所以每個人
00:59:56 have more choices. Um, so everybody
00:59:56 有更多選擇。嗯,所以每個人
00:59:56 have more choices. Um, so everybody wins. Who loses? Well, there was
00:59:56 有更多選擇。嗯,所以每個人都是贏家。誰輸了?嗯,
00:59:58 wins. Who loses? Well, there was
00:59:58 贏了。誰輸了?嗯,
00:59:58 wins. Who loses? Well, there was somebody who built that set
00:59:58 贏了。誰輸了?嗯,有人建造了這套
01:00:01 somebody who built that set
01:00:01 有人建造了這套
01:00:01 somebody who built that set and that set isn't needed anymore.
01:00:01 某個建造了該裝置的人,現在已經不再需要該裝置了。
01:00:03 and that set isn't needed anymore.
01:00:03 並且不再需要該集合。
01:00:03 and that set isn't needed anymore. That's a carpenter and a very talented
01:00:03 不再需要那套了。那是一個木匠,一個非常有才華的人
01:00:04 That's a carpenter and a very talented
01:00:04 那是一個木匠,非常有才華
01:00:04 That's a carpenter and a very talented person who now has to go get a job in
01:00:04 那是一個木匠,一個很有才華的人,現在必須去找一份工作
01:00:07 person who now has to go get a job in
01:00:07 現在必須去找工作的人
01:00:07 person who now has to go get a job in the carpentry business. So again, I
01:00:07 現在必須去找一份木工工作。所以,我
01:00:09 the carpentry business. So again, I
01:00:09 木工生意。所以,我
01:00:09 the carpentry business. So again, I think people get confused. If I look at
01:00:09 木工業。所以,我認為人們會感到困惑。如果我看看
01:00:11 think people get confused. If I look at
01:00:11 我覺得人們會感到困惑。如果我看
01:00:11 think people get confused. If I look at at if I look at the digital
01:00:11 我覺得人們會感到困惑。如果我看看數字
01:00:12 at if I look at the digital
01:00:12 如果我看一下數字
01:00:12 at if I look at the digital transformation of entertainment subject
01:00:12 如果我看一下娛樂產業的數位轉型
01:00:15 transformation of entertainment subject
01:00:15 娛樂主題的轉型
01:00:15 transformation of entertainment subject to intellectual property being held,
01:00:15 受智慧財產權保護的娛樂轉型,
01:00:17 to intellectual property being held,
01:00:17 智慧財產權被持有,
01:00:17 to intellectual property being held, which is always a question, it's going
01:00:17 智慧財產權被持有,這始終是一個問題,它將
01:00:20 which is always a question, it's going
01:00:20 這始終是個問題,它會
01:00:20 which is always a question, it's going to be just fine,
01:00:20 這始終是個問題,一切都會好起來的,
01:00:21 to be just fine,
01:00:21 一切都好,
01:00:21 to be just fine, right? There's still going to be
01:00:21 沒事吧?還是會有
01:00:22 right? There's still going to be
01:00:22 對吧?還會有
01:00:22 right? There's still going to be blockbusters. The cost will go down, not
01:00:22 對吧?還會有大片。成本會下降,而不是
01:00:25 blockbusters. The cost will go down, not
01:00:25 大片。成本會下降,而不是
01:00:25 blockbusters. The cost will go down, not up, or the or the relative income
01:00:25 大片。成本會下降,而不是上升,或相對收入
01:00:28 up, or the or the relative income
01:00:28 上漲,或相對收入
01:00:28 up, or the or the relative income because in Hollywood, they essentially
01:00:28 或相對收入,因為在好萊塢,他們基本上
01:00:30 because in Hollywood, they essentially
01:00:30 因為在好萊塢,他們基本上
01:00:30 because in Hollywood, they essentially have their own accounting and they
01:00:30 因為在好萊塢,他們基本上有自己的會計部門,而且他們
01:00:31 have their own accounting and they
01:00:31 有自己的會計部門,
01:00:31 have their own accounting and they essentially allocate all the revenue to
01:00:31 有自己的會計,他們基本上把所有收入分配給
01:00:33 essentially allocate all the revenue to
01:00:33 基本上把所有收入分配給
01:00:33 essentially allocate all the revenue to all the key producing people. The the
01:00:33 基本上把所有收入分配給所有關鍵的生產者。
01:00:35 all the key producing people. The the
01:00:35 所有關鍵生產人員。
01:00:35 all the key producing people. The the allocation will shift to the people who
01:00:35 所有關鍵生產人員。分配將轉移到那些
01:00:37 allocation will shift to the people who
01:00:37 分配將會轉移到那些
01:00:38 allocation will shift to the people who are the most creative. That's a normal
01:00:38 分配將轉移到最有創造力的人身上。這是正常的
01:00:40 are the most creative. That's a normal
01:00:40 是最有創意的。這是正常的
01:00:40 are the most creative. That's a normal process. Remember we said earlier that
01:00:40 是最有創意的。這是一個正常的過程。記得我們之前說過
01:00:42 process. Remember we said earlier that
01:00:42 過程。記得我們之前說過
01:00:42 process. Remember we said earlier that automation gets rid of the poor the
01:00:42 流程。記得我們之前說過,自動化可以擺脫窮人
01:00:45 automation gets rid of the poor the
01:00:45 自動化擺脫了窮人
01:00:45 automation gets rid of the poor the lowest quality jobs, the most dangerous
01:00:45 自動化讓窮人擺脫了品質最低、最危險的工作
01:00:47 lowest quality jobs, the most dangerous
01:00:47 品質最低的工作,最危險
01:00:47 lowest quality jobs, the most dangerous jobs there. The jobs that are sort of
01:00:47 那裡的工作品質最低,最危險。這些工作
01:00:49 jobs there. The jobs that are sort of
01:00:49 那裡有工作。這些工作有點像
01:00:49 jobs there. The jobs that are sort of straightforward are probably automated,
01:00:49 那裡有工作。那些比較簡單的工作很可能已經自動化了。
01:00:52 straightforward are probably automated,
01:00:52 直接可能是自動化的,
01:00:52 straightforward are probably automated, but they're really creative jobs. Um,
01:00:52 那些簡單的工作可能已經自動化了,但它們確實是創造性的工作。嗯,
01:00:54 but they're really creative jobs. Um,
01:00:54 但它們確實是創造性的工作。嗯,
01:00:54 but they're really creative jobs. Um, another example, the script writers.
01:00:54 但它們確實是創造性的工作。嗯,再舉個例子,編劇。
01:00:56 another example, the script writers.
01:00:56 另一個例子,劇本作者。
01:00:56 another example, the script writers. You're still going to have script
01:00:56 另一個例子,劇本作者。你仍然需要劇本
01:00:57 You're still going to have script
01:00:57 你還是會有劇本
01:00:57 You're still going to have script writers, but they're going to have an
01:00:57 你仍然會有編劇,但他們將會有一個
01:00:58 writers, but they're going to have an
01:00:58 作家,但他們將會有一個
01:00:58 writers, but they're going to have an awful lot of help from AI to write even
01:00:58 作家,但他們將得到人工智慧的大量幫助,甚至
01:01:01 awful lot of help from AI to write even
01:01:01 人工智慧的幫助很大,甚至可以
01:01:01 awful lot of help from AI to write even better scripts. That's not bad.
01:01:01 人工智慧的幫助讓我們寫出了更好的劇本。這還不錯。
01:01:03 better scripts. That's not bad.
01:01:03 更好的劇本。還不錯。
01:01:03 better scripts. That's not bad. Okay. I saw a study recently out of
01:01:03 更好的劇本。這還不錯。好的。我最近看到一項研究
01:01:05 Okay. I saw a study recently out of
01:01:05 好的。我最近看到一項研究
01:01:05 Okay. I saw a study recently out of Stanford that documented AI being much
01:01:05 好的。我看到史丹佛大學最近發表的一項研究,其中記錄了人工智慧在
01:01:10 Stanford that documented AI being much
01:01:10 史丹佛大學記錄了人工智慧
01:01:10 Stanford that documented AI being much more persuasive than the best humans.
01:01:10 史丹佛大學的一項研究表明,人工智慧比人類更有說服力。
01:01:13 more persuasive than the best humans.
01:01:13 比最優秀的人類更有說服力。
01:01:13 more persuasive than the best humans. Yes.
01:01:13 比最優秀的人類更有說服力。是的。
01:01:14 Yes. 01:01:14 是的。
01:01:14 Yes. Uh that set off some alarms. It also set
01:01:14 是的。呃,這引發了一些警報。它也
01:01:17 Uh that set off some alarms. It also set
01:01:17 呃,這引發了一些警報。它也
01:01:17 Uh that set off some alarms. It also set off some interesting thoughts on the
01:01:17 嗯,這引起了一些警覺。它也引發了一些關於
01:01:19 off some interesting thoughts on the
01:01:19 一些有趣的想法
01:01:19 off some interesting thoughts on the future of advertising.
01:01:19 對廣告的未來提出了一些有趣的想法。
01:01:21 future of advertising.
01:01:21 廣告的未來。
01:01:21 future of advertising. Any particular thoughts about that?
01:01:21 廣告的未來。對此有什麼特別的想法嗎?
01:01:23 Any particular thoughts about that?
01:01:23 對此有什麼特別的想法嗎?
01:01:23 Any particular thoughts about that? So we know the following. We know that
01:01:23 對此有什麼特別的想法嗎?我們知道以下幾點。我們知道
01:01:25 So we know the following. We know that
01:01:25 所以我們知道以下。我們知道
01:01:25 So we know the following. We know that if the system knows you well enough, it
01:01:25 所以我們知道以下幾點。我們知道,如果系統夠了解你,它
01:01:28 if the system knows you well enough, it
01:01:28 如果系統夠了解你,它
01:01:28 if the system knows you well enough, it can learn to convince you of anything.
01:01:28 如果系統夠了解你,它就能學會說服你任何事。
01:01:31 can learn to convince you of anything.
01:01:31 可以學習說服你任何事。
01:01:31 can learn to convince you of anything. Mhm. So what that means in an
01:01:31 可以學習說服你相信任何事。嗯。那麼這意味著什麼呢?
01:01:34 Mhm. So what that means in an
01:01:34 嗯。那麼這意味著什麼呢?
01:01:34 Mhm. So what that means in an unregulated environment is that the
01:01:34 嗯。這意味著在一個不受監管的環境中
01:01:36 unregulated environment is that the
01:01:36 不受監管的環境是
01:01:36 unregulated environment is that the systems will know you better and better.
01:01:36 不受監管的環境中系統會越來越了解你。
01:01:38 systems will know you better and better.
01:01:38 系統會越來越了解你。
01:01:38 systems will know you better and better. They'll get better at pitching you and
01:01:38 系統會越來越了解你。它們會越來越善於向你推銷產品,
01:01:40 They'll get better at pitching you and
01:01:40 他們會更好地向你推銷
01:01:40 They'll get better at pitching you and if you're not savvy, if you're not
01:01:40 他們會更擅長向你推銷,如果你不精明,如果你不
01:01:42 if you're not savvy, if you're not
01:01:42 如果你不聰明,如果你不
01:01:42 if you're not savvy, if you're not smart, you could be easily manipulated.
01:01:42 如果你不精明,如果你不聰明,你就很容易被操縱。
01:01:44 smart, you could be easily manipulated.
01:01:44 聰明,你很容易被操縱。
01:01:44 smart, you could be easily manipulated. We also know that the computer is better
01:01:44 聰明,你很容易被操縱。我們也知道電腦更聰明
01:01:47 We also know that the computer is better
01:01:47 我們也知道計算機更好
01:01:47 We also know that the computer is better than humans trying to do the same thing.
01:01:47 我們也知道,在做同樣的事情時,電腦比人類做得更好。
01:01:50 than humans trying to do the same thing.
01:01:50 比人類嘗試做同樣的事情要快。
01:01:50 than humans trying to do the same thing. So none of this surprises me. The real
01:01:50 比人類嘗試做同樣的事情要快得多。所以這些都不讓我感到驚訝。真正的
01:01:52 So none of this surprises me. The real
01:01:52 所以這些都不讓我驚訝。真正的
01:01:52 So none of this surprises me. The real question and I'll ask this in as a
01:01:52 所以這些都不讓我感到驚訝。真正的問題是,我會以
01:01:54 question and I'll ask this in as a
01:01:54 我會以
01:01:54 question and I'll ask this in as a question is in the presence of
01:01:54 我會問這個問題,因為問題是在
01:01:57 question is in the presence of
01:01:57 問題是
01:01:57 question is in the presence of unregulated misinformation engines of
01:01:57 問題是,在不受監管的假訊息引擎存在的情況下
01:02:00 unregulated misinformation engines of
01:02:00 不受監管的假訊息引擎
01:02:00 unregulated misinformation engines of which there will be many advertisers
01:02:00 不受監管的假訊息引擎,其中會有很多廣告商
01:02:03 which there will be many advertisers
01:02:03 將會有許多廣告商
01:02:03 which there will be many advertisers uh politicians just criminal people
01:02:03 會有很多廣告商,呃,政客,只是罪犯
01:02:06 uh politicians just criminal people
01:02:06 呃,政客只是罪犯
01:02:06 uh politicians just criminal people people trying to evade responsibility.
01:02:06 呃,政客只是試圖逃避責任的罪犯。
01:02:08 people trying to evade responsibility.
01:02:08 人們試圖逃避責任。
01:02:08 people trying to evade responsibility. There's all sorts of people who have
01:02:08 有人試圖逃避責任。各種各樣的人都有
01:02:10 There's all sorts of people who have
01:02:10 各種各樣的人都有
01:02:10 There's all sorts of people who have free speech. When they have free speech
01:02:10 各種各樣的人都有言論自由。當他們有言論自由時
01:02:13 free speech. When they have free speech
01:02:13 言論自由。當他們有言論自由的時候
01:02:13 free speech. When they have free speech which includes the ability to use
01:02:13 言論自由。當他們擁有言論自由,包括使用
01:02:15 which includes the ability to use
01:02:15 其中包括使用
01:02:15 which includes the ability to use misinformation to their advantage, what
01:02:15 其中包括利用假訊息為自己謀利,
01:02:18 misinformation to their advantage, what
01:02:18 錯誤訊息對他們有利,什麼
01:02:18 misinformation to their advantage, what happens to democracy? Yeah,
01:02:18 假訊息對他們有利,民主會怎樣?是啊,
01:02:20 happens to democracy? Yeah,
01:02:20 民主會怎樣?是的,
01:02:20 happens to democracy? Yeah, we we've all grown up in democracies
01:02:20 民主會怎樣?是的,我們都是在民主國家長大的
01:02:22 we we've all grown up in democracies
01:02:22 我們都是在民主國家長大的
01:02:22 we we've all grown up in democracies where there's a sort of a a consensus
01:02:22 我們都是在民主國家長大的,那裡有某種共識
01:02:24 where there's a sort of a a consensus
01:02:24 存在某種共識
01:02:24 where there's a sort of a a consensus around trust and there's an elite that
01:02:24 那裡存在著一種關於信任的共識,並且有一個精英
01:02:27 around trust and there's an elite that
01:02:27 圍繞信任,有一個精英
01:02:27 around trust and there's an elite that more or less administers the trust
01:02:27 圍繞著信任,有菁英或多或少地管理信任
01:02:28 more or less administers the trust
01:02:28 或多或少管理信託
01:02:28 more or less administers the trust vectors and so forth. There's a set of
01:02:28 或多或少管理信任向量等等。有一套
01:02:30 vectors and so forth. There's a set of
01:02:30 向量等等。有一組
01:02:30 vectors and so forth. There's a set of shared values. Do those shared values go
01:02:30 向量等等。有一組共享的價值觀。這些共享的價值觀是否
01:02:33 shared values. Do those shared values go
01:02:33 共同的價值觀。這些共同的價值觀
01:02:33 shared values. Do those shared values go away? In our book about Genesis, we talk
01:02:33 共同的價值觀。這些共同的價值觀會消失嗎?在我們關於《創世紀》的書中,我們談到
01:02:36 away? In our book about Genesis, we talk
01:02:36 ?在我們關於《創世紀》的書中,我們談到
01:02:36 away? In our book about Genesis, we talk about this as a deeper problem. What
01:02:36 ?在我們關於《創世紀》的書中,我們討論過這個問題,認為這是一個更深層的問題。
01:02:38 about this as a deeper problem. What
01:02:38 認為這是一個更深層的問題。
01:02:38 about this as a deeper problem. What does it mean to be human when you're
01:02:38 這個問題比較深奧。當你
01:02:41 does it mean to be human when you're
01:02:41 當你
01:02:41 does it mean to be human when you're interacting mostly with these digital
01:02:41 當你主要與這些數字互動時,這意味著作為人類
01:02:43 interacting mostly with these digital
01:02:43 主要與這些數字
01:02:43 interacting mostly with these digital things,
01:02:43 主要與這些數位事物互動,
01:02:45 things, 01:02:45 事,
01:02:45 things, especially if the digital things have
01:02:45 尤其是當數字事物
01:02:46 especially if the digital things have
01:02:46 尤其是當數字事物
01:02:46 especially if the digital things have their own scenarios? My favorite example
01:02:46 尤其是當數字事物有它們自己的場景時?我最喜歡的例子
01:02:50 their own scenarios? My favorite example
01:02:50 他們自己的場景?我最喜歡的例子
01:02:50 their own scenarios? My favorite example is that uh you have a son or a grandson
01:02:50 他們各自的場景?我最喜歡的例子是,呃,你有一個兒子或孫子
01:02:53 is that uh you have a son or a grandson
01:02:53 呃,你有一個兒子或孫子
01:02:53 is that uh you have a son or a grandson or a child or a grandchild and you give
01:02:53 呃,你有一個兒子或孫子或孩子或孫子,你給
01:02:56 or a child or a grandchild and you give
01:02:56 或一個孩子或孫子,你給
01:02:56 or a child or a grandchild and you give them a bear and the bear has a
01:02:56 或一個孩子或孫子,你給他們一隻熊,這隻熊有一個
01:02:58 them a bear and the bear has a
01:02:58 給他們一隻熊,熊有一個
01:02:58 them a bear and the bear has a personality and the child grows up but
01:02:58 給他們一隻熊,熊有個性,孩子長大了,但是
01:02:59 personality and the child grows up but
01:02:59 性格和孩子成長,但
01:02:59 personality and the child grows up but the bear grows up too.
01:02:59 個性和孩子一起長大,但熊也長大了。
01:03:01 the bear grows up too.
01:03:01 熊也長大了。
01:03:01 the bear grows up too. So who regulates what the bear talks to
01:03:01 熊也會長大。那麼誰來規定熊跟什麼說話呢?
01:03:04 So who regulates what the bear talks to
01:03:04 那麼誰來規範熊的談話內容
01:03:04 So who regulates what the bear talks to the kid? Most people haven't actually
01:03:04 那誰來管熊跟孩子說什麼呢?大多數人其實沒有
01:03:05 the kid? Most people haven't actually
01:03:05 孩子?大多數人其實還沒
01:03:05 the kid? Most people haven't actually experienced the super super empathetic
01:03:05 孩子?大多數人其實還沒有體驗過超級超級同理心
01:03:07 experienced the super super empathetic
01:03:07 經歷了超級超級感同身受
01:03:07 experienced the super super empathetic voice that can be any inflection you
01:03:07 體驗了超級超級富有同情心的聲音,可以是任何語調,
01:03:09 voice that can be any inflection you
01:03:09 聲音可以是任何語調
01:03:09 voice that can be any inflection you want. When they see that which will be
01:03:09 聲音可以是任何你想要的語調。當他們看到
01:03:11 want. When they see that which will be
01:03:11 想要。當他們看到
01:03:11 want. When they see that which will be in the next probably two months.
01:03:11 想要。當他們看到大概在接下來的兩個月內發生的事情。
01:03:12 in the next probably two months.
01:03:12 大概在接下來的兩個月內。
01:03:12 in the next probably two months. Yeah. they're going to completely open
01:03:12 大概在接下來的兩個月。是的,他們會完全開放
01:03:13 Yeah. they're going to completely open
01:03:13 是的,他們將完全開放
01:03:14 Yeah. they're going to completely open their eyes to what this
01:03:14 是的。他們會徹底看清這一切
01:03:14 their eyes to what this
01:03:14 他們的眼睛看到這個
01:03:14 their eyes to what this Well, remember that voice casting was
01:03:14 他們的眼睛看到了什麼,嗯,記得配音是
01:03:17 Well, remember that voice casting was
01:03:17 嗯,記得配音是
01:03:17 Well, remember that voice casting was solved a few years ago and that you can
01:03:17 嗯,記住,配音問題幾年前就解決了,你可以
01:03:19 solved a few years ago and that you can
01:03:19 幾年前就解決了,你可以
01:03:19 solved a few years ago and that you can cast
01:03:19 幾年前就解決了,你可以
01:03:20 cast 01:03:20 演員
01:03:20 cast anyone else's voice onto your own.
01:03:20 將其他人的聲音投射到自己的聲音中。
01:03:22 anyone else's voice onto your own.
01:03:22 將他人的聲音融入自己的聲音中。
01:03:22 anyone else's voice onto your own. Yeah.
01:03:22 把別人的聲音融入自己的聲音。是的。
01:03:23 Yeah. 01:03:23 是的。
01:03:23 Yeah. And that has all sorts of problems.
01:03:23 是的。但這會帶來各種各樣的問題。
01:03:25 And that has all sorts of problems.
01:03:25 這會帶來各種各樣的問題。
01:03:25 And that has all sorts of problems. Have you seen uh an avatar yet of
01:03:25 這會帶來各種各樣的問題。你見過嗎?
01:03:27 Have you seen uh an avatar yet of
01:03:27 你見過嗎
01:03:27 Have you seen uh an avatar yet of somebody that you love that's passed
01:03:27 你有沒有見過你愛的人的化身,那個已經過世的人
01:03:29 somebody that you love that's passed
01:03:29 你愛的人已經過世
01:03:29 somebody that you love that's passed away or or Henry Kissinger or anything
01:03:29 某個你愛的人過世了,或是亨利‧基辛格或其他什麼人
01:03:30 away or or Henry Kissinger or anything
01:03:30 離開或亨利·基辛格或任何人
01:03:30 away or or Henry Kissinger or anything is that?
01:03:30 離開或亨利·基辛格或其他什麼?
01:03:31 is that? 01:03:31 是嗎?
01:03:31 is that? Well, we created we actually created one
01:03:31 是嗎?嗯,我們確實創造了一個
01:03:32 Well, we created we actually created one
01:03:32 嗯,我們確實創造了一個
01:03:32 Well, we created we actually created one with the permission of his family.
01:03:32 嗯,我們實際上是在徵得他家人同意的情況下製作了一個。
01:03:34 with the permission of his family.
01:03:34 得到了家人的允許。
01:03:34 with the permission of his family. Did you start crying instantly?
01:03:34 得到了家人的同意。當時是不是立刻就哭了?
01:03:35 Did you start crying instantly?
01:03:35 你立刻就哭了?
01:03:35 Did you start crying instantly? Uh it's very emotional. It's very
01:03:35 你是不是立刻就哭了?呃,這很感人。這很
01:03:37 Uh it's very emotional. It's very
01:03:37 嗯,這很感動。這很
01:03:37 Uh it's very emotional. It's very emotional because, you know, it brings
01:03:37 嗯,這很感動。這很感人,因為你知道,它帶來了
01:03:38 emotional because, you know, it brings
01:03:38 情緒化,因為它帶來
01:03:38 emotional because, you know, it brings back I mean it's it's a real human,
01:03:38 很感人,因為它讓我回想起一個真實的人,
01:03:41 back I mean it's it's a real human,
01:03:41 我的意思是,這是一個真正的人類,
01:03:41 back I mean it's it's a real human, you know, it's a real memory, a real
01:03:41 回來,我的意思是,這是一個真實的人,你知道,這是一個真實的記憶,一個真實的
01:03:43 you know, it's a real memory, a real
01:03:43 你知道,這是一段真實的記憶,一段真實的
01:03:43 you know, it's a real memory, a real voice. Um, and I think we're going to
01:03:43 你知道,這是一段真實的記憶,一個真實的聲音。嗯,我想我們會
01:03:45 voice. Um, and I think we're going to
01:03:45 聲音。嗯,我想我們會
01:03:45 voice. Um, and I think we're going to see more of that. Now, one obvious thing
01:03:45 聲音。嗯,我想我們會看到更多這樣的情況。現在,有一件顯而易見的事情
01:03:47 see more of that. Now, one obvious thing
01:03:47 看到更多。現在,有一件顯而易見的事情
01:03:47 see more of that. Now, one obvious thing that will happen is at some point in the
01:03:47 看到更多。現在,一個顯而易見的事情是,在某個時候
01:03:49 that will happen is at some point in the
01:03:49 會發生在某個時刻
01:03:49 that will happen is at some point in the future when when we naturally die, our
01:03:49 將來的某個時候,當我們自然死亡時,我們的
01:03:53 future when when we naturally die, our
01:03:53 未來當我們自然死亡時,我們的
01:03:53 future when when we naturally die, our digital essence will live in the cloud.
01:03:53 未來當我們自然死亡時,我們的數位本質將存在於雲端。
01:03:56 digital essence will live in the cloud.
01:03:56 數位本質將存在於雲端。
01:03:56 digital essence will live in the cloud. Yeah.
01:03:56 數位本質將存在於雲端。是的。
01:03:56 Yeah. 01:03:56 是的。
01:03:56 Yeah. And it will know what we knew at the
01:03:56 是的。它會知道我們在
01:03:58 And it will know what we knew at the
01:03:58 它會知道我們在
01:03:58 And it will know what we knew at the time and you can ask it a question.
01:03:58 它會知道我們當時所知道的事情,你可以問它一個問題。
01:04:00 time and you can ask it a question.
01:04:00 時間,您可以向它提問。
01:04:00 time and you can ask it a question. Yeah.
01:04:00 的時候你可以問它一個問題。是的。
01:04:00 Yeah. 01:04:00 是的。
01:04:00 Yeah. So, can you imagine asking Einstein,
01:04:00 是的。那麼,你能想像問愛因斯坦嗎?
01:04:02 So, can you imagine asking Einstein,
01:04:02 那麼,你能想像問愛因斯坦嗎?
01:04:02 So, can you imagine asking Einstein, going back to Einstein,
01:04:02 那麼,你能想像問愛因斯坦,回到愛因斯坦,
01:04:04 going back to Einstein,
01:04:04 回到愛因斯坦,
01:04:04 going back to Einstein, what did you really think about,
01:04:04 回到愛因斯坦,你真正想到的是什麼,
01:04:06 what did you really think about,
01:04:06 你到底在想什麼?
01:04:06 what did you really think about, you know, this other guy,
01:04:06 你到底是怎麼想的,你知道,這個人,
01:04:08 you know, this other guy,
01:04:08 你知道,還有另一個人,
01:04:08 you know, this other guy, you know, did you actually like him or
01:04:08 你知道,這個人,你知道,你真的喜歡他嗎?
01:04:09 you know, did you actually like him or
01:04:09 你知道嗎,你真的喜歡他,或者
01:04:09 you know, did you actually like him or were you just being polite with him with
01:04:09 你知道嗎,你真的喜歡他,還是只是出於禮貌
01:04:11 were you just being polite with him with
01:04:11 你只是對他有禮貌
01:04:11 were you just being polite with him with letters?
01:04:11 你寫信只是出於禮貌嗎?
01:04:11 letters? 01:04:11 字母?
01:04:11 letters? Yeah. 01:04:11 封信?是的。
01:04:11 Yeah. 01:04:11 是的。
01:04:11 Yeah. Right. Um, and in all those sort of
01:04:11 是的。嗯,在所有這些
01:04:14 Right. Um, and in all those sort of
01:04:14 對。嗯,在所有這些
01:04:14 Right. Um, and in all those sort of famous contests that we study as
01:04:14 對。嗯,在我們研究的所有那些著名的比賽中
01:04:16 famous contests that we study as
01:04:16 我們研究的著名比賽
01:04:16 famous contests that we study as students,
01:04:16 我們學生時代研究過的著名比賽,
01:04:16 students, 01:04:16 學生們,
01:04:16 students, can you imagine be able to ask the, you
01:04:16 同學們,你們能想像能夠問,你們
01:04:19 can you imagine be able to ask the, you
01:04:19 你能想像能夠問你嗎
01:04:19 can you imagine be able to ask the, you know, the people
01:04:19 你能想像能夠問那些人嗎
01:04:20 know, the people
01:04:20 知道,人們
01:04:20 know, the people Yeah.
01:04:20 知道,人們是的。
01:04:21 Yeah. 01:04:21 是的。
01:04:21 Yeah. Today, you know, with today's
01:04:21 是的。今天,你知道,今天的
01:04:23 Today, you know, with today's
01:04:23 今天,你知道,今天的
01:04:23 Today, you know, with today's retrospective, what did you really
01:04:23 今天,你知道,透過今天的回顧,你真正
01:04:24 retrospective, what did you really
01:04:24 回顧過去,你真正
01:04:24 retrospective, what did you really think? I know that the education example
01:04:24 回顧一下,你當時到底是怎麼想的?我知道教育的例子
01:04:26 think? I know that the education example
01:04:26 覺得呢?我知道教育的例子
01:04:26 think? I know that the education example you gave earlier is so much more
01:04:26 覺得?我知道你之前舉的教育例子比
01:04:28 you gave earlier is so much more
01:04:28 你之前給的更多
01:04:28 you gave earlier is so much more compelling when you're talking to Isaac
01:04:28 你之前說的這些話在和 Isaac 談話時更有說服力
01:04:29 compelling when you're talking to Isaac
01:04:29 當你和 Isaac 談話時,很有吸引力
01:04:29 compelling when you're talking to Isaac Newton or Albert Einstein instead of
01:04:29 當你和艾薩克牛頓或阿爾伯特愛因斯坦交談時,而不是
01:04:31 Newton or Albert Einstein instead of
01:04:31 牛頓或阿爾伯特愛因斯坦代替
01:04:31 Newton or Albert Einstein instead of just a
01:04:31 牛頓或阿爾伯特愛因斯坦,而不僅僅是
01:04:33 just a 01:04:33 只是一個
01:04:33 just a but you know it's so it's so
01:04:33 只是,但你知道,就是這樣
01:04:35 but you know it's so it's so
01:04:35 但你知道這麼
01:04:35 but you know it's so it's so this is coming back to the V3 in the
01:04:35 但你知道這如此這又回到了 V3
01:04:37 this is coming back to the V3 in the
01:04:37 這又回到了 V3
01:04:37 this is coming back to the V3 in the movies when the one of the first
01:04:37 這又回到了電影裡的 V3,當時它是第一個
01:04:39 movies when the one of the first
01:04:39 電影當第一部
01:04:39 movies when the one of the first companies we incubated out of MIT course
01:04:39 電影是我們在麻省理工學院孵化的第一批公司之一
01:04:41 companies we incubated out of MIT course
01:04:41 我們在麻省理工學院課程中孵化的公司
01:04:41 companies we incubated out of MIT course advisor we sold it to Don Graham and the
01:04:41 我們孵化的麻省理工學院課程顧問公司,我們把它賣給了唐·格雷厄姆和
01:04:43 advisor we sold it to Don Graham and the
01:04:43 顧問我們把它賣給了唐·格雷厄姆和
01:04:43 advisor we sold it to Don Graham and the Washington Post and then so I was
01:04:43 顧問我們把它賣給了唐·格雷厄姆和《華盛頓郵報》,然後我就
01:04:44 Washington Post and then so I was
01:04:44 華盛頓郵報 然後我就
01:04:44 Washington Post and then so I was working for him for a year after that
01:04:44 華盛頓郵報,之後我為他工作了一年
01:04:47 working for him for a year after that
01:04:47 之後為他工作了一年
01:04:47 working for him for a year after that and the conception was here's the
01:04:47 之後我為他工作了一年,這個想法就是
01:04:48 and the conception was here's the
01:04:48 這個想法是
01:04:48 and the conception was here's the internet here's the newspaper let's move
01:04:48 當時的理念是,這是互聯網,這是報紙,讓我們行動起來
01:04:50 internet here's the newspaper let's move
01:04:50 網路 這是報紙 我們走吧
01:04:50 internet here's the newspaper let's move the newspaper onto the internet we'll
01:04:50 網路 這是報紙 讓我們把報紙搬到網路上
01:04:51 the newspaper onto the internet we'll
01:04:51 報紙到網路上,我們會
01:04:51 the newspaper onto the internet we'll call it washingtonost.com
01:04:51 報紙上線後我們將其命名為 washingtonost.com
01:04:53 call it washingtonost.com
01:04:53 稱為 washingtonost.com
01:04:53 call it washingtonost.com and if you look hit where it ended up,
01:04:53 把它命名為 washingtonost.com 如果你看看它最終在哪裡,
01:04:55 and if you look hit where it ended up,
01:04:55 如果你看看它最後落在哪裡,
01:04:55 and if you look hit where it ended up, you know, today with Meta, Tik Tok,
01:04:55 如果你看看它最終的結果,你知道,今天有了 Meta、Tik Tok,
01:04:58 you know, today with Meta, Tik Tok,
01:04:58 你知道,今天有了 Meta、Tik Tok,
01:04:58 you know, today with Meta, Tik Tok, YouTube didn't end up anything like the
01:04:58 你知道,今天有了 Meta、Tik Tok 和 YouTube,結果並沒有像
01:05:00 YouTube didn't end up anything like the
01:05:00 YouTube 最終並沒有
01:05:00 YouTube didn't end up anything like the newspaper moves to the internet.
01:05:00 YouTube 最終並沒有像報紙那樣轉向網路。
01:05:02 newspaper moves to the internet.
01:05:02 報社轉向網路。
01:05:02 newspaper moves to the internet. So now here's V3, here are movies. You
01:05:02 報紙搬到網路上了。現在這是 V3,這是電影。你
01:05:05 So now here's V3, here are movies. You
01:05:05 現在是 V3,這是電影。你
01:05:05 So now here's V3, here are movies. You can definitely make a long form movie
01:05:05 現在是 V3,這是電影。你絕對可以製作一部長片
01:05:06 can definitely make a long form movie
01:05:06 絕對可以拍一部長片
01:05:06 can definitely make a long form movie much more
01:05:06 絕對可以製作出更長的電影
01:05:07 much more 01:05:07 更多
01:05:08 much more cheaply. But I just had this experience
01:05:08 便宜多了。但我剛剛經歷了這樣的事情
01:05:10 cheaply. But I just had this experience
01:05:10 很便宜。但我剛剛經歷了這樣的事情
01:05:10 cheaply. But I just had this experience of somebody that I know is a complete
01:05:10 便宜。但我剛剛經歷了這樣的經歷,我知道這是一個完全
01:05:13 of somebody that I know is a complete
01:05:13 我知道有人完全
01:05:13 of somebody that I know is a complete this director will try and make a
01:05:13 我知道這個導演會嘗試製作一個
01:05:14 this director will try and make a
01:05:14 這位導演將嘗試製作一個
01:05:14 this director will try and make a tearjerker by leading me down a two-hour
01:05:14 這位導演會嘗試製作一部催人淚下的電影,帶我度過兩個小時
01:05:16 tearjerker by leading me down a two-hour
01:05:16 催人淚下,帶領我度過了兩個小時
01:05:16 tearjerker by leading me down a two-hour long path. But I can get you to that
01:05:16 催人淚下,帶我走過了長達兩個小時的道路。但我可以帶你到那裡
01:05:18 long path. But I can get you to that
01:05:18 路很長。但我可以帶你到那裡
01:05:18 long path. But I can get you to that same emotional state in about five
01:05:18 路很長。但我可以在大約五分鐘內讓你達到同樣的情緒狀態
01:05:19 same emotional state in about five
01:05:19 大約五分鐘內情緒狀態相同
01:05:20 same emotional state in about five minutes if it's personalized to you.
01:05:20 如果它是針對您個人化的,大約五分鐘後您的情緒狀態也會相同。
01:05:22 minutes if it's personalized to you.
如果它是針對您個性化的,則為 01:05:22 分鐘。
01:05:22 minutes if it's personalized to you. Well, one of the things that's happened
01:05:22 分鐘,如果它是為你量身定制的。嗯,發生的事情之一
01:05:24 Well, one of the things that's happened
01:05:24 嗯,發生的事情之一
01:05:24 Well, one of the things that's happened because of the addictive nature of the
01:05:24 嗯,由於成癮性,
01:05:26 because of the addictive nature of the
01:05:26 因為成癮性
01:05:26 because of the addictive nature of the internet is we've lost um sort of the
01:05:26 因為網路的成癮性,我們失去了
01:05:29 internet is we've lost um sort of the
01:05:29 網路是我們失去了某種
01:05:29 internet is we've lost um sort of the deep state of reading.
01:05:29 網路讓我們失去了某種深層的閱讀狀態。
01:05:30 deep state of reading.
01:05:30 深度閱讀狀態。
01:05:30 deep state of reading. Mhm.
01:05:30 深度閱讀。嗯。
01:05:31 Mhm. 01:05:31 嗯。
01:05:31 Mhm. So, I was walking around and I saw a
01:05:31 嗯。我當時正在四處走動,然後我看到了一個
01:05:33 So, I was walking around and I saw a
01:05:33 當時我正四處走動,看到一個
01:05:34 So, I was walking around and I saw a Borders, sorry, a Barnes & Noble
01:05:34 當時我正四處走走,看到了一家 Borders 書店,抱歉,是 Barnes & Noble 書店
01:05:35 Borders, sorry, a Barnes & Noble
01:05:35 Borders,抱歉,是 Barnes & Noble
01:05:35 Borders, sorry, a Barnes & Noble bookstore. Big, oh my god, my old home
01:05:35 Borders,抱歉,是 Barnes & Noble 書店。好大啊,我的天哪,我的老家
01:05:40 bookstore. Big, oh my god, my old home
01:05:40 書店。天哪,我的老家
01:05:40 bookstore. Big, oh my god, my old home is back and I went in and I felt good.
01:05:40 書店。好大啊,我的天哪,我的老家回來了,我進去感覺真好。
01:05:42 is back and I went in and I felt good.
01:05:42 回來了,我進去了,感覺很好。
01:05:42 is back and I went in and I felt good. But it's a very fond memory. But the
01:05:42 回來了,我進去的時候感覺很好。但這是一段非常美好的回憶。但是
01:05:44 But it's a very fond memory. But the
01:05:44 但這是一段非常美好的回憶。但是
01:05:44 But it's a very fond memory. But the fact of the matter is that people's
01:05:44 但這是一段非常美好的回憶。但事實上,人們的
01:05:45 fact of the matter is that people's
01:05:45 事實上,人們的
01:05:46 fact of the matter is that people's attention spans are shorter.
01:05:46 事實是人們的注意力持續時間較短。
01:05:47 attention spans are shorter.
01:05:47 注意力持續時間較短。
01:05:47 attention spans are shorter. They consume things quicker. One of the
01:05:47 注意力持續時間較短。他們消化資訊的速度更快。
01:05:50 They consume things quicker. One of the
01:05:50 它們消耗食物的速度更快。其中之一
01:05:50 They consume things quicker. One of the things interesting about sports is the
01:05:50 他們消耗東西的速度更快。體育運動的有趣之處之一就是
01:05:52 things interesting about sports is the
01:05:52 體育的有趣之處在於
01:05:52 things interesting about sports is the sports highlights business is a huge
01:05:52 體育界有趣的事情是體育精彩片段生意是一個巨大的
01:05:54 sports highlights business is a huge
01:05:54 體育精彩片段業務是一個巨大的
01:05:54 sports highlights business is a huge business. Licensed clips around
01:05:54 體育精彩片段生意很大。授權片段
01:05:56 business. Licensed clips around
01:05:56 生意。授權剪輯
01:05:56 business. Licensed clips around highlights because it's more efficient
01:05:56 業務。授權剪輯精彩片段,因為這樣更有效率
01:05:57 highlights because it's more efficient
01:05:57 突出顯示,因為它更有效率
01:05:57 highlights because it's more efficient than watching the whole game.
01:05:57 精彩片段,因為這比觀看整場比賽更有效率。
01:05:59 than watching the whole game.
01:05:59 比看整場比賽更有趣。
01:05:59 than watching the whole game. So, I suspect that if you're with your
01:05:59 比看完整場比賽更有意義。所以,我懷疑如果你和你的
01:06:01 So, I suspect that if you're with your
01:06:01 所以,我懷疑如果你和你的
01:06:01 So, I suspect that if you're with your buddies and you want to have be drinking
01:06:01 所以,我猜如果你和你的朋友在一起,你想喝酒
01:06:03 buddies and you want to have be drinking
01:06:03 哥們,你想喝酒
01:06:03 buddies and you want to have be drinking and so forth, you put the game on,
01:06:03 和朋友一起喝酒之類的,你就玩遊戲,
01:06:04 and so forth, you put the game on,
01:06:04 然後你打開遊戲,
01:06:04 and so forth, you put the game on, that's fine. But if you're a busy person
01:06:04 等等,你打開遊戲,沒問題。但如果你很忙
01:06:07 that's fine. But if you're a busy person
01:06:07 沒關係。但如果你很忙
01:06:07 that's fine. But if you're a busy person and you're busy with whatever you're
01:06:07 沒關係。但如果你很忙,而且你忙於你正在做的事情
01:06:08 and you're busy with whatever you're
01:06:08 你忙著做你正在做的事情
01:06:08 and you're busy with whatever you're busy of and you want to know what
01:06:08 你忙於自己正在忙的事情,你想知道什麼
01:06:09 busy of and you want to know what
01:06:09 忙碌的,你想知道什麼
01:06:09 busy of and you want to know what happened with your favorite team, the
01:06:09 忙碌的,你想知道你最喜歡的球隊發生了什麼,
01:06:10 happened with your favorite team, the
01:06:10 發生在你最喜歡的球隊身上,
01:06:10 happened with your favorite team, the highlights are good enough.
01:06:10 發生在你最喜歡的球隊身上,精彩片段已經夠精彩了。
01:06:11 highlights are good enough.
01:06:11 亮點已經夠好了。
01:06:11 highlights are good enough. Yeah. You have four panes of it going at
01:06:11 精彩片段就夠了。是的。你一共有四個窗格
01:06:13 Yeah. You have four panes of it going at
01:06:13 是的。你一共有四塊面板,
01:06:13 Yeah. You have four panes of it going at the same time, too.
01:06:13 是的。你同時有四個窗格。
01:06:14 the same time, too.
01:06:14 也是同一時間。
01:06:14 the same time, too. And so, this is again a change and it's
01:06:14 也是同時發生的。所以,這又是一個變化,而且
01:06:16 And so, this is again a change and it's
01:06:16 所以,這又是一個變化,
01:06:16 And so, this is again a change and it's it's a more fundamental change to
01:06:16 所以,這又是一個變化,這是一個更根本的變化
01:06:17 it's a more fundamental change to
01:06:17 這是一個更根本的改變
01:06:17 it's a more fundamental change to attention. Mhm.
01:06:17 這是對注意力更根本的改變。嗯。
01:06:18 attention. Mhm. 01:06:18 注意。嗯。
01:06:18 attention. Mhm. I've been work I work with a lot of
01:06:18 注意。嗯。我一直在和很多人一起工作
01:06:20 I've been work I work with a lot of
01:06:20 我一直在工作,我和很多人一起工作
01:06:20 I've been work I work with a lot of 20somes in research
01:06:20 我一直在和很多 20 多歲的年輕人一起做研究
01:06:23 20somes in research
01:06:23 20多歲的年輕人在研究中
01:06:23 20somes in research and one of the questions I had is how do
01:06:23 20多歲的年輕人在做研究,我有一個問題是如何
01:06:25 and one of the questions I had is how do
01:06:25 我的一個問題是
01:06:25 and one of the questions I had is how do they do research in the presence of all
01:06:25 我的一個問題是,在所有這些人面前,他們如何進行研究
01:06:28 they do research in the presence of all
01:06:28 他們在眾人面前進行研究
01:06:28 they do research in the presence of all of these stimulations and I can answer
01:06:28 他們在所有這些刺激下進行研究,我可以回答
01:06:30 of these stimulations and I can answer
01:06:30 這些刺激,我可以回答
01:06:30 of these stimulations and I can answer the question definitively. They turn off
01:06:30 這些刺激,我可以明確地回答這個問題。它們關閉
01:06:32 the question definitively. They turn off
01:06:32 這個問題肯定存在。他們關閉了
01:06:32 the question definitively. They turn off their phone.
01:06:32 這個問題很明確。他們關掉了手機。
01:06:33 their phone. 01:06:33 他們的電話。
01:06:33 their phone. Yeah.
01:06:33 他們的手機。是的。
01:06:34 Yeah. 01:06:34 是的。
01:06:34 Yeah. You can't think deeply as a researcher
01:06:34 是的。身為研究人員,你無法深入思考
01:06:39 You can't think deeply as a researcher
01:06:39 身為研究人員,你無法深入思考
01:06:39 You can't think deeply as a researcher with this thing buzzing. And remember
01:06:39 身為研究人員,你無法在這個嗡嗡作響的東西上深入思考。記住
01:06:41 with this thing buzzing. And remember
01:06:41 這東西嗡嗡作響。記住
01:06:41 with this thing buzzing. And remember that that part of the the industry's
01:06:41 嗡嗡作響。記住,這個行業的一部分
01:06:43 that that part of the the industry's
01:06:43 產業的一部分
01:06:43 that that part of the the industry's goal was to fully monetize your
01:06:43 產業目標的一部分是完全貨幣化你的
01:06:45 goal was to fully monetize your
01:06:45 目標是完全貨幣化你的
01:06:45 goal was to fully monetize your attention.
01:06:45 目標是將您的注意力完全貨幣化。
01:06:46 attention. 01:06:46 注意。
01:06:46 attention. Yeah.
01:06:46 注意。是的。
01:06:46 Yeah. 01:06:46 是的。
01:06:46 Yeah. Right. We we essent aside from sleeping
01:06:46 是的。除了睡覺,我們還有
01:06:49 Right. We we essent aside from sleeping
01:06:49 對。除了睡覺,我們還有
01:06:49 Right. We we essent aside from sleeping and we're working on having you have
01:06:49 對。除了睡覺之外,我們還有其他事情要做,我們正在努力讓你
01:06:50 and we're working on having you have
01:06:50 我們正在努力讓你
01:06:50 and we're working on having you have less sleep I I guess from stress we've
01:06:50 我們正在努力讓你少睡一會兒,我想是因為壓力,我們已經
01:06:53 less sleep I I guess from stress we've
01:06:53 睡眠不足,我想是因為壓力,我們
01:06:53 less sleep I I guess from stress we've essentially tried to monetize all of
01:06:53 睡眠不足 II 我覺得壓力很大,我們基本上試著把所有
01:06:55 essentially tried to monetize all of
01:06:55 基本上試圖將所有
01:06:55 essentially tried to monetize all of your waking hours with something some
01:06:55 本質上是試圖用某種東西把你醒著的所有時間都貨幣化
01:06:57 your waking hours with something some
01:06:57 在你清醒的時候做一些事情
01:06:57 your waking hours with something some form of ads some form of entertainment
01:06:57 你醒著的時候會看一些廣告或娛樂節目
01:06:59 form of ads some form of entertainment
01:06:59 廣告形式 某種娛樂形式
01:06:59 form of ads some form of entertainment some form of subscription that is
01:06:59 廣告形式 某種娛樂形式 某種訂閱形式
01:07:01 some form of subscription that is
01:07:01 某種形式的訂閱
01:07:01 some form of subscription that is completely antithetical to the way
01:07:01 某種形式的訂閱,與方式完全相反
01:07:03 completely antithetical to the way
01:07:03 完全違背了
01:07:03 completely antithetical to the way humans traditionally work with respect
01:07:03 完全違背了人類傳統尊重的方式
01:07:06 humans traditionally work with respect
01:07:06 人類傳統上尊重地工作
01:07:06 humans traditionally work with respect to long thoughtful examination of
01:07:06 人類傳統上會進行長期深思熟慮的檢查
01:07:09 to long thoughtful examination of
01:07:09 經過長時間的深思熟慮的審查
01:07:09 to long thoughtful examination of principles the time that it takes to be
01:07:09 需要長時間深思熟慮地檢討原則,
01:07:12 principles the time that it takes to be
01:07:12 原則所需的時間
01:07:12 principles the time that it takes to be a good human being these are in conflict
01:07:12 原則 成為一個好人所需要的時間 這些是衝突的
01:07:15 a good human being these are in conflict
01:07:15 一個好人,這些是有衝突的
01:07:15 a good human being these are in conflict right now there are various attempts at
01:07:15 一個好人,這些現在處於衝突之中,有各種各樣的嘗試
01:07:16 right now there are various attempts at
01:07:16 目前有各種嘗試
01:07:16 right now there are various attempts at this. So, you know, my favorite are
01:07:16 目前有很多這樣的嘗試。我最喜歡的是
01:07:18 this. So, you know, my favorite are
01:07:18 這個。所以,你知道,我最喜歡的是
01:07:18 this. So, you know, my favorite are these digital apps that make you relax.
01:07:18 這個。你知道,我最喜歡的就是這些能讓你放鬆的數位應用程式。
01:07:20 these digital apps that make you relax.
01:07:20 這些讓你放鬆的數位應用程式。
01:07:20 these digital apps that make you relax. Okay. So, the correct thing to do to
01:07:20 這些讓你放鬆的數位應用。好的。那麼,正確的做法是
01:07:22 Okay. So, the correct thing to do to
01:07:22 好的。那麼,正確的做法是
01:07:22 Okay. So, the correct thing to do to relax is to turn off your phone, right?
01:07:22 好的。那麼,正確的放鬆方式就是關掉手機,對嗎?
01:07:25 relax is to turn off your phone, right?
01:07:25 放鬆就是關掉手機,對吧?
01:07:25 relax is to turn off your phone, right? And then relax in a traditional way for,
01:07:25 放鬆就是關掉手機,對吧?然後以傳統的方式放鬆,
01:07:27 And then relax in a traditional way for,
01:07:27 然後以傳統方式放鬆,
01:07:27 And then relax in a traditional way for, you know, 70,000 human years of
01:07:27 然後以傳統的方式放鬆,你知道,70000 年的人類
01:07:29 you know, 70,000 human years of
01:07:29 你知道,7萬年的
01:07:29 you know, 70,000 human years of existence.
01:07:29 你知道,人類存在了70,000年。
01:07:30 existence. 01:07:30 存在。
01:07:30 existence. Yeah. Yeah. I had an incredible
01:07:30 存在。是的。是的。我度過了一段不可思議的
01:07:31 Yeah. Yeah. I had an incredible
01:07:31 是的。是的。我度過了一段不可思議的
01:07:31 Yeah. Yeah. I had an incredible experience. I'm doing the flight from
01:07:31 是的。是的。我經歷了一段不可思議的經歷。我正從
01:07:32 experience. I'm doing the flight from
01:07:32 經驗。我正從
01:07:32 experience. I'm doing the flight from MIT to Stanford all the time.
01:07:32 經驗。我經常從麻省理工學院飛往史丹佛大學。
01:07:35 MIT to Stanford all the time.
01:07:35 一直從麻省理工學院到史丹佛大學。
01:07:35 MIT to Stanford all the time. And, you know, like you said, attention
01:07:35 麻省理工學院和史丹佛大學一直都是這樣。而且,就像你說的,關注
01:07:38 And, you know, like you said, attention
01:07:38 你知道,就像你說的,注意
01:07:38 And, you know, like you said, attention spans are getting shorter and shorter
01:07:38 而且,就像你說的,注意力持續時間越來越短
01:07:39 spans are getting shorter and shorter
01:07:39 跨度越來越短
01:07:39 spans are getting shorter and shorter and shorter. The Tik Tok extreme, you
01:07:39 跨度越來越短。抖音極端,你
01:07:41 and shorter. The Tik Tok extreme, you
01:07:41 甚至更短。在 Tik Tok 的極端情況下,你
01:07:41 and shorter. The Tik Tok extreme, you know, the clips are so short. This
01:07:41 甚至更短。抖音的極端做法是,影片太短了。
01:07:43 know, the clips are so short. This
01:07:43 知道,這些片段太短了。這
01:07:43 know, the clips are so short. This particular flight was my first time
01:07:43 知道,影片太短了。這是我第一次
01:07:44 particular flight was my first time
01:07:44 這是我人生中第一次搭乘這班航班
01:07:44 particular flight was my first time brainstorming with Gemini for six hours
01:07:44 特定航班是我第一次與雙子座一起進行六個小時的腦力激盪
01:07:47 brainstorming with Gemini for six hours
01:07:47 與 Gemini 一起腦力激盪了六個小時
01:07:47 brainstorming with Gemini for six hours straight
01:07:47 與 Gemini 一起腦力激盪了六個小時
01:07:48 straight 01:07:48 直線
01:07:48 straight and I completely lost track of time and
01:07:48 直直地,我完全忘了時間,
01:07:49 and I completely lost track of time and
01:07:49 我完全忘了時間
01:07:49 and I completely lost track of time and I was we're I'm trying to figure out
01:07:49 我完全忘記了時間,我試著弄清楚
01:07:51 I was we're I'm trying to figure out
01:07:51 我正在努力弄清楚
01:07:51 I was we're I'm trying to figure out it's a circuit design and chip design
01:07:51 我當時正在嘗試弄清楚這是一個電路設計和晶片設計
01:07:53 it's a circuit design and chip design
01:07:53 這是電路設計和晶片設計
01:07:53 it's a circuit design and chip design for inference time compute and it's so
01:07:53 這是用於推理時間計算的電路設計和晶片設計,它是如此
01:07:55 for inference time compute and it's so
01:07:55 推理時間計算如下
01:07:56 for inference time compute and it's so good at brainstorming with me and
01:07:56 用於推理時間計算,它非常適合與我一起集思廣益,
01:07:57 good at brainstorming with me and
01:07:57 擅長與我集思廣益
01:07:57 good at brainstorming with me and bringing back data and so long as the
01:07:57 擅長與我集思廣益並帶回數據,只要
01:07:59 bringing back data and so long as the
01:07:59 帶回數據,只要
01:07:59 bringing back data and so long as the Wi-Fi on the plane is working.
01:07:59 帶回數據,只要飛機上的 Wi-Fi 正常。
01:08:00 Wi-Fi on the plane is working.
01:08:00 飛機上的 Wi-Fi 正常使用。
01:08:00 Wi-Fi on the plane is working. Time went by. So my first experience
01:08:00 飛機上的 Wi-Fi 正常了。時間流逝。我的第一次體驗
01:08:02 Time went by. So my first experience
01:08:02 時間流逝。我的第一次經歷
01:08:02 Time went by. So my first experience with technology that went the other
01:08:02 時間流逝。我的第一次科技體驗就這樣改變了。
01:08:04 with technology that went the other
01:08:04 科技走向了另一個
01:08:04 with technology that went the other direction
01:08:04 而科技走向了另一個方向
01:08:04 direction 01:08:04 方向
01:08:04 direction but noticed that you also were not
01:08:04 方向,但注意到你也沒有
01:08:06 but noticed that you also were not
01:08:06 但你注意到也沒有
01:08:06 but noticed that you also were not responding to texts and annoyances. You
01:08:06 但注意到你也沒有回覆簡訊和煩人的留言。你
01:08:09 responding to texts and annoyances. You
01:08:09 回覆簡訊和煩惱。你
01:08:09 responding to texts and annoyances. You weren't reading ads. you were deep
01:08:09 回覆簡訊和煩人的事情。你不是在看廣告。你很深奧
01:08:11 weren't reading ads. you were deep
01:08:11 我沒有看廣告。你很深奧
01:08:11 weren't reading ads. you were deep inside of a system
01:08:11 你沒有讀廣告。你深入系統內部
01:08:13 inside of a system
01:08:13 系統內部
01:08:13 inside of a system which for which you paid a subscription.
01:08:13 在您付費訂閱的系統內部。
01:08:15 which for which you paid a subscription.
01:08:15 您已為此付費訂閱。
01:08:15 which for which you paid a subscription. Mhm.
01:08:15 你為此付費訂閱了。嗯。
01:08:16 Mhm. 01:08:16 嗯。
01:08:16 Mhm. So if you look at the deep research
01:08:16 嗯。所以如果你看看深入研究
01:08:17 So if you look at the deep research
01:08:17 所以如果你看深入研究
01:08:17 So if you look at the deep research stuff, one of the questions I have when
01:08:17 所以如果你看一下深入研究的內容,我有一個問題
01:08:19 stuff, one of the questions I have when
01:08:19 東西,我有一個問題
01:08:19 stuff, one of the questions I have when you do a deep research analysis, I was
01:08:19 東西,當你進行深入研究分析時,我有一個問題,我
01:08:21 you do a deep research analysis, I was
01:08:21 你做了深入的研究分析,我
01:08:21 you do a deep research analysis, I was looking at factory automation for
01:08:21 你做了深入的研究分析,我正在研究工廠自動化
01:08:22 looking at factory automation for
01:08:22 看看工廠自動化
01:08:22 looking at factory automation for something. Where is the boundary of
01:08:22 看看工廠自動化。
01:08:24 something. Where is the boundary of
01:08:24 某事。邊界在哪裡
01:08:24 something. Where is the boundary of factory automation versus human
01:08:24 工廠自動化和人工的界線在哪裡?
01:08:25 factory automation versus human
01:08:25 工廠自動化與人類
01:08:26 factory automation versus human automation? It's some an area I don't
01:08:26 工廠自動化與人類自動化?這是我不太了解的領域。
01:08:27 automation? It's some an area I don't
01:08:27 自動化?這是我不了解的領域
01:08:27 automation? It's some an area I don't understand very well. It's very very
01:08:27 自動化?這個領域我不太了解。這非常非常
01:08:29 understand very well. It's very very
01:08:29 非常理解。這非常非常
01:08:29 understand very well. It's very very deep technical set of problems. I didn't
01:08:29 理解得很好。這是一系列非常深奧的技術問題。我沒有
01:08:31 deep technical set of problems. I didn't
01:08:31 一系列深層的技術問題。我沒有
01:08:31 deep technical set of problems. I didn't understand it.
01:08:31 一系列深奧的技術問題。我沒看懂。
01:08:32 understand it. 01:08:32 明白了。
01:08:32 understand it. It took 20 12 minutes or so to generate
01:08:32 明白了。生成需要 20 到 12 分鐘左右
01:08:35 It took 20 12 minutes or so to generate
01:08:35 產生耗時約 20 至 12 分鐘
01:08:35 It took 20 12 minutes or so to generate this paper. 12 minutes of these
01:08:35 產生這篇論文大約需要 20 到 12 分鐘。其中 12 分鐘
01:08:37 this paper. 12 minutes of these
01:08:37 這篇論文。其中12分鐘
01:08:37 this paper. 12 minutes of these supercomputers is an enormous amount of
01:08:37 這篇論文。這些超級電腦的 12 分鐘是一個巨大的
01:08:39 supercomputers is an enormous amount of
01:08:39 超級電腦的數量非常龐大
01:08:39 supercomputers is an enormous amount of time. What is it doing? Right. And the
01:08:39 超級電腦需要花費大量的時間。它在做什麼?對。
01:08:43 time. What is it doing? Right. And the
01:08:43 時間。它在做什麼?對。還有
01:08:43 time. What is it doing? Right. And the answer, of course, the product is
01:08:43 時間。它在做什麼?對。答案當然是產品
01:08:44 answer, of course, the product is
01:08:44 回答,當然,產品是
01:08:44 answer, of course, the product is fantastic.
01:08:44 回答,當然,產品很棒。
01:08:44 fantastic. 01:08:44 太棒了。
01:08:44 fantastic. Yeah. You know, to Peter's question
01:08:44 太棒了。是的。你知道,關於彼得的問題
01:08:46 Yeah. You know, to Peter's question
01:08:46 是的。彼得的問題
01:08:46 Yeah. You know, to Peter's question earlier, too, I keep the Google IPO
01:08:46 是的。關於 Peter 之前的問題,我保留了 Google IPO 的立場。
01:08:47 earlier, too, I keep the Google IPO
01:08:47 早些時候,我也保留了 Google IPO
01:08:47 earlier, too, I keep the Google IPO perspectus in my bathroom up in Vermont.
01:08:47 早些時候,我也將 Google IPO 說明書保存在我位於佛蒙特州的浴室裡。
01:08:50 perspectus in my bathroom up in Vermont.
01:08:50 佛蒙特州我浴室裡的透視圖。
01:08:50 perspectus in my bathroom up in Vermont. It's 2004. I've read it probably 500
01:08:50 佛蒙特州我家浴室裡的透視圖。那是2004年。我大概讀過500遍了。
01:08:52 It's 2004. I've read it probably 500
01:08:52 現在是 2004 年。我大概讀 500 本了
01:08:52 It's 2004. I've read it probably 500 times. But I don't know if you remember.
01:08:52 現在是2004年。我大概讀過500遍了。但我不知道你是否還記得。
01:08:56 times. But I don't know if you remember.
01:08:56 次。但我不知道你是否還記得。
01:08:56 times. But I don't know if you remember. It's getting a little ratty actually.
01:08:56 次。不過我不知道你還記不記得。其實有點亂了。
01:08:57 It's getting a little ratty actually.
01:08:57 事實上它變得有點破舊了。
01:08:57 It's getting a little ratty actually. You're the only the only person besides
01:08:57 情況確實有點糟。除了你之外,就只有你一個人
01:08:59 You're the only the only person besides
01:08:59 你是唯一一個
01:08:59 You're the only the only person besides me who did the same.
01:08:59 除了我之外,你是唯一這樣做的人。
01:09:01 me who did the same.
01:09:01 我也是這麼做的。
01:09:01 me who did the same. I read it 500 times because I had to. It
01:09:01 我也是這麼做的。我讀了500遍,因為我必須讀。
01:09:03 I read it 500 times because I had to. It
01:09:03 我讀了500遍,因為我必須讀。
01:09:03 I read it 500 times because I had to. It was. It was legally legally required.
01:09:03 我讀了500遍,因為我必須讀。這是法律要求的。
01:09:06 was. It was legally legally required.
01:09:06 是。這是法律要求的。
01:09:06 was. It was legally legally required. Well, I still read it um because because
01:09:06 是。這是法律要求的。嗯,我還是讀了,因為
01:09:08 Well, I still read it um because because
01:09:08 嗯,我還是會讀,因為
01:09:08 Well, I still read it um because because of the misconceptions, it's just so it's
01:09:08 嗯,我還是會讀,因為因為誤解,所以它就是這樣
01:09:10 of the misconceptions, it's just so it's
01:09:10 的誤解,就是這樣
01:09:10 of the misconceptions, it's just so it's such a great learning experience. But
01:09:10 的誤解,這是一個很好的學習經驗。但是
01:09:12 such a great learning experience. But
01:09:12 真是個很棒的學習經驗。但是
01:09:12 such a great learning experience. But even before the IPO, if you think back,
01:09:12 這真是一次很棒的學習經驗。但即使在 IPO 之前,如果你回想一下,
01:09:14 even before the IPO, if you think back,
01:09:14 甚至在 IPO 之前,如果你回想一下,
01:09:14 even before the IPO, if you think back, you know, there's this big debate about
01:09:14 甚至在 IPO 之前,如果你回想一下,你會發現,關於
01:09:16 you know, there's this big debate about
01:09:16 你知道,關於
01:09:16 you know, there's this big debate about will it be ad revenue, will it be
01:09:16 你知道,關於它是否會成為廣告收入,
01:09:17 will it be ad revenue, will it be
01:09:17 是廣告收入嗎?
01:09:17 will it be ad revenue, will it be subscription revenue, will it be paid
01:09:17 是廣告收入,還是訂閱收入,還是付費收入
01:09:19 subscription revenue, will it be paid
01:09:19 訂閱收入,會付嗎
01:09:19 subscription revenue, will it be paid inclusion, will the ads be visible, and
01:09:19 訂閱收入,是否需要付費收錄,廣告是否可見,以及
01:09:21 inclusion, will the ads be visible, and
01:09:21 納入,廣告是否可見,以及
01:09:21 inclusion, will the ads be visible, and all this confusion about how you're
01:09:21 包容性,廣告是否可見,以及所有這些關於你如何
01:09:22 all this confusion about how you're
01:09:22 所有這些關於你如何
01:09:22 all this confusion about how you're going to make money with this thing.
01:09:22 關於如何利用這個東西賺錢,大家都很困惑。
01:09:24 going to make money with this thing.
01:09:24 要用這個東西賺錢。
01:09:24 going to make money with this thing. Now, the internet moved to almost
01:09:24 用這個東西賺錢。現在,網路幾乎
01:09:25 Now, the internet moved to almost
01:09:25 現在,網路幾乎已經
01:09:25 Now, the internet moved to almost entirely ad revenue. But if you look at
01:09:25 現在,網路幾乎完全依賴廣告收入。但如果你看看
01:09:28 entirely ad revenue. But if you look at
01:09:28 完全是廣告收入。但如果你看看
01:09:28 entirely ad revenue. But if you look at the AI models, they're, you know, you
01:09:28 完全是廣告收入。但如果你看看人工智慧模型,它們,你知道,你
01:09:30 the AI models, they're, you know, you
01:09:30 人工智慧模型,你知道,你
01:09:30 the AI models, they're, you know, you got your $20 now $200 subscription and
01:09:30 人工智慧模型,你知道,你之前花了 20 美元,現在花了 200 美元訂閱,
01:09:32 got your $20 now $200 subscription and
01:09:32 已獲得 20 美元,現可享 200 美元訂閱服務
01:09:32 got your $20 now $200 subscription and people are signing up like crazy. So,
01:09:32 之前你花了 20 美元訂閱,現在卻花了 200 美元,人們瘋狂地註冊。所以,
01:09:36 people are signing up like crazy. So,
01:09:36 人們瘋狂報名。所以,
01:09:36 people are signing up like crazy. So, you know, the it's ultra ultra
01:09:36 人們瘋狂報名。所以,你知道,這太超級了
01:09:38 you know, the it's ultra ultra
01:09:38 你知道,這是超級超級
01:09:38 you know, the it's ultra ultra convincing. Is that going to be a form
01:09:38 你知道,這非常非常有說服力。這會是一種形式嗎
01:09:39 convincing. Is that going to be a form
01:09:39 令人信服。這會是一種形式嗎
01:09:39 convincing. Is that going to be a form of ad revenue where it convinces you to
01:09:39 令人信服。這會成為一種廣告收入形式嗎?它能說服你
01:09:41 of ad revenue where it convinces you to
01:09:41 的廣告收入,它說服你
01:09:41 of ad revenue where it convinces you to buy something or no? Is it going to be
01:09:41 的廣告收入是如何說服你買東西的?
01:09:43 buy something or no? Is it going to be
01:09:43 買還是不買?
01:09:43 buy something or no? Is it going to be subscription revenue where people pay a
01:09:43 買還是不買?會不會是訂閱收入,人們會支付
01:09:45 subscription revenue where people pay a
01:09:45 訂閱收入,人們支付
01:09:45 subscription revenue where people pay a lot more and there's no advertising at
01:09:45 訂閱收入,人們付更多,而且沒有廣告
01:09:47 lot more and there's no advertising at
01:09:47 還有很多,而且沒有廣告
01:09:47 lot more and there's no advertising at all?
01:09:47 還有更多,而且完全沒有廣告?
01:09:47 all? 01:09:47 全部?
01:09:47 all? No, but you have you have this with
01:09:47 全部?沒有,但是你有這個
01:09:48 No, but you have you have this with
01:09:48 沒有,但是你有這個
01:09:48 No, but you have you have this with Netflix. There was this whole discussion
01:09:48 沒有,但 Netflix 確實有這個功能。我們討論過
01:09:50 Netflix. There was this whole discussion
01:09:50 Netflix。有這麼一整場討論
01:09:50 Netflix. There was this whole discussion about would would how would you fund
01:09:50 Netflix。我們一直在討論如何資助
01:09:52 about would would how would you fund
01:09:52 關於你會如何資助
01:09:52 about would would how would you fund movies through ads? And the answer is
01:09:52 你會如何透過廣告來資助電影?答案是
01:09:54 movies through ads? And the answer is
01:09:54 電影裡有廣告?答案是
01:09:54 movies through ads? And the answer is you don't. You have a subscription. And
01:09:54 看電影需要廣告嗎?答案是不需要。你訂閱了。而且
01:09:56 you don't. You have a subscription. And
01:09:56 你不需要。你有一個訂閱。而且
01:09:56 you don't. You have a subscription. And the Netflix p people looked at having
01:09:56 你不需要。你有一個訂閱。 Netflix 的人們曾考慮過
01:09:58 the Netflix p people looked at having
01:09:58 Netflix p 人們考慮擁有
01:09:58 the Netflix p people looked at having free movies without a subscription and
01:09:58 Netflix p 用戶希望無需訂閱即可觀看免費電影,
01:10:01 free movies without a subscription and
01:10:01 無需訂閱即可免費觀看電影
01:10:01 free movies without a subscription and advertising supported and the math
01:10:01 免費電影,無需訂閱和廣告支持,以及數學
01:10:03 advertising supported and the math
01:10:03 廣告支援與數學
01:10:03 advertising supported and the math didn't work. So I think both will be
01:10:03 廣告支持,但計算不出來。所以我認為兩者都會
01:10:05 didn't work. So I think both will be
01:10:05 沒有用。所以我認為兩者都會
01:10:05 didn't work. So I think both will be tried. I think the fact of the matter is
01:10:05 沒有用。所以我認為兩種方法都會嘗試。我認為事實是
01:10:08 tried. I think the fact of the matter is
01:10:08 嘗試過了。我認為事實是
01:10:08 tried. I think the fact of the matter is deep research at least at the moment is
01:10:08 嘗試過。我認為至少目前來說,深入研究是
01:10:10 deep research at least at the moment is
01:10:10 至少目前來說,深入研究是
01:10:10 deep research at least at the moment is going to be chosen by wellto-do or
01:10:10 至少目前來說,深入研究將由富裕人士或
01:10:12 going to be chosen by wellto-do or
01:10:12 將被富裕的人選中
01:10:12 going to be chosen by wellto-do or professional tasks.
01:10:12 將被富裕的人或專業任務所選。
01:10:14 professional tasks.
01:10:14 專業任務。
01:10:14 professional tasks. You are capable of spending that $200 a
01:10:14 專業任務。你有能力花掉200美元
01:10:16 You are capable of spending that $200 a
01:10:16 你有能力花掉這 200 美元
01:10:16 You are capable of spending that $200 a month. A lot of people don't afford
01:10:16 你有能力每月花掉這200美元。很多人負擔不起
01:10:18 month. A lot of people don't afford
01:10:18 個月。很多人負擔不起
01:10:18 month. A lot of people don't afford cannot afford it.
01:10:18 一個月。很多人買不起。
01:10:19 cannot afford it.
01:10:19 買不起。
01:10:19 cannot afford it. And that free service remember is the
01:10:19 買不起。記住,免費服務是
01:10:23 And that free service remember is the
01:10:23 記住,這項免費服務是
01:10:23 And that free service remember is the thing that is the stepping stone for
01:10:23 記住,免費服務是
01:10:24 thing that is the stepping stone for
01:10:24 這是墊腳石
01:10:24 thing that is the stepping stone for that young person man or woman who just
01:10:24 這對年輕人來說是墊腳石,不論男女,
01:10:27 that young person man or woman who just
01:10:27 那個年輕人,不論男女,
01:10:27 that young person man or woman who just needs that access. My favorite story
01:10:27 那個年輕人,不論男女,都需要這種機會。我最喜歡的故事
01:10:29 needs that access. My favorite story
01:10:29 需要那個存取權限。我最喜歡的故事
01:10:29 needs that access. My favorite story there is that when I when I was at
01:10:29 需要這種存取權限。我最喜歡的故事是,當我在
01:10:31 there is that when I when I was at
01:10:31 有,當我在
01:10:31 there is that when I when I was at Google and I went to Kenya and Kenya is
01:10:31 有一次我在谷歌的時候,我去了肯亞,肯亞是
01:10:34 Google and I went to Kenya and Kenya is
01:10:34 谷歌和我去了肯亞,肯亞是
01:10:34 Google and I went to Kenya and Kenya is a great country and I and I was with
01:10:34 谷歌和我去了肯亞,肯亞是一個偉大的國家,我和我
01:10:35 a great country and I and I was with
01:10:35 一個偉大的國家,我和我
01:10:35 a great country and I and I was with this computer science professor and he
01:10:35 一個偉大的國家,我和這位電腦科學教授在一起,他
01:10:36 this computer science professor and he
01:10:36 這位電腦科學教授和他
01:10:36 this computer science professor and he said, "I love Google." I said, "Well, I
01:10:36 這位電腦科學教授說:「我喜歡谷歌。」我說:「嗯,我
01:10:38 said, "I love Google." I said, "Well, I
01:10:38 說:「我喜歡谷歌。」我說:「嗯,我
01:10:38 said, "I love Google." I said, "Well, I love Google, too." And he goes, "Well, I
01:10:38 說:「我喜歡谷歌。」我說:「嗯,我也喜歡谷歌。」然後他說:「嗯,我
01:10:40 love Google, too." And he goes, "Well, I
01:10:40 我也喜歡 Google。 」然後他說,「嗯,我
01:10:40 love Google, too." And he goes, "Well, I really love Google." I said, "I really
01:10:40 我也喜歡 Google。 」然後他說,「嗯,我真的很喜歡谷歌。 」我說,「我真的
01:10:41 really love Google." I said, "I really
01:10:41 我真的很喜歡 Google。 」我說,「我真的
01:10:41 really love Google." I said, "I really love Google, too." And I said, "Why do
01:10:41 我真的很喜歡 Google。 」我說,「我也非常喜歡谷歌。 」然後我說,「為什麼
01:10:42 love Google, too." And I said, "Why do
01:10:42 我也喜歡 Google。 」我說,「為什麼
01:10:42 love Google, too." And I said, "Why do you really love Google?" He said,
01:10:42 我也喜歡 Google。 」我問,「你為什麼喜歡 Google? 」他說,
01:10:43 you really love Google?" He said,
01:10:43 你真的喜歡 Google 嗎? “他說,
01:10:44 you really love Google?" He said, "Because we don't have textbooks."
01:10:44 你真的喜歡 Google 嗎?他說,「因為我們沒有教科書。 」
01:10:46 "Because we don't have textbooks."
01:10:46 “因為我們沒有教科書。”
01:10:46 "Because we don't have textbooks." And I thought, "The top computer science
01:10:46 「因為我們沒有教科書。」我想,「頂尖的電腦科學
01:10:48 And I thought, "The top computer science
01:10:48 我想,「頂尖的電腦科學
01:10:48 And I thought, "The top computer science program in the nation does not have
01:10:48 我想,「全國頂尖的電腦科學計畫沒有
01:10:50 program in the nation does not have
01:10:50 全國沒有這個計劃
01:10:50 program in the nation does not have textbooks."
01:10:50 全國沒有這個計畫的教科書。 」
01:10:51 textbooks." 01:10:51 教科書。 」
01:10:51 textbooks." Yeah. Well, let me uh
01:10:51 教科書。 「是的。好吧,讓我呃
01:10:52 Yeah. Well, let me uh
01:10:52 是的。嗯,讓我呃
01:10:52 Yeah. Well, let me uh let me jump in a couple things here. Uh
01:10:52 是的。好吧,我先說幾點。呃
01:10:54 let me jump in a couple things here. Uh
01:10:54 讓我先說幾點。呃
01:10:54 let me jump in a couple things here. Uh Eric in in the next few years what moes
01:10:54 我先說幾點。嗯,Eric,接下來幾年會有什麼改變?
01:11:01 Eric in in the next few years what moes
01:11:01 Eric 在接下來的幾年裡會做什麼
01:11:01 Eric in in the next few years what moes actually exist for startups as AI is
01:11:01 Eric 在接下來的幾年裡,隨著人工智慧的發展,新創公司實際上會面臨哪些變化?
01:11:04 actually exist for startups as AI is
01:11:04 其實對新創公司來說,人工智慧是
01:11:04 actually exist for startups as AI is coming in and disrupting uh
01:11:04 實際上對於新創公司來說,隨著人工智慧的出現和顛覆
01:11:09 coming in and disrupting uh
01:11:09 進來擾亂
01:11:09 coming in and disrupting uh do you have a list?
01:11:09 進來擾亂秩序呃你有名單嗎?
01:11:10 do you have a list?
01:11:10 你有清單嗎?
01:11:10 do you have a list? Yes, I I'll give you a simple answer.
01:11:10 你有清單嗎?有,我會給你一個簡單的答案。
01:11:11 Yes, I I'll give you a simple answer.
01:11:11 是的,我會給你一個簡單的答案。
01:11:11 Yes, I I'll give you a simple answer. And what do you look for in the
01:11:11 是的,我會給你一個簡單的答案。你在
01:11:12 And what do you look for in the
01:11:12 你在找什麼
01:11:12 And what do you look for in the companies that you're investing in?
01:11:12 您在投資公司時會關注哪些面向?
01:11:14 companies that you're investing in?
01:11:14 您投資的公司有哪些?
01:11:14 companies that you're investing in? So first in the deep tech hardware stuff
01:11:14 你投資的公司有哪些?首先是深度科技硬體
01:11:16 So first in the deep tech hardware stuff
01:11:16 首先是深度技術硬體
01:11:16 So first in the deep tech hardware stuff there's going to be patents, patents,
01:11:16 所以首先在深度技術硬體方面會有專利,專利,
01:11:18 there's going to be patents, patents,
01:11:18 將會有專利,專利,
01:11:18 there's going to be patents, patents, filings, inventions, you know the hard
01:11:18 將會有專利、專利、申請、發明,你知道這有多難
01:11:21 filings, inventions, you know the hard
01:11:21 申請、發明,你知道的
01:11:21 filings, inventions, you know the hard stuff. Those things are much slower than
01:11:21 申請、發明,你知道這些很難的事。這些事情比
01:11:23 stuff. Those things are much slower than
01:11:23 這些東西。這些東西比
01:11:23 stuff. Those things are much slower than the software industry in terms of growth
01:11:23 這些東西的成長速度比軟體產業慢得多
01:11:25 the software industry in terms of growth
01:11:25 軟體產業的成長
01:11:25 the software industry in terms of growth and they're just as important. You know,
01:11:25 就軟體產業成長而言,它們同樣重要。你知道,
01:11:27 and they're just as important. You know,
01:11:27 它們同樣重要。你知道,
01:11:27 and they're just as important. You know, power systems, all those robotic systems
01:11:27 它們同樣重要。你知道,電力系統,所有那些機器人系統
01:11:30 power systems, all those robotic systems
01:11:30 電力系統,所有這些機器人系統
01:11:30 power systems, all those robotic systems we've been waiting for a long time.
01:11:30 電力系統,所有這些機器人系統我們都已經等很久了。
01:11:31 we've been waiting for a long time.
01:11:31 我們已經等很久了。
01:11:32 we've been waiting for a long time. They're just it's just slower for all
01:11:32 我們已經等了很久了。他們只是速度慢了點
01:11:33 They're just it's just slower for all
01:11:33 他們只是速度慢了點
01:11:33 They're just it's just slower for all sorts of hardware is hard.
01:11:33 它們只是速度較慢,因為各種硬體都很難。
01:11:34 sorts of hardware is hard.
01:11:34 各種硬體都很難。
01:11:34 sorts of hardware is hard. Hardware is hard for those reasons.
01:11:34 各種硬體都很難。硬體很難,就是因為這些原因。
01:11:36 Hardware is hard for those reasons.
01:11:36 由於這些原因,硬體很難。
01:11:36 Hardware is hard for those reasons. In software, it's pretty clear to me
01:11:36 硬體很難,原因就在於此。軟體方面,我很清楚
01:11:38 In software, it's pretty clear to me
01:11:38 在軟體方面,我很清楚
01:11:38 In software, it's pretty clear to me it's going to be really simple. These
01:11:38 在軟體方面,我很清楚它會非常簡單。這些
01:11:41 it's going to be really simple. These
01:11:41 這會非常簡單。這些
01:11:41 it's going to be really simple. These software is typically a network effect
01:11:41 這會非常簡單。這些軟體通常具有網路效應
01:11:43 software is typically a network effect
01:11:43 軟體通常具有網路效應
01:11:43 software is typically a network effect business where the fastest mover wins.
01:11:43 軟體通常是一個網路效應業務,行動最快的人將獲勝。
01:11:47 business where the fastest mover wins.
01:11:47 行動最快的人獲勝。
01:11:47 business where the fastest mover wins. The fastest mover is the fastest learner
01:11:47 行動最快的人才能成功。行動最快的人也是學習最快的人
01:11:50 The fastest mover is the fastest learner
01:11:50 行動最快的人是學習最快的人
01:11:50 The fastest mover is the fastest learner in an AI system. So what I look for is a
01:11:50 在人工智慧系統中,行動最快的人也是學習最快的人。所以我尋找的是
01:11:54 in an AI system. So what I look for is a
01:11:54 在人工智慧系統中。所以我尋找的是
01:11:54 in an AI system. So what I look for is a is a a company where they have a loop.
01:11:54 在人工智慧系統中。所以我要找的是一家有循環的公司。
01:11:58 is a a company where they have a loop.
01:11:58 是一家擁有循環的公司。
01:11:58 is a a company where they have a loop. Ideally, they have a couple of learning
01:11:58 是一家有循環的公司。理想情況下,他們有幾個學習
01:11:59 Ideally, they have a couple of learning
01:11:59 理想情況下,他們有幾個學習
01:11:59 Ideally, they have a couple of learning loops. So I'll give you a simple
01:11:59 理想情況下,他們應該有幾個學習循環。所以我給你一個簡單的
01:12:00 loops. So I'll give you a simple
01:12:00 循環。所以我給你一個簡單的
01:12:00 loops. So I'll give you a simple learning loop that as you get more
01:12:00 循環。我會給你一個簡單的學習循環,隨著你掌握更多
01:12:02 learning loop that as you get more
01:12:02 學習循環,隨著你獲得更多
01:12:02 learning loop that as you get more people, the more people click and you
01:12:02 學習循環,當你吸引更多的人時,點擊的人也會越多,你
01:12:05 people, the more people click and you
01:12:05 人,點擊的人越多,你
01:12:05 people, the more people click and you learn from their click. They they they
01:12:05 人越多,點擊的人越多,你就能從他們的點擊中學習。他們他們他們
01:12:07 learn from their click. They they they
01:12:07 從他們的點擊中學習。他們他們他們
01:12:07 learn from their click. They they they express their preferences. So let's say
01:12:07 從他們的點擊中學習。他們表達了自己的偏好。所以假設
01:12:09 express their preferences. So let's say
01:12:09 表達他們的偏好。假設
01:12:09 express their preferences. So let's say I invent a whole new consumer thing,
01:12:09 表達他們的偏好。假設我發明了一種全新的消費性產品,
01:12:11 I invent a whole new consumer thing,
01:12:11 我發明了一種全新的消費性產品,
01:12:12 I invent a whole new consumer thing, which I don't have an idea right now for
01:12:12 我發明了一種全新的消費性產品,目前我還不知道它的具體用途
01:12:13 which I don't have an idea right now for
01:12:13 我現在還不知道
01:12:13 which I don't have an idea right now for it, but imagine I did. And furthermore,
01:12:13 我現在還沒想好,但我想像我當時就知道了。而且,
01:12:16 it, but imagine I did. And furthermore,
01:12:16 確實如此,但想像我的確這麼做了。而且,
01:12:16 it, but imagine I did. And furthermore, I said that I don't know anything about
01:12:16 但它確實存在。而且,我說了我對
01:12:18 I said that I don't know anything about
01:12:18 我說我不知道
01:12:18 I said that I don't know anything about how consumers behave, but I'm going to
01:12:18 我說過,我對消費者的行為一無所知,但我要
01:12:20 how consumers behave, but I'm going to
01:12:20 消費者的行為方式,但我要
01:12:20 how consumers behave, but I'm going to launch this thing. The moment people
01:12:20 消費者的行為方式,但我要推出這個產品。當人們
01:12:21 launch this thing. The moment people
01:12:21 推出這個東西。人們
01:12:21 launch this thing. The moment people start using it, I'm going to learn from
01:12:21 推出這個東西。一旦人們開始使用它,我就會從中學習
01:12:23 start using it, I'm going to learn from
01:12:23 開始使用它,我要學習
01:12:23 start using it, I'm going to learn from them, and I'll have instantaneous
01:12:23 開始使用它,我要向他們學習,我會立即
01:12:25 them, and I'll have instantaneous
01:12:25 我會立即
01:12:25 them, and I'll have instantaneous learning to get smarter about what they
01:12:25 我會立即學習,以便更聰明地了解他們
01:12:27 learning to get smarter about what they
01:12:27 學習如何更聰明地
01:12:27 learning to get smarter about what they want. So, I start from nothing. If my
01:12:27 學習更聰明地了解他們想要什麼。所以,我從零開始。如果我的
01:12:30 want. So, I start from nothing. If my
01:12:30 想要。所以,我從零開始。如果我的
01:12:30 want. So, I start from nothing. If my learning slope is this, I'm essentially
01:12:30 想要。所以,我從零開始。如果我的學習曲線是這樣的,那麼我基本上
01:12:33 learning slope is this, I'm essentially
01:12:33 學習斜率是這樣的,我基本上
01:12:33 learning slope is this, I'm essentially unstoppable.
01:12:33 學習斜率就是這樣,我基本上是不可阻擋的。
01:12:34 unstoppable. 01:12:34 無法停止。
01:12:34 unstoppable. I'm unstoppable because I'm my learning
01:12:34 勢不可擋。我勢不可擋,因為我正在學習
01:12:37 I'm unstoppable because I'm my learning
01:12:37 我勢不可擋,因為我就是我的學習
01:12:38 I'm unstoppable because I'm my learning advantage by the time my competitor
01:12:38 我勢不可擋,因為我的學習優勢在我競爭對手
01:12:40 advantage by the time my competitor
01:12:40 優勢
01:12:40 advantage by the time my competitor figures out what I've done is too great.
01:12:40 當我的競爭對手發現我所做的事情太大時,我的優勢就太大了。
01:12:42 figures out what I've done is too great.
01:12:42 發現我所做的事情太棒了。
01:12:42 figures out what I've done is too great. Yeah.
01:12:42 發現我做的太棒了。是的。
01:12:42 Yeah. 01:12:42 是的。
01:12:42 Yeah. Now, how close can my my competitor be
01:12:42 是的。現在,我的競爭對手能追上我嗎?
01:12:45 Now, how close can my my competitor be
01:12:45 現在,我的競爭對手能有多接近
01:12:45 Now, how close can my my competitor be and still lose? The answer is a few
01:12:45 那麼,我的競爭對手能追到多遠卻仍然輸掉比賽呢?答案是
01:12:47 and still lose? The answer is a few
01:12:47 還是輸?答案是
01:12:48 and still lose? The answer is a few months.
01:12:48 還會虧嗎?答案是幾個月。
01:12:48 months. 01:12:48 個月。
01:12:48 months. Mhm. 01:12:48 幾個月。嗯。
01:12:48 Mhm. 01:12:48 嗯。
01:12:48 Mhm. Because the slopes are exponential.
01:12:48 嗯。因為斜率是指數的。
01:12:50 Because the slopes are exponential.
01:12:50 因為斜率是指數的。
01:12:50 Because the slopes are exponential. Mhm.
01:12:50 因為斜率是指數的。嗯。
01:12:51 Mhm. 01:12:51 嗯。
01:12:51 Mhm. And so, it's likely to me that there
01:12:51 嗯。所以,我認為
01:12:54 And so, it's likely to me that there
01:12:54 所以,對我來說,
01:12:54 And so, it's likely to me that there will be another 10 fantastic Google
01:12:54 因此,我認為未來可能會有另外 10 個出色的 Google
01:12:57 will be another 10 fantastic Google
01:12:57 將是另外 10 個出色的 Google
01:12:57 will be another 10 fantastic Google scale meta-cale companies. They'll all
01:12:57 將會有另外 10 家出色的 Google 規模的元規模公司。它們都將
01:12:59 scale meta-cale companies. They'll all
01:12:59 規模超大規模公司。它們都會
01:12:59 scale meta-cale companies. They'll all be founded on this principle of learning
01:12:59 規模化的元規模公司。它們都將建立在學習的原則上
01:13:01 be founded on this principle of learning
01:13:01 建立在這個學習原則之上
01:13:01 be founded on this principle of learning loops. And when I say learning loops, I
01:13:01 建立在學習循環的原則上。當我說學習循環時,我
01:13:04 loops. And when I say learning loops, I
01:13:04 循環。當我說學習循環時,我
01:13:04 loops. And when I say learning loops, I mean in the core product, solving the
01:13:04 循環。當我說學習循環時,我指的是核心產品中,解決
01:13:06 mean in the core product, solving the
01:13:06 意味著核心產品,解決
01:13:06 mean in the core product, solving the current problem as fast you can. If you
01:13:06 意思是在核心產品中,盡快解決當前問題。如果你
01:13:09 current problem as fast you can. If you
01:13:09 盡快解決當前問題。如果你
01:13:09 current problem as fast you can. If you cannot define the learning loop, you're
01:13:09 盡可能快速解決當前問題。如果你不能定義學習循環,你就
01:13:11 cannot define the learning loop, you're
01:13:11 無法定義學習循環,你
01:13:11 cannot define the learning loop, you're going to be beaten by a company that can
01:13:11 無法定義學習循環,你就會被一家人能夠
01:13:13 going to be beaten by a company that can
01:13:13 將被一家人能夠
01:13:13 going to be beaten by a company that can define it.
01:13:13 將被一家能夠定義它的公司擊敗。
01:13:14 define it. 01:13:14 定義它。
01:13:14 define it. And you said 10 meta Googlesized
01:13:14 定義它。你說 10 元 Googlesized
01:13:17 And you said 10 meta Googlesized
01:13:17 你說的是 10 個元 Googlesized
01:13:17 And you said 10 meta Googlesized companies. Do you think they'll there
01:13:17 你提到了10家谷歌規模的元公司。你認為他們會在那裡嗎?
01:13:18 companies. Do you think they'll there
01:13:18 公司。你認為他們會在那裡
01:13:18 companies. Do you think they'll there will also be a thousand like if you look
01:13:18 公司。你認為他們也會有數千家這樣的公司嗎?
01:13:21 will also be a thousand like if you look
01:13:21 也會有一千個像如果你看
01:13:21 will also be a thousand like if you look at the enterprise software business the
01:13:21 也會有一千個,如果你看看企業軟體業務
01:13:23 at the enterprise software business the
01:13:23 在企業軟體業務方面
01:13:23 at the enterprise software business the you know Oracle on down peopleoft
01:13:23 在企業軟體業務方面,你知道 Oracle 和 Peopleoft
01:13:25 you know Oracle on down peopleoft
01:13:25 你知道 Oracle 已經關閉了 peopleoft
01:13:25 you know Oracle on down peopleoft whatever thousands of those or will they
01:13:25 你知道 Oracle 會關閉數千個這樣的人嗎?
01:13:27 whatever thousands of those or will they
01:13:27 無論有多少,或者他們會
01:13:28 whatever thousands of those or will they all consolidate into those 10 that are
01:13:28 無論有幾千個,還是它們都會合併成那 10 個
01:13:30 all consolidate into those 10 that are
01:13:30 全部合併到這 10 個
01:13:30 all consolidate into those 10 that are domain dominant learning loop companies?
01:13:30 全部合併到這 10 家領域主導的學習循環公司?
01:13:33 domain dominant learning loop companies?
01:13:33 領域主導學習循環公司?
01:13:33 domain dominant learning loop companies? Um, I think I'm largely speaking about
01:13:33 領域主導學習循環公司?嗯,我想我主要談論的是
01:13:35 Um, I think I'm largely speaking about
01:13:35 嗯,我想我主要談論的是
01:13:36 Um, I think I'm largely speaking about consumer scale because that's where the
01:13:36 嗯,我想我主要談論的是消費者規模,因為這就是
01:13:38 consumer scale because that's where the
01:13:38 消費者規模,因為這就是
01:13:38 consumer scale because that's where the real growth is.
01:13:38 消費者規模,因為這是真正的成長所在。
01:13:40 real growth is. 01:13:40 實際增長是。
01:13:40 real growth is. The problem with learning loops is if
01:13:40 真正的成長是。學習循環的問題在於,如果
01:13:42 The problem with learning loops is if
01:13:42 學習循環的問題是
01:13:42 The problem with learning loops is if your customer is not ready for you, you
01:13:42 學習循環的問題是,如果你的客戶還沒準備好接受你,你
01:13:44 your customer is not ready for you, you
01:13:44 你的客戶還沒準備好迎接你,你
01:13:44 your customer is not ready for you, you can only learn at a certain rate.
01:13:44 你的客戶還沒準備好接受你,你只能以一定的速度學習。
01:13:47 can only learn at a certain rate.
01:13:47 只能以一定的速度學習。
01:13:47 can only learn at a certain rate. So, it's probably the case that the
01:13:47 只能以一定的速度學習。所以,
01:13:49 So, it's probably the case that the
01:13:49 所以,情況可能是
01:13:49 So, it's probably the case that the government is not interested in learning
01:13:49 所以,政府可能不感興趣學習
01:13:51 government is not interested in learning
01:13:51 政府對學習不感興趣
01:13:51 government is not interested in learning and therefore there's no growth in
01:13:51 政府對學習不感興趣,因此沒有發展
01:13:53 and therefore there's no growth in
01:13:53 因此沒有成長
01:13:53 and therefore there's no growth in learning loop serving the government.
01:13:53 因此,為政府服務的學習循環並沒有成長。
01:13:55 learning loop serving the government.
01:13:55 學習循環服務政府。
01:13:55 learning loop serving the government. I'm sorry to say that needs to get
01:13:55 學習循環服務政府。很遺憾,這需要
01:13:56 I'm sorry to say that needs to get
01:13:56 很抱歉,這需要
01:13:56 I'm sorry to say that needs to get fixed.
01:13:56 很遺憾,這個問題需要修復。
01:13:56 fixed. 01:13:56 已修復。
01:13:56 fixed. Yeah. 01:13:56 已修復。是的。
01:13:57 Yeah. 01:13:57 是的。
01:13:57 Yeah. Um, educational systems are largely
01:13:57 是的。嗯,教育系統很大程度上
01:13:59 Um, educational systems are largely
01:13:59 嗯,教育系統很大程度上
01:13:59 Um, educational systems are largely regulated and run by the unions and so
01:13:59 嗯,教育系統很大程度上是由工會管理和運營的,所以
01:14:01 regulated and run by the unions and so
01:14:01 由工會監督和管理
01:14:01 regulated and run by the unions and so forth. they're not interested in
01:14:01 由工會等監管和管理。他們對
01:14:02 forth. they're not interested in
01:14:02 第四,他們不感興趣
01:14:02 forth. they're not interested in innovation. They're not going to be
01:14:02 第四,他們對創新不感興趣。他們不會
01:14:03 innovation. They're not going to be
01:14:03 創新。他們不會
01:14:03 innovation. They're not going to be doing any learning. I'm sorry to say we
01:14:03 創新。他們不會做任何學習。很遺憾,我們
01:14:05 doing any learning. I'm sorry to say we
01:14:05 沒有任何學習。很遺憾,我們
01:14:05 doing any learning. I'm sorry to say we have to get that has to get fixed. So
01:14:05 沒有任何學習。很遺憾,我們必須解決這個問題。所以
01:14:07 have to get that has to get fixed. So
01:14:07 必須解決這個問題。所以
01:14:07 have to get that has to get fixed. So the ones where there's a very fast
01:14:07 必須解決這個問題。所以那些有非常快的
01:14:09 the ones where there's a very fast
01:14:09 其中有一個非常快
01:14:09 the ones where there's a very fast feedback signal are the ones to watch.
01:14:09 那些有非常快速回饋訊號的才是值得關注的。
01:14:12 feedback signal are the ones to watch.
01:14:12 回饋訊號是需要關注的。
01:14:12 feedback signal are the ones to watch. Another example, uh it's pretty obvious
01:14:12 回饋訊號是需要注意的。另一個例子,嗯,很明顯
01:14:14 Another example, uh it's pretty obvious
01:14:14 另一個例子,呃,很明顯
01:14:14 Another example, uh it's pretty obvious that you can build a whole new stock
01:14:14 另一個例子,呃,很明顯你可以建立一個全新的股票
01:14:16 that you can build a whole new stock
01:14:16 你可以建立一個全新的股票
01:14:16 that you can build a whole new stock trading company where you learn if you
01:14:16 你可以建立一個全新的股票交易公司,在那裡你可以學習
01:14:19 trading company where you learn if you
01:14:19 貿易公司,你可以在那裡學習
01:14:19 trading company where you learn if you get the algorithms right, you learn
01:14:19 交易公司,如果你掌握了正確的演算法,你就能學到
01:14:20 get the algorithms right, you learn
01:14:20 掌握正確的演算法,你就能學習
01:14:20 get the algorithms right, you learn faster than everyone else and scale
01:14:20 掌握正確的演算法,你就能比其他人學得更快,而且規模化
01:14:22 faster than everyone else and scale
01:14:22 比其他任何人都更快,規模更大
01:14:22 faster than everyone else and scale matters. So in the presence of scale and
01:14:22 比其他人都快,規模很重要。因此,在規模和
01:14:24 matters. So in the presence of scale and
01:14:24 很重要。因此,在規模和
01:14:24 matters. So in the presence of scale and fast learning loops, that's the moat.
01:14:24 很重要。所以在規模化和快速學習循環的情況下,這就是護城河。
01:14:27 fast learning loops, that's the moat.
01:14:27 快速學習循環,這就是護城河。
01:14:28 fast learning loops, that's the moat. Now I don't know that there's many
01:14:28 快速學習循環,這就是護城河。現在我不知道還有很多
01:14:29 Now I don't know that there's many
01:14:29 現在我不知道有多少
01:14:29 Now I don't know that there's many others there. You do have
01:14:29 我不知道那裡還有沒有其他人。你確實有
01:14:32 others there. You do have
01:14:32 還有其他人。你確實有
01:14:32 others there. You do have you think brand would be a mode?
01:14:32 還有其他人。您認為品牌會是一種模式嗎?
01:14:33 you think brand would be a mode?
01:14:33 您認為品牌會是一種模式嗎?
01:14:33 you think brand would be a mode? Uh brand matters but less so. What's
01:14:33 你認為品牌會是一種模式嗎?呃,品牌很重要,但沒那麼重要。
01:14:37 Uh brand matters but less so. What's
01:14:37 呃,品牌很重要,但沒那麼重要。
01:14:37 Uh brand matters but less so. What's interesting is people seem to be
01:14:37 呃,品牌很重要,但沒那麼重要。有趣的是,人們似乎
01:14:39 interesting is people seem to be
01:14:39 有趣的是人們似乎
01:14:39 interesting is people seem to be perfectly willing now to move from one
01:14:39 有趣的是,人們現在似乎非常願意從一個
01:14:41 perfectly willing now to move from one
01:14:41 現在非常願意從一個
01:14:41 perfectly willing now to move from one thing to the other in at least in the
01:14:41 現在非常願意從一件事轉到另一件事,至少在
01:14:42 thing to the other in at least in the
01:14:42 至少在
01:14:42 thing to the other in at least in the digital world.
01:14:42 至少在數位世界中是這樣的。
01:14:43 digital world. 01:14:43 數位世界。
01:14:43 digital world. And there's a whole new set of brands
01:14:43 數位世界。還有一系列全新的品牌
01:14:45 And there's a whole new set of brands
01:14:45 還有一系列全新的品牌
01:14:45 And there's a whole new set of brands that have emerged that everyone is using
01:14:45 出現了一系列新品牌,每個人都在使用
01:14:47 that have emerged that everyone is using
01:14:47 已經出現,每個人都在使用
01:14:47 that have emerged that everyone is using that are you know the next generations
01:14:47 已經出現,每個人都在使用,你知道下一代
01:14:49 that are you know the next generations
01:14:49 你知道下一代
01:14:49 that are you know the next generations that I haven't even heard of.
01:14:49 這是我從未聽過的下一代。
01:14:51 that I haven't even heard of.
01:14:51 我甚至都沒聽說過。
01:14:51 that I haven't even heard of. With within those learning loops you
01:14:51 我甚至都沒聽說過。在這些學習循環中,你
01:14:52 With within those learning loops you
01:14:52 在這些學習循環中,你
01:14:52 With within those learning loops you think domain specific synthetic data is
01:14:52 在這些學習循環中,你認為特定領域的合成資料是
01:14:55 think domain specific synthetic data is
01:14:55 認為特定領域的合成資料是
01:14:55 think domain specific synthetic data is a is a big advantage? Well, the answer
01:14:55 你認為特定領域的合成資料是一個很大的優勢嗎?嗯,答案是
01:14:58 a is a big advantage? Well, the answer
01:14:58 是一個很大的優勢嗎?嗯,答案是
01:14:58 a is a big advantage? Well, the answer is whatever it causes faster learning.
01:14:58 是一個很大的優勢嗎?嗯,答案是,無論它能帶來更快的學習速度。
01:15:01 is whatever it causes faster learning.
01:15:01 是任何可以加快學習速度的東西。
01:15:01 is whatever it causes faster learning. There are applications where you have
01:15:01 指的是任何能夠加快學習速度的東西。有些應用
01:15:03 There are applications where you have
01:15:03 有些應用程序
01:15:03 There are applications where you have enough training data from humans. There
01:15:03 有些應用需要足夠的人類訓練資料。
01:15:05 enough training data from humans. There
01:15:05 有足夠的人類訓練資料。
01:15:05 enough training data from humans. There are applications where you have to
01:15:05 足夠的人類訓練資料。有些應用需要
01:15:06 are applications where you have to
01:15:06 是你必須
01:15:06 are applications where you have to generate the training data from what the
01:15:06 是需要從
01:15:08 generate the training data from what the
01:15:08 從
01:15:08 generate the training data from what the humans are doing.
01:15:08 根據人類的行為產生訓練資料。
01:15:09 humans are doing.
01:15:09 人類正在做的事情。
01:15:09 humans are doing. Right? So, you could imagine a situation
01:15:09 人類正在做的事情。對吧?所以,你可以想像一個場景
01:15:11 Right? So, you could imagine a situation
01:15:11 對吧?所以,你可以想像一個場景
01:15:11 Right? So, you could imagine a situation where you had a learning loop where
01:15:11 對吧?所以,你可以想像一個學習循環的情況,
01:15:12 where you had a learning loop where
01:15:12 你有一個學習循環
01:15:12 where you had a learning loop where there's no humans involved where it's
01:15:12 那裡有一個學習循環,沒有人類參與
01:15:14 there's no humans involved where it's
01:15:14 沒有人類參與
01:15:14 there's no humans involved where it's monitoring something, some sensors, but
01:15:14 雖然沒有人類參與,但它在監測某些東西,一些感測器,但是
01:15:17 monitoring something, some sensors, but
01:15:17 監控某些東西,一些感測器,但是
01:15:17 monitoring something, some sensors, but because you learn faster on those
01:15:17 監控某些東西,一些感測器,但因為你在這些方面學得更快
01:15:19 because you learn faster on those
01:15:19 因為這樣你學得更快
01:15:19 because you learn faster on those sensors, you get so smart, you can't be
01:15:19 因為你透過這些感測器學習得更快,你變得非常聰明,你不能
01:15:21 sensors, you get so smart, you can't be
01:15:21 感應器,你變得如此聰明,你不可能
01:15:21 sensors, you get so smart, you can't be replaced by another sensor management
01:15:21 感測器,你變得如此聰明,你無法被另一個感測器管理取代
01:15:24 replaced by another sensor management
01:15:24 被另一個感測器管理取代
01:15:24 replaced by another sensor management company. That's the way to think about.
01:15:24 被另一家感測器管理公司取代。這才是值得思考的方式。
01:15:25 company. That's the way to think about.
01:15:25 公司。這才是思考的方式。
01:15:25 company. That's the way to think about. So, so what about the the capital for
01:15:25 公司。這是思考的方式。那麼,資本呢?
01:15:27 So, so what about the the capital for
01:15:27 那麼,
01:15:28 So, so what about the the capital for the learning loop? Like because um do
01:15:28 那麼,學習循環的資金狀況如何?因為
01:15:30 the learning loop? Like because um do
01:15:30 學習循環?因為…
01:15:30 the learning loop? Like because um do you know Danielle Roose who runs CE? So
01:15:30 學習循環?例如,你認識負責 CE 的 Danielle Roose 嗎?
01:15:32 you know Danielle Roose who runs CE? So
01:15:32 你知道負責 CE 的 Danielle Roose 嗎?
01:15:32 you know Danielle Roose who runs CE? So Danielle and I are really good friends.
01:15:32 你認識負責 CE 的 Danielle Roose 嗎?我和 Danielle 是很好的朋友。
01:15:33 Danielle and I are really good friends.
01:15:33 丹妮爾和我是很好的朋友。
01:15:33 Danielle and I are really good friends. We've been talking to our governor Mora
01:15:33 丹妮爾和我關係很好。我們一直在和莫拉州長溝通。
01:15:35 We've been talking to our governor Mora
01:15:35 我們一直在和州長莫拉交談
01:15:35 We've been talking to our governor Mora Healey who's one of the best governors
01:15:35 我們一直在和我們的州長莫拉·希利交談,他是最好的州長之一
01:15:36 Healey who's one of the best governors
01:15:36 希利是最好的州長之一
01:15:36 Healey who's one of the best governors in the world.
01:15:36 希利是世界上最好的州長之一。
01:15:36 in the world. 世界上的 01:15:36。
01:15:36 in the world. I agree.
01:15:36 世界上。我同意。
01:15:37 I agree. 01:15:37 我同意。
01:15:37 I agree. So there's a problem in our academic
01:15:37 我同意。所以我們的學術
01:15:39 So there's a problem in our academic
01:15:39 所以我們的學術存在問題
01:15:39 So there's a problem in our academic systems where the big companies have all
01:15:39 所以我們的學術系統存在一個問題,大公司擁有所有
01:15:41 systems where the big companies have all
01:15:41 大公司擁有的
01:15:42 systems where the big companies have all the hardware because they have all the
01:15:42 大公司擁有所有硬件,因為他們擁有所有
01:15:43 the hardware because they have all the
01:15:43 硬件,因為他們擁有所有
01:15:43 the hardware because they have all the money and the universities do not have
01:15:43 硬件,因為他們有錢,而大學沒有
01:15:45 money and the universities do not have
01:15:45 錢和大學沒有
01:15:45 money and the universities do not have the money for even reasonablesiz data
01:15:45 錢,而大學甚至沒有錢來取得合理的數據
01:15:48 the money for even reasonablesiz data
01:15:48 甚至合理化數據的資金
01:15:48 the money for even reasonablesiz data centers. I was with one university where
01:15:48 甚至更合理的資料中心的資金。我曾經在一所大學
01:15:50 centers. I was with one university where
01:15:50 中心。我曾在一所大學
01:15:50 centers. I was with one university where after lot lots of meetings they agreed
01:15:50 中心。我曾與一所大學合作,經過多次會議後,他們同意
01:15:52 after lot lots of meetings they agreed
01:15:52 經過多次會議後他們同意
01:15:52 after lot lots of meetings they agreed to spend $50 million on a data center
01:15:52 經過多次會議後,他們同意投資 5,000 萬美元建造資料中心
01:15:55 to spend $50 million on a data center
01:15:55 斥資 5,000 萬美元建置資料中心
01:15:55 to spend $50 million on a data center which generates less than a thousand
01:15:55 花費 5000 萬美元建造一個發電量不到 1000 的資料中心
01:15:57 which generates less than a thousand
01:15:57 生成不到一千
01:15:57 which generates less than a thousand GPUs
01:15:57 產生不到一千個 GPU
01:15:59 GPUs 01:15:59 GPU
01:15:59 GPUs right for the entire campus and all the
01:15:59 GPU 適合整個校園和所有
01:16:01 right for the entire campus and all the
01:16:01 適合整個校園和所有
01:16:01 right for the entire campus and all the research.
01:16:01 適用於整個校園和所有研究。
01:16:02 research. 01:16:02 研究。
01:16:02 research. Yeah. 01:16:02 研究。是的。
01:16:02 Yeah. 01:16:02 是的。
01:16:02 Yeah. And that doesn't even include the
01:16:02 是的。這還不包括
01:16:03 And that doesn't even include the
01:16:03 這還不包括
01:16:03 And that doesn't even include the terabytes of storage and so forth. So I
01:16:03 這還不包括 TB 級的儲存空間等等。所以我
01:16:06 terabytes of storage and so forth. So I
01:16:06 TB 的儲存空間等等。所以我
01:16:06 terabytes of storage and so forth. So I and others are working on this as a
01:16:06 TB 的儲存空間等等。所以我和其他人正在研究這個問題
01:16:07 and others are working on this as a
01:16:07 其他人正在研究這個問題
01:16:07 and others are working on this as a philanthropic matter. The government is
01:16:07 其他人正在將此作為慈善事業。政府
01:16:09 philanthropic matter. The government is
01:16:09 慈善事務。政府
01:16:09 philanthropic matter. The government is going to have to come in with more money
01:16:09 慈善事務。政府必須拿出更多資金
01:16:12 going to have to come in with more money
01:16:12 必須投入更多資金
01:16:12 going to have to come in with more money for universities for this kind of stuff.
01:16:12 必須提供大學更多資金用於這類事情。
01:16:15 for universities for this kind of stuff.
01:16:15 大學會做這種事。
01:16:15 for universities for this kind of stuff. That is among the best investment. When
01:16:15 大學可以做這類事情。這是最好的投資之一。當
01:16:17 That is among the best investment. When
01:16:17 這是最好的投資之一。當
01:16:17 That is among the best investment. When I was young, I was on a National Science
01:16:17 這是最好的投資之一。我年輕的時候,參加過國家科學
01:16:19 I was young, I was on a National Science
01:16:19 當時我還年輕,正在參加國家科學
01:16:19 I was young, I was on a National Science Foundation scholarship for and by the
01:16:19 當時我還年輕,獲得了美國國家科學基金會的獎學金,
01:16:21 Foundation scholarship for and by the
01:16:21 基金會獎學金
01:16:22 Foundation scholarship for and by the way, I made $15,000 a year. Uh the
01:16:22 基金會獎學金,順便說一下,我每年賺15,000美元。呃
01:16:24 way, I made $15,000 a year. Uh the
01:16:24 這樣,我一年賺了15,000美元。呃
01:16:24 way, I made $15,000 a year. Uh the return to the nation of my that $15,000
01:16:24 這樣,我每年賺了15,000美元。呃,這15,000美元回報國家
01:16:27 return to the nation of my that $15,000
01:16:27 把我的15,000美元還給國家
01:16:27 return to the nation of my that $15,000 has been very good, shall we say, based
01:16:27 回到我的國家,15,000 美元已經非常好了,我們可以這麼說,基於
01:16:29 has been very good, shall we say, based
01:16:29 非常好,可以說,基於
01:16:29 has been very good, shall we say, based on the taxes that I pay and the jobs
01:16:29 非常好,可以說,根據我繳納的稅金和就業
01:16:31 on the taxes that I pay and the jobs
01:16:31 關於我所繳納的稅金和就業
01:16:31 on the taxes that I pay and the jobs that we have created.
01:16:31 我繳納的稅金和我們創造的就業機會。
01:16:32 that we have created.
01:16:32 我們已經創建了。
01:16:32 that we have created. So core question. So glad you
01:16:32 我們已經創建了。所以核心問題。很高興你
01:16:34 So core question. So glad you
01:16:34 核心問題。很高興你
01:16:34 So core question. So glad you so so creating so creating an ecosystem
01:16:34 核心問題。很高興你創造了一個生態系統
01:16:37 so so creating so creating an ecosystem
01:16:37 所以創造一個生態系統
01:16:37 so so creating so creating an ecosystem for the next generation to have the
01:16:37 所以,為下一代創造一個生態系統,
01:16:39 for the next generation to have the
01:16:39 讓下一代擁有
01:16:40 for the next generation to have the access to the systems is important. It's
01:16:40 讓下一代能夠使用這些系統非常重要。
01:16:43 access to the systems is important. It's
01:16:43 存取系統非常重要。
01:16:43 access to the systems is important. It's not obvious to me that they need
01:16:43 存取系統很重要。我不太清楚他們需要
01:16:45 not obvious to me that they need
01:16:45 對我來說,他們不需要
01:16:45 not obvious to me that they need billions of dollars.
01:16:45 我不太清楚他們是否需要數十億美元。
01:16:47 billions of dollars.
01:16:47 數十億美元。
01:16:47 billions of dollars. It's pretty obvious to me that they need
01:16:47 億美元。對我來說,很明顯他們需要
01:16:50 It's pretty obvious to me that they need
01:16:50 對我來說很明顯他們需要
01:16:50 It's pretty obvious to me that they need a million dollars, $2 million. Yeah,
01:16:50 對我來說,很明顯他們需要100萬美元,200萬美元。是的,
01:16:52 a million dollars, $2 million. Yeah,
01:16:52 一百萬美元,兩百萬美元。是的,
01:16:52 a million dollars, $2 million. Yeah, that's the goal.
01:16:52 一百萬美元,兩百萬美元。是的,這就是目標。
01:16:53 that's the goal.
01:16:53 這就是目標。
01:16:53 that's the goal. Yeah.
01:16:53 這就是目標。是的。
01:16:53 Yeah. 01:16:53 是的。
01:16:54 Yeah. I want to I want to take a I want to
01:16:54 是的。我想……我想……我想
01:16:55 I want to I want to take a I want to
01:16:55 我想我想參加我想
01:16:55 I want to I want to take a I want to take us in a direction of uh of uh
01:16:55 我想要我想要帶我們去呃呃的方向
01:16:58 take us in a direction of uh of uh
01:16:58 帶我們到呃呃的方向
01:16:58 take us in a direction of uh of uh wrapping up on super intelligence and
01:16:58 讓我們進入超級智慧的領域
01:17:00 wrapping up on super intelligence and
01:17:00 總結超級智慧和
01:17:00 wrapping up on super intelligence and the book.
01:17:00 總結超級智慧和這本書。
01:17:01 the book. 01:17:01 這本書。
01:17:02 the book. Um, 01:17:02 這本書。嗯,
01:17:03 Um, 01:17:03 一,
01:17:03 Um, we didn't finish the timeline on super
01:17:03 嗯,我們還沒完成超級
01:17:05 we didn't finish the timeline on super
01:17:05 我們還沒完成超級
01:17:05 we didn't finish the timeline on super intelligence and I think it's important
01:17:05 我們還沒有完成超級智慧的時間表,我認為這很重要
01:17:06 intelligence and I think it's important
01:17:06 情報,我認為這很重要
01:17:06 intelligence and I think it's important to give people a sense of how quickly
01:17:06 情報,我認為讓人們了解
01:17:09 to give people a sense of how quickly
01:17:09 讓人們感受到
01:17:09 to give people a sense of how quickly the self-reerential learning can get and
01:17:09 讓人們了解自我參考學習的速度有多快,
01:17:11 the self-reerential learning can get and
01:17:11 自我參考學習可以得到和
01:17:11 the self-reerential learning can get and how rapidly we can get to something, you
01:17:11 自我參照學習可以達到什麼程度,以及我們能多快達到某個目標,你
01:17:15 how rapidly we can get to something, you
01:17:15 我們能多快完成某件事,你
01:17:15 how rapidly we can get to something, you know, a thousand times, a million, a
01:17:15 我們能多快達到目標,你知道,一千次,一百萬次,
01:17:16 know, a thousand times, a million, a
01:17:16 知道,一千次,一百萬次,一個
01:17:16 know, a thousand times, a million, a billion times more capable than a human.
01:17:16 知道,比人類能力強一千倍、一百萬倍、十億倍。
01:17:20 billion times more capable than a human.
01:17:20 比人類的能力強十億倍。
01:17:20 billion times more capable than a human. On the flip side of that, Eric, when I
01:17:20 比人類的能力強十億倍。另一方面,埃里克,當我
01:17:22 On the flip side of that, Eric, when I
01:17:22 另一方面,艾瑞克,當我
01:17:22 On the flip side of that, Eric, when I look at my greatest concerns when we get
01:17:22 另一方面,艾瑞克,當我看到我們得到
01:17:25 look at my greatest concerns when we get
01:17:25 看看我們得到的時候我最擔心的
01:17:26 look at my greatest concerns when we get through this 5 to sevenyear period of
01:17:26 看看我們度過這五到七年的時期後我最擔心的
01:17:30 through this 5 to sevenyear period of
01:17:30 在這五到七年的時間裡
01:17:30 through this 5 to sevenyear period of uh let's just say rogue actors and
01:17:30 在這五到七年的時間裡,呃,我們只能說,流氓演員和
01:17:33 uh let's just say rogue actors and
01:17:33 呃,我們就說流氓演員和
01:17:33 uh let's just say rogue actors and stabilization and such. Uh one of the
01:17:33 呃,我們就說說流氓演員和穩定之類的吧。呃,其中之一
01:17:36 stabilization and such. Uh one of the
01:17:36 穩定等等。呃,其中之一
01:17:36 stabilization and such. Uh one of the biggest concerns I have is the
01:17:36 穩定等等。呃,我最大的擔憂之一是
01:17:39 biggest concerns I have is the
01:17:39 我最大的擔憂是
01:17:39 biggest concerns I have is the diminishment of human purpose. Mhm.
01:17:39 我最大的擔憂是人類目的性的減弱。嗯。
01:17:41 diminishment of human purpose. Mhm.
01:17:41 人類目的的減弱。嗯。
01:17:41 diminishment of human purpose. Mhm. Um, you know, you wrote uh in the book
01:17:41 人類目的的減弱。嗯。嗯,你知道,你在書裡寫過
01:17:45 Um, you know, you wrote uh in the book
01:17:45 嗯,你知道,你在書裡寫過
01:17:45 Um, you know, you wrote uh in the book uh and I've listened to it uh haven't
01:17:45 嗯,你知道,你在書裡寫過,我聽過,但是
01:17:48 uh and I've listened to it uh haven't
01:17:48 呃,我聽過,呃,還沒有
01:17:48 uh and I've listened to it uh haven't read it physically and my kids say you
01:17:48 呃,我聽過,呃,沒有讀過,我的孩子說你
01:17:50 read it physically and my kids say you
01:17:50 親手讀了一遍,我的孩子說你
01:17:50 read it physically and my kids say you don't read anymore.
01:17:50 親自閱讀,我的孩子說你不再閱讀了。
01:17:51 don't read anymore.
01:17:51 不要再讀了。
01:17:51 don't read anymore. You you listen to books you don't read.
01:17:51 不要再讀書了。你聽你不讀的書。
01:17:53 You you listen to books you don't read.
01:17:53 你聽的書你不讀。
01:17:53 You you listen to books you don't read. But um you said the real risk is not
01:17:53 你聽書,你不讀。但你說真正的風險不是
01:17:56 But um you said the real risk is not
01:17:56 但你說的真正的風險不是
01:17:56 But um you said the real risk is not terminator, it's drift. Um you argue
01:17:56 但你說真正的風險不是終結者,而是漂移。你爭辯道
01:17:58 terminator, it's drift. Um you argue
01:17:58 終結者,這是漂移。嗯,你爭論
01:17:58 terminator, it's drift. Um you argue that AI won't destroy human uh humanity
01:17:58 終結者,它漂移了。嗯,你說人工智慧不會毀滅人類…
01:18:01 that AI won't destroy human uh humanity
01:18:01 人工智慧不會毀滅人類
01:18:01 that AI won't destroy human uh humanity violently, but might slowly erode human
01:18:01 人工智慧不會暴力毀滅人類,但可能會慢慢侵蝕人類
01:18:04 violently, but might slowly erode human
01:18:04 劇烈,但可能慢慢侵蝕人類
01:18:04 violently, but might slowly erode human values, autonomy, and judgment if left
01:18:04 暴力,但如果放任不管,可能會慢慢侵蝕人類的價值觀、自主性和判斷力
01:18:06 values, autonomy, and judgment if left
01:18:06 價值觀、自主性和判斷力
01:18:06 values, autonomy, and judgment if left unregulated, misunderstood.
01:18:06 價值觀、自主權和判斷力如果不加以規範和誤解。
01:18:09 unregulated, misunderstood.
01:18:09 不受管制,被誤解。
01:18:09 unregulated, misunderstood. So it's really a Wall-E like future
01:18:09 不受監管,被誤解。所以這真的是像《機器人瓦力》一樣的未來
01:18:11 So it's really a Wall-E like future
01:18:11 所以這真的是像《機器人瓦力》一樣的未來
01:18:11 So it's really a Wall-E like future versus a a Star Trek boldly go out
01:18:11 所以這真的是像《機器人瓦力》那樣的未來,而不是《星際爭霸戰》那樣大膽地走出去
01:18:14 versus a a Star Trek boldly go out
01:18:14 對抗星際爭霸戰大膽出去
01:18:14 versus a a Star Trek boldly go out there.
01:18:14 與星際爭霸戰勇敢地走出去。
01:18:15 there. 01:18:15 那裡。
01:18:15 there. We're very in the book and my own
01:18:15 在那裡。我們非常在意這本書,我自己的
01:18:17 We're very in the book and my own
01:18:17 我們非常認同這本書以及我自己的
01:18:17 We're very in the book and my own personal view is it's very important
01:18:17 我們非常認同這一點,我個人認為這非常重要
01:18:19 personal view is it's very important
01:18:19 個人觀點是這非常重要
01:18:19 personal view is it's very important that human agency be protected.
01:18:19 個人觀點是保護人類能動性非常重要。
01:18:23 that human agency be protected.
01:18:23 人類機構受到保護。
01:18:23 that human agency be protected. Yeah.
01:18:23 人類能動性應該受到保護。是的。
01:18:23 Yeah. 01:18:23 是的。
01:18:23 Yeah. Human agency means the ability to get up
01:18:23 是的。人類的自主性意味著能夠站起來
01:18:26 Human agency means the ability to get up
01:18:26 人類的自主性意味著站起來的能力
01:18:26 Human agency means the ability to get up in the day and do what you want subject
01:18:26 人類的自主性意味著能夠在白天起床並做自己想做的事情
01:18:29 in the day and do what you want subject
01:18:29 在白天做你想做的事
01:18:29 in the day and do what you want subject to the law. Right. And it's perfectly
01:18:29 白天做你想做的事,遵守律法。對。而且完全
01:18:32 to the law. Right. And it's perfectly
01:18:32 法律。對。而且完全
01:18:32 to the law. Right. And it's perfectly possible that these digital devices can
01:18:32 法律。對。這些數位設備完全有可能
01:18:34 possible that these digital devices can
01:18:34 這些數位設備可能
01:18:34 possible that these digital devices can create a form of a virtual prison where
01:18:34 這些數位設備有可能創造一種虛擬監獄,
01:18:37 create a form of a virtual prison where
01:18:37 創造一種虛擬監獄的形式
01:18:37 create a form of a virtual prison where you don't feel that you as a human can
01:18:37 創造一種虛擬監獄,在那裡你不會覺得你作為一個人類可以
01:18:39 you don't feel that you as a human can
01:18:39 你不覺得你作為一個人可以
01:18:39 you don't feel that you as a human can do what you want. Right? That is to be
01:18:39 你不覺得你作為一個人可以做你想做的事。對嗎?那就是
01:18:41 do what you want. Right? That is to be
01:18:41 做你想做的事。對吧?那就是
01:18:41 do what you want. Right? That is to be avoided. I I'm I'm not worried about
01:18:41 做你想做的事。對吧?這是應該避免的。我,我,我不擔心
01:18:43 avoided. I I'm I'm not worried about
01:18:43 避免了。我,我,我不擔心
01:18:43 avoided. I I'm I'm not worried about that case. I'm more worried about the
01:18:43 避免了。我……我不擔心那個案子。我更擔心的是
01:18:45 that case. I'm more worried about the
01:18:45 這種情況。我更擔心的是
01:18:45 that case. I'm more worried about the case that if you want to do something,
01:18:45 這種情況。我更擔心的是,如果你想做點什麼,
01:18:48 case that if you want to do something,
01:18:48 如果你想做某件事,
01:18:48 case that if you want to do something, it's just so much easier to ask your
01:18:48 如果你想做某件事,那麼問你的
01:18:51 it's just so much easier to ask your
01:18:51 問你的
01:18:51 it's just so much easier to ask your robot or your AI to do it for you. The
01:18:51 讓你的機器人或人工智慧為你做這件事要容易得多。
01:18:53 robot or your AI to do it for you. The
01:18:53 機器人或人工智慧會為你做這件事。
01:18:54 robot or your AI to do it for you. The the human spirit that wants to overcome
01:18:54 機器人或你的人工智慧會幫你做這件事。人類的精神渴望克服
01:18:57 the human spirit that wants to overcome
01:18:57 想要克服的人類精神
01:18:57 the human spirit that wants to overcome a challenge. I mean the unchallenged
01:18:57 想要克服挑戰的人類精神。我指的是那些沒有挑戰的
01:18:58 a challenge. I mean the unchallenged
01:18:58 一個挑戰。我的意思是,沒有挑戰的
01:18:58 a challenge. I mean the unchallenged life is so going to so critical
01:18:58 一個挑戰。我的意思是,沒有挑戰的生活將會變得如此關鍵
01:19:00 life is so going to so critical
01:19:00 生活如此關鍵
01:19:00 life is so going to so critical but but there will be always new
01:19:00 生活總是如此關鍵,但總會有新的
01:19:02 but but there will be always new
01:19:02 但總會有新的
01:19:02 but but there will be always new challenges. Uh when I was a boy uh one
01:19:02 但總是會有新的挑戰。呃,當我還是個孩子的時候
01:19:05 challenges. Uh when I was a boy uh one
01:19:05 挑戰。呃,當我還是個孩子的時候,呃
01:19:05 challenges. Uh when I was a boy uh one of the things that I did is I would
01:19:05 挑戰。呃,當我還是個孩子的時候,我做的一件事就是
01:19:06 of the things that I did is I would
01:19:06 我所做的就是
01:19:06 of the things that I did is I would repair my father's car
01:19:06 我做的一件事就是修理我爸爸的車
01:19:08 repair my father's car
01:19:08 修理我爸爸的車
01:19:08 repair my father's car right I don't do that anymore. When I
01:19:08 修好我爸爸的車,我不再這麼做了。當我
01:19:10 right I don't do that anymore. When I
01:19:10 是的,我不再這麼做了。當我
01:19:10 right I don't do that anymore. When I was a boy I used to mow the lawn. I
01:19:10 對,我不再這麼做了。我小時候常修剪草坪。我
01:19:12 was a boy I used to mow the lawn. I
01:19:12 是一個我以前幫他修剪草坪的男孩。我
01:19:12 was a boy I used to mow the lawn. I don't do that anymore.
01:19:12 小時候我常幫他修剪草坪。現在我不再這麼做了。
01:19:13 don't do that anymore.
01:19:13 不要再這樣做了。
01:19:13 don't do that anymore. Sure.
01:19:13 別再這樣了。當然。
01:19:13 Sure. 01:19:13 當然。
01:19:13 Sure. Right. So there are plenty of examples
01:19:13 當然。對。有很多例子
01:19:16 Right. So there are plenty of examples
01:19:16 對。有很多例子
01:19:16 Right. So there are plenty of examples of things that we used to do that we
01:19:16 對。有很多我們過去常做的事情,
01:19:17 of things that we used to do that we
01:19:17 我們過去常做的事情
01:19:17 of things that we used to do that we don't need to do anymore. But there'll
01:19:17 我們以前常做的事情現在不再需要做了。但
01:19:19 don't need to do anymore. But there'll
01:19:19 不需要再做了。但是
01:19:19 don't need to do anymore. But there'll be plenty of things. Just remember the
01:19:19 不需要再做了。但還有很多事情要做。只要記住
01:19:21 be plenty of things. Just remember the
01:19:21 有很多事情。只要記住
01:19:21 be plenty of things. Just remember the complexity of the world that I'm
01:19:21 有很多事情。只要記住我所處的世界的複雜性
01:19:23 complexity of the world that I'm
01:19:23 世界的複雜性
01:19:23 complexity of the world that I'm describing is not a simple world. Just
01:19:23 我所描述的世界的複雜性並不是一個簡單的世界。只是
01:19:26 describing is not a simple world. Just
01:19:26 描述不是一個簡單的世界。只是
01:19:26 describing is not a simple world. Just managing the world around you is going
01:19:26 要描述世界並非易事。僅僅管理你周圍的世界就
01:19:28 managing the world around you is going
01:19:28 管理你周圍的世界
01:19:28 managing the world around you is going to be a full-time and purposeful job.
01:19:28 管理周遭的世界將會是一項全職且有意義的工作。
01:19:31 to be a full-time and purposeful job.
01:19:31 成為一份全職且有意義的工作。
01:19:31 to be a full-time and purposeful job. Partly because there will be so many
01:19:31 成為一份全職且有意義的工作。部分原因是
01:19:32 Partly because there will be so many
01:19:32 部分原因是
01:19:32 Partly because there will be so many people fighting for misinformation and
01:19:32 部分原因是會有很多人為假訊息而爭吵,
01:19:34 people fighting for misinformation and
01:19:34 人們為假訊息而戰,
01:19:34 people fighting for misinformation and for your attention and and there's
01:19:34 人們為了假訊息和你的注意力而爭吵,
01:19:37 for your attention and and there's
01:19:37 敬請關注
01:19:37 for your attention and and there's obviously lots of competition and so
01:19:37 引起你的注意,而且顯然競爭很激烈,所以
01:19:38 obviously lots of competition and so
01:19:38 顯然競爭很激烈,所以
01:19:38 obviously lots of competition and so forth. There's lots of things to worry
01:19:38 顯然競爭很激烈等等。有很多事情需要擔心
01:19:40 forth. There's lots of things to worry
01:19:40 第四。有很多事情要擔心
01:19:40 forth. There's lots of things to worry about. Plus, you have all of the people,
01:19:40 第四。有很多事情要擔心。而且,你還有這麼多人,
01:19:42 about. Plus, you have all of the people,
01:19:42 左右。另外,你們還有所有人,
01:19:42 about. Plus, you have all of the people, you know, trying to get your trying to
01:19:42 差不多。另外,你知道,所有人都在努力讓你
01:19:43 you know, trying to get your trying to
01:19:43 你知道,試圖讓你試圖
01:19:44 you know, trying to get your trying to get your your money, create
01:19:44 你知道,試圖得到你的錢,創造
01:19:45 get your your money, create
01:19:45 拿到你的錢,創造
01:19:45 get your your money, create opportunities, deceive you, what have
01:19:45 騙你的錢,創造機會,欺騙你,有什麼
01:19:47 opportunities, deceive you, what have
01:19:47 機會,欺騙你,有什麼
01:19:47 opportunities, deceive you, what have you. So, I think human purpose will
01:19:47 機會,欺騙你,你有什麼。所以,我認為人類的目的將
01:19:49 you. So, I think human purpose will
01:19:49 所以,我認為人類的目的
01:19:49 you. So, I think human purpose will remain because humans need purpose.
01:19:49 所以,我認為人類的目的將會保留,因為人類需要目的。
01:19:54 remain because humans need purpose.
01:19:54 之所以存在,是因為人類需要目標。
01:19:54 remain because humans need purpose. That's the point. And you know there's
01:19:54 之所以存在,是因為人類需要目標。這就是重點。你知道
01:19:55 That's the point. And you know there's
01:19:55 這就是重點。你知道
01:19:55 That's the point. And you know there's lots of literature that the people who
01:19:55 這就是重點。你知道,有很多文獻都提到
01:19:57 lots of literature that the people who
01:19:57 很多文獻都表明
01:19:57 lots of literature that the people who have what we would consider to be
01:19:57 許多文獻都表明,那些擁有我們認為
01:19:58 have what we would consider to be
01:19:58 有我們認為的
01:19:58 have what we would consider to be lowpaying worthless jobs enjoy going to
01:19:58 從事我們認為低薪且毫無價值的工作
01:20:01 lowpaying worthless jobs enjoy going to
01:20:01 低薪無價值的工作享受
01:20:01 lowpaying worthless jobs enjoy going to work. So the challenge is not to get rid
01:20:01 低薪且毫無價值的工作讓人享受工作。所以挑戰不在於擺脫
01:20:04 work. So the challenge is not to get rid
01:20:04 工作。所以挑戰不在於擺脫
01:20:04 work. So the challenge is not to get rid of their job. It's to make their job
01:20:04 工作。所以挑戰不在於讓他們失業,而是讓他們繼續工作
01:20:06 of their job. It's to make their job
01:20:06 他們的本職工作。這是為了讓他們的工作
01:20:06 of their job. It's to make their job more productive using AI tools. They're
01:20:06 他們的本職工作。目的是利用人工智慧工具來提高他們的工作效率。他們
01:20:09 more productive using AI tools. They're
01:20:09 使用人工智慧工具可以提高生產力。它們
01:20:09 more productive using AI tools. They're still going to go to work. And I to be
01:20:09 使用人工智慧工具會提高生產力。他們仍然會去工作。而我
01:20:12 still going to go to work. And I to be
01:20:12 還是要去上班。我
01:20:12 still going to go to work. And I to be very clear this notion that we're all
01:20:12 還是要去上班。我要非常明確地指出,我們所有人
01:20:14 very clear this notion that we're all
01:20:14 非常清楚,我們都是
01:20:14 very clear this notion that we're all going to be sitting around doing poetry
01:20:14 非常清楚,我們都會坐在一起寫詩
01:20:16 going to be sitting around doing poetry
01:20:16 坐下來寫詩
01:20:16 going to be sitting around doing poetry is not happening. Right? In the future
01:20:16 坐著寫詩是不可能的。對吧?將來
01:20:19 is not happening. Right? In the future
01:20:19 不會發生。對吧?將來
01:20:19 is not happening. Right? In the future there'll be lawyers. They'll use tools
01:20:19 不會發生。對吧?未來會有律師。他們會用工具
01:20:21 there'll be lawyers. They'll use tools
01:20:21 會有律師。他們會使用工具
01:20:21 there'll be lawyers. They'll use tools to have even more complex lawsuits
01:20:21 會有律師。他們會使用工具來處理更複雜的訴訟
01:20:23 to have even more complex lawsuits
01:20:23 訴訟變得更加複雜
01:20:23 to have even more complex lawsuits against each other, right? There will be
01:20:23 彼此之間會面臨更複雜的訴訟,對吧?
01:20:25 against each other, right? There will be
01:20:25 互相對抗,對吧?會有
01:20:25 against each other, right? There will be evil people who will use these tools to
01:20:25 互相攻擊,對吧?會有邪惡的人利用這些工具
01:20:27 evil people who will use these tools to
01:20:27 邪惡之人會利用這些工具
01:20:27 evil people who will use these tools to create even more evil problems. There
01:20:27 邪惡之人會利用這些工具製造更多邪惡問題。
01:20:29 create even more evil problems. There
01:20:29 會製造更多邪惡的問題。
01:20:29 create even more evil problems. There will be good people who will be trying
01:20:29 製造更多邪惡問題。會有好人試圖
01:20:31 will be good people who will be trying
01:20:31 都是好人,他們會努力
01:20:31 will be good people who will be trying to deter the evil people. The tools
01:20:31 將會是好人,他們會試圖阻止壞人。工具
01:20:34 to deter the evil people. The tools
01:20:34 震懾邪惡之人。
01:20:34 to deter the evil people. The tools change, but the structure of humanity,
01:20:34 震懾邪惡之人。工具變了,但人類的結構不變。
01:20:36 change, but the structure of humanity,
01:20:36 變化,但人類的結構,
01:20:36 change, but the structure of humanity, the way we work together is not going to
01:20:36 改變,但人類的結構,我們合作的方式不會改變
01:20:38 the way we work together is not going to
01:20:38 我們合作的方式不會
01:20:38 the way we work together is not going to change.
01:20:38 我們合作的方式不會改變。
01:20:38 change. 01:20:38 更改。
01:20:38 change. Peter and I were on Mike Sailor's yacht
01:20:38 變化。彼得和我在麥克·塞勒的遊艇上
01:20:40 Peter and I were on Mike Sailor's yacht
01:20:40 彼得和我在麥克·塞勒的遊艇上
01:20:40 Peter and I were on Mike Sailor's yacht a couple months ago, and I was
01:20:40 幾個月前,彼得和我乘坐麥克‧塞勒的遊艇,當時我
01:20:43 a couple months ago, and I was
01:20:43 幾個月前,我
01:20:43 a couple months ago, and I was complaining that the curriculum is
01:20:43 幾個月前,我抱怨課程
01:20:44 complaining that the curriculum is
01:20:44 抱怨課程
01:20:44 complaining that the curriculum is completely broken in all these schools.
01:20:44 抱怨這些學校的課程完全失效了。
01:20:46 completely broken in all these schools.
01:20:46 所有這些學校都徹底崩潰了。
01:20:46 completely broken in all these schools. But what I meant was we should be
01:20:46 所有這些學校都完全崩潰了。但我的意思是我們應該
01:20:48 But what I meant was we should be
01:20:48 但我的意思是我們應該
01:20:48 But what I meant was we should be teaching AI. And he said, "Yeah, they
01:20:48 但我的意思是我們應該教人工智慧。他說:「是的,他們
01:20:50 teaching AI. And he said, "Yeah, they
01:20:50 教授人工智慧。他說,「是的,他們
01:20:50 teaching AI. And he said, "Yeah, they should be teaching aesthetics." And I
01:20:50 教授人工智慧。他說:“是啊,他們應該教美學。”
01:20:52 should be teaching aesthetics." And I
01:20:52 應教美學。 」而我
01:20:52 should be teaching aesthetics." And I looked at him, I'm like, "What the hell
01:20:52 應教美學。 」我看著他,心想,「這到底是怎麼回事
01:20:53 looked at him, I'm like, "What the hell
01:20:53 看著他,我心想,「這到底是怎麼回事
01:20:53 looked at him, I'm like, "What the hell are you talking about?" He said, "No, in
01:20:53 我看著他,心想:「你到底在說什麼?」他說:「不,
01:20:55 are you talking about?" He said, "No, in
01:20:55 你在說什麼?他說,「不,在
01:20:55 are you talking about?" He said, "No, in the age of AI, which is imminent, look
01:20:55 你在說什麼?他說,「不,在即將到來的人工智慧時代,看看
01:20:57 the age of AI, which is imminent, look
01:20:57 人工智慧時代即將到來,看看
01:20:57 the age of AI, which is imminent, look at everything around you, whether it's
01:20:57 人工智慧時代即將到來,看看你周圍的一切,無論是
01:20:59 at everything around you, whether it's
01:20:59 你周遭的一切,無論是
01:20:59 at everything around you, whether it's good or bad, enjoyable, not enjoyable,
01:20:59 在你周圍發生的一切,無論是好的還是壞的,愉快的還是不愉快的,
01:21:01 good or bad, enjoyable, not enjoyable,
01:21:01 好或壞,愉快或不愉快,
01:21:01 good or bad, enjoyable, not enjoyable, it's all about designing aesthetics."
01:21:01 好或壞,享受或不享受,都關乎設計美學。 」
01:21:04 it's all about designing aesthetics."
01:21:04 一切都與設計美學有關。 」
01:21:04 it's all about designing aesthetics." When the AI is such a force multiplier
01:21:04 一切都與設計美學有關。當人工智慧成為力量倍增器時
01:21:06 When the AI is such a force multiplier
01:21:06 當人工智慧成為力量倍增器時
01:21:06 When the AI is such a force multiplier that you can create virtually anything,
01:21:06 當人工智慧成為力量倍增器,你幾乎可以創造任何東西時,
01:21:07 that you can create virtually anything,
01:21:07 你幾乎可以創造任何東西,
01:21:08 that you can create virtually anything, what what are you creating and why? And
01:21:08 你幾乎可以創造任何東西,你在創造什麼?為什麼?
01:21:09 what what are you creating and why? And
01:21:09 你在創作什麼?為什麼?
01:21:09 what what are you creating and why? And that becomes the challenge.
01:21:09 你在創造什麼?為什麼?這就是挑戰所在。
01:21:10 that becomes the challenge.
01:21:10 這成為了挑戰。
01:21:10 that becomes the challenge. If you look at Vickinstein and the sort
01:21:10 這才是真正的挑戰。如果你看看 Vickinstein 和
01:21:12 If you look at Vickinstein and the sort
01:21:12 如果你看看 Vickinstein 和類似的
01:21:12 If you look at Vickinstein and the sort of theories of all of this stuff, it is
01:21:12 如果你看看 Vickinstein 和所有這類理論,你會發現
01:21:15 of theories of all of this stuff, it is
01:21:15 所有這些理論都是
01:21:15 of theories of all of this stuff, it is all fundament we're having a
01:21:15 所有這些理論,都是我們擁有的基礎
01:21:16 all fundament we're having a
01:21:16 基本上我們有一個
01:21:16 all fundament we're having a conversation that America has about
01:21:16 我們正在討論美國正在討論的
01:21:19 conversation that America has about
01:21:19 美國正在談論
01:21:19 conversation that America has about tasks and outcomes. It's our culture.
01:21:19 美國人關於任務和結果的對話。這是我們的文化。
01:21:22 tasks and outcomes. It's our culture.
01:21:22 任務和結果。這就是我們的文化。
01:21:22 tasks and outcomes. It's our culture. But there are other aspects of human
01:21:22 任務和結果。這是我們的文化。但人類還有其他方面
01:21:23 But there are other aspects of human
01:21:23 但人類還有其他方面
01:21:23 But there are other aspects of human life, meaning, thinking, reasoning.
01:21:23 但人類生活還有其他方面,意義、思考、推理。
01:21:27 life, meaning, thinking, reasoning.
01:21:27 生命、意義、思考、推理。
01:21:28 life, meaning, thinking, reasoning. We're not going to stop doing that.
01:21:28 生命、意義、思考、推理。我們不會停止這樣做。
01:21:30 We're not going to stop doing that.
01:21:30 我們不會停止這樣做。
01:21:30 We're not going to stop doing that. So imagine if your purpose in life in
01:21:30 我們不會停止這樣做。想像一下,如果你的人生目標是
01:21:33 So imagine if your purpose in life in
01:21:33 想像一下,如果你的人生目標是
01:21:33 So imagine if your purpose in life in the future is to figure out what's going
01:21:33 想像一下,如果你未來的生活目標是弄清楚接下來會發生什麼
01:21:34 the future is to figure out what's going
01:21:34 未來是要弄清楚會發生什麼
01:21:34 the future is to figure out what's going on and to be successful, just figuring
01:21:34 未來就是要弄清楚發生了什麼,並且取得成功,只是弄清楚
01:21:37 on and to be successful, just figuring
01:21:37 並且成功,只是想
01:21:37 on and to be successful, just figuring that out is sufficient. Because once you
01:21:37 要成功,只要弄清楚這一點就夠了。因為一旦你
01:21:39 that out is sufficient. Because once you
01:21:39 這就夠了。因為一旦你
01:21:39 that out is sufficient. Because once you figured it out, it's taken care of for
01:21:39 這就夠了。因為一旦你弄清楚了,它就被處理好了
01:21:41 figured it out, it's taken care of for
01:21:41 搞清楚了,已經處理好了
01:21:41 figured it out, it's taken care of for you.
01:21:41 明白了,已經幫你處理好了。
01:21:41 you. 01:21:41 你。
01:21:41 you. That's beautiful,
01:21:41 你真漂亮。
01:21:41 That's beautiful,
01:21:41 太美了,
01:21:42 That's beautiful, right? That provides purpose.
01:21:42 很美,對吧?這給了我們目標。
01:21:43 right? That provides purpose.
01:21:43 對吧?這提供了目的。
01:21:43 right? That provides purpose. Yeah.
01:21:43 對吧?這提供了目標。是的。
01:21:44 Yeah. 01:21:44 是的。
01:21:44 Yeah. Um it's pretty clear that robots will
01:21:44 是的。嗯,很明顯機器人會
01:21:46 Um it's pretty clear that robots will
01:21:46 嗯,很明顯機器人會
01:21:46 Um it's pretty clear that robots will take over an awful lot of mechanical or
01:21:46 嗯,很明顯機器人將接管大量的機械或
01:21:48 take over an awful lot of mechanical or
01:21:48 接管大量機械或
01:21:48 take over an awful lot of mechanical or manual work.
01:21:48 接管大量機械或手工工作。
01:21:49 manual work. 01:21:49 手工工作。
01:21:49 manual work. Um and for people who like to, you know,
01:21:49 體力勞動。嗯,對於那些喜歡…的人來說,
01:21:52 Um and for people who like to, you know,
01:21:52 嗯,對於那些喜歡的人來說,你知道,
01:21:52 Um and for people who like to, you know, I like to repair the car. I don't do it
01:21:52 嗯,對於那些喜歡修理汽車的人來說,你知道,我喜歡修理汽車。我不做
01:21:53 I like to repair the car. I don't do it
01:21:53 我喜歡修車。我不做
01:21:53 I like to repair the car. I don't do it anymore. I miss it,
01:21:53 我喜歡修車。現在不修了。我很懷念以前的生活。
01:21:55 anymore. I miss it,
01:21:55 了。我很懷念它。
01:21:55 anymore. I miss it, but I I have other things to do with my
01:21:55 了。我很想念它,但我還有別的事要做
01:21:57 but I I have other things to do with my
01:21:57 但我還有其他事情要做
01:21:57 but I I have other things to do with my time.
01:21:57 但我還有其他事情要做。
01:21:57 time. 時間 01:21:57。
01:21:58 time. Yeah. 01:21:58 時間。是的。
01:21:59 Yeah. 01:21:59 是的。
01:21:59 Yeah. Take me forward. When do you see uh what
01:21:59 是的。帶我往前走。你什麼時候能看到呃什麼
01:22:03 Take me forward. When do you see uh what
01:22:03 帶我往前走。你什麼時候看到呃什麼
01:22:03 Take me forward. When do you see uh what you define as digital super
01:22:03 帶我繼續前進。什麼時候會看到你定義的數字超級
01:22:04 you define as digital super
01:22:04 你將其定義為數字超級
01:22:04 you define as digital super intelligence?
01:22:04 你如何定義數位超級智慧?
01:22:06 intelligence? 01:22:06 情報?
01:22:06 intelligence? Uh within 10 years.
01:22:06 智力?呃,10年內。
01:22:07 Uh within 10 years.
01:22:07 呃,10年內。
01:22:07 Uh within 10 years. Within 10 years. And what do people need
01:22:07 呃,10年內。 10年內。人們需要什麼
01:22:09 Within 10 years. And what do people need
01:22:09 十年之內。人們需要什麼
01:22:09 Within 10 years. And what do people need to know about that?
01:22:09 十年之內。人們需要了解什麼呢?
01:22:11 to know about that?
01:22:11 知道這個嗎?
01:22:11 to know about that? What do people need to understand and
01:22:11 需要了解這一點嗎?人們需要了解什麼?
01:22:13 What do people need to understand and
01:22:13 人們需要了解什麼?
01:22:13 What do people need to understand and sort of uh prepare themselves for either
01:22:13 人們需要了解什麼,並為此做好準備
01:22:17 sort of uh prepare themselves for either
01:22:17 做好準備
01:22:17 sort of uh prepare themselves for either from as a parent or as a employee or as
01:22:17 為作為父母、員工或
01:22:21 from as a parent or as a employee or as
01:22:21 作為父母、員工或
01:22:21 from as a parent or as a employee or as a CEO?
01:22:21 身為父母、員工還是執行長?
01:22:23 a CEO? 01:22:23 執行長?
01:22:23 a CEO? One way to think about it is that when
01:22:23 執行長?我們可以這樣想:
01:22:26 One way to think about it is that when
01:22:26 可以這樣想:
01:22:26 One way to think about it is that when digital super intelligence finally
01:22:26 我們可以這樣想:當數位超級智慧最終
01:22:28 digital super intelligence finally
01:22:28 數位超級智慧終於
01:22:28 digital super intelligence finally arrives and is generally available and
01:22:28 數位超級智慧終於到來,並且普遍可用,
01:22:30 arrives and is generally available and
01:22:30 到達,通常可用,並且
01:22:30 arrives and is generally available and generally safe, you're going to have
01:22:30 到達,並且通常可用且通常安全,你將擁有
01:22:33 generally safe, you're going to have
01:22:33 整體來說很安全,你會
01:22:33 generally safe, you're going to have your own polymath.
01:22:33 整體來說很安全,你會擁有自己的博學者。
01:22:35 your own polymath.
01:22:35 你自己的博學者。
01:22:36 your own polymath. So you're going to have the sum of
01:22:36 你自己的博學者。所以你將得到
01:22:37 So you're going to have the sum of
01:22:37 所以你將得到
01:22:38 So you're going to have the sum of Einstein and Leonardo da Vinci in the
01:22:38 所以你將得到愛因斯坦和李奧納多達文西的總和
01:22:40 Einstein and Leonardo da Vinci in the
01:22:40 愛因斯坦和李奧納多達文西在
01:22:40 Einstein and Leonardo da Vinci in the equivalent of your pocket. I think
01:22:40 你的口袋裡裝著愛因斯坦和達文西。我覺得
01:22:43 equivalent of your pocket. I think
01:22:43 相當於你的口袋。我認為
01:22:43 equivalent of your pocket. I think thinking about how you would use that
01:22:43 相當於你的口袋。我想想想你會如何使用它
01:22:44 thinking about how you would use that
01:22:44 思考如何使用它
01:22:44 thinking about how you would use that gift is interesting. And of course evil
01:22:44 想想你會如何運用這份天賦很有趣。當然,還有邪惡
01:22:48 gift is interesting. And of course evil
01:22:48 禮物很有趣。當然還有邪惡
01:22:48 gift is interesting. And of course evil people will become more evil, but the
01:22:48 禮物很有趣。當然,邪惡的人會變得更加邪惡,但
01:22:50 people will become more evil, but the
01:22:50 人們會變得更邪惡,但
01:22:50 people will become more evil, but the vast majority of people are good. Yes,
01:22:50 人們會變得更邪惡,但絕大多數人都是善良的。是的,
01:22:53 vast majority of people are good. Yes,
01:22:53 絕大多數人都是善良的。是的,
01:22:53 vast majority of people are good. Yes, they're well-meaning, right? So going
01:22:53 絕大多數人都是善良的。是的,他們都是好心,對吧?所以
01:22:55 they're well-meaning, right? So going
01:22:55 他們是好意,對吧?所以
01:22:55 they're well-meaning, right? So going back to your abundance argument, there
01:22:55 他們是好意,對吧?所以回到你的富足論證,
01:22:57 back to your abundance argument, there
01:22:57 回到你的豐富論點,
01:22:57 back to your abundance argument, there are people who've studied the the n the
01:22:57 回到你的豐富論點,有人研究過
01:22:59 are people who've studied the the n the
01:22:59 是研究過
01:22:59 are people who've studied the the n the notion of productivity increases and
01:22:59 是研究生產力提高概念的人,
01:23:01 notion of productivity increases and
01:23:01 生產力提高的概念和
01:23:01 notion of productivity increases and they believe that you can get we'll see
01:23:01 生產力提升的概念,他們相信你可以得到我們會看到
01:23:04 they believe that you can get we'll see
01:23:04 他們相信你能得到我們拭目以待
01:23:04 they believe that you can get we'll see to 30% year-over-year economic growth
01:23:04 他們相信我們能夠達到 30% 的年成長
01:23:06 to 30% year-over-year economic growth
01:23:06 年成長30%
01:23:06 to 30% year-over-year economic growth through abundance and so forth. That's a
01:23:06 透過豐裕的資源等,實現年均30%的經濟成長。這是一個
01:23:09 through abundance and so forth. That's a
01:23:09 透過豐富等等。這是一個
01:23:09 through abundance and so forth. That's a very wealthy world. That's a world of
01:23:09 透過富足等等。這是一個非常富裕的世界。這是一個
01:23:12 very wealthy world. That's a world of
01:23:12 非常富裕的世界。這是一個
01:23:12 very wealthy world. That's a world of much less disease, many more choices,
01:23:12 非常富裕的世界。這個世界疾病更少,選擇更多,
01:23:15 much less disease, many more choices,
01:23:15 疾病更少,選擇更多,
01:23:15 much less disease, many more choices, much more fun if you will, right? Just
01:23:15 疾病少了,選擇多了,樂趣也多了,對吧?只是
01:23:17 much more fun if you will, right? Just
01:23:17 更有趣,對吧?只是
01:23:17 much more fun if you will, right? Just taking all those poor people and lifting
01:23:17 更有趣,對吧?把那些可憐的人都帶走,然後
01:23:19 taking all those poor people and lifting
01:23:19 把那些可憐的人帶走,
01:23:19 taking all those poor people and lifting them out of the daily struggle they
01:23:19 幫助所有窮人,讓他們擺脫日常的掙扎
01:23:21 them out of the daily struggle they
01:23:21 讓他們擺脫日常的掙扎
01:23:21 them out of the daily struggle they have. That is a great human goal. That's
01:23:21 讓他們擺脫日常的困境。這是人類偉大的目標。
01:23:23 have. That is a great human goal. That's
01:23:23 擁有。這是一個偉大的人類目標。
01:23:24 have. That is a great human goal. That's focus on that. That's the goal we should
01:23:24 擁有。這是人類偉大的目標。這就是重點。這就是我們應該追求的目標
01:23:25 focus on that. That's the goal we should
01:23:25 專注於此。這就是我們應該實現的目標
01:23:25 focus on that. That's the goal we should have. Does GDP still have meaning in
01:23:25 專注於此。這才是我們該有的目標。 GDP 仍然有意義?
01:23:28 have. Does GDP still have meaning in
01:23:28 有。 GDP 在
01:23:28 have. Does GDP still have meaning in that world?
01:23:28 有。在那個世界裡,GDP 還有意義嗎?
01:23:28 that world? 01:23:28 那個世界?
01:23:28 that world? If you include services, it does. Um,
01:23:28 那個世界?如果算上服務的話,確實如此。嗯,
01:23:31 If you include services, it does. Um,
01:23:31 若算上服務的話,確實如此。嗯,
01:23:31 If you include services, it does. Um, one of the things about manufacturing
01:23:31 如果算上服務業,確實如此。嗯,製造業的一個特點是
01:23:33 one of the things about manufacturing
01:23:33 製造業的一個特點
01:23:33 one of the things about manufacturing and and everyone's focused on trade
01:23:33 關於製造業,每個人都關注貿易
01:23:34 and and everyone's focused on trade
01:23:34 每個人都關注貿易
01:23:34 and and everyone's focused on trade deficits and they don't understand the
01:23:34 每個人都關注貿易逆差,他們不明白
01:23:36 deficits and they don't understand the
01:23:36 赤字,他們不明白
01:23:36 deficits and they don't understand the vast majority of modern economies are
01:23:36 赤字,他們不明白絕大多數現代經濟體
01:23:38 vast majority of modern economies are
01:23:38 絕大多數現代經濟體
01:23:38 vast majority of modern economies are service economies, not manufacturing
01:23:38 絕大多數現代經濟體是服務經濟,而不是製造業
01:23:40 service economies, not manufacturing
01:23:40 服務經濟,而非製造業
01:23:40 service economies, not manufacturing economies. And if you look at the
01:23:40 服務經濟,而不是製造業經濟。如果你看看
01:23:41 economies. And if you look at the
01:23:41 經濟體。如果你看看
01:23:42 economies. And if you look at the percentage of farming, it was roughly
01:23:42 經濟體。如果你看一下農業的佔比,它大約是
01:23:44 percentage of farming, it was roughly
01:23:44 農業佔比大約是
01:23:44 percentage of farming, it was roughly 98% to roughly 2 or 3% in America over a
01:23:44 農業佔比,大約是 98%,而美國則約為 2% 或 3%。
01:23:47 98% to roughly 2 or 3% in America over a
01:23:47 在美國,這一比例從 98% 降至約 2% 或 3%。
01:23:47 98% to roughly 2 or 3% in America over a hundred years. If you look at
01:23:47 一百年來,美國的失業率從98%降至約2%或3%。如果你看看
01:23:48 hundred years. If you look at
01:23:48 一百年。如果你看看
01:23:48 hundred years. If you look at manufacturing, the heydays in the 30s
01:23:48 一百年。如果你看看製造業,30年代的鼎盛時期
01:23:50 manufacturing, the heydays in the 30s
01:23:50 製造業,30 年代的鼎盛時期
01:23:50 manufacturing, the heydays in the 30s and 40s and 50s, those percentages are
01:23:50 製造業,30 年代、40 年代和 50 年代的鼎盛時期,這些百分比是
01:23:53 and 40s and 50s, those percentages are
01:23:53 和 40 年代和 50 年代,這些百分比是
01:23:53 and 40s and 50s, those percentages are now down. Well, lower than 10%. It's not
01:23:53 以及 40 多歲和 50 多歲,這些百分比現在下降了。嗯,低於 10%。這不是
01:23:56 now down. Well, lower than 10%. It's not
01:23:56 現在下降了。嗯,低於10%。
01:23:56 now down. Well, lower than 10%. It's not because we don't buy stuff. It's because
01:23:56 現在下降了。嗯,低於10%。不是因為我們不買東西,而是因為
01:23:58 because we don't buy stuff. It's because
01:23:58 因為我們不買東西。這是因為
01:23:58 because we don't buy stuff. It's because the stuff is automat automated. You need
01:23:58 因為我們不買東西。因為東西都是自動販賣的。你需要
01:24:00 the stuff is automat automated. You need
01:24:00 這些東西都是自動的。你需要
01:24:00 the stuff is automat automated. You need fewer people. Those there's plenty of
01:24:00 一切都是自動化的。你需要更少的人手。
01:24:03 fewer people. Those there's plenty of
01:24:03 人少了。還有很多
01:24:03 fewer people. Those there's plenty of people working in other jobs. So again,
01:24:03 人少了。還有很多人在做其他工作。所以,
01:24:06 people working in other jobs. So again,
01:24:06 從事其他工作的人。所以,
01:24:06 people working in other jobs. So again, look at the totality of the society. Is
01:24:06 從事其他工作的人。所以,再看看整個社會。
01:24:09 look at the totality of the society. Is
01:24:09 看看整個社會。
01:24:09 look at the totality of the society. Is it healthy?
01:24:09 看看整個社會。它健康嗎?
01:24:10 it healthy? 01:24:10 它健康嗎?
01:24:10 it healthy? If you look in China, it's easy to
01:24:10 是否健康?如果你看看中國,很容易
01:24:12 If you look in China, it's easy to
01:24:12 如果你看看中國,很容易
01:24:12 If you look in China, it's easy to complain about them. Um they have now
01:24:12 如果你看看中國,很容易抱怨他們。嗯,他們現在
01:24:15 complain about them. Um they have now
01:24:15 抱怨他們。嗯,他們現在
01:24:15 complain about them. Um they have now deflation. They have a term where people
01:24:15 抱怨他們。嗯,他們現在有通貨緊縮。他們有一個術語,指的是人們
01:24:18 deflation. They have a term where people
01:24:18 通貨緊縮。他們有一個術語,指的是人們
01:24:18 deflation. They have a term where people are it's called laying down where they
01:24:18 通貨緊縮。他們有一個術語,叫做“躺下”
01:24:20 are it's called laying down where they
01:24:20 這叫做躺在他們
01:24:20 are it's called laying down where they lay they they stay at home. They don't
01:24:20 這叫做“躺下”,他們待在家裡。他們不
01:24:21 lay they they stay at home. They don't
01:24:21 他們待在家裡。他們不
01:24:21 lay they they stay at home. They don't participate in the workforce, which is
01:24:21 他們待在家裡。他們不參加工作,
01:24:23 participate in the workforce, which is
01:24:23 參與勞動力,也就是
01:24:23 participate in the workforce, which is counter to their traditional culture. If
01:24:23 參與勞動力,這與他們的傳統文化背道而馳。如果
01:24:25 counter to their traditional culture. If
01:24:25 與他們的傳統文化背道而馳。如果
01:24:25 counter to their traditional culture. If you look at reproduction rates, these
01:24:25 與他們的傳統文化背道而馳。如果你看看生育率,這些
01:24:27 you look at reproduction rates, these
01:24:27 你看看繁殖率,這些
01:24:27 you look at reproduction rates, these countries that are essentially having no
01:24:27 你看看繁殖率,這些國家基本上沒有
01:24:28 countries that are essentially having no
01:24:28 基本上沒有的國家
01:24:28 countries that are essentially having no children, that's not a good thing.
01:24:28 基本上沒有孩子的國家,這不是好事。
01:24:30 children, that's not a good thing.
01:24:30 孩子們,這不是好事。
01:24:30 children, that's not a good thing. Yeah.
01:24:30 孩子們,這不是好事。是的。
01:24:31 Yeah. 01:24:31 是的。
01:24:31 Yeah. Right. Those are problems that we're
01:24:31 是的。這些都是我們
01:24:32 Right. Those are problems that we're
01:24:32 對。這些都是我們
01:24:32 Right. Those are problems that we're going to face. Those are the new
01:24:32 對。這些都是我們將要面臨的問題。這些都是新的
01:24:34 going to face. Those are the new
01:24:34 即將面對。這些是新的
01:24:34 going to face. Those are the new problems of the age.
01:24:34 即將面臨的。這些都是時代的新問題。
01:24:35 problems of the age.
01:24:35 時代問題。
01:24:35 problems of the age. I love that.
01:24:35 時代問題。我喜歡這個。
01:24:37 I love that. 01:24:37 我喜歡這個。
01:24:37 I love that. Eric, uh, so grateful for your time.
01:24:37 我很喜歡。埃里克,嗯,非常感謝你抽出時間。
01:24:41 Eric, uh, so grateful for your time.
01:24:41 Eric,呃,非常感謝您抽出時間。
01:24:41 Eric, uh, so grateful for your time. Thank you. Thank you both. Um, I I love
01:24:41 Eric,嗯,非常感謝你的時間。謝謝。謝謝你們兩位。嗯,我愛你們
01:24:43 Thank you. Thank you both. Um, I I love
01:24:43 謝謝。謝謝你們倆。嗯,我愛
01:24:43 Thank you. Thank you both. Um, I I love your show.
01:24:43 謝謝。謝謝你們倆。嗯,我很喜歡你們的節目。
01:24:44 your show. 01:24:44 你的表演。
01:24:44 your show. Yeah. Thank you, buddy.
01:24:44 你的節目。是的。謝謝你,夥計。
01:24:45 Yeah. Thank you, buddy.
01:24:45 是的。謝謝你,夥計。
01:24:45 Yeah. Thank you, buddy. Thank you.
01:24:45 是的。謝謝你,夥計。謝謝你。
01:24:45 Thank you. 01:24:45 謝謝。
01:24:45 Thank you. Okay. Thank you, guys. If you could have
01:24:45 謝謝。好的。謝謝大家。如果你們能
01:24:47 Okay. Thank you, guys. If you could have
01:24:47 好的。謝謝大家。如果你們能
01:24:47 Okay. Thank you, guys. If you could have had a 10-year head start on the dot boom
01:24:47 好的。謝謝大家。如果你們能比網路泡沫早十年
01:24:49 had a 10-year head start on the dot boom
01:24:49 領先網路繁榮 10 年
01:24:49 had a 10-year head start on the dot boom back in the 2000s, would you have taken
01:24:49 如果你在 2000 年代網路泡沫爆發之前就領先 10 年,你會選擇
01:24:51 back in the 2000s, would you have taken
01:24:51 回到 2000 年代,你會採取
01:24:51 back in the 2000s, would you have taken it? Every week, I track the major tech
01:24:51 回到2000年代,你會接受嗎?每週我都會關注主要的科技
01:24:54 it? Every week, I track the major tech
01:24:54 是嗎?每週我都會關注主要的科技
01:24:54 it? Every week, I track the major tech meta trends. These are massive
01:24:54 是嗎?我每週都會追蹤主要的科技元趨勢。這些趨勢非常龐大
01:24:56 meta trends. These are massive
01:24:56 元趨勢。這些趨勢非常龐大
01:24:56 meta trends. These are massive game-changing shifts that will play out
01:24:56 元趨勢。這些是巨大的、改變遊戲規則的轉變,將
01:24:58 game-changing shifts that will play out
01:24:58 改變遊戲規則的轉變即將發生
01:24:58 game-changing shifts that will play out over the decade ahead. From humanoid
01:24:58 未來十年將發生的改變遊戲規則的轉變。來自人形機器人
01:25:00 over the decade ahead. From humanoid
01:25:00 未來十年。來自人形機器人
01:25:00 over the decade ahead. From humanoid robotics to AGI, quantum computing,
01:25:00 未來十年。從人形機器人到通用人工智慧、量子運算,
01:25:02 robotics to AGI, quantum computing,
01:25:02 從機器人技術到通用人工智慧、量子計算,
01:25:02 robotics to AGI, quantum computing, energy breakthroughs, and longevity. I
01:25:02 從機器人技術到通用人工智慧、量子運算、能源突破和長壽。我
01:25:04 energy breakthroughs, and longevity. I
01:25:04 能源突破與長壽。我
01:25:04 energy breakthroughs, and longevity. I cut through the noise and deliver only
01:25:04 能量突破,以及長壽。我排除乾擾,只傳遞
01:25:07 cut through the noise and deliver only
01:25:07 消除噪音,只傳遞
01:25:07 cut through the noise and deliver only what matters to our lives and our
01:25:07 消除噪音,只傳遞對我們的生活和我們的
01:25:09 what matters to our lives and our
01:25:09 對我們的生活和我們的
01:25:09 what matters to our lives and our careers. I send out a Metatron
01:25:09 什麼對我們的生活和事業重要。我派出了一個梅塔特隆
01:25:11 careers. I send out a Metatron
01:25:11 職業。我派出了一個梅塔特隆
01:25:11 careers. I send out a Metatron newsletter twice a week as a quick
01:25:11 職業。我每週發送兩次 Metatron 簡報,作為快速
01:25:13 newsletter twice a week as a quick
01:25:13 每週兩次的簡報
01:25:13 newsletter twice a week as a quick two-minute readover email. It's entirely
01:25:13 每週兩次的新聞通訊,以兩分鐘快速閱讀的電子郵件形式呈現。這完全
01:25:15 two-minute readover email. It's entirely
01:25:15 兩分鐘閱讀郵件。這完全是
01:25:15 two-minute readover email. It's entirely free. These insights are read by
01:25:15 兩分鐘閱讀郵件。完全免費。這些見解由
01:25:18 free. These insights are read by
01:25:18 免費。這些見解由
01:25:18 free. These insights are read by founders, CEOs, and investors behind
01:25:18 免費。這些見解由創辦人、執行長和投資者閱讀
01:25:20 founders, CEOs, and investors behind
01:25:20 創辦人、CEO 和投資者
01:25:20 founders, CEOs, and investors behind some of the world's most disruptive
01:25:20 一些全球最具顛覆性的公司背後的創始人、執行長和投資者
01:25:21 some of the world's most disruptive
01:25:21 一些世界上最具破壞性的
01:25:21 some of the world's most disruptive companies. Why? Because acting early is
01:25:21 一些世界上最具顛覆性的公司。為什麼?因為及早行動是
01:25:25 companies. Why? Because acting early is
01:25:25 公司。為什麼?因為早點行動是
01:25:25 companies. Why? Because acting early is everything. This is for you if you want
01:25:25 公司。為什麼?因為早點行動至關重要。如果你想
01:25:27 everything. This is for you if you want
01:25:27 一切。如果你想
01:25:27 everything. This is for you if you want to see the future before it arrives and
01:25:27 一切。如果你想在未來到來之前預見未來,那麼這就是為你準備的。
01:25:30 to see the future before it arrives and
01:25:30 在未來到來之前預見未來
01:25:30 to see the future before it arrives and profit from it. Sign up at
01:25:30 搶佔先機,搶佔先機,從中獲利。註冊
01:25:31 profit from it. Sign up at
01:25:31 從中獲利。註冊
01:25:31 profit from it. Sign up at dmandis.com/atrends
01:25:31 從中獲利。在 dmandis.com/atrends 註冊
01:25:33 dmandis.com/atrends
01:25:33 dmandis.com/atrends and be ahead of the next tech bubble.
01:25:33 dmandis.com/atrends 並領先下一個科技泡沫。
01:25:36 and be ahead of the next tech bubble.
01:25:36 並領先下一個科技泡沫。
01:25:36 and be ahead of the next tech bubble. That's dmmand.com/metats.
01:25:36 領先下一輪科技泡沫。請造訪 dmmand.com/metats。
01:25:39 That's dmmand.com/metats.
01:25:39 這是 dmmand.com/metats。
01:25:39 That's dmmand.com/metats. [Music]
01:25:39 網址:dmmand.com/metats。 [音樂]
影片字幕
00:00:03 When do you see what you define as
00:00:03 When do you see what you define as digital super intelligence?
00:00:04 digital super intelligence?
00:00:04 digital super intelligence? Uh, within 10 years.
00:00:06 Uh, within 10 years.
00:00:06 Uh, within 10 years. The AI's ability to generate its own
00:00:09 The AI's ability to generate its own
00:00:09 The AI's ability to generate its own scaffolding is imminent. Pretty much
00:00:12 scaffolding is imminent. Pretty much
00:00:12 scaffolding is imminent. Pretty much sure that that will be a 2025 thing. We
00:00:15 sure that that will be a 2025 thing. We
00:00:15 sure that that will be a 2025 thing. We certainly don't know what super
00:00:16 certainly don't know what super
00:00:16 certainly don't know what super intelligence will deliver, but we know
00:00:18 intelligence will deliver, but we know
00:00:18 intelligence will deliver, but we know it's coming.
00:00:19 it's coming.
00:00:19 it's coming. And what do people need to know about
00:00:21 And what do people need to know about
00:00:21 And what do people need to know about that?
00:00:21 that?
00:00:22 that? You're going to have your own polymath.
00:00:24 You're going to have your own polymath.
00:00:24 You're going to have your own polymath. So, you're going to have the sum of
00:00:25 So, you're going to have the sum of
00:00:25 So, you're going to have the sum of Einstein and Leonardo da Vinci in the
00:00:28 Einstein and Leonardo da Vinci in the
00:00:28 Einstein and Leonardo da Vinci in the equivalent of your pocket. agents are
00:00:30 equivalent of your pocket. agents are
00:00:30 equivalent of your pocket. agents are going to happen. This math thing is
00:00:31 going to happen. This math thing is
00:00:32 going to happen. This math thing is going to happen. The software thing is
00:00:33 going to happen. The software thing is
00:00:33 going to happen. The software thing is going to happen. Everything I've talked
00:00:34 going to happen. Everything I've talked
00:00:34 going to happen. Everything I've talked about is in the positive domain, but
00:00:36 about is in the positive domain, but
00:00:36 about is in the positive domain, but there's a negative domain as well. It's
00:00:38 there's a negative domain as well. It's
00:00:38 there's a negative domain as well. It's likely, in my opinion, that you're going
00:00:40 likely, in my opinion, that you're going
00:00:40 likely, in my opinion, that you're going to see.
00:00:48 Now, that's a moonshot, ladies and
00:00:48 Now, that's a moonshot, ladies and gentlemen.
00:00:53 Hey, everybody. Welcome to Moonshots.
00:00:53 Hey, everybody. Welcome to Moonshots. I'm here live with my Moonshot mate,
00:00:54 I'm here live with my Moonshot mate,
00:00:54 I'm here live with my Moonshot mate, Dave London. Uh we're here in our Santa
00:00:57 Dave London. Uh we're here in our Santa
00:00:57 Dave London. Uh we're here in our Santa Monica studios and we have a special
00:00:58 Monica studios and we have a special
00:00:58 Monica studios and we have a special guest today,
00:01:00 guest today,
00:01:00 guest today, Eric Schmidt, the author of Genesis. We
00:01:03 Eric Schmidt, the author of Genesis. We
00:01:03 Eric Schmidt, the author of Genesis. We talk about China. We're going to talk
00:01:05 talk about China. We're going to talk
00:01:05 talk about China. We're going to talk about, you know, digital super
00:01:06 about, you know, digital super
00:01:06 about, you know, digital super intelligence. We'll talk about, you
00:01:08 intelligence. We'll talk about, you
00:01:08 intelligence. We'll talk about, you know, what people should be thinking
00:01:10 know, what people should be thinking
00:01:10 know, what people should be thinking about over the 10 years.
00:01:11 about over the 10 years.
00:01:11 about over the 10 years. And we're talking about the guy who has
00:01:14 And we're talking about the guy who has
00:01:14 And we're talking about the guy who has more access to more more actionable
00:01:16 more access to more more actionable
00:01:16 more access to more more actionable information than probably anyone else
00:01:17 information than probably anyone else
00:01:17 information than probably anyone else you could think of. So, it should be
00:01:20 you could think of. So, it should be
00:01:20 you could think of. So, it should be should be pretty exciting.
00:01:21 should be pretty exciting.
00:01:21 should be pretty exciting. Incredibly brilliant. All right, stand
00:01:23 Incredibly brilliant. All right, stand
00:01:23 Incredibly brilliant. All right, stand by for a conversation with the Eric
00:01:25 by for a conversation with the Eric
00:01:25 by for a conversation with the Eric Schmidt, CEO or past CEO of Google and
00:01:28 Schmidt, CEO or past CEO of Google and
00:01:28 Schmidt, CEO or past CEO of Google and an extraordinary investor and uh and
00:01:30 an extraordinary investor and uh and
00:01:30 an extraordinary investor and uh and thinker in this field of AI.
00:01:32 thinker in this field of AI.
00:01:32 thinker in this field of AI. Let's do it.
00:01:33 Let's do it.
00:01:33 Let's do it. Eric, welcome back to Moonshots.
00:01:35 Eric, welcome back to Moonshots.
00:01:35 Eric, welcome back to Moonshots. It's great to be here with you guys.
00:01:36 It's great to be here with you guys.
00:01:36 It's great to be here with you guys. Thank you. It's been uh it's been a long
00:01:39 Thank you. It's been uh it's been a long
00:01:39 Thank you. It's been uh it's been a long road since I first met you at Google. I
00:01:42 road since I first met you at Google. I
00:01:42 road since I first met you at Google. I remember uh our first conversations were
00:01:44 remember uh our first conversations were
00:01:44 remember uh our first conversations were fantastic. Uh it's been a crazy month in
00:01:48 fantastic. Uh it's been a crazy month in
00:01:48 fantastic. Uh it's been a crazy month in the world of AI, but I think every month
00:01:50 the world of AI, but I think every month
00:01:50 the world of AI, but I think every month from here is going to be a crazy month.
00:01:52 from here is going to be a crazy month.
00:01:52 from here is going to be a crazy month. And so I'd love to hit on a number of
00:01:55 And so I'd love to hit on a number of
00:01:55 And so I'd love to hit on a number of subjects and get your your take on them.
00:01:57 subjects and get your your take on them.
00:01:57 subjects and get your your take on them. I want to start with probably the most
00:01:59 I want to start with probably the most
00:01:59 I want to start with probably the most important point that you've made
00:02:00 important point that you've made
00:02:00 important point that you've made recently that got a lot of traction, a
00:02:02 recently that got a lot of traction, a
00:02:02 recently that got a lot of traction, a lot of attention, which is that AI is
00:02:05 lot of attention, which is that AI is
00:02:05 lot of attention, which is that AI is underhyped when the rest of the world is
00:02:07 underhyped when the rest of the world is
00:02:07 underhyped when the rest of the world is either confused, lost, or think it's,
00:02:09 either confused, lost, or think it's,
00:02:10 either confused, lost, or think it's, you know, not impacting us.
00:02:13 you know, not impacting us.
00:02:13 you know, not impacting us. We'll get into in more detail, but quick
00:02:16 We'll get into in more detail, but quick
00:02:16 We'll get into in more detail, but quick most important point to make there.
00:02:19 most important point to make there.
00:02:19 most important point to make there. AI is a learning machine. Yeah.
00:02:21 AI is a learning machine. Yeah.
00:02:21 AI is a learning machine. Yeah. And in network effect businesses, when
00:02:24 And in network effect businesses, when
00:02:24 And in network effect businesses, when the learning machine learns faster,
00:02:26 the learning machine learns faster,
00:02:26 the learning machine learns faster, everything accelerates.
00:02:28 everything accelerates.
00:02:28 everything accelerates. It accelerates to its natural limit. The
00:02:31 It accelerates to its natural limit. The
00:02:31 It accelerates to its natural limit. The natural limit is electricity.
00:02:35 natural limit is electricity.
00:02:35 natural limit is electricity. Not chips,
00:02:36 Not chips,
00:02:36 Not chips, electricity really. Okay.
00:02:39 electricity really. Okay.
00:02:39 electricity really. Okay. So that gets me to the next point here,
00:02:41 So that gets me to the next point here,
00:02:41 So that gets me to the next point here, which is uh a discussion on AI and
00:02:44 which is uh a discussion on AI and
00:02:44 which is uh a discussion on AI and energy. So, we saw recently was Meta
00:02:47 energy. So, we saw recently was Meta
00:02:47 energy. So, we saw recently was Meta recently announcing uh that they signed
00:02:50 recently announcing uh that they signed
00:02:50 recently announcing uh that they signed a 20-year nuclear contract with uh with
00:02:53 a 20-year nuclear contract with uh with
00:02:53 a 20-year nuclear contract with uh with Constellation Energy. We've seen Google,
00:02:56 Constellation Energy. We've seen Google,
00:02:56 Constellation Energy. We've seen Google, Microsoft, Amazon, everybody buying
00:02:59 Microsoft, Amazon, everybody buying
00:02:59 Microsoft, Amazon, everybody buying basically nuclear capacity right now.
00:03:02 basically nuclear capacity right now.
00:03:02 basically nuclear capacity right now. That's got to be weird
00:03:05 That's got to be weird
00:03:05 That's got to be weird uh that private companies are are
00:03:08 uh that private companies are are
00:03:08 uh that private companies are are basically taking over into their own
00:03:10 basically taking over into their own
00:03:10 basically taking over into their own hands what was utility function before.
00:03:14 hands what was utility function before.
00:03:14 hands what was utility function before. Um,
00:03:14 Um,
00:03:14 Um, well, just to be cynical, I I'm so glad
00:03:17 well, just to be cynical, I I'm so glad
00:03:17 well, just to be cynical, I I'm so glad those companies plan to be around the 20
00:03:19 those companies plan to be around the 20
00:03:19 those companies plan to be around the 20 years that it's going to take to get the
00:03:21 years that it's going to take to get the
00:03:21 years that it's going to take to get the nuclear power plants built.
00:03:23 nuclear power plants built.
00:03:23 nuclear power plants built. In my recent testimony, I talked about
00:03:26 In my recent testimony, I talked about
00:03:26 In my recent testimony, I talked about the the current expected need for the AI
00:03:28 the the current expected need for the AI
00:03:28 the the current expected need for the AI revolution in the United States is 92
00:03:31 revolution in the United States is 92
00:03:31 revolution in the United States is 92 gawatt of more power.
00:03:33 gawatt of more power.
00:03:33 gawatt of more power. For reference, one gawatt is one big
00:03:37 For reference, one gawatt is one big
00:03:37 For reference, one gawatt is one big nuclear power station. And there are
00:03:39 nuclear power station. And there are
00:03:39 nuclear power station. And there are none essentially being started now.
00:03:41 none essentially being started now.
00:03:41 none essentially being started now. And there have been two in the last
00:03:42 And there have been two in the last
00:03:42 And there have been two in the last what, 30 years built. There is
00:03:44 what, 30 years built. There is
00:03:44 what, 30 years built. There is excitement that there's an SMR, small
00:03:46 excitement that there's an SMR, small
00:03:46 excitement that there's an SMR, small modular reactor coming in at 300
00:03:48 modular reactor coming in at 300
00:03:48 modular reactor coming in at 300 megawws, but it won't start till 2030.
00:03:51 megawws, but it won't start till 2030.
00:03:51 megawws, but it won't start till 2030. As important as nuclear, both fision and
00:03:54 As important as nuclear, both fision and
00:03:54 As important as nuclear, both fision and fusion is, they're not going to arrive
00:03:56 fusion is, they're not going to arrive
00:03:56 fusion is, they're not going to arrive in time to get us what we need as a
00:03:59 in time to get us what we need as a
00:04:00 in time to get us what we need as a globe to deal with our many problems and
00:04:01 globe to deal with our many problems and
00:04:01 globe to deal with our many problems and the many opportunities that are before
00:04:03 the many opportunities that are before
00:04:03 the many opportunities that are before us. Do you think uh so if if you look at
00:04:05 us. Do you think uh so if if you look at
00:04:05 us. Do you think uh so if if you look at the sort of three-year timeline toward
00:04:07 the sort of three-year timeline toward
00:04:07 the sort of three-year timeline toward AGI, do you think if you started a a
00:04:10 AGI, do you think if you started a a
00:04:10 AGI, do you think if you started a a fusion reactor project today that won't
00:04:12 fusion reactor project today that won't
00:04:12 fusion reactor project today that won't come online for five, six, seven years,
00:04:15 come online for five, six, seven years,
00:04:15 come online for five, six, seven years, is there a probability that the AGI
00:04:17 is there a probability that the AGI
00:04:17 is there a probability that the AGI comes up with some other breakthrough
00:04:19 comes up with some other breakthrough
00:04:19 comes up with some other breakthrough fusion or otherwise that makes it
00:04:20 fusion or otherwise that makes it
00:04:20 fusion or otherwise that makes it irrelevant before it even gets online?
00:04:22 irrelevant before it even gets online?
00:04:22 irrelevant before it even gets online? A very good question. We don't know what
00:04:24 A very good question. We don't know what
00:04:24 A very good question. We don't know what artificial general intelligence will
00:04:27 artificial general intelligence will
00:04:27 artificial general intelligence will deliver. Yeah. And we certainly don't
00:04:29 deliver. Yeah. And we certainly don't
00:04:29 deliver. Yeah. And we certainly don't know what super intelligence will
00:04:31 know what super intelligence will
00:04:31 know what super intelligence will deliver, but we know it's coming.
00:04:34 deliver, but we know it's coming.
00:04:34 deliver, but we know it's coming. So, first we need to plan for it. And
00:04:36 So, first we need to plan for it. And
00:04:36 So, first we need to plan for it. And there's lots of issues as well as
00:04:38 there's lots of issues as well as
00:04:38 there's lots of issues as well as opportunities for that. But the fact of
00:04:39 opportunities for that. But the fact of
00:04:40 opportunities for that. But the fact of the matter is that the computing needs
00:04:42 the matter is that the computing needs
00:04:42 the matter is that the computing needs that we name now are going to come from
00:04:44 that we name now are going to come from
00:04:44 that we name now are going to come from traditional energy suppliers in places
00:04:47 traditional energy suppliers in places
00:04:47 traditional energy suppliers in places like the United States and the Arab
00:04:49 like the United States and the Arab
00:04:49 like the United States and the Arab world and Canada and the Western world.
00:04:51 world and Canada and the Western world.
00:04:51 world and Canada and the Western world. And it's important to note that China
00:04:54 And it's important to note that China
00:04:54 And it's important to note that China has lots of electricity. So if they get
00:04:57 has lots of electricity. So if they get
00:04:57 has lots of electricity. So if they get the chips, it's going to be one heck of
00:04:59 the chips, it's going to be one heck of
00:04:59 the chips, it's going to be one heck of a race.
00:04:59 a race.
00:04:59 a race. Yeah. They've been scaling it uh you
00:05:02 Yeah. They've been scaling it uh you
00:05:02 Yeah. They've been scaling it uh you know at two or three times. The US has
00:05:04 know at two or three times. The US has
00:05:04 know at two or three times. The US has been flat for how long in terms of
00:05:05 been flat for how long in terms of
00:05:06 been flat for how long in terms of energy production?
00:05:06 energy production?
00:05:06 energy production? Um from my perspective uh infinite. In
00:05:09 Um from my perspective uh infinite. In
00:05:09 Um from my perspective uh infinite. In fact,
00:05:10 fact,
00:05:10 fact, electricity demand declined for a while
00:05:12 electricity demand declined for a while
00:05:12 electricity demand declined for a while as has overall energy needs because of
00:05:14 as has overall energy needs because of
00:05:14 as has overall energy needs because of conservation and other things.
00:05:16 conservation and other things.
00:05:16 conservation and other things. But the data center story is the story
00:05:19 But the data center story is the story
00:05:19 But the data center story is the story of the energy people, right? And you sit
00:05:22 of the energy people, right? And you sit
00:05:22 of the energy people, right? And you sit there and you go, how could these data
00:05:23 there and you go, how could these data
00:05:23 there and you go, how could these data centers use so much power? Well, and
00:05:26 centers use so much power? Well, and
00:05:26 centers use so much power? Well, and especially when you think about how
00:05:28 especially when you think about how
00:05:28 especially when you think about how little power our brains do. Well, these
00:05:30 little power our brains do. Well, these
00:05:30 little power our brains do. Well, these are our best approximation in digital
00:05:33 are our best approximation in digital
00:05:33 are our best approximation in digital form of how our brains work. But when
00:05:35 form of how our brains work. But when
00:05:35 form of how our brains work. But when they start working together, they become
00:05:37 they start working together, they become
00:05:37 they start working together, they become superbrains. The promise of a superbrain
00:05:40 superbrains. The promise of a superbrain
00:05:40 superbrains. The promise of a superbrain with a 1 gawatt for example data center
00:05:43 with a 1 gawatt for example data center
00:05:43 with a 1 gawatt for example data center is so palpable. People are going crazy.
00:05:46 is so palpable. People are going crazy.
00:05:46 is so palpable. People are going crazy. And by the way, the economics of these
00:05:48 And by the way, the economics of these
00:05:48 And by the way, the economics of these things are unproven. How much revenue do
00:05:51 things are unproven. How much revenue do
00:05:51 things are unproven. How much revenue do you have to have to have 50 billion in
00:05:53 you have to have to have 50 billion in
00:05:53 you have to have to have 50 billion in capital? Well, if you depreciate it over
00:05:55 capital? Well, if you depreciate it over
00:05:55 capital? Well, if you depreciate it over three years or four years, you need to
00:05:57 three years or four years, you need to
00:05:57 three years or four years, you need to have 10 or 15 billion dollars of capital
00:06:00 have 10 or 15 billion dollars of capital
00:06:00 have 10 or 15 billion dollars of capital spend per year just to handle the
00:06:03 spend per year just to handle the
00:06:03 spend per year just to handle the infrastructure. Those are huge
00:06:05 infrastructure. Those are huge
00:06:05 infrastructure. Those are huge businesses and huge revenue, which in
00:06:07 businesses and huge revenue, which in
00:06:07 businesses and huge revenue, which in most places is not there yet.
00:06:10 most places is not there yet.
00:06:10 most places is not there yet. I'm curious, there's so much capital
00:06:12 I'm curious, there's so much capital
00:06:12 I'm curious, there's so much capital being invested and deployed right now in
00:06:15 being invested and deployed right now in
00:06:15 being invested and deployed right now in SMRs in in nuclear bringing Three Mile
00:06:17 SMRs in in nuclear bringing Three Mile
00:06:17 SMRs in in nuclear bringing Three Mile Island back online. uh in in fusion
00:06:20 Island back online. uh in in fusion
00:06:20 Island back online. uh in in fusion companies. Why isn't there an equal
00:06:22 companies. Why isn't there an equal
00:06:22 companies. Why isn't there an equal amount of capital going into making uh
00:06:25 amount of capital going into making uh
00:06:25 amount of capital going into making uh the entire you know chipset and compute
00:06:29 the entire you know chipset and compute
00:06:29 the entire you know chipset and compute just a thousand times more energy
00:06:30 just a thousand times more energy
00:06:30 just a thousand times more energy efficient?
00:06:31 efficient?
00:06:31 efficient? There is a similar amount in going in
00:06:33 There is a similar amount in going in
00:06:33 There is a similar amount in going in capital. There are many many startups
00:06:35 capital. There are many many startups
00:06:35 capital. There are many many startups that are working on non-traditional ways
00:06:37 that are working on non-traditional ways
00:06:37 that are working on non-traditional ways of doing chips. The transformer
00:06:39 of doing chips. The transformer
00:06:39 of doing chips. The transformer architecture which is what is powering
00:06:41 architecture which is what is powering
00:06:41 architecture which is what is powering things today has new variants. Every
00:06:44 things today has new variants. Every
00:06:44 things today has new variants. Every week or so I get a pitch from a new
00:06:46 week or so I get a pitch from a new
00:06:46 week or so I get a pitch from a new startup that's going to build inference
00:06:48 startup that's going to build inference
00:06:48 startup that's going to build inference time, test time computing which are
00:06:50 time, test time computing which are
00:06:50 time, test time computing which are simpler and they're optimized for
00:06:52 simpler and they're optimized for
00:06:52 simpler and they're optimized for inference. It looks like the hardware
00:06:56 inference. It looks like the hardware
00:06:56 inference. It looks like the hardware will arrive just as the software needs
00:06:59 will arrive just as the software needs
00:06:59 will arrive just as the software needs expand.
00:07:00 expand.
00:07:00 expand. And by the way, that's always been true.
00:07:02 And by the way, that's always been true.
00:07:02 And by the way, that's always been true. We old-timers had a phrase um grove
00:07:05 We old-timers had a phrase um grove
00:07:05 We old-timers had a phrase um grove giveth and gates take it away. So Intel
00:07:08 giveth and gates take it away. So Intel
00:07:08 giveth and gates take it away. So Intel would improve the chipsets right way
00:07:11 would improve the chipsets right way
00:07:11 would improve the chipsets right way back when
00:07:12 back when
00:07:12 back when and the software people would
00:07:13 and the software people would
00:07:13 and the software people would immediately use it all and suck it all
00:07:16 immediately use it all and suck it all
00:07:16 immediately use it all and suck it all up. I have no reason to believe
00:07:19 up. I have no reason to believe
00:07:19 up. I have no reason to believe that that's that that law grove and
00:07:22 that that's that that law grove and
00:07:22 that that's that that law grove and gates law has changed. If you look at
00:07:24 gates law has changed. If you look at
00:07:24 gates law has changed. If you look at the gains in like the Blackwell chip or
00:07:27 the gains in like the Blackwell chip or
00:07:27 the gains in like the Blackwell chip or the AS uh the the 350 chip in AMD,
00:07:30 the AS uh the the 350 chip in AMD,
00:07:30 the AS uh the the 350 chip in AMD, these chips are massive supercomputers
00:07:33 these chips are massive supercomputers
00:07:33 these chips are massive supercomputers and yet we need according to the people
00:07:36 and yet we need according to the people
00:07:36 and yet we need according to the people have hundreds of thousands of these
00:07:38 have hundreds of thousands of these
00:07:38 have hundreds of thousands of these chips just to make a data center work.
00:07:40 chips just to make a data center work.
00:07:40 chips just to make a data center work. That shows you the scale of what this
00:07:42 That shows you the scale of what this
00:07:42 That shows you the scale of what this kind of thinking algorithms. Now you sit
00:07:44 kind of thinking algorithms. Now you sit
00:07:44 kind of thinking algorithms. Now you sit there and you go what could these people
00:07:46 there and you go what could these people
00:07:46 there and you go what could these people possibly be doing with all these chips?
00:07:49 possibly be doing with all these chips?
00:07:49 possibly be doing with all these chips? I'll give you an example. We went from
00:07:51 I'll give you an example. We went from
00:07:51 I'll give you an example. We went from language to language which is what
00:07:53 language to language which is what
00:07:53 language to language which is what chatbd can be understood at to reasoning
00:07:55 chatbd can be understood at to reasoning
00:07:55 chatbd can be understood at to reasoning and thinking. If you want to look at an
00:07:57 and thinking. If you want to look at an
00:07:57 and thinking. If you want to look at an open eye example look at open oi03
00:08:01 open eye example look at open oi03
00:08:01 open eye example look at open oi03 which go does forward and back
00:08:02 which go does forward and back
00:08:02 which go does forward and back reinforcement learning and planning.
00:08:04 reinforcement learning and planning.
00:08:04 reinforcement learning and planning. Now the cost of doing the forward and
00:08:06 Now the cost of doing the forward and
00:08:06 Now the cost of doing the forward and back is many orders of magnitude besides
00:08:09 back is many orders of magnitude besides
00:08:09 back is many orders of magnitude besides just answering your question for your
00:08:11 just answering your question for your
00:08:12 just answering your question for your PhD thesis or your college paper that
00:08:15 PhD thesis or your college paper that
00:08:15 PhD thesis or your college paper that planning the back and forth is
00:08:17 planning the back and forth is
00:08:17 planning the back and forth is computationally very very expensive. So
00:08:19 computationally very very expensive. So
00:08:19 computationally very very expensive. So with the best energy and the best
00:08:21 with the best energy and the best
00:08:21 with the best energy and the best technology today we are able to show
00:08:23 technology today we are able to show
00:08:23 technology today we are able to show evidence of planning. Many people
00:08:26 evidence of planning. Many people
00:08:26 evidence of planning. Many people believe that if you combine planning and
00:08:28 believe that if you combine planning and
00:08:28 believe that if you combine planning and very deep memories you can build human
00:08:31 very deep memories you can build human
00:08:31 very deep memories you can build human level intelligence. Now of course they
00:08:34 level intelligence. Now of course they
00:08:34 level intelligence. Now of course they will be very expensive to start with but
00:08:37 will be very expensive to start with but
00:08:37 will be very expensive to start with but humans are very very industrious and
00:08:39 humans are very very industrious and
00:08:39 humans are very very industrious and furthermore the great future companies
00:08:41 furthermore the great future companies
00:08:42 furthermore the great future companies will have AI scientists that is
00:08:43 will have AI scientists that is
00:08:43 will have AI scientists that is non-human scientists AI programmers that
00:08:46 non-human scientists AI programmers that
00:08:46 non-human scientists AI programmers that as opposed to human programmers who will
00:08:48 as opposed to human programmers who will
00:08:48 as opposed to human programmers who will accelerate their impact. So, if you
00:08:51 accelerate their impact. So, if you
00:08:51 accelerate their impact. So, if you think about it, going back to you're the
00:08:52 think about it, going back to you're the
00:08:52 think about it, going back to you're the author of the abundance thesis, as best
00:08:54 author of the abundance thesis, as best
00:08:54 author of the abundance thesis, as best I can tell, Peter, you've talked about
00:08:55 I can tell, Peter, you've talked about
00:08:55 I can tell, Peter, you've talked about this for 20 years. You saw it first. It
00:08:58 this for 20 years. You saw it first. It
00:08:58 this for 20 years. You saw it first. It sure looks like if we get enough
00:09:00 sure looks like if we get enough
00:09:00 sure looks like if we get enough electricity, we can generate the power
00:09:03 electricity, we can generate the power
00:09:03 electricity, we can generate the power in in the sense of intellectual power to
00:09:05 in in the sense of intellectual power to
00:09:05 in in the sense of intellectual power to generate abundance along the lines that
00:09:07 generate abundance along the lines that
00:09:07 generate abundance along the lines that you predicted two decades ago.
00:09:08 you predicted two decades ago.
00:09:08 you predicted two decades ago. Every week, I study the 10 major tech
00:09:11 Every week, I study the 10 major tech
00:09:11 Every week, I study the 10 major tech meta trends that will transform
00:09:13 meta trends that will transform
00:09:13 meta trends that will transform industries over the decade ahead. I
00:09:15 industries over the decade ahead. I
00:09:15 industries over the decade ahead. I cover trends ranging from humanoid
00:09:17 cover trends ranging from humanoid
00:09:17 cover trends ranging from humanoid robots, AGI, quantum computing,
00:09:19 robots, AGI, quantum computing,
00:09:19 robots, AGI, quantum computing, transport, energy, longevity, and more.
00:09:22 transport, energy, longevity, and more.
00:09:22 transport, energy, longevity, and more. No fluff, only the important stuff that
00:09:25 No fluff, only the important stuff that
00:09:25 No fluff, only the important stuff that matters, that impacts our lives and our
00:09:27 matters, that impacts our lives and our
00:09:27 matters, that impacts our lives and our careers. If you want me to share these
00:09:29 careers. If you want me to share these
00:09:29 careers. If you want me to share these with you, I write a newsletter twice a
00:09:31 with you, I write a newsletter twice a
00:09:31 with you, I write a newsletter twice a week, sending it out as a short
00:09:33 week, sending it out as a short
00:09:33 week, sending it out as a short two-minute read via email. And if you
00:09:35 two-minute read via email. And if you
00:09:35 two-minute read via email. And if you want to discover the most important meta
00:09:37 want to discover the most important meta
00:09:37 want to discover the most important meta trends 10 years before anyone else,
00:09:39 trends 10 years before anyone else,
00:09:39 trends 10 years before anyone else, these reports are for you. Readers
00:09:41 these reports are for you. Readers
00:09:41 these reports are for you. Readers include founders and CEOs from the
00:09:43 include founders and CEOs from the
00:09:43 include founders and CEOs from the world's most disruptive companies and
00:09:45 world's most disruptive companies and
00:09:46 world's most disruptive companies and entrepreneurs building the world's most
00:09:48 entrepreneurs building the world's most
00:09:48 entrepreneurs building the world's most disruptive companies. It's not for you
00:09:50 disruptive companies. It's not for you
00:09:50 disruptive companies. It's not for you if you don't want to be informed of
00:09:52 if you don't want to be informed of
00:09:52 if you don't want to be informed of what's coming, why it matters, and how
00:09:54 what's coming, why it matters, and how
00:09:54 what's coming, why it matters, and how you can benefit from it. To subscribe
00:09:56 you can benefit from it. To subscribe
00:09:56 you can benefit from it. To subscribe for free, go to dmadis.com/tatrends.
00:10:00 for free, go to dmadis.com/tatrends.
00:10:00 for free, go to dmadis.com/tatrends. That's dmandis.com/tatrends
00:10:03 That's dmandis.com/tatrends
00:10:03 That's dmandis.com/tatrends to gain access to trends 10 plus years
00:10:06 to gain access to trends 10 plus years
00:10:06 to gain access to trends 10 plus years before anyone else.
00:10:07 before anyone else.
00:10:08 before anyone else. Let me throw some numbers at you just to
00:10:09 Let me throw some numbers at you just to
00:10:09 Let me throw some numbers at you just to reinforce what you said. you know, we
00:10:10 reinforce what you said. you know, we
00:10:10 reinforce what you said. you know, we have a couple companies in the lab that
00:10:12 have a couple companies in the lab that
00:10:12 have a couple companies in the lab that are doing voice customer service, voice
00:10:14 are doing voice customer service, voice
00:10:14 are doing voice customer service, voice sales with the new, you know, just as of
00:10:16 sales with the new, you know, just as of
00:10:16 sales with the new, you know, just as of the last month.
00:10:16 the last month.
00:10:16 the last month. Sure.
00:10:17 Sure.
00:10:17 Sure. And the value of these these
00:10:20 And the value of these these
00:10:20 And the value of these these conversations is 10 to $1,000. And the
00:10:23 conversations is 10 to $1,000. And the
00:10:23 conversations is 10 to $1,000. And the cost of the compute is, you know, maybe
00:10:25 cost of the compute is, you know, maybe
00:10:25 cost of the compute is, you know, maybe two three concurrent GPUs is optimal.
00:10:28 two three concurrent GPUs is optimal.
00:10:28 two three concurrent GPUs is optimal. It's like 10 20 cents. And so they would
00:10:31 It's like 10 20 cents. And so they would
00:10:31 It's like 10 20 cents. And so they would buy massively more compute to improve
00:10:34 buy massively more compute to improve
00:10:34 buy massively more compute to improve the the quality of the conversation.
00:10:36 the the quality of the conversation.
00:10:36 the the quality of the conversation. There aren't even close to enough. We we
00:10:38 There aren't even close to enough. We we
00:10:38 There aren't even close to enough. We we count about 10 million concurrent phone
00:10:39 count about 10 million concurrent phone
00:10:39 count about 10 million concurrent phone calls that should move to AI in the next
00:10:42 calls that should move to AI in the next
00:10:42 calls that should move to AI in the next year or so.
00:10:44 year or so.
00:10:44 year or so. And and my view of that is that's a good
00:10:46 And and my view of that is that's a good
00:10:46 And and my view of that is that's a good tactical solution and a great business.
00:10:48 tactical solution and a great business.
00:10:48 tactical solution and a great business. Let's look at other examples of tactical
00:10:50 Let's look at other examples of tactical
00:10:50 Let's look at other examples of tactical solutions that are great businesses.
00:10:52 solutions that are great businesses.
00:10:52 solutions that are great businesses. And I obviously have a conflict of
00:10:53 And I obviously have a conflict of
00:10:53 And I obviously have a conflict of interest talking about Google because I
00:10:55 interest talking about Google because I
00:10:55 interest talking about Google because I love it so much. So with that as in
00:10:57 love it so much. So with that as in
00:10:57 love it so much. So with that as in mind, look at the Google strength in
00:10:58 mind, look at the Google strength in
00:10:58 mind, look at the Google strength in GCP. Now Google Google's cloud product
00:11:01 GCP. Now Google Google's cloud product
00:11:02 GCP. Now Google Google's cloud product where they have a completely fully
00:11:04 where they have a completely fully
00:11:04 where they have a completely fully served enterprise offering for
00:11:06 served enterprise offering for
00:11:06 served enterprise offering for essentially automating your company with
00:11:08 essentially automating your company with
00:11:08 essentially automating your company with AI.
00:11:09 AI.
00:11:09 AI. Yeah.
00:11:09 Yeah.
00:11:09 Yeah. And the remarkable thing and this is to
00:11:12 And the remarkable thing and this is to
00:11:12 And the remarkable thing and this is to me is shocking is you can in an
00:11:14 me is shocking is you can in an
00:11:14 me is shocking is you can in an enterprise write the task that you want
00:11:18 enterprise write the task that you want
00:11:18 enterprise write the task that you want and then using something called the
00:11:19 and then using something called the
00:11:19 and then using something called the model context protocol you can connect
00:11:21 model context protocol you can connect
00:11:21 model context protocol you can connect your databases to that and the large
00:11:23 your databases to that and the large
00:11:23 your databases to that and the large language model can produce the code for
00:11:26 language model can produce the code for
00:11:26 language model can produce the code for your enterprise. Now, there's 100,000
00:11:29 your enterprise. Now, there's 100,000
00:11:29 your enterprise. Now, there's 100,000 enterprise software companies,
00:11:31 enterprise software companies,
00:11:31 enterprise software companies, middleware companies that grew up in the
00:11:33 middleware companies that grew up in the
00:11:33 middleware companies that grew up in the last 30 years that I've been working on
00:11:35 last 30 years that I've been working on
00:11:35 last 30 years that I've been working on this that are all now in trouble because
00:11:37 this that are all now in trouble because
00:11:37 this that are all now in trouble because that that interstitial connection is no
00:11:39 that that interstitial connection is no
00:11:39 that that interstitial connection is no longer needed
00:11:40 longer needed
00:11:40 longer needed with their business
00:11:41 with their business
00:11:41 with their business and and and of course they'll have to
00:11:42 and and and of course they'll have to
00:11:42 and and and of course they'll have to change as well. The good news for them
00:11:44 change as well. The good news for them
00:11:44 change as well. The good news for them is enterprises make these changes very
00:11:46 is enterprises make these changes very
00:11:46 is enterprises make these changes very slowly. If you built a brand new
00:11:49 slowly. If you built a brand new
00:11:49 slowly. If you built a brand new enterprise um architecture for ERP and
00:11:52 enterprise um architecture for ERP and
00:11:52 enterprise um architecture for ERP and MRP, you would be highly tempted to not
00:11:55 MRP, you would be highly tempted to not
00:11:55 MRP, you would be highly tempted to not use any of the ERP or MRP suppliers, but
00:11:58 use any of the ERP or MRP suppliers, but
00:11:58 use any of the ERP or MRP suppliers, but instead use open- source libraries,
00:12:01 instead use open- source libraries,
00:12:01 instead use open- source libraries, build essentially use BigQuery or the
00:12:03 build essentially use BigQuery or the
00:12:03 build essentially use BigQuery or the equivalent from Amazon, which is Red
00:12:05 equivalent from Amazon, which is Red
00:12:05 equivalent from Amazon, which is Red Redshift, and essentially build that
00:12:07 Redshift, and essentially build that
00:12:07 Redshift, and essentially build that architecture and it gives you infinite
00:12:08 architecture and it gives you infinite
00:12:08 architecture and it gives you infinite flexibility and the computer system
00:12:10 flexibility and the computer system
00:12:10 flexibility and the computer system writes most of the code. Now,
00:12:13 writes most of the code. Now,
00:12:13 writes most of the code. Now, programmers don't go away at the moment.
00:12:15 programmers don't go away at the moment.
00:12:15 programmers don't go away at the moment. It's pretty clear that junior
00:12:17 It's pretty clear that junior
00:12:17 It's pretty clear that junior programmers go away. The sort of
00:12:18 programmers go away. The sort of
00:12:18 programmers go away. The sort of journeymen, if you will, of the
00:12:20 journeymen, if you will, of the
00:12:20 journeymen, if you will, of the stereotype because these systems aren't
00:12:22 stereotype because these systems aren't
00:12:22 stereotype because these systems aren't good enough yet to automatically write
00:12:24 good enough yet to automatically write
00:12:24 good enough yet to automatically write all the code. They need very senior
00:12:27 all the code. They need very senior
00:12:27 all the code. They need very senior computer scientists, computer engineers
00:12:28 computer scientists, computer engineers
00:12:28 computer scientists, computer engineers who are watching it, that will
00:12:30 who are watching it, that will
00:12:30 who are watching it, that will eventually go away.
00:12:32 eventually go away.
00:12:32 eventually go away. One of the things to say about
00:12:33 One of the things to say about
00:12:33 One of the things to say about productivity, and I call this the San
00:12:35 productivity, and I call this the San
00:12:35 productivity, and I call this the San Francisco consensus because it's it's
00:12:36 Francisco consensus because it's it's
00:12:36 Francisco consensus because it's it's largely the view of people who operate
00:12:39 largely the view of people who operate
00:12:39 largely the view of people who operate in San Francisco,
00:12:40 in San Francisco,
00:12:40 in San Francisco, goes something like this. uh we're just
00:12:43 goes something like this. uh we're just
00:12:43 goes something like this. uh we're just about to the point where we can do two
00:12:45 about to the point where we can do two
00:12:45 about to the point where we can do two things that are shocking. The first is
00:12:48 things that are shocking. The first is
00:12:48 things that are shocking. The first is we can replace most programming tasks by
00:12:50 we can replace most programming tasks by
00:12:50 we can replace most programming tasks by computers and we can replace both most
00:12:54 computers and we can replace both most
00:12:54 computers and we can replace both most mathemat mathematical tasks by
00:12:56 mathemat mathematical tasks by
00:12:56 mathemat mathematical tasks by computers.
00:12:57 computers.
00:12:57 computers. Now you sit there and you go why? Well,
00:12:59 Now you sit there and you go why? Well,
00:12:59 Now you sit there and you go why? Well, if you think about programming and math,
00:13:02 if you think about programming and math,
00:13:02 if you think about programming and math, they have limited language sets compared
00:13:05 they have limited language sets compared
00:13:05 they have limited language sets compared to human language. So close they're
00:13:07 to human language. So close they're
00:13:07 to human language. So close they're simpler computationally
00:13:09 simpler computationally
00:13:09 simpler computationally and they're scale free. You can just do
00:13:12 and they're scale free. You can just do
00:13:12 and they're scale free. You can just do it and do it and do it with more
00:13:13 it and do it and do it with more
00:13:14 it and do it and do it with more electricity. You don't need data. You
00:13:16 electricity. You don't need data. You
00:13:16 electricity. You don't need data. You don't need real world input. You don't
00:13:18 don't need real world input. You don't
00:13:18 don't need real world input. You don't need telemetry. You don't need sensors.
00:13:20 need telemetry. You don't need sensors.
00:13:20 need telemetry. You don't need sensors. Yeah.
00:13:21 Yeah.
00:13:21 Yeah. So, it's likely in my opinion that
00:13:23 So, it's likely in my opinion that
00:13:23 So, it's likely in my opinion that you're going to see worldclass
00:13:24 you're going to see worldclass
00:13:24 you're going to see worldclass mathematicians emerge in the next one
00:13:27 mathematicians emerge in the next one
00:13:27 mathematicians emerge in the next one year that are AI based and worldclass
00:13:30 year that are AI based and worldclass
00:13:30 year that are AI based and worldclass programmers that going to appear within
00:13:32 programmers that going to appear within
00:13:32 programmers that going to appear within the next one or two years. When those
00:13:35 the next one or two years. When those
00:13:35 the next one or two years. When those things are deployed at scale, remember
00:13:37 things are deployed at scale, remember
00:13:37 things are deployed at scale, remember math and programming are the basis of
00:13:39 math and programming are the basis of
00:13:39 math and programming are the basis of kind of everything, right? It's an
00:13:41 kind of everything, right? It's an
00:13:41 kind of everything, right? It's an accelerate accelerant for physics,
00:13:43 accelerate accelerant for physics,
00:13:43 accelerate accelerant for physics, chemistry, biology, material science.
00:13:46 chemistry, biology, material science.
00:13:46 chemistry, biology, material science. So, going back to things like climate
00:13:47 So, going back to things like climate
00:13:47 So, going back to things like climate change, can you imagine if we and this
00:13:49 change, can you imagine if we and this
00:13:49 change, can you imagine if we and this goes back to your original argument,
00:13:51 goes back to your original argument,
00:13:51 goes back to your original argument, Peter, imagine if we can accelerate the
00:13:53 Peter, imagine if we can accelerate the
00:13:53 Peter, imagine if we can accelerate the discoveries of the new materials that
00:13:55 discoveries of the new materials that
00:13:55 discoveries of the new materials that allow us to deal with a carbonized
00:13:56 allow us to deal with a carbonized
00:13:56 allow us to deal with a carbonized world.
00:13:57 world.
00:13:57 world. Yeah.
00:13:58 Yeah.
00:13:58 Yeah. Right. It's very exciting. Can I love to
00:14:01 Right. It's very exciting. Can I love to
00:14:01 Right. It's very exciting. Can I love to drill in about
00:14:03 drill in about
00:14:03 drill in about you first?
00:14:03 you first?
00:14:04 you first? I just want to hit this because it's
00:14:05 I just want to hit this because it's
00:14:05 I just want to hit this because it's important the potential for there to be
00:14:09 important the potential for there to be
00:14:09 important the potential for there to be I don't want to use the word PhD level
00:14:12 I don't want to use the word PhD level
00:14:12 I don't want to use the word PhD level you know other than uh thinking in the
00:14:15 you know other than uh thinking in the
00:14:15 you know other than uh thinking in the terms of research PhD level AIS and uh
00:14:19 terms of research PhD level AIS and uh
00:14:19 terms of research PhD level AIS and uh that can basically attack any problem
00:14:23 that can basically attack any problem
00:14:23 that can basically attack any problem and solve it uh and solve math if you
00:14:26 and solve it uh and solve math if you
00:14:26 and solve it uh and solve math if you would in physics. uh this idea of an AI,
00:14:29 would in physics. uh this idea of an AI,
00:14:29 would in physics. uh this idea of an AI, you know, intelligence explosion. Um Leo
00:14:33 you know, intelligence explosion. Um Leo
00:14:33 you know, intelligence explosion. Um Leo Leopold put that at like 26 27
00:14:38 Leopold put that at like 26 27
00:14:38 Leopold put that at like 26 27 uh heading towards digital super
00:14:40 uh heading towards digital super
00:14:40 uh heading towards digital super intelligence in the next few years. Do
00:14:42 intelligence in the next few years. Do
00:14:42 intelligence in the next few years. Do you buy that time frame?
00:14:44 you buy that time frame?
00:14:44 you buy that time frame? So again, I consider that to be the San
00:14:46 So again, I consider that to be the San
00:14:46 So again, I consider that to be the San Francisco consensus. I think the dates
00:14:48 Francisco consensus. I think the dates
00:14:48 Francisco consensus. I think the dates are probably off by one and a half or
00:14:51 are probably off by one and a half or
00:14:51 are probably off by one and a half or two times,
00:14:52 two times,
00:14:52 two times, which is pretty close. So a reasonable
00:14:55 which is pretty close. So a reasonable
00:14:55 which is pretty close. So a reasonable prediction is that we're going to have
00:14:58 prediction is that we're going to have
00:14:58 prediction is that we're going to have specialized soants in every field within
00:15:01 specialized soants in every field within
00:15:01 specialized soants in every field within five years.
00:15:03 five years.
00:15:03 five years. That's pretty much in the bag as far as
00:15:05 That's pretty much in the bag as far as
00:15:05 That's pretty much in the bag as far as I'm concerned.
00:15:06 I'm concerned.
00:15:06 I'm concerned. Sure.
00:15:06 Sure.
00:15:06 Sure. And here's why. You have this amount of
00:15:08 And here's why. You have this amount of
00:15:08 And here's why. You have this amount of humans and then you add a million AI
00:15:11 humans and then you add a million AI
00:15:11 humans and then you add a million AI scientists to do something, your slope
00:15:13 scientists to do something, your slope
00:15:13 scientists to do something, your slope goes like this. Your rate of
00:15:14 goes like this. Your rate of
00:15:14 goes like this. Your rate of improvement, we should get there.
00:15:17 improvement, we should get there.
00:15:18 improvement, we should get there. The real question is once you have all
00:15:19 The real question is once you have all
00:15:19 The real question is once you have all these sants, do they unify?
00:15:23 these sants, do they unify?
00:15:24 these sants, do they unify? Do they ultimately become a superhum?
00:15:26 Do they ultimately become a superhum?
00:15:26 Do they ultimately become a superhum? The term we're using is super
00:15:28 The term we're using is super
00:15:28 The term we're using is super intelligence, which implies intelligence
00:15:30 intelligence, which implies intelligence
00:15:30 intelligence, which implies intelligence that beyond the sum of what humans can
00:15:32 that beyond the sum of what humans can
00:15:32 that beyond the sum of what humans can do.
00:15:33 do.
00:15:33 do. The race to super intelligence, which is
00:15:36 The race to super intelligence, which is
00:15:36 The race to super intelligence, which is incredibly important because imagine
00:15:38 incredibly important because imagine
00:15:38 incredibly important because imagine what a super intelligence could do that
00:15:40 what a super intelligence could do that
00:15:40 what a super intelligence could do that we ourselves cannot imagine, right?
00:15:42 we ourselves cannot imagine, right?
00:15:42 we ourselves cannot imagine, right? There it's so much smarter than we and
00:15:45 There it's so much smarter than we and
00:15:45 There it's so much smarter than we and it has huge proliferation issues,
00:15:47 it has huge proliferation issues,
00:15:47 it has huge proliferation issues, competitive issues, China versus the US
00:15:49 competitive issues, China versus the US
00:15:49 competitive issues, China versus the US issues, electricity issues, so forth. We
00:15:52 issues, electricity issues, so forth. We
00:15:52 issues, electricity issues, so forth. We don't even have the language for the
00:15:54 don't even have the language for the
00:15:54 don't even have the language for the deterrence aspects and the proliferation
00:15:56 deterrence aspects and the proliferation
00:15:56 deterrence aspects and the proliferation issues of these powerful models
00:15:58 issues of these powerful models
00:15:58 issues of these powerful models or the imagination.
00:15:59 or the imagination.
00:15:59 or the imagination. Totally agree. In fact, it's it's one of
00:16:00 Totally agree. In fact, it's it's one of
00:16:00 Totally agree. In fact, it's it's one of the great flaws actually in the original
00:16:02 the great flaws actually in the original
00:16:02 the great flaws actually in the original conception. You remember Singularity
00:16:04 conception. You remember Singularity
00:16:04 conception. You remember Singularity University and Ray Curtzwhile's books
00:16:06 University and Ray Curtzwhile's books
00:16:06 University and Ray Curtzwhile's books and everything. And we kind of drew this
00:16:08 and everything. And we kind of drew this
00:16:08 and everything. And we kind of drew this curve of rat level intelligence, then
00:16:10 curve of rat level intelligence, then
00:16:10 curve of rat level intelligence, then cat, then monkey, and then it hits human
00:16:12 cat, then monkey, and then it hits human
00:16:12 cat, then monkey, and then it hits human and then it goes super intelligent. But
00:16:14 and then it goes super intelligent. But
00:16:14 and then it goes super intelligent. But it's now really obvious when you talk to
00:16:16 it's now really obvious when you talk to
00:16:16 it's now really obvious when you talk to one of these multilingual models that's
00:16:19 one of these multilingual models that's
00:16:19 one of these multilingual models that's explaining physics to you that it's
00:16:21 explaining physics to you that it's
00:16:21 explaining physics to you that it's already hugely super intelligent within
00:16:23 already hugely super intelligent within
00:16:23 already hugely super intelligent within its savant category. And so Dennis keeps
00:16:26 its savant category. And so Dennis keeps
00:16:26 its savant category. And so Dennis keeps redefining AGI day as well when it can
00:16:29 redefining AGI day as well when it can
00:16:29 redefining AGI day as well when it can discover relativity the same way
00:16:30 discover relativity the same way
00:16:30 discover relativity the same way Einstein did with data that was
00:16:32 Einstein did with data that was
00:16:32 Einstein did with data that was available up until that date. That's
00:16:34 available up until that date. That's
00:16:34 available up until that date. That's when we have AGI.
00:16:35 when we have AGI.
00:16:35 when we have AGI. So long before that.
00:16:37 So long before that.
00:16:37 So long before that. Yeah. So I think it's worth getting the
00:16:38 Yeah. So I think it's worth getting the
00:16:38 Yeah. So I think it's worth getting the timeline right.
00:16:39 timeline right.
00:16:39 timeline right. Yeah.
00:16:40 Yeah.
00:16:40 Yeah. So the following things are baked in.
00:16:42 So the following things are baked in.
00:16:42 So the following things are baked in. You're going to have an agentic
00:16:44 You're going to have an agentic
00:16:44 You're going to have an agentic revolution where agents are connected to
00:16:46 revolution where agents are connected to
00:16:46 revolution where agents are connected to solve business processes, government
00:16:48 solve business processes, government
00:16:48 solve business processes, government processes and so forth. They will be
00:16:51 processes and so forth. They will be
00:16:51 processes and so forth. They will be adopted most quickly in companies in
00:16:54 adopted most quickly in companies in
00:16:54 adopted most quickly in companies in country companies that have a lot of
00:16:55 country companies that have a lot of
00:16:55 country companies that have a lot of money and a lot of uh time latency
00:16:58 money and a lot of uh time latency
00:16:58 money and a lot of uh time latency issues at stake. It will adop be adopted
00:17:00 issues at stake. It will adop be adopted
00:17:00 issues at stake. It will adop be adopted most slowly in places like government
00:17:02 most slowly in places like government
00:17:02 most slowly in places like government which do not have an incentive for
00:17:04 which do not have an incentive for
00:17:04 which do not have an incentive for innovation. Um and fundamentally are job
00:17:07 innovation. Um and fundamentally are job
00:17:07 innovation. Um and fundamentally are job programs and redistribution of income
00:17:09 programs and redistribution of income
00:17:09 programs and redistribution of income kind of programs.
00:17:10 kind of programs.
00:17:10 kind of programs. So call it what you will. The important
00:17:12 So call it what you will. The important
00:17:12 So call it what you will. The important thing is that there will be a tip of the
00:17:13 thing is that there will be a tip of the
00:17:13 thing is that there will be a tip of the spear in places like financial services,
00:17:17 spear in places like financial services,
00:17:17 spear in places like financial services, certain kind of bio biomedical things,
00:17:19 certain kind of bio biomedical things,
00:17:19 certain kind of bio biomedical things, startups and so forth. And that's the
00:17:20 startups and so forth. And that's the
00:17:20 startups and so forth. And that's the place to watch. So all of that is going
00:17:24 place to watch. So all of that is going
00:17:24 place to watch. So all of that is going to happen. The agents are going to
00:17:26 to happen. The agents are going to
00:17:26 to happen. The agents are going to happen. This math thing is going to
00:17:27 happen. This math thing is going to
00:17:28 happen. This math thing is going to happen. The software thing is going to
00:17:29 happen. The software thing is going to
00:17:29 happen. The software thing is going to happen. We can debate the rate at which
00:17:31 happen. We can debate the rate at which
00:17:31 happen. We can debate the rate at which the biological revolution will occur,
00:17:33 the biological revolution will occur,
00:17:34 the biological revolution will occur, but everyone agrees that it's right
00:17:36 but everyone agrees that it's right
00:17:36 but everyone agrees that it's right after that. We're very close to these
00:17:38 after that. We're very close to these
00:17:38 after that. We're very close to these major biological understandings. Um in
00:17:41 major biological understandings. Um in
00:17:41 major biological understandings. Um in physics you're limited by data but you
00:17:43 physics you're limited by data but you
00:17:43 physics you're limited by data but you can generate it synthetically. There are
00:17:45 can generate it synthetically. There are
00:17:45 can generate it synthetically. There are groups which I'm funding which are
00:17:47 groups which I'm funding which are
00:17:47 groups which I'm funding which are generating physics um essentially um
00:17:50 generating physics um essentially um
00:17:50 generating physics um essentially um models that can approximate algorithms
00:17:53 models that can approximate algorithms
00:17:53 models that can approximate algorithms that cannot be they're incomputable. So
00:17:55 that cannot be they're incomputable. So
00:17:55 that cannot be they're incomputable. So in other words you have a a essentially
00:17:57 in other words you have a a essentially
00:17:57 in other words you have a a essentially a foundation model that can answer the
00:17:59 a foundation model that can answer the
00:17:59 a foundation model that can answer the question good enough for the purposes of
00:18:01 question good enough for the purposes of
00:18:01 question good enough for the purposes of doing physics without having to spend a
00:18:03 doing physics without having to spend a
00:18:03 doing physics without having to spend a million years doing the computation of
00:18:05 million years doing the computation of
00:18:05 million years doing the computation of you know quantum chromodnamics and
00:18:07 you know quantum chromodnamics and
00:18:07 you know quantum chromodnamics and things like that. Yep.
00:18:08 things like that. Yep.
00:18:08 things like that. Yep. Um, all of that's going to happen.
00:18:11 Um, all of that's going to happen.
00:18:11 Um, all of that's going to happen. The next questions have to do with what
00:18:14 The next questions have to do with what
00:18:14 The next questions have to do with what is the point in which this becomes a
00:18:17 is the point in which this becomes a
00:18:17 is the point in which this becomes a national emergency
00:18:19 national emergency
00:18:20 national emergency and it goes something like this.
00:18:22 and it goes something like this.
00:18:22 and it goes something like this. Everything I've talked about is in the
00:18:24 Everything I've talked about is in the
00:18:24 Everything I've talked about is in the positive domain, but there's a negative
00:18:26 positive domain, but there's a negative
00:18:26 positive domain, but there's a negative domain as well. The ability for
00:18:28 domain as well. The ability for
00:18:28 domain as well. The ability for biological attacks, um, uh, obviously
00:18:31 biological attacks, um, uh, obviously
00:18:31 biological attacks, um, uh, obviously cyber attacks. Imagine a cyber attack
00:18:34 cyber attacks. Imagine a cyber attack
00:18:34 cyber attacks. Imagine a cyber attack that we as humans cannot conceive of,
00:18:37 that we as humans cannot conceive of,
00:18:37 that we as humans cannot conceive of, which means there's no defense for it
00:18:38 which means there's no defense for it
00:18:38 which means there's no defense for it because no one ever thought about it.
00:18:40 because no one ever thought about it.
00:18:40 because no one ever thought about it. Right? These are real issues. A
00:18:42 Right? These are real issues. A
00:18:42 Right? These are real issues. A biological attack, you take a virus, I
00:18:44 biological attack, you take a virus, I
00:18:44 biological attack, you take a virus, I won't obviously go into the details. You
00:18:46 won't obviously go into the details. You
00:18:46 won't obviously go into the details. You take a virus that's bad and you make it
00:18:49 take a virus that's bad and you make it
00:18:49 take a virus that's bad and you make it undetectable by some changes in its
00:18:52 undetectable by some changes in its
00:18:52 undetectable by some changes in its structure, which again I won't go into
00:18:53 structure, which again I won't go into
00:18:53 structure, which again I won't go into the details. We released a whole report
00:18:55 the details. We released a whole report
00:18:55 the details. We released a whole report at the national level on this issue. So
00:18:59 at the national level on this issue. So
00:18:59 at the national level on this issue. So at some point the government and not it
00:19:02 at some point the government and not it
00:19:02 at some point the government and not it doesn't appear to understand this now is
00:19:03 doesn't appear to understand this now is
00:19:04 doesn't appear to understand this now is going to have to say this is very big
00:19:07 going to have to say this is very big
00:19:07 going to have to say this is very big because it affects national security,
00:19:09 because it affects national security,
00:19:09 because it affects national security, national economic strengths and so
00:19:10 national economic strengths and so
00:19:10 national economic strengths and so forth. Now China clearly understands
00:19:13 forth. Now China clearly understands
00:19:13 forth. Now China clearly understands this and China is putting an enormous
00:19:16 this and China is putting an enormous
00:19:16 this and China is putting an enormous amount of money into this. We have
00:19:18 amount of money into this. We have
00:19:18 amount of money into this. We have slowed them down by virtue of our chips
00:19:21 slowed them down by virtue of our chips
00:19:21 slowed them down by virtue of our chips controls but they found clever ways
00:19:23 controls but they found clever ways
00:19:23 controls but they found clever ways around this. There are also
00:19:25 around this. There are also
00:19:25 around this. There are also proliferation issues. Many of the chips
00:19:27 proliferation issues. Many of the chips
00:19:27 proliferation issues. Many of the chips that they're not supposed to have, they
00:19:29 that they're not supposed to have, they
00:19:29 that they're not supposed to have, they seem to be able to get. And more
00:19:30 seem to be able to get. And more
00:19:30 seem to be able to get. And more importantly, as I mentioned, the
00:19:32 importantly, as I mentioned, the
00:19:32 importantly, as I mentioned, the algorithms are changing. And instead of
00:19:34 algorithms are changing. And instead of
00:19:34 algorithms are changing. And instead of having these expensive foundation models
00:19:36 having these expensive foundation models
00:19:36 having these expensive foundation models by themselves, you have continuous
00:19:38 by themselves, you have continuous
00:19:38 by themselves, you have continuous updating, which is called test time
00:19:39 updating, which is called test time
00:19:39 updating, which is called test time training. That continuous updating
00:19:41 training. That continuous updating
00:19:41 training. That continuous updating appears to be capable of being done with
00:19:44 appears to be capable of being done with
00:19:44 appears to be capable of being done with lesser power chips. So, so we I there
00:19:47 lesser power chips. So, so we I there
00:19:47 lesser power chips. So, so we I there are so many questions that I think we
00:19:49 are so many questions that I think we
00:19:49 are so many questions that I think we don't know. We don't know the role of
00:19:51 don't know. We don't know the role of
00:19:51 don't know. We don't know the role of open source because remember open source
00:19:53 open source because remember open source
00:19:53 open source because remember open source means open weights, which means everyone
00:19:55 means open weights, which means everyone
00:19:55 means open weights, which means everyone can use it. A fair reading of this is
00:19:57 can use it. A fair reading of this is
00:19:57 can use it. A fair reading of this is that every country that's not in the
00:19:59 that every country that's not in the
00:19:59 that every country that's not in the west will end up using open source
00:20:01 west will end up using open source
00:20:01 west will end up using open source because they'll perceive it as cheaper
00:20:03 because they'll perceive it as cheaper
00:20:03 because they'll perceive it as cheaper which trans transfers leadership in open
00:20:05 which trans transfers leadership in open
00:20:05 which trans transfers leadership in open source from America to China. That's a
00:20:07 source from America to China. That's a
00:20:07 source from America to China. That's a big deal, right? If that occurs.
00:20:09 big deal, right? If that occurs.
00:20:09 big deal, right? If that occurs. Um how much longer do the chip bans if
00:20:12 Um how much longer do the chip bans if
00:20:12 Um how much longer do the chip bans if you will hold and how long before China
00:20:14 you will hold and how long before China
00:20:14 you will hold and how long before China can answer?
00:20:16 can answer?
00:20:16 can answer? What are the effects of the current uh
00:20:18 What are the effects of the current uh
00:20:18 What are the effects of the current uh government's policies of getting rid of
00:20:20 government's policies of getting rid of
00:20:20 government's policies of getting rid of foreigners and foreign investment? what
00:20:22 foreigners and foreign investment? what
00:20:22 foreigners and foreign investment? what happens with the Arab U data centers
00:20:24 happens with the Arab U data centers
00:20:24 happens with the Arab U data centers assuming they work and I'm generally
00:20:26 assuming they work and I'm generally
00:20:26 assuming they work and I'm generally supportive of them um if those things
00:20:29 supportive of them um if those things
00:20:29 supportive of them um if those things are then misused uh to help train train
00:20:32 are then misused uh to help train train
00:20:32 are then misused uh to help train train models. The list just goes on and on. We
00:20:34 models. The list just goes on and on. We
00:20:34 models. The list just goes on and on. We just don't know. Okay. Can I ask you
00:20:36 just don't know. Okay. Can I ask you
00:20:36 just don't know. Okay. Can I ask you probably one of the toughest questions?
00:20:37 probably one of the toughest questions?
00:20:38 probably one of the toughest questions? I don't know if you saw Mark Andre
00:20:40 I don't know if you saw Mark Andre
00:20:40 I don't know if you saw Mark Andre uh he went and talked to the Biden
00:20:42 uh he went and talked to the Biden
00:20:42 uh he went and talked to the Biden administration past administration and
00:20:44 administration past administration and
00:20:44 administration past administration and said how are we going to deal with
00:20:45 said how are we going to deal with
00:20:45 said how are we going to deal with exactly what you just talked about
00:20:47 exactly what you just talked about
00:20:47 exactly what you just talked about chemical and biological and radiological
00:20:49 chemical and biological and radiological
00:20:49 chemical and biological and radiological and nuclear risks from big foundation
00:20:51 and nuclear risks from big foundation
00:20:51 and nuclear risks from big foundation models being operated by foreign
00:20:53 models being operated by foreign
00:20:53 models being operated by foreign countries. And the Biden answer was you
00:20:56 countries. And the Biden answer was you
00:20:56 countries. And the Biden answer was you know we're going to keep it into the
00:20:57 know we're going to keep it into the
00:20:57 know we're going to keep it into the three or four big companies like Google
00:21:00 three or four big companies like Google
00:21:00 three or four big companies like Google and we'll just regulate them. And Mark
00:21:03 and we'll just regulate them. And Mark
00:21:03 and we'll just regulate them. And Mark was like, "That is a surefire way to
00:21:05 was like, "That is a surefire way to
00:21:05 was like, "That is a surefire way to lose the race with China because all
00:21:08 lose the race with China because all
00:21:08 lose the race with China because all innovation comes from a startup that you
00:21:10 innovation comes from a startup that you
00:21:10 innovation comes from a startup that you didn't anticipate or you know it's just
00:21:12 didn't anticipate or you know it's just
00:21:12 didn't anticipate or you know it's just the American history and you're you're
00:21:14 the American history and you're you're
00:21:14 the American history and you're you're cutting off the entrepreneur from
00:21:16 cutting off the entrepreneur from
00:21:16 cutting off the entrepreneur from participating in this." So as of right
00:21:18 participating in this." So as of right
00:21:18 participating in this." So as of right now with the open source models, the
00:21:20 now with the open source models, the
00:21:20 now with the open source models, the entrepreneurs are in great shape. But if
00:21:22 entrepreneurs are in great shape. But if
00:21:22 entrepreneurs are in great shape. But if you think about the models getting crazy
00:21:24 you think about the models getting crazy
00:21:24 you think about the models getting crazy smart a year from now, how are we going
00:21:26 smart a year from now, how are we going
00:21:26 smart a year from now, how are we going to have the the balance between startups
00:21:29 to have the the balance between startups
00:21:29 to have the the balance between startups actually being able to work with the
00:21:31 actually being able to work with the
00:21:31 actually being able to work with the best technology but proliferation not
00:21:34 best technology but proliferation not
00:21:34 best technology but proliferation not percolating to every country in the
00:21:36 percolating to every country in the
00:21:36 percolating to every country in the world.
00:21:37 world.
00:21:37 world. Again, a set of unknown questions and
00:21:38 Again, a set of unknown questions and
00:21:38 Again, a set of unknown questions and anybody who knows the answer to these
00:21:40 anybody who knows the answer to these
00:21:40 anybody who knows the answer to these things is not telling the full truth.
00:21:42 things is not telling the full truth.
00:21:42 things is not telling the full truth. Um the doctrine in the B administration
00:21:45 Um the doctrine in the B administration
00:21:45 Um the doctrine in the B administration was called 10 to the 26 flops. It was a
00:21:48 was called 10 to the 26 flops. It was a
00:21:48 was called 10 to the 26 flops. It was a point that was a consensus above which
00:21:51 point that was a consensus above which
00:21:51 point that was a consensus above which the models were powerful enough to cause
00:21:54 the models were powerful enough to cause
00:21:54 the models were powerful enough to cause some damage. So the theory was that if
00:21:56 some damage. So the theory was that if
00:21:56 some damage. So the theory was that if you stayed below 10 the 26 you didn't
00:21:59 you stayed below 10 the 26 you didn't
00:21:59 you stayed below 10 the 26 you didn't need to be regulated.
00:22:00 need to be regulated.
00:22:00 need to be regulated. But if you were above that you needed to
00:22:02 But if you were above that you needed to
00:22:02 But if you were above that you needed to be regulated. And the proposal in the
00:22:04 be regulated. And the proposal in the
00:22:04 be regulated. And the proposal in the Biden administration was to regulate
00:22:06 Biden administration was to regulate
00:22:06 Biden administration was to regulate both the open source and the closed
00:22:07 both the open source and the closed
00:22:08 both the open source and the closed source.
00:22:08 source.
00:22:08 source. Okay that's that's the those are the the
00:22:11 Okay that's that's the those are the the
00:22:11 Okay that's that's the those are the the summary
00:22:11 summary
00:22:11 summary that of course has been ended by the
00:22:13 that of course has been ended by the
00:22:13 that of course has been ended by the Trump administration. um they have not
00:22:16 Trump administration. um they have not
00:22:16 Trump administration. um they have not yet produced their own thinking in this
00:22:18 yet produced their own thinking in this
00:22:18 yet produced their own thinking in this area. They're very concerned about China
00:22:20 area. They're very concerned about China
00:22:20 area. They're very concerned about China and it getting forward. So, they'll come
00:22:22 and it getting forward. So, they'll come
00:22:22 and it getting forward. So, they'll come out with something. From my perspective,
00:22:25 out with something. From my perspective,
00:22:25 out with something. From my perspective, the the core questions are the
00:22:27 the the core questions are the
00:22:27 the the core questions are the following. Will the Chinese be able to
00:22:30 following. Will the Chinese be able to
00:22:30 following. Will the Chinese be able to use even with um chip restrictions, will
00:22:33 use even with um chip restrictions, will
00:22:33 use even with um chip restrictions, will they use architectural changes that will
00:22:34 they use architectural changes that will
00:22:34 they use architectural changes that will allow them to build models as powerful
00:22:36 allow them to build models as powerful
00:22:36 allow them to build models as powerful as ours?
00:22:37 as ours?
00:22:37 as ours? And let's assume they're government
00:22:38 And let's assume they're government
00:22:38 And let's assume they're government funded. That's the first question. The
00:22:41 funded. That's the first question. The
00:22:41 funded. That's the first question. The next fun question is how will you raise
00:22:44 next fun question is how will you raise
00:22:44 next fun question is how will you raise $50 billion for your data center if your
00:22:47 $50 billion for your data center if your
00:22:47 $50 billion for your data center if your product is open source?
00:22:48 product is open source?
00:22:48 product is open source? Yeah.
00:22:49 Yeah.
00:22:49 Yeah. In the American model, part of the
00:22:51 In the American model, part of the
00:22:51 In the American model, part of the reason these models are closed is that
00:22:52 reason these models are closed is that
00:22:52 reason these models are closed is that the business people and the lawyers
00:22:54 the business people and the lawyers
00:22:54 the business people and the lawyers correctly are saying I've got to sell
00:22:57 correctly are saying I've got to sell
00:22:57 correctly are saying I've got to sell this thing because I've got to pay for
00:22:58 this thing because I've got to pay for
00:22:58 this thing because I've got to pay for my capital. These are not free goods.
00:23:00 my capital. These are not free goods.
00:23:00 my capital. These are not free goods. And the US government correctly is not
00:23:02 And the US government correctly is not
00:23:02 And the US government correctly is not giving $50 billion to these companies.
00:23:05 giving $50 billion to these companies.
00:23:05 giving $50 billion to these companies. So we don't know that. Um the to me the
00:23:09 So we don't know that. Um the to me the
00:23:09 So we don't know that. Um the to me the key question to watch is look at
00:23:11 key question to watch is look at
00:23:11 key question to watch is look at Deepseek. So Deepseek um a week or so
00:23:14 Deepseek. So Deepseek um a week or so
00:23:14 Deepseek. So Deepseek um a week or so ago Gemini 2.5 Pro got to the top of the
00:23:18 ago Gemini 2.5 Pro got to the top of the
00:23:18 ago Gemini 2.5 Pro got to the top of the leaderboards in intelligence. Great
00:23:21 leaderboards in intelligence. Great
00:23:21 leaderboards in intelligence. Great achievement for my friends at Gem at
00:23:22 achievement for my friends at Gem at
00:23:22 achievement for my friends at Gem at Gemini. A week later deepseek comes in
00:23:27 Gemini. A week later deepseek comes in
00:23:27 Gemini. A week later deepseek comes in and is slightly better than Gemini. and
00:23:29 and is slightly better than Gemini. and
00:23:29 and is slightly better than Gemini. and Deeps of course is trained on the
00:23:31 Deeps of course is trained on the
00:23:31 Deeps of course is trained on the existing hardware that's in China which
00:23:33 existing hardware that's in China which
00:23:33 existing hardware that's in China which includes stuff that's been Pilford and
00:23:35 includes stuff that's been Pilford and
00:23:35 includes stuff that's been Pilford and some of the Ascend it's called the
00:23:37 some of the Ascend it's called the
00:23:37 some of the Ascend it's called the Ascend Huawei chips and a few others
00:23:41 Ascend Huawei chips and a few others
00:23:41 Ascend Huawei chips and a few others what happens now the US people say well
00:23:45 what happens now the US people say well
00:23:45 what happens now the US people say well you know the the deepseek people cheated
00:23:48 you know the the deepseek people cheated
00:23:48 you know the the deepseek people cheated and they cheated by doing a technique
00:23:50 and they cheated by doing a technique
00:23:50 and they cheated by doing a technique called distillation where you take a
00:23:52 called distillation where you take a
00:23:52 called distillation where you take a large model and you ask it 10,000
00:23:54 large model and you ask it 10,000
00:23:54 large model and you ask it 10,000 questions you get its answers and then
00:23:55 questions you get its answers and then
00:23:55 questions you get its answers and then then you use that as your training
00:23:57 then you use that as your training
00:23:57 then you use that as your training material
00:23:57 material
00:23:57 material yep
00:23:57 yep
00:23:58 yep so the US companies will have to figure
00:23:59 so the US companies will have to figure
00:23:59 so the US companies will have to figure out a way to make sure that their
00:24:01 out a way to make sure that their
00:24:01 out a way to make sure that their proprietary information that they've
00:24:02 proprietary information that they've
00:24:02 proprietary information that they've spent so much money on does not get
00:24:05 spent so much money on does not get
00:24:05 spent so much money on does not get leaked into these open source things. Um
00:24:08 leaked into these open source things. Um
00:24:08 leaked into these open source things. Um I just don't know with respect to uh
00:24:11 I just don't know with respect to uh
00:24:11 I just don't know with respect to uh nuclear, biological, chemical and so
00:24:13 nuclear, biological, chemical and so
00:24:13 nuclear, biological, chemical and so forth issues. Um the US companies are
00:24:15 forth issues. Um the US companies are
00:24:15 forth issues. Um the US companies are doing a really good job of looking for
00:24:17 doing a really good job of looking for
00:24:17 doing a really good job of looking for that. There's a great concern, for
00:24:19 that. There's a great concern, for
00:24:19 that. There's a great concern, for example, that nuclear information would
00:24:21 example, that nuclear information would
00:24:22 example, that nuclear information would leak into these models as they're
00:24:23 leak into these models as they're
00:24:24 leak into these models as they're training without us knowing it. And by
00:24:25 training without us knowing it. And by
00:24:25 training without us knowing it. And by the way, that's a violation of law.
00:24:27 the way, that's a violation of law.
00:24:27 the way, that's a violation of law. Oh, really? they work and the whole
00:24:29 Oh, really? they work and the whole
00:24:29 Oh, really? they work and the whole nuclear information thing is is there's
00:24:31 nuclear information thing is is there's
00:24:31 nuclear information thing is is there's no free speech in that world for good
00:24:33 no free speech in that world for good
00:24:33 no free speech in that world for good reasons
00:24:34 reasons
00:24:34 reasons and there's no free use and copyright
00:24:36 and there's no free use and copyright
00:24:36 and there's no free use and copyright and all that kind of stuff. It's illegal
00:24:37 and all that kind of stuff. It's illegal
00:24:37 and all that kind of stuff. It's illegal to do it and so they're doing a really
00:24:39 to do it and so they're doing a really
00:24:39 to do it and so they're doing a really really good job of making sure that that
00:24:41 really good job of making sure that that
00:24:41 really good job of making sure that that does not happen. They also put in very
00:24:44 does not happen. They also put in very
00:24:44 does not happen. They also put in very significant tests for biological
00:24:45 significant tests for biological
00:24:45 significant tests for biological information and certain kinds of cyber
00:24:47 information and certain kinds of cyber
00:24:47 information and certain kinds of cyber attacks. What happens there? Their
00:24:50 attacks. What happens there? Their
00:24:50 attacks. What happens there? Their incentive is their incentive to continue
00:24:51 incentive is their incentive to continue
00:24:51 incentive is their incentive to continue especially if it's not if it's not
00:24:53 especially if it's not if it's not
00:24:53 especially if it's not if it's not required by law. The government has just
00:24:56 required by law. The government has just
00:24:56 required by law. The government has just gotten rid of the the safety institutes
00:24:58 gotten rid of the the safety institutes
00:24:58 gotten rid of the the safety institutes that were in place in Biden and are
00:25:00 that were in place in Biden and are
00:25:00 that were in place in Biden and are replacing it by a new term which is
00:25:02 replacing it by a new term which is
00:25:02 replacing it by a new term which is largely a safety assessment program
00:25:05 largely a safety assessment program
00:25:05 largely a safety assessment program which is a fine answer. I think
00:25:07 which is a fine answer. I think
00:25:07 which is a fine answer. I think collectively we in the industry just
00:25:10 collectively we in the industry just
00:25:10 collectively we in the industry just want the government at the secret and
00:25:11 want the government at the secret and
00:25:12 want the government at the secret and top secret level to have people who are
00:25:14 top secret level to have people who are
00:25:14 top secret level to have people who are really studying what China and others
00:25:16 really studying what China and others
00:25:16 really studying what China and others are doing. You can be sure that China
00:25:18 are doing. You can be sure that China
00:25:18 are doing. You can be sure that China really has very smart people studying
00:25:20 really has very smart people studying
00:25:20 really has very smart people studying what we're doing. We at the secret and
00:25:23 what we're doing. We at the secret and
00:25:23 what we're doing. We at the secret and top secret level should have the same
00:25:25 top secret level should have the same
00:25:25 top secret level should have the same thing.
00:25:25 thing.
00:25:25 thing. Have you read the uh AI27 paper?
00:25:28 Have you read the uh AI27 paper?
00:25:28 Have you read the uh AI27 paper? I have. Uh, and so for those listening
00:25:31 I have. Uh, and so for those listening
00:25:31 I have. Uh, and so for those listening who haven't read it, it's a it's a
00:25:33 who haven't read it, it's a it's a
00:25:33 who haven't read it, it's a it's a future vision of the AI and US and China
00:25:36 future vision of the AI and US and China
00:25:36 future vision of the AI and US and China racing towards AI and at some point the
00:25:40 racing towards AI and at some point the
00:25:40 racing towards AI and at some point the story splits into a we're going to slow
00:25:42 story splits into a we're going to slow
00:25:42 story splits into a we're going to slow down and work on alignment or we're
00:25:45 down and work on alignment or we're
00:25:45 down and work on alignment or we're going full out and uh, you know, spoiler
00:25:48 going full out and uh, you know, spoiler
00:25:48 going full out and uh, you know, spoiler alert and the race to infinity uh,
00:25:52 alert and the race to infinity uh,
00:25:52 alert and the race to infinity uh, humanity vanishes. So the right outcome
00:25:55 humanity vanishes. So the right outcome
00:25:55 humanity vanishes. So the right outcome will ultimately be some form of
00:25:58 will ultimately be some form of
00:25:58 will ultimately be some form of deterrence and mutually assured
00:26:00 deterrence and mutually assured
00:26:00 deterrence and mutually assured destruction. Uh I wrote a paper with two
00:26:03 destruction. Uh I wrote a paper with two
00:26:03 destruction. Uh I wrote a paper with two other authors Dan Hendricks and Alex
00:26:05 other authors Dan Hendricks and Alex
00:26:05 other authors Dan Hendricks and Alex Wang where we named it mutual AI
00:26:09 Wang where we named it mutual AI
00:26:09 Wang where we named it mutual AI malfunction.
00:26:10 malfunction.
00:26:10 malfunction. And the idea was goes something like
00:26:12 And the idea was goes something like
00:26:12 And the idea was goes something like this. Um you're the United States, I'm
00:26:15 this. Um you're the United States, I'm
00:26:15 this. Um you're the United States, I'm China, you're ahead of me. Um at some
00:26:17 China, you're ahead of me. Um at some
00:26:17 China, you're ahead of me. Um at some point you cross a line. You know, you
00:26:19 point you cross a line. You know, you
00:26:19 point you cross a line. You know, you Peter cross a line and I China go this
00:26:22 Peter cross a line and I China go this
00:26:22 Peter cross a line and I China go this is unacceptable.
00:26:23 is unacceptable.
00:26:23 is unacceptable. At some point it becomes
00:26:25 At some point it becomes
00:26:25 At some point it becomes in terms of amount of compute and amount
00:26:26 in terms of amount of compute and amount
00:26:26 in terms of amount of compute and amount of
00:26:27 of
00:26:27 of it's it's something you're doing where
00:26:29 it's it's something you're doing where
00:26:29 it's it's something you're doing where it affects my sovereignty.
00:26:31 it affects my sovereignty.
00:26:31 it affects my sovereignty. It's not just words and yelling and an
00:26:34 It's not just words and yelling and an
00:26:34 It's not just words and yelling and an occasional shooting down a jet. It's
00:26:35 occasional shooting down a jet. It's
00:26:35 occasional shooting down a jet. It's it's a real threat to the identity of my
00:26:40 it's a real threat to the identity of my
00:26:40 it's a real threat to the identity of my my country, my economic what have you.
00:26:43 my country, my economic what have you.
00:26:43 my country, my economic what have you. Under this scenario, I would be highly
00:26:45 Under this scenario, I would be highly
00:26:45 Under this scenario, I would be highly tempted to do a cyber attack to slow you
00:26:49 tempted to do a cyber attack to slow you
00:26:49 tempted to do a cyber attack to slow you down. Okay? In mutually assured mal
00:26:53 down. Okay? In mutually assured mal
00:26:53 down. Okay? In mutually assured mal malfunction, if you will, we have to
00:26:55 malfunction, if you will, we have to
00:26:56 malfunction, if you will, we have to engineer it so that you have the ability
00:26:57 engineer it so that you have the ability
00:26:57 engineer it so that you have the ability to then do the same thing to me.
00:27:00 to then do the same thing to me.
00:27:00 to then do the same thing to me. And that causes both of us to be careful
00:27:04 And that causes both of us to be careful
00:27:04 And that causes both of us to be careful not to trigger the other.
00:27:05 not to trigger the other.
00:27:06 not to trigger the other. That's what mutual assured destruction
00:27:07 That's what mutual assured destruction
00:27:07 That's what mutual assured destruction is. That's our best formulation right
00:27:09 is. That's our best formulation right
00:27:09 is. That's our best formulation right now. We also recommend in our work, and
00:27:12 now. We also recommend in our work, and
00:27:12 now. We also recommend in our work, and I think it's very strong, that the
00:27:14 I think it's very strong, that the
00:27:14 I think it's very strong, that the government require that we know where
00:27:16 government require that we know where
00:27:16 government require that we know where all the chips are. And remember, the
00:27:18 all the chips are. And remember, the
00:27:18 all the chips are. And remember, the chips can tell you where they are
00:27:19 chips can tell you where they are
00:27:19 chips can tell you where they are because they're computers. Yeah.
00:27:21 because they're computers. Yeah.
00:27:21 because they're computers. Yeah. And it would be easy to add a little
00:27:22 And it would be easy to add a little
00:27:22 And it would be easy to add a little crypto thing, which would say, "Yeah,
00:27:24 crypto thing, which would say, "Yeah,
00:27:24 crypto thing, which would say, "Yeah, here I am, and this is what I'm doing."
00:27:26 here I am, and this is what I'm doing."
00:27:26 here I am, and this is what I'm doing." So, so knowing where the chips are,
00:27:28 So, so knowing where the chips are,
00:27:28 So, so knowing where the chips are, knowing where the training runs are, and
00:27:30 knowing where the training runs are, and
00:27:30 knowing where the training runs are, and knowing what these fault lines are are
00:27:33 knowing what these fault lines are are
00:27:33 knowing what these fault lines are are very important. Now, there are a whole
00:27:35 very important. Now, there are a whole
00:27:35 very important. Now, there are a whole bunch of assumptions in this scenario
00:27:37 bunch of assumptions in this scenario
00:27:37 bunch of assumptions in this scenario that I described. The first is that
00:27:39 that I described. The first is that
00:27:39 that I described. The first is that there was enough electricity. The second
00:27:41 there was enough electricity. The second
00:27:41 there was enough electricity. The second is that there was enough power. The
00:27:43 is that there was enough power. The
00:27:43 is that there was enough power. The third is the Chinese had enough
00:27:44 third is the Chinese had enough
00:27:44 third is the Chinese had enough electricity, which they do, and enough
00:27:46 electricity, which they do, and enough
00:27:46 electricity, which they do, and enough computing resources, which they may or
00:27:48 computing resources, which they may or
00:27:48 computing resources, which they may or may not have
00:27:49 may not have
00:27:49 may not have or may in the future have,
00:27:50 or may in the future have,
00:27:50 or may in the future have, and may in the future have. And also,
00:27:52 and may in the future have. And also,
00:27:52 and may in the future have. And also, I'm asserting that everyone arrives at
00:27:55 I'm asserting that everyone arrives at
00:27:55 I'm asserting that everyone arrives at this eventual state of super
00:27:57 this eventual state of super
00:27:57 this eventual state of super intelligence at a roughly the same time.
00:27:59 intelligence at a roughly the same time.
00:27:59 intelligence at a roughly the same time. Again, these are debatable points, but
00:28:02 Again, these are debatable points, but
00:28:02 Again, these are debatable points, but the most interesting scenario is we're
00:28:04 the most interesting scenario is we're
00:28:04 the most interesting scenario is we're saying it's 1938. the letter has come,
00:28:08 saying it's 1938. the letter has come,
00:28:08 saying it's 1938. the letter has come, you know, from Einstein to the president
00:28:10 you know, from Einstein to the president
00:28:10 you know, from Einstein to the president and we're having a conversation and
00:28:12 and we're having a conversation and
00:28:12 and we're having a conversation and we're saying,"Well, how does this end?"
00:28:15 we're saying,"Well, how does this end?"
00:28:15 we're saying,"Well, how does this end?" Okay. So, if you were so brilliant in
00:28:17 Okay. So, if you were so brilliant in
00:28:17 Okay. So, if you were so brilliant in 38, what you would have said is this
00:28:20 38, what you would have said is this
00:28:20 38, what you would have said is this ultimately ends with us having a bomb,
00:28:22 ultimately ends with us having a bomb,
00:28:22 ultimately ends with us having a bomb, the other guys having a bomb, and then
00:28:24 the other guys having a bomb, and then
00:28:24 the other guys having a bomb, and then we're going to have one heck of a
00:28:25 we're going to have one heck of a
00:28:25 we're going to have one heck of a negotiation to try to make sure that we
00:28:28 negotiation to try to make sure that we
00:28:28 negotiation to try to make sure that we don't end up destroying each other. And
00:28:30 don't end up destroying each other. And
00:28:30 don't end up destroying each other. And I think the same conversation needs to
00:28:32 I think the same conversation needs to
00:28:32 I think the same conversation needs to get started now, well before the
00:28:35 get started now, well before the
00:28:35 get started now, well before the Chernobyl events, well before the
00:28:37 Chernobyl events, well before the
00:28:37 Chernobyl events, well before the buildups.
00:28:38 buildups.
00:28:38 buildups. Can I just take that one more step? And
00:28:40 Can I just take that one more step? And
00:28:40 Can I just take that one more step? And and don't answer if you don't want to,
00:28:42 and don't answer if you don't want to,
00:28:42 and don't answer if you don't want to, but if it was 1947, 1948,
00:28:45 but if it was 1947, 1948,
00:28:45 but if it was 1947, 1948, so before the Cold War really took off,
00:28:48 so before the Cold War really took off,
00:28:48 so before the Cold War really took off, and you say, well, that's similar to
00:28:50 and you say, well, that's similar to
00:28:50 and you say, well, that's similar to where we are with China right now. We
00:28:51 where we are with China right now. We
00:28:51 where we are with China right now. We have a competitive lead, but it may or
00:28:53 have a competitive lead, but it may or
00:28:54 have a competitive lead, but it may or may not be fragile.
00:28:56 may not be fragile.
00:28:56 may not be fragile. What would you do differently 1947 1940
00:28:58 What would you do differently 1947 1940
00:28:58 What would you do differently 1947 1940 or what would Kissinger do different
00:28:59 or what would Kissinger do different
00:28:59 or what would Kissinger do different 1947 1948 1949 than what we did do?
00:29:03 1947 1948 1949 than what we did do?
00:29:03 1947 1948 1949 than what we did do? You know I I wrote two books with Dr.
00:29:05 You know I I wrote two books with Dr.
00:29:05 You know I I wrote two books with Dr. Kissinger and I miss him very much. He
00:29:07 Kissinger and I miss him very much. He
00:29:07 Kissinger and I miss him very much. He was my closest friend. Um and Henry was
00:29:11 was my closest friend. Um and Henry was
00:29:11 was my closest friend. Um and Henry was very much a realist in the sense that
00:29:14 very much a realist in the sense that
00:29:14 very much a realist in the sense that when you look at his history in uh
00:29:17 when you look at his history in uh
00:29:17 when you look at his history in uh roughly 36 38 he and his uh I guess 37
00:29:21 roughly 36 38 he and his uh I guess 37
00:29:21 roughly 36 38 he and his uh I guess 37 38 his family were were Jewish were
00:29:23 38 his family were were Jewish were
00:29:24 38 his family were were Jewish were forced to immigrate from uh Germany
00:29:26 forced to immigrate from uh Germany
00:29:26 forced to immigrate from uh Germany because of the Nazis
00:29:27 because of the Nazis
00:29:28 because of the Nazis and he watched the entire world that
00:29:30 and he watched the entire world that
00:29:30 and he watched the entire world that he'd grown up with as a boy be destroyed
00:29:32 he'd grown up with as a boy be destroyed
00:29:32 he'd grown up with as a boy be destroyed by the Nazis and by Hitler and then he
00:29:35 by the Nazis and by Hitler and then he
00:29:36 by the Nazis and by Hitler and then he saw the confilgration that occurred as a
00:29:37 saw the confilgration that occurred as a
00:29:37 saw the confilgration that occurred as a result and I tell you that whether you
00:29:40 result and I tell you that whether you
00:29:40 result and I tell you that whether you like him or not, he spent the rest of
00:29:42 like him or not, he spent the rest of
00:29:42 like him or not, he spent the rest of his life trying to prevent that from
00:29:44 his life trying to prevent that from
00:29:44 his life trying to prevent that from happening again.
00:29:45 happening again.
00:29:45 happening again. Mhm.
00:29:46 Mhm.
00:29:46 Mhm. So we we are today safe because people
00:29:49 So we we are today safe because people
00:29:49 So we we are today safe because people like Henry saw the world fall apart.
00:29:52 like Henry saw the world fall apart.
00:29:52 like Henry saw the world fall apart. Mhm.
00:29:52 Mhm.
00:29:52 Mhm. So I think from my perspective, we
00:29:55 So I think from my perspective, we
00:29:55 So I think from my perspective, we should be very careful in our language
00:29:57 should be very careful in our language
00:29:57 should be very careful in our language and our strategy to not start that
00:30:00 and our strategy to not start that
00:30:00 and our strategy to not start that process. Henry's view on China was
00:30:02 process. Henry's view on China was
00:30:02 process. Henry's view on China was different from other China scholars. His
00:30:04 different from other China scholars. His
00:30:04 different from other China scholars. His view was in China was that we shouldn't
00:30:06 view was in China was that we shouldn't
00:30:06 view was in China was that we shouldn't poke the bear, that we shouldn't talk
00:30:08 poke the bear, that we shouldn't talk
00:30:08 poke the bear, that we shouldn't talk about Taiwan too much and we let China
00:30:11 about Taiwan too much and we let China
00:30:11 about Taiwan too much and we let China deal with our own problems which were
00:30:12 deal with our own problems which were
00:30:12 deal with our own problems which were very significant. But he was worried
00:30:15 very significant. But he was worried
00:30:15 very significant. But he was worried that we or China in a small way would
00:30:18 that we or China in a small way would
00:30:18 that we or China in a small way would start World War II in the same way that
00:30:20 start World War II in the same way that
00:30:20 start World War II in the same way that World War I was started. You remember
00:30:22 World War I was started. You remember
00:30:22 World War I was started. You remember that World War One one, World War I
00:30:24 that World War One one, World War I
00:30:24 that World War One one, World War I started with a essentially a small
00:30:26 started with a essentially a small
00:30:26 started with a essentially a small geopolitical event which was quickly
00:30:29 geopolitical event which was quickly
00:30:29 geopolitical event which was quickly escalated for political reasons on on
00:30:31 escalated for political reasons on on
00:30:31 escalated for political reasons on on all sides
00:30:32 all sides
00:30:32 all sides and then the rest was a horrific war,
00:30:34 and then the rest was a horrific war,
00:30:34 and then the rest was a horrific war, the war to end all wars at the time.
00:30:36 the war to end all wars at the time.
00:30:36 the war to end all wars at the time. So we have to be very very careful when
00:30:38 So we have to be very very careful when
00:30:38 So we have to be very very careful when we have these conversations not to
00:30:40 we have these conversations not to
00:30:40 we have these conversations not to isolate each other. Um Henry started a
00:30:43 isolate each other. Um Henry started a
00:30:43 isolate each other. Um Henry started a number of what are called track two
00:30:44 number of what are called track two
00:30:44 number of what are called track two dialogues which I'm part of one of them
00:30:46 dialogues which I'm part of one of them
00:30:46 dialogues which I'm part of one of them to try to make sure we're talking to
00:30:48 to try to make sure we're talking to
00:30:48 to try to make sure we're talking to each other. And so somebody who's a a
00:30:51 each other. And so somebody who's a a
00:30:51 each other. And so somebody who's a a hardcore person would say, well, you
00:30:52 hardcore person would say, well, you
00:30:52 hardcore person would say, well, you know, we're Americans and we're better
00:30:54 know, we're Americans and we're better
00:30:54 know, we're Americans and we're better and so forth. Well, I can tell you
00:30:56 and so forth. Well, I can tell you
00:30:56 and so forth. Well, I can tell you having spent lots of time on this, the
00:30:58 having spent lots of time on this, the
00:30:58 having spent lots of time on this, the Chinese are very smart, very care
00:31:01 Chinese are very smart, very care
00:31:01 Chinese are very smart, very care capable, very much up here. And if
00:31:04 capable, very much up here. And if
00:31:04 capable, very much up here. And if you're confused about that, again, look
00:31:06 you're confused about that, again, look
00:31:06 you're confused about that, again, look at the arrival of Deep Seek. A year ago,
00:31:08 at the arrival of Deep Seek. A year ago,
00:31:08 at the arrival of Deep Seek. A year ago, I said they were two years behind.
00:31:10 I said they were two years behind.
00:31:10 I said they were two years behind. I was clearly wrong.
00:31:12 I was clearly wrong.
00:31:12 I was clearly wrong. With enough money and enough power,
00:31:15 With enough money and enough power,
00:31:15 With enough money and enough power, they're in the game.
00:31:15 they're in the game.
00:31:16 they're in the game. Yeah. Let me actually drill in just a
00:31:18 Yeah. Let me actually drill in just a
00:31:18 Yeah. Let me actually drill in just a little bit more on that too because I
00:31:19 little bit more on that too because I
00:31:19 little bit more on that too because I think um one of the reasons deep sea
00:31:21 think um one of the reasons deep sea
00:31:21 think um one of the reasons deep sea caught up so quickly is because it
00:31:22 caught up so quickly is because it
00:31:22 caught up so quickly is because it turned out that inference time generates
00:31:24 turned out that inference time generates
00:31:24 turned out that inference time generates a lot of IQ and I don't think anyone saw
00:31:26 a lot of IQ and I don't think anyone saw
00:31:26 a lot of IQ and I don't think anyone saw that coming and inference time is a lot
00:31:29 that coming and inference time is a lot
00:31:29 that coming and inference time is a lot easier to catch up on and also if you
00:31:31 easier to catch up on and also if you
00:31:31 easier to catch up on and also if you take one of our big open source models
00:31:33 take one of our big open source models
00:31:33 take one of our big open source models and distill it
00:31:34 and distill it
00:31:34 and distill it and then make it a specialist like you
00:31:36 and then make it a specialist like you
00:31:36 and then make it a specialist like you were saying a minute ago and then you
00:31:38 were saying a minute ago and then you
00:31:38 were saying a minute ago and then you put a ton of infra time compute behind
00:31:40 put a ton of infra time compute behind
00:31:40 put a ton of infra time compute behind it, it's a massive advantage and also a
00:31:43 it, it's a massive advantage and also a
00:31:43 it, it's a massive advantage and also a ma massive leak of capability within
00:31:46 ma massive leak of capability within
00:31:46 ma massive leak of capability within CBRN for example that nobody anticipated
00:31:50 CBRN for example that nobody anticipated
00:31:50 CBRN for example that nobody anticipated and CBNN remember is chemical,
00:31:52 and CBNN remember is chemical,
00:31:52 and CBNN remember is chemical, biological, radiological and nuclear.
00:31:55 biological, radiological and nuclear.
00:31:55 biological, radiological and nuclear. Um
00:31:57 Um
00:31:57 Um let me rephrase what you said.
00:32:00 let me rephrase what you said.
00:32:00 let me rephrase what you said. If the structure of the world in 5 to 10
00:32:03 If the structure of the world in 5 to 10
00:32:03 If the structure of the world in 5 to 10 years is 10 models
00:32:07 years is 10 models
00:32:07 years is 10 models and I'll make some numbers up. Five in
00:32:10 and I'll make some numbers up. Five in
00:32:10 and I'll make some numbers up. Five in the United States, three in China, two
00:32:12 the United States, three in China, two
00:32:12 the United States, three in China, two elsewhere. And those models are data
00:32:15 elsewhere. And those models are data
00:32:15 elsewhere. And those models are data centers that are multi- gigawatts.
00:32:18 centers that are multi- gigawatts.
00:32:18 centers that are multi- gigawatts. They will be all nationalized in some
00:32:21 They will be all nationalized in some
00:32:21 They will be all nationalized in some way.
00:32:23 way.
00:32:23 way. In China, they will be owned by the
00:32:25 In China, they will be owned by the
00:32:25 In China, they will be owned by the government.
00:32:25 government.
00:32:25 government. Mhm.
00:32:26 Mhm.
00:32:26 Mhm. The stakes are too high.
00:32:27 The stakes are too high.
00:32:27 The stakes are too high. Mhm. Um, one in my military work one day
00:32:30 Mhm. Um, one in my military work one day
00:32:30 Mhm. Um, one in my military work one day I visited a place where we keep our
00:32:31 I visited a place where we keep our
00:32:31 I visited a place where we keep our plutonium and we keep our plutonium in
00:32:35 plutonium and we keep our plutonium in
00:32:35 plutonium and we keep our plutonium in in a base that's inside of another base
00:32:37 in a base that's inside of another base
00:32:37 in a base that's inside of another base with even more machine guns and even
00:32:39 with even more machine guns and even
00:32:39 with even more machine guns and even more specialized because the plutonium
00:32:41 more specialized because the plutonium
00:32:41 more specialized because the plutonium is so is so interesting and and
00:32:44 is so is so interesting and and
00:32:44 is so is so interesting and and obviously very dangerous and I believe
00:32:46 obviously very dangerous and I believe
00:32:46 obviously very dangerous and I believe it's the only one or two facilities that
00:32:47 it's the only one or two facilities that
00:32:47 it's the only one or two facilities that we have in America. So in that scenario,
00:32:51 we have in America. So in that scenario,
00:32:51 we have in America. So in that scenario, these data centers will have the
00:32:53 these data centers will have the
00:32:53 these data centers will have the equivalent of guards and machine guns
00:32:55 equivalent of guards and machine guns
00:32:55 equivalent of guards and machine guns because they're so important.
00:32:58 because they're so important.
00:32:58 because they're so important. Now is that a stable geopolitical
00:33:00 Now is that a stable geopolitical
00:33:00 Now is that a stable geopolitical system? Absolutely. You know where they
00:33:03 system? Absolutely. You know where they
00:33:03 system? Absolutely. You know where they are. President of one country can call
00:33:06 are. President of one country can call
00:33:06 are. President of one country can call the other. They can have a conversation.
00:33:08 the other. They can have a conversation.
00:33:08 the other. They can have a conversation. You know, they can agree on what they
00:33:10 You know, they can agree on what they
00:33:10 You know, they can agree on what they agree on and so forth. But let's say the
00:33:13 agree on and so forth. But let's say the
00:33:13 agree on and so forth. But let's say the it is not true. Let's say that the
00:33:15 it is not true. Let's say that the
00:33:16 it is not true. Let's say that the technology improves again unknown to the
00:33:19 technology improves again unknown to the
00:33:19 technology improves again unknown to the point where the kind of technologies
00:33:21 point where the kind of technologies
00:33:21 point where the kind of technologies that I'm describing are implementable on
00:33:23 that I'm describing are implementable on
00:33:23 that I'm describing are implementable on the equivalent of a small server
00:33:25 the equivalent of a small server
00:33:25 the equivalent of a small server then you have a humongous
00:33:28 then you have a humongous
00:33:28 then you have a humongous data center proliferation problem and
00:33:30 data center proliferation problem and
00:33:30 data center proliferation problem and that's where the open-source issue is so
00:33:32 that's where the open-source issue is so
00:33:32 that's where the open-source issue is so important because those servers which
00:33:34 important because those servers which
00:33:34 important because those servers which will be proliferate throughout the world
00:33:36 will be proliferate throughout the world
00:33:36 will be proliferate throughout the world will all be on open source. We have no
00:33:38 will all be on open source. We have no
00:33:38 will all be on open source. We have no control regime for that. Now, I'm in
00:33:40 control regime for that. Now, I'm in
00:33:40 control regime for that. Now, I'm in favor of open source as you mentioned
00:33:42 favor of open source as you mentioned
00:33:42 favor of open source as you mentioned earlier with Mark Andre u uh that open
00:33:45 earlier with Mark Andre u uh that open
00:33:45 earlier with Mark Andre u uh that open competition and so forth tends to allow
00:33:47 competition and so forth tends to allow
00:33:47 competition and so forth tends to allow people to run ahead in defense of the
00:33:50 people to run ahead in defense of the
00:33:50 people to run ahead in defense of the proprietary companies. Collectively,
00:33:53 proprietary companies. Collectively,
00:33:53 proprietary companies. Collectively, they believe as best I can tell that the
00:33:57 they believe as best I can tell that the
00:33:57 they believe as best I can tell that the open- source models can't scale fast
00:33:59 open- source models can't scale fast
00:33:59 open- source models can't scale fast enough because they need this
00:34:01 enough because they need this
00:34:01 enough because they need this heavyweight training. If you look, I
00:34:03 heavyweight training. If you look, I
00:34:03 heavyweight training. If you look, I I'll give you an example of Grock is
00:34:05 I'll give you an example of Grock is
00:34:05 I'll give you an example of Grock is trained on a single cluster that was
00:34:08 trained on a single cluster that was
00:34:08 trained on a single cluster that was built by Nvidia in 20 days or so forth
00:34:10 built by Nvidia in 20 days or so forth
00:34:10 built by Nvidia in 20 days or so forth in Memphis, Tennessee of 200,000 GPUs.
00:34:14 in Memphis, Tennessee of 200,000 GPUs.
00:34:14 in Memphis, Tennessee of 200,000 GPUs. Um GPU is about $50,000. You can say
00:34:17 Um GPU is about $50,000. You can say
00:34:17 Um GPU is about $50,000. You can say it's about a $10 billion supercomput in
00:34:20 it's about a $10 billion supercomput in
00:34:20 it's about a $10 billion supercomput in one building that does one thing, right?
00:34:23 one building that does one thing, right?
00:34:23 one building that does one thing, right? If that is the future, then we're okay
00:34:26 If that is the future, then we're okay
00:34:26 If that is the future, then we're okay because we'll be able to know where they
00:34:28 because we'll be able to know where they
00:34:28 because we'll be able to know where they are.
00:34:28 are.
00:34:28 are. Yeah. If in fact the arrival of
00:34:31 Yeah. If in fact the arrival of
00:34:31 Yeah. If in fact the arrival of intelligence is ultimately a a
00:34:33 intelligence is ultimately a a
00:34:33 intelligence is ultimately a a distributed problem, then we're going to
00:34:36 distributed problem, then we're going to
00:34:36 distributed problem, then we're going to have lots of problems with terrorism,
00:34:38 have lots of problems with terrorism,
00:34:38 have lots of problems with terrorism, bad actors, North Korea poorly,
00:34:41 bad actors, North Korea poorly,
00:34:41 bad actors, North Korea poorly, which is my which is my greatest
00:34:42 which is my which is my greatest
00:34:42 which is my which is my greatest concern. Right. China and the US are
00:34:44 concern. Right. China and the US are
00:34:44 concern. Right. China and the US are rational actors.
00:34:46 rational actors.
00:34:46 rational actors. Yeah.
00:34:47 Yeah.
00:34:47 Yeah. Uh the terrorist who has access to this
00:34:49 Uh the terrorist who has access to this
00:34:49 Uh the terrorist who has access to this and I I don't want to go all negative on
00:34:51 and I I don't want to go all negative on
00:34:51 and I I don't want to go all negative on this on this podcast. It's it's an
00:34:53 this on this podcast. It's it's an
00:34:53 this on this podcast. It's it's an important thing to wake people up to the
00:34:56 important thing to wake people up to the
00:34:56 important thing to wake people up to the deep thinking you've done on this. Um my
00:34:59 deep thinking you've done on this. Um my
00:34:59 deep thinking you've done on this. Um my concern is is the terrorist who gains
00:35:01 concern is is the terrorist who gains
00:35:01 concern is is the terrorist who gains access and
00:35:04 access and
00:35:04 access and are we spending enough time and energy
00:35:06 are we spending enough time and energy
00:35:06 are we spending enough time and energy and are we training enough models to
00:35:08 and are we training enough models to
00:35:08 and are we training enough models to watch them.
00:35:10 watch them.
00:35:10 watch them. So the first the companies are doing
00:35:12 So the first the companies are doing
00:35:12 So the first the companies are doing this
00:35:14 this
00:35:14 this there are there's a body of work
00:35:16 there are there's a body of work
00:35:16 there are there's a body of work happening now which can be understood as
00:35:18 happening now which can be understood as
00:35:18 happening now which can be understood as follows.
00:35:21 follows.
00:35:21 follows. You have a super intelligent model. Can
00:35:24 You have a super intelligent model. Can
00:35:24 You have a super intelligent model. Can you build a model that's not as smart as
00:35:27 you build a model that's not as smart as
00:35:27 you build a model that's not as smart as the student that's studying? You know,
00:35:29 the student that's studying? You know,
00:35:30 the student that's studying? You know, there is a professor that's watching the
00:35:31 there is a professor that's watching the
00:35:31 there is a professor that's watching the student,
00:35:32 student,
00:35:32 student, but the student is smarter than the
00:35:33 but the student is smarter than the
00:35:34 but the student is smarter than the professor. Is it possible to watch what
00:35:36 professor. Is it possible to watch what
00:35:36 professor. Is it possible to watch what it does? It appears that we can.
00:35:39 it does? It appears that we can.
00:35:39 it does? It appears that we can. It appears that there's a way even if
00:35:41 It appears that there's a way even if
00:35:41 It appears that there's a way even if you have a this rogue incredible thing,
00:35:44 you have a this rogue incredible thing,
00:35:44 you have a this rogue incredible thing, we can watch it and understand what it's
00:35:45 we can watch it and understand what it's
00:35:45 we can watch it and understand what it's doing and thereby control it. Another
00:35:48 doing and thereby control it. Another
00:35:48 doing and thereby control it. Another example of the of where where we don't
00:35:51 example of the of where where we don't
00:35:51 example of the of where where we don't know is that it's very clear that these
00:35:54 know is that it's very clear that these
00:35:54 know is that it's very clear that these savant models will proceed. There's no
00:35:57 savant models will proceed. There's no
00:35:57 savant models will proceed. There's no question about that.
00:35:59 question about that.
00:36:00 question about that. The question is how do we get the
00:36:02 The question is how do we get the
00:36:02 The question is how do we get the Einsteins?
00:36:04 Einsteins?
00:36:04 Einsteins? So there are two possibilities.
00:36:06 So there are two possibilities.
00:36:06 So there are two possibilities. One and this is to discover completely
00:36:08 One and this is to discover completely
00:36:08 One and this is to discover completely new schools of thought
00:36:09 new schools of thought
00:36:09 new schools of thought which is what's the most exciting thing.
00:36:12 which is what's the most exciting thing.
00:36:12 which is what's the most exciting thing. Yeah. And in our book Genesis, Henry and
00:36:14 Yeah. And in our book Genesis, Henry and
00:36:14 Yeah. And in our book Genesis, Henry and I and Craig talk about the importance of
00:36:17 I and Craig talk about the importance of
00:36:17 I and Craig talk about the importance of polymaths in history. In fact, the first
00:36:20 polymaths in history. In fact, the first
00:36:20 polymaths in history. In fact, the first chapter is on polymaths. What happens
00:36:23 chapter is on polymaths. What happens
00:36:23 chapter is on polymaths. What happens when we have millions and millions of
00:36:24 when we have millions and millions of
00:36:24 when we have millions and millions of polymaths? Very, very interesting.
00:36:26 polymaths? Very, very interesting.
00:36:26 polymaths? Very, very interesting. Okay.
00:36:27 Okay.
00:36:27 Okay. Now, it looks like the great
00:36:31 Now, it looks like the great
00:36:31 Now, it looks like the great discoveries, the greatest scientists and
00:36:34 discoveries, the greatest scientists and
00:36:34 discoveries, the greatest scientists and people in our history had the following
00:36:37 people in our history had the following
00:36:37 people in our history had the following property. They were experts in something
00:36:40 property. They were experts in something
00:36:40 property. They were experts in something and they looked at some at a different
00:36:42 and they looked at some at a different
00:36:42 and they looked at some at a different problem and they saw a pattern
00:36:44 problem and they saw a pattern
00:36:44 problem and they saw a pattern in one area of thinking that they could
00:36:46 in one area of thinking that they could
00:36:46 in one area of thinking that they could apply to a completely unrelated field
00:36:49 apply to a completely unrelated field
00:36:49 apply to a completely unrelated field and they were able to do so and make a
00:36:51 and they were able to do so and make a
00:36:51 and they were able to do so and make a huge breakthrough. The models today are
00:36:54 huge breakthrough. The models today are
00:36:54 huge breakthrough. The models today are not able to do that. So one thing to
00:36:56 not able to do that. So one thing to
00:36:56 not able to do that. So one thing to watch for is algorithmically
00:36:59 watch for is algorithmically
00:37:00 watch for is algorithmically when can they do that? This is generally
00:37:01 when can they do that? This is generally
00:37:01 when can they do that? This is generally known as the non-stationerity problem.
00:37:03 known as the non-stationerity problem.
00:37:03 known as the non-stationerity problem. Yeah. because uh the reward functions in
00:37:06 Yeah. because uh the reward functions in
00:37:06 Yeah. because uh the reward functions in these models are fairly straightforward.
00:37:08 these models are fairly straightforward.
00:37:08 these models are fairly straightforward. You know, beat the human, beat the
00:37:09 You know, beat the human, beat the
00:37:09 You know, beat the human, beat the question and so forth. But when the
00:37:11 question and so forth. But when the
00:37:11 question and so forth. But when the rules keep changing, is it possible to
00:37:13 rules keep changing, is it possible to
00:37:14 rules keep changing, is it possible to say the old rule can be applied to a new
00:37:16 say the old rule can be applied to a new
00:37:16 say the old rule can be applied to a new rule to discover something new?
00:37:19 rule to discover something new?
00:37:19 rule to discover something new? And and again, the research is underway.
00:37:22 And and again, the research is underway.
00:37:22 And and again, the research is underway. We won't know for years.
00:37:23 We won't know for years.
00:37:23 We won't know for years. Peter and I were over at OpenAI
00:37:25 Peter and I were over at OpenAI
00:37:25 Peter and I were over at OpenAI yesterday, actually, and we were talking
00:37:26 yesterday, actually, and we were talking
00:37:26 yesterday, actually, and we were talking to many people, but Noan Brown in
00:37:28 to many people, but Noan Brown in
00:37:28 to many people, but Noan Brown in particular, and um I said the word of
00:37:30 particular, and um I said the word of
00:37:30 particular, and um I said the word of the year is scaffolding. And he said,
00:37:33 the year is scaffolding. And he said,
00:37:33 the year is scaffolding. And he said, "Yeah, maybe the word of the month is
00:37:34 "Yeah, maybe the word of the month is
00:37:34 "Yeah, maybe the word of the month is scaffolding." I was like, "Okay, what
00:37:36 scaffolding." I was like, "Okay, what
00:37:36 scaffolding." I was like, "Okay, what did I step on there?" He said, "Look,
00:37:37 did I step on there?" He said, "Look,
00:37:38 did I step on there?" He said, "Look, you know, right now, if you try to get
00:37:40 you know, right now, if you try to get
00:37:40 you know, right now, if you try to get the AI to discover relativity or, you
00:37:42 the AI to discover relativity or, you
00:37:42 the AI to discover relativity or, you know, just some green field opportunity,
00:37:44 know, just some green field opportunity,
00:37:44 know, just some green field opportunity, it won't it won't do it. If you set up a
00:37:47 it won't it won't do it. If you set up a
00:37:47 it won't it won't do it. If you set up a framework kind of like a lattice, like a
00:37:48 framework kind of like a lattice, like a
00:37:48 framework kind of like a lattice, like a trellis, the vine will grow on the
00:37:50 trellis, the vine will grow on the
00:37:50 trellis, the vine will grow on the trellis beautifully, but you have to lay
00:37:52 trellis beautifully, but you have to lay
00:37:52 trellis beautifully, but you have to lay out those pathways and breadcrumbs." He
00:37:55 out those pathways and breadcrumbs." He
00:37:55 out those pathways and breadcrumbs." He was saying the AI's ability to generate
00:37:57 was saying the AI's ability to generate
00:37:58 was saying the AI's ability to generate its own scaffolding is imminent.
00:38:01 its own scaffolding is imminent.
00:38:01 its own scaffolding is imminent. Mhm. That doesn't make it completely
00:38:03 Mhm. That doesn't make it completely
00:38:03 Mhm. That doesn't make it completely self-improving. It's not it's not
00:38:05 self-improving. It's not it's not
00:38:05 self-improving. It's not it's not Pandora's box, but it's also much deeper
00:38:07 Pandora's box, but it's also much deeper
00:38:07 Pandora's box, but it's also much deeper down the path of create an entire
00:38:10 down the path of create an entire
00:38:10 down the path of create an entire breakthrough in physics or create an
00:38:11 breakthrough in physics or create an
00:38:11 breakthrough in physics or create an entire feature length movie or you know
00:38:13 entire feature length movie or you know
00:38:13 entire feature length movie or you know these these prompts that require 20
00:38:16 these these prompts that require 20
00:38:16 these these prompts that require 20 hours of consecutive inference time
00:38:18 hours of consecutive inference time
00:38:18 hours of consecutive inference time compute
00:38:19 compute
00:38:19 compute pretty much sure that that will be a
00:38:21 pretty much sure that that will be a
00:38:22 pretty much sure that that will be a 2025 thing at least from from their
00:38:24 2025 thing at least from from their
00:38:24 2025 thing at least from from their point of view.
00:38:25 point of view.
00:38:25 point of view. So, uh, recursive self-improvement is
00:38:28 So, uh, recursive self-improvement is
00:38:28 So, uh, recursive self-improvement is the general term for the computer
00:38:31 the general term for the computer
00:38:31 the general term for the computer continuing to learn.
00:38:32 continuing to learn.
00:38:32 continuing to learn. Yeah,
00:38:33 Yeah,
00:38:33 Yeah, we've already crossed that
00:38:35 we've already crossed that
00:38:35 we've already crossed that in the sense that these systems are now
00:38:37 in the sense that these systems are now
00:38:37 in the sense that these systems are now running and learning things and they're
00:38:39 running and learning things and they're
00:38:39 running and learning things and they're learning from the way they own they
00:38:41 learning from the way they own they
00:38:41 learning from the way they own they think within limited functions.
00:38:46 think within limited functions.
00:38:46 think within limited functions. When does the system have the ability to
00:38:48 When does the system have the ability to
00:38:48 When does the system have the ability to generate its own objective and its own
00:38:51 generate its own objective and its own
00:38:51 generate its own objective and its own question?
00:38:52 question?
00:38:52 question? Does not have that today.
00:38:53 Does not have that today.
00:38:53 Does not have that today. Yep. That's another sign. Another sign
00:38:57 Yep. That's another sign. Another sign
00:38:57 Yep. That's another sign. Another sign would be that the system decides to uh
00:39:01 would be that the system decides to uh
00:39:01 would be that the system decides to uh exfiltrate itself and it takes steps to
00:39:04 exfiltrate itself and it takes steps to
00:39:04 exfiltrate itself and it takes steps to get it get itself away from the
00:39:05 get it get itself away from the
00:39:05 get it get itself away from the commander the control and command
00:39:07 commander the control and command
00:39:07 commander the control and command system. Um that has not happened yet.
00:39:10 system. Um that has not happened yet.
00:39:10 system. Um that has not happened yet. Jim and I hasn't called you yet and
00:39:11 Jim and I hasn't called you yet and
00:39:11 Jim and I hasn't called you yet and said, "Hi, Eric. Can I
00:39:13 said, "Hi, Eric. Can I
00:39:13 said, "Hi, Eric. Can I but but there there are theoreticians
00:39:15 but but there there are theoreticians
00:39:15 but but there there are theoreticians who believe that the that the systems
00:39:17 who believe that the that the systems
00:39:17 who believe that the that the systems will ultimately choose that as a reward
00:39:19 will ultimately choose that as a reward
00:39:19 will ultimately choose that as a reward function because they're programmed to,
00:39:21 function because they're programmed to,
00:39:22 function because they're programmed to, you know, to continue to learn."
00:39:23 you know, to continue to learn."
00:39:23 you know, to continue to learn." Uh, another one is access to weapons,
00:39:26 Uh, another one is access to weapons,
00:39:26 Uh, another one is access to weapons, right? And lying to get it. So, these
00:39:29 right? And lying to get it. So, these
00:39:29 right? And lying to get it. So, these are trip wires,
00:39:31 are trip wires,
00:39:31 are trip wires, right? All of each of each of which is a
00:39:33 right? All of each of each of which is a
00:39:33 right? All of each of each of which is a trip wire that we're we're watching.
00:39:36 trip wire that we're we're watching.
00:39:36 trip wire that we're we're watching. And again, each of these could be the
00:39:38 And again, each of these could be the
00:39:38 And again, each of these could be the beginning of a mini Chernobyl event that
00:39:41 beginning of a mini Chernobyl event that
00:39:41 beginning of a mini Chernobyl event that would become part of consciousness.
00:39:44 would become part of consciousness.
00:39:44 would become part of consciousness. I think at the moment the US government
00:39:47 I think at the moment the US government
00:39:47 I think at the moment the US government is not focused on these issues. They're
00:39:48 is not focused on these issues. They're
00:39:48 is not focused on these issues. They're focused on other things, economic
00:39:50 focused on other things, economic
00:39:50 focused on other things, economic opportunity, growth, and so forth. It's
00:39:51 opportunity, growth, and so forth. It's
00:39:51 opportunity, growth, and so forth. It's all good, but somebody's going to get
00:39:54 all good, but somebody's going to get
00:39:54 all good, but somebody's going to get focused on this and somebody's going to
00:39:56 focused on this and somebody's going to
00:39:56 focused on this and somebody's going to pay attention to it and it will
00:39:57 pay attention to it and it will
00:39:58 pay attention to it and it will ultimately be a problem. A quick aside,
00:39:59 ultimately be a problem. A quick aside,
00:40:00 ultimately be a problem. A quick aside, you probably heard me speaking about
00:40:01 you probably heard me speaking about
00:40:01 you probably heard me speaking about fountain life before and you're probably
00:40:03 fountain life before and you're probably
00:40:03 fountain life before and you're probably wishing, "Peter, would you please stop
00:40:05 wishing, "Peter, would you please stop
00:40:05 wishing, "Peter, would you please stop talking about fountain life?" And the
00:40:06 talking about fountain life?" And the
00:40:06 talking about fountain life?" And the answer is no, I won't. Because
00:40:08 answer is no, I won't. Because
00:40:08 answer is no, I won't. Because genuinely, we're living through a
00:40:10 genuinely, we're living through a
00:40:10 genuinely, we're living through a healthc care crisis. You may not know
00:40:12 healthc care crisis. You may not know
00:40:12 healthc care crisis. You may not know this, but 70% of heart attacks have no
00:40:14 this, but 70% of heart attacks have no
00:40:14 this, but 70% of heart attacks have no precedent, no pain, no shortness of
00:40:16 precedent, no pain, no shortness of
00:40:16 precedent, no pain, no shortness of breath. And half of those people with a
00:40:18 breath. And half of those people with a
00:40:18 breath. And half of those people with a heart attack never wake up. You don't
00:40:19 heart attack never wake up. You don't
00:40:20 heart attack never wake up. You don't feel cancer until stage three or stage
00:40:22 feel cancer until stage three or stage
00:40:22 feel cancer until stage three or stage 4, until it's too late. But we have all
00:40:24 4, until it's too late. But we have all
00:40:24 4, until it's too late. But we have all the technology required to detect and
00:40:26 the technology required to detect and
00:40:26 the technology required to detect and prevent these diseases early at scale.
00:40:29 prevent these diseases early at scale.
00:40:29 prevent these diseases early at scale. That's why a group of us including Tony
00:40:31 That's why a group of us including Tony
00:40:31 That's why a group of us including Tony Robbins, Bill Cap, and Bob Heruri
00:40:33 Robbins, Bill Cap, and Bob Heruri
00:40:33 Robbins, Bill Cap, and Bob Heruri founded Fountain Life, a one-stop center
00:40:35 founded Fountain Life, a one-stop center
00:40:35 founded Fountain Life, a one-stop center to help people understand what's going
00:40:37 to help people understand what's going
00:40:37 to help people understand what's going on inside their bodies before it's too
00:40:40 on inside their bodies before it's too
00:40:40 on inside their bodies before it's too late and to gain access to the
00:40:41 late and to gain access to the
00:40:41 late and to gain access to the therapeutics to give them decades of
00:40:43 therapeutics to give them decades of
00:40:43 therapeutics to give them decades of extra health span. Learn more about
00:40:45 extra health span. Learn more about
00:40:45 extra health span. Learn more about what's going on inside your body from
00:40:46 what's going on inside your body from
00:40:46 what's going on inside your body from Fountainife. Go to fountainlife.com/per
00:40:50 Fountainife. Go to fountainlife.com/per
00:40:50 Fountainife. Go to fountainlife.com/per and tell them Peter sent you. Okay, back
00:40:52 and tell them Peter sent you. Okay, back
00:40:52 and tell them Peter sent you. Okay, back to the episode. Can I can I clean up one
00:40:55 to the episode. Can I can I clean up one
00:40:55 to the episode. Can I can I clean up one kind of common misconception there
00:40:56 kind of common misconception there
00:40:56 kind of common misconception there because um I think it's a really
00:40:58 because um I think it's a really
00:40:58 because um I think it's a really important one. In the movie version of
00:41:00 important one. In the movie version of
00:41:00 important one. In the movie version of AI, you described, hey, maybe there are
00:41:02 AI, you described, hey, maybe there are
00:41:02 AI, you described, hey, maybe there are 10 big AIs and five are in the US, three
00:41:04 10 big AIs and five are in the US, three
00:41:04 10 big AIs and five are in the US, three are in China, and two are one's not in
00:41:06 are in China, and two are one's not in
00:41:06 are in China, and two are one's not in Brussels, probably one's maybe in Dubai.
00:41:09 Brussels, probably one's maybe in Dubai.
00:41:09 Brussels, probably one's maybe in Dubai. Um or, you know, Israel.
00:41:11 Um or, you know, Israel.
00:41:11 Um or, you know, Israel. Israel. Okay, there you go.
00:41:12 Israel. Okay, there you go.
00:41:12 Israel. Okay, there you go. Some somewhere like that.
00:41:13 Some somewhere like that.
00:41:13 Some somewhere like that. Yeah. Um in the movie version of this,
00:41:16 Yeah. Um in the movie version of this,
00:41:16 Yeah. Um in the movie version of this, if it goes rogue, you know, the SWAT
00:41:18 if it goes rogue, you know, the SWAT
00:41:18 if it goes rogue, you know, the SWAT team comes in, they blow it up, and it's
00:41:20 team comes in, they blow it up, and it's
00:41:20 team comes in, they blow it up, and it's it's solved. But the actual real world
00:41:22 it's solved. But the actual real world
00:41:22 it's solved. But the actual real world is when you're using one of these huge
00:41:24 is when you're using one of these huge
00:41:24 is when you're using one of these huge data centers to create an super
00:41:26 data centers to create an super
00:41:26 data centers to create an super intelligent AI, the training process is
00:41:29 intelligent AI, the training process is
00:41:29 intelligent AI, the training process is 10 E26, 10 E28, you know, or more flops.
00:41:33 10 E26, 10 E28, you know, or more flops.
00:41:33 10 E26, 10 E28, you know, or more flops. But then the final brain can be ported
00:41:36 But then the final brain can be ported
00:41:36 But then the final brain can be ported and run on four GPUs, 8 GPUs, so a box
00:41:39 and run on four GPUs, 8 GPUs, so a box
00:41:39 and run on four GPUs, 8 GPUs, so a box about this size.
00:41:41 about this size.
00:41:41 about this size. Um, and it's just as intelligent, you
00:41:43 Um, and it's just as intelligent, you
00:41:43 Um, and it's just as intelligent, you know, it's it's it's and that's one of
00:41:45 know, it's it's it's and that's one of
00:41:45 know, it's it's it's and that's one of the beautiful things about it is you
00:41:46 the beautiful things about it is you
00:41:46 the beautiful things about it is you This is called stealing the weights.
00:41:47 This is called stealing the weights.
00:41:47 This is called stealing the weights. Stealing the weights. Exactly. And the
00:41:50 Stealing the weights. Exactly. And the
00:41:50 Stealing the weights. Exactly. And the new new thing is that that weight file
00:41:53 new new thing is that that weight file
00:41:53 new new thing is that that weight file with if you have an innovation in
00:41:55 with if you have an innovation in
00:41:55 with if you have an innovation in inference time speed and you say oh same
00:41:58 inference time speed and you say oh same
00:41:58 inference time speed and you say oh same weights no difference distill it or or
00:42:01 weights no difference distill it or or
00:42:01 weights no difference distill it or or just quantize it or whatever but I made
00:42:02 just quantize it or whatever but I made
00:42:02 just quantize it or whatever but I made it a 100 times faster now it's actually
00:42:05 it a 100 times faster now it's actually
00:42:05 it a 100 times faster now it's actually far more intelligent than what you
00:42:07 far more intelligent than what you
00:42:07 far more intelligent than what you exported from the data center and so the
00:42:11 exported from the data center and so the
00:42:11 exported from the data center and so the but all of these are examples of the
00:42:13 but all of these are examples of the
00:42:13 but all of these are examples of the proliferation problem
00:42:15 proliferation problem
00:42:15 proliferation problem and I'm not convinced that we will hold
00:42:18 and I'm not convinced that we will hold
00:42:18 and I'm not convinced that we will hold these things in the 10 places.
00:42:22 these things in the 10 places.
00:42:22 these things in the 10 places. And and here's why. Let's assume you
00:42:24 And and here's why. Let's assume you
00:42:24 And and here's why. Let's assume you have the 10, which is possible.
00:42:27 have the 10, which is possible.
00:42:27 have the 10, which is possible. They will have subsets
00:42:29 They will have subsets
00:42:29 They will have subsets of models that are smaller but nearly as
00:42:33 of models that are smaller but nearly as
00:42:34 of models that are smaller but nearly as intelligent.
00:42:35 intelligent.
00:42:36 intelligent. And so the tree of knowledge of systems
00:42:39 And so the tree of knowledge of systems
00:42:39 And so the tree of knowledge of systems that have knowledge is not going to be
00:42:40 that have knowledge is not going to be
00:42:40 that have knowledge is not going to be 10 and then zero. It's going to be 10, a
00:42:43 10 and then zero. It's going to be 10, a
00:42:43 10 and then zero. It's going to be 10, a h 100red, a thousand, a million, a
00:42:46 h 100red, a thousand, a million, a
00:42:46 h 100red, a thousand, a million, a billion at different levels of
00:42:48 billion at different levels of
00:42:48 billion at different levels of complexity. So the system that's on your
00:42:51 complexity. So the system that's on your
00:42:51 complexity. So the system that's on your future phone may be, you know, three
00:42:54 future phone may be, you know, three
00:42:54 future phone may be, you know, three orders of magnitude, four order
00:42:55 orders of magnitude, four order
00:42:55 orders of magnitude, four order magnitude smaller than the one at the
00:42:59 magnitude smaller than the one at the
00:42:59 magnitude smaller than the one at the very tippy top, but it will be very,
00:43:01 very tippy top, but it will be very,
00:43:01 very tippy top, but it will be very, very powerful.
00:43:01 very powerful.
00:43:01 very powerful. You know, to exactly what you're talking
00:43:03 You know, to exactly what you're talking
00:43:03 You know, to exactly what you're talking about, there's some great research going
00:43:04 about, there's some great research going
00:43:04 about, there's some great research going on at MIT. It'll probably move to
00:43:06 on at MIT. It'll probably move to
00:43:06 on at MIT. It'll probably move to Stanford just to be fair but it always
00:43:08 Stanford just to be fair but it always
00:43:08 Stanford just to be fair but it always does but uh it's great research going on
00:43:10 does but uh it's great research going on
00:43:10 does but uh it's great research going on at MIT on uh if you have one of these
00:43:12 at MIT on uh if you have one of these
00:43:12 at MIT on uh if you have one of these huge models and it's been trained on
00:43:15 huge models and it's been trained on
00:43:15 huge models and it's been trained on movies it's been trained on Swahili a
00:43:17 movies it's been trained on Swahili a
00:43:18 movies it's been trained on Swahili a lot of the parameters aren't useful for
00:43:19 lot of the parameters aren't useful for
00:43:19 lot of the parameters aren't useful for this soant use case but the general
00:43:21 this soant use case but the general
00:43:21 this soant use case but the general knowledge and intuition is so what's the
00:43:23 knowledge and intuition is so what's the
00:43:24 knowledge and intuition is so what's the optimal balance between narrowing the
00:43:26 optimal balance between narrowing the
00:43:26 optimal balance between narrowing the training data and narrowing the
00:43:27 training data and narrowing the
00:43:27 training data and narrowing the parameter set to be a specialist without
00:43:31 parameter set to be a specialist without
00:43:31 parameter set to be a specialist without losing general you know learning
00:43:34 losing general you know learning
00:43:34 losing general you know learning so the people who opposed to that view
00:43:36 so the people who opposed to that view
00:43:36 so the people who opposed to that view and again we don't know would say the
00:43:38 and again we don't know would say the
00:43:38 and again we don't know would say the following. If you take a general purpose
00:43:40 following. If you take a general purpose
00:43:40 following. If you take a general purpose model and you specialize it through
00:43:42 model and you specialize it through
00:43:42 model and you specialize it through finetuning it also becomes more brittle.
00:43:45 finetuning it also becomes more brittle.
00:43:45 finetuning it also becomes more brittle. Mhm. Mhm.
00:43:46 Mhm. Mhm.
00:43:46 Mhm. Mhm. Their view is that what you do is you
00:43:48 Their view is that what you do is you
00:43:48 Their view is that what you do is you just make bigger and bigger and bigger
00:43:51 just make bigger and bigger and bigger
00:43:51 just make bigger and bigger and bigger models because they're in the big model
00:43:53 models because they're in the big model
00:43:53 models because they're in the big model camp right and that's why they need
00:43:55 camp right and that's why they need
00:43:55 camp right and that's why they need gigawatts of data centers and so forth.
00:43:57 gigawatts of data centers and so forth.
00:43:58 gigawatts of data centers and so forth. And their argument is that that
00:43:59 And their argument is that that
00:43:59 And their argument is that that flexibility of intelligence that we that
00:44:01 flexibility of intelligence that we that
00:44:01 flexibility of intelligence that we that they are seeing will continue.
00:44:04 they are seeing will continue.
00:44:04 they are seeing will continue. Dario wrote a a piece called um
00:44:08 Dario wrote a a piece called um
00:44:08 Dario wrote a a piece called um basically about machines
00:44:10 basically about machines
00:44:10 basically about machines and he argued that there
00:44:12 and he argued that there
00:44:12 and he argued that there machines of of grace
00:44:15 machines of of grace
00:44:15 machines of of grace machines of amazing grace
00:44:17 machines of amazing grace
00:44:17 machines of amazing grace and he argued that there are three
00:44:20 and he argued that there are three
00:44:20 and he argued that there are three scaling laws playing. The first one is
00:44:22 scaling laws playing. The first one is
00:44:22 scaling laws playing. The first one is what you know of which is foundation
00:44:23 what you know of which is foundation
00:44:23 what you know of which is foundation model growth. We're we're still on that.
00:44:26 model growth. We're we're still on that.
00:44:26 model growth. We're we're still on that. The second one is a test time
00:44:29 The second one is a test time
00:44:29 The second one is a test time training law and the third one is a
00:44:31 training law and the third one is a
00:44:31 training law and the third one is a reinforcement learning training law.
00:44:33 reinforcement learning training law.
00:44:33 reinforcement learning training law. Training laws are where if you just put
00:44:35 Training laws are where if you just put
00:44:35 Training laws are where if you just put more hardware and more data, they just
00:44:37 more hardware and more data, they just
00:44:37 more hardware and more data, they just get smarter in a in a predictable way.
00:44:40 get smarter in a in a predictable way.
00:44:40 get smarter in a in a predictable way. Um, we're just at the beginning in his
00:44:42 Um, we're just at the beginning in his
00:44:42 Um, we're just at the beginning in his view of uh this the second and third one
00:44:46 view of uh this the second and third one
00:44:46 view of uh this the second and third one beginning. That's why I I'm sure our
00:44:49 beginning. That's why I I'm sure our
00:44:49 beginning. That's why I I'm sure our audience would be frustrated. Why why do
00:44:51 audience would be frustrated. Why why do
00:44:51 audience would be frustrated. Why why do we not know? I'm just we don't know,
00:44:54 we not know? I'm just we don't know,
00:44:54 we not know? I'm just we don't know, right? It's too new. It's too powerful.
00:44:57 right? It's too new. It's too powerful.
00:44:57 right? It's too new. It's too powerful. And at the moment, all of these
00:45:00 And at the moment, all of these
00:45:00 And at the moment, all of these businesses are incredibly highly valued.
00:45:02 businesses are incredibly highly valued.
00:45:02 businesses are incredibly highly valued. They're growing incredibly quickly. The
00:45:05 They're growing incredibly quickly. The
00:45:05 They're growing incredibly quickly. The uses of them, I mentioned earlier, uh
00:45:08 uses of them, I mentioned earlier, uh
00:45:08 uses of them, I mentioned earlier, uh going back to Google, um the ability to
00:45:11 going back to Google, um the ability to
00:45:11 going back to Google, um the ability to refactor your entire workflow in a
00:45:13 refactor your entire workflow in a
00:45:13 refactor your entire workflow in a business is a very big deal. That's a
00:45:15 business is a very big deal. That's a
00:45:15 business is a very big deal. That's a lot of money to be made there for all
00:45:18 lot of money to be made there for all
00:45:18 lot of money to be made there for all the companies involved. We will see.
00:45:21 the companies involved. We will see.
00:45:21 the companies involved. We will see. Eric, shifting the topic. One of the
00:45:23 Eric, shifting the topic. One of the
00:45:23 Eric, shifting the topic. One of the concerns that people have in the near
00:45:25 concerns that people have in the near
00:45:25 concerns that people have in the near term and people have been, you know,
00:45:27 term and people have been, you know,
00:45:27 term and people have been, you know, ringing the alarm bells is on jobs.
00:45:30 ringing the alarm bells is on jobs.
00:45:30 ringing the alarm bells is on jobs. Um, I'm wondering where you come out on
00:45:32 Um, I'm wondering where you come out on
00:45:32 Um, I'm wondering where you come out on this and flipping that forward to
00:45:35 this and flipping that forward to
00:45:36 this and flipping that forward to education. How do we educate our kids
00:45:38 education. How do we educate our kids
00:45:38 education. How do we educate our kids today in high school and college? Uh,
00:45:41 today in high school and college? Uh,
00:45:41 today in high school and college? Uh, and what's your advice? So on the first
00:45:43 and what's your advice? So on the first
00:45:43 and what's your advice? So on the first thing, do you believe that as Dario has
00:45:46 thing, do you believe that as Dario has
00:45:46 thing, do you believe that as Dario has gone on uh you know TV shows now and
00:45:50 gone on uh you know TV shows now and
00:45:50 gone on uh you know TV shows now and speaking to significant white collar job
00:45:52 speaking to significant white collar job
00:45:52 speaking to significant white collar job loss, we're seeing obviously a multitude
00:45:54 loss, we're seeing obviously a multitude
00:45:54 loss, we're seeing obviously a multitude of different drivers and uh robots
00:45:57 of different drivers and uh robots
00:45:57 of different drivers and uh robots coming in. How do you think about the
00:45:59 coming in. How do you think about the
00:45:59 coming in. How do you think about the job market over the next 5 years? Um
00:46:03 job market over the next 5 years? Um
00:46:03 job market over the next 5 years? Um let's posit that in 30 or 40 years
00:46:08 let's posit that in 30 or 40 years
00:46:08 let's posit that in 30 or 40 years there'll be a very different employment
00:46:10 there'll be a very different employment
00:46:10 there'll be a very different employment robotic human interaction
00:46:12 robotic human interaction
00:46:12 robotic human interaction or the definition of of do we need to
00:46:14 or the definition of of do we need to
00:46:14 or the definition of of do we need to work at all
00:46:15 work at all
00:46:15 work at all the definition of work the definition of
00:46:17 the definition of work the definition of
00:46:17 the definition of work the definition of identity. Let's just posit that uh and
00:46:20 identity. Let's just posit that uh and
00:46:20 identity. Let's just posit that uh and let's also posit that it will take 20 or
00:46:22 let's also posit that it will take 20 or
00:46:22 let's also posit that it will take 20 or 30 years for those things to work
00:46:25 30 years for those things to work
00:46:25 30 years for those things to work through the economy of our world. Um,
00:46:29 through the economy of our world. Um,
00:46:29 through the economy of our world. Um, now in California and other cities in
00:46:31 now in California and other cities in
00:46:31 now in California and other cities in America, you can get on a Whimo taxi.
00:46:33 America, you can get on a Whimo taxi.
00:46:33 America, you can get on a Whimo taxi. Um, Whimo, it's 2025. The original work
00:46:38 Um, Whimo, it's 2025. The original work
00:46:38 Um, Whimo, it's 2025. The original work was done in the late '9s.
00:46:40 was done in the late '9s.
00:46:40 was done in the late '9s. The original challenge at Stanford was
00:46:42 The original challenge at Stanford was
00:46:42 The original challenge at Stanford was done, I believe, in 2004.
00:46:43 done, I believe, in 2004.
00:46:43 done, I believe, in 2004. The DRA Grand Challenge. It was 2004.
00:46:45 The DRA Grand Challenge. It was 2004.
00:46:45 The DRA Grand Challenge. It was 2004. 20 Sebastian through one.
00:46:48 20 Sebastian through one.
00:46:48 20 Sebastian through one. So, so more than 20 years from a visible
00:46:52 So, so more than 20 years from a visible
00:46:52 So, so more than 20 years from a visible demonstration to our ability to use it
00:46:55 demonstration to our ability to use it
00:46:55 demonstration to our ability to use it in daily life. Why? It's hard. It's deep
00:46:58 in daily life. Why? It's hard. It's deep
00:46:58 in daily life. Why? It's hard. It's deep tech. It's regulated and all of that.
00:47:00 tech. It's regulated and all of that.
00:47:00 tech. It's regulated and all of that. And I think that's going to be true,
00:47:01 And I think that's going to be true,
00:47:01 And I think that's going to be true, especially in robots that are
00:47:03 especially in robots that are
00:47:03 especially in robots that are interacting with humans. They're going
00:47:04 interacting with humans. They're going
00:47:04 interacting with humans. They're going to get regulated. You're not going to be
00:47:05 to get regulated. You're not going to be
00:47:05 to get regulated. You're not going to be wandering around and the robots going to
00:47:07 wandering around and the robots going to
00:47:07 wandering around and the robots going to decide to slap you. It just doesn't, you
00:47:09 decide to slap you. It just doesn't, you
00:47:09 decide to slap you. It just doesn't, you know, societyy's not going to allow that
00:47:11 know, societyy's not going to allow that
00:47:11 know, societyy's not going to allow that sort of thing.
00:47:12 sort of thing.
00:47:12 sort of thing. It's just not, it's not going to it's
00:47:13 It's just not, it's not going to it's
00:47:14 It's just not, it's not going to it's it's not going to allow it.
00:47:16 it's not going to allow it.
00:47:16 it's not going to allow it. So, in the shorter term, five or 10
00:47:19 So, in the shorter term, five or 10
00:47:19 So, in the shorter term, five or 10 years, I'm going to argue that this is
00:47:23 years, I'm going to argue that this is
00:47:23 years, I'm going to argue that this is positive for jobs in the following way.
00:47:26 positive for jobs in the following way.
00:47:26 positive for jobs in the following way. Okay.
00:47:27 Okay.
00:47:27 Okay. Um if you look at the history of
00:47:29 Um if you look at the history of
00:47:29 Um if you look at the history of automation and economic growth,
00:47:33 automation and economic growth,
00:47:33 automation and economic growth, automation starts with the lowest status
00:47:36 automation starts with the lowest status
00:47:36 automation starts with the lowest status and most dangerous jobs and then works
00:47:40 and most dangerous jobs and then works
00:47:40 and most dangerous jobs and then works up the chain. So if you think about
00:47:42 up the chain. So if you think about
00:47:42 up the chain. So if you think about assembly lines and cars and you know
00:47:44 assembly lines and cars and you know
00:47:44 assembly lines and cars and you know furnaces and all these sort of very very
00:47:46 furnaces and all these sort of very very
00:47:46 furnaces and all these sort of very very dangerous jobs that our four forefathers
00:47:48 dangerous jobs that our four forefathers
00:47:48 dangerous jobs that our four forefathers did, they don't do them anymore. They're
00:47:49 did, they don't do them anymore. They're
00:47:50 did, they don't do them anymore. They're done by robotic solutions of one another
00:47:52 done by robotic solutions of one another
00:47:52 done by robotic solutions of one another and typically not a humanoid robot but
00:47:54 and typically not a humanoid robot but
00:47:54 and typically not a humanoid robot but an arm. So the so the world dominated by
00:47:57 an arm. So the so the world dominated by
00:47:57 an arm. So the so the world dominated by arms that are intelligent and so forth
00:47:59 arms that are intelligent and so forth
00:47:59 arms that are intelligent and so forth will automate those functions. What
00:48:01 will automate those functions. What
00:48:02 will automate those functions. What happens to the people? Well, it turns
00:48:04 happens to the people? Well, it turns
00:48:04 happens to the people? Well, it turns out that the person who was working with
00:48:06 out that the person who was working with
00:48:06 out that the person who was working with the the welder who's now operating the
00:48:08 the the welder who's now operating the
00:48:08 the the welder who's now operating the arm has a higher
00:48:11 arm has a higher
00:48:11 arm has a higher wage and the company has higher profits
00:48:15 wage and the company has higher profits
00:48:15 wage and the company has higher profits because it's producing more widgets. So
00:48:17 because it's producing more widgets. So
00:48:17 because it's producing more widgets. So the company makes more money and the
00:48:19 the company makes more money and the
00:48:19 the company makes more money and the person makes more money, right? In that
00:48:21 person makes more money, right? In that
00:48:21 person makes more money, right? In that sense. Now you sit there and say well
00:48:23 sense. Now you sit there and say well
00:48:23 sense. Now you sit there and say well that's not true because humans don't
00:48:24 that's not true because humans don't
00:48:24 that's not true because humans don't want to be retrained. Ah but in the
00:48:27 want to be retrained. Ah but in the
00:48:27 want to be retrained. Ah but in the vision that we're talking about every
00:48:29 vision that we're talking about every
00:48:29 vision that we're talking about every single person will have a human a
00:48:31 single person will have a human a
00:48:31 single person will have a human a computer assistant that's very
00:48:33 computer assistant that's very
00:48:33 computer assistant that's very intelligent that helps them perform.
00:48:35 intelligent that helps them perform.
00:48:35 intelligent that helps them perform. And you take a person of normal
00:48:37 And you take a person of normal
00:48:37 And you take a person of normal intelligence or knowledge and you add a
00:48:39 intelligence or knowledge and you add a
00:48:39 intelligence or knowledge and you add a you know sort of accelerant they can get
00:48:41 you know sort of accelerant they can get
00:48:41 you know sort of accelerant they can get a higher paying job. So you sit there
00:48:43 a higher paying job. So you sit there
00:48:43 a higher paying job. So you sit there and you go well why are there more jobs?
00:48:45 and you go well why are there more jobs?
00:48:45 and you go well why are there more jobs? There should be less jobs. That's not
00:48:47 There should be less jobs. That's not
00:48:47 There should be less jobs. That's not how economics works. Economics expands
00:48:50 how economics works. Economics expands
00:48:50 how economics works. Economics expands because the opportunities expands,
00:48:52 because the opportunities expands,
00:48:52 because the opportunities expands, profits expands, wealth expands and so
00:48:54 profits expands, wealth expands and so
00:48:54 profits expands, wealth expands and so forth. So there's plenty of dislocation
00:48:58 forth. So there's plenty of dislocation
00:48:58 forth. So there's plenty of dislocation but in aggregate are there more people
00:49:01 but in aggregate are there more people
00:49:02 but in aggregate are there more people employed or fewer? The answer is more
00:49:04 employed or fewer? The answer is more
00:49:04 employed or fewer? The answer is more people with higher paying jobs.
00:49:05 people with higher paying jobs.
00:49:05 people with higher paying jobs. Is that true in India as well?
00:49:07 Is that true in India as well?
00:49:07 Is that true in India as well? Uh it will be and you picked India
00:49:09 Uh it will be and you picked India
00:49:09 Uh it will be and you picked India because India has a positive demographic
00:49:11 because India has a positive demographic
00:49:11 because India has a positive demographic outlook although their their birth rate
00:49:12 outlook although their their birth rate
00:49:12 outlook although their their birth rate is now down to 2.0.
00:49:14 is now down to 2.0.
00:49:14 is now down to 2.0. Huh. That's good. the the the rest of
00:49:16 Huh. That's good. the the the rest of
00:49:16 Huh. That's good. the the the rest of the world is choosing not to have
00:49:17 the world is choosing not to have
00:49:17 the world is choosing not to have children.
00:49:18 children.
00:49:18 children. If you look at Korea, it's now down to.7
00:49:22 If you look at Korea, it's now down to.7
00:49:22 If you look at Korea, it's now down to.7 children per two parents.
00:49:23 children per two parents.
00:49:23 children per two parents. Yeah.
00:49:24 Yeah.
00:49:24 Yeah. China is down to one child per two
00:49:26 China is down to one child per two
00:49:26 China is down to one child per two parents.
00:49:27 parents.
00:49:27 parents. It's evaporating.
00:49:28 It's evaporating.
00:49:28 It's evaporating. Now, what happens in those situations?
00:49:30 Now, what happens in those situations?
00:49:30 Now, what happens in those situations? They completely automate everything
00:49:32 They completely automate everything
00:49:32 They completely automate everything because it's the only way to increase
00:49:33 because it's the only way to increase
00:49:34 because it's the only way to increase national priority. So the most likely
00:49:36 national priority. So the most likely
00:49:36 national priority. So the most likely scenario, at least in the next decade,
00:49:38 scenario, at least in the next decade,
00:49:38 scenario, at least in the next decade, is it's a national emergency to use more
00:49:40 is it's a national emergency to use more
00:49:40 is it's a national emergency to use more AI in the workplace to give people
00:49:43 AI in the workplace to give people
00:49:43 AI in the workplace to give people better paying jobs and create more
00:49:45 better paying jobs and create more
00:49:45 better paying jobs and create more productivity in the United States
00:49:47 productivity in the United States
00:49:47 productivity in the United States because our birth rate has been falling.
00:49:49 because our birth rate has been falling.
00:49:49 because our birth rate has been falling. And and what happens is people have
00:49:51 And and what happens is people have
00:49:51 And and what happens is people have talked about this for 20 years. If you
00:49:53 talked about this for 20 years. If you
00:49:53 talked about this for 20 years. If you if you have this conversation and you
00:49:55 if you have this conversation and you
00:49:55 if you have this conversation and you ignore demographics, which is negative
00:49:57 ignore demographics, which is negative
00:49:57 ignore demographics, which is negative for humans, and economic growth, which
00:50:00 for humans, and economic growth, which
00:50:00 for humans, and economic growth, which occurs naturally because of capital
00:50:01 occurs naturally because of capital
00:50:01 occurs naturally because of capital investment, then you miss the whole
00:50:03 investment, then you miss the whole
00:50:03 investment, then you miss the whole story. Now, there are plenty of people
00:50:05 story. Now, there are plenty of people
00:50:05 story. Now, there are plenty of people who lose their jobs, but there's an
00:50:07 who lose their jobs, but there's an
00:50:07 who lose their jobs, but there's an awful lot of people who have new jobs.
00:50:09 awful lot of people who have new jobs.
00:50:09 awful lot of people who have new jobs. And the typical simple example would be
00:50:12 And the typical simple example would be
00:50:12 And the typical simple example would be all those people who work in in Amazon
00:50:14 all those people who work in in Amazon
00:50:14 all those people who work in in Amazon distribution centers and Amazon trucks,
00:50:17 distribution centers and Amazon trucks,
00:50:17 distribution centers and Amazon trucks, those jobs didn't exist until Amazon was
00:50:19 those jobs didn't exist until Amazon was
00:50:19 those jobs didn't exist until Amazon was created, right? Um the number one
00:50:23 created, right? Um the number one
00:50:23 created, right? Um the number one shortage in jobs right now in America
00:50:25 shortage in jobs right now in America
00:50:26 shortage in jobs right now in America are truck drivers. Why? Truck driving is
00:50:29 are truck drivers. Why? Truck driving is
00:50:29 are truck drivers. Why? Truck driving is a lonely, hard, lowpaying, right? low
00:50:33 a lonely, hard, lowpaying, right? low
00:50:33 a lonely, hard, lowpaying, right? low status of good people job. They don't
00:50:35 status of good people job. They don't
00:50:35 status of good people job. They don't want it. They want a better paying job.
00:50:37 want it. They want a better paying job.
00:50:37 want it. They want a better paying job. Right? Going back to education,
00:50:39 Right? Going back to education,
00:50:40 Right? Going back to education, it's really a crime that our industry
00:50:42 it's really a crime that our industry
00:50:42 it's really a crime that our industry has not invented the following product.
00:50:44 has not invented the following product.
00:50:44 has not invented the following product. The product that I wanted to build is a
00:50:46 The product that I wanted to build is a
00:50:46 The product that I wanted to build is a product that teaches every single human
00:50:49 product that teaches every single human
00:50:49 product that teaches every single human who wants to be taught in their language
00:50:51 who wants to be taught in their language
00:50:51 who wants to be taught in their language in a gamified way the stuff they need to
00:50:53 in a gamified way the stuff they need to
00:50:53 in a gamified way the stuff they need to know to be a great citizen in their
00:50:55 know to be a great citizen in their
00:50:55 know to be a great citizen in their country.
00:50:56 country.
00:50:56 country. Right? That can all be done on phones
00:50:58 Right? That can all be done on phones
00:50:58 Right? That can all be done on phones now. It can all be learned and you can
00:50:59 now. It can all be learned and you can
00:51:00 now. It can all be learned and you can all learn how to do it. And why do we
00:51:02 all learn how to do it. And why do we
00:51:02 all learn how to do it. And why do we not have that product? Right? The
00:51:04 not have that product? Right? The
00:51:04 not have that product? Right? The investment in the humans of the world is
00:51:06 investment in the humans of the world is
00:51:06 investment in the humans of the world is the best return always in knowledge in
00:51:10 the best return always in knowledge in
00:51:10 the best return always in knowledge in capability is always the right answer.
00:51:12 capability is always the right answer.
00:51:12 capability is always the right answer. Let me try and get get your opinion on
00:51:13 Let me try and get get your opinion on
00:51:13 Let me try and get get your opinion on this because you're so influential with
00:51:15 this because you're so influential with
00:51:15 this because you're so influential with so I've got about a thousand people in
00:51:17 so I've got about a thousand people in
00:51:17 so I've got about a thousand people in the companies where I'm the controlling
00:51:18 the companies where I'm the controlling
00:51:18 the companies where I'm the controlling shareholder and I've been trying to tell
00:51:20 shareholder and I've been trying to tell
00:51:20 shareholder and I've been trying to tell them exactly what you just articulated
00:51:23 them exactly what you just articulated
00:51:23 them exactly what you just articulated where a lot of these people have been in
00:51:25 where a lot of these people have been in
00:51:25 where a lot of these people have been in the company for 10 15 years. They're
00:51:27 the company for 10 15 years. They're
00:51:27 the company for 10 15 years. They're incredibly capable and loyal, but
00:51:28 incredibly capable and loyal, but
00:51:28 incredibly capable and loyal, but they've learned a specific white collar
00:51:30 they've learned a specific white collar
00:51:30 they've learned a specific white collar skill. They worked really hard to learn
00:51:33 skill. They worked really hard to learn
00:51:33 skill. They worked really hard to learn the skill and the AI is coming within no
00:51:37 the skill and the AI is coming within no
00:51:37 the skill and the AI is coming within no no more than 3 years and maybe two
00:51:38 no more than 3 years and maybe two
00:51:38 no more than 3 years and maybe two years. And the the opportunity to
00:51:42 years. And the the opportunity to
00:51:42 years. And the the opportunity to retrain and have continuity is right
00:51:45 retrain and have continuity is right
00:51:45 retrain and have continuity is right now.
00:51:46 now.
00:51:46 now. But if they delay, which everyone seems
00:51:48 But if they delay, which everyone seems
00:51:48 But if they delay, which everyone seems to be just let's wait and see. And what
00:51:50 to be just let's wait and see. And what
00:51:50 to be just let's wait and see. And what I'm trying to tell them is if you wait
00:51:52 I'm trying to tell them is if you wait
00:51:52 I'm trying to tell them is if you wait and see, you're you're really screwing
00:51:55 and see, you're you're really screwing
00:51:55 and see, you're you're really screwing over that employee. So, so we are in
00:51:57 over that employee. So, so we are in
00:51:57 over that employee. So, so we are in wild agreement that this is going to
00:51:59 wild agreement that this is going to
00:51:59 wild agreement that this is going to happen and the winners we the ones who
00:52:03 happen and the winners we the ones who
00:52:03 happen and the winners we the ones who act. Now, what's interesting is when you
00:52:05 act. Now, what's interesting is when you
00:52:05 act. Now, what's interesting is when you look at innovation history, the biggest
00:52:08 look at innovation history, the biggest
00:52:08 look at innovation history, the biggest companies who you would think of are the
00:52:09 companies who you would think of are the
00:52:10 companies who you would think of are the slowest because they have economic
00:52:12 slowest because they have economic
00:52:12 slowest because they have economic resources that the little companies
00:52:14 resources that the little companies
00:52:14 resources that the little companies typically don't, they tend to eventually
00:52:17 typically don't, they tend to eventually
00:52:17 typically don't, they tend to eventually get there, right? So, watch what the big
00:52:20 get there, right? So, watch what the big
00:52:20 get there, right? So, watch what the big companies do. Mhm.
00:52:21 companies do. Mhm.
00:52:21 companies do. Mhm. are their CFOs and the people who
00:52:23 are their CFOs and the people who
00:52:23 are their CFOs and the people who measure things carefully, who are very
00:52:25 measure things carefully, who are very
00:52:25 measure things carefully, who are very very intelligent. They say, "I'm done
00:52:28 very intelligent. They say, "I'm done
00:52:28 very intelligent. They say, "I'm done with that thousand engineering team that
00:52:29 with that thousand engineering team that
00:52:29 with that thousand engineering team that doesn't do very much. I want 50 people
00:52:32 doesn't do very much. I want 50 people
00:52:32 doesn't do very much. I want 50 people working in this other way and we'll do
00:52:34 working in this other way and we'll do
00:52:34 working in this other way and we'll do something else for the other people."
00:52:35 something else for the other people."
00:52:35 something else for the other people." And when you say big companies, we're
00:52:37 And when you say big companies, we're
00:52:37 And when you say big companies, we're thinking Google, Meta. We're not
00:52:38 thinking Google, Meta. We're not
00:52:38 thinking Google, Meta. We're not thinking, you know, big bank hasn't done
00:52:39 thinking, you know, big bank hasn't done
00:52:40 thinking, you know, big bank hasn't done anything.
00:52:40 anything.
00:52:40 anything. I'm thinking about big banks. Um when
00:52:42 I'm thinking about big banks. Um when
00:52:42 I'm thinking about big banks. Um when when I talk to CEOs and I know a lot of
00:52:44 when I talk to CEOs and I know a lot of
00:52:44 when I talk to CEOs and I know a lot of them in traditional industries, what I
00:52:47 them in traditional industries, what I
00:52:47 them in traditional industries, what I counsel them is you already have people
00:52:49 counsel them is you already have people
00:52:50 counsel them is you already have people in the company who know what to do. You
00:52:51 in the company who know what to do. You
00:52:51 in the company who know what to do. You just don't know who they are.
00:52:53 just don't know who they are.
00:52:53 just don't know who they are. So call a review of the best ideas to
00:52:55 So call a review of the best ideas to
00:52:56 So call a review of the best ideas to apply AI in our business and ine
00:52:58 apply AI in our business and ine
00:52:58 apply AI in our business and ine inevitably the first ones are boring.
00:53:01 inevitably the first ones are boring.
00:53:01 inevitably the first ones are boring. Improve customer service, improve call
00:53:03 Improve customer service, improve call
00:53:03 Improve customer service, improve call centers and so forth. But then somebody
00:53:05 centers and so forth. But then somebody
00:53:05 centers and so forth. But then somebody says, you know, we could increase
00:53:06 says, you know, we could increase
00:53:06 says, you know, we could increase revenue if we built this product. I'll
00:53:08 revenue if we built this product. I'll
00:53:08 revenue if we built this product. I'll give you another example. There's this
00:53:10 give you another example. There's this
00:53:10 give you another example. There's this whole industry of people who work on
00:53:12 whole industry of people who work on
00:53:12 whole industry of people who work on regulated user interfaces or one
00:53:14 regulated user interfaces or one
00:53:14 regulated user interfaces or one another. I think user interfaces are
00:53:16 another. I think user interfaces are
00:53:16 another. I think user interfaces are largely going to go away because if you
00:53:18 largely going to go away because if you
00:53:18 largely going to go away because if you think about it, the agents speak English
00:53:20 think about it, the agents speak English
00:53:20 think about it, the agents speak English typically or other languages. You can
00:53:22 typically or other languages. You can
00:53:22 typically or other languages. You can talk to them. You can say what you want.
00:53:24 talk to them. You can say what you want.
00:53:24 talk to them. You can say what you want. The UI can be generated. So I can say
00:53:26 The UI can be generated. So I can say
00:53:26 The UI can be generated. So I can say generate me a set of buttons that allows
00:53:28 generate me a set of buttons that allows
00:53:28 generate me a set of buttons that allows me to solve this problem and it's
00:53:30 me to solve this problem and it's
00:53:30 me to solve this problem and it's generated for you. Why do I have to be
00:53:32 generated for you. Why do I have to be
00:53:32 generated for you. Why do I have to be stuck in what is called the WIMP
00:53:34 stuck in what is called the WIMP
00:53:34 stuck in what is called the WIMP interface, Windows, icons, menus, and
00:53:36 interface, Windows, icons, menus, and
00:53:36 interface, Windows, icons, menus, and pulld down that was invented in Xerox
00:53:38 pulld down that was invented in Xerox
00:53:38 pulld down that was invented in Xerox Park, right, 50 years ago? Why am I
00:53:41 Park, right, 50 years ago? Why am I
00:53:41 Park, right, 50 years ago? Why am I still stuck in that paradigm? I just
00:53:43 still stuck in that paradigm? I just
00:53:43 still stuck in that paradigm? I just want it to work.
00:53:44 want it to work.
00:53:44 want it to work. Yeah.
00:53:47 Yeah.
00:53:47 Yeah. Kids in high school and college now, any
00:53:50 Kids in high school and college now, any
00:53:50 Kids in high school and college now, any different recommendations for where they
00:53:51 different recommendations for where they
00:53:51 different recommendations for where they go? When you spend any time in a high
00:53:55 go? When you spend any time in a high
00:53:55 go? When you spend any time in a high school or I was at a conference
00:53:58 school or I was at a conference
00:53:58 school or I was at a conference yesterday where we had a drone challenge
00:54:00 yesterday where we had a drone challenge
00:54:00 yesterday where we had a drone challenge and you watch the 15 year olds, they're
00:54:03 and you watch the 15 year olds, they're
00:54:03 and you watch the 15 year olds, they're going to be fine.
00:54:05 going to be fine.
00:54:05 going to be fine. They're just going to be fine. It all
00:54:07 They're just going to be fine. It all
00:54:07 They're just going to be fine. It all makes sense to them and we're in their
00:54:08 makes sense to them and we're in their
00:54:08 makes sense to them and we're in their way.
00:54:09 way.
00:54:09 way. Um, if I were
00:54:10 Um, if I were
00:54:10 Um, if I were digital natives,
00:54:11 digital natives,
00:54:11 digital natives, but they're more than digital natives.
00:54:13 but they're more than digital natives.
00:54:13 but they're more than digital natives. They get it. They understand the speed.
00:54:15 They get it. They understand the speed.
00:54:15 They get it. They understand the speed. It's natural to them. They're also,
00:54:17 It's natural to them. They're also,
00:54:17 It's natural to them. They're also, frankly, faster and smarter than we are,
00:54:19 frankly, faster and smarter than we are,
00:54:19 frankly, faster and smarter than we are, right? That's just how life works, I'm
00:54:21 right? That's just how life works, I'm
00:54:21 right? That's just how life works, I'm sorry to say. So we have wisdom, they
00:54:24 sorry to say. So we have wisdom, they
00:54:24 sorry to say. So we have wisdom, they have intelligence, they win, right? So
00:54:27 have intelligence, they win, right? So
00:54:27 have intelligence, they win, right? So in their case,
00:54:29 in their case,
00:54:29 in their case, I used to think the right answer was to
00:54:31 I used to think the right answer was to
00:54:31 I used to think the right answer was to go into biology. I now actually think
00:54:34 go into biology. I now actually think
00:54:34 go into biology. I now actually think going into the application of
00:54:37 going into the application of
00:54:37 going into the application of intelligence to whatever you're
00:54:38 intelligence to whatever you're
00:54:38 intelligence to whatever you're interested in is the best thing you can
00:54:41 interested in is the best thing you can
00:54:41 interested in is the best thing you can do as a young person.
00:54:41 do as a young person.
00:54:41 do as a young person. Purpose driven.
00:54:42 Purpose driven.
00:54:42 Purpose driven. Yeah.
00:54:43 Yeah.
00:54:43 Yeah. Any form of solution that you find
00:54:46 Any form of solution that you find
00:54:46 Any form of solution that you find interesting. Most uh most kids get into
00:54:48 interesting. Most uh most kids get into
00:54:48 interesting. Most uh most kids get into it for gaming reasons or something and
00:54:50 it for gaming reasons or something and
00:54:50 it for gaming reasons or something and they learn how to program very young. So
00:54:52 they learn how to program very young. So
00:54:52 they learn how to program very young. So they're quite familiar with this. Um I
00:54:55 they're quite familiar with this. Um I
00:54:55 they're quite familiar with this. Um I work uh at a particular university with
00:54:57 work uh at a particular university with
00:54:57 work uh at a particular university with undergraduates and they're already doing
00:55:00 undergraduates and they're already doing
00:55:00 undergraduates and they're already doing different different algorithms for
00:55:02 different different algorithms for
00:55:02 different different algorithms for reinforcement learning as sophomores.
00:55:04 reinforcement learning as sophomores.
00:55:04 reinforcement learning as sophomores. This shows you how fast this is
00:55:06 This shows you how fast this is
00:55:06 This shows you how fast this is happening at their level. They're going
00:55:07 happening at their level. They're going
00:55:08 happening at their level. They're going to be just fine.
00:55:09 to be just fine.
00:55:09 to be just fine. They're responding to the economic
00:55:10 They're responding to the economic
00:55:10 They're responding to the economic signals, but they're also responding to
00:55:13 signals, but they're also responding to
00:55:13 signals, but they're also responding to their purpose. Right? So, an example
00:55:16 their purpose. Right? So, an example
00:55:16 their purpose. Right? So, an example would be you care about climate, which I
00:55:18 would be you care about climate, which I
00:55:18 would be you care about climate, which I certainly do. If you're a young person,
00:55:20 certainly do. If you're a young person,
00:55:20 certainly do. If you're a young person, why don't you figure out a way to
00:55:21 why don't you figure out a way to
00:55:21 why don't you figure out a way to simplify the climate science to use
00:55:23 simplify the climate science to use
00:55:23 simplify the climate science to use simple foundation models to answer these
00:55:25 simple foundation models to answer these
00:55:25 simple foundation models to answer these core questions?
00:55:26 core questions?
00:55:26 core questions? Yeah.
00:55:26 Yeah.
00:55:26 Yeah. Why don't you figure out a way to use
00:55:28 Why don't you figure out a way to use
00:55:28 Why don't you figure out a way to use these powerful models to come up with
00:55:29 these powerful models to come up with
00:55:29 these powerful models to come up with new materials, right, that allow us
00:55:32 new materials, right, that allow us
00:55:32 new materials, right, that allow us again to address the carbon challenge?
00:55:34 again to address the carbon challenge?
00:55:34 again to address the carbon challenge? And why don't you work on energy systems
00:55:36 And why don't you work on energy systems
00:55:36 And why don't you work on energy systems to have better and more efficient energy
00:55:38 to have better and more efficient energy
00:55:38 to have better and more efficient energy sources that are not that less carbon?
00:55:40 sources that are not that less carbon?
00:55:40 sources that are not that less carbon? You see my point? Yeah,
00:55:41 You see my point? Yeah,
00:55:41 You see my point? Yeah, you know, I've noticed uh because I have
00:55:43 you know, I've noticed uh because I have
00:55:43 you know, I've noticed uh because I have kids exactly that that era and um
00:55:45 kids exactly that that era and um
00:55:45 kids exactly that that era and um there's a very clear step function
00:55:48 there's a very clear step function
00:55:48 there's a very clear step function change largely attributable I think to
00:55:50 change largely attributable I think to
00:55:50 change largely attributable I think to Google and Apple that they have the
00:55:53 Google and Apple that they have the
00:55:53 Google and Apple that they have the assumption that things will work
00:55:55 assumption that things will work
00:55:55 assumption that things will work and if you go just a couple years older
00:55:56 and if you go just a couple years older
00:55:56 and if you go just a couple years older during the wimp era like you described
00:55:58 during the wimp era like you described
00:55:58 during the wimp era like you described it which I'll attribute more to
00:55:59 it which I'll attribute more to
00:55:59 it which I'll attribute more to Microsoft the assumption is nothing will
00:56:01 Microsoft the assumption is nothing will
00:56:01 Microsoft the assumption is nothing will ever work like if I try to use this
00:56:03 ever work like if I try to use this
00:56:03 ever work like if I try to use this thing it's going to crash I'm going to
00:56:05 thing it's going to crash I'm going to
00:56:05 thing it's going to crash I'm going to be also interesting was that in my
00:56:06 be also interesting was that in my
00:56:06 be also interesting was that in my career I used to give these speeches
00:56:08 career I used to give these speeches
00:56:08 career I used to give these speeches about the internet which I enjoyed
00:56:11 about the internet which I enjoyed
00:56:11 about the internet which I enjoyed uh where I said, you know, the great
00:56:12 uh where I said, you know, the great
00:56:12 uh where I said, you know, the great thing about the internet is it has
00:56:13 thing about the internet is it has
00:56:13 thing about the internet is it has there's an off button and you can turn
00:56:15 there's an off button and you can turn
00:56:15 there's an off button and you can turn off your odd button and you can actually
00:56:18 off your odd button and you can actually
00:56:18 off your odd button and you can actually have dinner with your family and then
00:56:19 have dinner with your family and then
00:56:19 have dinner with your family and then you can turn it on after dinner. This is
00:56:22 you can turn it on after dinner. This is
00:56:22 you can turn it on after dinner. This is no longer possible. So the divi the
00:56:24 no longer possible. So the divi the
00:56:24 no longer possible. So the divi the distinction between the real world and
00:56:26 distinction between the real world and
00:56:26 distinction between the real world and the digital world has become confusing.
00:56:28 the digital world has become confusing.
00:56:28 the digital world has become confusing. But no one none of us are offline for
00:56:32 But no one none of us are offline for
00:56:32 But no one none of us are offline for any significant period of time.
00:56:33 any significant period of time.
00:56:34 any significant period of time. Yeah. And indeed the the reward system
00:56:36 Yeah. And indeed the the reward system
00:56:36 Yeah. And indeed the the reward system in the world has now caused us to not
00:56:37 in the world has now caused us to not
00:56:38 in the world has now caused us to not even be able to fly in peace. Yeah.
00:56:40 even be able to fly in peace. Yeah.
00:56:40 even be able to fly in peace. Yeah. Right. Drive in peace, take a train in
00:56:42 Right. Drive in peace, take a train in
00:56:42 Right. Drive in peace, take a train in peace.
00:56:42 peace.
00:56:42 peace. Star link is everywhere.
00:56:44 Star link is everywhere.
00:56:44 Star link is everywhere. Right. And and that that ubiquitous
00:56:45 Right. And and that that ubiquitous
00:56:45 Right. And and that that ubiquitous connectivity has some negative impact in
00:56:48 connectivity has some negative impact in
00:56:48 connectivity has some negative impact in terms of psychological stress uh loss of
00:56:51 terms of psychological stress uh loss of
00:56:51 terms of psychological stress uh loss of emotional physical health and so forth.
00:56:53 emotional physical health and so forth.
00:56:53 emotional physical health and so forth. But the benefit of that productivity is
00:56:56 But the benefit of that productivity is
00:56:56 But the benefit of that productivity is without question.
00:56:57 without question.
00:56:57 without question. Every day I get the strangest
00:56:59 Every day I get the strangest
00:56:59 Every day I get the strangest compliment. Someone will stop me and
00:57:01 compliment. Someone will stop me and
00:57:01 compliment. Someone will stop me and say, "Peter, you have such nice skin."
00:57:03 say, "Peter, you have such nice skin."
00:57:03 say, "Peter, you have such nice skin." Honestly, I never thought I'd hear that
00:57:05 Honestly, I never thought I'd hear that
00:57:05 Honestly, I never thought I'd hear that from anyone. And honestly, I can't take
00:57:07 from anyone. And honestly, I can't take
00:57:07 from anyone. And honestly, I can't take the full credit. All I do is use
00:57:09 the full credit. All I do is use
00:57:09 the full credit. All I do is use something called OneSkin OS1 twice a day
00:57:12 something called OneSkin OS1 twice a day
00:57:12 something called OneSkin OS1 twice a day every day. The company is built by four
00:57:15 every day. The company is built by four
00:57:15 every day. The company is built by four brilliant PhD women who've identified a
00:57:17 brilliant PhD women who've identified a
00:57:17 brilliant PhD women who've identified a peptide that effectively reverses the
00:57:19 peptide that effectively reverses the
00:57:19 peptide that effectively reverses the age of your skin. I love it. And again,
00:57:22 age of your skin. I love it. And again,
00:57:22 age of your skin. I love it. And again, I use this twice a day, every day. You
00:57:25 I use this twice a day, every day. You
00:57:25 I use this twice a day, every day. You can go to onkin.co and write peter at
00:57:27 can go to onkin.co and write peter at
00:57:27 can go to onkin.co and write peter at checkout for a discount on the same
00:57:29 checkout for a discount on the same
00:57:29 checkout for a discount on the same product I use. That's oneskin.co co and
00:57:32 product I use. That's oneskin.co co and
00:57:32 product I use. That's oneskin.co co and use the code Peter at checkout. All
00:57:35 use the code Peter at checkout. All
00:57:35 use the code Peter at checkout. All right, back to the episode.
00:57:37 right, back to the episode.
00:57:37 right, back to the episode. Google IO was amazing.
00:57:39 Google IO was amazing.
00:57:40 Google IO was amazing. I mean, just hats off to the entire team
00:57:42 I mean, just hats off to the entire team
00:57:42 I mean, just hats off to the entire team there. Um, V3 was shocking and we're
00:57:47 there. Um, V3 was shocking and we're
00:57:47 there. Um, V3 was shocking and we're we're sitting here 8 miles from
00:57:49 we're sitting here 8 miles from
00:57:49 we're sitting here 8 miles from Hollywood
00:57:51 Hollywood
00:57:51 Hollywood and I'm just wondering your thoughts on
00:57:56 and I'm just wondering your thoughts on
00:57:56 and I'm just wondering your thoughts on the impact this will have. you know, we
00:57:58 the impact this will have. you know, we
00:57:58 the impact this will have. you know, we going to see the oneperson film, feature
00:58:01 going to see the oneperson film, feature
00:58:01 going to see the oneperson film, feature film like we're seeing potentially
00:58:04 film like we're seeing potentially
00:58:04 film like we're seeing potentially oneperson uh unicorns in the future with
00:58:06 oneperson uh unicorns in the future with
00:58:06 oneperson uh unicorns in the future with a with aic. Are we going to see uh an
00:58:09 a with aic. Are we going to see uh an
00:58:09 a with aic. Are we going to see uh an individual be able to compete with a
00:58:11 individual be able to compete with a
00:58:12 individual be able to compete with a Hollywood studio? And should they be
00:58:13 Hollywood studio? And should they be
00:58:13 Hollywood studio? And should they be worried about their assets?
00:58:16 worried about their assets?
00:58:16 worried about their assets? Well, they should always be worried
00:58:17 Well, they should always be worried
00:58:17 Well, they should always be worried because of intellectual property issues
00:58:19 because of intellectual property issues
00:58:19 because of intellectual property issues and so forth. Um, I think blockbusters
00:58:22 and so forth. Um, I think blockbusters
00:58:22 and so forth. Um, I think blockbusters are likely to still be put together by
00:58:24 are likely to still be put together by
00:58:24 are likely to still be put together by people with an awful lot of help from by
00:58:26 people with an awful lot of help from by
00:58:26 people with an awful lot of help from by AI. Mhm.
00:58:27 AI. Mhm.
00:58:27 AI. Mhm. Um I don't think that goes away. Um if
00:58:30 Um I don't think that goes away. Um if
00:58:30 Um I don't think that goes away. Um if you look at what we can do with
00:58:31 you look at what we can do with
00:58:31 you look at what we can do with generating long- form video, it's very
00:58:34 generating long- form video, it's very
00:58:34 generating long- form video, it's very expensive to do long-term video,
00:58:35 expensive to do long-term video,
00:58:35 expensive to do long-term video, although that will come down. And also
00:58:37 although that will come down. And also
00:58:37 although that will come down. And also there's an occasional extra leg or extra
00:58:40 there's an occasional extra leg or extra
00:58:40 there's an occasional extra leg or extra clock or whatever. It's not perfect yet.
00:58:43 clock or whatever. It's not perfect yet.
00:58:43 clock or whatever. It's not perfect yet. And that requires human editing. So even
00:58:45 And that requires human editing. So even
00:58:45 And that requires human editing. So even in the scenario where a lot of the the
00:58:47 in the scenario where a lot of the the
00:58:47 in the scenario where a lot of the the video is created by by a computer, there
00:58:50 video is created by by a computer, there
00:58:50 video is created by by a computer, there going to be humans that are producing it
00:58:51 going to be humans that are producing it
00:58:51 going to be humans that are producing it and directing it for reasons. My best
00:58:54 and directing it for reasons. My best
00:58:54 and directing it for reasons. My best example in Hollywood is that let's let's
00:58:57 example in Hollywood is that let's let's
00:58:57 example in Hollywood is that let's let's use the example and I was at at a studio
00:58:59 use the example and I was at at a studio
00:58:59 use the example and I was at at a studio where they were showing me this.
00:59:01 where they were showing me this.
00:59:01 where they were showing me this. They had they happened to have an actor
00:59:03 They had they happened to have an actor
00:59:03 They had they happened to have an actor who was recreating William Sha Shatner's
00:59:06 who was recreating William Sha Shatner's
00:59:06 who was recreating William Sha Shatner's movies uh movements a young man and they
00:59:10 movies uh movements a young man and they
00:59:10 movies uh movements a young man and they had licensed the likeness from you know
00:59:12 had licensed the likeness from you know
00:59:12 had licensed the likeness from you know William Shatner who's now older and they
00:59:16 William Shatner who's now older and they
00:59:16 William Shatner who's now older and they put his head on this person's body and
00:59:18 put his head on this person's body and
00:59:18 put his head on this person's body and it was seamless. Well that's pretty
00:59:20 it was seamless. Well that's pretty
00:59:20 it was seamless. Well that's pretty impressive. That's more revenue for
00:59:21 impressive. That's more revenue for
00:59:22 impressive. That's more revenue for everyone. The an unknown actor becomes a
00:59:24 everyone. The an unknown actor becomes a
00:59:24 everyone. The an unknown actor becomes a bit more famous, Mr. Shatner gets more
00:59:26 bit more famous, Mr. Shatner gets more
00:59:26 bit more famous, Mr. Shatner gets more revenue, they the whole the whole movie
00:59:29 revenue, they the whole the whole movie
00:59:29 revenue, they the whole the whole movie genre works. That's a good thing.
00:59:32 genre works. That's a good thing.
00:59:32 genre works. That's a good thing. Another example is that nowadays they
00:59:35 Another example is that nowadays they
00:59:35 Another example is that nowadays they use green screens rather than sets. And
00:59:38 use green screens rather than sets. And
00:59:38 use green screens rather than sets. And furthermore, in the alien department,
00:59:40 furthermore, in the alien department,
00:59:40 furthermore, in the alien department, when you have, you know, scary movies,
00:59:42 when you have, you know, scary movies,
00:59:42 when you have, you know, scary movies, instead of having the makeup person,
00:59:44 instead of having the makeup person,
00:59:44 instead of having the makeup person, they just add the makeup digitally.
00:59:47 they just add the makeup digitally.
00:59:47 they just add the makeup digitally. So, who wins? The costs are lower. the
00:59:50 So, who wins? The costs are lower. the
00:59:50 So, who wins? The costs are lower. the movies are made quicker. In theory, the
00:59:53 movies are made quicker. In theory, the
00:59:53 movies are made quicker. In theory, the movies are better, right? Because you
00:59:54 movies are better, right? Because you
00:59:54 movies are better, right? Because you have more choices. Um, so everybody
00:59:56 have more choices. Um, so everybody
00:59:56 have more choices. Um, so everybody wins. Who loses? Well, there was
00:59:58 wins. Who loses? Well, there was
00:59:58 wins. Who loses? Well, there was somebody who built that set
01:00:01 somebody who built that set
01:00:01 somebody who built that set and that set isn't needed anymore.
01:00:03 and that set isn't needed anymore.
01:00:03 and that set isn't needed anymore. That's a carpenter and a very talented
01:00:04 That's a carpenter and a very talented
01:00:04 That's a carpenter and a very talented person who now has to go get a job in
01:00:07 person who now has to go get a job in
01:00:07 person who now has to go get a job in the carpentry business. So again, I
01:00:09 the carpentry business. So again, I
01:00:09 the carpentry business. So again, I think people get confused. If I look at
01:00:11 think people get confused. If I look at
01:00:11 think people get confused. If I look at at if I look at the digital
01:00:12 at if I look at the digital
01:00:12 at if I look at the digital transformation of entertainment subject
01:00:15 transformation of entertainment subject
01:00:15 transformation of entertainment subject to intellectual property being held,
01:00:17 to intellectual property being held,
01:00:17 to intellectual property being held, which is always a question, it's going
01:00:20 which is always a question, it's going
01:00:20 which is always a question, it's going to be just fine,
01:00:21 to be just fine,
01:00:21 to be just fine, right? There's still going to be
01:00:22 right? There's still going to be
01:00:22 right? There's still going to be blockbusters. The cost will go down, not
01:00:25 blockbusters. The cost will go down, not
01:00:25 blockbusters. The cost will go down, not up, or the or the relative income
01:00:28 up, or the or the relative income
01:00:28 up, or the or the relative income because in Hollywood, they essentially
01:00:30 because in Hollywood, they essentially
01:00:30 because in Hollywood, they essentially have their own accounting and they
01:00:31 have their own accounting and they
01:00:31 have their own accounting and they essentially allocate all the revenue to
01:00:33 essentially allocate all the revenue to
01:00:33 essentially allocate all the revenue to all the key producing people. The the
01:00:35 all the key producing people. The the
01:00:35 all the key producing people. The the allocation will shift to the people who
01:00:37 allocation will shift to the people who
01:00:38 allocation will shift to the people who are the most creative. That's a normal
01:00:40 are the most creative. That's a normal
01:00:40 are the most creative. That's a normal process. Remember we said earlier that
01:00:42 process. Remember we said earlier that
01:00:42 process. Remember we said earlier that automation gets rid of the poor the
01:00:45 automation gets rid of the poor the
01:00:45 automation gets rid of the poor the lowest quality jobs, the most dangerous
01:00:47 lowest quality jobs, the most dangerous
01:00:47 lowest quality jobs, the most dangerous jobs there. The jobs that are sort of
01:00:49 jobs there. The jobs that are sort of
01:00:49 jobs there. The jobs that are sort of straightforward are probably automated,
01:00:52 straightforward are probably automated,
01:00:52 straightforward are probably automated, but they're really creative jobs. Um,
01:00:54 but they're really creative jobs. Um,
01:00:54 but they're really creative jobs. Um, another example, the script writers.
01:00:56 another example, the script writers.
01:00:56 another example, the script writers. You're still going to have script
01:00:57 You're still going to have script
01:00:57 You're still going to have script writers, but they're going to have an
01:00:58 writers, but they're going to have an
01:00:58 writers, but they're going to have an awful lot of help from AI to write even
01:01:01 awful lot of help from AI to write even
01:01:01 awful lot of help from AI to write even better scripts. That's not bad.
01:01:03 better scripts. That's not bad.
01:01:03 better scripts. That's not bad. Okay. I saw a study recently out of
01:01:05 Okay. I saw a study recently out of
01:01:05 Okay. I saw a study recently out of Stanford that documented AI being much
01:01:10 Stanford that documented AI being much
01:01:10 Stanford that documented AI being much more persuasive than the best humans.
01:01:13 more persuasive than the best humans.
01:01:13 more persuasive than the best humans. Yes.
01:01:14 Yes.
01:01:14 Yes. Uh that set off some alarms. It also set
01:01:17 Uh that set off some alarms. It also set
01:01:17 Uh that set off some alarms. It also set off some interesting thoughts on the
01:01:19 off some interesting thoughts on the
01:01:19 off some interesting thoughts on the future of advertising.
01:01:21 future of advertising.
01:01:21 future of advertising. Any particular thoughts about that?
01:01:23 Any particular thoughts about that?
01:01:23 Any particular thoughts about that? So we know the following. We know that
01:01:25 So we know the following. We know that
01:01:25 So we know the following. We know that if the system knows you well enough, it
01:01:28 if the system knows you well enough, it
01:01:28 if the system knows you well enough, it can learn to convince you of anything.
01:01:31 can learn to convince you of anything.
01:01:31 can learn to convince you of anything. Mhm. So what that means in an
01:01:34 Mhm. So what that means in an
01:01:34 Mhm. So what that means in an unregulated environment is that the
01:01:36 unregulated environment is that the
01:01:36 unregulated environment is that the systems will know you better and better.
01:01:38 systems will know you better and better.
01:01:38 systems will know you better and better. They'll get better at pitching you and
01:01:40 They'll get better at pitching you and
01:01:40 They'll get better at pitching you and if you're not savvy, if you're not
01:01:42 if you're not savvy, if you're not
01:01:42 if you're not savvy, if you're not smart, you could be easily manipulated.
01:01:44 smart, you could be easily manipulated.
01:01:44 smart, you could be easily manipulated. We also know that the computer is better
01:01:47 We also know that the computer is better
01:01:47 We also know that the computer is better than humans trying to do the same thing.
01:01:50 than humans trying to do the same thing.
01:01:50 than humans trying to do the same thing. So none of this surprises me. The real
01:01:52 So none of this surprises me. The real
01:01:52 So none of this surprises me. The real question and I'll ask this in as a
01:01:54 question and I'll ask this in as a
01:01:54 question and I'll ask this in as a question is in the presence of
01:01:57 question is in the presence of
01:01:57 question is in the presence of unregulated misinformation engines of
01:02:00 unregulated misinformation engines of
01:02:00 unregulated misinformation engines of which there will be many advertisers
01:02:03 which there will be many advertisers
01:02:03 which there will be many advertisers uh politicians just criminal people
01:02:06 uh politicians just criminal people
01:02:06 uh politicians just criminal people people trying to evade responsibility.
01:02:08 people trying to evade responsibility.
01:02:08 people trying to evade responsibility. There's all sorts of people who have
01:02:10 There's all sorts of people who have
01:02:10 There's all sorts of people who have free speech. When they have free speech
01:02:13 free speech. When they have free speech
01:02:13 free speech. When they have free speech which includes the ability to use
01:02:15 which includes the ability to use
01:02:15 which includes the ability to use misinformation to their advantage, what
01:02:18 misinformation to their advantage, what
01:02:18 misinformation to their advantage, what happens to democracy? Yeah,
01:02:20 happens to democracy? Yeah,
01:02:20 happens to democracy? Yeah, we we've all grown up in democracies
01:02:22 we we've all grown up in democracies
01:02:22 we we've all grown up in democracies where there's a sort of a a consensus
01:02:24 where there's a sort of a a consensus
01:02:24 where there's a sort of a a consensus around trust and there's an elite that
01:02:27 around trust and there's an elite that
01:02:27 around trust and there's an elite that more or less administers the trust
01:02:28 more or less administers the trust
01:02:28 more or less administers the trust vectors and so forth. There's a set of
01:02:30 vectors and so forth. There's a set of
01:02:30 vectors and so forth. There's a set of shared values. Do those shared values go
01:02:33 shared values. Do those shared values go
01:02:33 shared values. Do those shared values go away? In our book about Genesis, we talk
01:02:36 away? In our book about Genesis, we talk
01:02:36 away? In our book about Genesis, we talk about this as a deeper problem. What
01:02:38 about this as a deeper problem. What
01:02:38 about this as a deeper problem. What does it mean to be human when you're
01:02:41 does it mean to be human when you're
01:02:41 does it mean to be human when you're interacting mostly with these digital
01:02:43 interacting mostly with these digital
01:02:43 interacting mostly with these digital things,
01:02:45 things,
01:02:45 things, especially if the digital things have
01:02:46 especially if the digital things have
01:02:46 especially if the digital things have their own scenarios? My favorite example
01:02:50 their own scenarios? My favorite example
01:02:50 their own scenarios? My favorite example is that uh you have a son or a grandson
01:02:53 is that uh you have a son or a grandson
01:02:53 is that uh you have a son or a grandson or a child or a grandchild and you give
01:02:56 or a child or a grandchild and you give
01:02:56 or a child or a grandchild and you give them a bear and the bear has a
01:02:58 them a bear and the bear has a
01:02:58 them a bear and the bear has a personality and the child grows up but
01:02:59 personality and the child grows up but
01:02:59 personality and the child grows up but the bear grows up too.
01:03:01 the bear grows up too.
01:03:01 the bear grows up too. So who regulates what the bear talks to
01:03:04 So who regulates what the bear talks to
01:03:04 So who regulates what the bear talks to the kid? Most people haven't actually
01:03:05 the kid? Most people haven't actually
01:03:05 the kid? Most people haven't actually experienced the super super empathetic
01:03:07 experienced the super super empathetic
01:03:07 experienced the super super empathetic voice that can be any inflection you
01:03:09 voice that can be any inflection you
01:03:09 voice that can be any inflection you want. When they see that which will be
01:03:11 want. When they see that which will be
01:03:11 want. When they see that which will be in the next probably two months.
01:03:12 in the next probably two months.
01:03:12 in the next probably two months. Yeah. they're going to completely open
01:03:13 Yeah. they're going to completely open
01:03:14 Yeah. they're going to completely open their eyes to what this
01:03:14 their eyes to what this
01:03:14 their eyes to what this Well, remember that voice casting was
01:03:17 Well, remember that voice casting was
01:03:17 Well, remember that voice casting was solved a few years ago and that you can
01:03:19 solved a few years ago and that you can
01:03:19 solved a few years ago and that you can cast
01:03:20 cast
01:03:20 cast anyone else's voice onto your own.
01:03:22 anyone else's voice onto your own.
01:03:22 anyone else's voice onto your own. Yeah.
01:03:23 Yeah.
01:03:23 Yeah. And that has all sorts of problems.
01:03:25 And that has all sorts of problems.
01:03:25 And that has all sorts of problems. Have you seen uh an avatar yet of
01:03:27 Have you seen uh an avatar yet of
01:03:27 Have you seen uh an avatar yet of somebody that you love that's passed
01:03:29 somebody that you love that's passed
01:03:29 somebody that you love that's passed away or or Henry Kissinger or anything
01:03:30 away or or Henry Kissinger or anything
01:03:30 away or or Henry Kissinger or anything is that?
01:03:31 is that?
01:03:31 is that? Well, we created we actually created one
01:03:32 Well, we created we actually created one
01:03:32 Well, we created we actually created one with the permission of his family.
01:03:34 with the permission of his family.
01:03:34 with the permission of his family. Did you start crying instantly?
01:03:35 Did you start crying instantly?
01:03:35 Did you start crying instantly? Uh it's very emotional. It's very
01:03:37 Uh it's very emotional. It's very
01:03:37 Uh it's very emotional. It's very emotional because, you know, it brings
01:03:38 emotional because, you know, it brings
01:03:38 emotional because, you know, it brings back I mean it's it's a real human,
01:03:41 back I mean it's it's a real human,
01:03:41 back I mean it's it's a real human, you know, it's a real memory, a real
01:03:43 you know, it's a real memory, a real
01:03:43 you know, it's a real memory, a real voice. Um, and I think we're going to
01:03:45 voice. Um, and I think we're going to
01:03:45 voice. Um, and I think we're going to see more of that. Now, one obvious thing
01:03:47 see more of that. Now, one obvious thing
01:03:47 see more of that. Now, one obvious thing that will happen is at some point in the
01:03:49 that will happen is at some point in the
01:03:49 that will happen is at some point in the future when when we naturally die, our
01:03:53 future when when we naturally die, our
01:03:53 future when when we naturally die, our digital essence will live in the cloud.
01:03:56 digital essence will live in the cloud.
01:03:56 digital essence will live in the cloud. Yeah.
01:03:56 Yeah.
01:03:56 Yeah. And it will know what we knew at the
01:03:58 And it will know what we knew at the
01:03:58 And it will know what we knew at the time and you can ask it a question.
01:04:00 time and you can ask it a question.
01:04:00 time and you can ask it a question. Yeah.
01:04:00 Yeah.
01:04:00 Yeah. So, can you imagine asking Einstein,
01:04:02 So, can you imagine asking Einstein,
01:04:02 So, can you imagine asking Einstein, going back to Einstein,
01:04:04 going back to Einstein,
01:04:04 going back to Einstein, what did you really think about,
01:04:06 what did you really think about,
01:04:06 what did you really think about, you know, this other guy,
01:04:08 you know, this other guy,
01:04:08 you know, this other guy, you know, did you actually like him or
01:04:09 you know, did you actually like him or
01:04:09 you know, did you actually like him or were you just being polite with him with
01:04:11 were you just being polite with him with
01:04:11 were you just being polite with him with letters?
01:04:11 letters?
01:04:11 letters? Yeah.
01:04:11 Yeah.
01:04:11 Yeah. Right. Um, and in all those sort of
01:04:14 Right. Um, and in all those sort of
01:04:14 Right. Um, and in all those sort of famous contests that we study as
01:04:16 famous contests that we study as
01:04:16 famous contests that we study as students,
01:04:16 students,
01:04:16 students, can you imagine be able to ask the, you
01:04:19 can you imagine be able to ask the, you
01:04:19 can you imagine be able to ask the, you know, the people
01:04:20 know, the people
01:04:20 know, the people Yeah.
01:04:21 Yeah.
01:04:21 Yeah. Today, you know, with today's
01:04:23 Today, you know, with today's
01:04:23 Today, you know, with today's retrospective, what did you really
01:04:24 retrospective, what did you really
01:04:24 retrospective, what did you really think? I know that the education example
01:04:26 think? I know that the education example
01:04:26 think? I know that the education example you gave earlier is so much more
01:04:28 you gave earlier is so much more
01:04:28 you gave earlier is so much more compelling when you're talking to Isaac
01:04:29 compelling when you're talking to Isaac
01:04:29 compelling when you're talking to Isaac Newton or Albert Einstein instead of
01:04:31 Newton or Albert Einstein instead of
01:04:31 Newton or Albert Einstein instead of just a
01:04:33 just a
01:04:33 just a but you know it's so it's so
01:04:35 but you know it's so it's so
01:04:35 but you know it's so it's so this is coming back to the V3 in the
01:04:37 this is coming back to the V3 in the
01:04:37 this is coming back to the V3 in the movies when the one of the first
01:04:39 movies when the one of the first
01:04:39 movies when the one of the first companies we incubated out of MIT course
01:04:41 companies we incubated out of MIT course
01:04:41 companies we incubated out of MIT course advisor we sold it to Don Graham and the
01:04:43 advisor we sold it to Don Graham and the
01:04:43 advisor we sold it to Don Graham and the Washington Post and then so I was
01:04:44 Washington Post and then so I was
01:04:44 Washington Post and then so I was working for him for a year after that
01:04:47 working for him for a year after that
01:04:47 working for him for a year after that and the conception was here's the
01:04:48 and the conception was here's the
01:04:48 and the conception was here's the internet here's the newspaper let's move
01:04:50 internet here's the newspaper let's move
01:04:50 internet here's the newspaper let's move the newspaper onto the internet we'll
01:04:51 the newspaper onto the internet we'll
01:04:51 the newspaper onto the internet we'll call it washingtonost.com
01:04:53 call it washingtonost.com
01:04:53 call it washingtonost.com and if you look hit where it ended up,
01:04:55 and if you look hit where it ended up,
01:04:55 and if you look hit where it ended up, you know, today with Meta, Tik Tok,
01:04:58 you know, today with Meta, Tik Tok,
01:04:58 you know, today with Meta, Tik Tok, YouTube didn't end up anything like the
01:05:00 YouTube didn't end up anything like the
01:05:00 YouTube didn't end up anything like the newspaper moves to the internet.
01:05:02 newspaper moves to the internet.
01:05:02 newspaper moves to the internet. So now here's V3, here are movies. You
01:05:05 So now here's V3, here are movies. You
01:05:05 So now here's V3, here are movies. You can definitely make a long form movie
01:05:06 can definitely make a long form movie
01:05:06 can definitely make a long form movie much more
01:05:07 much more
01:05:08 much more cheaply. But I just had this experience
01:05:10 cheaply. But I just had this experience
01:05:10 cheaply. But I just had this experience of somebody that I know is a complete
01:05:13 of somebody that I know is a complete
01:05:13 of somebody that I know is a complete this director will try and make a
01:05:14 this director will try and make a
01:05:14 this director will try and make a tearjerker by leading me down a two-hour
01:05:16 tearjerker by leading me down a two-hour
01:05:16 tearjerker by leading me down a two-hour long path. But I can get you to that
01:05:18 long path. But I can get you to that
01:05:18 long path. But I can get you to that same emotional state in about five
01:05:19 same emotional state in about five
01:05:20 same emotional state in about five minutes if it's personalized to you.
01:05:22 minutes if it's personalized to you.
01:05:22 minutes if it's personalized to you. Well, one of the things that's happened
01:05:24 Well, one of the things that's happened
01:05:24 Well, one of the things that's happened because of the addictive nature of the
01:05:26 because of the addictive nature of the
01:05:26 because of the addictive nature of the internet is we've lost um sort of the
01:05:29 internet is we've lost um sort of the
01:05:29 internet is we've lost um sort of the deep state of reading.
01:05:30 deep state of reading.
01:05:30 deep state of reading. Mhm.
01:05:31 Mhm.
01:05:31 Mhm. So, I was walking around and I saw a
01:05:33 So, I was walking around and I saw a
01:05:34 So, I was walking around and I saw a Borders, sorry, a Barnes & Noble
01:05:35 Borders, sorry, a Barnes & Noble
01:05:35 Borders, sorry, a Barnes & Noble bookstore. Big, oh my god, my old home
01:05:40 bookstore. Big, oh my god, my old home
01:05:40 bookstore. Big, oh my god, my old home is back and I went in and I felt good.
01:05:42 is back and I went in and I felt good.
01:05:42 is back and I went in and I felt good. But it's a very fond memory. But the
01:05:44 But it's a very fond memory. But the
01:05:44 But it's a very fond memory. But the fact of the matter is that people's
01:05:45 fact of the matter is that people's
01:05:46 fact of the matter is that people's attention spans are shorter.
01:05:47 attention spans are shorter.
01:05:47 attention spans are shorter. They consume things quicker. One of the
01:05:50 They consume things quicker. One of the
01:05:50 They consume things quicker. One of the things interesting about sports is the
01:05:52 things interesting about sports is the
01:05:52 things interesting about sports is the sports highlights business is a huge
01:05:54 sports highlights business is a huge
01:05:54 sports highlights business is a huge business. Licensed clips around
01:05:56 business. Licensed clips around
01:05:56 business. Licensed clips around highlights because it's more efficient
01:05:57 highlights because it's more efficient
01:05:57 highlights because it's more efficient than watching the whole game.
01:05:59 than watching the whole game.
01:05:59 than watching the whole game. So, I suspect that if you're with your
01:06:01 So, I suspect that if you're with your
01:06:01 So, I suspect that if you're with your buddies and you want to have be drinking
01:06:03 buddies and you want to have be drinking
01:06:03 buddies and you want to have be drinking and so forth, you put the game on,
01:06:04 and so forth, you put the game on,
01:06:04 and so forth, you put the game on, that's fine. But if you're a busy person
01:06:07 that's fine. But if you're a busy person
01:06:07 that's fine. But if you're a busy person and you're busy with whatever you're
01:06:08 and you're busy with whatever you're
01:06:08 and you're busy with whatever you're busy of and you want to know what
01:06:09 busy of and you want to know what
01:06:09 busy of and you want to know what happened with your favorite team, the
01:06:10 happened with your favorite team, the
01:06:10 happened with your favorite team, the highlights are good enough.
01:06:11 highlights are good enough.
01:06:11 highlights are good enough. Yeah. You have four panes of it going at
01:06:13 Yeah. You have four panes of it going at
01:06:13 Yeah. You have four panes of it going at the same time, too.
01:06:14 the same time, too.
01:06:14 the same time, too. And so, this is again a change and it's
01:06:16 And so, this is again a change and it's
01:06:16 And so, this is again a change and it's it's a more fundamental change to
01:06:17 it's a more fundamental change to
01:06:17 it's a more fundamental change to attention. Mhm.
01:06:18 attention. Mhm.
01:06:18 attention. Mhm. I've been work I work with a lot of
01:06:20 I've been work I work with a lot of
01:06:20 I've been work I work with a lot of 20somes in research
01:06:23 20somes in research
01:06:23 20somes in research and one of the questions I had is how do
01:06:25 and one of the questions I had is how do
01:06:25 and one of the questions I had is how do they do research in the presence of all
01:06:28 they do research in the presence of all
01:06:28 they do research in the presence of all of these stimulations and I can answer
01:06:30 of these stimulations and I can answer
01:06:30 of these stimulations and I can answer the question definitively. They turn off
01:06:32 the question definitively. They turn off
01:06:32 the question definitively. They turn off their phone.
01:06:33 their phone.
01:06:33 their phone. Yeah.
01:06:34 Yeah.
01:06:34 Yeah. You can't think deeply as a researcher
01:06:39 You can't think deeply as a researcher
01:06:39 You can't think deeply as a researcher with this thing buzzing. And remember
01:06:41 with this thing buzzing. And remember
01:06:41 with this thing buzzing. And remember that that part of the the industry's
01:06:43 that that part of the the industry's
01:06:43 that that part of the the industry's goal was to fully monetize your
01:06:45 goal was to fully monetize your
01:06:45 goal was to fully monetize your attention.
01:06:46 attention.
01:06:46 attention. Yeah.
01:06:46 Yeah.
01:06:46 Yeah. Right. We we essent aside from sleeping
01:06:49 Right. We we essent aside from sleeping
01:06:49 Right. We we essent aside from sleeping and we're working on having you have
01:06:50 and we're working on having you have
01:06:50 and we're working on having you have less sleep I I guess from stress we've
01:06:53 less sleep I I guess from stress we've
01:06:53 less sleep I I guess from stress we've essentially tried to monetize all of
01:06:55 essentially tried to monetize all of
01:06:55 essentially tried to monetize all of your waking hours with something some
01:06:57 your waking hours with something some
01:06:57 your waking hours with something some form of ads some form of entertainment
01:06:59 form of ads some form of entertainment
01:06:59 form of ads some form of entertainment some form of subscription that is
01:07:01 some form of subscription that is
01:07:01 some form of subscription that is completely antithetical to the way
01:07:03 completely antithetical to the way
01:07:03 completely antithetical to the way humans traditionally work with respect
01:07:06 humans traditionally work with respect
01:07:06 humans traditionally work with respect to long thoughtful examination of
01:07:09 to long thoughtful examination of
01:07:09 to long thoughtful examination of principles the time that it takes to be
01:07:12 principles the time that it takes to be
01:07:12 principles the time that it takes to be a good human being these are in conflict
01:07:15 a good human being these are in conflict
01:07:15 a good human being these are in conflict right now there are various attempts at
01:07:16 right now there are various attempts at
01:07:16 right now there are various attempts at this. So, you know, my favorite are
01:07:18 this. So, you know, my favorite are
01:07:18 this. So, you know, my favorite are these digital apps that make you relax.
01:07:20 these digital apps that make you relax.
01:07:20 these digital apps that make you relax. Okay. So, the correct thing to do to
01:07:22 Okay. So, the correct thing to do to
01:07:22 Okay. So, the correct thing to do to relax is to turn off your phone, right?
01:07:25 relax is to turn off your phone, right?
01:07:25 relax is to turn off your phone, right? And then relax in a traditional way for,
01:07:27 And then relax in a traditional way for,
01:07:27 And then relax in a traditional way for, you know, 70,000 human years of
01:07:29 you know, 70,000 human years of
01:07:29 you know, 70,000 human years of existence.
01:07:30 existence.
01:07:30 existence. Yeah. Yeah. I had an incredible
01:07:31 Yeah. Yeah. I had an incredible
01:07:31 Yeah. Yeah. I had an incredible experience. I'm doing the flight from
01:07:32 experience. I'm doing the flight from
01:07:32 experience. I'm doing the flight from MIT to Stanford all the time.
01:07:35 MIT to Stanford all the time.
01:07:35 MIT to Stanford all the time. And, you know, like you said, attention
01:07:38 And, you know, like you said, attention
01:07:38 And, you know, like you said, attention spans are getting shorter and shorter
01:07:39 spans are getting shorter and shorter
01:07:39 spans are getting shorter and shorter and shorter. The Tik Tok extreme, you
01:07:41 and shorter. The Tik Tok extreme, you
01:07:41 and shorter. The Tik Tok extreme, you know, the clips are so short. This
01:07:43 know, the clips are so short. This
01:07:43 know, the clips are so short. This particular flight was my first time
01:07:44 particular flight was my first time
01:07:44 particular flight was my first time brainstorming with Gemini for six hours
01:07:47 brainstorming with Gemini for six hours
01:07:47 brainstorming with Gemini for six hours straight
01:07:48 straight
01:07:48 straight and I completely lost track of time and
01:07:49 and I completely lost track of time and
01:07:49 and I completely lost track of time and I was we're I'm trying to figure out
01:07:51 I was we're I'm trying to figure out
01:07:51 I was we're I'm trying to figure out it's a circuit design and chip design
01:07:53 it's a circuit design and chip design
01:07:53 it's a circuit design and chip design for inference time compute and it's so
01:07:55 for inference time compute and it's so
01:07:56 for inference time compute and it's so good at brainstorming with me and
01:07:57 good at brainstorming with me and
01:07:57 good at brainstorming with me and bringing back data and so long as the
01:07:59 bringing back data and so long as the
01:07:59 bringing back data and so long as the Wi-Fi on the plane is working.
01:08:00 Wi-Fi on the plane is working.
01:08:00 Wi-Fi on the plane is working. Time went by. So my first experience
01:08:02 Time went by. So my first experience
01:08:02 Time went by. So my first experience with technology that went the other
01:08:04 with technology that went the other
01:08:04 with technology that went the other direction
01:08:04 direction
01:08:04 direction but noticed that you also were not
01:08:06 but noticed that you also were not
01:08:06 but noticed that you also were not responding to texts and annoyances. You
01:08:09 responding to texts and annoyances. You
01:08:09 responding to texts and annoyances. You weren't reading ads. you were deep
01:08:11 weren't reading ads. you were deep
01:08:11 weren't reading ads. you were deep inside of a system
01:08:13 inside of a system
01:08:13 inside of a system which for which you paid a subscription.
01:08:15 which for which you paid a subscription.
01:08:15 which for which you paid a subscription. Mhm.
01:08:16 Mhm.
01:08:16 Mhm. So if you look at the deep research
01:08:17 So if you look at the deep research
01:08:17 So if you look at the deep research stuff, one of the questions I have when
01:08:19 stuff, one of the questions I have when
01:08:19 stuff, one of the questions I have when you do a deep research analysis, I was
01:08:21 you do a deep research analysis, I was
01:08:21 you do a deep research analysis, I was looking at factory automation for
01:08:22 looking at factory automation for
01:08:22 looking at factory automation for something. Where is the boundary of
01:08:24 something. Where is the boundary of
01:08:24 something. Where is the boundary of factory automation versus human
01:08:25 factory automation versus human
01:08:26 factory automation versus human automation? It's some an area I don't
01:08:27 automation? It's some an area I don't
01:08:27 automation? It's some an area I don't understand very well. It's very very
01:08:29 understand very well. It's very very
01:08:29 understand very well. It's very very deep technical set of problems. I didn't
01:08:31 deep technical set of problems. I didn't
01:08:31 deep technical set of problems. I didn't understand it.
01:08:32 understand it.
01:08:32 understand it. It took 20 12 minutes or so to generate
01:08:35 It took 20 12 minutes or so to generate
01:08:35 It took 20 12 minutes or so to generate this paper. 12 minutes of these
01:08:37 this paper. 12 minutes of these
01:08:37 this paper. 12 minutes of these supercomputers is an enormous amount of
01:08:39 supercomputers is an enormous amount of
01:08:39 supercomputers is an enormous amount of time. What is it doing? Right. And the
01:08:43 time. What is it doing? Right. And the
01:08:43 time. What is it doing? Right. And the answer, of course, the product is
01:08:44 answer, of course, the product is
01:08:44 answer, of course, the product is fantastic.
01:08:44 fantastic.
01:08:44 fantastic. Yeah. You know, to Peter's question
01:08:46 Yeah. You know, to Peter's question
01:08:46 Yeah. You know, to Peter's question earlier, too, I keep the Google IPO
01:08:47 earlier, too, I keep the Google IPO
01:08:47 earlier, too, I keep the Google IPO perspectus in my bathroom up in Vermont.
01:08:50 perspectus in my bathroom up in Vermont.
01:08:50 perspectus in my bathroom up in Vermont. It's 2004. I've read it probably 500
01:08:52 It's 2004. I've read it probably 500
01:08:52 It's 2004. I've read it probably 500 times. But I don't know if you remember.
01:08:56 times. But I don't know if you remember.
01:08:56 times. But I don't know if you remember. It's getting a little ratty actually.
01:08:57 It's getting a little ratty actually.
01:08:57 It's getting a little ratty actually. You're the only the only person besides
01:08:59 You're the only the only person besides
01:08:59 You're the only the only person besides me who did the same.
01:09:01 me who did the same.
01:09:01 me who did the same. I read it 500 times because I had to. It
01:09:03 I read it 500 times because I had to. It
01:09:03 I read it 500 times because I had to. It was. It was legally legally required.
01:09:06 was. It was legally legally required.
01:09:06 was. It was legally legally required. Well, I still read it um because because
01:09:08 Well, I still read it um because because
01:09:08 Well, I still read it um because because of the misconceptions, it's just so it's
01:09:10 of the misconceptions, it's just so it's
01:09:10 of the misconceptions, it's just so it's such a great learning experience. But
01:09:12 such a great learning experience. But
01:09:12 such a great learning experience. But even before the IPO, if you think back,
01:09:14 even before the IPO, if you think back,
01:09:14 even before the IPO, if you think back, you know, there's this big debate about
01:09:16 you know, there's this big debate about
01:09:16 you know, there's this big debate about will it be ad revenue, will it be
01:09:17 will it be ad revenue, will it be
01:09:17 will it be ad revenue, will it be subscription revenue, will it be paid
01:09:19 subscription revenue, will it be paid
01:09:19 subscription revenue, will it be paid inclusion, will the ads be visible, and
01:09:21 inclusion, will the ads be visible, and
01:09:21 inclusion, will the ads be visible, and all this confusion about how you're
01:09:22 all this confusion about how you're
01:09:22 all this confusion about how you're going to make money with this thing.
01:09:24 going to make money with this thing.
01:09:24 going to make money with this thing. Now, the internet moved to almost
01:09:25 Now, the internet moved to almost
01:09:25 Now, the internet moved to almost entirely ad revenue. But if you look at
01:09:28 entirely ad revenue. But if you look at
01:09:28 entirely ad revenue. But if you look at the AI models, they're, you know, you
01:09:30 the AI models, they're, you know, you
01:09:30 the AI models, they're, you know, you got your $20 now $200 subscription and
01:09:32 got your $20 now $200 subscription and
01:09:32 got your $20 now $200 subscription and people are signing up like crazy. So,
01:09:36 people are signing up like crazy. So,
01:09:36 people are signing up like crazy. So, you know, the it's ultra ultra
01:09:38 you know, the it's ultra ultra
01:09:38 you know, the it's ultra ultra convincing. Is that going to be a form
01:09:39 convincing. Is that going to be a form
01:09:39 convincing. Is that going to be a form of ad revenue where it convinces you to
01:09:41 of ad revenue where it convinces you to
01:09:41 of ad revenue where it convinces you to buy something or no? Is it going to be
01:09:43 buy something or no? Is it going to be
01:09:43 buy something or no? Is it going to be subscription revenue where people pay a
01:09:45 subscription revenue where people pay a
01:09:45 subscription revenue where people pay a lot more and there's no advertising at
01:09:47 lot more and there's no advertising at
01:09:47 lot more and there's no advertising at all?
01:09:47 all?
01:09:47 all? No, but you have you have this with
01:09:48 No, but you have you have this with
01:09:48 No, but you have you have this with Netflix. There was this whole discussion
01:09:50 Netflix. There was this whole discussion
01:09:50 Netflix. There was this whole discussion about would would how would you fund
01:09:52 about would would how would you fund
01:09:52 about would would how would you fund movies through ads? And the answer is
01:09:54 movies through ads? And the answer is
01:09:54 movies through ads? And the answer is you don't. You have a subscription. And
01:09:56 you don't. You have a subscription. And
01:09:56 you don't. You have a subscription. And the Netflix p people looked at having
01:09:58 the Netflix p people looked at having
01:09:58 the Netflix p people looked at having free movies without a subscription and
01:10:01 free movies without a subscription and
01:10:01 free movies without a subscription and advertising supported and the math
01:10:03 advertising supported and the math
01:10:03 advertising supported and the math didn't work. So I think both will be
01:10:05 didn't work. So I think both will be
01:10:05 didn't work. So I think both will be tried. I think the fact of the matter is
01:10:08 tried. I think the fact of the matter is
01:10:08 tried. I think the fact of the matter is deep research at least at the moment is
01:10:10 deep research at least at the moment is
01:10:10 deep research at least at the moment is going to be chosen by wellto-do or
01:10:12 going to be chosen by wellto-do or
01:10:12 going to be chosen by wellto-do or professional tasks.
01:10:14 professional tasks.
01:10:14 professional tasks. You are capable of spending that $200 a
01:10:16 You are capable of spending that $200 a
01:10:16 You are capable of spending that $200 a month. A lot of people don't afford
01:10:18 month. A lot of people don't afford
01:10:18 month. A lot of people don't afford cannot afford it.
01:10:19 cannot afford it.
01:10:19 cannot afford it. And that free service remember is the
01:10:23 And that free service remember is the
01:10:23 And that free service remember is the thing that is the stepping stone for
01:10:24 thing that is the stepping stone for
01:10:24 thing that is the stepping stone for that young person man or woman who just
01:10:27 that young person man or woman who just
01:10:27 that young person man or woman who just needs that access. My favorite story
01:10:29 needs that access. My favorite story
01:10:29 needs that access. My favorite story there is that when I when I was at
01:10:31 there is that when I when I was at
01:10:31 there is that when I when I was at Google and I went to Kenya and Kenya is
01:10:34 Google and I went to Kenya and Kenya is
01:10:34 Google and I went to Kenya and Kenya is a great country and I and I was with
01:10:35 a great country and I and I was with
01:10:35 a great country and I and I was with this computer science professor and he
01:10:36 this computer science professor and he
01:10:36 this computer science professor and he said, "I love Google." I said, "Well, I
01:10:38 said, "I love Google." I said, "Well, I
01:10:38 said, "I love Google." I said, "Well, I love Google, too." And he goes, "Well, I
01:10:40 love Google, too." And he goes, "Well, I
01:10:40 love Google, too." And he goes, "Well, I really love Google." I said, "I really
01:10:41 really love Google." I said, "I really
01:10:41 really love Google." I said, "I really love Google, too." And I said, "Why do
01:10:42 love Google, too." And I said, "Why do
01:10:42 love Google, too." And I said, "Why do you really love Google?" He said,
01:10:43 you really love Google?" He said,
01:10:44 you really love Google?" He said, "Because we don't have textbooks."
01:10:46 "Because we don't have textbooks."
01:10:46 "Because we don't have textbooks." And I thought, "The top computer science
01:10:48 And I thought, "The top computer science
01:10:48 And I thought, "The top computer science program in the nation does not have
01:10:50 program in the nation does not have
01:10:50 program in the nation does not have textbooks."
01:10:51 textbooks."
01:10:51 textbooks." Yeah. Well, let me uh
01:10:52 Yeah. Well, let me uh
01:10:52 Yeah. Well, let me uh let me jump in a couple things here. Uh
01:10:54 let me jump in a couple things here. Uh
01:10:54 let me jump in a couple things here. Uh Eric in in the next few years what moes
01:11:01 Eric in in the next few years what moes
01:11:01 Eric in in the next few years what moes actually exist for startups as AI is
01:11:04 actually exist for startups as AI is
01:11:04 actually exist for startups as AI is coming in and disrupting uh
01:11:09 coming in and disrupting uh
01:11:09 coming in and disrupting uh do you have a list?
01:11:10 do you have a list?
01:11:10 do you have a list? Yes, I I'll give you a simple answer.
01:11:11 Yes, I I'll give you a simple answer.
01:11:11 Yes, I I'll give you a simple answer. And what do you look for in the
01:11:12 And what do you look for in the
01:11:12 And what do you look for in the companies that you're investing in?
01:11:14 companies that you're investing in?
01:11:14 companies that you're investing in? So first in the deep tech hardware stuff
01:11:16 So first in the deep tech hardware stuff
01:11:16 So first in the deep tech hardware stuff there's going to be patents, patents,
01:11:18 there's going to be patents, patents,
01:11:18 there's going to be patents, patents, filings, inventions, you know the hard
01:11:21 filings, inventions, you know the hard
01:11:21 filings, inventions, you know the hard stuff. Those things are much slower than
01:11:23 stuff. Those things are much slower than
01:11:23 stuff. Those things are much slower than the software industry in terms of growth
01:11:25 the software industry in terms of growth
01:11:25 the software industry in terms of growth and they're just as important. You know,
01:11:27 and they're just as important. You know,
01:11:27 and they're just as important. You know, power systems, all those robotic systems
01:11:30 power systems, all those robotic systems
01:11:30 power systems, all those robotic systems we've been waiting for a long time.
01:11:31 we've been waiting for a long time.
01:11:32 we've been waiting for a long time. They're just it's just slower for all
01:11:33 They're just it's just slower for all
01:11:33 They're just it's just slower for all sorts of hardware is hard.
01:11:34 sorts of hardware is hard.
01:11:34 sorts of hardware is hard. Hardware is hard for those reasons.
01:11:36 Hardware is hard for those reasons.
01:11:36 Hardware is hard for those reasons. In software, it's pretty clear to me
01:11:38 In software, it's pretty clear to me
01:11:38 In software, it's pretty clear to me it's going to be really simple. These
01:11:41 it's going to be really simple. These
01:11:41 it's going to be really simple. These software is typically a network effect
01:11:43 software is typically a network effect
01:11:43 software is typically a network effect business where the fastest mover wins.
01:11:47 business where the fastest mover wins.
01:11:47 business where the fastest mover wins. The fastest mover is the fastest learner
01:11:50 The fastest mover is the fastest learner
01:11:50 The fastest mover is the fastest learner in an AI system. So what I look for is a
01:11:54 in an AI system. So what I look for is a
01:11:54 in an AI system. So what I look for is a is a a company where they have a loop.
01:11:58 is a a company where they have a loop.
01:11:58 is a a company where they have a loop. Ideally, they have a couple of learning
01:11:59 Ideally, they have a couple of learning
01:11:59 Ideally, they have a couple of learning loops. So I'll give you a simple
01:12:00 loops. So I'll give you a simple
01:12:00 loops. So I'll give you a simple learning loop that as you get more
01:12:02 learning loop that as you get more
01:12:02 learning loop that as you get more people, the more people click and you
01:12:05 people, the more people click and you
01:12:05 people, the more people click and you learn from their click. They they they
01:12:07 learn from their click. They they they
01:12:07 learn from their click. They they they express their preferences. So let's say
01:12:09 express their preferences. So let's say
01:12:09 express their preferences. So let's say I invent a whole new consumer thing,
01:12:11 I invent a whole new consumer thing,
01:12:12 I invent a whole new consumer thing, which I don't have an idea right now for
01:12:13 which I don't have an idea right now for
01:12:13 which I don't have an idea right now for it, but imagine I did. And furthermore,
01:12:16 it, but imagine I did. And furthermore,
01:12:16 it, but imagine I did. And furthermore, I said that I don't know anything about
01:12:18 I said that I don't know anything about
01:12:18 I said that I don't know anything about how consumers behave, but I'm going to
01:12:20 how consumers behave, but I'm going to
01:12:20 how consumers behave, but I'm going to launch this thing. The moment people
01:12:21 launch this thing. The moment people
01:12:21 launch this thing. The moment people start using it, I'm going to learn from
01:12:23 start using it, I'm going to learn from
01:12:23 start using it, I'm going to learn from them, and I'll have instantaneous
01:12:25 them, and I'll have instantaneous
01:12:25 them, and I'll have instantaneous learning to get smarter about what they
01:12:27 learning to get smarter about what they
01:12:27 learning to get smarter about what they want. So, I start from nothing. If my
01:12:30 want. So, I start from nothing. If my
01:12:30 want. So, I start from nothing. If my learning slope is this, I'm essentially
01:12:33 learning slope is this, I'm essentially
01:12:33 learning slope is this, I'm essentially unstoppable.
01:12:34 unstoppable.
01:12:34 unstoppable. I'm unstoppable because I'm my learning
01:12:37 I'm unstoppable because I'm my learning
01:12:38 I'm unstoppable because I'm my learning advantage by the time my competitor
01:12:40 advantage by the time my competitor
01:12:40 advantage by the time my competitor figures out what I've done is too great.
01:12:42 figures out what I've done is too great.
01:12:42 figures out what I've done is too great. Yeah.
01:12:42 Yeah.
01:12:42 Yeah. Now, how close can my my competitor be
01:12:45 Now, how close can my my competitor be
01:12:45 Now, how close can my my competitor be and still lose? The answer is a few
01:12:47 and still lose? The answer is a few
01:12:48 and still lose? The answer is a few months.
01:12:48 months.
01:12:48 months. Mhm.
01:12:48 Mhm.
01:12:48 Mhm. Because the slopes are exponential.
01:12:50 Because the slopes are exponential.
01:12:50 Because the slopes are exponential. Mhm.
01:12:51 Mhm.
01:12:51 Mhm. And so, it's likely to me that there
01:12:54 And so, it's likely to me that there
01:12:54 And so, it's likely to me that there will be another 10 fantastic Google
01:12:57 will be another 10 fantastic Google
01:12:57 will be another 10 fantastic Google scale meta-cale companies. They'll all
01:12:59 scale meta-cale companies. They'll all
01:12:59 scale meta-cale companies. They'll all be founded on this principle of learning
01:13:01 be founded on this principle of learning
01:13:01 be founded on this principle of learning loops. And when I say learning loops, I
01:13:04 loops. And when I say learning loops, I
01:13:04 loops. And when I say learning loops, I mean in the core product, solving the
01:13:06 mean in the core product, solving the
01:13:06 mean in the core product, solving the current problem as fast you can. If you
01:13:09 current problem as fast you can. If you
01:13:09 current problem as fast you can. If you cannot define the learning loop, you're
01:13:11 cannot define the learning loop, you're
01:13:11 cannot define the learning loop, you're going to be beaten by a company that can
01:13:13 going to be beaten by a company that can
01:13:13 going to be beaten by a company that can define it.
01:13:14 define it.
01:13:14 define it. And you said 10 meta Googlesized
01:13:17 And you said 10 meta Googlesized
01:13:17 And you said 10 meta Googlesized companies. Do you think they'll there
01:13:18 companies. Do you think they'll there
01:13:18 companies. Do you think they'll there will also be a thousand like if you look
01:13:21 will also be a thousand like if you look
01:13:21 will also be a thousand like if you look at the enterprise software business the
01:13:23 at the enterprise software business the
01:13:23 at the enterprise software business the you know Oracle on down peopleoft
01:13:25 you know Oracle on down peopleoft
01:13:25 you know Oracle on down peopleoft whatever thousands of those or will they
01:13:27 whatever thousands of those or will they
01:13:28 whatever thousands of those or will they all consolidate into those 10 that are
01:13:30 all consolidate into those 10 that are
01:13:30 all consolidate into those 10 that are domain dominant learning loop companies?
01:13:33 domain dominant learning loop companies?
01:13:33 domain dominant learning loop companies? Um, I think I'm largely speaking about
01:13:35 Um, I think I'm largely speaking about
01:13:36 Um, I think I'm largely speaking about consumer scale because that's where the
01:13:38 consumer scale because that's where the
01:13:38 consumer scale because that's where the real growth is.
01:13:40 real growth is.
01:13:40 real growth is. The problem with learning loops is if
01:13:42 The problem with learning loops is if
01:13:42 The problem with learning loops is if your customer is not ready for you, you
01:13:44 your customer is not ready for you, you
01:13:44 your customer is not ready for you, you can only learn at a certain rate.
01:13:47 can only learn at a certain rate.
01:13:47 can only learn at a certain rate. So, it's probably the case that the
01:13:49 So, it's probably the case that the
01:13:49 So, it's probably the case that the government is not interested in learning
01:13:51 government is not interested in learning
01:13:51 government is not interested in learning and therefore there's no growth in
01:13:53 and therefore there's no growth in
01:13:53 and therefore there's no growth in learning loop serving the government.
01:13:55 learning loop serving the government.
01:13:55 learning loop serving the government. I'm sorry to say that needs to get
01:13:56 I'm sorry to say that needs to get
01:13:56 I'm sorry to say that needs to get fixed.
01:13:56 fixed.
01:13:56 fixed. Yeah.
01:13:57 Yeah.
01:13:57 Yeah. Um, educational systems are largely
01:13:59 Um, educational systems are largely
01:13:59 Um, educational systems are largely regulated and run by the unions and so
01:14:01 regulated and run by the unions and so
01:14:01 regulated and run by the unions and so forth. they're not interested in
01:14:02 forth. they're not interested in
01:14:02 forth. they're not interested in innovation. They're not going to be
01:14:03 innovation. They're not going to be
01:14:03 innovation. They're not going to be doing any learning. I'm sorry to say we
01:14:05 doing any learning. I'm sorry to say we
01:14:05 doing any learning. I'm sorry to say we have to get that has to get fixed. So
01:14:07 have to get that has to get fixed. So
01:14:07 have to get that has to get fixed. So the ones where there's a very fast
01:14:09 the ones where there's a very fast
01:14:09 the ones where there's a very fast feedback signal are the ones to watch.
01:14:12 feedback signal are the ones to watch.
01:14:12 feedback signal are the ones to watch. Another example, uh it's pretty obvious
01:14:14 Another example, uh it's pretty obvious
01:14:14 Another example, uh it's pretty obvious that you can build a whole new stock
01:14:16 that you can build a whole new stock
01:14:16 that you can build a whole new stock trading company where you learn if you
01:14:19 trading company where you learn if you
01:14:19 trading company where you learn if you get the algorithms right, you learn
01:14:20 get the algorithms right, you learn
01:14:20 get the algorithms right, you learn faster than everyone else and scale
01:14:22 faster than everyone else and scale
01:14:22 faster than everyone else and scale matters. So in the presence of scale and
01:14:24 matters. So in the presence of scale and
01:14:24 matters. So in the presence of scale and fast learning loops, that's the moat.
01:14:27 fast learning loops, that's the moat.
01:14:28 fast learning loops, that's the moat. Now I don't know that there's many
01:14:29 Now I don't know that there's many
01:14:29 Now I don't know that there's many others there. You do have
01:14:32 others there. You do have
01:14:32 others there. You do have you think brand would be a mode?
01:14:33 you think brand would be a mode?
01:14:33 you think brand would be a mode? Uh brand matters but less so. What's
01:14:37 Uh brand matters but less so. What's
01:14:37 Uh brand matters but less so. What's interesting is people seem to be
01:14:39 interesting is people seem to be
01:14:39 interesting is people seem to be perfectly willing now to move from one
01:14:41 perfectly willing now to move from one
01:14:41 perfectly willing now to move from one thing to the other in at least in the
01:14:42 thing to the other in at least in the
01:14:42 thing to the other in at least in the digital world.
01:14:43 digital world.
01:14:43 digital world. And there's a whole new set of brands
01:14:45 And there's a whole new set of brands
01:14:45 And there's a whole new set of brands that have emerged that everyone is using
01:14:47 that have emerged that everyone is using
01:14:47 that have emerged that everyone is using that are you know the next generations
01:14:49 that are you know the next generations
01:14:49 that are you know the next generations that I haven't even heard of.
01:14:51 that I haven't even heard of.
01:14:51 that I haven't even heard of. With within those learning loops you
01:14:52 With within those learning loops you
01:14:52 With within those learning loops you think domain specific synthetic data is
01:14:55 think domain specific synthetic data is
01:14:55 think domain specific synthetic data is a is a big advantage? Well, the answer
01:14:58 a is a big advantage? Well, the answer
01:14:58 a is a big advantage? Well, the answer is whatever it causes faster learning.
01:15:01 is whatever it causes faster learning.
01:15:01 is whatever it causes faster learning. There are applications where you have
01:15:03 There are applications where you have
01:15:03 There are applications where you have enough training data from humans. There
01:15:05 enough training data from humans. There
01:15:05 enough training data from humans. There are applications where you have to
01:15:06 are applications where you have to
01:15:06 are applications where you have to generate the training data from what the
01:15:08 generate the training data from what the
01:15:08 generate the training data from what the humans are doing.
01:15:09 humans are doing.
01:15:09 humans are doing. Right? So, you could imagine a situation
01:15:11 Right? So, you could imagine a situation
01:15:11 Right? So, you could imagine a situation where you had a learning loop where
01:15:12 where you had a learning loop where
01:15:12 where you had a learning loop where there's no humans involved where it's
01:15:14 there's no humans involved where it's
01:15:14 there's no humans involved where it's monitoring something, some sensors, but
01:15:17 monitoring something, some sensors, but
01:15:17 monitoring something, some sensors, but because you learn faster on those
01:15:19 because you learn faster on those
01:15:19 because you learn faster on those sensors, you get so smart, you can't be
01:15:21 sensors, you get so smart, you can't be
01:15:21 sensors, you get so smart, you can't be replaced by another sensor management
01:15:24 replaced by another sensor management
01:15:24 replaced by another sensor management company. That's the way to think about.
01:15:25 company. That's the way to think about.
01:15:25 company. That's the way to think about. So, so what about the the capital for
01:15:27 So, so what about the the capital for
01:15:28 So, so what about the the capital for the learning loop? Like because um do
01:15:30 the learning loop? Like because um do
01:15:30 the learning loop? Like because um do you know Danielle Roose who runs CE? So
01:15:32 you know Danielle Roose who runs CE? So
01:15:32 you know Danielle Roose who runs CE? So Danielle and I are really good friends.
01:15:33 Danielle and I are really good friends.
01:15:33 Danielle and I are really good friends. We've been talking to our governor Mora
01:15:35 We've been talking to our governor Mora
01:15:35 We've been talking to our governor Mora Healey who's one of the best governors
01:15:36 Healey who's one of the best governors
01:15:36 Healey who's one of the best governors in the world.
01:15:36 in the world.
01:15:36 in the world. I agree.
01:15:37 I agree.
01:15:37 I agree. So there's a problem in our academic
01:15:39 So there's a problem in our academic
01:15:39 So there's a problem in our academic systems where the big companies have all
01:15:41 systems where the big companies have all
01:15:42 systems where the big companies have all the hardware because they have all the
01:15:43 the hardware because they have all the
01:15:43 the hardware because they have all the money and the universities do not have
01:15:45 money and the universities do not have
01:15:45 money and the universities do not have the money for even reasonablesiz data
01:15:48 the money for even reasonablesiz data
01:15:48 the money for even reasonablesiz data centers. I was with one university where
01:15:50 centers. I was with one university where
01:15:50 centers. I was with one university where after lot lots of meetings they agreed
01:15:52 after lot lots of meetings they agreed
01:15:52 after lot lots of meetings they agreed to spend $50 million on a data center
01:15:55 to spend $50 million on a data center
01:15:55 to spend $50 million on a data center which generates less than a thousand
01:15:57 which generates less than a thousand
01:15:57 which generates less than a thousand GPUs
01:15:59 GPUs
01:15:59 GPUs right for the entire campus and all the
01:16:01 right for the entire campus and all the
01:16:01 right for the entire campus and all the research.
01:16:02 research.
01:16:02 research. Yeah.
01:16:02 Yeah.
01:16:02 Yeah. And that doesn't even include the
01:16:03 And that doesn't even include the
01:16:03 And that doesn't even include the terabytes of storage and so forth. So I
01:16:06 terabytes of storage and so forth. So I
01:16:06 terabytes of storage and so forth. So I and others are working on this as a
01:16:07 and others are working on this as a
01:16:07 and others are working on this as a philanthropic matter. The government is
01:16:09 philanthropic matter. The government is
01:16:09 philanthropic matter. The government is going to have to come in with more money
01:16:12 going to have to come in with more money
01:16:12 going to have to come in with more money for universities for this kind of stuff.
01:16:15 for universities for this kind of stuff.
01:16:15 for universities for this kind of stuff. That is among the best investment. When
01:16:17 That is among the best investment. When
01:16:17 That is among the best investment. When I was young, I was on a National Science
01:16:19 I was young, I was on a National Science
01:16:19 I was young, I was on a National Science Foundation scholarship for and by the
01:16:21 Foundation scholarship for and by the
01:16:22 Foundation scholarship for and by the way, I made $15,000 a year. Uh the
01:16:24 way, I made $15,000 a year. Uh the
01:16:24 way, I made $15,000 a year. Uh the return to the nation of my that $15,000
01:16:27 return to the nation of my that $15,000
01:16:27 return to the nation of my that $15,000 has been very good, shall we say, based
01:16:29 has been very good, shall we say, based
01:16:29 has been very good, shall we say, based on the taxes that I pay and the jobs
01:16:31 on the taxes that I pay and the jobs
01:16:31 on the taxes that I pay and the jobs that we have created.
01:16:32 that we have created.
01:16:32 that we have created. So core question. So glad you
01:16:34 So core question. So glad you
01:16:34 So core question. So glad you so so creating so creating an ecosystem
01:16:37 so so creating so creating an ecosystem
01:16:37 so so creating so creating an ecosystem for the next generation to have the
01:16:39 for the next generation to have the
01:16:40 for the next generation to have the access to the systems is important. It's
01:16:43 access to the systems is important. It's
01:16:43 access to the systems is important. It's not obvious to me that they need
01:16:45 not obvious to me that they need
01:16:45 not obvious to me that they need billions of dollars.
01:16:47 billions of dollars.
01:16:47 billions of dollars. It's pretty obvious to me that they need
01:16:50 It's pretty obvious to me that they need
01:16:50 It's pretty obvious to me that they need a million dollars, $2 million. Yeah,
01:16:52 a million dollars, $2 million. Yeah,
01:16:52 a million dollars, $2 million. Yeah, that's the goal.
01:16:53 that's the goal.
01:16:53 that's the goal. Yeah.
01:16:53 Yeah.
01:16:54 Yeah. I want to I want to take a I want to
01:16:55 I want to I want to take a I want to
01:16:55 I want to I want to take a I want to take us in a direction of uh of uh
01:16:58 take us in a direction of uh of uh
01:16:58 take us in a direction of uh of uh wrapping up on super intelligence and
01:17:00 wrapping up on super intelligence and
01:17:00 wrapping up on super intelligence and the book.
01:17:01 the book.
01:17:02 the book. Um,
01:17:03 Um,
01:17:03 Um, we didn't finish the timeline on super
01:17:05 we didn't finish the timeline on super
01:17:05 we didn't finish the timeline on super intelligence and I think it's important
01:17:06 intelligence and I think it's important
01:17:06 intelligence and I think it's important to give people a sense of how quickly
01:17:09 to give people a sense of how quickly
01:17:09 to give people a sense of how quickly the self-reerential learning can get and
01:17:11 the self-reerential learning can get and
01:17:11 the self-reerential learning can get and how rapidly we can get to something, you
01:17:15 how rapidly we can get to something, you
01:17:15 how rapidly we can get to something, you know, a thousand times, a million, a
01:17:16 know, a thousand times, a million, a
01:17:16 know, a thousand times, a million, a billion times more capable than a human.
01:17:20 billion times more capable than a human.
01:17:20 billion times more capable than a human. On the flip side of that, Eric, when I
01:17:22 On the flip side of that, Eric, when I
01:17:22 On the flip side of that, Eric, when I look at my greatest concerns when we get
01:17:25 look at my greatest concerns when we get
01:17:26 look at my greatest concerns when we get through this 5 to sevenyear period of
01:17:30 through this 5 to sevenyear period of
01:17:30 through this 5 to sevenyear period of uh let's just say rogue actors and
01:17:33 uh let's just say rogue actors and
01:17:33 uh let's just say rogue actors and stabilization and such. Uh one of the
01:17:36 stabilization and such. Uh one of the
01:17:36 stabilization and such. Uh one of the biggest concerns I have is the
01:17:39 biggest concerns I have is the
01:17:39 biggest concerns I have is the diminishment of human purpose. Mhm.
01:17:41 diminishment of human purpose. Mhm.
01:17:41 diminishment of human purpose. Mhm. Um, you know, you wrote uh in the book
01:17:45 Um, you know, you wrote uh in the book
01:17:45 Um, you know, you wrote uh in the book uh and I've listened to it uh haven't
01:17:48 uh and I've listened to it uh haven't
01:17:48 uh and I've listened to it uh haven't read it physically and my kids say you
01:17:50 read it physically and my kids say you
01:17:50 read it physically and my kids say you don't read anymore.
01:17:51 don't read anymore.
01:17:51 don't read anymore. You you listen to books you don't read.
01:17:53 You you listen to books you don't read.
01:17:53 You you listen to books you don't read. But um you said the real risk is not
01:17:56 But um you said the real risk is not
01:17:56 But um you said the real risk is not terminator, it's drift. Um you argue
01:17:58 terminator, it's drift. Um you argue
01:17:58 terminator, it's drift. Um you argue that AI won't destroy human uh humanity
01:18:01 that AI won't destroy human uh humanity
01:18:01 that AI won't destroy human uh humanity violently, but might slowly erode human
01:18:04 violently, but might slowly erode human
01:18:04 violently, but might slowly erode human values, autonomy, and judgment if left
01:18:06 values, autonomy, and judgment if left
01:18:06 values, autonomy, and judgment if left unregulated, misunderstood.
01:18:09 unregulated, misunderstood.
01:18:09 unregulated, misunderstood. So it's really a Wall-E like future
01:18:11 So it's really a Wall-E like future
01:18:11 So it's really a Wall-E like future versus a a Star Trek boldly go out
01:18:14 versus a a Star Trek boldly go out
01:18:14 versus a a Star Trek boldly go out there.
01:18:15 there.
01:18:15 there. We're very in the book and my own
01:18:17 We're very in the book and my own
01:18:17 We're very in the book and my own personal view is it's very important
01:18:19 personal view is it's very important
01:18:19 personal view is it's very important that human agency be protected.
01:18:23 that human agency be protected.
01:18:23 that human agency be protected. Yeah.
01:18:23 Yeah.
01:18:23 Yeah. Human agency means the ability to get up
01:18:26 Human agency means the ability to get up
01:18:26 Human agency means the ability to get up in the day and do what you want subject
01:18:29 in the day and do what you want subject
01:18:29 in the day and do what you want subject to the law. Right. And it's perfectly
01:18:32 to the law. Right. And it's perfectly
01:18:32 to the law. Right. And it's perfectly possible that these digital devices can
01:18:34 possible that these digital devices can
01:18:34 possible that these digital devices can create a form of a virtual prison where
01:18:37 create a form of a virtual prison where
01:18:37 create a form of a virtual prison where you don't feel that you as a human can
01:18:39 you don't feel that you as a human can
01:18:39 you don't feel that you as a human can do what you want. Right? That is to be
01:18:41 do what you want. Right? That is to be
01:18:41 do what you want. Right? That is to be avoided. I I'm I'm not worried about
01:18:43 avoided. I I'm I'm not worried about
01:18:43 avoided. I I'm I'm not worried about that case. I'm more worried about the
01:18:45 that case. I'm more worried about the
01:18:45 that case. I'm more worried about the case that if you want to do something,
01:18:48 case that if you want to do something,
01:18:48 case that if you want to do something, it's just so much easier to ask your
01:18:51 it's just so much easier to ask your
01:18:51 it's just so much easier to ask your robot or your AI to do it for you. The
01:18:53 robot or your AI to do it for you. The
01:18:54 robot or your AI to do it for you. The the human spirit that wants to overcome
01:18:57 the human spirit that wants to overcome
01:18:57 the human spirit that wants to overcome a challenge. I mean the unchallenged
01:18:58 a challenge. I mean the unchallenged
01:18:58 a challenge. I mean the unchallenged life is so going to so critical
01:19:00 life is so going to so critical
01:19:00 life is so going to so critical but but there will be always new
01:19:02 but but there will be always new
01:19:02 but but there will be always new challenges. Uh when I was a boy uh one
01:19:05 challenges. Uh when I was a boy uh one
01:19:05 challenges. Uh when I was a boy uh one of the things that I did is I would
01:19:06 of the things that I did is I would
01:19:06 of the things that I did is I would repair my father's car
01:19:08 repair my father's car
01:19:08 repair my father's car right I don't do that anymore. When I
01:19:10 right I don't do that anymore. When I
01:19:10 right I don't do that anymore. When I was a boy I used to mow the lawn. I
01:19:12 was a boy I used to mow the lawn. I
01:19:12 was a boy I used to mow the lawn. I don't do that anymore.
01:19:13 don't do that anymore.
01:19:13 don't do that anymore. Sure.
01:19:13 Sure.
01:19:13 Sure. Right. So there are plenty of examples
01:19:16 Right. So there are plenty of examples
01:19:16 Right. So there are plenty of examples of things that we used to do that we
01:19:17 of things that we used to do that we
01:19:17 of things that we used to do that we don't need to do anymore. But there'll
01:19:19 don't need to do anymore. But there'll
01:19:19 don't need to do anymore. But there'll be plenty of things. Just remember the
01:19:21 be plenty of things. Just remember the
01:19:21 be plenty of things. Just remember the complexity of the world that I'm
01:19:23 complexity of the world that I'm
01:19:23 complexity of the world that I'm describing is not a simple world. Just
01:19:26 describing is not a simple world. Just
01:19:26 describing is not a simple world. Just managing the world around you is going
01:19:28 managing the world around you is going
01:19:28 managing the world around you is going to be a full-time and purposeful job.
01:19:31 to be a full-time and purposeful job.
01:19:31 to be a full-time and purposeful job. Partly because there will be so many
01:19:32 Partly because there will be so many
01:19:32 Partly because there will be so many people fighting for misinformation and
01:19:34 people fighting for misinformation and
01:19:34 people fighting for misinformation and for your attention and and there's
01:19:37 for your attention and and there's
01:19:37 for your attention and and there's obviously lots of competition and so
01:19:38 obviously lots of competition and so
01:19:38 obviously lots of competition and so forth. There's lots of things to worry
01:19:40 forth. There's lots of things to worry
01:19:40 forth. There's lots of things to worry about. Plus, you have all of the people,
01:19:42 about. Plus, you have all of the people,
01:19:42 about. Plus, you have all of the people, you know, trying to get your trying to
01:19:43 you know, trying to get your trying to
01:19:44 you know, trying to get your trying to get your your money, create
01:19:45 get your your money, create
01:19:45 get your your money, create opportunities, deceive you, what have
01:19:47 opportunities, deceive you, what have
01:19:47 opportunities, deceive you, what have you. So, I think human purpose will
01:19:49 you. So, I think human purpose will
01:19:49 you. So, I think human purpose will remain because humans need purpose.
01:19:54 remain because humans need purpose.
01:19:54 remain because humans need purpose. That's the point. And you know there's
01:19:55 That's the point. And you know there's
01:19:55 That's the point. And you know there's lots of literature that the people who
01:19:57 lots of literature that the people who
01:19:57 lots of literature that the people who have what we would consider to be
01:19:58 have what we would consider to be
01:19:58 have what we would consider to be lowpaying worthless jobs enjoy going to
01:20:01 lowpaying worthless jobs enjoy going to
01:20:01 lowpaying worthless jobs enjoy going to work. So the challenge is not to get rid
01:20:04 work. So the challenge is not to get rid
01:20:04 work. So the challenge is not to get rid of their job. It's to make their job
01:20:06 of their job. It's to make their job
01:20:06 of their job. It's to make their job more productive using AI tools. They're
01:20:09 more productive using AI tools. They're
01:20:09 more productive using AI tools. They're still going to go to work. And I to be
01:20:12 still going to go to work. And I to be
01:20:12 still going to go to work. And I to be very clear this notion that we're all
01:20:14 very clear this notion that we're all
01:20:14 very clear this notion that we're all going to be sitting around doing poetry
01:20:16 going to be sitting around doing poetry
01:20:16 going to be sitting around doing poetry is not happening. Right? In the future
01:20:19 is not happening. Right? In the future
01:20:19 is not happening. Right? In the future there'll be lawyers. They'll use tools
01:20:21 there'll be lawyers. They'll use tools
01:20:21 there'll be lawyers. They'll use tools to have even more complex lawsuits
01:20:23 to have even more complex lawsuits
01:20:23 to have even more complex lawsuits against each other, right? There will be
01:20:25 against each other, right? There will be
01:20:25 against each other, right? There will be evil people who will use these tools to
01:20:27 evil people who will use these tools to
01:20:27 evil people who will use these tools to create even more evil problems. There
01:20:29 create even more evil problems. There
01:20:29 create even more evil problems. There will be good people who will be trying
01:20:31 will be good people who will be trying
01:20:31 will be good people who will be trying to deter the evil people. The tools
01:20:34 to deter the evil people. The tools
01:20:34 to deter the evil people. The tools change, but the structure of humanity,
01:20:36 change, but the structure of humanity,
01:20:36 change, but the structure of humanity, the way we work together is not going to
01:20:38 the way we work together is not going to
01:20:38 the way we work together is not going to change.
01:20:38 change.
01:20:38 change. Peter and I were on Mike Sailor's yacht
01:20:40 Peter and I were on Mike Sailor's yacht
01:20:40 Peter and I were on Mike Sailor's yacht a couple months ago, and I was
01:20:43 a couple months ago, and I was
01:20:43 a couple months ago, and I was complaining that the curriculum is
01:20:44 complaining that the curriculum is
01:20:44 complaining that the curriculum is completely broken in all these schools.
01:20:46 completely broken in all these schools.
01:20:46 completely broken in all these schools. But what I meant was we should be
01:20:48 But what I meant was we should be
01:20:48 But what I meant was we should be teaching AI. And he said, "Yeah, they
01:20:50 teaching AI. And he said, "Yeah, they
01:20:50 teaching AI. And he said, "Yeah, they should be teaching aesthetics." And I
01:20:52 should be teaching aesthetics." And I
01:20:52 should be teaching aesthetics." And I looked at him, I'm like, "What the hell
01:20:53 looked at him, I'm like, "What the hell
01:20:53 looked at him, I'm like, "What the hell are you talking about?" He said, "No, in
01:20:55 are you talking about?" He said, "No, in
01:20:55 are you talking about?" He said, "No, in the age of AI, which is imminent, look
01:20:57 the age of AI, which is imminent, look
01:20:57 the age of AI, which is imminent, look at everything around you, whether it's
01:20:59 at everything around you, whether it's
01:20:59 at everything around you, whether it's good or bad, enjoyable, not enjoyable,
01:21:01 good or bad, enjoyable, not enjoyable,
01:21:01 good or bad, enjoyable, not enjoyable, it's all about designing aesthetics."
01:21:04 it's all about designing aesthetics."
01:21:04 it's all about designing aesthetics." When the AI is such a force multiplier
01:21:06 When the AI is such a force multiplier
01:21:06 When the AI is such a force multiplier that you can create virtually anything,
01:21:07 that you can create virtually anything,
01:21:08 that you can create virtually anything, what what are you creating and why? And
01:21:09 what what are you creating and why? And
01:21:09 what what are you creating and why? And that becomes the challenge.
01:21:10 that becomes the challenge.
01:21:10 that becomes the challenge. If you look at Vickinstein and the sort
01:21:12 If you look at Vickinstein and the sort
01:21:12 If you look at Vickinstein and the sort of theories of all of this stuff, it is
01:21:15 of theories of all of this stuff, it is
01:21:15 of theories of all of this stuff, it is all fundament we're having a
01:21:16 all fundament we're having a
01:21:16 all fundament we're having a conversation that America has about
01:21:19 conversation that America has about
01:21:19 conversation that America has about tasks and outcomes. It's our culture.
01:21:22 tasks and outcomes. It's our culture.
01:21:22 tasks and outcomes. It's our culture. But there are other aspects of human
01:21:23 But there are other aspects of human
01:21:23 But there are other aspects of human life, meaning, thinking, reasoning.
01:21:27 life, meaning, thinking, reasoning.
01:21:28 life, meaning, thinking, reasoning. We're not going to stop doing that.
01:21:30 We're not going to stop doing that.
01:21:30 We're not going to stop doing that. So imagine if your purpose in life in
01:21:33 So imagine if your purpose in life in
01:21:33 So imagine if your purpose in life in the future is to figure out what's going
01:21:34 the future is to figure out what's going
01:21:34 the future is to figure out what's going on and to be successful, just figuring
01:21:37 on and to be successful, just figuring
01:21:37 on and to be successful, just figuring that out is sufficient. Because once you
01:21:39 that out is sufficient. Because once you
01:21:39 that out is sufficient. Because once you figured it out, it's taken care of for
01:21:41 figured it out, it's taken care of for
01:21:41 figured it out, it's taken care of for you.
01:21:41 you.
01:21:41 you. That's beautiful,
01:21:41 That's beautiful,
01:21:42 That's beautiful, right? That provides purpose.
01:21:43 right? That provides purpose.
01:21:43 right? That provides purpose. Yeah.
01:21:44 Yeah.
01:21:44 Yeah. Um it's pretty clear that robots will
01:21:46 Um it's pretty clear that robots will
01:21:46 Um it's pretty clear that robots will take over an awful lot of mechanical or
01:21:48 take over an awful lot of mechanical or
01:21:48 take over an awful lot of mechanical or manual work.
01:21:49 manual work.
01:21:49 manual work. Um and for people who like to, you know,
01:21:52 Um and for people who like to, you know,
01:21:52 Um and for people who like to, you know, I like to repair the car. I don't do it
01:21:53 I like to repair the car. I don't do it
01:21:53 I like to repair the car. I don't do it anymore. I miss it,
01:21:55 anymore. I miss it,
01:21:55 anymore. I miss it, but I I have other things to do with my
01:21:57 but I I have other things to do with my
01:21:57 but I I have other things to do with my time.
01:21:57 time.
01:21:58 time. Yeah.
01:21:59 Yeah.
01:21:59 Yeah. Take me forward. When do you see uh what
01:22:03 Take me forward. When do you see uh what
01:22:03 Take me forward. When do you see uh what you define as digital super
01:22:04 you define as digital super
01:22:04 you define as digital super intelligence?
01:22:06 intelligence?
01:22:06 intelligence? Uh within 10 years.
01:22:07 Uh within 10 years.
01:22:07 Uh within 10 years. Within 10 years. And what do people need
01:22:09 Within 10 years. And what do people need
01:22:09 Within 10 years. And what do people need to know about that?
01:22:11 to know about that?
01:22:11 to know about that? What do people need to understand and
01:22:13 What do people need to understand and
01:22:13 What do people need to understand and sort of uh prepare themselves for either
01:22:17 sort of uh prepare themselves for either
01:22:17 sort of uh prepare themselves for either from as a parent or as a employee or as
01:22:21 from as a parent or as a employee or as
01:22:21 from as a parent or as a employee or as a CEO?
01:22:23 a CEO?
01:22:23 a CEO? One way to think about it is that when
01:22:26 One way to think about it is that when
01:22:26 One way to think about it is that when digital super intelligence finally
01:22:28 digital super intelligence finally
01:22:28 digital super intelligence finally arrives and is generally available and
01:22:30 arrives and is generally available and
01:22:30 arrives and is generally available and generally safe, you're going to have
01:22:33 generally safe, you're going to have
01:22:33 generally safe, you're going to have your own polymath.
01:22:35 your own polymath.
01:22:36 your own polymath. So you're going to have the sum of
01:22:37 So you're going to have the sum of
01:22:38 So you're going to have the sum of Einstein and Leonardo da Vinci in the
01:22:40 Einstein and Leonardo da Vinci in the
01:22:40 Einstein and Leonardo da Vinci in the equivalent of your pocket. I think
01:22:43 equivalent of your pocket. I think
01:22:43 equivalent of your pocket. I think thinking about how you would use that
01:22:44 thinking about how you would use that
01:22:44 thinking about how you would use that gift is interesting. And of course evil
01:22:48 gift is interesting. And of course evil
01:22:48 gift is interesting. And of course evil people will become more evil, but the
01:22:50 people will become more evil, but the
01:22:50 people will become more evil, but the vast majority of people are good. Yes,
01:22:53 vast majority of people are good. Yes,
01:22:53 vast majority of people are good. Yes, they're well-meaning, right? So going
01:22:55 they're well-meaning, right? So going
01:22:55 they're well-meaning, right? So going back to your abundance argument, there
01:22:57 back to your abundance argument, there
01:22:57 back to your abundance argument, there are people who've studied the the n the
01:22:59 are people who've studied the the n the
01:22:59 are people who've studied the the n the notion of productivity increases and
01:23:01 notion of productivity increases and
01:23:01 notion of productivity increases and they believe that you can get we'll see
01:23:04 they believe that you can get we'll see
01:23:04 they believe that you can get we'll see to 30% year-over-year economic growth
01:23:06 to 30% year-over-year economic growth
01:23:06 to 30% year-over-year economic growth through abundance and so forth. That's a
01:23:09 through abundance and so forth. That's a
01:23:09 through abundance and so forth. That's a very wealthy world. That's a world of
01:23:12 very wealthy world. That's a world of
01:23:12 very wealthy world. That's a world of much less disease, many more choices,
01:23:15 much less disease, many more choices,
01:23:15 much less disease, many more choices, much more fun if you will, right? Just
01:23:17 much more fun if you will, right? Just
01:23:17 much more fun if you will, right? Just taking all those poor people and lifting
01:23:19 taking all those poor people and lifting
01:23:19 taking all those poor people and lifting them out of the daily struggle they
01:23:21 them out of the daily struggle they
01:23:21 them out of the daily struggle they have. That is a great human goal. That's
01:23:23 have. That is a great human goal. That's
01:23:24 have. That is a great human goal. That's focus on that. That's the goal we should
01:23:25 focus on that. That's the goal we should
01:23:25 focus on that. That's the goal we should have. Does GDP still have meaning in
01:23:28 have. Does GDP still have meaning in
01:23:28 have. Does GDP still have meaning in that world?
01:23:28 that world?
01:23:28 that world? If you include services, it does. Um,
01:23:31 If you include services, it does. Um,
01:23:31 If you include services, it does. Um, one of the things about manufacturing
01:23:33 one of the things about manufacturing
01:23:33 one of the things about manufacturing and and everyone's focused on trade
01:23:34 and and everyone's focused on trade
01:23:34 and and everyone's focused on trade deficits and they don't understand the
01:23:36 deficits and they don't understand the
01:23:36 deficits and they don't understand the vast majority of modern economies are
01:23:38 vast majority of modern economies are
01:23:38 vast majority of modern economies are service economies, not manufacturing
01:23:40 service economies, not manufacturing
01:23:40 service economies, not manufacturing economies. And if you look at the
01:23:41 economies. And if you look at the
01:23:42 economies. And if you look at the percentage of farming, it was roughly
01:23:44 percentage of farming, it was roughly
01:23:44 percentage of farming, it was roughly 98% to roughly 2 or 3% in America over a
01:23:47 98% to roughly 2 or 3% in America over a
01:23:47 98% to roughly 2 or 3% in America over a hundred years. If you look at
01:23:48 hundred years. If you look at
01:23:48 hundred years. If you look at manufacturing, the heydays in the 30s
01:23:50 manufacturing, the heydays in the 30s
01:23:50 manufacturing, the heydays in the 30s and 40s and 50s, those percentages are
01:23:53 and 40s and 50s, those percentages are
01:23:53 and 40s and 50s, those percentages are now down. Well, lower than 10%. It's not
01:23:56 now down. Well, lower than 10%. It's not
01:23:56 now down. Well, lower than 10%. It's not because we don't buy stuff. It's because
01:23:58 because we don't buy stuff. It's because
01:23:58 because we don't buy stuff. It's because the stuff is automat automated. You need
01:24:00 the stuff is automat automated. You need
01:24:00 the stuff is automat automated. You need fewer people. Those there's plenty of
01:24:03 fewer people. Those there's plenty of
01:24:03 fewer people. Those there's plenty of people working in other jobs. So again,
01:24:06 people working in other jobs. So again,
01:24:06 people working in other jobs. So again, look at the totality of the society. Is
01:24:09 look at the totality of the society. Is
01:24:09 look at the totality of the society. Is it healthy?
01:24:10 it healthy?
01:24:10 it healthy? If you look in China, it's easy to
01:24:12 If you look in China, it's easy to
01:24:12 If you look in China, it's easy to complain about them. Um they have now
01:24:15 complain about them. Um they have now
01:24:15 complain about them. Um they have now deflation. They have a term where people
01:24:18 deflation. They have a term where people
01:24:18 deflation. They have a term where people are it's called laying down where they
01:24:20 are it's called laying down where they
01:24:20 are it's called laying down where they lay they they stay at home. They don't
01:24:21 lay they they stay at home. They don't
01:24:21 lay they they stay at home. They don't participate in the workforce, which is
01:24:23 participate in the workforce, which is
01:24:23 participate in the workforce, which is counter to their traditional culture. If
01:24:25 counter to their traditional culture. If
01:24:25 counter to their traditional culture. If you look at reproduction rates, these
01:24:27 you look at reproduction rates, these
01:24:27 you look at reproduction rates, these countries that are essentially having no
01:24:28 countries that are essentially having no
01:24:28 countries that are essentially having no children, that's not a good thing.
01:24:30 children, that's not a good thing.
01:24:30 children, that's not a good thing. Yeah.
01:24:31 Yeah.
01:24:31 Yeah. Right. Those are problems that we're
01:24:32 Right. Those are problems that we're
01:24:32 Right. Those are problems that we're going to face. Those are the new
01:24:34 going to face. Those are the new
01:24:34 going to face. Those are the new problems of the age.
01:24:35 problems of the age.
01:24:35 problems of the age. I love that.
01:24:37 I love that.
01:24:37 I love that. Eric, uh, so grateful for your time.
01:24:41 Eric, uh, so grateful for your time.
01:24:41 Eric, uh, so grateful for your time. Thank you. Thank you both. Um, I I love
01:24:43 Thank you. Thank you both. Um, I I love
01:24:43 Thank you. Thank you both. Um, I I love your show.
01:24:44 your show.
01:24:44 your show. Yeah. Thank you, buddy.
01:24:45 Yeah. Thank you, buddy.
01:24:45 Yeah. Thank you, buddy. Thank you.
01:24:45 Thank you.
01:24:45 Thank you. Okay. Thank you, guys. If you could have
01:24:47 Okay. Thank you, guys. If you could have
01:24:47 Okay. Thank you, guys. If you could have had a 10-year head start on the dot boom
01:24:49 had a 10-year head start on the dot boom
01:24:49 had a 10-year head start on the dot boom back in the 2000s, would you have taken
01:24:51 back in the 2000s, would you have taken
01:24:51 back in the 2000s, would you have taken it? Every week, I track the major tech
01:24:54 it? Every week, I track the major tech
01:24:54 it? Every week, I track the major tech meta trends. These are massive
01:24:56 meta trends. These are massive
01:24:56 meta trends. These are massive game-changing shifts that will play out
01:24:58 game-changing shifts that will play out
01:24:58 game-changing shifts that will play out over the decade ahead. From humanoid
01:25:00 over the decade ahead. From humanoid
01:25:00 over the decade ahead. From humanoid robotics to AGI, quantum computing,
01:25:02 robotics to AGI, quantum computing,
01:25:02 robotics to AGI, quantum computing, energy breakthroughs, and longevity. I
01:25:04 energy breakthroughs, and longevity. I
01:25:04 energy breakthroughs, and longevity. I cut through the noise and deliver only
01:25:07 cut through the noise and deliver only
01:25:07 cut through the noise and deliver only what matters to our lives and our
01:25:09 what matters to our lives and our
01:25:09 what matters to our lives and our careers. I send out a Metatron
01:25:11 careers. I send out a Metatron
01:25:11 careers. I send out a Metatron newsletter twice a week as a quick
01:25:13 newsletter twice a week as a quick
01:25:13 newsletter twice a week as a quick two-minute readover email. It's entirely
01:25:15 two-minute readover email. It's entirely
01:25:15 two-minute readover email. It's entirely free. These insights are read by
01:25:18 free. These insights are read by
01:25:18 free. These insights are read by founders, CEOs, and investors behind
01:25:20 founders, CEOs, and investors behind
01:25:20 founders, CEOs, and investors behind some of the world's most disruptive
01:25:21 some of the world's most disruptive
01:25:21 some of the world's most disruptive companies. Why? Because acting early is
01:25:25 companies. Why? Because acting early is
01:25:25 companies. Why? Because acting early is everything. This is for you if you want
01:25:27 everything. This is for you if you want
01:25:27 everything. This is for you if you want to see the future before it arrives and
01:25:30 to see the future before it arrives and
01:25:30 to see the future before it arrives and profit from it. Sign up at
01:25:31 profit from it. Sign up at
01:25:31 profit from it. Sign up at dmandis.com/atrends
01:25:33 dmandis.com/atrends
01:25:33 dmandis.com/atrends and be ahead of the next tech bubble.
01:25:36 and be ahead of the next tech bubble.
01:25:36 and be ahead of the next tech bubble. That's dmmand.com/metats.
01:25:39 That's dmmand.com/metats.
01:25:39 That's dmmand.com/metats. [Music]