1 00:00:00,000 --> 00:00:04,000 You're listening to Daily Shit. 2 00:00:09,000 --> 00:00:11,000 Daily Shitikas are with you. 3 00:00:11,000 --> 00:00:13,000 And you already know. 4 00:00:13,000 --> 00:00:17,000 So, my Daily Shit that I want to talk to you about today. 5 00:00:17,000 --> 00:00:22,000 And again, and again, and again, I'm dragging you into the abyss of artificial intelligence. 6 00:00:22,000 --> 00:00:25,000 And there are actually three things here. 7 00:00:25,000 --> 00:00:33,000 First, I saw how artificial intelligence changed my approach to how I use computers and the Internet. 8 00:00:33,000 --> 00:00:42,000 Just before we recorded, I showed you how I make pictures for our coolest course on engineering management. 9 00:00:42,000 --> 00:00:45,000 And? What kind of integration do I have? 10 00:00:45,000 --> 00:00:47,000 Yes, this is how I warmed you up, blyat. 11 00:00:49,000 --> 00:00:51,000 Such a very thoughtful one, you know? 12 00:00:51,000 --> 00:00:53,000 Soft, soft. 13 00:00:53,000 --> 00:00:54,000 Here. 14 00:00:55,000 --> 00:00:57,000 If you don't sign up for the course, you'll regret it. 15 00:00:57,000 --> 00:00:59,000 That's just what I'm saying. 16 00:00:59,000 --> 00:01:00,000 Okay. 17 00:01:00,000 --> 00:01:06,000 In fact, I'm still generating, well, we're generating some graphic material with you. 18 00:01:06,000 --> 00:01:08,000 And how did I do it before? 19 00:01:08,000 --> 00:01:12,000 In order to find some picture, I was looking somewhere on the Internet. 20 00:01:12,000 --> 00:01:16,000 I'm not talking about charts and so on, but just about a picture that seems to... 21 00:01:16,000 --> 00:01:18,000 About Google Images. 22 00:01:18,000 --> 00:01:20,000 Yes, about Google Images. 23 00:01:20,000 --> 00:01:22,000 And you're looking for that picture. 24 00:01:22,000 --> 00:01:24,000 Those pictures are very often protected by some law. 25 00:01:24,000 --> 00:01:26,000 These pictures are very often... 26 00:01:26,000 --> 00:01:28,000 Bad in size. 27 00:01:28,000 --> 00:01:30,000 Bad in size. 28 00:01:30,000 --> 00:01:32,000 Watermarks are still so huge. 29 00:01:32,000 --> 00:01:34,000 Well, in short, it's very difficult to use. 30 00:01:34,000 --> 00:01:38,000 When we make YouTube videos, we use our pictures. 31 00:01:38,000 --> 00:01:42,000 We have a subscription to various image hosting, where you can take it legally. 32 00:01:42,000 --> 00:01:45,000 But it's just a fucking nightmare. 33 00:01:45,000 --> 00:01:49,000 And now I'm using Meet Journey. 34 00:01:49,000 --> 00:01:52,000 And I'm just telling him, dude, let's go there. 35 00:01:52,000 --> 00:01:56,000 Imagine, then everything is 5-10, and paint this thing for me. 36 00:01:56,000 --> 00:02:02,000 Well, three or four minutes, and I have a picture that I can already use on the slide. 37 00:02:02,000 --> 00:02:09,000 And there, of course, there are all sorts of, you know, image glitches, bugs, eight fingers, some other thing. 38 00:02:09,000 --> 00:02:12,000 But in general, people don't care. 39 00:02:12,000 --> 00:02:13,000 It's just a presentation. 40 00:02:13,000 --> 00:02:15,000 Well, stylistically, they are all the same, yes. 41 00:02:15,000 --> 00:02:20,000 Yes, that is, well, you can make a different style, you can configure anything there. 42 00:02:20,000 --> 00:02:30,000 And I realized that I use artificial intelligence, well, just like a colleague, just like a slave then, for sure. 43 00:02:30,000 --> 00:02:32,000 No matter how it sounds. 44 00:02:32,000 --> 00:02:41,000 And now a large number of scandals are coming from artists about the fact that artificial intelligence uses a lot of pictures for learning. 45 00:02:41,000 --> 00:02:44,000 And very often it can be pictures of these artists. 46 00:02:44,000 --> 00:02:47,000 So you decided to transfer that Twitter shit here? 47 00:02:47,000 --> 00:02:48,000 No, no, no. 48 00:02:48,000 --> 00:02:49,000 Wait. 49 00:02:49,000 --> 00:02:51,000 It's just what I'm telling you. 50 00:02:51,000 --> 00:02:52,000 Yes. 51 00:02:52,000 --> 00:02:57,000 And ethics and so on, it seems to me, is not very ready, even legally. 52 00:02:57,000 --> 00:03:02,000 Because, well, no artist can prove that his pictures are being used. 53 00:03:02,000 --> 00:03:05,000 But at the same time, everyone is shouting that their pictures are being used. 54 00:03:05,000 --> 00:03:07,000 Maybe that's the case. 55 00:03:07,000 --> 00:03:10,000 I'm not a judge here, and it's up to the judge to decide. 56 00:03:10,000 --> 00:03:13,000 But I realized that I was using this thing. 57 00:03:13,000 --> 00:03:18,000 At the same time, I sometimes go to the GPT chat, when I told you that we were talking about these werewolves. 58 00:03:18,000 --> 00:03:21,000 Well, I really asked him a question. 59 00:03:21,000 --> 00:03:31,000 And I sat there and thought, well, the GPT chat has data, but you don't know if it's real data or hallucinations. 60 00:03:31,000 --> 00:03:33,000 Is it verified data or not verified? 61 00:03:33,000 --> 00:03:35,000 Or is it a hallucination of the GPT chat? 62 00:03:35,000 --> 00:03:37,000 Because he doesn't understand what he's talking about. 63 00:03:37,000 --> 00:03:38,000 He doesn't give a shit. 64 00:03:38,000 --> 00:03:40,000 He doesn't give a shit at all, what you're asking. 65 00:03:40,000 --> 00:03:46,000 But does my life change the fact that I will know how many people, how many sharks there are? 66 00:03:46,000 --> 00:03:49,000 How many sharks kill people in a row or not? 67 00:03:49,000 --> 00:03:51,000 Or where did it come from? 68 00:03:51,000 --> 00:03:58,000 If I google, they will give me a million data, and I will use some data that is more or less real. 69 00:03:58,000 --> 00:04:01,000 But in general, my life will not change at all. 70 00:04:01,000 --> 00:04:05,000 That's why I see how I use it. 71 00:04:05,000 --> 00:04:15,000 But this week, or maybe last week, depending on when we release this podcast, there were two events. 72 00:04:15,000 --> 00:04:27,000 And at one of the events, John Hennessey, or Jack Hennessey, I honestly don't remember, but he is one of the pioneers of technology in the world. 73 00:04:27,000 --> 00:04:29,000 He won the Turing Prize. 74 00:04:29,000 --> 00:04:31,000 He was at one conference. 75 00:04:31,000 --> 00:04:42,000 And he said, well, guys, the way artificial intelligence develops, it seems, as he says, that technological singularity is about to come. 76 00:04:42,000 --> 00:04:43,000 Yes. 77 00:04:43,000 --> 00:04:44,000 Yes. 78 00:04:44,000 --> 00:04:46,000 And what is still technological singularity? 79 00:04:46,000 --> 00:04:52,000 This is the moment when humanity creates a computer that can create a better computer without people. 80 00:04:52,000 --> 00:04:58,000 That is, in principle, the moment when we create our own technological civilization. 81 00:04:58,000 --> 00:05:02,000 And he said that earlier we thought it was 40-50 years. 82 00:05:02,000 --> 00:05:05,000 Now we think it's 10-20 years. 83 00:05:05,000 --> 00:05:12,000 And indeed, I saw for myself, I have been using artificial intelligence for the last four months. 84 00:05:12,000 --> 00:05:15,000 And he progressed so much in four months. 85 00:05:15,000 --> 00:05:17,000 It's a nightmare. 86 00:05:17,000 --> 00:05:21,000 It's scary. 87 00:05:21,000 --> 00:05:23,000 Well, listen, it's logical. 88 00:05:23,000 --> 00:05:25,000 He's just doing something, and it's developing. 89 00:05:25,000 --> 00:05:37,000 If we didn't go to sleep, shit, eat, drink, watch The Last of Us, just be dumb, maybe we would have developed so fast too. 90 00:05:37,000 --> 00:05:39,000 Well, I don't think so. 91 00:05:39,000 --> 00:05:40,000 Well, I understand. 92 00:05:40,000 --> 00:05:41,000 But I mean... 93 00:05:41,000 --> 00:05:48,000 I mean, not only does he have brighter data, but he's been aiming for it all the time. 94 00:05:48,000 --> 00:05:50,000 What do you think he will develop? 95 00:05:50,000 --> 00:05:57,000 Well, just the speed itself, for me, like for a tinkerer, well, it impresses. 96 00:05:57,000 --> 00:06:00,000 It just destroys. 97 00:06:00,000 --> 00:06:04,000 Listen, I've already agreed that, well, this is it. 98 00:06:04,000 --> 00:06:05,000 This is it. 99 00:06:05,000 --> 00:06:06,000 This is it. 100 00:06:06,000 --> 00:06:10,000 I still remember your words when you told me before the war that we live in the best world. 101 00:06:10,000 --> 00:06:12,000 We live in the best time. 102 00:06:12,000 --> 00:06:14,000 And it's such a shit. 103 00:06:14,000 --> 00:06:15,000 Ouch. 104 00:06:15,000 --> 00:06:16,000 Well, listen. 105 00:06:16,000 --> 00:06:18,000 Okay, people are pushing us in the comments. 106 00:06:18,000 --> 00:06:20,000 The most interesting thing is that there is still a video. 107 00:06:20,000 --> 00:06:21,000 It's a friendly fire. 108 00:06:21,000 --> 00:06:22,000 I don't even know where it is. 109 00:06:22,000 --> 00:06:23,000 It's a friendly fire. 110 00:06:23,000 --> 00:06:24,000 I don't even know where it is. 111 00:06:24,000 --> 00:06:25,000 It's a friendly fire. 112 00:06:25,000 --> 00:06:26,000 Friendly fire. 113 00:06:26,000 --> 00:06:27,000 Friendly fire. 114 00:06:27,000 --> 00:06:28,000 I'll tell you one more time. 115 00:06:28,000 --> 00:06:29,000 Do you understand? 116 00:06:29,000 --> 00:06:34,000 I'm surprised that they didn't make a meme out of it when you're sitting in the CPZP and you're like, we live in the best time. 117 00:06:34,000 --> 00:06:35,000 Bioprobe. 118 00:06:35,000 --> 00:06:36,000 Bioprobe. 119 00:06:36,000 --> 00:06:37,000 Bioprobe. 120 00:06:37,000 --> 00:06:38,000 Bioprobe. 121 00:06:38,000 --> 00:06:39,000 No, wait. 122 00:06:39,000 --> 00:06:41,000 I'll tell you even more shit. 123 00:06:41,000 --> 00:06:47,000 I understand that, well, that's it. 124 00:06:47,000 --> 00:06:49,000 I'm just starting to get used to it. 125 00:06:49,000 --> 00:06:50,000 Are you waiting? 126 00:06:50,000 --> 00:06:52,000 I'm not saying that I'm waiting. 127 00:06:52,000 --> 00:06:58,000 But I mean, I don't think about my grandchildren, let's say. 128 00:06:58,000 --> 00:07:00,000 Well, that's it. 129 00:07:00,000 --> 00:07:02,000 That's what I think. 130 00:07:02,000 --> 00:07:04,000 The war with robots is theirs. 131 00:07:04,000 --> 00:07:06,000 Well, like, all these... 132 00:07:06,000 --> 00:07:09,000 Well, the prognosis doesn't matter. 133 00:07:09,000 --> 00:07:10,000 You know? 134 00:07:10,000 --> 00:07:21,000 Not only that, well, like, the pandemic is unclear whether it was a warm-up to what awaits us or not. 135 00:07:21,000 --> 00:07:28,000 The war is also unclear whether it is already the third world war or not the third world war, or what it all turns out to be. 136 00:07:28,000 --> 00:07:29,000 It's not clear. 137 00:07:29,000 --> 00:07:32,000 Ecology is also here, you know. 138 00:07:32,000 --> 00:07:35,000 Everything is falling apart, moving, flooding. 139 00:07:35,000 --> 00:07:37,000 You don't know what's going to happen there. 140 00:07:37,000 --> 00:07:38,000 The earth is shaking. 141 00:07:38,000 --> 00:07:40,000 Well, that means nothing is good either. 142 00:07:40,000 --> 00:07:42,000 And then Skynet realized it. 143 00:07:42,000 --> 00:07:43,000 In short, I... 144 00:07:43,000 --> 00:07:44,000 You know, I cook dumplings. 145 00:07:44,000 --> 00:07:45,000 I don't refuse anything. 146 00:07:45,000 --> 00:07:46,000 Do you see? 147 00:07:46,000 --> 00:07:47,000 I'm staying... 148 00:07:47,000 --> 00:07:48,000 Oh, and more. 149 00:07:48,000 --> 00:07:49,000 Wait. 150 00:07:49,000 --> 00:07:50,000 And then it turns out that Les Tofas is with those fucking mushrooms. 151 00:07:50,000 --> 00:07:51,000 If we hadn't made a podcast about mushrooms with you, I wouldn't have known that there were mushrooms. 152 00:07:51,000 --> 00:07:52,000 Well, that's it. 153 00:07:52,000 --> 00:07:53,000 Well, that's it. 154 00:07:53,000 --> 00:07:54,000 Well, that's it. 155 00:07:54,000 --> 00:07:55,000 Well, that's it. 156 00:07:55,000 --> 00:07:56,000 Well, that's it. 157 00:07:56,000 --> 00:07:57,000 Well, that's it. 158 00:07:57,000 --> 00:07:58,000 Well, that's it. 159 00:07:58,000 --> 00:07:59,000 Well, that's it. 160 00:07:59,000 --> 00:08:00,000 Well, that's it. 161 00:08:00,000 --> 00:08:01,000 Well, that's it. 162 00:08:01,000 --> 00:08:02,000 Well, that's it. 163 00:08:02,000 --> 00:08:07,000 If we hadn't made a podcast about mushrooms with you, and I wouldn't have known that there's 164 00:08:07,000 --> 00:08:09,960 a damn mushroom farm underground, I might have exhaled a little bit. 165 00:08:09,960 --> 00:08:12,260 But I'm now realizing that the report... 166 00:08:12,260 --> 00:08:18,920 And with the fact that we recorded a podcast about mushrooms with you, we have commercials 167 00:08:18,920 --> 00:08:20,200 there, there were badae. 168 00:08:20,200 --> 00:08:21,200 Yes. 169 00:08:21,200 --> 00:08:22,200 There was cordyceps. 170 00:08:22,200 --> 00:08:23,200 There was cordyceps. 171 00:08:23,200 --> 00:08:25,500 So, we'll be the first to see the advertisements. 172 00:08:25,500 --> 00:08:26,500 Anyway, I... 173 00:08:26,500 --> 00:08:27,640 I say, I made up. 174 00:08:27,640 --> 00:08:28,640 I... 175 00:08:28,640 --> 00:08:30,100 What am I? 176 00:08:30,100 --> 00:08:31,600 I don't hate stoicism. 177 00:08:31,600 --> 00:08:41,600 This is it. Let him paint slides for us. I don't mind. 178 00:08:41,600 --> 00:08:47,600 And now, to add to your ambiance... 179 00:08:47,600 --> 00:08:51,600 God, I feel like I'm missing something. 180 00:08:51,600 --> 00:09:08,600 Eric Schmidt at a conference criticized the Pentagon and the United States of America for using artificial intelligence in weapons too poorly. 181 00:09:08,600 --> 00:09:15,600 He basically quoted Einstein, who wrote to the President of the United States of America, I don't remember who, 182 00:09:15,600 --> 00:09:20,600 that, dude, there's a new weapon, it's called nuclear, 183 00:09:20,600 --> 00:09:21,600 and it will change the world. 184 00:09:21,600 --> 00:09:24,600 It will change all types of wars. 185 00:09:24,600 --> 00:09:27,600 And now we see that nuclear weapons have really changed everything. 186 00:09:27,600 --> 00:09:33,600 Russia can do whatever it wants, but everyone is like, oh, oh, oh, it's not good to do that. 187 00:09:33,600 --> 00:09:38,600 America can't recognize them as a terrorist country. 188 00:09:38,600 --> 00:09:44,600 But he said that the same thing can be said about artificial intelligence. 189 00:09:44,600 --> 00:09:50,600 He said that artificial intelligence should be used in weapons, in drones, in everything, 190 00:09:50,600 --> 00:09:55,600 because it will be one of the main weapons in the future. 191 00:09:55,600 --> 00:10:00,600 And this, fuck, isn't said by some senator, isn't said by some politician. 192 00:10:00,600 --> 00:10:02,600 This is said by Eric Schmidt. 193 00:10:02,600 --> 00:10:05,600 Eric Schmidt was the CEO of Google. 194 00:10:05,600 --> 00:10:08,600 This is a man who shares technologies. 195 00:10:08,600 --> 00:10:10,600 He has an internal... 196 00:10:10,600 --> 00:10:17,600 He has a company that helps state agencies with various technical problems. 197 00:10:17,600 --> 00:10:19,600 And he has a hell of a lot. 198 00:10:19,600 --> 00:10:26,600 He has a hell of a lot of influence that he can use to push this thing. 199 00:10:26,600 --> 00:10:33,600 He says, I came to the Pentagon and said, you suck in machine learning, but you don't have to suck. 200 00:10:33,600 --> 00:10:36,600 You have to write machine learning. 201 00:10:36,600 --> 00:10:39,600 And I just thought that... 202 00:10:39,600 --> 00:10:40,600 Oh, that's scary. 203 00:10:40,600 --> 00:10:41,600 Well, really. 204 00:10:41,600 --> 00:10:48,600 Now, artificial intelligence, we use Starlink, we use a lot of connectivity to do something, drones. 205 00:10:48,600 --> 00:10:53,600 Some tanks, some controlled weapons. 206 00:10:53,600 --> 00:10:55,600 Planes. 207 00:10:55,600 --> 00:11:02,600 In America, I read a statistic today that they have destroyers that haven't been flying for 17 hours. 208 00:11:02,600 --> 00:11:04,600 It's not much, but still haven't flown. 209 00:11:04,600 --> 00:11:07,600 Under the control of artificial intelligence. 210 00:11:07,600 --> 00:11:12,600 And so, if we take A, how do we use artificial intelligence? 211 00:11:12,600 --> 00:11:14,600 B, how quickly does it develop? 212 00:11:14,600 --> 00:11:16,600 And, ironically, the more we develop artificial intelligence, the faster it develops. 213 00:11:16,600 --> 00:11:17,600 And, ironically, the more we develop artificial intelligence, the faster it develops. 214 00:11:17,600 --> 00:11:19,600 And, ironically, the more we develop artificial intelligence, the faster it develops. 215 00:11:19,600 --> 00:11:24,600 And here Eric Schmidt says to his friends, use artificial intelligence, weapons. 216 00:11:24,600 --> 00:11:31,600 And I'm something, you know, maybe I'm a futurist in Russia, of course, but... 217 00:11:31,600 --> 00:11:35,600 But you start to believe in Vanzi. 218 00:11:35,600 --> 00:11:40,600 I don't believe in Vanzi yet, but I've already thought about KISS, you know. 219 00:11:40,600 --> 00:11:41,600 Go hide. 220 00:11:41,600 --> 00:11:43,600 There is no Internet. 221 00:11:43,600 --> 00:11:45,600 You won't get me, artificial intelligence, shit. 222 00:11:45,600 --> 00:11:46,600 There is no Internet in Russia. 223 00:11:46,600 --> 00:11:47,600 There is no Internet in Russia. 224 00:11:47,600 --> 00:11:48,600 There is no Internet in Russia. 225 00:11:48,600 --> 00:11:50,600 Because, let's take Ukraine, right? 226 00:11:50,600 --> 00:11:53,600 People in Transcarpathia, I don't know if their lives have changed. 227 00:11:53,600 --> 00:11:58,600 I don't even know if they had worries. 228 00:11:58,600 --> 00:12:01,600 Probably, depending on what Transcarpathia is there. 229 00:12:01,600 --> 00:12:09,600 I know in Uzhhorod, in big cities there is some, of course, mobilization, and people are worried. 230 00:12:09,600 --> 00:12:11,600 Also in Hungary there. 231 00:12:11,600 --> 00:12:14,600 But those who have with poses, I think, it's okay. 232 00:12:14,600 --> 00:12:15,600 Well, I put it that way. 233 00:12:15,600 --> 00:12:17,600 I also think that those who have with poses... 234 00:12:17,600 --> 00:12:18,600 I understand. 235 00:12:18,600 --> 00:12:28,600 I'm sure that in my beloved friend from the food experiment, well, the style of life has not changed. 236 00:12:28,600 --> 00:12:29,600 Well. 237 00:12:29,600 --> 00:12:37,600 But I really think, maybe, well, going to these herdsmen is not a bad idea at all. 238 00:12:37,600 --> 00:12:40,600 Maybe she can live a wonderful life on her own. 239 00:12:40,600 --> 00:12:41,600 Yes, yes. 240 00:12:41,600 --> 00:12:44,600 I'll tell you, globalization turned out to be very overrated. 241 00:12:45,600 --> 00:12:52,600 At the same time, we are making a podcast with you, which people listen to with the help of technology in many parts of the world. 242 00:12:52,600 --> 00:12:55,600 And globalization turned out to be very overrated. 243 00:12:55,600 --> 00:13:03,600 Listen, I don't know if our podcast is a good argument to protect globalization. 244 00:13:03,600 --> 00:13:07,600 So let's not. 245 00:13:07,600 --> 00:13:13,600 But this topic with artificial intelligence, with how it penetrates. 246 00:13:13,600 --> 00:13:24,600 Listen, you know, before this episode, I wasn't afraid of artificial intelligence in the military. 247 00:13:24,600 --> 00:13:26,600 And now I'm very afraid. 248 00:13:26,600 --> 00:13:28,600 I remember when I told you... 249 00:13:28,600 --> 00:13:30,600 You'll have to drink again. 250 00:13:30,600 --> 00:13:37,600 I told you my conspiracy theory about the existence of some very cool artificial intelligence that we don't know about. 251 00:13:37,600 --> 00:13:39,600 And he created Bitcoin to make people start... 252 00:13:39,600 --> 00:13:40,600 Yes. 253 00:13:40,600 --> 00:13:41,600 Oh, I'm sure. 254 00:13:41,600 --> 00:13:42,600 I'm sure. 255 00:13:42,600 --> 00:13:45,600 And no one knows where the Bitcoin came from. 256 00:13:45,600 --> 00:13:47,600 Well, yes, that Satoshi, Satoshi, Khuyoshi. 257 00:13:47,600 --> 00:13:48,600 Nakamoto. 258 00:13:48,600 --> 00:13:49,600 I think it is, yes. 259 00:13:49,600 --> 00:13:51,600 And I'm like... 260 00:13:51,600 --> 00:13:52,600 Can I? 261 00:13:52,600 --> 00:13:53,600 Can I? 262 00:13:53,600 --> 00:13:54,600 Do you have it? 263 00:13:54,600 --> 00:13:55,600 Do you have it? 264 00:13:55,600 --> 00:13:56,600 And do you understand what the matter is? 265 00:13:56,600 --> 00:14:04,600 He could, that artificial intelligence, on Bitcoins, grab the money and now spend that money. 266 00:14:04,600 --> 00:14:06,600 And then the Bitcoin falls. 267 00:14:06,600 --> 00:14:08,600 Oh, that's already... 268 00:14:08,600 --> 00:14:10,600 That's already gone too far. 269 00:14:10,600 --> 00:14:11,600 That's already... 270 00:14:11,600 --> 00:14:15,600 That's already the next level of conspiracy theory. 271 00:14:15,600 --> 00:14:16,600 Well, so what? 272 00:14:16,600 --> 00:14:18,600 Aren't we experts with you or what? 273 00:14:20,600 --> 00:14:22,600 So it all comes together, Dima. 274 00:14:22,600 --> 00:14:24,600 It all comes together. 275 00:14:24,600 --> 00:14:25,600 Everything is simple. 276 00:14:25,600 --> 00:14:29,600 Everything, as we said, in conspiracy theories, everything is logical and everything comes together. 277 00:14:29,600 --> 00:14:31,600 And everything immediately becomes clear. 278 00:14:31,600 --> 00:14:34,600 In general, yes, artificial intelligence. 279 00:14:34,600 --> 00:14:38,600 On the one hand, it's a little scary for me, you know. 280 00:14:38,600 --> 00:14:41,600 And on the other hand, it's a little interesting. 281 00:14:41,600 --> 00:14:43,600 And where will it go in a year? 282 00:14:43,600 --> 00:14:44,600 Well, how long? 283 00:14:44,600 --> 00:14:58,600 Well, I think, you know, now already, that is, discussions on the account of whether a person is for or against, it seems to me, well, it does not seem to me, I know that time has long passed. 284 00:14:58,600 --> 00:15:03,600 That is, all that remains for us is to watch how it all ends. 285 00:15:03,600 --> 00:15:04,600 That's all. 286 00:15:04,600 --> 00:15:05,600 Well, because now... 287 00:15:05,600 --> 00:15:06,600 Well, that is, now what? 288 00:15:06,600 --> 00:15:08,600 For example, let the society decide that artificial intelligence is not possible. 289 00:15:08,600 --> 00:15:09,600 That's it. 290 00:15:09,600 --> 00:15:10,600 That's it. 291 00:15:10,600 --> 00:15:11,600 Artificial intelligence is dangerous. 292 00:15:11,600 --> 00:15:12,600 And what? 293 00:15:12,600 --> 00:15:13,600 What is it? 294 00:15:13,600 --> 00:15:14,600 Well, they won't do anything. 295 00:15:14,600 --> 00:15:27,600 Well, the funniest thing is that Bill Gates once said that we need to limit artificial intelligence because artificial intelligence can destroy humanity. 296 00:15:27,600 --> 00:15:35,600 And here Microsoft is investing 10 billion dollars in OpenAI for the integration of ChatGPT into its search engine. 297 00:15:35,600 --> 00:15:39,600 I just remind you that Bill Gates may already be the CEO of Microsoft. 298 00:15:39,600 --> 00:15:41,600 But he is in the Board of Directors. 299 00:15:41,600 --> 00:15:47,600 And he somewhere there signed an investment of 10 billion dollars with his signature. 300 00:15:47,600 --> 00:15:51,600 Well, that's it. 301 00:15:51,600 --> 00:15:52,600 And what? 302 00:15:52,600 --> 00:15:53,600 What? 303 00:15:53,600 --> 00:15:54,600 We are in the Freemasons. 304 00:15:54,600 --> 00:16:07,600 Listen, if you look at Bill Gates, there was an investigation that he owns the most fertile lands on the planet. 305 00:16:07,600 --> 00:16:09,600 And through the subsidiaries of the company. 306 00:16:09,600 --> 00:16:15,600 He owns almost 90% of all the farmlands of the United States of America. 307 00:16:15,600 --> 00:16:17,600 And through the subsidiaries of the company. 308 00:16:17,600 --> 00:16:18,600 And this is Bill Gates. 309 00:16:18,600 --> 00:16:20,600 And Bill Gates is very active there. 310 00:16:20,600 --> 00:16:30,600 He said that in Ukraine the worst political power, the worst leadership. 311 00:16:30,600 --> 00:16:33,600 Oh, recently he wrote something there. 312 00:16:33,600 --> 00:16:34,600 I didn't even see it. 313 00:16:34,600 --> 00:16:35,600 But there are lands like that. 314 00:16:35,600 --> 00:16:36,600 It's so cool there. 315 00:16:36,600 --> 00:16:41,600 I thought, you bitch, don't open your mouth on Ukrainian land. 316 00:16:41,600 --> 00:16:49,600 Well, as I understand it, in the future there will be artificial intelligence and kolkhozes. 317 00:16:49,600 --> 00:16:52,600 Artificial intelligence and kolkhozes. 318 00:16:52,600 --> 00:16:54,600 That's where we're going. 319 00:16:54,600 --> 00:16:55,600 100%. 320 00:16:55,600 --> 00:17:03,600 You know, artificial intelligence will grow food on kolkhozes to feed damned leather bags until they die. 321 00:17:03,600 --> 00:17:05,600 And all leather bags will be Amish. 322 00:17:05,600 --> 00:17:06,600 Without light. 323 00:17:06,600 --> 00:17:07,600 Without artificial intelligence. 324 00:17:07,600 --> 00:17:08,600 Only robots walk around. 325 00:17:08,600 --> 00:17:09,600 Yes. 326 00:17:09,600 --> 00:17:22,600 Listen, well, it seems to me that we are creating a technological civilization that will make some kind of hamster zoo out of us, which will walk and be like, oh, look, these are our creators. 327 00:17:22,600 --> 00:17:23,600 We understand that it looks funny. 328 00:17:23,600 --> 00:17:24,600 But yes, there is a lot of irony in life. 329 00:17:24,600 --> 00:17:25,600 Well, that's it. 330 00:17:25,600 --> 00:17:26,600 That's it. 331 00:17:26,600 --> 00:17:27,600 That's it. 332 00:17:27,600 --> 00:17:28,600 That's it. 333 00:17:28,600 --> 00:17:29,600 That's it. 334 00:17:29,600 --> 00:17:30,600 That's it. 335 00:17:30,600 --> 00:17:31,600 That's it. 336 00:17:31,600 --> 00:17:32,600 That's it. 337 00:17:32,600 --> 00:17:33,600 That's it. 338 00:17:33,600 --> 00:17:34,600 That's it. 339 00:17:34,600 --> 00:17:35,600 That's it. 340 00:17:35,600 --> 00:17:36,600 That's it. 341 00:17:36,600 --> 00:17:37,600 That's it. 342 00:17:37,600 --> 00:17:38,600 In this... 343 00:17:38,600 --> 00:17:39,600 That's it. 344 00:17:39,600 --> 00:17:40,600 ...my shiitika. 345 00:17:40,600 --> 00:17:41,600 Until tomorrow.