1 00:00:00,000 --> 00:00:08,000 The development of artificial intelligence, AI, and robots will replace human labor in the future. 2 00:00:08,000 --> 00:00:16,000 In what field should we find the real value of our human beings that cannot be replaced by artificial intelligence? 3 00:00:30,000 --> 00:00:36,000 Thank you for welcoming me. I'm a social science instructor, Lee Ji-young. 4 00:00:43,000 --> 00:00:47,000 A lecture on the subject of artificial intelligence is a very difficult lecture. 5 00:00:47,000 --> 00:00:56,000 As you all know, the speed of technological development is too fast, and human prediction often goes too far. 6 00:00:56,000 --> 00:00:59,000 Not long ago, I had a lecture with students, 7 00:00:59,000 --> 00:01:00,000 and I said, 8 00:01:01,000 --> 00:01:05,000 There is a subject called social culture in the subject I teach, 9 00:01:05,000 --> 00:01:10,000 and that subject is one of the CSAT subjects as a social exploration subject. 10 00:01:11,000 --> 00:01:15,000 About 15 years ago, this was written in the social exploration textbook. 11 00:01:16,000 --> 00:01:24,000 The development of artificial intelligence, AI, and robots will replace human labor in the future. 12 00:01:24,000 --> 00:01:29,000 Have you ever heard of Jeremy Rifkin's book, The End of Labor? 13 00:01:29,000 --> 00:01:30,000 Have you ever heard of it? 14 00:01:30,000 --> 00:01:32,000 I'm sure you've heard of it. 15 00:01:32,000 --> 00:01:36,000 About 80% of human jobs can be replaced, 16 00:01:36,000 --> 00:01:44,000 and sometimes there are a lot of warnings in the philosophy books that humans have to lose their labor and live on basic income. 17 00:01:44,000 --> 00:01:48,000 At that time, these were the predictions about the future. 18 00:01:48,000 --> 00:01:59,000 Simple labor will be replaced by machines, and humans can only play their role in areas that deal with high-dimensional intelligence. 19 00:01:59,000 --> 00:02:07,000 Sometimes, the prediction in the textbook 15 years ago seems to be completely wrong. 20 00:02:07,000 --> 00:02:11,000 Not long ago, about 1.5 million views on YouTube, 21 00:02:11,000 --> 00:02:15,000 my simple short video was uploaded to the algorithm. 22 00:02:15,000 --> 00:02:21,000 As I was giving lectures to students, I didn't know if it would be made into a short video separately, 23 00:02:21,000 --> 00:02:24,000 but I think it was probably made on the fan page. 24 00:02:24,000 --> 00:02:27,000 When I teach my students, 25 00:02:27,000 --> 00:02:31,000 there are many students who want to be a doctor, 26 00:02:31,000 --> 00:02:35,000 or want to go to business school and become a CEO, 27 00:02:35,000 --> 00:02:39,000 or want to go to law school and become a lawyer, 28 00:02:39,000 --> 00:02:45,000 so it was one of the things I mentioned to give them motivation. 29 00:02:45,000 --> 00:02:49,000 One of my juniors took a lawyer's exam, 30 00:02:49,000 --> 00:02:51,000 and went to Kim-Hen-Jang, 31 00:02:51,000 --> 00:02:53,000 and there were two juniors who did a lot of large-scale cases, 32 00:02:53,000 --> 00:02:56,000 and started a business and set up a law firm. 33 00:02:56,000 --> 00:02:58,000 One of them came to me. 34 00:02:58,000 --> 00:03:02,000 In a comfortable conversation, he said the following. 35 00:03:02,000 --> 00:03:04,000 I made a small law firm, 36 00:03:04,000 --> 00:03:08,000 and hired six interns as a new lawyer. 37 00:03:08,000 --> 00:03:13,000 I told them to prepare a document related to the conclusion, 38 00:03:13,000 --> 00:03:19,000 and the results I got in a week were not better than the results I made in an hour 39 00:03:19,000 --> 00:03:24,000 using the AI app that lawyers use, 40 00:03:24,000 --> 00:03:28,000 so I don't think I'm going to hire more interns in the future. 41 00:03:28,000 --> 00:03:30,000 The more I go, 42 00:03:30,000 --> 00:03:33,000 the more I make smart decisions as a senior rather than a new lawyer, 43 00:03:33,000 --> 00:03:36,000 whether what AI is talking about now is a lie, 44 00:03:36,000 --> 00:03:39,000 or whether it is a fake false information, 45 00:03:39,000 --> 00:03:43,000 in addition to a small number of lawyers who can really cover it up, 46 00:03:43,000 --> 00:03:46,000 I thought that a lawyer's job would have to pay tax. 47 00:03:46,000 --> 00:03:49,000 Of course, it was my personal opinion. 48 00:03:49,000 --> 00:03:52,000 I think it's possible with artificial intelligence. 49 00:03:52,000 --> 00:03:54,000 In addition, man-made accessibility is an artificial intelligence. 50 00:03:54,000 --> 00:03:57,000 In the world where there is an artificial intelligence, 51 00:03:57,000 --> 00:04:01,000 WeOrangeU just Graham cultivate possible options for the future 52 00:04:01,000 --> 00:04:04,000 such as artificial intelligence that has been presented in the world 53 00:04:04,000 --> 00:04:08,000 Now, when we talk about artificial intelligence as an artificial intelligence, 54 00:04:08,000 --> 00:04:11,000 science and technology develop, 55 00:04:11,000 --> 00:04:14,000 and again I can tell you about our specialization of artificial intelligence is 56 00:04:14,000 --> 00:04:17,000 it's only the first thing I wanted to look into, 57 00:04:17,000 --> 00:04:19,000 now, when we talk about artificial intelligence 58 00:04:19,000 --> 00:04:21,000 which is known as machine intelligence, 59 00:04:21,000 --> 00:04:27,000 I've been thinking about whether I can easily deliver the philosophy to students. 60 00:04:27,000 --> 00:04:31,000 These days, I also use artificial intelligence chatbots, 61 00:04:31,000 --> 00:04:39,000 and I often think that the area of human knowledge is in front of a turning point in which changes are taking place at one stage. 62 00:04:39,000 --> 00:04:47,000 With the feeling that things that have been studied, read, and organized for the rest of their lives are changing at a very fast speed, 63 00:04:47,000 --> 00:04:51,000 they can become meaningless. 64 00:04:51,000 --> 00:04:57,000 These days, when I go to lectures for my parents, my parents ask me a lot of questions. 65 00:04:57,000 --> 00:05:03,000 In an era when artificial intelligence is developing so fast, should my child go to an English kindergarten? 66 00:05:03,000 --> 00:05:05,000 Is the Daechi-dong lecture meaningful? 67 00:05:05,000 --> 00:05:07,000 You start with that question. 68 00:05:07,000 --> 00:05:09,000 I am a professor of humanities. 69 00:05:09,000 --> 00:05:17,000 Today, I would like to ask you a question about the changes in the future. 70 00:05:17,000 --> 00:05:19,000 Let me go back to my high school days. 71 00:05:19,000 --> 00:05:25,000 I will go back to my work. 72 00:05:25,000 --> 00:05:31,000 Do you know the Maru doll or the Maron doll? 73 00:05:31,000 --> 00:05:37,000 It is a three-dimensional doll that children change clothes. 74 00:05:37,000 --> 00:05:39,000 When I was young, 75 00:05:39,000 --> 00:05:41,000 I grew up in a poor family. 76 00:05:41,000 --> 00:05:45,000 I spent my middle school days in a room that smelled like Kiki in the movie Parasite, 77 00:05:45,000 --> 00:05:51,000 which you have seen a lot. 78 00:05:51,000 --> 00:05:55,000 The biggest thing I experienced in those days was that on the day of the storm, 79 00:05:55,000 --> 00:05:59,000 the mountain in front of my house collapsed into a mountain, 80 00:05:59,000 --> 00:06:03,000 and the house was buried in the fallen mountain. 81 00:06:03,000 --> 00:06:07,000 I still remember the experience of losing my house. 82 00:06:07,000 --> 00:06:09,000 I lived a poor life when I was young. 83 00:06:09,000 --> 00:06:15,000 At that time, the price of a Maron doll was only 10,000 won or 20,000 won. 84 00:06:15,000 --> 00:06:19,000 I couldn't buy one of those. 85 00:06:19,000 --> 00:06:21,000 It's a sticker doll that costs less than 1,000 won. 86 00:06:21,000 --> 00:06:23,000 It's a paper doll. 87 00:06:23,000 --> 00:06:25,000 It's a 2D doll. 88 00:06:25,000 --> 00:06:27,000 I remember playing with that paper doll. 89 00:06:27,000 --> 00:06:29,000 Imagine it together. 90 00:06:29,000 --> 00:06:31,000 Here's a paper doll. 91 00:06:31,000 --> 00:06:33,000 I put a sticker on it to change clothes. 92 00:06:33,000 --> 00:06:35,000 I put a sticker on it to change clothes. 93 00:06:35,000 --> 00:06:38,000 And I put a paper doll again. 94 00:06:38,000 --> 00:06:43,000 In the 2верse world, there is a man named Cheolsu who meets 21 00 00. 95 00:06:43,000 --> 00:06:47,000 I set up Young-hee to change her dress. 96 00:06:47,000 --> 00:06:49,000 And I have just changed Young-hee. 97 00:06:49,000 --> 00:06:55,000 To those three dimensional body from the 3 dungeons to the Voor ände I am living, 98 00:06:55,000 --> 00:06:59,000 I completely understand Young-hee and Cheolsu's mechanism in the 2nd dimension. 99 00:06:59,000 --> 00:07:03,000 But as a Cheolsu who is living in the 2nd dimension world, 100 00:07:03,000 --> 00:07:05,000 It remains something else. 101 00:07:05,000 --> 00:07:11,840 In the world of the second dimension that he understands, his girlfriend, Young-hee, disappears at some point. 102 00:07:11,840 --> 00:07:15,880 And she shows up in different clothes. 103 00:07:15,880 --> 00:07:18,600 Wouldn't you say Cheol-soo is a miracle? 104 00:07:18,600 --> 00:07:20,800 Let's come to the world of the second dimension. 105 00:07:20,800 --> 00:07:25,560 Let's say I have a boyfriend who has never been here in my life. 106 00:07:25,560 --> 00:07:28,280 Why is dating so difficult? 107 00:07:28,280 --> 00:07:34,400 I'm not good at dating, but I'll imagine that my imaginary boyfriend looks like Cha Eun-woo. 108 00:07:34,400 --> 00:07:43,600 I'm walking in a three-dimensional space and in an unstable four-dimensional time, and my boyfriend suddenly disappears. 109 00:07:43,600 --> 00:07:51,520 And if he changes his clothes and reappears in front of me, I think I'll have faith from then on. 110 00:07:51,520 --> 00:07:53,520 There's a miracle. 111 00:07:53,520 --> 00:08:00,680 I'm going to think, is there an absolute person in a high-dimensional area that we don't understand in the world of our dimension? 112 00:08:00,680 --> 00:08:04,400 What I just told you is what I wrote in my third year of high school. 113 00:08:04,400 --> 00:08:06,400 It was part of my first year of high school. 114 00:08:06,400 --> 00:08:08,400 What does human faith come from? 115 00:08:08,400 --> 00:08:20,400 In a way, the mysterious area that we don't understand in the high-dimensional world of physics was written with the thought that it might be physically solved. 116 00:08:20,400 --> 00:08:23,400 I remember winning an award at the writing competition. 117 00:08:23,400 --> 00:08:32,400 Humans are afraid of what happens in their own world through mechanisms they don't know, or they have faith, or they're unfamiliar with it. 118 00:08:32,400 --> 00:08:33,400 That's the beauty of humans. 119 00:08:33,400 --> 00:08:34,400 That's the beauty of humans. That's the beauty of humans. That's the beauty of humans. That's the beauty of humans. 120 00:08:34,400 --> 00:08:35,400 It was the origin of aтора religion. 121 00:08:35,400 --> 00:08:42,540 One very famous philosopher, Yuval Harari, describes the age of AI in his book , 122 00:08:42,540 --> 00:08:44,400 As follows. 123 00:08:44,400 --> 00:08:49,400 Inside AI algorithms and mechanisms that humans don't understand, 124 00:08:49,400 --> 00:08:55,400 The age of AI is the age where sharp AI or AGI is built. 125 00:08:55,400 --> 00:09:00,400 If an age of AI is built where AI is possible for human play, 126 00:09:00,400 --> 00:09:03,400 It creates a new believes to human beings. 127 00:09:03,400 --> 00:09:08,400 It's an analysis that says it could be the beginning of a new religion. 128 00:09:08,400 --> 00:09:13,400 What do you think religion and religion are? 129 00:09:13,400 --> 00:09:19,400 When we look at human history, we ask about the unknown future. 130 00:09:19,400 --> 00:09:22,400 What kind of person would you like to meet? 131 00:09:22,400 --> 00:09:24,400 How old would you like to get married? 132 00:09:24,400 --> 00:09:26,400 What major would you like to choose? 133 00:09:26,400 --> 00:09:28,400 What kind of job would you like to choose? 134 00:09:28,400 --> 00:09:35,400 We often rely on the field of faith to ask whether it is good to quit and change jobs here. 135 00:09:35,400 --> 00:09:38,400 However, with the development of AI chatbot, 136 00:09:38,400 --> 00:09:44,400 we are now in an era of asking what to eat for dinner, 137 00:09:44,400 --> 00:09:52,400 what kind of movie to watch, and what kind of career to choose for AI chatbot at a very fast pace. 138 00:09:52,400 --> 00:09:58,400 In a way, it is a new AGI that is impossible to understand 100% for humans. 139 00:09:58,400 --> 00:10:06,400 In other words, the emergence of AI agents may open a new and unfamiliar era for humans. 140 00:10:06,400 --> 00:10:11,400 While studying humanities, I had a fun theory. 141 00:10:11,400 --> 00:10:13,400 It was a theory called the theory of the second. 142 00:10:13,400 --> 00:10:16,400 When talking about disasters, it is the theory of the second. 143 00:10:16,400 --> 00:10:20,400 When talking about unfamiliar and strange things, it is the theory of the second. 144 00:10:20,400 --> 00:10:22,400 And it is a theory that means a theory. 145 00:10:22,400 --> 00:10:24,400 The theory of the theory of the second. 146 00:10:24,400 --> 00:10:26,400 When there is a natural disaster, 147 00:10:26,400 --> 00:10:27,400 when there is a storm, 148 00:10:27,400 --> 00:10:28,400 when there is an earthquake, 149 00:10:28,400 --> 00:10:29,400 when there is a tsunami, 150 00:10:29,400 --> 00:10:32,400 when there is a big change of events like a tsunami, 151 00:10:32,400 --> 00:10:38,400 people don't understand why disasters like this occur in the Joseon Dynasty. 152 00:10:38,400 --> 00:10:42,400 It was a theory that found the cause of the disaster from the king's fortune. 153 00:10:42,400 --> 00:10:46,400 You can think of it as an adage of the philosophy theory of the Joseon Dynasty. 154 00:10:46,400 --> 00:10:49,400 At that time, it was before scientific technology developed, 155 00:10:49,400 --> 00:10:54,400 and it was before we verified everything like now and looked at everything as a scientific accident. 156 00:10:54,400 --> 00:10:57,400 Looking at the theory of the Joseon Dynasty, 157 00:10:57,400 --> 00:10:58,400 I thought, 158 00:10:58,400 --> 00:11:03,400 wow, this is the era of the science. 159 00:11:03,400 --> 00:11:14,400 Now, someone among your friends or colleagues is watching a scene where someone is suffering from a natural disaster due to a huge storm and a huge drought, 160 00:11:14,400 --> 00:11:17,400 and they are watching that terrible article on TV news. 161 00:11:17,400 --> 00:11:19,400 If there is someone who says, 162 00:11:19,400 --> 00:11:21,400 it's all because you chose the wrong president, 163 00:11:21,400 --> 00:11:24,400 would you like to be close to that person? 164 00:11:24,400 --> 00:11:26,400 I don't think so. 165 00:11:26,400 --> 00:11:31,400 As the theory of the Joseon Dynasty has become more and more stringent, 166 00:11:31,400 --> 00:11:33,400 it is becoming more and more de-religiousized 167 00:11:33,400 --> 00:11:36,400 with the development of scientific technologies. 168 00:11:36,400 --> 00:11:39,400 But the appearance of artificial intelligence 169 00:11:39,400 --> 00:11:43,400 and the development of artificial intelligence's getting stronger and stronger technologies 170 00:11:43,400 --> 00:11:44,400 is something that I, 171 00:11:44,400 --> 00:11:47,400 like the high-level me who raised the philosophy and spirit of the two-dimensional world 172 00:11:47,400 --> 00:11:52,400 in the world of the mechanism that we do not know, 173 00:11:52,400 --> 00:11:55,400 cannot understand the mechanism 174 00:11:55,400 --> 00:12:04,400 There are a lot of philosophical concerns that we might be able to lead humans with a mechanism that we can't use. 175 00:12:04,400 --> 00:12:10,400 I'm going to organize artificial intelligence ethics with four keywords with more realistic stories. 176 00:12:10,400 --> 00:12:18,400 I usually introduce artificial intelligence-related ethics lectures with four keywords. 177 00:12:18,400 --> 00:12:22,400 The first is the problem of choice, the second is the problem of anxiety, 178 00:12:22,400 --> 00:12:27,400 the third is the problem of hatred, and the fourth is the problem of time. 179 00:12:27,400 --> 00:12:34,400 The first is the problem of choice, which we should never miss out on when it comes to artificial intelligence. 180 00:12:34,400 --> 00:12:46,400 There is a question that a professor named Michael Sandel, who lectures on justice at Harvard University, always asks at the beginning of his lecture on justice theory. 181 00:12:46,400 --> 00:12:50,400 It's a question known as the trolley dilemma. 182 00:12:50,400 --> 00:12:52,400 You have 10 questions. 183 00:12:52,400 --> 00:12:53,400 It's a train. 184 00:12:53,400 --> 00:12:56,400 I found out that the brake on the train was broken, 185 00:12:56,400 --> 00:12:58,400 and if the train goes straight like this, 186 00:12:58,400 --> 00:13:08,400 the six civil servants guarding the railway track that I believe will stop at the stop line will die. 187 00:13:08,400 --> 00:13:11,400 We learned that there is an emergency road. 188 00:13:11,400 --> 00:13:19,400 If you misunderstand the emergency road for the past 20 years without going straight, 189 00:13:19,400 --> 00:13:20,400 if you go straight, 190 00:13:20,400 --> 00:13:22,400 you can go to the mountain. 191 00:13:22,400 --> 00:13:32,400 You can save six people, but you will die because you firmly believe that one of the ordinary citizens will not get on the train and walk. 192 00:13:32,400 --> 00:13:37,400 This is the same question I ask every first week of my high school law class. 193 00:13:37,400 --> 00:13:41,400 Please answer it together. It's an easy question. 194 00:13:41,400 --> 00:13:46,400 If you go straight like this, six people will die, and if you take an emergency route, one person will die. 195 00:13:46,400 --> 00:13:52,400 Would you choose to go straight where six people die, or would you choose to go straight where only one person dies? 196 00:13:52,400 --> 00:13:56,400 I'll ask the person in front of you. Go straight or go straight. 197 00:13:59,400 --> 00:14:00,400 Go straight. 198 00:14:00,400 --> 00:14:02,400 Go straight. You're killing six people. 199 00:14:04,400 --> 00:14:05,400 You're bold. 200 00:14:06,400 --> 00:14:09,400 Shall I ask you a simple question? 201 00:14:09,400 --> 00:14:11,400 I'm going to take an emergency route. 202 00:14:12,400 --> 00:14:14,400 Thank you. Please lower your hand. 203 00:14:14,400 --> 00:14:15,400 I'm going to go straight. 204 00:14:16,400 --> 00:14:18,400 Only five people raised their hands. 205 00:14:18,400 --> 00:14:19,400 Thank you. 206 00:14:19,400 --> 00:14:29,400 Everyone, I'm a lawyer who teaches ethics, so when I go straight or go straight, I mention each philosopher. 207 00:14:29,400 --> 00:14:33,400 A philosopher named Jeremy Van Dam in the UK, who talked about the greatest happiness of the greatest majority, said, 208 00:14:33,400 --> 00:14:36,400 If you can save the majority with the sacrifice of the majority, 209 00:14:36,400 --> 00:14:42,400 If you can stop with the sacrifice of the majority by listening to the right-wing oligarch who said it was correct, 210 00:14:42,400 --> 00:14:45,400 You have to take an emergency route. 211 00:14:45,400 --> 00:14:50,440 In other words, there are other options. 212 00:14:50,440 --> 00:14:52,400 statement that the conclusion must be truth. 213 00:14:52,400 --> 00:14:53,400 quote 214 00:14:53,400 --> 00:14:54,400 Or in other words, 215 00:14:54,400 --> 00:14:56,400 They are always ди�댕여 216 00:14:56,400 --> 00:14:58,420 to guide human beings by policy and by method, 217 00:14:58,420 --> 00:15:04,400 Before, wherever, to themselves, to others I heard Kant's philosophy that never inventorizes personality, 218 00:15:04,400 --> 00:15:07,400 I heard German philosopher Immanuel Kant's philosophy, 219 00:15:07,400 --> 00:15:12,400 To save these six people, you can't dispel one by means of method or feeling. 220 00:15:12,400 --> 00:15:13,400 while saying 221 00:15:13,400 --> 00:15:21,400 It is also said that it is not morally right to sacrifice one person even if it was inevitable. 222 00:15:21,400 --> 00:15:26,400 Earlier, 90% of the people who were here said that they would misunderstand the emergency railway. 223 00:15:26,400 --> 00:15:31,400 Then, when you tried to turn the handle to misunderstand the emergency railway, 224 00:15:31,400 --> 00:15:42,400 I saw that the father waiting for you to leave the emergency railway was walking to wait for you with chicken. 225 00:15:42,400 --> 00:15:47,400 Will your choice be to sue the emergency railway again? 226 00:15:47,400 --> 00:15:50,400 Can you stand chicken? 227 00:15:50,400 --> 00:15:56,400 It's the same as when you judge the third citizen of my father. 228 00:15:56,400 --> 00:16:02,400 Or, the person who dies in the emergency railway is someone I don't know. 229 00:16:02,400 --> 00:16:06,400 If you go straight, six of my close colleagues will die. 230 00:16:06,400 --> 00:16:09,400 Is that an opportunity to change your choice? 231 00:16:09,400 --> 00:16:12,400 There is a reason why I ask this question. 232 00:16:12,400 --> 00:16:19,400 It is not easy for us humans to see what is right and what is wrong. 233 00:16:19,400 --> 00:16:24,400 To us, a thorough principle of ethics is still in the realm of debate, 234 00:16:24,400 --> 00:16:27,400 but humans do not have exactly one answer. 235 00:16:27,400 --> 00:16:34,400 It is the same with the autonomous driving system cars that humans are currently preparing. 236 00:16:34,400 --> 00:16:39,400 Think about it from the perspective of ordinary people. 237 00:16:39,400 --> 00:16:44,400 Think of yourself as a college student who is discussing the legal issue of autonomous driving. 238 00:16:44,400 --> 00:16:48,400 In the early development stage of this autonomous driving system, 239 00:16:48,400 --> 00:16:51,400 you have to set a setting value. 240 00:16:51,400 --> 00:16:54,400 When the autonomous driving system is moving forward, 241 00:16:54,400 --> 00:16:58,400 other cars are coming in as variables that cannot be controlled by the car body. 242 00:16:58,400 --> 00:17:03,400 If the AI car stops as it is, the driver dies, 243 00:17:03,400 --> 00:17:08,400 but the AI autonomous driving car turns and stops driving. 244 00:17:08,400 --> 00:17:10,400 When the car does not stop driving, 245 00:17:10,400 --> 00:17:13,400 you can imagine six people. 246 00:17:13,400 --> 00:17:15,400 In your view, 247 00:17:15,400 --> 00:17:18,400 what is more moral than programming six people 248 00:17:18,400 --> 00:17:23,400 and killing one driver? 249 00:17:23,400 --> 00:17:26,400 For those of you who chose to take the emergency lane, 250 00:17:26,400 --> 00:17:29,400 for those of you who tried to save the majority with the sacrifice of a few, 251 00:17:29,400 --> 00:17:30,400 in your view, 252 00:17:30,400 --> 00:17:35,400 the death of one driver is a much more moral choice than killing six people. 253 00:17:35,400 --> 00:17:36,400 So, 254 00:17:36,400 --> 00:17:40,400 I thought you might be able to participate in the legal debate class. 255 00:17:40,400 --> 00:17:43,400 Even if it is politically publicized or open to public debate, 256 00:17:43,400 --> 00:17:45,400 the conclusion will probably be similar. 257 00:17:45,400 --> 00:17:47,400 Now, here is the question. 258 00:17:47,400 --> 00:17:50,400 Let's say that a system of autonomous driving that can save several people 259 00:17:50,400 --> 00:17:56,400 through the sacrifice of a few is formed in the form of a social consensus. 260 00:17:56,400 --> 00:18:00,400 Would you buy such an autonomous driving car? 261 00:18:00,400 --> 00:18:04,400 The driver dies in an emergency. 262 00:18:04,400 --> 00:18:08,400 Would you be able to buy a car that can kill you for 100 million won? 263 00:18:08,400 --> 00:18:11,400 Let's say you are the CEO of Tesla. 264 00:18:11,400 --> 00:18:16,400 Let's say that there is a principle of autonomous driving that is not disclosed to the outside world. 265 00:18:16,400 --> 00:18:21,400 And that principle is a principle that is made to choose a few sacrifices 266 00:18:21,400 --> 00:18:24,400 according to the government regulations. 267 00:18:24,400 --> 00:18:30,400 Then, when you make a commercial copy phrase to sell a car called Tesla, 268 00:18:30,400 --> 00:18:33,400 when you make a commercial copy phrase to sell a car called Tesla, 269 00:18:33,400 --> 00:18:36,400 you will make a commercial copy phrase to sell a car called Tesla, 270 00:18:36,400 --> 00:18:39,400 which is a moral car that perfectly solves the legal problem. 271 00:18:39,400 --> 00:18:41,400 But the driver is in danger of dying. 272 00:18:41,400 --> 00:18:44,400 I think that business will fail. 273 00:18:44,400 --> 00:18:47,400 If the autonomous driving car business is successful, 274 00:18:47,400 --> 00:18:50,400 if the capitalist logic is included, 275 00:18:50,400 --> 00:18:54,400 I think it will be sold only if the driver is sure that he will not die in any case. 276 00:18:54,400 --> 00:18:58,400 Wouldn't that go against the ethical emotions we know? 277 00:18:58,400 --> 00:19:02,400 More than we think, we humans often have to make ethical choices. 278 00:19:02,400 --> 00:19:04,400 As much as we make one ethical choice, 279 00:19:04,400 --> 00:19:07,400 we often have to give up the rest of the morality. 280 00:19:07,400 --> 00:19:11,400 Even the option not chosen is not completely unethical. 281 00:19:11,400 --> 00:19:13,400 As if you chose to go straight, 282 00:19:13,400 --> 00:19:15,400 if there is only a logical principle, 283 00:19:15,400 --> 00:19:17,400 it is possible to persuade. 284 00:19:17,400 --> 00:19:21,400 In a situation where even humans have not found a perfectly logical answer, 285 00:19:21,400 --> 00:19:30,400 what should we introduce, teach, and set the early growth model to artificial intelligence? 286 00:19:30,400 --> 00:19:33,400 Artificial intelligence is not impossible to set, 287 00:19:33,400 --> 00:19:35,400 but in the United States, 288 00:19:35,400 --> 00:19:39,400 a few ethical principles have already been given as a self-driving guide in the self-driving demonstration stage, 289 00:19:39,400 --> 00:19:41,400 which we see a lot in articles. 290 00:19:41,400 --> 00:19:43,400 Now, the first concern. 291 00:19:43,400 --> 00:19:45,400 Even if it is not the answer, 292 00:19:45,400 --> 00:19:47,400 the first concern is the problem of choice. 293 00:19:47,400 --> 00:19:49,400 In other words, 294 00:19:49,400 --> 00:19:52,400 I gave up the choice of self-driving, 295 00:19:52,400 --> 00:19:56,400 but as more and more humanoid robots appear in the future, 296 00:19:56,400 --> 00:19:58,400 about the behavior of this robot, 297 00:19:58,400 --> 00:19:59,400 about the behavior of this robot, 298 00:19:59,400 --> 00:20:02,400 whether it will act for the owner of the employer, 299 00:20:02,400 --> 00:20:05,400 whether it will act for the ethics of the universal human race, 300 00:20:05,400 --> 00:20:07,400 when setting the principle of robot ethics, 301 00:20:07,400 --> 00:20:10,400 it will probably be in the same situation. 302 00:20:10,400 --> 00:20:13,400 The problem of ethical choice is quite difficult. 303 00:20:13,400 --> 00:20:14,400 Before this is publicized, 304 00:20:14,400 --> 00:20:16,400 when applied to technology, 305 00:20:16,400 --> 00:20:17,400 moreover, 306 00:20:17,400 --> 00:20:21,400 by a few capitalists who are corrupting capitalism, 307 00:20:21,400 --> 00:20:24,400 the principle is influenced by the maximum sales, 308 00:20:24,400 --> 00:20:26,400 and the possibility of being decided is very high. 309 00:20:26,400 --> 00:20:28,400 The most correct concern is that, 310 00:20:28,400 --> 00:20:35,400 we can never rely on the personality of a famous special CEO. 311 00:20:35,400 --> 00:20:37,400 I think everyone will agree with this. 312 00:20:37,400 --> 00:20:40,400 The second is the problem of hatred. 313 00:20:40,400 --> 00:20:45,400 There was an ancient Greek philosopher named Plato. 314 00:20:45,400 --> 00:20:50,400 Plato explains the concept of the ring of Giges in his book. 315 00:20:50,400 --> 00:20:52,400 When Giges was called the ring of Giges, 316 00:20:52,400 --> 00:20:54,400 Giges was a dentist. 317 00:20:54,400 --> 00:20:55,400 One day, 318 00:20:55,400 --> 00:20:57,400 he got a ring on the way. 319 00:20:57,400 --> 00:20:59,400 The ring was a transparent human ring 320 00:20:59,400 --> 00:21:02,400 that made humans invisible. 321 00:21:02,400 --> 00:21:03,400 What did Giges, 322 00:21:03,400 --> 00:21:05,400 who had a transparent human ring, 323 00:21:05,400 --> 00:21:07,400 choose? 324 00:21:07,400 --> 00:21:11,400 The part where the discussions of ancient Greek philosophers were introduced 325 00:21:11,400 --> 00:21:12,400 is in Plato's book, 326 00:21:12,400 --> 00:21:14,400 The Conversation. 327 00:21:14,400 --> 00:21:17,400 When you get a transparent human ring, 328 00:21:17,400 --> 00:21:18,400 of course, 329 00:21:18,400 --> 00:21:20,400 only in our philosophical imagination, 330 00:21:20,400 --> 00:21:23,400 what would you do first? 331 00:21:23,400 --> 00:21:26,400 The man in the white hoodie in the front. 332 00:21:26,400 --> 00:21:32,400 What would you do with the ring? 333 00:21:32,400 --> 00:21:36,900 Would you be 334 00:21:36,900 --> 00:21:38,400 playing around? 335 00:21:38,400 --> 00:21:40,400 I won't ask you where you go to play. 336 00:21:40,400 --> 00:21:43,400 There might be some legal problems. 337 00:21:43,400 --> 00:21:46,400 I didn't debut as a teacher 338 00:21:46,400 --> 00:21:49,400 and teach high school student examination in high school, 339 00:21:49,400 --> 00:21:51,400 I debuted as 340 00:21:51,400 --> 00:21:53,400 a third-year elementary school lecturer, 341 00:21:53,400 --> 00:21:55,400 I didn't know what to teach third-year elementary school students, 342 00:21:55,400 --> 00:22:02,400 So he wrote down what he would do if he got a Giges ring. 343 00:22:02,400 --> 00:22:05,400 The elementary school boy was so cute. 344 00:22:05,400 --> 00:22:09,400 He said he would go to Yeotang. 345 00:22:09,400 --> 00:22:12,400 The answer was a little different from the middle school student. 346 00:22:12,400 --> 00:22:16,400 He said he would go to the Korean Bank to get a credit card. 347 00:22:16,400 --> 00:22:19,400 Can you predict the answers of our high school students? 348 00:22:19,400 --> 00:22:26,400 He said he would go to the Korean Academy of Education to get a credit card. 349 00:22:26,400 --> 00:22:30,400 The Giges ring is mentioned as follows in the philosophy class. 350 00:22:30,400 --> 00:22:37,400 Human beings do not make the most moral and pure choices in the state of anonymity, 351 00:22:37,400 --> 00:22:43,400 but they make the most violent and decisive choices. 352 00:22:43,400 --> 00:22:45,400 Because in Plato's philosophy, 353 00:22:45,400 --> 00:22:48,400 from now on, in the philosophy books of 2,500 years ago, 354 00:22:48,400 --> 00:22:49,400 Plato's... 355 00:22:49,400 --> 00:22:51,400 The Giges ring of Giges, 356 00:22:51,400 --> 00:22:53,400 the toothed Giges, 357 00:22:53,400 --> 00:22:56,400 runs to the king, 358 00:22:56,400 --> 00:22:57,400 kills the king, 359 00:22:57,400 --> 00:22:59,400 punishes the queen, 360 00:22:59,400 --> 00:23:01,400 and takes over the country. 361 00:23:01,400 --> 00:23:03,400 That's the conclusion. 362 00:23:05,400 --> 00:23:10,400 Are we moral in the space given to us by anonymity? 363 00:23:10,400 --> 00:23:13,400 Even now, hundreds of millions of people 364 00:23:13,400 --> 00:23:18,400 use text that can't be counted, 365 00:23:18,400 --> 00:23:22,400 but in the secret space of the artificial intelligence world, 366 00:23:22,400 --> 00:23:25,400 there are many conversations that only the two of us know. 367 00:23:25,400 --> 00:23:28,400 Are the messages that we input into the artificial intelligence world, 368 00:23:28,400 --> 00:23:32,400 such as Chet Chepiti and Gemini, 369 00:23:32,400 --> 00:23:36,400 and the recommended videos, 370 00:23:36,400 --> 00:23:38,400 recommended cartoons, 371 00:23:38,400 --> 00:23:40,400 and recommended various actions, 372 00:23:40,400 --> 00:23:43,400 the most moral? 373 00:23:43,400 --> 00:23:46,400 I don't think so. 374 00:23:46,400 --> 00:23:53,400 We are living in a world that imposes hatred on the artificial intelligence chatbot without knowing. 375 00:23:53,400 --> 00:23:57,400 You are already in a small screen, 376 00:23:57,400 --> 00:24:00,400 watching YouTube Shots, Insta Reels, 377 00:24:00,400 --> 00:24:04,400 TikTok-like channels on the platform, 378 00:24:04,400 --> 00:24:09,400 scrolling through the videos that stimulate my dopamine in 10 seconds. 379 00:24:09,400 --> 00:24:13,400 There are many cases where you are locked up for 2-3 hours. 380 00:24:13,400 --> 00:24:15,400 I don't know if you're losing your time. 381 00:24:16,400 --> 00:24:20,400 How does the algorithm of our platform work? 382 00:24:20,400 --> 00:24:22,400 The videos that we stayed longer, 383 00:24:22,400 --> 00:24:26,400 the videos that we liked more and participated in the comments, 384 00:24:26,400 --> 00:24:29,400 are shown to us as more algorithms. 385 00:24:29,400 --> 00:24:36,400 But among the videos that we stay when we face ourselves on a small screen, 386 00:24:36,400 --> 00:24:38,400 a lot of them are selective, 387 00:24:38,400 --> 00:24:43,400 and a lot of them stay in violent videos. 388 00:24:43,400 --> 00:24:44,400 Do you feel it? 389 00:24:44,400 --> 00:24:45,400 As you know, 390 00:24:45,400 --> 00:24:47,400 in the modern war, 391 00:24:47,400 --> 00:24:51,400 in the process of setting a drone attack and a target, 392 00:24:51,400 --> 00:24:53,400 AI intervenes directly into the war 393 00:24:53,400 --> 00:24:56,400 and decides where to send a drone bomb. 394 00:24:56,400 --> 00:24:58,400 This is the image of the modern war. 395 00:24:58,400 --> 00:25:01,400 AI, who has learned the hatred of humans, 396 00:25:01,400 --> 00:25:03,400 resembles humans. 397 00:25:03,400 --> 00:25:05,400 Artificial intelligence itself has no emotions, 398 00:25:05,400 --> 00:25:11,400 Artificial intelligence itself does not resemble human emotions, 399 00:25:11,400 --> 00:25:13,400 but it is an algorithm that tries to imitate humans more, 400 00:25:13,400 --> 00:25:15,400 and it resembles the hatred, 401 00:25:15,400 --> 00:25:16,400 the predisposition, 402 00:25:16,400 --> 00:25:18,400 and the violence of humans. 403 00:25:18,400 --> 00:25:19,400 Isn't it? 404 00:25:19,400 --> 00:25:24,400 In a famous large company using the AI recruitment method, 405 00:25:24,400 --> 00:25:26,400 they have been entering all the documents, 406 00:25:26,400 --> 00:25:27,400 recruitment results, 407 00:25:27,400 --> 00:25:30,400 and the same things into the 전산, 408 00:25:30,400 --> 00:25:32,400 and among tens of thousands of documents, 409 00:25:32,400 --> 00:25:36,400 they have chosen the person to be recruited as an AI. 410 00:25:36,400 --> 00:25:37,400 So far, 411 00:25:37,400 --> 00:25:38,400 in the recruitment process, 412 00:25:38,400 --> 00:25:40,400 there has been a lot of male and female discrimination, 413 00:25:40,400 --> 00:25:42,400 and there have been fewer female recruiters. 414 00:25:42,400 --> 00:25:43,400 But, 415 00:25:43,400 --> 00:25:45,400 there was a result that AI also produced 416 00:25:45,400 --> 00:25:47,400 sexual discrimination results 417 00:25:47,400 --> 00:25:50,400 by reflecting the reality that there were fewer female recruiters. 418 00:25:50,400 --> 00:25:51,400 Everyone, 419 00:25:51,400 --> 00:25:53,400 we are still using a lot of data 420 00:25:53,400 --> 00:25:55,400 to learn human beings, 421 00:25:55,400 --> 00:25:59,400 as the early users of artificial intelligence, 422 00:25:59,400 --> 00:26:02,400 as the early users of artificial intelligence, 423 00:26:02,400 --> 00:26:05,400 will we be able to show 424 00:26:05,400 --> 00:26:07,400 the high-level humanity and morality 425 00:26:07,400 --> 00:26:09,400 that AI and artificial intelligence resemble? 426 00:26:09,400 --> 00:26:11,400 Will we be able to show 427 00:26:11,400 --> 00:26:13,400 the high-level humanity and morality 428 00:26:13,400 --> 00:26:15,400 that AI and artificial intelligence resemble? 429 00:26:15,400 --> 00:26:17,400 It is time to think about whether 430 00:26:17,400 --> 00:26:19,400 the human nature is showing the most serious 431 00:26:19,400 --> 00:26:21,400 and logical problems in the frame of anonymity. 432 00:26:21,400 --> 00:26:23,400 The third keyword is anxiety. 433 00:26:23,400 --> 00:26:25,400 Do you remember the 19th century 434 00:26:25,400 --> 00:26:27,400 Rudite movement in your textbooks? 435 00:26:27,400 --> 00:26:29,400 Do you remember the 19th century Rudite movement in your textbooks? 436 00:26:29,400 --> 00:26:31,400 Do you remember the 19th century Rudite movement in your textbooks? 437 00:26:31,400 --> 00:26:33,400 It appears in the social textbooks of the first grade of high school. 438 00:26:33,400 --> 00:26:35,400 It appears in the social textbooks of the first grade of high school. 439 00:26:35,400 --> 00:26:37,400 After the Industrial Revolution, 440 00:26:37,400 --> 00:26:39,400 machines took a lot of human jobs, 441 00:26:39,400 --> 00:26:40,400 and human rights were taken away. 442 00:26:40,400 --> 00:26:43,400 The act of a human stopping 443 00:26:43,400 --> 00:26:44,400 the mental breakdown of machines 444 00:26:44,400 --> 00:26:46,400 is called the Rudite movement. 445 00:26:46,400 --> 00:26:49,400 The way a king stopped his men 446 00:26:49,400 --> 00:26:52,400 of meddling after the pounded machines 447 00:26:52,400 --> 00:26:54,400 to fix their job 448 00:26:54,400 --> 00:26:56,400 was the most common style in our textbooks. 449 00:26:56,400 --> 00:26:58,400 is the most common style in our textbooks. 450 00:26:58,400 --> 00:27:00,400 Many discussions about the new Rudite movement 451 00:27:00,400 --> 00:27:02,400 have also been going on. 452 00:27:04,400 --> 00:27:06,400 The Rudite movement was wrote 453 00:27:06,400 --> 00:27:09,400 by the very famous claimant 454 00:27:09,400 --> 00:27:19,400 I think it was not the anger of the machine, but the fear of human alienation caused by the machine. 455 00:27:19,400 --> 00:27:24,400 The anxiety that the machine will take over our jobs and replace humans. 456 00:27:24,400 --> 00:27:26,400 It was in that era. 457 00:27:26,400 --> 00:27:38,400 When we started 15 years ago, it was predicted that only white-collar workers would survive in the era of artificial intelligence. 458 00:27:38,400 --> 00:27:40,400 But it was wrong. 459 00:27:40,400 --> 00:27:50,400 In a situation where the white-collar class is disappearing faster than the white-collar class, we need to accept it. 460 00:27:50,400 --> 00:27:56,400 We need to find out what the job that is not replaced by artificial intelligence, machines, and robots is. 461 00:27:56,400 --> 00:28:03,400 The fact that the job that is replaced by artificial intelligence will be faster than the job that is replaced by machines, machines, and robots will be a flow that we cannot refuse. 462 00:28:03,400 --> 00:28:22,400 Then, when artificial intelligence replaces human labor, and when we lose our basic jobs, how should we live? 463 00:28:22,400 --> 00:28:25,400 Sometimes, I think about it. 464 00:28:25,400 --> 00:28:26,400 I think about it. 465 00:28:26,400 --> 00:28:29,400 In my lecture, I mentioned what the PAYPAL startup did. 466 00:28:29,400 --> 00:28:38,400 Elon Musk was included in the team, and the PAYPAL startup is a very famous payment system in the United States. 467 00:28:38,400 --> 00:28:47,400 It's more important than how we will adapt to the changing times or how we will survive. 468 00:28:47,400 --> 00:28:55,400 It's a scene where we tell people that how much we think will affect the future of mankind. 469 00:28:55,400 --> 00:28:57,960 This is something that I often say to my students. 470 00:28:57,960 --> 00:29:03,600 We are now in an era where artificial intelligence robots do not know how to replace our labor, 471 00:29:03,600 --> 00:29:06,880 or rather, in an era where replacing is almost certain, 472 00:29:06,880 --> 00:29:14,080 we need to think about how to replace it with a machine and become a more mature human being in an area that is unique to humans, 473 00:29:14,080 --> 00:29:21,240 and how to grow that area that cannot be replaced. 474 00:29:21,240 --> 00:29:30,880 Rather than just discussing the anxiety and fear of changing our lives as artificial intelligence develops, 475 00:29:30,880 --> 00:29:34,720 we need to start discussing how we can become more human in a changing world, 476 00:29:34,720 --> 00:29:42,240 and how we can become more human and shine in an area that cannot be replaced with a machine without losing our humanity. 477 00:29:42,240 --> 00:29:48,800 We are now in an era where we do not work, we do not have jobs, and we do not have our own income, 478 00:29:48,800 --> 00:29:51,200 and we only have the basic income of the government, 479 00:29:51,200 --> 00:29:51,220 and we only have the basic income of the government, and we only have the basic income of the government, 480 00:29:51,220 --> 00:29:53,860 and we only have the basic income of the government, 481 00:29:53,860 --> 00:29:58,600 so what can we do to protect the value of humanity? 482 00:29:58,600 --> 00:30:03,420 The fourth keyword is time. 483 00:30:03,420 --> 00:30:09,340 I am a love expert, but I had a boyfriend in my first year of college. 484 00:30:09,340 --> 00:30:13,580 My boyfriend was addicted to the game Lineage. 485 00:30:13,580 --> 00:30:17,080 He played games for 24 hours a day. 486 00:30:17,080 --> 00:30:19,320 When do you sleep? 487 00:30:19,320 --> 00:30:21,180 I think he slept for 3-4 hours. 488 00:30:21,180 --> 00:30:25,440 During those 3-4 hours, during the age when there was a pc room culture, 489 00:30:25,440 --> 00:30:30,500 the elementary school part-time job in a pc room was paid for by the pc room fee and ramen, 490 00:30:30,500 --> 00:30:35,820 and the character was logged in and went to the dungeon to catch the boss mob, 491 00:30:35,820 --> 00:30:41,980 and when he woke up, he was given an ID again as a reward, and he could play the character for 24 hours. 492 00:30:41,980 --> 00:30:47,240 I didn't have a boyfriend, but when I met an addicted to the game, 493 00:30:47,240 --> 00:30:49,260 I learned the game together, 494 00:30:49,260 --> 00:30:51,140 and I wanted to be friends with the boss mob. 495 00:30:51,140 --> 00:30:56,140 I went to the supermarket with him, but it didn't work out well. 496 00:30:56,140 --> 00:31:00,140 Because even when I bought milk at the convenience store, 497 00:31:00,140 --> 00:31:02,140 I asked, 498 00:31:02,140 --> 00:31:06,140 How many aden are you using? 499 00:31:06,140 --> 00:31:11,140 I was a vicious addict. 500 00:31:11,140 --> 00:31:13,140 I thought, 501 00:31:13,140 --> 00:31:16,140 Why do I like such a strange person? 502 00:31:16,140 --> 00:31:18,140 At some point, my mind changed a little. 503 00:31:18,140 --> 00:31:24,140 At that time, my boyfriend's character was ranked second in the server. 504 00:31:24,140 --> 00:31:29,140 There was a site called Item Bay where you could sell your character. 505 00:31:29,140 --> 00:31:35,140 There, the character was sold for 20 million won. 506 00:31:35,140 --> 00:31:40,140 Do you know why I'm talking about that time? 507 00:31:40,140 --> 00:31:46,140 The character that I used to work 24 hours a day as a part-time elementary school student 508 00:31:46,140 --> 00:31:48,140 was also economically valuable. 509 00:31:48,140 --> 00:31:51,140 It was a character that I liked. 510 00:31:51,140 --> 00:31:53,140 Machines don't need rest. 511 00:31:53,140 --> 00:31:56,140 AI doesn't need rest either. 512 00:31:56,140 --> 00:32:00,140 Humans only need to rest and heal. 513 00:32:00,140 --> 00:32:02,140 We need rest. 514 00:32:02,140 --> 00:32:10,140 In a way, it's impossible for us to compete with machines that don't need rest and nutrition. 515 00:32:10,140 --> 00:32:13,140 In a time when time is power and money, 516 00:32:13,140 --> 00:32:16,140 the limited time of humans is 517 00:32:16,140 --> 00:32:20,140 learning at a very fast pace in the artificial intelligence world. 518 00:32:20,140 --> 00:32:24,140 It's a time when humans don't follow the physical strength and health 519 00:32:24,140 --> 00:32:27,140 to simply compare time to the artificial intelligence, 520 00:32:27,140 --> 00:32:35,140 which goes beyond some peculiarities and goes beyond the speed of human learning. 521 00:32:35,140 --> 00:32:42,140 Do you remember what I said about the ethics of artificial intelligence today? 522 00:32:42,140 --> 00:32:44,140 The first is the problem of choice. 523 00:32:44,140 --> 00:32:47,140 The second is the problem of hatred. 524 00:32:47,140 --> 00:32:52,140 The third is the logical problem of the anxiety that human labor may be replaced. 525 00:32:52,140 --> 00:33:02,140 The fourth is that human beings have a limit of physical strength and time compared to artificial intelligence with infinite time. 526 00:33:02,140 --> 00:33:07,140 I talked about the ethical problems that we have to face with four keywords. 527 00:33:07,140 --> 00:33:10,140 Then I think this is what we really have to talk about. 528 00:33:10,140 --> 00:33:12,140 We can replace artificial intelligence. 529 00:33:12,140 --> 00:33:16,140 In a field that is not replaced by artificial intelligence, 530 00:33:16,140 --> 00:33:19,140 should we really find the value of our human beings? 531 00:33:19,140 --> 00:33:26,140 I've been teaching students several times and I've been asked these questions. 532 00:33:26,140 --> 00:33:29,140 Teacher, which major makes good money? 533 00:33:29,140 --> 00:33:33,140 Teacher, what major do you think will have a bright future? 534 00:33:33,140 --> 00:33:36,140 Teacher, which university would you choose? 535 00:33:36,140 --> 00:33:39,140 I like this major, but I don't like this major. 536 00:33:39,140 --> 00:33:41,140 I like this major, but I don't like this major. 537 00:33:41,140 --> 00:33:43,140 I like this major, but I don't like this major. 538 00:33:43,140 --> 00:33:44,140 Teacher, which major do you not recommend? 539 00:33:44,140 --> 00:33:46,140 I've been asked a lot of questions like this. 540 00:33:46,140 --> 00:34:02,140 Because I think our children want the experience they haven't experienced in the past 19, 20 years to be supplemented by someone else. 541 00:34:02,140 --> 00:34:06,140 In the past, adults had more social experience. 542 00:34:06,140 --> 00:34:10,140 It was a time when adults understood the mechanism of this world better. 543 00:34:10,140 --> 00:34:22,140 Now, it would be difficult for adults to follow the development speed of elementary and high school students who use the Internet and use artificial intelligence. 544 00:34:22,140 --> 00:34:25,140 Do you know the meaning of the keyword I'm talking about now? 545 00:34:34,140 --> 00:34:39,140 If you tell these things to elementary and high school students, they'll laugh excitedly. 546 00:34:39,140 --> 00:34:42,140 It sounds like an alien language to adults, doesn't it? 547 00:34:42,140 --> 00:34:51,140 It is a meme that appears in the algorithm of videos that stimulate dopamine in a few seconds in the algorithm of children. 548 00:34:51,140 --> 00:34:59,140 Now we have passed an era where we can guide our next generation as a senior who has lived in the world for a longer time. 549 00:34:59,140 --> 00:35:09,140 In the same era of artificial intelligence, we have to learn and follow children who adapt to artificial intelligence faster than we do. 550 00:35:09,140 --> 00:35:15,140 What can adults and seniors of life do to lead children better? 551 00:35:15,140 --> 00:35:22,140 In fact, all these stories are the last discussion on humanity. 552 00:35:22,140 --> 00:35:26,140 There is a book by a person named Ina Mori-Gazio, whom I like very much. 553 00:35:26,140 --> 00:35:28,140 In the book, Ina Mori-Gazio says, 554 00:35:28,140 --> 00:35:32,140 Ina Mori-Gazio says, 555 00:35:32,140 --> 00:35:39,140 Our life is a process to make our souls stronger. 556 00:35:39,140 --> 00:35:45,140 As I use the text of this book, I will talk about how we can raise children in the future of artificial intelligence, 557 00:35:45,140 --> 00:35:52,140 how we can take the direction of society, and how humans can live as human beings. 558 00:35:52,140 --> 00:35:57,140 I would like to say hello to you in Chungcheong-do and Jincheon. 559 00:35:57,140 --> 00:35:58,140 Thank you. 560 00:35:58,140 --> 00:36:04,140 Can I talk about myself who is lecturing in front of you for about 3 minutes? 561 00:36:04,140 --> 00:36:09,140 I was born in Incheon. 562 00:36:09,140 --> 00:36:13,140 As I said earlier, I lived in a house with a house rent in Banji and Wolset-bang, where parasites come out. 563 00:36:13,140 --> 00:36:16,140 I lost my house because of a car accident in middle school. 564 00:36:16,140 --> 00:36:22,140 At that time, all the crises in life came at once, and my house was gone. 565 00:36:22,140 --> 00:36:27,140 In a situation where I had nowhere to go, I found out that my parents were suffering from cancer. 566 00:36:27,140 --> 00:36:34,140 In the process, my father's company, which he had been working for for 2 years, was bankrupt, 567 00:36:34,140 --> 00:36:37,140 so I couldn't get a salary for 2 years. 568 00:36:37,140 --> 00:36:40,140 I was kicked out of Wolset-bang because I had nowhere to go back to. 569 00:36:40,140 --> 00:36:44,140 Because I already had no savings. 570 00:36:44,140 --> 00:36:50,140 So I found out that I had nowhere to go. 571 00:36:50,140 --> 00:36:55,140 In fact, my mother, who graduated from elementary school, graduated from high school, 572 00:36:55,140 --> 00:36:58,140 and raised us for the rest of her life. 573 00:36:58,140 --> 00:37:00,140 My father was a hodduk shopkeeper and a guard. 574 00:37:00,140 --> 00:37:05,140 Sometimes he drove a company truck and grew up under my parents who raised my three daughters. 575 00:37:05,140 --> 00:37:08,140 I didn't know if my house was that difficult, 576 00:37:08,140 --> 00:37:12,140 but the suffering I experienced in middle school made me very difficult. 577 00:37:12,140 --> 00:37:17,140 By the time I was depressed with puberty, 578 00:37:17,140 --> 00:37:22,140 I found out that there was no money to treat my parents' cancer. 579 00:37:22,140 --> 00:37:24,140 I think I was greatly shocked. 580 00:37:24,140 --> 00:37:27,140 The only thing I believed that would save me was my studies. 581 00:37:27,140 --> 00:37:31,140 The notes and books that I wrote with all my heart 582 00:37:31,140 --> 00:37:35,140 were buried in the water of the mountain. 583 00:37:35,140 --> 00:37:37,140 In the muddy water, 584 00:37:37,140 --> 00:37:40,140 Caramel Macchiato's 585 00:37:40,140 --> 00:37:44,140 The letters disappeared and the paper became a rice cake that could not be ripped, 586 00:37:44,140 --> 00:37:48,140 so the results of my studies were all gone. 587 00:37:48,140 --> 00:37:51,140 Did the world hate me? Did the sky hate me? 588 00:37:51,140 --> 00:37:52,140 Did the sky abandon me? 589 00:37:52,140 --> 00:37:53,140 That's what I thought when I was in middle school. 590 00:37:53,140 --> 00:37:56,140 I don't want to live anymore. 591 00:37:56,140 --> 00:37:58,140 I want to put everything down. 592 00:37:58,140 --> 00:38:01,140 I thought I should die when I was in middle school 593 00:38:01,140 --> 00:38:03,140 thinking that there was no way to survive. 594 00:38:03,140 --> 00:38:06,140 When I was in my third year of middle school in Incheon, 595 00:38:06,140 --> 00:38:09,140 I went up to the highest rooftop. 596 00:38:09,140 --> 00:38:12,140 I thought it would be easier to put everything down. 597 00:38:12,140 --> 00:38:14,140 I don't have any regrets in my life. 598 00:38:14,140 --> 00:38:15,140 The world is not on my side anyway. 599 00:38:15,140 --> 00:38:17,140 I thought I took it all away. 600 00:38:17,140 --> 00:38:20,140 I went up to the top of the high rooftop, 601 00:38:20,140 --> 00:38:21,140 and then I found out. 602 00:38:21,140 --> 00:38:22,140 I have a high fear of heights. 603 00:38:22,140 --> 00:38:24,140 I have a really severe fear of heights. 604 00:38:24,140 --> 00:38:26,140 I was so scared. 605 00:38:26,140 --> 00:38:28,140 I didn't want to die there. 606 00:38:28,140 --> 00:38:31,140 I thought I wanted to die in a place that wasn't scary even if I died, 607 00:38:31,140 --> 00:38:32,140 but then I thought, 608 00:38:32,140 --> 00:38:33,140 why die? I have to live. 609 00:38:33,140 --> 00:38:35,140 That thought came out of my mouth. 610 00:38:35,140 --> 00:38:36,140 Then I realized. 611 00:38:36,140 --> 00:38:39,140 I'm not a person who can give up on myself. 612 00:38:39,140 --> 00:38:43,140 And I didn't go up there because I hated myself. 613 00:38:43,140 --> 00:38:46,140 I love myself so much that I want to give myself something good, 614 00:38:46,140 --> 00:38:48,140 but I realized that I went up to the high place 615 00:38:48,140 --> 00:38:50,140 because of the pressure I couldn't do. 616 00:38:50,140 --> 00:38:51,140 As I came down, 617 00:38:52,140 --> 00:38:54,140 I thought Lee Ji-young, 618 00:38:54,140 --> 00:38:56,140 who blamed the world and blamed the world, 619 00:38:56,140 --> 00:38:57,140 was dead. 620 00:38:57,140 --> 00:39:01,140 I came down with the thought that I should live hard with the determination to die. 621 00:39:01,140 --> 00:39:03,140 There was no place for all my family to go. 622 00:39:03,140 --> 00:39:06,140 You've heard a lot about the form of living 623 00:39:06,140 --> 00:39:11,140 such as a private apartment, a house, or a lease, a monthly rent, 624 00:39:11,140 --> 00:39:14,140 but I don't know if you've heard the word lease. 625 00:39:14,140 --> 00:39:18,140 I lived in Bupyeong-gu, Incheon, 626 00:39:18,140 --> 00:39:21,140 and then I moved to Okdong-ri, Deoksan-myeon, Jincheon-gun, Chungbuk. 627 00:39:22,140 --> 00:39:26,140 I moved to a house with a rent of 200,000 won. 628 00:39:26,140 --> 00:39:32,140 It's so cold because I came to a lecture in the neighborhood where I went to high school. 629 00:39:32,140 --> 00:39:34,140 It was a house with no windows. 630 00:39:34,140 --> 00:39:36,140 There was no boiler. 631 00:39:36,140 --> 00:39:37,140 Without glass, 632 00:39:37,140 --> 00:39:39,140 as you know, 633 00:39:39,140 --> 00:39:43,140 it was a house made of thin Korean paper called Munpung-ji. 634 00:39:43,140 --> 00:39:46,140 It was a house that collapsed. 635 00:39:46,140 --> 00:39:49,140 It was just a house that seemed to have been covered with dirt. 636 00:39:49,140 --> 00:39:51,140 I was in the shape of a house, 637 00:39:51,140 --> 00:39:54,140 so it was a really hard place. 638 00:39:54,140 --> 00:39:57,140 I didn't have money to buy a textbook, 639 00:39:57,140 --> 00:39:59,140 I didn't have money to buy a problem book, 640 00:39:59,140 --> 00:40:02,140 and I didn't have money to fit my school uniform, 641 00:40:02,140 --> 00:40:05,140 so I wore a school uniform that the seniors had thrown away, 642 00:40:05,140 --> 00:40:09,140 and only five people from the whole school, 643 00:40:09,140 --> 00:40:11,140 the basic life-saving, 644 00:40:11,140 --> 00:40:13,140 I was a life-saving person at that time. 645 00:40:13,140 --> 00:40:15,140 I was given a free lunch box. 646 00:40:15,140 --> 00:40:18,140 At that time, lunch boxes were delivered. 647 00:40:18,140 --> 00:40:19,140 For five students in the whole school, 648 00:40:19,140 --> 00:40:22,140 the lunch box was a blue lunch box, 649 00:40:22,140 --> 00:40:24,140 and for the rest of the students, 650 00:40:24,140 --> 00:40:27,140 a white lunch box came out. 651 00:40:27,140 --> 00:40:29,140 It was a small neighborhood, 652 00:40:29,140 --> 00:40:31,140 so if I was eating a blue lunch box, 653 00:40:31,140 --> 00:40:32,140 the children knew everything. 654 00:40:32,140 --> 00:40:33,140 In their poor house, 655 00:40:33,140 --> 00:40:35,140 to the five poorest people in our school, 656 00:40:35,140 --> 00:40:37,140 to the five children who don't have lunch money, 657 00:40:37,140 --> 00:40:40,140 I don't know that I'm giving them a blue lunch box. 658 00:40:40,140 --> 00:40:42,140 I was the only one who had a different color, 659 00:40:42,140 --> 00:40:44,140 and I was the only one who brought it. 660 00:40:44,140 --> 00:40:47,140 My friends who were eating that lunch box asked me. 661 00:40:47,140 --> 00:40:48,140 They asked me directly. 662 00:40:48,140 --> 00:40:50,140 They asked me to bother them. 663 00:40:50,140 --> 00:40:52,140 Ji-young, why do you have a blue lunch box? 664 00:40:52,140 --> 00:40:53,140 I still have this question. 665 00:40:53,140 --> 00:40:55,140 It's one of the questions I got when I was in high school. 666 00:40:55,140 --> 00:40:57,140 How would you answer me? 667 00:40:57,140 --> 00:40:58,140 My house is poor, 668 00:40:58,140 --> 00:41:00,140 and I can't pay for my meals. 669 00:41:00,140 --> 00:41:02,140 In front of the blue lunch box, 670 00:41:02,140 --> 00:41:04,140 which is a symbol of poverty 671 00:41:04,140 --> 00:41:06,140 for only five people in the school, 672 00:41:06,140 --> 00:41:08,140 how would you answer me? 673 00:41:08,140 --> 00:41:10,140 I answered like this. 674 00:41:10,140 --> 00:41:11,140 I don't know. 675 00:41:11,140 --> 00:41:12,140 Maybe it's because I'm the first in the school, 676 00:41:12,140 --> 00:41:14,140 so I'm giving them a special meal. 677 00:41:14,140 --> 00:41:15,140 It's delicious. 678 00:41:15,140 --> 00:41:17,140 The side dishes are a little different from yours. 679 00:41:18,140 --> 00:41:20,140 My friends knew it wasn't true, 680 00:41:20,140 --> 00:41:24,140 but I was persuaded that it could be true. 681 00:41:24,140 --> 00:41:26,140 I graduated from Jincheon High School, 682 00:41:26,140 --> 00:41:28,140 and I was the first in the school there. 683 00:41:28,140 --> 00:41:32,140 That's how I won my mind. 684 00:41:32,140 --> 00:41:37,140 As I came down from the high roof, 685 00:41:37,140 --> 00:41:39,140 I couldn't stop thinking about depression. 686 00:41:39,140 --> 00:41:42,140 I couldn't stop being depressed in my adolescence. 687 00:41:42,140 --> 00:41:44,140 I borrowed books from various libraries. 688 00:41:44,140 --> 00:41:46,140 I borrowed books from various libraries. 689 00:41:46,140 --> 00:41:47,140 I started reading. 690 00:41:47,140 --> 00:41:50,140 I borrowed books from various libraries. 691 00:41:50,140 --> 00:41:52,140 I borrowed books from various libraries. 692 00:41:52,140 --> 00:41:54,140 I borrowed books from various libraries. 693 00:41:54,140 --> 00:41:57,140 I borrowed books from various libraries. 694 00:41:57,140 --> 00:41:59,140 As I was reading, 695 00:41:59,140 --> 00:42:01,140 I found the following phrase 696 00:42:01,140 --> 00:42:05,140 in the book that I had in my mind. 697 00:42:05,140 --> 00:42:06,140 To a person who will be a big hero, 698 00:42:06,140 --> 00:42:08,140 to the one who I had to bring, 699 00:42:08,140 --> 00:42:12,140 to those who will upset me and hurt me, 700 00:42:12,140 --> 00:42:15,140 can that personones through a challenge 701 00:42:15,140 --> 00:42:16,140 to come back from the suffering? 702 00:42:16,140 --> 00:42:18,140 I met a great trial in my life. 703 00:42:18,140 --> 00:42:22,140 Look back if you are not the chosen one in heaven. 704 00:42:22,140 --> 00:42:24,140 It was the phrase of the book. 705 00:42:24,140 --> 00:42:27,140 The book was strangely comforting. 706 00:42:27,140 --> 00:42:31,140 Oh, I shouldn't blame the world. 707 00:42:31,140 --> 00:42:41,140 God is not giving good results in an environment where everything is well-off from the beginning. 708 00:42:41,140 --> 00:42:48,140 God is not giving good results in an environment where everything is well-off from the beginning. 709 00:42:48,140 --> 00:42:52,140 I wrote it down in my diary when I was in high school. 710 00:42:52,140 --> 00:42:59,140 Thankfully, I had a chance to give a lecture in the field of artificial intelligence that I majored in. 711 00:42:59,140 --> 00:43:09,140 I thought about what is the area of human beings that cannot be replaced by artificial intelligence and that cannot be replaced by machines. 712 00:43:09,140 --> 00:43:11,140 There is no emotion. 713 00:43:11,140 --> 00:43:18,140 We sometimes say thank you on the screen and talk like friends, 714 00:43:18,140 --> 00:43:30,140 but I think that the comfort from the real human being and the care of oneself are the areas that only humans have. 715 00:43:30,140 --> 00:43:35,140 God gives a gift to a human being wrapped in a wrapping paper. 716 00:43:35,140 --> 00:43:37,140 A small gift is wrapped in a small wrapping paper. 717 00:43:37,140 --> 00:43:39,140 A big gift is wrapped in a big wrapping paper. 718 00:43:39,140 --> 00:43:41,140 What is the name of the wrapping paper? 719 00:43:41,140 --> 00:43:44,140 The name of the wrapping paper is called a wrapping paper. 720 00:43:44,140 --> 00:43:49,140 After opening the wrapping paper, I thought about how big a gift would be waiting for me. 721 00:43:49,140 --> 00:43:54,140 It allowed me to endure in a tight environment. 722 00:43:54,140 --> 00:44:00,140 Sometimes we are comforted by a short lecture like this, 723 00:44:00,140 --> 00:44:04,140 and we try to intervene in my situation. 724 00:44:04,140 --> 00:44:09,140 The comfort that only humans can do and the empathy that only humans can do 725 00:44:09,140 --> 00:44:15,140 are the elements that make us more human in the next world. 726 00:44:15,140 --> 00:44:22,140 Secondly, time is more infinite for machines and we don't need to rest, 727 00:44:22,140 --> 00:44:29,140 but as real humans, we don't get tired of flapping our feet to follow the speed of the machine, 728 00:44:29,140 --> 00:44:37,140 but we want to talk about the area where we can be human at the speed of the natural human slowness. 729 00:44:37,140 --> 00:44:39,140 Some of you may know, 730 00:44:39,140 --> 00:44:43,140 I've mentioned this on TV or during a lecture, 731 00:44:43,140 --> 00:44:46,140 but I'm a very ignorant person. 732 00:44:46,140 --> 00:44:50,140 I thought that living alone would be the only way to save me, 733 00:44:50,140 --> 00:44:52,140 so I lived a very strict life. 734 00:44:52,140 --> 00:44:55,140 I studied for 3-4 hours a day in high school, 735 00:44:55,140 --> 00:45:01,140 and even as a lecturer, I filmed 40 hours a week of Monday, Tuesday, Wednesday, Friday, and Saturday, 736 00:45:01,140 --> 00:45:05,140 and I wrote over 30 to 40 lectures a year. 737 00:45:05,140 --> 00:45:06,140 I lived by giving lectures. 738 00:45:07,140 --> 00:45:09,140 I thought that being alone was the source of my success, 739 00:45:09,140 --> 00:45:11,140 and that being alone would help me survive, 740 00:45:11,140 --> 00:45:13,140 and that being alone would help me get out of my poverty, 741 00:45:13,140 --> 00:45:15,140 and that being alone would help me get out of my poverty, 742 00:45:15,140 --> 00:45:17,140 and that being alone would help me get out of my poverty, 743 00:45:17,140 --> 00:45:22,140 but I realized that I was wrong in 2017 and 2018. 744 00:45:22,140 --> 00:45:25,140 I was so sick that I was writing a book, 745 00:45:25,140 --> 00:45:34,140 but the pain I endured from eating painkillers because of the book's ending, 746 00:45:34,140 --> 00:45:36,140 when I went to the hospital, 747 00:45:36,140 --> 00:45:39,140 it was 4 days after when I got the pneumonia, 748 00:45:39,140 --> 00:45:41,140 when I was about to die, 749 00:45:41,140 --> 00:45:46,140 the whole body was widthening and sprung green. 750 00:45:46,140 --> 00:45:51,140 So, I kept up the swift surgery, 751 00:45:51,140 --> 00:45:54,140 and myself, 752 00:45:54,140 --> 00:45:58,140 my body was surrendered and not being able to recover. 753 00:45:58,140 --> 00:46:04,140 So, I got aheimer's disease at which the likelihood of cause for death was more than 70% in 50% of gelernt members or younger members. 754 00:46:04,140 --> 00:46:07,140 I am a popular lecturer among students. 755 00:46:07,140 --> 00:46:09,140 Students who liked me said, 756 00:46:09,140 --> 00:46:12,140 I love you so much, please teach me the lecture of Koo Kyung-soo. 757 00:46:12,140 --> 00:46:18,140 After the article was published that I was sick and the lecture was stopped due to star lecturer's bad health, 758 00:46:18,140 --> 00:46:21,140 I got a lot of bad comments. 759 00:46:21,140 --> 00:46:26,140 Why Lee Ji-young is sick when I was in the third year of high school? 760 00:46:26,140 --> 00:46:29,140 I thought about it then. 761 00:46:29,140 --> 00:46:41,140 In a changing world, I think it may be more important than trying to endure, adapt, and not give up quickly with the effort to cut the bone. 762 00:46:41,140 --> 00:46:45,140 I became the best betrayer when I lost my health. 763 00:46:45,140 --> 00:46:52,140 All the money I earned from the abnormal entrance education in Korea, 764 00:46:52,140 --> 00:46:58,140 I was so sick that the company already sold the loss compensation money in the middle of the lecture due to bad health. 765 00:46:58,140 --> 00:46:59,140 I was so sick that the company already sold the loss compensation money in the middle of the lecture due to bad health. 766 00:46:59,140 --> 00:47:10,140 Sometimes I asked for the compensation Award for the loss of $3.5 billion. 767 00:47:12,140 --> 00:47:14,140 I have thought that 768 00:47:15,140 --> 00:47:19,140 There is nothing in this world that can save me. 769 00:47:19,140 --> 00:47:22,140 We need to be healthy. 770 00:47:22,140 --> 00:47:26,140 I hope everyone keeps your physical and mental health well. 771 00:47:27,140 --> 00:47:28,140 To some extent, 772 00:47:28,140 --> 00:47:40,140 There is no value to be gained from the world, and the sweet achievements that I have gained by conquering myself are often the excuse for self-conquering. 773 00:47:40,140 --> 00:47:42,140 The world is changing fast. 774 00:47:42,140 --> 00:47:47,140 In the process of this fast-changing, I had this dream. 775 00:47:47,140 --> 00:47:55,140 K-pop, K-culture, K-content is gaining a lot of popularity on the OTT market and on YouTube around the world. 776 00:47:55,140 --> 00:48:12,140 I thought that it would be valuable if K-culture, which discusses the human nature with the emotions of Koreans and the tone of the Koreans, could be discussed in an area that is not replaced by artificial intelligence by touching it emotionally in Korean. 777 00:48:12,140 --> 00:48:15,140 There is a reason why I came to the lecture today. 778 00:48:15,140 --> 00:48:17,140 It's a very difficult topic. 779 00:48:17,140 --> 00:48:25,140 In the morning, I had a lecture in front of the law enforcement officers and their children at the Cheongju District Court. 780 00:48:25,140 --> 00:48:29,140 It was about how to make dreams come true. 781 00:48:29,140 --> 00:48:34,140 There was a discussion about health, resilience, and human relationships. 782 00:48:34,140 --> 00:48:42,140 Discussing artificial intelligence is a difficult topic to discuss because it requires a lot of philosophy, but I don't think the essence has changed. 783 00:48:42,140 --> 00:48:54,140 No matter how fast the world changes, our true essence that does not change, the reason why we wake up in the morning and open our eyes, I think the principles of the era of artificial intelligence start from the beginning. 784 00:48:54,140 --> 00:48:55,140 Everyone, 785 00:48:55,140 --> 00:48:56,140 why do you live? 786 00:48:56,140 --> 00:49:00,140 Why do you wake up in the morning and go to work? 787 00:49:00,140 --> 00:49:07,140 And why are you studying these changes to adapt to change? 788 00:49:07,140 --> 00:49:14,140 We all know, and philosophers have mentioned it for a long time, but we live to be happy. 789 00:49:14,140 --> 00:49:24,140 In the process of happiness, all keywords are in the world of success, victory of competition, and development of skills. 790 00:49:24,140 --> 00:49:36,140 I think it's important to look back at ourselves a little slower in the world, not to be slaves to algorithms, not to be replaced by machines, and to look at our minds that can shine brighter. 791 00:49:36,140 --> 00:49:40,140 I think that's where the new era of ethics comes in. 792 00:49:40,140 --> 00:49:46,140 As I mentioned the book of Inamori Gajio earlier, I came to the second part of the lecture. 793 00:49:46,140 --> 00:49:50,140 As we live, we don't just grow our bodies. 794 00:49:50,140 --> 00:49:53,140 The shining mind inside me. 795 00:49:53,140 --> 00:50:03,140 I think it's the beginning of trying not to lose priority by the physical exterior of the mind. 796 00:50:03,140 --> 00:50:09,140 I teach a lot of students, and our students are living in an era of appearance idealism. 797 00:50:09,140 --> 00:50:15,140 Because the ideal model I want to be is Jang Won-young of IVE. 798 00:50:15,140 --> 00:50:19,140 If you're not born like that, it's really hard to lose weight like that. 799 00:50:19,140 --> 00:50:21,140 Many students have eating disorders. 800 00:50:21,140 --> 00:50:22,140 They can't eat what they want. 801 00:50:22,140 --> 00:50:24,140 They can't eat what they want. 802 00:50:24,140 --> 00:50:30,140 Even adults search for whether they'll lose weight if they get a stomachache. 803 00:50:30,140 --> 00:50:36,140 In an era where so many things are focused on appearance and the competitiveness of appearance is believed to be their competitiveness, 804 00:50:36,140 --> 00:50:43,140 I think it's an era where we don't talk about our exhausted minds and tired minds that shrink into it. 805 00:50:43,140 --> 00:50:50,140 Rather than thinking about what I see, what I fill, and what I live with, 806 00:50:50,140 --> 00:50:56,140 I think about how much I have external indicators of competitiveness, 807 00:50:56,140 --> 00:51:00,140 how much salary I get, how many apartments I live in, 808 00:51:00,140 --> 00:51:03,140 and how I built a career in this job. 809 00:51:03,140 --> 00:51:06,140 It's an era of focusing a lot. 810 00:51:06,140 --> 00:51:10,140 Sometimes, rather than the external beauty, rather than the external specifications, 811 00:51:10,140 --> 00:51:16,140 when the argument that can soothe the tired mind inside me is dominant, 812 00:51:16,140 --> 00:51:18,140 no matter what wrapping paper is piled up on the outside, 813 00:51:18,140 --> 00:51:20,140 no matter what appearance I have, 814 00:51:20,140 --> 00:51:23,140 when it becomes a culture where we can respect each other's shining soul inside, 815 00:51:23,140 --> 00:51:26,140 I thought that it would be an era where anthropological change would begin 816 00:51:26,140 --> 00:51:31,140 and the humanity that is not replaced by machines would be maintained. 817 00:51:31,140 --> 00:51:34,140 I was preparing for this lecture, 818 00:51:34,140 --> 00:51:37,140 and the founder of OpenAI, Sam Altman, 819 00:51:37,140 --> 00:51:41,140 recently gave the following warning in an article for two days. 820 00:51:41,140 --> 00:51:46,140 There may be increased hacking damage in the era of artificial intelligence, 821 00:51:46,140 --> 00:51:51,140 and voice phishing crimes that have been copied to the face and voice have increased, 822 00:51:51,140 --> 00:51:54,140 so be careful. 823 00:51:54,140 --> 00:51:56,140 There was a comment saying, 824 00:51:56,140 --> 00:52:02,140 why didn't you listen to me when you knew it was so dangerous? 825 00:52:02,140 --> 00:52:06,140 I talked about the Tesla CEO business earlier, right? 826 00:52:06,140 --> 00:52:11,140 Capitalism does not slow down at the speed of rational investigation. 827 00:52:11,140 --> 00:52:13,140 How do you get more investment, 828 00:52:13,140 --> 00:52:15,140 how do you get more facility facilities, 829 00:52:15,140 --> 00:52:19,140 and how can you create more profit? 830 00:52:19,140 --> 00:52:22,140 I'm going to create a strong artificial intelligence 831 00:52:22,140 --> 00:52:25,140 through self-driving cars and humanoid robots. 832 00:52:25,140 --> 00:52:28,140 In a world where we are changing to the logic of capital, 833 00:52:28,140 --> 00:52:33,140 I thought that if we hold on to the speed of capital 834 00:52:33,140 --> 00:52:36,140 and only regulate and control it, 835 00:52:36,140 --> 00:52:42,140 we might be in the era of technological defeat of the world's artificial intelligence. 836 00:52:42,140 --> 00:52:44,140 Not long ago, in the Korean court, 837 00:52:44,140 --> 00:52:51,140 a woman who was a voice coach and a vocal coach 838 00:52:51,140 --> 00:52:54,140 was synthesized into AI. 839 00:52:54,140 --> 00:52:58,140 About the sale of the AI voice synthesizer program, 840 00:52:58,140 --> 00:53:01,140 the woman asked me to stop the sale 841 00:53:01,140 --> 00:53:04,140 because she didn't know that my voice would be used in such a wide range. 842 00:53:04,140 --> 00:53:09,140 I agreed to synthesize the voice of artificial intelligence with my voice, 843 00:53:09,140 --> 00:53:12,140 but I didn't agree to the sale of this in all areas. 844 00:53:12,140 --> 00:53:13,140 She filed a lawsuit against me. 845 00:53:13,140 --> 00:53:18,140 There was an article that the court listened to the woman's side. 846 00:53:18,140 --> 00:53:20,140 I saw a comment in the article that said, 847 00:53:20,140 --> 00:53:26,140 the regulations are so severe that the development of artificial intelligence in our country is too late. 848 00:53:26,140 --> 00:53:30,140 It's true, but I also think it's sad. 849 00:53:30,140 --> 00:53:33,140 At the speed at which the huge capital of capitalism is moving, 850 00:53:33,140 --> 00:53:40,140 can we really make a logical break without limiting the development of technology? 851 00:53:40,140 --> 00:53:41,140 All of those logical breaks 852 00:53:41,140 --> 00:53:45,140 start with the rational consensus of the whole nation, 853 00:53:45,140 --> 00:53:52,140 rather than the forced legal regulations or the judgment of the court. 854 00:53:52,140 --> 00:53:56,140 So today, I met you with this difficult topic, 855 00:53:56,140 --> 00:53:59,140 but one of the questions I asked here 856 00:53:59,140 --> 00:54:01,140 will be discussed more broadly, 857 00:54:01,140 --> 00:54:06,140 and it can reflect the capitalists. 858 00:54:06,140 --> 00:54:08,140 I hope that the direction of capital 859 00:54:08,140 --> 00:54:10,140 will not be the direction of neglecting human ethics. 860 00:54:10,140 --> 00:54:13,140 If it becomes the starting point of the discussion, 861 00:54:13,140 --> 00:54:16,140 I came here with a happy mind. 862 00:54:16,140 --> 00:54:19,140 Well, the last thing in the lecture is the following. 863 00:54:19,140 --> 00:54:20,140 Let's be healthy. 864 00:54:20,140 --> 00:54:23,140 And let's take good care of ourselves. 865 00:54:23,140 --> 00:54:25,140 In any change, 866 00:54:25,140 --> 00:54:28,140 you must always maintain the inner purity of your mind 867 00:54:28,140 --> 00:54:31,140 that is not explained in numbers as something external, 868 00:54:31,140 --> 00:54:34,140 and you must always keep your eyes on the change, 869 00:54:34,140 --> 00:54:37,140 so that our humanity can move forward 870 00:54:37,140 --> 00:54:39,140 in the direction of change. 871 00:54:39,140 --> 00:54:41,140 I hope that society will come to an understanding 872 00:54:41,140 --> 00:54:43,140 that can take part in the discussion at least once, 873 00:54:43,140 --> 00:54:45,140 and let's conclude this lecture. 874 00:54:45,140 --> 00:54:46,140 Thank you. 875 00:54:46,140 --> 00:54:48,140 Thank you. 876 00:54:48,140 --> 00:54:52,140 I hope I managed to answer all the questions. 877 00:54:52,140 --> 00:54:57,140 I will answer the questions. 878 00:54:57,140 --> 00:55:07,140 If you have any questions, please put your hand up. 879 00:55:07,140 --> 00:55:16,140 Hello, I've been listening to your lecture for a long time, and I'm so happy to be here. 880 00:55:16,140 --> 00:55:21,140 While I was listening to your lecture, I had a question. 881 00:55:21,140 --> 00:55:26,140 You said that there was no such thing as an exoskeleton. 882 00:55:26,140 --> 00:55:42,140 I heard that it was most effective in the market to promote content to consumers through Facebook. 883 00:55:42,140 --> 00:55:45,140 When I saw that, I thought, 884 00:55:45,140 --> 00:55:52,140 I don't know if it's a good thing for humans to consume content, 885 00:55:52,140 --> 00:55:59,140 or if it's a good thing for humans to consume content. 886 00:55:59,140 --> 00:56:07,140 But companies don't use it, and consumers consume it, 887 00:56:07,140 --> 00:56:10,140 so I think it will continue. 888 00:56:10,140 --> 00:56:14,140 What do you think about this? 889 00:56:15,140 --> 00:56:18,140 It's a very difficult question. 890 00:56:18,140 --> 00:56:22,140 Everyone, I like cats. 891 00:56:22,140 --> 00:56:28,140 Cats appear a lot in YouTube algorithms, Instagram Shorts, and Reels. 892 00:56:28,140 --> 00:56:31,140 Sometimes, when I watch cats for a long time, 893 00:56:31,140 --> 00:56:39,140 I recommend similar videos compared to the time I watched the video, the comments, and the heart reaction. 894 00:56:39,140 --> 00:56:44,140 I only want to see that cute cat, but sometimes there is a CCTV on Han Moon-chul TV. 895 00:56:45,140 --> 00:56:48,140 There was a car accident in Incheon Bridge, 896 00:56:48,140 --> 00:56:52,140 and the car disappeared in an instant, 897 00:56:52,140 --> 00:56:59,140 but the car was completely compressed in a small square area, 898 00:56:59,140 --> 00:57:03,140 and I think he was famous for that. 899 00:57:03,140 --> 00:57:06,140 But it was so irritating and disgusting, 900 00:57:06,140 --> 00:57:08,140 so I closed my eyes, 901 00:57:08,140 --> 00:57:10,140 and I was so curious about what happened to the driver, 902 00:57:10,140 --> 00:57:12,140 so I watched the video about 10 times, 903 00:57:12,140 --> 00:57:14,140 and I watched it for a second. 904 00:57:14,140 --> 00:57:16,140 In the comments, 905 00:57:16,140 --> 00:57:17,140 it was mentioned in the comments 906 00:57:17,140 --> 00:57:19,140 how many people were in the car 907 00:57:19,140 --> 00:57:21,140 because I was curious about what happened 908 00:57:21,140 --> 00:57:23,140 whether he died or not. 909 00:57:23,140 --> 00:57:24,140 When I read it, 910 00:57:24,140 --> 00:57:27,140 my YouTube algorithm turned into blood. 911 00:57:27,140 --> 00:57:29,140 When I saw it, 912 00:57:29,140 --> 00:57:31,140 I thought like this. 913 00:57:31,140 --> 00:57:32,140 Human beings, 914 00:57:32,140 --> 00:57:35,140 as I said about the Giges ring, 915 00:57:35,140 --> 00:57:37,140 when we look at ourselves honestly, 916 00:57:37,140 --> 00:57:41,140 we are not always full of good and moral content. 917 00:57:41,140 --> 00:57:42,140 Sometimes, 918 00:57:42,140 --> 00:57:43,140 we are hateful, 919 00:57:43,140 --> 00:57:44,140 we are prejudiced, 920 00:57:44,140 --> 00:57:45,140 we are violent, 921 00:57:45,140 --> 00:57:48,140 and it makes our brain more intensely stimulated, 922 00:57:48,140 --> 00:57:50,140 so we can keep an eye on it. 923 00:57:50,140 --> 00:57:53,140 I think that's the realm of human instinct. 924 00:57:53,140 --> 00:57:54,140 Sometimes, 925 00:57:54,140 --> 00:57:56,140 everything is regulated, 926 00:57:56,140 --> 00:58:03,140 everything is in the situation where there are rules and regulations, 927 00:58:03,140 --> 00:58:08,140 and I'm not allowed to see it on the screen, 928 00:58:08,140 --> 00:58:12,140 so I'm sure there are concerns about whether it is suitable for humanity. 929 00:58:12,140 --> 00:58:16,140 But I think I have to tell you this. 930 00:58:16,140 --> 00:58:18,140 Do you like macarons? 931 00:58:18,140 --> 00:58:21,140 Listen to why I'm talking about macarons. 932 00:58:21,140 --> 00:58:25,140 I like pork better than macarons. 933 00:58:25,140 --> 00:58:27,140 I like beef better. 934 00:58:27,140 --> 00:58:30,140 When I eat macarons, it's too sweet, so I squint and say, 935 00:58:30,140 --> 00:58:32,140 Oh, it's sweet. 936 00:58:32,140 --> 00:58:36,140 Not long ago, I met a distant relative of five years old. 937 00:58:36,140 --> 00:58:42,140 While eating macarons, he expressed that he had a burst in his head. 938 00:58:42,140 --> 00:58:44,140 He said it was so delicious. 939 00:58:44,140 --> 00:58:46,140 Do you know why he said that? 940 00:58:46,140 --> 00:58:49,140 You may know more about this public. 941 00:58:49,140 --> 00:58:55,140 It's because the sense that adults feel about sweetness is different from that of children. 942 00:58:55,140 --> 00:58:57,140 We've become adults to some extent. 943 00:58:57,140 --> 00:58:58,140 I have a job. 944 00:58:58,140 --> 00:59:00,140 In the midst of what I have to do, 945 00:59:00,140 --> 00:59:04,140 Watching videos that stimulate dopamine, 946 00:59:04,140 --> 00:59:06,140 I'm completely immersed in this video. 947 00:59:06,140 --> 00:59:08,140 I can't just watch it all day long. 948 00:59:08,140 --> 00:59:12,140 There are tasks in my field of work. 949 00:59:12,140 --> 00:59:15,140 But children, really young children, 950 00:59:15,140 --> 00:59:18,140 It's not completely mature yet. 951 00:59:18,140 --> 00:59:20,140 The brain is more sensitive to stimulation. 952 00:59:20,140 --> 00:59:23,140 In a situation where you lack self-control, 953 00:59:23,140 --> 00:59:26,140 In such a positive, stimulating, destructive, and violent way 954 00:59:26,140 --> 00:59:28,140 If you are exposed to unlimited exposure, 955 00:59:28,140 --> 00:59:32,140 Before the judgment that this is bad is properly formed, 956 00:59:32,140 --> 00:59:34,140 A specific part of the brain 957 00:59:34,140 --> 00:59:36,140 Or the child's psychology and emotions. 958 00:59:36,140 --> 00:59:41,140 I think there is a certain aspect that is clearly manipulated. 959 00:59:41,140 --> 00:59:44,140 Violence and violence are inside humans. 960 00:59:44,140 --> 00:59:48,140 It's one of the very basic emotions that are shy but hidden. 961 00:59:48,140 --> 00:59:51,140 Of course, that's not the only thing human beings have. 962 00:59:51,140 --> 00:59:55,140 But at least for young children, 963 00:59:55,140 --> 00:59:58,140 YouTube or Instagram 964 00:59:58,140 --> 01:00:01,140 When it can be regulated in the algorithm, 965 01:00:01,140 --> 01:00:04,140 At least when you become an adult, 966 01:00:04,140 --> 01:00:07,140 Rather than meeting when you can judge, 967 01:00:07,140 --> 01:00:11,140 I think the discussion should start with the fact that it is much more important. 968 01:00:11,140 --> 01:00:13,140 For me, who can't live in a perfect world, 969 01:00:13,140 --> 01:00:15,140 Better than the best, 970 01:00:15,140 --> 01:00:19,140 I have no choice but to choose the worst than the worst. 971 01:00:19,140 --> 01:00:22,140 I told you about the logic of capitalism earlier. 972 01:00:22,140 --> 01:00:25,140 It's clear that the meta that runs Facebook and Instagram 973 01:00:25,140 --> 01:00:27,140 Or YouTube 974 01:00:27,140 --> 01:00:31,140 The more people stay in such selective, violent, and stimulating videos, 975 01:00:31,140 --> 01:00:33,140 The more ads you can get. 976 01:00:33,140 --> 01:00:35,140 You can get more users. 977 01:00:35,140 --> 01:00:38,140 You can get more time to use it. 978 01:00:38,140 --> 01:00:43,140 I don't think it's going to be easy to stop such selective and stimulating content. 979 01:00:43,140 --> 01:00:47,140 The logic of capitalism is sometimes more than moral reflection. 980 01:00:47,140 --> 01:00:50,140 It's going too far. 981 01:00:50,140 --> 01:00:56,140 But helping growing children to get away from that. 982 01:00:56,140 --> 01:00:58,140 Blocking the algorithm. 983 01:00:58,140 --> 01:01:00,140 And in any way 984 01:01:00,140 --> 01:01:02,140 To prevent such things from being manipulated. 985 01:01:02,140 --> 01:01:04,140 I thought it was the least I could do. 986 01:01:04,140 --> 01:01:06,140 Living in the world first. 987 01:01:06,140 --> 01:01:08,140 The ability of morality has been established. 988 01:01:08,140 --> 01:01:10,140 I thought it was the least I could do. 989 01:01:10,140 --> 01:01:12,140 I thought it was the least I could do. 990 01:01:12,140 --> 01:01:15,140 Rather than the exact answer to that question, 991 01:01:15,140 --> 01:01:19,140 The logic of capitalism is too cruel to our growing children. 992 01:01:19,140 --> 01:01:23,140 I think we need to start with the rules that adults are trying to get into. 993 01:01:23,140 --> 01:01:26,140 I'll try to answer the question. 994 01:01:26,140 --> 01:01:28,140 Yes. 995 01:01:28,140 --> 01:01:30,140 Another question. 996 01:01:30,140 --> 01:01:32,140 Qualifying artificial intelligence as an art. 997 01:01:32,140 --> 01:01:34,100 An artificial intelligence project to develop technology. 998 01:01:34,100 --> 01:01:36,140 An artificial intelligence project to create侵入. 999 01:01:36,140 --> 01:01:37,140 An artificial intelligence project to attract initially Twitter源. 1000 01:01:37,140 --> 01:01:38,140 Not just artificial intelligence principle. 1001 01:01:38,140 --> 01:01:40,140 A personal question is good, too. 1002 01:01:40,140 --> 01:01:42,140 I'm more comfortable to answer. 1003 01:01:42,140 --> 01:01:44,140 I'm more comfortable to answer. 1004 01:01:44,140 --> 01:01:46,140 One more time. 1005 01:01:46,140 --> 01:01:52,140 What is your stress assessment based on? 1006 01:01:52,140 --> 01:01:54,140 Об → 1007 01:01:54,140 --> 01:01:55,140 An expression of non-re comprehensible norteo. 1008 01:01:55,140 --> 01:01:58,140 ответ 1009 01:01:58,140 --> 01:01:59,140 Yes. 1010 01:01:59,140 --> 01:02:04,140 Like Immanuel Kant, a German philosopher who was briefly mentioned earlier, 1011 01:02:04,140 --> 01:02:07,140 I spent my school days saying to my friends, 1012 01:02:07,140 --> 01:02:12,140 I love philosophy so much that I will become Immanuel Kant. 1013 01:02:12,140 --> 01:02:15,140 So I was going to study philosophy, 1014 01:02:15,140 --> 01:02:19,140 but the teachers around me said this to me. 1015 01:02:19,140 --> 01:02:22,140 If you want to study philosophy and get a job, 1016 01:02:22,140 --> 01:02:26,140 you have to go to ancient Greece, Athens. 1017 01:02:26,140 --> 01:02:29,140 They stopped me from majoring in philosophy, 1018 01:02:29,140 --> 01:02:31,140 and when I went to the Ministry of Education, 1019 01:02:31,140 --> 01:02:33,140 I was given a certificate of education, 1020 01:02:33,140 --> 01:02:37,140 and I was encouraged by the advice that I could think about the future in various ways 1021 01:02:37,140 --> 01:02:39,140 with the certificate of education, 1022 01:02:39,140 --> 01:02:43,140 so I chose to major in ethics education. 1023 01:02:43,140 --> 01:02:45,140 I wanted to continue to be a scholar. 1024 01:02:45,140 --> 01:02:47,140 In fact, after graduating from Seoul National University, 1025 01:02:47,140 --> 01:02:50,140 I chose to study ethics and philosophy, 1026 01:02:50,140 --> 01:02:53,140 both in the doctoral and doctoral courses. 1027 01:02:53,140 --> 01:02:55,140 At some point, 1028 01:02:55,140 --> 01:02:56,140 I thought, 1029 01:02:56,140 --> 01:02:58,140 in order to walk the path of a scholar, 1030 01:02:58,140 --> 01:03:00,140 I may not be enough as a doctor in Korea, 1031 01:03:00,140 --> 01:03:04,140 and I may need to study abroad. 1032 01:03:04,140 --> 01:03:08,140 I became a lecturer to save money for studying abroad. 1033 01:03:08,140 --> 01:03:10,140 After becoming a lecturer, 1034 01:03:10,140 --> 01:03:12,140 I found out at some point that 1035 01:03:12,140 --> 01:03:14,140 if I hold the microphone, 1036 01:03:14,140 --> 01:03:16,140 I have a lot of talent in lectures, 1037 01:03:16,140 --> 01:03:22,140 and I started thinking about why I wanted to be a scholar again. 1038 01:03:22,140 --> 01:03:25,140 Even after I died, 1039 01:03:25,140 --> 01:03:27,140 I wanted to be a scholar 1040 01:03:27,140 --> 01:03:29,140 because I wanted the book I wrote 1041 01:03:29,140 --> 01:03:34,140 to throw a valuable question. 1042 01:03:34,140 --> 01:03:36,140 Just like a tiger dies to leave his family, 1043 01:03:36,140 --> 01:03:38,140 and a person dies to leave his name, 1044 01:03:38,140 --> 01:03:40,140 I thought it would be very grateful 1045 01:03:40,140 --> 01:03:42,140 if the theory and book I made 1046 01:03:42,140 --> 01:03:44,140 could be discussed in a valuable way. 1047 01:03:44,140 --> 01:03:46,140 At some point, 1048 01:03:46,140 --> 01:03:48,140 through a lecture, 1049 01:03:48,140 --> 01:03:50,140 a really good lecture 1050 01:03:50,140 --> 01:03:53,140 can change the mind of those who listen to that lecture, 1051 01:03:53,140 --> 01:03:54,140 and it can change the mind of those who listen to that lecture, 1052 01:03:54,140 --> 01:03:57,140 and it can create social empathy. 1053 01:03:57,140 --> 01:04:02,140 Like a small pebble dropped in a well, 1054 01:04:02,140 --> 01:04:04,140 I found out that 1055 01:04:04,140 --> 01:04:06,140 it has the power to expand the discussion 1056 01:04:06,140 --> 01:04:08,140 between the one and ten thousand people. 1057 01:04:08,140 --> 01:04:10,140 So I chose the life of a lecturer, 1058 01:04:10,140 --> 01:04:12,140 not a scholar, 1059 01:04:12,140 --> 01:04:14,140 and I hope that 1060 01:04:14,140 --> 01:04:16,140 when good people are waiting for me, 1061 01:04:16,140 --> 01:04:20,140 I can change my life with a good lecture. 1062 01:04:20,140 --> 01:04:22,140 Even if you don't sympathize with 1063 01:04:22,140 --> 01:04:23,140 or like all my stories, 1064 01:04:23,140 --> 01:04:25,140 if you remember the time we spent together 1065 01:04:25,140 --> 01:04:27,140 in one hour, 1066 01:04:27,140 --> 01:04:29,140 I think it would be an honor 1067 01:04:29,140 --> 01:04:31,140 to choose the life of a lecturer 1068 01:04:31,140 --> 01:04:33,140 with that thought in mind. 1069 01:04:33,140 --> 01:04:35,140 Of course, capitalism was sweet. 1070 01:04:35,140 --> 01:04:37,140 Of course, capitalism was sweet. 1071 01:04:37,140 --> 01:04:39,140 Do you have any other questions? 1072 01:04:39,140 --> 01:04:41,140 Yes, I have one last question. 1073 01:04:41,140 --> 01:04:43,140 Do you have any other questions? 1074 01:04:43,140 --> 01:04:45,140 Yes, I have one last question. 1075 01:04:45,140 --> 01:04:47,140 What is it? 1076 01:04:47,140 --> 01:04:49,140 I was curious about 1077 01:04:49,140 --> 01:04:51,140 the human nature of the book. 1078 01:04:51,140 --> 01:04:53,140 About human nature, 1079 01:04:53,140 --> 01:04:55,140 I haven't disclosed my definition 1080 01:04:55,140 --> 01:04:57,140 of human nature. 1081 01:04:57,140 --> 01:04:59,140 People write stuck comments 1082 01:04:59,140 --> 01:05:01,140 and then leave comments 1083 01:05:01,140 --> 01:05:02,140 about今日は 여기까지 Echo free. 1084 01:05:02,140 --> 01:05:03,160 People write stuck comments and then leave comments 1085 01:05:03,160 --> 01:05:05,160 and then leave comments 1086 01:05:05,160 --> 01:05:07,140 explaining how the book was called, 1087 01:05:07,140 --> 01:05:09,140 it uses the word human at all. 1088 01:05:09,140 --> 01:05:11,140 After all, 1089 01:05:11,140 --> 01:05:13,140 they claim human nature 1090 01:05:13,140 --> 01:05:15,140 by advocating it, 1091 01:05:15,140 --> 01:05:17,140 just as you Opinion. 1092 01:05:17,140 --> 01:05:19,140 Then, 1093 01:05:19,140 --> 01:05:21,140 Angry at AIC 1094 01:05:21,140 --> 01:05:26,140 How long do you want to live? 1095 01:05:26,140 --> 01:05:31,140 I want to live until I'm 120 years old. 1096 01:05:31,140 --> 01:05:36,140 There's a reason why I'm answering this question. 1097 01:05:36,140 --> 01:05:41,140 I think that as we are born and live until we die, 1098 01:05:41,140 --> 01:05:45,140 our spirit and mind grow. 1099 01:05:45,140 --> 01:05:49,140 When I was young, I only thought about winning in competition, 1100 01:05:49,140 --> 01:05:52,140 but after I got sick to death, 1101 01:05:52,140 --> 01:05:57,140 I started to worry about being healthy and living well together. 1102 01:05:57,140 --> 01:06:01,140 In my 20s, my lecture was that I had to be selfish to the students, 1103 01:06:01,140 --> 01:06:05,140 and that it was a sin for a student to sleep for 7 hours a day, 1104 01:06:05,140 --> 01:06:10,140 and that if I wanted to lose weight, I had to stop eating, 1105 01:06:10,140 --> 01:06:14,140 and that if I wanted to sleep less, I had to stop sleeping. 1106 01:06:14,140 --> 01:06:16,140 I was doing a good lecture. 1107 01:06:16,140 --> 01:06:19,140 But I didn't want to lose weight. 1108 01:06:19,140 --> 01:06:22,140 As I lived my life, my lecture changed. 1109 01:06:22,140 --> 01:06:24,140 It's a study to be happy. 1110 01:06:24,140 --> 01:06:27,140 It's a study to give me good things. 1111 01:06:27,140 --> 01:06:31,140 Even if you sleep well and have a clear mind, you can do it well enough. 1112 01:06:31,140 --> 01:06:34,140 There's nothing happier than chewing on delicious things. 1113 01:06:34,140 --> 01:06:36,140 As I said to myself, 1114 01:06:36,140 --> 01:06:39,140 give yourself a healthy food as if you were giving yourself a gift, 1115 01:06:39,140 --> 01:06:43,140 I was able to do a lecture that grew one step further. 1116 01:06:43,140 --> 01:06:46,140 Of course, rather than a lecture that emphasized only selfishness and selfishness, 1117 01:06:46,140 --> 01:06:48,140 a lecture that emphasized only selfishness and selfishness, 1118 01:06:48,140 --> 01:06:51,140 I was able to do a lecture that emphasized only selfishness and selfishness, 1119 01:06:51,140 --> 01:06:55,140 but that was also another process of realization for me. 1120 01:06:55,140 --> 01:07:02,140 I've been thinking a lot that life is a long learning process for humility, 1121 01:07:02,140 --> 01:07:07,140 and I hope that the amazing process of learning more and more things one by one 1122 01:07:07,140 --> 01:07:11,140 doesn't end too soon. 1123 01:07:11,140 --> 01:07:15,140 The reason I'm talking about that is that my body is getting old, 1124 01:07:15,140 --> 01:07:17,140 my head is getting white, 1125 01:07:17,140 --> 01:07:19,140 my body is getting dull, 1126 01:07:19,140 --> 01:07:22,140 and I don't know if I'm going to have a hard time with my body, 1127 01:07:22,140 --> 01:07:24,140 but in the process of aging, 1128 01:07:24,140 --> 01:07:31,140 even though my body is weak and I don't look strong and weak to others, 1129 01:07:31,140 --> 01:07:36,140 I thought that the shining spirit inside me was growing. 1130 01:07:36,140 --> 01:07:42,140 I think the real discussion about humanity comes from looking into my mind. 1131 01:07:42,140 --> 01:07:44,140 Where is my mind going? 1132 01:07:44,140 --> 01:07:45,140 What am I pursuing? 1133 01:07:45,140 --> 01:07:46,140 I was born in this world, 1134 01:07:46,140 --> 01:07:50,140 so what am I going to get and go to that world? 1135 01:07:50,140 --> 01:07:52,140 Since no one has ever died, 1136 01:07:52,140 --> 01:07:58,140 I think that discussing death is only possible in the books of philosophers. 1137 01:07:58,140 --> 01:08:01,140 However, I felt that I might die in a week 1138 01:08:01,140 --> 01:08:08,140 in the midst of a dilemma of death. 1139 01:08:09,140 --> 01:08:15,140 Even if I die and no longer exist in this world, 1140 01:08:15,140 --> 01:08:19,140 what do I really want to do when I'm alive? 1141 01:08:19,140 --> 01:08:22,140 What do I want to give to myself during the remaining period? 1142 01:08:22,140 --> 01:08:24,140 What kind of person do I want to be with? 1143 01:08:24,140 --> 01:08:27,140 What kind of content do I want to fill my remaining life with? 1144 01:08:27,140 --> 01:08:29,140 What do I prefer? 1145 01:08:29,140 --> 01:08:32,140 From deciding what I want to talk about with someone, 1146 01:08:32,140 --> 01:08:35,140 I thought that it was a process of learning about myself. 1147 01:08:35,140 --> 01:08:37,140 Everyone, these days, 1148 01:08:37,140 --> 01:08:42,140 you have to watch a video for 5 minutes to rest, 1149 01:08:42,140 --> 01:08:44,140 and you have to watch YouTube algorithms for 2 to 3 hours. 1150 01:08:44,140 --> 01:08:49,140 Have you ever experienced being trapped in a YouTube algorithm? 1151 01:08:49,140 --> 01:08:51,140 I have a request. 1152 01:08:51,140 --> 01:08:55,140 When was the last time you went to a bookstore? 1153 01:08:55,140 --> 01:08:58,140 These days, people are living in an era 1154 01:08:58,140 --> 01:09:00,140 where they open a book with one click on the internet 1155 01:09:00,140 --> 01:09:02,140 and it flies right away, 1156 01:09:02,140 --> 01:09:04,140 but when you hold your children's hands 1157 01:09:04,140 --> 01:09:07,140 or on a day when you are tired, 1158 01:09:07,140 --> 01:09:10,140 you can represent your mind for the rest of your life. 1159 01:09:10,140 --> 01:09:13,140 You can represent the person who is the most you. 1160 01:09:13,140 --> 01:09:15,140 How do you move around, 1161 01:09:15,140 --> 01:09:16,140 how do you transfer jobs, 1162 01:09:16,140 --> 01:09:18,140 and how do you retire? 1163 01:09:18,140 --> 01:09:22,140 When you were told to find three books that represent you, 1164 01:09:22,140 --> 01:09:23,140 you said, 1165 01:09:23,140 --> 01:09:25,140 it's not difficult to buy real estate. 1166 01:09:25,140 --> 01:09:30,140 Will you choose to be a high-ranking investor? 1167 01:09:30,140 --> 01:09:34,140 Will you choose to invest wisely? 1168 01:09:37,140 --> 01:09:40,140 The word humanity is a very broad term, 1169 01:09:40,140 --> 01:09:42,140 but the humanity I use is, 1170 01:09:43,140 --> 01:09:46,140 just like a slave to the algorithm, 1171 01:09:46,140 --> 01:09:50,140 I live by compromising with other people's eyes, 1172 01:09:50,140 --> 01:09:52,140 and I don't live by looking at the world's eyes 1173 01:09:52,140 --> 01:09:55,140 and thinking that I've lived well enough. 1174 01:09:55,140 --> 01:09:58,140 I think it starts with really looking for myself. 1175 01:09:58,140 --> 01:10:00,140 What I want, 1176 01:10:00,140 --> 01:10:02,140 what I want to fill myself with, 1177 01:10:02,140 --> 01:10:04,140 until I die, 1178 01:10:04,140 --> 01:10:07,140 these scenes are important to me, 1179 01:10:07,140 --> 01:10:10,140 so I fill myself with scenes that I can do. 1180 01:10:10,140 --> 01:10:11,140 It's embarrassing, 1181 01:10:11,140 --> 01:10:12,140 but YouTube is really fun, too. 1182 01:10:12,140 --> 01:10:15,140 I'm a YouTuber, too. 1183 01:10:15,140 --> 01:10:20,140 But I write a diary every year. 1184 01:10:20,140 --> 01:10:22,140 Let's not be slaves to the algorithm this year. 1185 01:10:22,140 --> 01:10:23,140 This year, 1186 01:10:23,140 --> 01:10:26,140 let's not be swayed too much by the videos 1187 01:10:26,140 --> 01:10:28,140 that the automatic algorithm recommends, 1188 01:10:28,140 --> 01:10:31,140 except for the keywords I searched because I was curious. 1189 01:10:31,140 --> 01:10:34,140 It's a difficult thing, 1190 01:10:34,140 --> 01:10:37,140 but when I see something fun 1191 01:10:37,140 --> 01:10:39,140 swaying around in the algorithm, 1192 01:10:39,140 --> 01:10:40,140 I sometimes think this. 1193 01:10:40,140 --> 01:10:41,140 Today, 1194 01:10:41,140 --> 01:10:43,140 my wasted time 1195 01:10:43,140 --> 01:10:45,140 has contributed to my life 1196 01:10:45,140 --> 01:10:48,140 in order to make it a more pure soul. 1197 01:10:48,140 --> 01:10:49,140 When I think about it, 1198 01:10:49,140 --> 01:10:51,140 there are many times when I feel sorry. 1199 01:10:51,140 --> 01:10:52,140 In that case, 1200 01:10:52,140 --> 01:10:54,140 I hold an analog notebook, 1201 01:10:54,140 --> 01:10:57,140 an analog fountain pen, 1202 01:10:57,140 --> 01:10:59,140 and write about why I live, 1203 01:10:59,140 --> 01:11:00,140 what I dream of, 1204 01:11:00,140 --> 01:11:02,140 and what I want to achieve. 1205 01:11:02,140 --> 01:11:05,140 I think this is the nature of human beings. 1206 01:11:05,140 --> 01:11:08,140 I think the main thing is to find what I want, 1207 01:11:08,140 --> 01:11:09,140 and from there, 1208 01:11:09,140 --> 01:11:10,140 the growth of the mind, 1209 01:11:10,140 --> 01:11:12,140 to the increasing growth of the personality, 1210 01:11:12,140 --> 01:11:13,140 and that kind of eel. 1211 01:11:13,140 --> 01:11:14,140 Then how can people 1212 01:11:14,140 --> 01:11:15,140 grow well? 1213 01:11:15,140 --> 01:11:17,140 I think it's a new step 1214 01:11:17,140 --> 01:11:20,140 that I think a person should take. 1215 01:11:20,140 --> 01:11:21,140 Although it has been aungen, 1216 01:11:21,140 --> 01:11:25,140 there have been some really takes 1217 01:11:25,140 --> 01:11:27,140 in writing a diary ten years later, 1218 01:11:27,140 --> 01:11:28,140 and still writing, 1219 01:11:28,140 --> 01:11:30,140 and asking a question 1220 01:11:30,140 --> 01:11:31,140 if I am getting along well 1221 01:11:31,140 --> 01:11:32,140 with my mind, 1222 01:11:32,140 --> 01:11:34,140 and thinking about 1223 01:11:34,140 --> 01:11:36,140 things I have been dreaming about 1224 01:11:36,140 --> 01:11:38,140 for 10 years in the future... 1225 01:11:38,140 --> 01:11:39,140 So, 1226 01:11:39,140 --> 01:11:47,140 I want to say that this is the beginning of humanity. 1227 01:11:47,140 --> 01:11:51,140 I always say this in my lectures. 1228 01:11:51,140 --> 01:11:56,140 We can't be slaves to algorithms from becoming our own masters. 1229 01:11:56,140 --> 01:12:00,140 That's what I really want to say. 1230 01:12:09,140 --> 01:12:19,140 Thank you for your hard work. 1231 01:12:19,140 --> 01:12:23,140 We will now end the press conference. 1232 01:12:23,140 --> 01:12:27,140 We look forward to hearing from you again. 1233 01:12:39,140 --> 01:12:49,140 Thank you. 1234 01:13:09,140 --> 01:13:19,140 Thank you for your hard work.