1 00:00:01,100 --> 00:00:04,340 covering the week's top textbooks like 2 00:00:04,340 --> 00:00:04,350 covering the week's top textbooks like 3 00:00:04,350 --> 00:00:07,369 covering the week's top textbooks like Linux bias Facebook has launched a new 4 00:00:07,369 --> 00:00:07,379 Linux bias Facebook has launched a new 5 00:00:07,379 --> 00:00:09,500 Linux bias Facebook has launched a new chat bot that it claims is able to 6 00:00:09,500 --> 00:00:09,510 chat bot that it claims is able to 7 00:00:09,510 --> 00:00:11,180 chat bot that it claims is able to demonstrate empathy knowledge and 8 00:00:11,180 --> 00:00:11,190 demonstrate empathy knowledge and 9 00:00:11,190 --> 00:00:14,330 demonstrate empathy knowledge and personality their chat bot which they've 10 00:00:14,330 --> 00:00:14,340 personality their chat bot which they've 11 00:00:14,340 --> 00:00:16,790 personality their chat bot which they've annoyingly named blender was trained 12 00:00:16,790 --> 00:00:16,800 annoyingly named blender was trained 13 00:00:16,800 --> 00:00:18,410 annoyingly named blender was trained using available public domain 14 00:00:18,410 --> 00:00:18,420 using available public domain 15 00:00:18,420 --> 00:00:21,080 using available public domain conversations which included 1.5 billion 16 00:00:21,080 --> 00:00:21,090 conversations which included 1.5 billion 17 00:00:21,090 --> 00:00:24,529 conversations which included 1.5 billion examples of human exchanges but experts 18 00:00:24,529 --> 00:00:24,539 examples of human exchanges but experts 19 00:00:24,539 --> 00:00:26,390 examples of human exchanges but experts say training the artificial intelligence 20 00:00:26,390 --> 00:00:26,400 say training the artificial intelligence 21 00:00:26,400 --> 00:00:28,820 say training the artificial intelligence using a platform such as reddit has its 22 00:00:28,820 --> 00:00:28,830 using a platform such as reddit has its 23 00:00:28,830 --> 00:00:31,730 using a platform such as reddit has its drawbacks numerous issues arose during 24 00:00:31,730 --> 00:00:31,740 drawbacks numerous issues arose during 25 00:00:31,740 --> 00:00:34,130 drawbacks numerous issues arose during longer conversations blender would 26 00:00:34,130 --> 00:00:34,140 longer conversations blender would 27 00:00:34,140 --> 00:00:35,900 longer conversations blender would sometimes respond with offensive 28 00:00:35,900 --> 00:00:35,910 sometimes respond with offensive 29 00:00:35,910 --> 00:00:37,940 sometimes respond with offensive language and at other times it would 30 00:00:37,940 --> 00:00:37,950 language and at other times it would 31 00:00:37,950 --> 00:00:40,549 language and at other times it would make up facts altogether researchers 32 00:00:40,549 --> 00:00:40,559 make up facts altogether researchers 33 00:00:40,559 --> 00:00:42,139 make up facts altogether researchers said they hoped further models would 34 00:00:42,139 --> 00:00:42,149 said they hoped further models would 35 00:00:42,149 --> 00:00:44,260 said they hoped further models would address some of these issues 36 00:00:44,260 --> 00:00:44,270 address some of these issues 37 00:00:44,270 --> 00:00:46,729 address some of these issues artificial intelligence expert Dave 38 00:00:46,729 --> 00:00:46,739 artificial intelligence expert Dave 39 00:00:46,739 --> 00:00:49,190 artificial intelligence expert Dave Chopin said that blender was a step in 40 00:00:49,190 --> 00:00:49,200 Chopin said that blender was a step in 41 00:00:49,200 --> 00:00:50,990 Chopin said that blender was a step in the right direction but noted two 42 00:00:50,990 --> 00:00:51,000 the right direction but noted two 43 00:00:51,000 --> 00:00:52,880 the right direction but noted two fundamental issues that still need to be 44 00:00:52,880 --> 00:00:52,890 fundamental issues that still need to be 45 00:00:52,890 --> 00:00:56,000 fundamental issues that still need to be overcome he told the BBC the first is 46 00:00:56,000 --> 00:00:56,010 overcome he told the BBC the first is 47 00:00:56,010 --> 00:00:58,069 overcome he told the BBC the first is just how complex it is to replicate all 48 00:00:58,069 --> 00:00:58,079 just how complex it is to replicate all 49 00:00:58,079 --> 00:01:00,410 just how complex it is to replicate all of the nuances of a human attribute like 50 00:01:00,410 --> 00:01:00,420 of the nuances of a human attribute like 51 00:01:00,420 --> 00:01:03,560 of the nuances of a human attribute like the ability to hold a conversation a sip 52 00:01:03,560 --> 00:01:03,570 the ability to hold a conversation a sip 53 00:01:03,570 --> 00:01:05,390 the ability to hold a conversation a sip a skill that most three-year-olds can 54 00:01:05,390 --> 00:01:05,400 a skill that most three-year-olds can 55 00:01:05,400 --> 00:01:07,490 a skill that most three-year-olds can master the second is around the 56 00:01:07,490 --> 00:01:07,500 master the second is around the 57 00:01:07,500 --> 00:01:09,469 master the second is around the relationship with the data used to train 58 00:01:09,469 --> 00:01:09,479 relationship with the data used to train 59 00:01:09,479 --> 00:01:11,660 relationship with the data used to train the model and the results generated by 60 00:01:11,660 --> 00:01:11,670 the model and the results generated by 61 00:01:11,670 --> 00:01:15,200 the model and the results generated by the model it goes on to explain as great 62 00:01:15,200 --> 00:01:15,210 the model it goes on to explain as great 63 00:01:15,210 --> 00:01:16,999 the model it goes on to explain as great a platform as Retta is training 64 00:01:16,999 --> 00:01:17,009 a platform as Retta is training 65 00:01:17,009 --> 00:01:19,010 a platform as Retta is training algorithms based on the conversations 66 00:01:19,010 --> 00:01:19,020 algorithms based on the conversations 67 00:01:19,020 --> 00:01:20,780 algorithms based on the conversations you find there is going to get you a lot 68 00:01:20,780 --> 00:01:20,790 you find there is going to get you a lot 69 00:01:20,790 --> 00:01:24,170 you find there is going to get you a lot of chaff amongst the wheat Facebook also 70 00:01:24,170 --> 00:01:24,180 of chaff amongst the wheat Facebook also 71 00:01:24,180 --> 00:01:25,730 of chaff amongst the wheat Facebook also compared blenders performance with the 72 00:01:25,730 --> 00:01:25,740 compared blenders performance with the 73 00:01:25,740 --> 00:01:27,649 compared blenders performance with the latest version of Google's own chat BOTS 74 00:01:27,649 --> 00:01:27,659 latest version of Google's own chat BOTS 75 00:01:27,659 --> 00:01:31,190 latest version of Google's own chat BOTS Mina it showed people two sets of 76 00:01:31,190 --> 00:01:31,200 Mina it showed people two sets of 77 00:01:31,200 --> 00:01:33,260 Mina it showed people two sets of conversations one made with blender and 78 00:01:33,260 --> 00:01:33,270 conversations one made with blender and 79 00:01:33,270 --> 00:01:35,330 conversations one made with blender and the other with Mina conversations 80 00:01:35,330 --> 00:01:35,340 the other with Mina conversations 81 00:01:35,340 --> 00:01:36,890 the other with Mina conversations included a wide range of topics 82 00:01:36,890 --> 00:01:36,900 included a wide range of topics 83 00:01:36,900 --> 00:01:39,850 included a wide range of topics including movies music and veganism 84 00:01:39,850 --> 00:01:39,860 including movies music and veganism 85 00:01:39,860 --> 00:01:42,830 including movies music and veganism Facebook said the 67% of respondents 86 00:01:42,830 --> 00:01:42,840 Facebook said the 67% of respondents 87 00:01:42,840 --> 00:01:45,080 Facebook said the 67% of respondents through blender sounded more human than 88 00:01:45,080 --> 00:01:45,090 through blender sounded more human than 89 00:01:45,090 --> 00:01:48,319 through blender sounded more human than Mina the researchers noted we achieved 90 00:01:48,319 --> 00:01:48,329 Mina the researchers noted we achieved 91 00:01:48,329 --> 00:01:50,389 Mina the researchers noted we achieved this milestone through a new chat bot 92 00:01:50,389 --> 00:01:50,399 this milestone through a new chat bot 93 00:01:50,399 --> 00:01:52,370 this milestone through a new chat bot recipe that includes improved decoding 94 00:01:52,370 --> 00:01:52,380 recipe that includes improved decoding 95 00:01:52,380 --> 00:01:54,980 recipe that includes improved decoding techniques novel blending of skills and 96 00:01:54,980 --> 00:01:54,990 techniques novel blending of skills and 97 00:01:54,990 --> 00:01:56,990 techniques novel blending of skills and a model with nine point four billion 98 00:01:56,990 --> 00:01:57,000 a model with nine point four billion 99 00:01:57,000 --> 00:01:59,480 a model with nine point four billion parameters which is three point six 100 00:01:59,480 --> 00:01:59,490 parameters which is three point six 101 00:01:59,490 --> 00:02:01,819 parameters which is three point six times more than the largest existing 102 00:02:01,819 --> 00:02:01,829 times more than the largest existing 103 00:02:01,829 --> 00:02:04,370 times more than the largest existing system building a truly intelligent 104 00:02:04,370 --> 00:02:04,380 system building a truly intelligent 105 00:02:04,380 --> 00:02:06,050 system building a truly intelligent dialogue agent that can chat like a 106 00:02:06,050 --> 00:02:06,060 dialogue agent that can chat like a 107 00:02:06,060 --> 00:02:08,389 dialogue agent that can chat like a human remains one of the largest open 108 00:02:08,389 --> 00:02:08,399 human remains one of the largest open 109 00:02:08,399 --> 00:02:23,230 human remains one of the largest open challenges in AI today 110 00:02:23,230 --> 00:02:23,240 111 00:02:23,240 --> 00:02:26,279 [Music]