WEBVTT 1 00:00:03.560 --> 00:00:08.360 Hello and welcome to this webinar on discovering and delivering A.I. products. 2 00:00:08.560 --> 00:00:10.360 My name is Doug Rose. 3 00:00:10.560 --> 00:00:11.960 One of the key things that I want you 4 00:00:12.160 --> 00:00:15.120 to get out of this is with artificial intelligence. 5 00:00:15.320 --> 00:00:18.520 You don't build products, you discover them. 6 00:00:18.720 --> 00:00:21.640 So we're going to find a little bit more about what that means. 7 00:00:21.640 --> 00:00:23.480 What does it mean to discover her product? 8 00:00:23.680 --> 00:00:27.080 Most organizations are very comfortable building out products, 9 00:00:27.280 --> 00:00:31.200 coming out with a sort of an idea and then building it out over time. 10 00:00:31.400 --> 00:00:33.320 And they're considerably less comfortable 11 00:00:33.520 --> 00:00:37.360 just kind of exploring different possibilities, having lots of questions 12 00:00:37.560 --> 00:00:41.800 and then trying to sort of discover a new product. 13 00:00:42.000 --> 00:00:45.320 So let's start out by talking about what it means to discover a product. 14 00:00:45.520 --> 00:00:46.800 I'm going to start out at the very 15 00:00:47.000 --> 00:00:51.840 beginning by talking about what it means to what artificial intelligence means, 16 00:00:52.040 --> 00:00:55.760 kind of have a working definition of artificial intelligence. 17 00:00:55.960 --> 00:00:58.360 So artificial intelligence is the ability 18 00:00:58.560 --> 00:01:03.440 for a computer to perform tasks that are commonly associated with humans. 19 00:01:03.640 --> 00:01:07.320 There's a couple of common A.I. tools to do this. 20 00:01:07.520 --> 00:01:09.200 You may have heard of machine learning, 21 00:01:09.400 --> 00:01:13.440 which takes massive data sets and then has the machine learning. 22 00:01:13.640 --> 00:01:14.920 How does the machine 23 00:01:15.120 --> 00:01:20.160 learn by looking at these massive datasets through machine learning algorithms. 24 00:01:20.360 --> 00:01:22.080 And then you have artificial neural 25 00:01:22.280 --> 00:01:27.560 networks, which takes these little sort of artificial neurons and uses the human 26 00:01:27.760 --> 00:01:31.680 mind as kind of, ah, the human brain as a map for how to deal 27 00:01:31.880 --> 00:01:35.720 with these these different sort of massive data sets. 28 00:01:35.720 --> 00:01:36.840 How you could learn something new 29 00:01:37.040 --> 00:01:42.800 from them by using sort of the brain is almost a metaphor for how how it can work. 30 00:01:43.000 --> 00:01:47.080 So then you have something called deep learning, which takes these neural 31 00:01:47.280 --> 00:01:51.840 networks and then creates several neural sort of neurons. 32 00:01:52.040 --> 00:01:54.960 This creates several layers of neurons. 33 00:01:55.160 --> 00:01:59.560 And the deeper these layers are, the more interesting, the more the better. 34 00:01:59.760 --> 00:02:02.240 Your your machine learning algorithms are 35 00:02:02.440 --> 00:02:08.200 at finding these really, really hard to discover patterns. 36 00:02:08.400 --> 00:02:10.680 And so if you have a really deep neural 37 00:02:10.880 --> 00:02:15.320 network, then you can see things and create patterns or the machine can see 38 00:02:15.520 --> 00:02:19.200 patterns in things that humans can't even comprehend. 39 00:02:19.400 --> 00:02:22.400 And so when you see something like 40 00:02:22.600 --> 00:02:25.200 Google Translate using something like deep 41 00:02:25.400 --> 00:02:29.600 learning, what it's doing is it's using a deep artificial neural network that's 42 00:02:29.800 --> 00:02:35.200 using these machine learning algorithms to look for patterns and how people speak. 43 00:02:35.400 --> 00:02:39.520 And then it translates these words. 44 00:02:39.720 --> 00:02:43.800 It sees these patterns and learns them, that it translates them by matching up new 45 00:02:44.000 --> 00:02:47.920 data to the sort of the patterns that it's seen in the past. 46 00:02:48.120 --> 00:02:51.920 And so there's a model and then the machine updates it by itself. 47 00:02:52.120 --> 00:02:54.600 So now I don't expect you to kind of just 48 00:02:54.800 --> 00:02:59.640 go out now and and with what I've told you and start building your artificial neural 49 00:02:59.840 --> 00:03:03.280 networks or start working on deep learning projects. 50 00:03:03.480 --> 00:03:07.600 But I think it's important to understand kind of what these things are. 51 00:03:07.660 --> 00:03:09.760 So think of them almost like you start out 52 00:03:09.760 --> 00:03:11.720 with machine learning, machine learning algorithms. 53 00:03:11.920 --> 00:03:13.720 You can look for patterns. 54 00:03:13.920 --> 00:03:18.160 Then you can use artificial neural networks with these machine learning 55 00:03:18.360 --> 00:03:22.160 algorithms to find really hard to discover patterns. 56 00:03:22.360 --> 00:03:24.520 And then you can use deep learning 57 00:03:24.720 --> 00:03:29.680 to see really, really difficult to detect patterns in massive data sets. 58 00:03:29.880 --> 00:03:31.120 And so when you see something like 59 00:03:31.320 --> 00:03:34.720 Google's self-driving car, they're using a form of deep learning 60 00:03:34.920 --> 00:03:40.600 to kind of collect these massive data sets as your car drives down the road and try 61 00:03:40.800 --> 00:03:43.760 to sort of figure out if it can see patterns. 62 00:03:43.960 --> 00:03:47.800 When I see someone crossing the street, then I know to stop because I've seen 63 00:03:48.000 --> 00:03:51.600 that pattern a million times and I've collected enormous amounts of data. 64 00:03:51.800 --> 00:03:53.760 So when you see those cars driving around, 65 00:03:53.960 --> 00:03:57.720 what they're doing is they're collecting these massive amounts of data. 66 00:03:57.920 --> 00:04:03.760 So these deep learning algorithms can find these very difficult to discover patterns. 67 00:04:03.960 --> 00:04:06.560 So these tools are getting cheaper and easier. 68 00:04:06.760 --> 00:04:10.040 But when you think about artificial intelligence in your organization, 69 00:04:10.200 --> 00:04:12.360 I don't want you to think start thinking about the tools. 70 00:04:12.560 --> 00:04:16.440 I don't want you to go out and start setting up an artificial neural network 71 00:04:16.640 --> 00:04:21.240 or, you know, start training everybody intenser flow or something like that. 72 00:04:21.440 --> 00:04:24.280 Instead, I think that the best place to start for organizations you want 73 00:04:24.480 --> 00:04:29.160 to discover products is to start by approaching their organizational mindset. 74 00:04:29.360 --> 00:04:31.240 And that's really the main obstacle 75 00:04:31.440 --> 00:04:36.800 that a lot of organizations have from any value from these new A.I. tools. 76 00:04:37.000 --> 00:04:38.840 So the last 50 years, 77 00:04:39.040 --> 00:04:44.160 most organizations have been focused on operational efficiency. 78 00:04:44.360 --> 00:04:48.400 There's the creating meeting and management objectives. 79 00:04:48.600 --> 00:04:52.760 They're going lean and they're trying to sort of make sure 80 00:04:52.960 --> 00:04:58.000 that the the operational part of the organization is efficient as possible. 81 00:04:58.190 --> 00:05:02.240 And you see this a lot with Peter Drucker, who is every everybody's always quoting 82 00:05:02.440 --> 00:05:05.480 that the enterprise must have clear and unifying objectives. 83 00:05:05.680 --> 00:05:10.520 You must be able to set these objectives and then remove a lot of the operational 84 00:05:10.720 --> 00:05:14.040 inefficiencies to meeting these objectives. 85 00:05:14.240 --> 00:05:19.120 But really to get value from A.I. tools, you have to sort of move away from that. 86 00:05:19.320 --> 00:05:22.160 You have to start thinking more about science. 87 00:05:22.360 --> 00:05:24.320 You have to think about a discovery. 88 00:05:24.520 --> 00:05:29.280 So but it's when you when you think about how most organizations operate, 89 00:05:29.480 --> 00:05:34.680 that's really kind of not the way a lot of people approach new products. 90 00:05:34.880 --> 00:05:37.080 A typical organization will approach new 91 00:05:37.280 --> 00:05:41.960 products by using something like the typical project lifecycle. 92 00:05:42.160 --> 00:05:45.320 So they'll start out by planning something new, this is, you know, 93 00:05:45.520 --> 00:05:49.840 when they come up with a project and then they'll come up with a requirements 94 00:05:50.040 --> 00:05:55.360 document, then they'll analyze how this project or product is going 95 00:05:55.490 --> 00:05:58.840 to fit in your organization and they'll try to map objectives to it. 96 00:05:59.020 --> 00:06:01.120 So if you were going to come up with a new 97 00:06:01.320 --> 00:06:04.360 tennis shoe, then you would go out and you'd plan it. 98 00:06:04.360 --> 00:06:05.760 We're going to build a new tennis shoe. 99 00:06:05.960 --> 00:06:09.600 Then you analyze it by looking at the market and map some objectives. 100 00:06:09.800 --> 00:06:13.160 You know, we'll want to have the shoe released in, 101 00:06:13.360 --> 00:06:17.080 you know, two years from now or in the next quarter or whatever. 102 00:06:17.280 --> 00:06:21.400 And then you design this project or a product, 103 00:06:21.600 --> 00:06:24.480 you'll describe the features, you design it out. 104 00:06:24.520 --> 00:06:26.320 And that's where you see like with shoes 105 00:06:26.520 --> 00:06:30.880 or whatever, manufacturing people will create schematics and things like that. 106 00:06:31.080 --> 00:06:32.920 And then you'll code the product. 107 00:06:33.120 --> 00:06:34.320 If you're working with software, 108 00:06:34.520 --> 00:06:37.880 you'll have developers that are coding out the product. 109 00:06:38.080 --> 00:06:39.480 If you're doing something like a shoe, 110 00:06:39.680 --> 00:06:42.160 then you'll have people who are manufacturing the product. 111 00:06:42.360 --> 00:06:44.040 But it's kind of the same result. 112 00:06:44.190 --> 00:06:48.040 This is where you're working to build out the product and then you'll test it. 113 00:06:48.240 --> 00:06:50.200 You know, in software you have quality 114 00:06:50.400 --> 00:06:53.360 assurance testers that will go through and test the product. 115 00:06:53.560 --> 00:06:57.600 And in manufacturing, things like that, you'll have someone who just puts 116 00:06:57.620 --> 00:07:00.720 on a pair of shoes, go for a walk, send them to some customers, 117 00:07:00.920 --> 00:07:04.040 see what they think, and then once assuming that the tests all 118 00:07:04.240 --> 00:07:08.280 clear, then you'll deploy the product and then you'll deliver it. 119 00:07:08.480 --> 00:07:10.320 Software will get deployed to servers. 120 00:07:10.520 --> 00:07:13.320 Shoes will get deployed to customers. 121 00:07:13.520 --> 00:07:20.960 So you have a typical project lifecycle plan, analyze, design, code, test, deploy. 122 00:07:21.400 --> 00:07:24.040 But eight products are different, 123 00:07:24.240 --> 00:07:29.920 some common products are a next generation business agent that pops up and can kind 124 00:07:30.120 --> 00:07:33.400 of answer questions that you type in on a website. 125 00:07:33.600 --> 00:07:36.920 There's object and pattern detection A.I. products. 126 00:07:37.120 --> 00:07:42.920 I once worked for a paper company that was trying to use object and pattern detection 127 00:07:43.120 --> 00:07:46.120 to mitigate against any sort of workplace injuries. 128 00:07:46.320 --> 00:07:51.400 They had cameras set up on the shop floor and the agent was designed to sort of look 129 00:07:51.600 --> 00:07:56.440 for spills or see if someone left a cup of coffee on top of an equipment equipment. 130 00:07:56.640 --> 00:08:01.240 And then the I would send out an email or a notice to try and mitigate that. 131 00:08:01.440 --> 00:08:05.000 And the most common is sort of these EHI assistance that you see for EHI products 132 00:08:05.200 --> 00:08:09.680 where you have like Alexa or Siri that uses natural language processing 133 00:08:09.880 --> 00:08:14.120 to kind of do some analysis in real time to learn 134 00:08:14.320 --> 00:08:19.040 from people's requests and give them back the information that they want. 135 00:08:19.240 --> 00:08:23.000 And so these are pretty typical A.I. products. 136 00:08:23.200 --> 00:08:25.240 So you can't really use a standard 137 00:08:25.440 --> 00:08:28.840 development lifecycle with a products for one. 138 00:08:28.840 --> 00:08:30.440 I mean, it's difficult to have a plan 139 00:08:30.640 --> 00:08:33.920 because you're going to be learning so much along the way. 140 00:08:34.120 --> 00:08:35.920 When the paper company was making 141 00:08:36.120 --> 00:08:40.040 the cameras the point on the shop floor, they learned a lot about the types 142 00:08:40.040 --> 00:08:41.920 of injuries that they might be able to mitigate. 143 00:08:41.920 --> 00:08:44.760 And they learned a lot about how people might get injured. 144 00:08:44.760 --> 00:08:48.280 So it's difficult to sort of plan that all out because a lot of it is going to be 145 00:08:48.400 --> 00:08:52.200 discovery and because it's difficult to plan, it's difficult to have a scope. 146 00:08:52.400 --> 00:08:55.800 It's difficult to sort of know exactly when to stop. 147 00:08:55.880 --> 00:08:57.760 What's going to be the scope of your entire product? 148 00:08:57.960 --> 00:09:02.760 I mean, when does the one with the company say, OK, we've mitigated enough injuries? 149 00:09:02.760 --> 00:09:04.880 Are they going for 100 percent or 90 percent? 150 00:09:05.080 --> 00:09:09.200 So you're just trying to sort of tweak and optimize the product over time. 151 00:09:09.400 --> 00:09:10.920 And it's also difficult since you don't 152 00:09:11.120 --> 00:09:14.120 have a plan in the scope to come up with requirements 153 00:09:14.320 --> 00:09:17.480 again, because a lot of these products are going to be learning as you go. 154 00:09:17.680 --> 00:09:20.240 You're going to be getting better at natural language processing. 155 00:09:20.440 --> 00:09:23.520 Then it's very difficult to sort of have very strict requirements. 156 00:09:23.720 --> 00:09:28.280 This is what you do to sort of achieve this level of functionality. 157 00:09:28.480 --> 00:09:30.280 And you don't know what the requirements are. 158 00:09:30.480 --> 00:09:33.360 You don't know what sort of tweaks to the machine 159 00:09:33.560 --> 00:09:37.920 learning algorithm are going to make it the result more effective because you're 160 00:09:37.920 --> 00:09:40.280 running experiments and you're trying to optimize. 161 00:09:40.480 --> 00:09:42.800 So it's very difficult requirements. 162 00:09:43.000 --> 00:09:44.480 Requirements depend a little bit 163 00:09:44.680 --> 00:09:49.120 on knowing that if you do a certain thing, you'll have some sort of outcome. 164 00:09:49.320 --> 00:09:52.720 You don't really have that as much with A.I. products, 165 00:09:52.920 --> 00:09:57.200 it's very difficult to sort of understand the quality of an A.I. product because 166 00:09:57.400 --> 00:10:02.480 things start out in a less optimized and you optimize it over time. 167 00:10:02.680 --> 00:10:04.000 So if you notice with Google, 168 00:10:04.200 --> 00:10:08.640 a lot of times they'll start out with sort of machine learning, 169 00:10:08.840 --> 00:10:11.840 sort of machine learning tools that are 170 00:10:12.040 --> 00:10:15.160 very effective, like when they started out sort 171 00:10:15.160 --> 00:10:18.360 of with the Atari twenty six hundred game that played against itself. 172 00:10:18.560 --> 00:10:20.440 I mean, that was kind of it was great 173 00:10:20.640 --> 00:10:23.320 and it was neat, but they were just starting out there 174 00:10:23.520 --> 00:10:27.520 and then they optimized it over time to the point where you could play go or 175 00:10:27.720 --> 00:10:30.240 you could play sort of more complex video games. 176 00:10:30.330 --> 00:10:33.880 And so it's very difficult to kind of understand the quality when you're 177 00:10:34.080 --> 00:10:38.080 always improving sort of what you wouldn't stop there, but instead you're sort 178 00:10:38.280 --> 00:10:41.320 of starting somewhere and then optimizing it over time. 179 00:10:41.520 --> 00:10:44.240 And because you don't have the plan, you don't have the scope 180 00:10:44.440 --> 00:10:47.640 of the requirements, you're not sure what quality is going to be. 181 00:10:47.840 --> 00:10:50.600 Where are you going to stop the quality then? 182 00:10:50.640 --> 00:10:52.080 It's very difficult to budget 183 00:10:52.280 --> 00:10:56.440 because you don't know when your machine, when you're a product, 184 00:10:56.640 --> 00:11:00.520 is going to be optimized to the point where it will be valuable to people. 185 00:11:00.720 --> 00:11:05.000 And so you kind of have to run these experiments and then improve it over time. 186 00:11:05.200 --> 00:11:08.800 And then if you feel that it adds real value, then release it again. 187 00:11:09.000 --> 00:11:14.200 When you see a lot of these companies like Microsoft or Google or 188 00:11:14.200 --> 00:11:17.240 Facebook, what they're doing is, is they're creating sort of like these 189 00:11:17.440 --> 00:11:20.600 deep learning products that might not have that much value. 190 00:11:20.600 --> 00:11:22.120 You have, like Microsoft that might be 191 00:11:22.320 --> 00:11:27.240 creating a deep learning product that can identify humor or comics, 192 00:11:27.440 --> 00:11:30.280 but that doesn't really have that much commercial value yet. 193 00:11:30.480 --> 00:11:32.840 But they know that you start out somewhere 194 00:11:33.040 --> 00:11:35.600 and then you optimize and improve over time. 195 00:11:35.800 --> 00:11:39.560 And so it's very difficult to run these to sort of think of these products 196 00:11:39.760 --> 00:11:43.760 the same way you think about a typical project. 197 00:11:44.360 --> 00:11:46.000 And if you think about it, 198 00:11:46.200 --> 00:11:51.400 if you if you list out sort of typical project objectives and you 199 00:11:51.600 --> 00:11:54.960 compare it to a typical A.I. product, you'll see that like 200 00:11:55.160 --> 00:12:00.160 a typical project might be something like develop a customer self-help portal where 201 00:12:00.360 --> 00:12:04.240 a typical AI product might be to better understand a customer's needs. 202 00:12:04.440 --> 00:12:06.800 Remember, we were looking at those agents 203 00:12:07.000 --> 00:12:12.440 are typical project with objectives might be to create software based on customer 204 00:12:12.640 --> 00:12:17.680 feedback, where a typical A.I. product might be something like a cell phone 205 00:12:17.880 --> 00:12:21.440 company trying to create a model to predict customer churn. 206 00:12:21.640 --> 00:12:25.760 Because if you lose your customer, it's more expensive than getting a new one. 207 00:12:25.960 --> 00:12:28.040 So are less expensive than getting a new 208 00:12:28.240 --> 00:12:34.160 one or a typical project might be something like create an online course. 209 00:12:34.360 --> 00:12:37.240 But a typical A.I. product might be like 210 00:12:37.440 --> 00:12:41.320 a machine learning algorithm that helps identify fake news. 211 00:12:41.520 --> 00:12:43.480 You hear a lot about that in the news 212 00:12:43.610 --> 00:12:45.960 lately with climate change and things like that. 213 00:12:46.160 --> 00:12:49.240 So coming up with an API product where you 214 00:12:49.440 --> 00:12:57.160 are trying to improve the ability of a AI machine, learning an agent to identify 215 00:12:57.160 --> 00:13:00.960 fake news is something that, you know, you wouldn't optimize and improve over 216 00:13:01.160 --> 00:13:04.960 time, which is much different from a typical project. 217 00:13:05.160 --> 00:13:09.640 Another typical project might be to something like create legacy code 218 00:13:09.840 --> 00:13:14.200 or convert legacy code and update the server software. 219 00:13:14.400 --> 00:13:18.840 And a typical A.I. product would be something more along the lines of stopping 220 00:13:19.040 --> 00:13:23.440 security threats where you have to be able to anticipate something completely new or 221 00:13:23.640 --> 00:13:27.360 look for patterns and identify patterns that might be hostile. 222 00:13:27.560 --> 00:13:31.280 And that's completely different from something where you can scope out 223 00:13:31.480 --> 00:13:33.960 the objectives and try to meet those objectives. 224 00:13:34.160 --> 00:13:35.520 So there's a big difference 225 00:13:35.720 --> 00:13:42.080 between A.I. products and typical projects where you can use project objectives. 226 00:13:43.400 --> 00:13:48.120 So what I like to do with customers when I work in A.I. products is to create 227 00:13:48.320 --> 00:13:51.720 an entirely different life cycle, which is more based on discovery, 228 00:13:51.920 --> 00:13:56.840 which I call the learning lifecycle so or discovery lifecycle. 229 00:13:57.040 --> 00:13:59.320 So what you want to do with when you're 230 00:13:59.520 --> 00:14:04.160 working on a product is first you want to kind of identify the roles which I call 231 00:14:04.160 --> 00:14:07.040 the identify, sort of just start out with identifying the roles. 232 00:14:07.240 --> 00:14:10.600 Think about the different people who are going to interact with your product, 233 00:14:10.800 --> 00:14:14.360 then ask a bunch of interesting questions about your product. 234 00:14:14.560 --> 00:14:16.920 OK, so how are we going to what what how 235 00:14:16.920 --> 00:14:19.840 will we approach this? What are the different ways that we could approach 236 00:14:20.040 --> 00:14:25.120 this? What are the different values that we what the different ways we can add 237 00:14:25.320 --> 00:14:30.880 value and then research, look at the data, try to get as much data as you can, 238 00:14:31.080 --> 00:14:33.600 try to crunch it, get something interesting out of it. 239 00:14:33.800 --> 00:14:37.920 If you have data science teams and you can work with big data to try and sort of do 240 00:14:37.920 --> 00:14:41.520 something, see if you can create an agent that's very that does something 241 00:14:41.520 --> 00:14:44.040 interesting with the data and then look at the results, 242 00:14:44.240 --> 00:14:47.640 share the results with other people in the company, discuss reports, 243 00:14:47.840 --> 00:14:52.440 try to see if if this A.I. agent is a product, 244 00:14:52.640 --> 00:14:57.360 is doing something interesting and then sort of gaining insights from it, 245 00:14:57.560 --> 00:15:04.960 learn to draw conclusions and try then in the end, create knowledge. 246 00:15:05.160 --> 00:15:07.320 And if you look at how a lot of 247 00:15:07.520 --> 00:15:11.600 sort of A.I. software companies are working with air products, 248 00:15:11.600 --> 00:15:13.720 you can kind of recognize this lifecycle, like I said, 249 00:15:13.920 --> 00:15:17.040 with with Google or with other with Microsoft, 250 00:15:17.040 --> 00:15:20.320 they'll start out with these small products that don't have much value. 251 00:15:20.520 --> 00:15:21.880 They'll ask some interesting questions. 252 00:15:22.080 --> 00:15:27.600 Can we use Deep Learning Network to have a video game play against itself and then 253 00:15:27.800 --> 00:15:31.040 do some research, create sort of have the machine play 254 00:15:31.240 --> 00:15:34.720 against itself a million times, a hundred thousand times whatever, 255 00:15:34.920 --> 00:15:39.840 and see if it's learning something new and see if you're getting any interesting 256 00:15:40.040 --> 00:15:44.120 results, if it's improving the model and then seeing what insights you've drawn 257 00:15:44.140 --> 00:15:47.640 from it, see how you can improve the product and maybe turn the product 258 00:15:47.640 --> 00:15:51.680 into something that can do something more interesting, like play a more complex game 259 00:15:51.880 --> 00:15:56.040 like go or some of the more complex video games and then see what you've learned. 260 00:15:56.240 --> 00:15:58.440 And so this is a completely different life 261 00:15:58.640 --> 00:16:02.800 cycle than what you have with a typical product lifecycle. 262 00:16:03.960 --> 00:16:05.160 And a lot of times we have seen 263 00:16:05.360 --> 00:16:09.560 organizations do is that they take this life cycle and they'll 264 00:16:09.760 --> 00:16:13.920 run them in small sort of knowledge, creating sprints almost similar to how 265 00:16:14.120 --> 00:16:17.920 software works, where they'll run through every phase 266 00:16:18.120 --> 00:16:21.480 of this life cycle, and then every two weeks see if they can 267 00:16:21.680 --> 00:16:25.480 produce something interesting in this helps the team kind of 268 00:16:25.680 --> 00:16:29.840 learn something new and then quickly kind of pivot if they find something. 269 00:16:30.040 --> 00:16:33.400 So I was working with a company once 270 00:16:33.600 --> 00:16:37.840 that was trying to create a machine learning algorithm to look through 271 00:16:38.040 --> 00:16:41.760 massive data sets to try and come up with credit card offers. 272 00:16:41.960 --> 00:16:46.040 And so they were able to sort of they're playing with the model and they could look 273 00:16:46.240 --> 00:16:49.880 at the results and they noticed that the machine was actually pretty good 274 00:16:49.880 --> 00:16:52.840 at predicting whether or not someone was having trouble paying their bills. 275 00:16:53.040 --> 00:16:57.920 And so they were able to run a few sprints and see if they could come up with a new 276 00:16:58.120 --> 00:17:02.560 product based on the fee, based on that feedback and the insights 277 00:17:02.560 --> 00:17:05.640 and the knowledge that they got from one of the shorter sprints. 278 00:17:05.840 --> 00:17:07.440 And so you want to kind of run these is 279 00:17:07.640 --> 00:17:12.840 little short cut of product deliveries and be able to pivot. 280 00:17:13.040 --> 00:17:16.560 If you learn something new, if you end up sort of 281 00:17:16.760 --> 00:17:21.280 tied to a really long life cycle, then it comes becomes much more difficult 282 00:17:21.280 --> 00:17:24.600 for your organization to learn something new because they're kind of tied into what 283 00:17:24.800 --> 00:17:29.920 they're doing and they can't quickly pivot based on new knowledge. 284 00:17:30.560 --> 00:17:32.560 OK, so now you've seen a little bit about 285 00:17:32.760 --> 00:17:37.320 what an A.I. product is and you've seen a little bit about how you can sort 286 00:17:37.430 --> 00:17:41.480 of change your life cycle from something that's focused on objectives to something 287 00:17:41.660 --> 00:17:44.560 that's a little bit more focused on knowledge and learning. 288 00:17:44.760 --> 00:17:46.360 It's nice to think a little bit about how 289 00:17:46.560 --> 00:17:51.320 you can change your organization to to actually be more exploratory, 290 00:17:51.520 --> 00:17:56.440 to be less focused on objectives and more focused on learning something new. 291 00:17:56.640 --> 00:18:00.880 So an AI researcher named Ken Stanley came out with a really interesting book called 292 00:18:01.080 --> 00:18:04.760 Why Greatness Can't Be on the Myth of the Objective. 293 00:18:04.960 --> 00:18:08.040 And he talked about how humans are 294 00:18:08.240 --> 00:18:12.520 actually much better at discovering something new when they're not 295 00:18:12.720 --> 00:18:17.400 focused on objectives, that it the that if you if you're able 296 00:18:17.600 --> 00:18:21.320 to explore if you're able to do something close to a scientific method, 297 00:18:21.520 --> 00:18:25.480 that a lot of these teams can be more creative and imagine 298 00:18:25.680 --> 00:18:29.200 when they're able to use their imagination and if they're able to take a more 299 00:18:29.360 --> 00:18:32.760 empirical approach, if they're able to sort of run small experiments. 300 00:18:32.960 --> 00:18:38.440 And he talked about how this focus on objectives is actually an impediment 301 00:18:38.640 --> 00:18:43.360 to greatness, that if you want to discover something new, then you shouldn't focus 302 00:18:43.560 --> 00:18:47.960 on objectives, but you should really tap into kind of human creativity. 303 00:18:48.160 --> 00:18:54.880 And one of the examples of this is that humans are actually really good at 304 00:18:55.080 --> 00:18:57.960 it being creative when they don't really 305 00:18:58.160 --> 00:19:02.000 have a lot of information or if they have massive amounts of information. 306 00:19:02.200 --> 00:19:08.000 Humans are very good at making sense out of nonsense, out of huge amounts of data. 307 00:19:08.200 --> 00:19:09.760 And so there was an interesting article 308 00:19:09.960 --> 00:19:14.720 that I read in The New Yorker which said that when they ask people questions 309 00:19:14.920 --> 00:19:18.440 that were sort of nonsensical, that they were actually very analytical 310 00:19:18.640 --> 00:19:21.520 about it, that they took a group and they asked them 311 00:19:21.720 --> 00:19:27.120 who was more what was more likely to exist, something like a yeti or a dragon. 312 00:19:27.320 --> 00:19:29.000 And people could actually go through 313 00:19:29.000 --> 00:19:32.560 the analysis and say, well, you know, I think a yeti is more likely to exist 314 00:19:32.560 --> 00:19:36.320 because it might be smaller, more nimble, living in places where there's a lot 315 00:19:36.320 --> 00:19:39.960 of snow, whereas a dragon, we would have seen these flying around, they're larger. 316 00:19:40.160 --> 00:19:44.440 And if especially if it's fire breathing, it's more attention getting. 317 00:19:44.640 --> 00:19:48.600 And they asked what's more likely to exist, a unicorn or a mermaid? 318 00:19:48.800 --> 00:19:51.320 And people are like, well, you know, we're probably a mermaid because it was 319 00:19:51.520 --> 00:19:55.160 in the ocean and much, much more of the ocean is unexplored. 320 00:19:55.360 --> 00:19:58.240 So even people, even though people are asking something that's sort 321 00:19:58.440 --> 00:20:02.560 of nonsensical, fantastic, that they're actually able to do some 322 00:20:02.760 --> 00:20:06.720 really interesting analysis and that's very similar to how you want 323 00:20:06.920 --> 00:20:09.840 your teams to think when you're working a product. 324 00:20:10.040 --> 00:20:12.160 A lot of the data that you'll be getting 325 00:20:12.160 --> 00:20:14.840 when you're working on a product will require some creativity. 326 00:20:15.040 --> 00:20:18.280 It will require you to sort of make sense out of nonsense. 327 00:20:18.480 --> 00:20:23.280 And if you're focused on making if you're focused on sort of being completely 328 00:20:23.480 --> 00:20:27.640 analytical, not being entirely creative, and if you're focused on objectives, 329 00:20:27.840 --> 00:20:33.320 then you can actually have a lot of trouble making new discoveries. 330 00:20:33.520 --> 00:20:36.600 And one of the things he talked about is you want your organization to embrace 331 00:20:36.800 --> 00:20:40.840 serendipity, you want them to be able to sort of ask interesting questions 332 00:20:41.040 --> 00:20:45.240 and to pursue interestingness, to pursue novelty. 333 00:20:45.440 --> 00:20:48.000 And some of the examples of organizations 334 00:20:48.000 --> 00:20:50.880 that have discovered something new through something serendipitous, 335 00:20:51.080 --> 00:20:56.080 like the microwave was discovered because someone was fixing radio towers and they 336 00:20:56.180 --> 00:20:59.080 noticed that the chocolate bar in their pocket was melting. 337 00:20:59.080 --> 00:21:01.080 And so it was kind of a serendipitous discovery. 338 00:21:01.280 --> 00:21:04.640 And they discovered how they thought about it. 339 00:21:04.640 --> 00:21:05.800 They were creative and they thought, OK, 340 00:21:05.800 --> 00:21:07.720 well, maybe we can make another out of this. 341 00:21:07.920 --> 00:21:11.320 Plastic was discovered serendipitously, a sort of byproduct, petroleum. 342 00:21:11.520 --> 00:21:14.040 Teflon was discovered serendipitously. 343 00:21:14.240 --> 00:21:18.240 And a lot of these products were not objective driven, 344 00:21:18.440 --> 00:21:22.360 but it was sort of a team was working together and they were creative. 345 00:21:22.360 --> 00:21:24.360 They were able to ask interesting questions. 346 00:21:24.560 --> 00:21:27.200 And so they were able to discover something new. 347 00:21:27.400 --> 00:21:30.840 And more recently, if you're a fan of Silicon Valley, 348 00:21:31.040 --> 00:21:35.040 there was an episode where one of the software developers was trying 349 00:21:35.240 --> 00:21:39.080 to develop an A.I. product that could identify whether or not something was 350 00:21:39.280 --> 00:21:43.520 a hot dog and called it the not hot dog A.I. product. 351 00:21:43.720 --> 00:21:47.920 And so he was created this product and it 352 00:21:48.120 --> 00:21:51.120 was focused on it and doing discovery and crunching data. 353 00:21:51.320 --> 00:21:55.280 And it was it ended up being really good, but it wasn't really commercially viable. 354 00:21:55.280 --> 00:21:57.680 Not that many people didn't want to find a hot dog. 355 00:21:57.870 --> 00:21:59.320 So in the end of the episode, 356 00:21:59.520 --> 00:22:03.880 he ended up selling it to Instagram, Instagram as a way to sort of filter out 357 00:22:03.880 --> 00:22:06.840 whether or not someone was uploading the wrong kind of pictures. 358 00:22:07.040 --> 00:22:08.640 And so it was, you know, 359 00:22:08.840 --> 00:22:14.000 this is a pretty good example of a product that started out going in one direction 360 00:22:14.200 --> 00:22:20.400 and then through creativity and looking for interestingness and pursuing novelty, 361 00:22:20.600 --> 00:22:23.200 it was able to pivot and do something else. 362 00:22:23.400 --> 00:22:27.000 And remember, you want to run this life cycle in short sprints so that you could 363 00:22:27.200 --> 00:22:30.800 sort of pivot quickly and do and look for something interesting. 364 00:22:30.800 --> 00:22:34.440 If your project is completely focused on objectives and you're going to miss 365 00:22:34.440 --> 00:22:36.360 a lot of opportunity to discover something new. 366 00:22:36.560 --> 00:22:39.520 And again, when you look at a lot of the companies that are focused 367 00:22:39.720 --> 00:22:42.080 on products, this is exactly what they're doing. 368 00:22:42.080 --> 00:22:45.560 They're working on products that might not have that much commercial value, 369 00:22:45.720 --> 00:22:48.920 but they're learning something and they're learning how to work with the technology. 370 00:22:49.120 --> 00:22:51.000 And the and the machine is 371 00:22:51.200 --> 00:22:55.200 the machine is updating its model and they're improving their algorithms. 372 00:22:55.400 --> 00:22:57.000 Professor Stanley described these 373 00:22:57.200 --> 00:23:01.760 discoveries is like stepping stones, is that each time you learn something new, 374 00:23:01.960 --> 00:23:04.200 you're taking a step closer to your product. 375 00:23:04.400 --> 00:23:06.080 So with not hot dog, 376 00:23:06.280 --> 00:23:11.520 with Google creating an Atari twenty six hundred algorithm, that each one of these 377 00:23:11.720 --> 00:23:14.080 is a stepping stone to creating something new. 378 00:23:14.280 --> 00:23:16.600 Now, these companies might not know what 379 00:23:16.600 --> 00:23:20.000 the end result is going to be because you don't really know what all the stepping 380 00:23:20.200 --> 00:23:22.640 stones are until your end at the end of the path. 381 00:23:22.840 --> 00:23:26.520 But each time they take a step, they're learning something new. 382 00:23:26.720 --> 00:23:31.200 And so you only know at the end your pathway going back. 383 00:23:31.400 --> 00:23:34.360 And this is exactly what's happened in a lot of the products I work with. 384 00:23:34.560 --> 00:23:38.040 With the shop floor, they learn something new and they were able 385 00:23:38.240 --> 00:23:42.000 to sort of optimize their algorithm with the credit card processing firm. 386 00:23:42.200 --> 00:23:45.560 They were able to spin off a new product that they completely didn't anticipate 387 00:23:45.760 --> 00:23:50.360 because each step that they took got them a little bit further than they weren't 388 00:23:50.360 --> 00:23:53.520 focused on objectives, but instead they were focused on learning. 389 00:23:53.720 --> 00:23:56.120 And so that let them develop some really 390 00:23:56.320 --> 00:24:00.440 interesting eye products that might seem strange to sort of use words like 391 00:24:00.480 --> 00:24:04.080 creativity and serendipity when you're talking about product development. 392 00:24:04.280 --> 00:24:06.840 But if you think about it, a lot of us sort of have very 393 00:24:06.840 --> 00:24:09.840 serendipitous things happen and we don't even really think about it. 394 00:24:10.040 --> 00:24:13.280 If you think about your career, maybe when you were in high school or 395 00:24:13.480 --> 00:24:16.520 college, you had very well set career objectives. 396 00:24:16.720 --> 00:24:19.080 But then something serendipitous happened. 397 00:24:19.080 --> 00:24:20.800 You picked up a job you weren't expecting. 398 00:24:20.800 --> 00:24:22.320 You got a promotion that you weren't 399 00:24:22.520 --> 00:24:25.920 expecting in your career, completely went in a different direction. 400 00:24:26.120 --> 00:24:30.800 And so the fact that you were able to sort of take advantage of that as a person 401 00:24:31.000 --> 00:24:34.800 probably really made your career, really changed your career. 402 00:24:35.000 --> 00:24:37.120 But as an organization, 403 00:24:37.320 --> 00:24:40.760 their teams are not very well structured to take advantage of the same thing. 404 00:24:40.760 --> 00:24:42.360 And so they would say, well, you know, 405 00:24:42.560 --> 00:24:44.880 this might be a separate, serendipitous thing that happened, 406 00:24:45.080 --> 00:24:49.400 but I've got these set objectives and so I really can't take advantage of it. 407 00:24:49.600 --> 00:24:50.920 So you have to really think about this. 408 00:24:50.920 --> 00:24:54.640 When you're trying to develop a product, you want to change your organization so 409 00:24:54.840 --> 00:24:58.960 they can think about these serendipitous stepping stones, learn from it and build 410 00:24:58.960 --> 00:25:01.560 out a product kind of like how you might do as a person. 411 00:25:01.760 --> 00:25:06.960 But it's much more difficult to do as a team working in an organization. 412 00:25:07.160 --> 00:25:08.680 And much like data science, 413 00:25:08.880 --> 00:25:14.240 I found that really small teams working on A.I. products 414 00:25:14.440 --> 00:25:15.920 structured in a way that's consistent 415 00:25:15.960 --> 00:25:18.560 with the scientific method, gets much better results. 416 00:25:18.760 --> 00:25:24.720 You have sort of this three person team, which is sort of focused on discovering we 417 00:25:24.920 --> 00:25:28.320 have a knowledge explorer, a data analyst and a servant leader 418 00:25:28.520 --> 00:25:32.440 working together in these tight teams to find new products. 419 00:25:32.640 --> 00:25:34.560 Now, a lot of times if you have a good 420 00:25:34.710 --> 00:25:37.160 data science team, they'll be structured this way. 421 00:25:37.160 --> 00:25:40.960 And then you could have your data science team also work on your machine learning 422 00:25:40.960 --> 00:25:43.920 products and sort of a natural progression, because data science teams 423 00:25:44.120 --> 00:25:47.240 are using the scientific method to explore massive data sets. 424 00:25:47.400 --> 00:25:51.040 And so they might end up using machine learning algorithms or artificial neural 425 00:25:51.240 --> 00:25:55.320 networks to crunch that massive data and then churn out data products. 426 00:25:55.320 --> 00:25:58.880 So it kind of makes sense that they would have a very similar team structure. 427 00:25:59.080 --> 00:26:03.280 But one of the most important roles in this team is the knowledge explorer, 428 00:26:03.480 --> 00:26:06.960 and they should sort of think about things differently than the rest of the team. 429 00:26:07.160 --> 00:26:10.880 If you're read, there's a book by Daniel Pink called 430 00:26:11.080 --> 00:26:15.120 A Whole New Mind, where he goes over some of the new skills that are much more 431 00:26:15.320 --> 00:26:20.080 valuable in organizations as they kind of start to invent and do more interesting 432 00:26:20.280 --> 00:26:24.760 things and develop products like some stuff with A.I. And he puts much more 433 00:26:24.960 --> 00:26:28.760 emphasis, which I think is correct on story over reporting, 434 00:26:28.960 --> 00:26:32.600 which is you want people in your organization to be able to fashion 435 00:26:32.800 --> 00:26:37.040 a compelling narrative instead of focus completely on reporting what's happening. 436 00:26:37.240 --> 00:26:40.720 You want people who can look at Symfony over detail. 437 00:26:40.920 --> 00:26:44.320 Organizations tend to favor people who are very detail oriented. 438 00:26:44.520 --> 00:26:47.880 But for A.I. products, you want to have people who can look 439 00:26:48.080 --> 00:26:52.360 at the big picture so they cross boundaries and be able to contribute 440 00:26:52.560 --> 00:26:56.080 pieces to the overall whole sort of someone who's able to see something 441 00:26:56.280 --> 00:26:58.960 really big picture instead of focusing on the details. 442 00:26:59.000 --> 00:27:01.880 But there's a lot of detail oriented people in your organization, 443 00:27:02.080 --> 00:27:07.120 so you might need to sort of train people up in this new skill set or find someone. 444 00:27:07.320 --> 00:27:10.320 And that these sort of people in your team 445 00:27:10.520 --> 00:27:14.240 should be called should have empathy over certainty. 446 00:27:14.440 --> 00:27:18.480 Instead of focusing on objectives, they should sort of look at what makes 447 00:27:18.680 --> 00:27:22.040 the their fellow human tick, kind of understand 448 00:27:22.240 --> 00:27:26.120 how people forge relationships and how might people might care for others. 449 00:27:26.320 --> 00:27:28.880 Now, the name for this role, which I like the most, 450 00:27:28.880 --> 00:27:31.880 which I've seen a lot of organizations, is the knowledge explorer, 451 00:27:32.080 --> 00:27:36.040 sort of the I think that this kind of encapsulates how this person should 452 00:27:36.040 --> 00:27:39.000 think about themselves as they're trying to gain organizational knowledge 453 00:27:39.200 --> 00:27:43.160 and they're trying to do it by kind of exploring the data with the team 454 00:27:43.360 --> 00:27:45.880 so this person won't be in charge of crunching the numbers. 455 00:27:46.010 --> 00:27:48.160 That might be sort of that the data analyst. 456 00:27:48.360 --> 00:27:51.560 But there would be the person who's asking 457 00:27:51.560 --> 00:27:53.880 interesting questions, making sure that the teams are focusing 458 00:27:54.080 --> 00:27:58.640 on objectives and instead looking at these sort of questions and seeing if you can 459 00:27:58.760 --> 00:28:01.760 learning something new and being able to help the team pivot. 460 00:28:01.960 --> 00:28:05.080 If you find something that's serendipitous, if you make a discovery 461 00:28:05.080 --> 00:28:07.800 that you weren't anticipating, which when you're working with massive 462 00:28:07.800 --> 00:28:10.840 data sets you're working with machine learning is not uncommon. 463 00:28:10.840 --> 00:28:12.160 Again, you're going to find a lot of new 464 00:28:12.240 --> 00:28:16.040 stuff when you start sort of crunching these numbers and when you start using 465 00:28:16.240 --> 00:28:20.080 machine learning algorithms to look through the data you already have. 466 00:28:20.560 --> 00:28:21.520 A really good example, 467 00:28:21.720 --> 00:28:27.920 this is a few years ago, there was a professor at Cornell called 468 00:28:28.120 --> 00:28:32.000 Professor Klineberg who came up with a really interesting challenge. 469 00:28:32.200 --> 00:28:37.960 He wanted to see if he could have his students discover on Facebook whether or 470 00:28:38.160 --> 00:28:41.800 not people were in a relationship in a romantic relationship. 471 00:28:42.000 --> 00:28:43.800 Now, I know on Facebook now at the time, 472 00:28:44.000 --> 00:28:47.080 they didn't have you tag whether or not people relationship. 473 00:28:47.280 --> 00:28:51.240 So he was just kind of using the data that was out there at the time to try 474 00:28:51.240 --> 00:28:53.720 and figure out if he could make this connection himself. 475 00:28:53.920 --> 00:28:55.880 And he used a very 476 00:28:56.080 --> 00:28:59.640 sort of nonobjective driven approach was to try to be creative. 477 00:28:59.840 --> 00:29:04.480 He he had he acted kind of as the knowledge 478 00:29:04.680 --> 00:29:08.680 explorer and the team acted as data analysts and they tried to figure out if 479 00:29:08.880 --> 00:29:11.480 they could make meaning out of this massive data set. 480 00:29:11.680 --> 00:29:13.680 And when they got together and they were 481 00:29:13.880 --> 00:29:17.960 asking questions and the question meaning they kind of talked it out, remember, 482 00:29:18.160 --> 00:29:22.720 you know, empathy over empathy, over certainty. 483 00:29:22.920 --> 00:29:28.240 And so he talked about sort of what it was that made people tick. 484 00:29:28.440 --> 00:29:33.240 You know, how how did you know when people might be in a romantic relationship? 485 00:29:33.440 --> 00:29:37.120 And one of the things that came up is that when people are in a romantic 486 00:29:37.320 --> 00:29:41.840 relationship, a lot of times they'll end up becoming friends with people that 487 00:29:42.040 --> 00:29:44.560 that's their new partner, that partners, friends. 488 00:29:44.560 --> 00:29:45.880 So they'll take on a whole bunch 489 00:29:45.880 --> 00:29:48.640 of different meet a whole bunch of new people that have been friends 490 00:29:48.840 --> 00:29:52.440 with whoever their new romantic partner has been friends with. 491 00:29:52.640 --> 00:29:56.200 And so they thought about how that could represent that in the data. 492 00:29:56.400 --> 00:29:58.200 And they came up with this chart, 493 00:29:58.400 --> 00:30:02.400 this little visualization here, which shows that you can see kind 494 00:30:02.600 --> 00:30:08.720 of the dark concentration of new friend requests in the Facebook data. 495 00:30:08.920 --> 00:30:13.040 And you can kind of see that a lot of people are meeting new people, 496 00:30:13.240 --> 00:30:17.200 and that was the way they figured out accurately whether or not people were kind 497 00:30:17.400 --> 00:30:20.400 of in a new romantic relationship with one another. 498 00:30:20.600 --> 00:30:23.320 And so that's kind of the way that you want to think about this data. 499 00:30:23.520 --> 00:30:27.200 When you're working on a product, you want to sort of be able to have 500 00:30:27.400 --> 00:30:31.920 empathy, be able to pivot the product based on the knowledge that you create. 501 00:30:31.920 --> 00:30:36.000 And if you find something new and again, they couldn't create sort of an objective, 502 00:30:36.000 --> 00:30:38.160 they couldn't say, here are the steps we're going to take 503 00:30:38.360 --> 00:30:40.360 to find out if someone's in a romantic relationship. 504 00:30:40.360 --> 00:30:41.920 They had to ask interesting questions 505 00:30:42.120 --> 00:30:46.920 and they had to explore the data to see if they could find something interesting. 506 00:30:47.200 --> 00:30:48.600 So 507 00:30:48.800 --> 00:30:53.280 one of the key things that I want to emphasize is that it's when you are 508 00:30:53.280 --> 00:30:56.080 building out the products, a lot of technology that you're going 509 00:30:56.280 --> 00:31:00.400 to get is going to be either free or not very expensive. 510 00:31:00.420 --> 00:31:03.120 You can use tens for you can download Python libraries. 511 00:31:03.320 --> 00:31:05.840 So a lot of the challenge around A.I. 512 00:31:06.040 --> 00:31:09.560 isn't really technical as much is cultural. 513 00:31:09.760 --> 00:31:11.760 You have to change your organization. 514 00:31:11.960 --> 00:31:14.720 And it's also about having the people 515 00:31:14.920 --> 00:31:17.280 who are on your team embracing the right mindset. 516 00:31:17.390 --> 00:31:19.440 So some of the key things that you should 517 00:31:19.640 --> 00:31:23.320 keep in mind is does your organization have kind of an agile mindset? 518 00:31:23.320 --> 00:31:26.600 Are they able to to think about things that they could deliver quickly? 519 00:31:26.640 --> 00:31:28.160 That's almost one of the first steps. 520 00:31:28.360 --> 00:31:32.400 Can they can they think about things and deliver value quickly? 521 00:31:32.600 --> 00:31:37.120 Does the organization already make data driven decisions? Do you have data science 522 00:31:37.320 --> 00:31:42.040 teams that are in place? Hopefully they're using more of these small teams, 523 00:31:42.240 --> 00:31:45.000 taking a creative approach to looking at their data. 524 00:31:45.200 --> 00:31:49.280 Does the organization tolerate and even value change? 525 00:31:49.480 --> 00:31:50.640 If you're in a very conservative 526 00:31:50.640 --> 00:31:54.320 organization that focuses on structure and certainty that it's going to be very 527 00:31:54.320 --> 00:31:58.160 difficult to deliver a product because a lot of it, you're going to be learning 528 00:31:58.360 --> 00:32:02.520 something new and you're going to have to be able to pivot and ask interesting 529 00:32:02.720 --> 00:32:06.320 questions and try to sort of discover something looked for. 530 00:32:06.320 --> 00:32:09.480 Interesting. This is Professor Stanley says. 531 00:32:09.680 --> 00:32:14.360 And does it have the right reward environment 532 00:32:14.560 --> 00:32:18.760 for people to experiment? I mean, if you're if you're start out by saying 533 00:32:18.960 --> 00:32:23.040 you want to do things one thing one way and then you learn something new 534 00:32:23.240 --> 00:32:27.120 and it's a better way to do it, are you going to be penalized or rewarded for it? 535 00:32:27.320 --> 00:32:31.600 So you want to be able to sort of have people be rewarded for discovering 536 00:32:31.720 --> 00:32:34.120 something new, discovering something interesting. 537 00:32:34.320 --> 00:32:36.400 And so you want to take these small steps 538 00:32:36.600 --> 00:32:40.280 with these products that might not have that much value at the beginning. 539 00:32:40.480 --> 00:32:45.360 And then through serendipitous discoveries, take these stepping stones 540 00:32:45.560 --> 00:32:49.480 to learn something new and deliver something that's valuable. 541 00:32:49.680 --> 00:32:51.200 Now, it might sound like 542 00:32:51.400 --> 00:32:55.440 that's completely different from how your organization operates, and it might be. 543 00:32:55.640 --> 00:32:58.400 But if you look again, if you look at how these A.I. products are 544 00:32:58.550 --> 00:33:02.000 being developed and some of the organizations that are using the mouse 545 00:33:02.060 --> 00:33:05.760 like Google and Facebook and Microsoft, this is exactly what they're doing. 546 00:33:05.760 --> 00:33:06.840 They're developing these products 547 00:33:06.840 --> 00:33:10.080 that don't have that much value and they're refining them over time 548 00:33:10.280 --> 00:33:14.200 until they do have value and they pivot based on what they learn. 549 00:33:14.400 --> 00:33:18.400 So this is kind of the way that you want to deliver these products. 550 00:33:18.600 --> 00:33:22.360 So here are five key takeaways I want you to have from this, 551 00:33:22.560 --> 00:33:26.600 so artificial intelligence are getting cheaper and more widely available so you 552 00:33:26.800 --> 00:33:31.360 shouldn't think of delivering a product is a technical challenge. 553 00:33:31.560 --> 00:33:34.840 The tools and you should think of it more as an organizational challenge. 554 00:33:35.040 --> 00:33:39.280 The tools are not going to help you unless your organization has the right mindset. 555 00:33:39.480 --> 00:33:42.440 Do you have people in your organization that are comfortable discovering? 556 00:33:42.640 --> 00:33:44.720 Can you create these small teams that can 557 00:33:44.920 --> 00:33:49.160 pivot and learn something new to deliver a great product? 558 00:33:50.160 --> 00:33:51.280 Have your organization. 559 00:33:51.280 --> 00:33:52.760 If you're starting out with they have your 560 00:33:52.960 --> 00:33:57.000 organization focus on exploration and not on objectives. 561 00:33:57.200 --> 00:34:00.600 You look at Ken Stanley's book and see how 562 00:34:00.800 --> 00:34:06.200 objectives are actually in the way of trying to discover something new. 563 00:34:06.400 --> 00:34:09.480 And he learned this a lot from his own research. 564 00:34:09.680 --> 00:34:13.120 Embrace and don't suppress serendipitous discovery. 565 00:34:13.320 --> 00:34:15.120 A lot of times when you're in a large 566 00:34:15.320 --> 00:34:17.680 organization, there'll be a lot of people there 567 00:34:17.750 --> 00:34:20.400 with different backgrounds and doing different things. 568 00:34:20.600 --> 00:34:23.160 And so you might discover something serendipitously. 569 00:34:23.360 --> 00:34:27.080 You might figure out how people are, 570 00:34:27.280 --> 00:34:30.120 the way people are getting injured on a manufacturing floor. 571 00:34:30.320 --> 00:34:34.200 You might figure out some way to predict that your customer might have trouble 572 00:34:34.200 --> 00:34:36.400 paying their credit card bill by looking at the data. 573 00:34:36.600 --> 00:34:38.080 Well, so you want those people to make 574 00:34:38.280 --> 00:34:42.160 these serendipitous discoveries and then roll it into your product. 575 00:34:42.360 --> 00:34:45.440 And finally, you want to work with small teams, much like data science. 576 00:34:45.440 --> 00:34:46.600 You want to work with small teams 577 00:34:46.800 --> 00:34:51.840 to explore your data instead of just having sort of one person 578 00:34:52.040 --> 00:34:55.280 who's an eye specialist or a data scientist. 579 00:34:55.480 --> 00:34:58.160 So small teams increase the likelihood 580 00:34:58.160 --> 00:35:00.000 that you're going to have a serendipitous discovery. 581 00:35:00.200 --> 00:35:02.440 And you also don't have to then focus 582 00:35:02.640 --> 00:35:06.160 on just hiring one person to take you to where you need to be. 583 00:35:06.360 --> 00:35:10.240 So all of these five takeaways will hopefully 584 00:35:10.440 --> 00:35:15.440 help your organization kind of get on track to start delivering these products. 585 00:35:15.600 --> 00:35:17.600 I hope you enjoyed this. And good luck. 586 00:35:17.600 --> 00:35:18.600 Thank you for watching.