Episode 31: The Business Leader as Data Consumer

Download MP3

[00:00:00] Dr Genevieve Hayes: Hello and welcome to Value Driven Data Science brought to you by Genevieve Hayes Consulting. I'm Dr. Genevieve Hayes, and today I'm joined by Dr. Howard Stephen Friedman to discuss how adopting a customer mindset can help business leaders capitalize on the hidden value of data. Howard is a data scientist, health economist, and writer with decades of experience leading data modeling teams in the private sector, public sector, and academia.
[00:00:31] He is an adjunct professor teaching data science, statistics, and program evaluation at Columbia University and has authored or co authored over 100 scientific articles and book chapters. In areas of applied statistics, health economics, and politics. His previous books include ultimate price and measure of a nation, which Jared Diamond called the best book of 2012.
[00:01:01] Howard, welcome to the show.
[00:01:03] Dr Howard Friedman: Thank you. It's a pleasure to be here.
[00:01:05] Dr Genevieve Hayes: When data science first became the must have skill of the 21st century, organizations were fighting to recruit the best and brightest data talent. But the glory of having a data scientist on staff was often short lived when many organizations found they didn't know what to do with them.
[00:01:24] Business leaders had been sold the dream of being able to turn their data into business gold, but were unable to maximize the value of the data expertise they had brought in because they couldn't communicate effectively with their new data science teams. And that led to a lot of time and money being wasted and many business leaders becoming disillusioned with data science.
[00:01:49] In some cases, writing it off completely. But as the guests on this podcast have demonstrated, it is possible to create business value from data. Business leaders just need to know the right questions to ask. And those critical questions are something the new book, Winning with Data Science, a handbook for business leaders, aims to equip its readers with.
[00:02:13] So to begin with, Howard, could you provide our listeners with a brief overview of Winning With Data Science and how it came to be?
[00:02:23] Dr Howard Friedman: So thank you very much again for allowing me on the show. I think you framed this really well. A lot of businesses were frustrated and continue to be frustrated by not seeing the returns that they expected for investments in data science. This particular book really started with that deep understanding.
[00:02:44] As you mentioned in the background, I worked in a private sector for many, many years in different industries. My co author, Akshay Swaminathan, has also a lot of experience in the private sector. And through this experience, one of the things that we saw was That many projects fail because of a lack of communication between the customer and the data science team.
[00:03:09] Now, that communication starts off at the very beginning. That is with a lack of understanding of really what does the customer want to achieve? What is the problem they're trying to solve? And then. A bit of a throwing over the wall where they're hoping the data science team has picked up on those hints or clues about what they're trying to do and not really having the direction they need.
[00:03:30] And so then the communication gap persists. The data science team eventually comes back with a solution, a solution, perhaps not implementable at the scale that was desired, perhaps not at the cost or times that was expected. And with a very frustrated customer. So, that breakdown in communication, in expectations, and even in project prioritization, is a huge frustration.
[00:03:56] The goal of our book was to try and build the customer up in a way that they can demand better work. Work that suits their needs. And to do it in a way that's different. To be honest, I think textbooks are boring. And I teach with textbooks. So we didn't want a textbook. And that's why we took this innovative approach where we have fictitious characters.
[00:04:20] And it's a storytelling approach, and they have narratives and dialogues, and they move across. And if we have time, I'm very happy to tell you about these characters and a little bit about their background experience. But we really tried to treat it like a book, a book that we can all read and enjoy and learn.
[00:04:37] Dr Genevieve Hayes: I haven't had time to read the whole book, but I did skim through it in preparation for this episode and I found it was incredibly easy to read. You start reading one of the case studies in it about these characters and you sort of find yourself wanting to know, okay, what happens to Steve?
[00:04:52] What happens to Kamala?
[00:04:55] Dr Howard Friedman: So, well, 1st of all, I want to congratulate you a few interviews. We'll admit that they're going to get a C plus, perhaps a B minus on preparation. Just joking. No, in all seriousness, we wanted to make it interesting. And so what we did, which is a technique when you write fiction books is we started.
[00:05:14] With a discussion, a long conversation between auction myself on who are these characters? What is their backstory? Where did they come from? How did they get to this point in their corporate lives? And then where are they going? So, for example, Steve, we had his backstory written out before we wrote any of the chapters.
[00:05:33] So we knew who he was, and we knew he was going into consumer finance and we had the rotations he was going to do because he was an MBA student. He'd come out. With his degree, and he want to make a difference in the world. And a lot of corporations do have these rotation programs and we had him move across departments from the fraud department to the risk department later.
[00:05:55] He's in real estate. And then the question was, well, where is Steve going in his career? Remember, he's a business person and in his particular case, I'll tip you off right now. He got so into the data science itself, the details of it. That he decides he wants to be more of a data scientist himself. So he starts learning programming and that's kind of his career route.
[00:06:15] And by the way, in my own world, I have had people working for me who that became their trajectory. They just loved the programming or auto ML software and seeing what they could do themselves with their hands on the key. And so that's a real story for people I've worked with. Now, our other character, Kamala, Kamala, She's coming from a different end.
[00:06:36] She's this incredibly superstar student. She's the first person from her family ever to go to college and she's got an MD, but she's also got a business background and she's just trying to fly up the corporate ladder. She took a risk on a startup, it didn't work out and now she's got this debt and she's got the family that she is expecting so much for her.
[00:06:57] She's so incredibly motivated and rising up as she Becomes this broader leader who all these data science teams are taking their guidance from, and she's got wonderful mentors above. And if we have a chance, I'd love to talk about our mentors, but she's growing, growing. And, you know, she's, she ends with a promotion.
[00:07:14] She's a VP at the end of the story, and she's going up that really that C suite ladder. She's going to be that super business leader. Maybe eventually she'll be a chief data officer, a chief information officer. So no understands the business, but can talk the data as well.
[00:07:29] Dr Genevieve Hayes: What you just described there, you have described people that I have encountered throughout my career, and there are parts of that that resonate with my own experiences.
[00:07:40] Dr Howard Friedman: Well, they say that writers, even if it's fiction, don't go far from their own truth. And in this book, there's a lot of my own truth. Akshay's lived experiences, my lived experiences, but even a lot of the characters are fictionalized versions. Of real people that I encountered wonderful mentors that I experienced really people who helped shape who I became and people who helped me learn in a positive way.
[00:08:09] I purposely didn't bring too many negative characters into the story. We all live in the real world. Right? And there's some people who aren't helping you grow. But I didn't want that. To be what this book is about. So she has some wonderful mentors in her experience. And at the same time, Steve's got some people who are also great mentors, but also they're learning as they're going, learning the language, but also learning how to interact with people.
[00:08:34] Dr Genevieve Hayes: So Howard, you yourself are an experienced data scientist, as is your co author, yet you decided to target your book to business leaders rather than to fellow data scientists. What made you decide that?
[00:08:48] Dr Howard Friedman: It's a really interesting question, you know, as we thought about where we see great opportunities and our own experiences, we perhaps from a selfish point of view felt that if we had better prepared customers, we would do our jobs better. Did I just blame the other person? I think I did. Well, anyway, so our idea was if we can strengthen the capacity of.
[00:09:13] the customer to help us do our job well, then everyone would benefit. So they say that often when you write a book, you're writing for a niche that isn't there. Of course, data scientists, I think could read this book. And I was just talking to a wonderful data scientist earlier today, who's already read the book.
[00:09:30] And she said she read it because she wanted to better understand how she could help the customer by thinking through their version, seeing things for their world. But they're not the target audience. Totally. It's really about empowering those business people so that they demand Their services better, they require from the data scientists to answer certain questions to use certain frameworks to be prepared.
[00:09:55] And I think that it's to help us do our jobs better. So, you know, we were chatting earlier about movies before we got into the formal podcast. And I almost felt like it was a Jerry McGuire moment of help me help you. Right? So I really was. Aiming for that help me help you by giving you the language, giving you the frameworks, giving you the questions so that I can do my job better.
[00:10:18] Dr Genevieve Hayes: Given your extensive experience and knowledge of data science, was it difficult for you to put yourself in the shoes of a business leader who may know very little about data science? I
[00:10:30] Dr Howard Friedman: Now, part of, I think my niche as a data scientist has always been my communication. I think that it's because also I teach and I teach at different levels. So for me. It didn't take too much of a leap. You know, I work, of course, with these business leaders all the time. And sometimes it's with the teams, the manager level, but sometimes it's the C suite.
[00:10:52] And that's part of what I bring to the table as a data scientist is going between the technical people and. The decision maker, so for me, it was quite natural and something that I've done for decades. That was just straight out of graduate school. That was saying that I seem to do well auction.
[00:11:09] On the other hand he's just a natural. I mean, my goodness, I've written a lot of books. This isn't my 1st, book, but I had known him for a few years. And when I 1st saw how good a writer. He is he's the 1 who taught me. To move to more of a narrative approach to bring in that dialogue.
[00:11:27] So I learned from him how he made his chapters very accessible and then said I started adopting his style. In fact, I'll tell you a funny story with that. You may not have seen this, but the dedication in this book is to my dad. Now my father he suffers through reading the first draft of everything I write and he's suffered for many years.
[00:11:49] You get to read the polished book after every chapter has Probably 10 or so readers giving me feedback, improving it, but he's reader number one and his comment to me was, wow, that co author really knows how to write. He goes, you should pay attention to him and learn how to write better.
[00:12:07] Dr Genevieve Hayes: actually did notice the dedication on it. I always read the dedications cause I find them fascinating.
[00:12:12] Dr Howard Friedman: Oh, that's great. I appreciate that. You know, it's interesting because having different books, I've dedicated them to different people over time. But this 1 was really special because like I said, he's worked. So hard for so many years with me to make something that's important to me come to life.
[00:12:27] And I think that's part of the perspective of loving someone as you say, well, this is something that my son wants to do. So I'm going to take on a task that isn't always that pleasant.
[00:12:38] Dr Genevieve Hayes: Yeah. My dad read my entire PhD and told me it's very good. I don't know what you're talking about, but it's very good.
[00:12:45] Dr Howard Friedman: Well, there we go. And thanks. And that's, that's really what you wanted is to hear that. It's very good.
[00:12:52] Dr Genevieve Hayes: One of the things you talk about in your book is the value of business leaders adopting a customer mindset when working with data science teams. And I really like that concept so that listeners can get an understanding of what I mean here. Here's a quote from early in your book in the case study about Steve.
[00:13:13] Steve felt overwhelmed. He wanted to understand everything that the data science team members mentioned and not embarrass himself. He wanted them to view him as a well informed future leader of the company. But then Steve remembered that he wasn't building the solution. He was the customer. He needed to act like a customer.
[00:13:34] If he was buying cabinets and having someone install them, he would know how large the cabinets had to be, where they would be installed, and how exactly he would use them. He would let the cabinet professionals build the solutions for him, while he made sure that the products met his needs. In that passage, I think you've done a really great job of capturing the mindset of many business leaders who I've encountered who have faced the prospect of working with a data science team and explaining why they don't need to know everything about data science to be effective in their jobs.
[00:14:11] Obviously, a business leader shouldn't be expected to be able to program or fit a machine learning model, just like they shouldn't be expected to do the business's accounts. But what sort of things should they know in order to be an effective customer of a data science team?
[00:14:29] Dr Howard Friedman: So in order to be a good customer, you have to have some understanding of roughly the types of problems that they could address. But that's roughly because really, when I see a good customer, they start with communicating what are the issues they're having? What are they trying to achieve? What are their goals?
[00:14:49] And then the conversations with the data scientists The business leader understand which projects she or he should be prioritizing. So it's starting with the problems and then communicating more about, well, how do we prioritize? So I love creating a list of. seven, eight, nine potential problems and then starting to think about the prioritization.
[00:15:15] Well, which one should be attacked first? Is there a sequencing? Am I prioritizing it based on a business need? There's something that's mission critical or are we prioritizing it based on the expected return on investment or is it a timeline issue? So those are the things that we start thinking about and then really getting into the question of what is success.
[00:15:39] Because the customer's perception of success. may be very different from the data scientists. The customer may have a whole set of criteria around what is the expected performance. Let's say it's a model, for example. What is the performance I need to call this useful? How soon do I need it? Who is going to execute it?
[00:15:57] Is this going to be automated or is there going to be a human in the loop? The costs involved. And once you start getting those parameters, then the data science team knows what they're trying to deliver, why they're trying to deliver it, the purpose. And then what are the parameters for success and that helps drive the conversation about how they move to the next steps and really creating those solutions.
[00:16:20] So, I think that clarity of conversation, but it starts with the customer. What does she or he need? And then moving from there to the definitions of success. And then from there, the data science team thinking through what are their potential solutions, options, and quite frankly, there's often an early version of the solution, which they introduce and they say, this is version number one.
[00:16:42] We have an expectation that we're going to come back over X time period with improved versions, but we're giving you the MVP first, the minimum viable product so that you have something in house that's functional. And that's saying that we do very often again, understanding from the customer. Is that what they're looking for?
[00:16:58] Do they want a final product right away? Or do they want to see what an early product would look like? And then we continue working, but they stay in the loop at all times. We don't throw it over the wall and then hope they're happy because that never works out well for us.
[00:17:11] Dr Genevieve Hayes: What I find really interesting in what you just described, the advice that's often given to data scientists is for them to ask all those questions of their customer. So the business leader and guide them through that process of answering those questions. If you're a relatively junior or green data scientist, it can be challenging to have that conversation with Someone who is much more senior than yourself, what you're describing there is allowing the business leader to lead that conversation, which I suspect might get better results, particularly if you're dealing with more junior data scientists.
[00:17:52] Dr Howard Friedman: So, every dynamic can vary based on the people in the room. And so, in some cases, it could be the data scientists bridging it. In some cases, there's even a cadre of data scientists who are meant as business liaison people, and they're talented in not just communications, but also in that. having some understanding of the technical solutions.
[00:18:12] So it does depend. My goal in this book is empowering the business person. And if tangentially that also empowers the data scientist, that's great because it gives this framework, these lists of questions. So for example let's say the conversation comes around data sources. The book gives the customer a whole list of possible questions they could ask around data sources, but it also tells the data scientist, you should think in advance.
[00:18:38] What the customer might be asking so that it's not a surprise because it's very frustrating. For the person who trying to fire a team and by the way, this isn't always an internal team. So the idea with this book is this is a way to also vet potential vendors who show up and say, well, we can build this and this for you is to have a systematic way to ask questions to assess.
[00:19:00] Are they really qualified? Because very often external vendors. They could be extremely well qualified. They could be creating great solutions, or they could be just handing the same off the self solution every time. And they're not digging deeply at all. And the customer should understand, are they getting a bespoke solution?
[00:19:17] Do they need a bespoke solution? Or is off the shelf absolutely fine, but really making sure they know what they're doing.
[00:19:23] Dr Genevieve Hayes: Another thing I found interesting in your book, you do discuss things like machine learning and different types of algorithms in the later part of the book. But you don't actually discuss things like new technologies, like generative AI or large language models or whatever the next buzzword is.
[00:19:44] How important is it for business leaders to keep up to date with all those new technologies?
[00:19:50] Dr Howard Friedman: So we do have a little bit of mention of Generat ai. You know, the problem with books is they're not blogs, so there's a long lead time. So we have a little bit of discussion of generative AI in the book. And in fact we have a fun little section towards the end where we write about a hackathon where gen AI sneaks its way in as well.
[00:20:06] But it is a challenge if you're a business leader from the very top down, your board wants to know. How are you using what do you mean that you're having or go back a few years ago? What do you mean that you haven't been using big data or tell us what you're doing with deep learning? So these phrases are out there and if you're raising money.
[00:20:29] There's a bit of a gamesmanship of using those phrases that's the gamesmanship associated with funding and financing of a business. And I think we can understand that if we've played in that startup space, and I've done a lot of work in that startup space. So, you have to have a foot in that door, but what we very much emphasize in our book and what I've seen throughout my career is.
[00:20:49] Start with the basics, start with problems that you're trying to solve and understand your data and the way to get to that solution. And that means drawing your data, cleaning your data. It's not as exciting, but starting to do the descriptive statistics and understanding what is the basis of the solution?
[00:21:08] What features are providing the most information? Because, you know, while it's possible that you could build this incredibly complicated. Very complicated. 50 feature model that has amazing results, maybe for almost the same performance, you can have a very simple model that could get you most of the value and have extremely high explain ability back to the customer, which 1 is more relevant.
[00:21:30] Plus, maybe all those features cost you money to acquire. And you have three or four that get you most of the value just off the shelf waiting. So that whole question comes up. We focus on start with the basics 1st, before you go to anything advanced, understand the problem, get your data ready, do the descriptive statistics, take the time to understand.
[00:21:52] A little bit about your features and the results you're trying to predict before you go all the way into the most advanced modeling possible. And by the way, I'm a huge fan of auto ML software. I think it really does democratize. The advanced modeling and at the same time, I don't tell people, we'll take your data set and just jump into the auto ML.
[00:22:11] I want them to spend a little time understanding their data. 1st, it'll help them. It'll help them find the mistakes in the data. Not just you know, could be data acquisition errors. It could be just a typo errors, whatever it is, but find those areas so that you're not building models off the mistakes.
[00:22:28] And that's very important. I was just having a conversation earlier today with a dear colleague who works at 1 of the largest banks in the world, and she was telling me her biggest frustration is people skip those steps. They've got huge errors in their data and they're just jumping right to the most advanced modeling possible.
[00:22:45] And then they have to circle back, whether it's a few days or a week later to figure out what just went wrong. And she said it's the basics of data analytics that she always encourages her team. To do before they jump into doing the most advanced modeling and we know it's exciting to see these great techniques.
[00:23:05] And some of them truly have replaced a lot of the basic work. You opened up the conversation on generative AI and, you know, this has been such a game changer for us because I had a problem come to me just a couple of weeks ago. The team and customer service had been extracting customer feedback.
[00:23:21] This was for a real estate company. And they had extracted feedback from tens of thousands of customers and they wanted to systematize this so that they had an understanding of what were the categories of feedback. And the urgency in response, so that they would have an automated system instead of having someone manually read the written comments and then go through it.
[00:23:43] Automatically do it. Now, take you back four or five years ago, we could have built this, and we would have had a sentiment analysis involved, and we would have trained it, and we would have had some categorization. We've done some really nice NLP on it. But now, it's just a few lines of good prompt engineering, and the show is over.
[00:24:00] And so, the technology has moved, and the bar has changed. And the bar has changed to really, again, democratize who can provide value, but also to understand that a problem that used to be Doable, but fairly challenging has become quite easy and that's great. I think that's really wonderful. And as we get to the multimodal models, we're going to be opening that door even further.
[00:24:20] Dr Genevieve Hayes: The analogy I always draw is what's happening now with Data science and machine learning is equivalent to what happened with website programming 10 or so years ago. It used to be that if you wanted a website, someone had to build it for you by writing HTML. Now most websites are just built using drag and drop website builders.
[00:24:44] Dr Howard Friedman: It's a good analogy. I think it's even more subtle than that. So you make a great point. I mean, I remember coding in HTML. It's going to be a lot more than 10 years ago, I promise you, but I was coding in HTML way back when. And now, of course, you've got all these templates, drag and drop, you're ready to go.
[00:25:00] But where we've gotten these days with really some of these fantastic LLMs is you can have it draft your programming code for you too. And so you can get, you know, not perfect, but you can get pretty good structure on a Python or all. With some really just clear, simple prompting, and then you can clean it up.
[00:25:19] You take that code, you use it, you clean it up. And that's fantastic because as you change what is the bar for what is technical work, it makes you think about what is the value add of your contributions. And I'm okay with that. A lot of people, they're concerned about it from the point of view of, Future employment.
[00:25:38] I think it raises the bar of what we define as intelligence. And in fact, I take you back in time when I was growing up, the ability to be able to fastly regurgitate facts was a sign of intelligence. Now, in what year was this and this occurring? Well, that got replaced by quick lookups like Google.
[00:25:55] So that's been unnecessary for decades as a measure of intelligence. Calculations. Could you multiply this in your head? Still useful. And to get order of magnitude is useful, but that got replaced very easily. The way everyone with a cell phone has a calculator. And now we're seeing these other skills, such as basic programming is also no longer a major measure of skill.
[00:26:16] Now, you still, of course, advanced programming and we really need the data engineers more than ever. Now, their importance has become more and more critical, but some of those other skills. Less critical and quite frankly, I think that people doing the job interviews need to revise how they conduct job interviews because they're still doing job interviews as if these other technologies don't exist.
[00:26:37] I think in some cases, they're living quite in the past.
[00:26:40] Dr Genevieve Hayes: It's the same with how Universities and schools teach things now. I think the lecturers and teachers in many cases haven't caught up with the fact that students will be using generative AI as an assistant going forward.
[00:26:56] Dr Howard Friedman: So, you make a great point with that. So in the school systems universities. They are obliged to develop policies. Not every university has, by the way, and it's a little frustration because I can tell you, I know some very well respected universities, not going to name any who, put it on the individual professor for her or him to decide their own personal policy.
[00:27:18] And to me, that seems highly unfair to the individual teaching that said the analogy that I hear. Outside of the education space, but in the workspace is there are going to be people who embrace using these generative AI and other tools and people who don't, and you will find yourself losing to people who effectively embrace these technologies in terms of future jobs.
[00:27:43] So it's less about being outperformed by the technology. and more about being outperformed by the person who uses the technology effectively and again, effectively being a key word because you can't use these things blindly. Errors happen, mistakes happen, biases can get replicated. So you have to use it with open eyes.
[00:28:01] But that is how I use it throughout my work experience.
[00:28:05] Dr Genevieve Hayes: It's just like with a calculator. You can make an error if you're using a calculator, but you're still going to do better if you've got the calculator than if you've not.
[00:28:12] Dr Howard Friedman: Absolutely. And at the same time, your brain should be thinking, is this order of magnitude right? So that's the human check. Does this make sense? Because if it doesn't make sense, maybe I need to go back. And refer to a mistake, perhaps in the inputs or some other assumption I made when I look at the outputs, not all of it's relevant.
[00:28:30] Why is it not relevant? Do I need to change my prompt or something else? And in the tuning is fascinating as well, because what we like to do, of course, is do multi shot tuning, look at the results. And then we have a number of humans look and say, does this make sense? I had a wonderful experience recently where the original programmer.
[00:28:49] So He was frustrated. He said, you know, he did really well on these four, but on this fifth one, I just totally missed it. And when I looked at it, I said, I think it hit that last one out of the park. And let me tell you why. And he paused. He goes, you know, something you're right. It actually. Assigned it more accurately than I would have done.
[00:29:07] And we talked, you know, I said, that's my opinion. Let's talk to two other people and see if they also agree with you or me. And basically, the original programmer didn't realize this program was doing much better than he'd even thought where he thought it had totally missed it actually had had a fantastic response.
[00:29:23] Dr Genevieve Hayes: That's interesting from your experience working as a data scientist, how much do you actually use the more cutting edge models and techniques in your own work compared to the simpler and more established techniques?
[00:29:35] Dr Howard Friedman: It's a great question. It is 100 percent industry dependent. And in my own experience, the finance industry tends to be much more at the cutting edge. Then many other industries. To give you an example, I had worked for Capital One back in the 90s. Back in the 90s, we had neural network models. We had random forest models.
[00:29:56] Then we were doing boosted machine learning back then. Now, people say, wow, it's incredible. The answer is no. Finance is always. Has the money and the edge in terms of trying to have models outperformed so I think that the most advanced models that I built typically are in that industry. I've done a lot of work in healthcare space farmer, for example, now, pharma has a lot of traditional statistical approaches and they are opening the door more and more to more machine learning approaches, but every solution that I build.
[00:30:28] In particular, the pharma space that has a machine learning solution. I also provide a traditional statistical solution for so that they can contrast. Because people have to feel comfortable. Back to that customer. The customer in that case is someone who is typically a scientist. Most people who are working at a pharma company, or many of them who are my counterparts, are PhDs or PharmDs, but they have a classical statistical training usually.
[00:30:55] Some now have more of a machine learning AI approach. And so to get them in their language, I have traditional statistical solutions. And I compare that performance to machine learning solutions, and I will also show them, for example, let's say that I was doing a binary outcome, predicting the probability of mortality, for example, logistic regression might be the traditional solution.
[00:31:20] And so they would have the features that are most predictive, and they would have odds ratios. That's their comfort zone. And then I would show in the machine learning model. The performance sometimes much better, sometimes not, to be quite honest, and then what is the, for example, the partial dependence plots and here are the most important features.
[00:31:40] And they would see often a similarity. They would see sometimes the machine learning model does better and they get more information about the relationship. And sometimes the performance is similar and the take on it is, that they don't want to have a lot of extra complexity when a model isn't really substantially better.
[00:31:57] And that's probably the right decision by the customer. You know, why add a whole lot of extra effort and information if you're not getting much back and you then have to continuously educate that customer. So that's an industry. When I'm working in retail, it depends. Yeah. And so I've done a lot of stuff with the restaurants.
[00:32:14] I've done a lot of stuff also with telecom and there, depending on where I'm working, they might be really closer to that finance than in the Farva now health insurance. Is always been an information game and so there's a lot of really, I think, cutting edge models that I've seen in that space. So again, there's definitely a gradation and I say finance is really, really where they've got so much money invested and that's the battle that goes on for getting so much of the talent and the cutting edge models.
[00:32:46] And then some of the other industries lag behind.
[00:32:49] Dr Genevieve Hayes: That makes sense. I mean, finance with the quantitative analysts was really the birthplace of data science.
[00:32:56] Dr Howard Friedman: Well, they certainly pushed it and the difference maker for profitability for many companies was, especially when you talk about program trading models the hedge funds, it was all about the game of can you find within this mud of data, those gems that are going to make profit.
[00:33:13] And that is really, really the rawest of data science.
[00:33:17] Dr Genevieve Hayes: Earlier in the episode, you were mentioning a person who'd read your book, who was reading it as a data scientist to understand their customers better. If you were rewriting this book or writing another edition of this book with data scientists as your target audience, how would you do it differently?
[00:33:38] What sort of things would you include in that book?
[00:33:41] Dr Howard Friedman: So if I, let's say that this book sells, what's it say, five, ten million copies, and then I get that request to one, make it into a movie which actor would be playing, or actress would be playing that, we'll deal with that one later, I've got some people in mind , let's say there is a request to make a sequel, I'm gonna make a sequel, I would do it in a similar fashion, I would still have fictitious characters, but they would be, Data scientists, 1 of that would be a junior data scientist who is learning how to work with different people in the corporate world.
[00:34:15] And 1 would be perhaps that very, very could be the chief data officer of a company and she is navigating her way between the C suite and the team below and knowing how to communicate at that very, very senior level. The little short dips that they will hold on to, and then providing the direction to the team and a detailed enough level so that they know what to do, but not being totally directive.
[00:34:38] That's what the sequel would be. I wouldn't change this book because I think this was written exactly for what I wanted, but I would love to, you know, have those 5, 10 million copies and write the sequel for the data science community.
[00:34:50] Dr Genevieve Hayes: And what sort of advice would you give to junior data scientists and senior data scientists characters in this book?
[00:34:57] Dr Howard Friedman: This is my sequel book.
[00:34:59] Dr Genevieve Hayes: Yeah. The sequel book.
[00:35:00] Dr Howard Friedman: Don't start with the solution is one. Listen to the customer because the customer usually. Knows what they need. That's back to like, you had read this nice excerpt Steve describing. Well, if he's buying a cabinet, he knows what the cabinets for. He knows roughly where it's going to be in his apartment, but he's not getting into the details of what the wood is or the cut or how many hinges need to be.
[00:35:25] And that's really. Where I think is important, but it's also about figuring out what information you need to communicate and what information you don't. I've sat in horrible meetings where the CEO of a company is on a call with a data scientist, and they are going on and on about, the criteria that was used in order to reduce their features and, what transformations were explored, which they hope we're going to reduce the made holding asleep because it's 100 percent irrelevant to his purpose in life.
[00:36:02] He wants to know, did the model work at a level? That was. sufficient for his business purpose? Is it going to be implementable? What's it going to cost to maintain? Now let's move on. Okay, everything else, I've got a team for that. I've got people underneath me and people underneath me who that's their purpose.
[00:36:19] And so just a complete mis targeting of information I think is important. So, I would focus that that's the junior person, the senior person, and I always gravitate towards that senior person being a woman. I can tell you that because 1 of my main mentors was this wonderful leader. Her name is crystal and I'll say her name.
[00:36:37] So she was a great leader at capital 1 and she was. Brilliant scientist PhD. I think it was aerospace engineer, but she knew how to talk to a C suite and to the technical people. And she was so good at for the C suite bullet point answer their questions. Make sure they know where we're going. And that she'll let him know if we're in trouble.
[00:37:00] She'll give them those signs. There's a problem with she would build that confidence because she really knew her stuff technically, but also because she kept it focused on what was relevant for them. And then she would come back to the team and say, here's the expectations. This is what we need. You go off.
[00:37:16] If you have a problem, you come to me. You want advice, come to me. I'm happy to chat about it, but I don't think you need it. And she would empower you and you'd go off and do and you felt so good. And then what also made her great was. When you were ready, she would try to bring you into that very senior meeting so that you had the chance to present the 3, 4 levels above you.
[00:37:37] So not hiding the junior people all the way down, but giving those young people a chance. And to me, that's what a great. Senior data scientist or chief. Officer is about is someone who does the communication at the high level empowers the team below. Make sure that they know that the door is open.
[00:37:54] If they need advice, give them enough rope, but make sure they don't hang themselves. And then keeping that kind of conversation open when it's ready, let them be the ones to show what they've done. Don't take the stage for yourself. It takes a humility to do that. And that's something I love to see in leaders.
[00:38:11] Dr Genevieve Hayes: So to change the topic a bit. In addition to co authoring Science, you're also the author of Ultimate Price, The Value We Place on Human Life, about how economists and data scientists place a price on human life I started my career as an actuary, so this is a topic I find to be particularly interesting, and it's also something I've previously observed in the insurance industry.
[00:38:38] For example, in most Australian workers compensation insurance schemes, if a person dies in a workplace accident and they don't have a dependent partner or children, the payout will just be basic burial costs. Whereas, a person who dies with dependence will receive hundreds of thousands of dollars in compensation.
[00:39:01] So, that effectively means the life of someone without a family is considered less valuable in dollar terms than the life of someone with one. And even though I don't like it, I understand where that's coming from. Outside of the insurance sector though, what are some other examples of situations where a price tag is effectively being placed on human life?
[00:39:25] Dr Howard Friedman: So interestingly, it's almost pervasive and you're right. The insurance 1 is the 1 that's most intuitive to people because it's the rare 1 where you get to decide how much you want to take out in your own insurance policy. So you get to put that price tag on yourself. Of course, affordability is a consideration, but that's when it's very special.
[00:39:47] If you look across 1st, there's the public sector itself. So regulators actually use. A price tag when they're trying to assess whether a new regulation is cost effective or not, right? Because often there's a burden it might place on industries. And there's a cost involved to the industry. And the question is, is there a benefit related to the reduced mortality and morbidity.
[00:40:10] So there's often a price there. And there's a equations used for that. And I question a lot of that in the book, but that's regulators in the private sector, though, you'll see this in many areas. We talked earlier about health insurance and health insurance. They have to make a decision.
[00:40:24] Should a particular policy or procedure or drug be funded. Because that's going to cost money. Will that save enough later? That's a business decision, but that's a data driven business decision. And I've been on the end of that. There's budgetary impact models that are created that really look at the question of, is it cost effective?
[00:40:45] And if it is, how many more months of life. Do you have to add what is the incremental amount of months or years of life that is necessary to defend the incremental cost and you have the incremental cost effect and cost effectiveness ratios, ICERS. So that's done very commonly in the private sector and public sector.
[00:41:08] So that's one. The private sector does a calculation very, very often where they have to balance safety versus. This reduced mortality morbidity. And what I mean by that is it costs more money to put in more safety devices. It might lower sales because the price of your product goes up. So there are these implicit prices in there.
[00:41:30] But someone also has to figure out well, what is the expected increased morbidity and mortality if I don't put those safety devices in? And then what is the cost per morbidity and mortality? And there's a lot of calculations that go on there. It could be financial analysts. There could be a lot of digging of data with data scientists to help do that calculation as well.
[00:41:50] So that comes up all the time. And in a litigious society like the United States, this is a major, major consideration. So in that book, I talk a lot about the Notorious Ford Pinto case in the United States. It's a great one for people to Google if you haven't seen it, but it's a famous one about when Ford did some calculations.
[00:42:09] That turned out to vastly underestimate what the courts were going to value a human facts, and that really changed the landscape there. But again, you walk across industries and you'll see that it's really quite pervasive. And a challenge is finding the data. I walked into that book hoping to find a nice, comfortable, single answer that we all rest on and use.
[00:42:35] But I concluded saying that There is no single price on life and there's different perspectives and the perspective of a regulator like the government is very, very different from the perspective of, let's say, a victim's compensation fund. Where, similar to what you were talking about Genevieve, they do consider things like the number of dependents.
[00:42:54] They don't have to, but they could. They do consider things like, what was the income that the person was earning at the time of the untimely death? Those are considerations and I did focus in that book on the September 11th victims compensation fund, which was run by New York City. Kenneth Feinberg, who really in the United States is one of the leaders I shouldn't say one of, is the leader for compensation fund assessments.
[00:43:17] But the fascinating part about that story is I make an argument that they should have used equal values, that they shouldn't have had these huge differentiations, because it's always going to be political and I can always make good arguments, but fundamentally it didn't feel like a fair answer. And a few years after Kenneth Feinberg was done, With his work on the September 11th Victims Compensation Fund, that was his statement.
[00:43:39] He said if he could have done it over again, he would have valued all lives equally. Now he, he was under constraints. He couldn't have. Congress had rules in place that he had to operate on. So he was a little constrained. He did whatever he could to be as fair as possible. But he realized, and this is literally what he's done his entire life, Valuing lives equally would have been better.
[00:43:58] Pushing forward in time after the Boston Marathon bombing, there was another victim's compensation fund. He again was the administrator of the fund and there was no constraints. He, as the administrator, could do what he thought was the right thing to do. And he did value all lives equally, regardless of what their income was, regardless of what their age was.
[00:44:18] And he said that the distinctions that are made for valuing one life more than others were arbitrary, not defensible. So he eventually did exactly what he had stated was the right thing to do. Anyway, that's a long, long answer to one of my favorite topics.
[00:44:34] Dr Genevieve Hayes: How were the price tags determined in the September the 11th Victim's Compensation Fund?
[00:44:40] Dr Howard Friedman: So, in that fund, as I mentioned, he had some constraints. He wasn't allowed to just simply pick numbers. The instructions said he had to consider economic benefits. So, income mattered. Now, income in the United States has a tremendous range to it. And so income was a factor. Those who earned more would get a higher pay.
[00:44:59] The range for those who died was it was 250, 000 up to 7, 000, 000, right? So that's a very large range. But those who died who earned the most didn't accept his offer. He went to court and those are undisclosed amounts. But Presumably they got even more in court because, the lifetime earnings of those people would have vastly exceeded even that 7 million, but the highest payout was not for a death.
[00:45:24] Was for an injury, someone who unfortunately permanently injured had major, major medical expenses for the rest of their life as well as lost income. But fundamentally, what that did is that valued the life of someone who died of everyone who died was valued less than the life of someone who survived.
[00:45:42] And that also raised a lot of flags. Dependence. If you had a dependent, there was an incremental amount you were provided. There were other factors that took place. If you had a life insurance policy that subtracted the pay, so if you had gone out of your way to plan for your potential untimely demise, then your family got less money.
[00:46:02] A lot of people felt that was unfair. So there was a lot of concerns that went about it. Those who were, minors. We're all valued the same. So suddenly this whole idea of looking at your income and other factors was cleaned out because they said, well, they're not generally expected to be working. So they valued all their lives the same on average, lower than the average payout.
[00:46:24] So children effectively became lower and then in what is perhaps a reflection of how the real world works, those who did not immediately accept. the offer that came from the victim's compensation fund, but went and personally challenged that, got higher payouts. So if you said, no, this number really doesn't work for me, here's five reasons why, on average you got more money.
[00:46:48] And that also spoke to unfairness because not everyone would have known that that was still a conversation open for. And that's again, back to perhaps business savvy of the families of the victims, things that you don't think necessarily should have been in the conversation at all, but came up anyway.
[00:47:06] Dr Genevieve Hayes: Sort of like negotiating salaries. If you accept the salary, you're going to get a lower salary than the person who doesn't accept it.
[00:47:13] Dr Howard Friedman: Absolutely. And on that realm. So if you lack the context, To get information to know what you should be getting offered, you could get a lower salary. Now I always found that very frustrating, not because I had a lack of contacts, but because as a manager, the decision of the hiring salary was, you know, I had a range and then human resources in some cases, unfortunately felt like it was a real win if they could pay someone lower than the mark.
[00:47:41] And they felt like they were doing something good for the company, but eventually. That employee would learn that they took a much lower number than they should have. And they're angry. And they're angry with the company. And they're angry with, perhaps me as a boss. But they're also going to spread a lot of concern.
[00:48:00] Not only do they have a high likelihood of a triding, but it's also damaging. The team morale, so I really don't like it when human resources chooses to do that. They think they're making a win and they're making a big loss. I can tell you a quick anecdote. I'm going to take away the name of the company here.
[00:48:18] I had this situation. So I was leading a large team. Fantastic work, you know, she had MBA from 1 of the top 5. MBA schools in the United States and she had great experience beforehand. We brought her in, but they brought her in a solid 1, if not 2 levels below where she should have been brought in.
[00:48:37] Didn't take her more than a week inside the company to look around and say, wait a 2nd, why am I at this level when people straight out of undergraduate who got hired last year were hired at a higher level? And I've got 5 years experience plus an MBA from 1 of the top 5 programs in the United States.
[00:48:52] So she quickly realized. They had low balder and she didn't realize that she had been low balled. She's unhappy. She comes to me and I know she's not going to stay with this company long. And it's a source of frustration. Again, she came in very talented and I knew she would do a great job, but this is also going to create dissension amongst my team.
[00:49:09] And I had a fairly large team. I went to human resources. They said, well, she took the offer, so that's where it is, and she's up for promotion in the first year, and if she deserves the promotion, we'll set her right. And I said, she's not going to be around here, and I'm going to have one year of very unhappy team because you clearly hired her at the wrong level, and you knew this wasn't the right level.
[00:49:31] And they said, well, she accepted the offer. And I said, no, that doesn't fly. So I jumped all the way up to the chief credit officer. It was the number 3 person of the company who was a great person. This is a 20, 000 person company where the number 3 person had an open door. You have a problem, you come to me.
[00:49:47] So I went straight to him. I said, I have a problem. And we talked through it and he said, oh, no, he's like, they can't do that. He goes, it's going to destroy your team. I said, I know that. And, you know, that yeah. He said, what was their response and I told him the response that they said, well, she signed the offer talk to in a year.
[00:50:01] He goes, oh, because that's how they answer you, but that's not how they answer me. And he picked up the phone. Dial the point of contact said, hi, this is. I'm sitting here with Howard. He said in 10 seconds later, he said, thank you so much for solving that problem. I know you're going to send an email right now with a timeline of when that gets corrected.
[00:50:19] I expect it'll be within the next couple of days. Thank you. And I looked at him and said, thank you very much. That took a great senior leader to realize. That the low ball was really not going to help and to be willing to challenge HR on it.
[00:50:32] Dr Genevieve Hayes: With these calculations, I mean, going back to the putting prices on human lives, every human's got their biases and it's very hard for humans to set aside their biases because they often don't even realize they've got them. How do you avoid introducing such biases into these calculations?
[00:50:51] Dr Howard Friedman: So you're right. There are absolutely biases that exist. My suggestion in the book was very straightforward. Value all lives equally. Now, a lot of people object to that. They say, well, wait a second. I spent all of these years getting an education and working so hard to do all these things. And you're going to value me the same as someone who didn't put it all in.
[00:51:14] And in my book, I said, I openly say this values the Nobel prize winner at the same level as the murderer. And that's unfortunate, but I will take that unfortunate situation of equality because it also takes away all the inequalities that would be there otherwise. Now in the book. I talk about how income is a driver of so many of these calculations, but income is where so much of that bias comes in.
[00:51:43] And I think that's where you were referring to Genevieve. So income has so many implicit biases and explicit. There are gender inequalities in income, well established fact. There are racial inequalities with income, and certainly in the United States, well established fact. There are additionally. Ageism plays a role in many different areas.
[00:52:03] So certainly in the United States, after a certain point, your opportunity for obtaining new employment actually goes down dramatically. You reach a point where companies won't hire you. That takes, it's an increasing curve and then it's a decline. So these factors are already in the salary. So if you use salary as a basis for value of life, then by definition, they'll be biased.
[00:52:24] So my argument is default to equal values of life. If you need to tweak around the edges, okay, I accept that. But when you do that, you take away all these other biases. That will exist and are already in the calculations anyway.
[00:52:40] Dr Genevieve Hayes: On that note, what final advice would you give to data scientists looking to create business value from data?
[00:52:47] Dr Howard Friedman: So for data scientists, I really encourage you to, of course, be curious, want to learn, but want to learn not just the technical aspects, but want to learn. What's important to your customer, what do they care about? What is the problem they're trying to solve? What's important to them in terms of the cost of the solution, the time it takes for that solution, who's going to use that solution, learn, listen to them, be a good listener, and then encourage them to ask good questions as they ask good questions.
[00:53:18] You may not know all the answers this time, but next time you'll be far more prepared, and you'll be ready to answer those questions. So if you're a data scientist reading this book, those questions, think about them in advance, because that's what your customers should be asking.
[00:53:33] Dr Genevieve Hayes: For listeners who want to learn more about you or get in contact, what can they do?
[00:53:38] Dr Howard Friedman: I love hearing from people. They can reach out to me directly. So of course I'm active on LinkedIn. I actually have A few linked in learning courses, so they're very welcome to join me there. I've got 1 linked in learning course. That is the data science playbook for private equity and venture capital.
[00:53:53] I have a new 1 that's going to be coming out in a few months. That's really tied to this book about how to be a good data science customer. So please contact me on linked in. I know there's a lot of Howard Friedman's out there, but there's only one Howard Friedman teaching at Columbia University. So you'll find me that way.
[00:54:09] Additionally, I have a website, howard friedman. com. And on that, you can see some excerpts from the book. Got some some fun media. I think I've got a couple of my original paintings. And additionally, I have a contact page. So Not just data scientists, but businesses. If you're interested in having a consultation, reach out via that website.
[00:54:32] So howard freedman. com. Use the contact page. And I'm very happy to have a conversation.
[00:54:39] Dr Genevieve Hayes: Thank you. And I'll put links to that in the show notes.
[00:54:42] Dr Howard Friedman: So appreciated. And Genevieve, I really want to thank you for taking the time to have this discussion with me. It's really been a pleasure.
[00:54:49] Dr Genevieve Hayes: Thank you for joining me. And for those in the audience, thank you for listening. I'm Dr. Genevieve Hayes, and this has been Value Driven Data Science, brought to you by Genevieve Hayes Consulting.

Episode 31: The Business Leader as Data Consumer
Broadcast by