Episode 81: [Value Boost] How to Frame Data Problems Like a Decision Scientist

Download MP3

[00:00:00] Dr Genevieve Hayes: Hello and welcome to Your Value Boost from Value Driven Data Science, the podcast that helps data scientists transform their technical expertise into tangible business value, career autonomy, and financial reward. I. I'm Dr. Genevieve Hayes, and I'm here again with Professor Jeff Camm, a decision scientist and the Inmar Presidential Chair in analytics at the Wake Forest University School of Business to turbocharge your data science career in less time than it takes to run a simple query.
[00:00:34] In today's episode, we'll explore the specific problem framing techniques decision scientists use to avoid project failure, and ensure their work solves the business problems that matter. Welcome back Jeff.
[00:00:49] Prof Jeff Camm: Thank you, Genevieve. Great to be with you again.
[00:00:52] Dr Genevieve Hayes: In our previous episode together, we talked about the notoriously high failure rate of data science projects, which is often quoted as being around 87%, and how the decision science skillset could be combined with data science techniques to dramatically improve this. One of the key areas in which decision scientists excel is in framing business problems right from the start.
[00:01:15] Which is something that's often glossed over in data science training programs to the detriment of data science, project success. So learning the approach decision scientists take to ensure they're solving the right problem in the right way is likely to be a very high ROI skill for many data scientists.
[00:01:33] Jeff, as someone who's practiced and taught decision science, what is the practical framework you use when framing business problems that you intend to later go on and solve?
[00:01:46] Prof Jeff Camm: Sure. So I often say that. A good decision scientist is like a good medical doctor. And the issue is unless you're reporting to a decision scientist, you're probably reporting to a manager. That manager typically is gonna describe symptoms. So, for example, we have way too much inventory sitting in the warehouse.
[00:02:09] Well, that's a symptom. And it's caused by something else, is the issue. Our demand forecasting was bad, is the issue that our forecasting is pretty good, but we have limited capacity of some sort that we make long runs of certain products at a fixed point in time, and that results in a lot of this stuff sitting on the shelf in the warehouse because it takes time to get rid of that inventory. So trying to put yourself in the position of the sponsor, whoever's talking to you, who wants your analysis, what are they really trying to decide? What's their problem that they're trying to solve?
[00:02:49] What's giving them heartburn? So I would tend to ask clients, what is disturbing you? Essentially, give me the list of symptoms and then try to speak to them and get them to realize those are symptoms. Then what do they control that might alleviate those symptoms?
[00:03:05] Oh, you get to decide how much we produce. Well, okay, your real problem is maybe you're not producing the right quantities at the right time, and that's why all this inventory is so let's model all of that. And we can take into account the uncertainties of forecast and we can take into account the capacity of production and take into account the profit margins.
[00:03:28] Maybe those things that are sitting in the warehouse are the low profit margin things, and it's okay for them to be sitting there. So go to that process of essentially putting yourself in the position of the manager.
[00:03:39] They're coming to you for analysis because there's an issue they have. There's something giving them heartburn and they have to make a hard decision. Put yourself in their shoes and ask what they're going through, and ultimately, I think that'll lead you to whatever analysis you're doing, how it's gonna be used in the decision process.
[00:03:57] That will give you a lot of benefit. You're gonna understand the metrics they care about. You're gonna understand the constraints, their limitations, and, I think it also will enable you to, not just provide forecast or not just provide an answer, but provide some alternatives for them from which to choose to make that decision.
[00:04:18] So in a nutshell, think beyond the data. Think to the business process that you're going to impact and the decisions that are going to be made. And keep asking questions until you fully understand how what you're doing is gonna be used to change or improve somebody's decision. And so that's sort of the process I follow. It can be laborious, but it pays off.
[00:04:40] Dr Genevieve Hayes: The question that really stood out to me when you were giving all those examples was. What do you control that could alleviate those symptoms? Because that really gets to the heart of the decision that they're going to make.
[00:04:53] Prof Jeff Camm: Exactly. And unless they have control over something, they're stuck. And ultimately as I think like a. Decision scientists, when I ask, what can you control? That's actually the decision. Whatever they say they control, that's the decision process right there. You know, how much are we going to produce?
[00:05:10] I can control that. I can't control the demand, but I need to forecast the demand to help me decide how much I'm going to produce.
[00:05:18] Dr Genevieve Hayes: What happens if you have a situation where they tell you the symptoms and then they tell you what they can control, and there isn't a way of alleviating the symptoms from what they can control.
[00:05:29] Prof Jeff Camm: Yeah, well, in my experience, usually when that happens, we haven't really gotten to the real decision. Or the other thing that can happen, this happens sometimes. The system is so constrained that in fact they have no control but. You find that out in the process, right?
[00:05:49] Maybe, you do some optimization and you find out that, you should produce so much of this product, so much of that product, et cetera. But then you find out, oh really, we don't have the capacity to do that. So maybe we left something out of the model.
[00:06:06] And in fact, if you did put that in the model, you'd realize you don't really have any decision to make that it's all locked down. And another thing going back to the optimization mindset, sometimes you can build a model and you find out you cannot satisfy the constraints of the model. It's infeasible.
[00:06:24] That's actually really good information for a manager. So you might have the correct model, all the right inputs, maybe forecast the demand, which are uncertain, but you run this model. By that we mean we solve for the optimal decision or set of decisions. And the computer tells us it's infeasible.
[00:06:43] You cannot meet those constraints. And now what you can do , the theory of constraints kind of thing. Well, something has to be relaxed and you can use those models to determine what needs to be relaxed and try to figure out a strategy for how to make the problem feasible. And then they do have things they can control.
[00:07:01] Dr Genevieve Hayes: So you're still providing a solution, but it's a solution to the right problem rather than the wrong one.
[00:07:07] Prof Jeff Camm: Exactly. That's something I try to teach my students when I teach optimization coming up with a model and you have to say it's infeasible. Then the challenge just becomes, which constraints are so binding, are so tight that if you can get more resources here or, relax some of these constraints, then you have a decision problem you can make or solve.
[00:07:27] Dr Genevieve Hayes: So it's not just a matter of saying, I can't do this, throwing your hands up in the air and walking off and having lunch. It's a matter of saying, I can't do this, but how can I help you to find a situation where this can be done?
[00:07:41] Prof Jeff Camm: Right. Let's suppose the manager above your manager has specified we will achieve this market share. And that's a constraint in the model. Well, you run the model as defined and you find out it's infeasible slowly. I. Walk down and resol the market share requirement. Now you have to go back to your manager and say, with our capacity, your market share requirement is not realistic, but here's what we can achieve based on our model.
[00:08:11] You said we want achieve 25% market share. The model says that's not feasible with our resources, but looks like we can achieve 22%. And the manager might say, no, I want 25%. Tell me how much more production capacity we need to get to 25%. You can tweak your model to do that which is, I think, a real important point, a optimization model, especially, is not just a model to solve for the answer.
[00:08:38] It's like a laboratory. Especially when it's infeasible because you can tweak all kinds of different directions to get it to be feasible. And the question is, what really maps onto something that is a good strategy for either bringing more resources or cutting your expectation on market share to use the example we were talking about, so.
[00:08:58] What first appears like a real problem when your model appears to be broken, tweak the data experiment with the model itself in terms of the parameters of the model, and see what kind of information you can glean from it. There was art Jeffrey on who's a professor at UCLA who's retired now, I believe.
[00:09:16] I'll probably misquoted a little bit, but he had a paper where he said that the purpose of basically math programming or optimization is insights, not numbers. So his point was, use these models to understand your system and your decision you're gonna make tweak them, tweak the data so that you can learn how all these things work in tandem constraints and objective and what the trade-offs are.
[00:09:41] Dr Genevieve Hayes: So use the model as a tool not as a infallible oracle.
[00:09:46] Prof Jeff Camm: Yeah. It's not the holy grail. It's a really, really good recommendation engine. That's what I would say that's true of decision science. Really good recommendation engine.
[00:09:57] Dr Genevieve Hayes: And what's one specific thing you'd recommend our listeners do when they receive their next project request to start applying this framework?
[00:10:04] Prof Jeff Camm: Think beyond the data. . They're gonna come with data. Hopefully it's the data you need to do whatever it is and the problem you're gonna solve. But by thinking beyond the data, again, , I'm gonna say think about what deployment means of any model that you build or any analysis that you do.
[00:10:22] And refrain from giving answers, give analysis that captures uncertainty and give analysis that. If plugged into the decision process later is robust, It doesn't matter if you're a data scientist or a decision scientist, having that mindset is very valuable.
[00:10:42] Dr Genevieve Hayes: And that's a wrap for today's value boost. But if you want more insights from Jeff, you're in luck. We've got a longer episode with Jeff where we dive deeper into exactly what decision science is and how adding decision science skills to your toolkit. It can increase your value as a data scientist, and it's packed with no nonsense advice for turning your data skills into serious clout, cash, and career freedom.
[00:11:09] You can find it now wherever you found this episode or at your favorite podcast platform. Thanks for joining me again, Jeff,
[00:11:17] Prof Jeff Camm: Me, Genevieve, I've really enjoyed it.
[00:11:20] Dr Genevieve Hayes: and for those in the audience, thanks for listening. I'm Dr. Genevieve Hayes, and this has been Value Driven Data Science.

Episode 81: [Value Boost] How to Frame Data Problems Like a Decision Scientist
Broadcast by