Episode 70: How to Interpret Data Like a Pro in the Age of AI

Download MP3

[00:00:00] Dr Genevieve Hayes: Hello and welcome to Value Driven Data Science, the podcast that helps data scientists transform their technical expertise into tangible business value, career autonomy, and financial reward. I'm Dr. Genevieve Hayes, and today I'm joined by Nicholas Kelly. Nick is the founder of Delivering Data Analytics, a consultancy focused on helping organizations enable their teams to make smarter, faster, and more confident decisions through data and ai.
[00:00:31] He is also the author of Delivering Data Analytics and the recently released How to Interpret Data. In this episode, you'll learn practical frameworks for transforming raw data into business impact through the art of data interpretation. So get ready to boost your impact, earn what you're worth, and rewrite your career algorithm.
[00:00:54] Nick, welcome to the show.
[00:00:56] Nicholas Kelly: Thanks for having me on Dr. Hayes. It's a pleasure and honor.
[00:01:00] Dr Genevieve Hayes: Data is a cornerstone of our decision making, but the ability to effectively interpret data is challenging to master. So begins the blurb on the back cover of how to interpret data, and it's difficult to disagree. We live in a century where more data is produced every year than existed in the prior history of humanity and where data science degrees have become commonplace.
[00:01:25] Yet, despite this abundance of data and skills, even the most experienced data professionals can still find themselves. Drawing wrong conclusions, missing critical insights, or failing to communicate their findings in ways decision makers can act upon. Because at the end of the day, it all comes down to how well these professionals can interpret data.
[00:01:47] Nick, as the title suggests, your book tackles the critical challenge of data interpretation. So to begin with, could you help our listeners understand what makes data interpretation so difficult? Even with all the technical advances we've seen in recent years?
[00:02:05] Nicholas Kelly: Yeah. Dr. Hayes, I think you nailed it there there's the proliferation of knowledge and access to, data science skills and certainly the technical skills of what's needed to interpret data, but I think still a lot of the challenges remain on the human side and the conveying of value to people.
[00:02:24] So if I was to try and condense, the biggest challenges that there are, with our own clients that we see, like I have a recent example where. I, was working on a project and it's for a very large client, and great access to data, lots of information there, but we're really struggling to find anything actually useful to convey to them.
[00:02:45] And, it's not the biggest problem, but it's a problem we come across fairly regularly that. We might be then looking for a problem that doesn't really exist. So that's part one, which would lead into part two, which is maybe we just don't have all the data we need and we're trying to interpret things that, well, we could make better insights and better interpretations if there was more data.
[00:03:07] So let me give a quick example. I was working with a client that owns many grocery stores. And, we got all their sales data, the number of units they sell how many stores sell, what products and all these good things you would expect from retail data. But we were finding that there wasn't really any like there, there, hey, you need to change your operations x and Y way.
[00:03:29] And it's like we weren't seeing that. If anything, we're gonna go back and say keep doing business as usual, which is obviously not what they want to hear. But what we did find is I. There were some unusual things going on in the data, but we didn't have enough information to interpret what that actually meant.
[00:03:46] And it turned out we were missing some obvious things. And I think this is a challenge that you can come across if you don't have good subject matter experts available to you. So they would say, well, where's the promotional data? So there might be certain promotions going on, and that's not visible in the data 'cause we don't have it.
[00:04:03] So that's like a secondary part of this problem in interpreting data is maybe you don't have all the data available to you. Now, the next two, I think are the real human aspect of it, which is we might be going looking for something to confirm a certain assumption that maybe the CEO has, so the CEO's tasked us with, Hey, I have this hunch.
[00:04:22] I think there's X, y, and Z things going on at the stores. And it's like, okay, well, you're the data scientist on their team or you're their consultant. You're being told to go find this thing. It's quite human to try and just look for that in the data. So, you hear the term of course, confirmation bias, but there's other types of psychological factors , like loss aversion for example.
[00:04:43] Being another one that you would often come across when executives, right, where you're trying to work with them and help them interpret the data. So that can be another challenge. And I would say , a final one. Would be the data literacy aspects of working with data. And, one of the things you highlighted, which is conveying value to people, the sort of flip side of that is we also have to convey to them the data process. And often people think especially in the business, of course, that working with data can be a trivial thing. Oh, you have the data, then, just tell us what to do. Give us the insights. Tell us the next best, best action
[00:05:20] or just make some recommendations. And often though you wouldn't think it's your role but often as the data expert, you have to educate people on. The data process, and how to interpret data as well. So that was really the thinking behind the book. I would say they're the biggest ones. And if I was to rank them, and say just kind of group them into one category, I'd say it's mostly the human factors over interpreting data.
[00:05:45] It's less on the technical side, though, those are major issues the two I mentioned there. But if you don't have the human factor covered, the other ones are almost immaterial, if you've got the technical side covered, but you can't bring people on a journey, then you know you're not really gonna get there a hundred percent.
[00:06:01] Dr Genevieve Hayes: One of the questions I have written in my notes is why not just use chat GPT? I can understand from what you're saying that this is such a big thing that you couldn't just use an AI to cover off all these human aspects, but how has. The existence of generative AI changed the data interpretation challenge.
[00:06:23] Nicholas Kelly: That's really interesting. I think we're starting to see it already I'll give you two angles on that one. One is we've had a client who will trust chat GBT more than their data analysts 'cause it's coming from chat, GBT. And that's like, well, okay, but you know, obviously you can make mistakes.
[00:06:41] And secondly, did you expose your data to chat GBT? And you can argue, well, they could stand up their own enterprise version right. And train it on their own data, which is great. And there's certainly arguments to be made for doing that, so there's a subset of the business that will.
[00:06:56] Be biased towards trusting the computer over the analyst. So there's that angle of it. And the other part of it is we've actually had clients, not the large clients, but maybe medium size clients that have said, why would I hire a consultant if I can just do chat bt, and it's gonna , be 50, 60, 70% as good as you.
[00:07:15] And well, it's hard to argue that. If they're the one paying the bills and they get a chat GPT license for like 30, 40 bucks and at least in their perception it can do quite a lot of that. And maybe a third piece there, Dr. Hayes, is also like, how could a data scientist potentially leverage chat bt to help them with the human factors, which could be, it was interesting, a year ago I was speaking at a conference and a lady said much to her surprise, she was working in the business and she was made the chief data and analytics officer for their organization not really having much experience working with data and. We did it live in the session.
[00:07:58] She asked me, what's the first thing I should do? Well, you need a data strategy, right? Maybe do an eight week sprint, come up with a overall roadmap for what you do with your data. She's like, well, I dunno how to do that. Well, let's just chat to bd,
[00:08:08] just for fun, right? It's just like, for the heck of it, let's do it. On the spot. And it came up with something that was a lot better than what she would've come up with so it gave her a huge confidence boost that okay, there is at least some support there. Is it exactly what their organization need?
[00:08:22] Probably not, but it's way better than what she would've come up with in isolation. So I think it's sort of a gray area right now it's hard to sort of really say can chat GBT replace, the human factors of working with data to a degree. But I will say there is always gonna be people who will need to have that human interaction like
[00:08:43] I remember years ago when I first started doing workshops with Deloitte Analytics and I had to do work with their priority clients, so we'd have their senior execs come into , our building and we had this really nice space, lots of technology, touch screens, everything, high tech, and I love the high tech stuff.
[00:09:01] So I was like, okay, we're gonna work off the touchscreens. They're gonna show you all these cool analytics and dashboards and all this cool stuff, and. Invariably, what I noticed over time was there was usually a few people disengaged in that workshop. These were the folks that did not want to interact with a touchscreen, did not wanna use technology,
[00:09:21] and then over time, I started to realize, oh, I see why a lot of the really good facilitators just used sticky notes and whiteboards, there is just that part of it that if you want to include everyone and bring them on that journey. Low tech actually wins. Which, wasn't a message I was really open to hearing, but if I wanted to be successful, then I had to be open to that.
[00:09:42] And I think maybe there's some analogy there, here where we could look at that and say, yep, chat GBT is probably gonna, do some of that work for us. And then there's still gonna be people that need that human interaction and to deal with those human factors.
[00:09:56] Dr Genevieve Hayes: In your experience, what are the most common ways you see even technically skilled data professionals misinterpreting data?
[00:10:05] Nicholas Kelly: I did come across the, so the first challenge that I mentioned, which is you're looking to deliver a solution to your boss, to your company, to your client. And maybe there's nothing. There's something there, but you want to make it interesting , the desire to find something where there's nothing really around there. So I think that's very tempting, especially if it's your job, and especially now when you're hearing, oh, AI could take my role, our AI can do it better I have to find something more interesting than chat GBT would.
[00:10:36] So I'd say I think there's a strong temptation to do that. But also historically when I've worked with data scientists who are. Generally to be a data scientist, you're pretty smart. The problems being less on the technical interpretation of the data, it's being on the delivery of that interpretation to the business.
[00:10:54] That is probably the consistent area where I will see work, needs to be done. 'Cause it's just hard. It's hard to be very technically capable and also have the mindset and personality to work well at the business. 'cause if we kind of look at the. Sort of personality archetypes, if we could generalize, and lots of outliers and lots of exceptions to this, but let's just say, at least in my experience, Dr.
[00:11:20] Hayes a lot of the data scientists I have worked with historically they tend to be more introverted, and that is a characteristic that is good for someone working with data at that level. The obvious challenge then is the business and very successful business people tend to be more extroverted.
[00:11:41] Not always, but doing general rules, right? Tend to be more extroverted. I. And want to work with more extroverted people. So when you bring these two folks together, it tends to be a little bit of an imbalance in okay, you know, the business person wants, bang, bang, bang.
[00:11:55] Like, just give it to me. Right. I don't need to hear all the, . Details and go in the weeds. Right? But that's the stuff that's really exciting for the data scientists. They love that. And that's like amazing journey they've been on to get there. But for the business person, they've got like, tell me it in 10 seconds.
[00:12:09] It's like, what do I need to do? And I think that's the real challenge because some folks are gonna be able to do that and then some are not. And then it's gonna be a question, well, who can you partner with in the business? Or who can you partner with in your own team maybe,
[00:12:23] that, you still get the credit for the work that you've done, but you're also making sure that it is being successfully delivered because you can have brilliant, work with your data and build fantastic models all day long, but if they're not delivering value over time, it's gonna wear thin.
[00:12:39] So it's in your best interest to figure that out. So I would say for data scientists, at least in my experience, the biggest challenge is how you're getting it across the finish line making sure it's impacting the business. And then. Measuring the impact that your work has had on the business, because most people are not gonna care.
[00:12:57] The business probably is not gonna care. And then let me give an example, so some years ago we worked on a predictive dashboard for HR to estimate the cost of attrition in the organization. And it had loads of features in the data. It was very rich and one of the features that came out was the size of an employee's internal network that seemed to have some level of.
[00:13:25] Weight to it. If the network size was low, they're more likely to attrite in the next six months or if it's high. It tended to suggest longer tenure in the organization. So one of the things that that model was able to do was to come up with a predicted cost of attrition per.
[00:13:45] HR manager, so like a HR manager, is responsible for let's say like 30 people in the organization. And I would say this year, over the next 12 months, we're predicting you're gonna have a $6 million cost of attrition. Here are the top 10 reasons why you can look at a per employee. Oh, these are the top three reasons this employee is gonna attrite.
[00:14:05] So if that HR manager is not taking action against those reasons, then. Come end of year when they're being compensated, let's say bonused on this KPI or this metric, then why didn't you do it? You had this list of things you could do. Now that's something that's highly measurable, so if you can come outta that and you can say, okay let's measure behavior and impact we've had on the organization by this KPI that we're, coming out with our model.
[00:14:31] Okay? How much did we reduce the predicted cost of attrition? That's a win, right? If you can measure it and you can show it's reducing, and then you can quantify it 'cause it's so hard to quantify a lot of these things. So as a means of building that bridge between a technical person that's gonna struggle to convey value to the businesses, maybe upfront, start thinking about how are you gonna measure that,
[00:14:52] how are you gonna measure the impact? 'cause most people aren't gonna think about it, the business is gonna think about it, but if you can put it in terms they care about, they're absolutely gonna think, Hey, you can save me 2 million a year. There like, done. Right. They don't care about all the technical details.
[00:15:08] Think about savings and value delivered and if you can go into with that mindset, you're setting yourself up for success.
[00:15:14] Dr Genevieve Hayes: It's what they say about marketing. Focus on the benefits, not on the features.
[00:15:17] Nicholas Kelly: It sounds simple, right? It sounds very simple and we always lose sight of it. I think for technical people, the preference is to go deep technically because it's what they're comfortable with, what they love doing.
[00:15:29] Dr Genevieve Hayes: And it's what gets rewarded in university essays and reports.
[00:15:33] Nicholas Kelly: Yeah, I know, right? It's tough. In my first book delivering data analytics, one of the stories I told, there was an experience I had with a group of data scientists and they did wonderful work, but they just could not translate it to the business. And the work kind of went to waste, it was so unfortunate 'cause they did tremendous work, but they just couldn't get it across the line
[00:15:51] and maybe part of the challenge with this Dr. Hayes is arrogance might be too strong a word. But I think when working with the business, they might feel like these people don't know much about data. It's like, they should really just do what I tell them.
[00:16:06] Right? But it's like if someone just comes along and tells you what to do, there's friction,
[00:16:10] so I think we have to appreciate that any of this stuff, working with people is a journey and it takes time. And I remember the first time I started working with data and I remember we built this cool dashboard, and just assumed everyone would adopt it right away. It's like, no, no. It took about a year.
[00:16:28] And I had the good fortune to be working with a very impressive change management professional. And she knew no, this is just gonna take time. Anything to do with people, just gonna take time. Sorry. But that's how it is.
[00:16:41] Dr Genevieve Hayes: So you've painted a clear picture of the challenges in data interpretation. Now I'd like to shift to. Solution mode. What frameworks have you found most effective for improving how professionals interpret data?
[00:16:54] Nicholas Kelly: It's a challenge to kind of put things in simple terms to follow step by step. And let's say if we put it in two ways, there's the professionals who are. Data professionals, so they regularly work with data. And then there's the sort of business professionals who maybe want to work with data.
[00:17:09] And so the first one I would say don't really need to focus much more on the technical side. Let's assume they've got that, but let's ensure they're answering the right question. So the techniques I've found that work well for that there's the five why's technique. That was developed by the CEO of Toyota, way back in the day to understand manufacturing defects.
[00:17:32] So what's going on the production line that's causing issues. And maybe to go back to that HR example, so let's say a question you might think of is, okay, why is attrition so high? Why are we having problems with attrition? Why are people leaving the organization?
[00:17:46] And you could start looking in the data and try and answer that you could, but I bet you there's probably someone in there that in five minutes you'd get an answer to it, so you don't have to spend your time spining your wheels figuring out the answer. And it might be because, employees are leaving for higher paying jobs.
[00:18:00] Everyone knows that, so don't spin your wheels on it. Then the next question might be, well wire. Competitors offering more money, right? So the answer might be, well, they've adjusted their compensation packages, they've seen the market trends. They know they have to pay more. Okay? So they're paying more.
[00:18:16] So then we ask, why didn't we do the same? And we can keep digging down until maybe we get to the very last why. And we find out that. We're not covering compensation KPIs. Our HR managers are not aware of it. Our recruiters are not compensated on, compensation.
[00:18:31] Right. So we finally get down to the core of it, and let's just say in maybe a few cases, we might find it's not answered by data at all, which is okay. Probably saved ourselves a lot of time, but more often than not, we're gonna be asking the questions of the data that our people can't answer that nobody has an answer for.
[00:18:50] And now that's a good place to start. So that'd be one thing that people could do right. To start getting stuck in. That's on the technical side. And then for business professionals, a good framework. So I have several in the how to interpret data book, but one of them is just the basic data process.
[00:19:07] Right? Like just understanding, okay. Ask the right question. Sure. Is it aligned to business goals and business value? I. Check. Or if it's not, then don't bother. Right? Don't waste your time. You can ask the right question, but if it's not aligned to organizational goals, our business strategy, then don't waste your time.
[00:19:27] And the next part of that is, well, okay, let's figure out what data we have and how we're gonna gather the data. What data sources do we need? All those good things. And then the entire process end to end, quite often where it stops is, okay, let's do some data storytelling
[00:19:41] so we found our insights, we did all that. We're telling our story, which is super important. But the final step I throw in there is. How do you influence decision making? You can tell the story and that's great. You need to, you have to do that. But how are you making sure that it's influencing decisions?
[00:20:01] Because if you are asking the right question, and if it is aligned to business strategy and you gathered all the data and you found something interesting and you're telling the story, so there's something there. If it's not influencing decisions, why is it not influencing decisions? More often than not, and this is a real challenge for data people, is it's a organizational challenge.
[00:20:22] Are there certain people that are blocking it? It's like, okay, that contradicts what so and so said, . And again, we get down to these human factors, of these challenges. So, for the non-technical folks, I think it's good for them to understand the entire process and then for everyone, it's good to make sure you're asking the right questions.
[00:20:41] And then that tail end of it is, are we actually influencing decisions? And that last mile, that last kilometer, are we actually making sure it's taking an impact on the business? We're not done when we tell the story. We do our presentation or we tell people, or we share an email, it's not done.
[00:20:58] Actually, I would argue you're 50% there because the really hard part is getting people to act on it and change their behavior. That's the really hard part and people have made their careers out of change management. So that for sure is the hardest thing to do. Like working with data versus working with people.
[00:21:17] There's a reason why lots of people focus on the working with data piece. 'cause they're working with the people can be very, very hard and it can be unrewarding, but if you really wanna make an impact with your career and the work that you've done, you have to make sure it influences decision making.
[00:21:33] Dr Genevieve Hayes: You mentioned asking the right questions, but what's an example of a wrong question to ask?
[00:21:38] Nicholas Kelly: There's a few ways we could go down a rabbit hole here where we could say, okay, if you're asking the wrong question, okay, is it aligned to the business strategy? Because if it's not, it's the wrong question. Why are we asking that? Now, you could argue that maybe it should be part of the business strategy and there's certainly an argument to be made there.
[00:21:56] It could also be okay, we're answering a question, but it's at a surface level. So that example of the five why's just checking in on that. Maybe it's the right question, but maybe it's way too shallow and it's not getting deep enough. I think the temptation can be, I'm gonna ask questions and try and answer them.
[00:22:14] Are within my domain of expertise. So I'm gonna answer it with data, like it's kind of assumed I'm gonna ask questions that are answered with data and maybe that is not the biggest. Most highest priority question to be answering and, that sucks for the person who's, wants to be proactive and use data to answer those questions
[00:22:34] but we have to be open to that because I can wax lyrical about, certain types of biases of non-technical people working with data. And I can say all those things. What about our own bias towards only answering questions that we can answer, as the highest possible priority.
[00:22:49] And I think we also have to be open to that, it is entirely possible to ask the wrong question because we want to answer questions with data like that is entirely. Something that's possible. But then the other two is so strategic alignment. I would think that's the biggest one.
[00:23:03] Probably Dr. Hayes. Is the biggest way you can answer a wrong question is it's just not aligned to the business. And you could spend time finding, well, maybe we need to adjust our business strategy. Let me give an example. So with another client, we help 'em forecast their quarterly sales and.
[00:23:20] They've always done it that way by quarter, and we started to come back into them and say, well, you know what? It's kind of hard to forecast a quarter out in advance with the degree of accuracy you're looking at. So let's say they wanted an estimate of like one to 2% under that, right?
[00:23:37] So let's say rough margin of error of like 2% in the forecasting. And they would then align their budgets to that and. What we were finding was, actually four weeks out, we can get under 1%. After that, it starts to go on up 1, 2, 3, 4, 5%. So what we actually came back with was a recommendation to change their forecasting approach and said, don't do a quarterly do it on a rolling four week window.
[00:24:04] So what we said is we can't stand over a 13 week forecast with the level of accuracy you're looking for. And it's funny 'cause historically when you look at their own manual forecasting, they're like five, 6% off, , and understandably, right?
[00:24:19] It's human. I can't say the model was doing tremendously better. It was getting around 4% maybe going a quarter out, but it was getting way more accurate when we were doing it four weeks out.
[00:24:30] Dr Genevieve Hayes: So the wrong question was what is the forecast one quarter out? The right question was what is the rolling forecast? Four weeks out.
[00:24:38] Nicholas Kelly: And to change their business approach to how they're doing the forecasting. Make it a four week rolling window rather than forecast a quarter out.
[00:24:46] Dr Genevieve Hayes: And the other thing you were talking about was the importance of getting people to act on your insights. Earlier in the episode you mentioned how data scientists have a tendency if they can't find anything interesting to. Maybe misinterpret something in order to make it interesting. And you often see that with a lot of academic research papers.
[00:25:06] There's been research into how the statistical analysis in research papers is often inaccurate in order to find things, what happens if your findings are boring, what do you do then?
[00:25:21] Nicholas Kelly: I know it's tough, isn't it? I think it's part of building trust, if you're. An employee in a company or you're working with a client and if it's boring and it's not particularly interesting or revelatory, I think we just have to be honest about it and just say this is how it is, this time just not much there.
[00:25:39] Because if we look at it from the other side and that example you gave of, what's going on in academia, it arose trust very quickly. And so we're just not gonna trust those people in future when they publish something. So the case here would be okay, well I tell them something interesting and people just know that not to be the case.
[00:25:56] And i'm probably not gonna have a job for too much longer. Or are they go looking elsewhere, or there's just no trust. And then I don't get to work on cool fund projects in the future.
[00:26:06] Okay. Maybe there's two ways to answer that. If it's boring. Because it's boring and there's nothing interesting. Yeah. I think we have to pony it up and just say, sorry, that's how it is. And then maybe it's boring, but it is interesting. But I think that comes to how you're gonna tell the story to the business and how you're gonna make it interesting.
[00:26:25] I think there's always ways to do it. I think there will always be a way to, like, if it's been a very boring, uninteresting process and journey to get there. There would be a way to jazz it up, it depends on the situation, but I would think there's a way. So it's either one or the other, Dr.
[00:26:40] Hayes. But if it's the first one, let's just be honest and say, yeah, hands up. If it's just boring, make it interesting. There's ways use ai, use chat. BT , here's my insight. How do I make it exciting for my business audience?
[00:26:54] Dr Genevieve Hayes: that's actually a really good piece of advice there. So what's one immediate change our listeners could make tomorrow to improve how they interpret data?
[00:27:03] Nicholas Kelly: People need to focus on the human side. So to try and be concise about it is like understand people understand their motivations. And going back to that comment you made about marketing, focus on the benefits, people really ultimately only care about what transformation they're gonna get out of anything.
[00:27:21] How's it gonna help me? How's it gonna improve me, my situation, my career? If we can put things in terms and like walk in their shoes, then that was the biggest thing you should do is in your interpretation of data and how you're sharing that and influencing people understand where those folks are and how this is gonna help them, and how you can convey that it's gonna help them.
[00:27:43] Dr Genevieve Hayes: For listeners who wanna get in contact with uni, what can they do?
[00:27:46] Nicholas Kelly: We are on www.deliveringdataanalytics.com. That's our website and LinkedIn. And if you just search for Nicholas Kelly, I'm the one holding my book. There's a few Nicholas Kelly's and I'm pretty active on LinkedIn. They're the two places that people can find me.
[00:28:04] Dr Genevieve Hayes: There you have it. Another value packed episode to help turn your data skills into serious clout, cash, and career freedom. If you. Enjoyed this episode. Enjoyed, why not make it a double next week catch Nick's value boost a five minute episode where he shares one powerful tip for getting real results real fast.
[00:28:25] Make sure you're subscribed so you don't miss it. Thanks for joining me today, Nick,
[00:28:29] Nicholas Kelly: It's a pleasure. Thanks for having me on.
[00:28:31] Dr Genevieve Hayes: and for those in the audience, thanks for listening. I'm Dr. Genevieve Hayes, and this has been Value Driven Data Science.

Episode 70: How to Interpret Data Like a Pro in the Age of AI
Broadcast by