Episode 44: Designing Data Products People Actually Want to Use

Download MP3

[00:00:00] Dr Genevieve Hayes: Hello and welcome to Value Driven Data Science. Brought to you by Genevieve Hayes Consulting. I'm Dr. Genevieve Hayes, and today I'm joined by Brian T. O'Neill to discuss how data scientists can use UI and UX principles to create data products people actually want to use. Brian is the founder and principal of Designing for Analytics, an independent data product UI UX design consultancy.
[00:00:28] That helps data leaders turn ML and analytics into usable, valuable data products. He also advises on product and UI UX design for startup founders in MIT's Sandbox Innovation Fund, hosts the podcast Experiencing Data, founded the Data Product Leadership Community, and maintains a career as a professional percussionist performing in Boston and internationally.
[00:00:57] Brian, welcome to the show.
[00:00:59] Brian T O'Neill: Yeah, thanks for having me.
[00:01:01] Dr Genevieve Hayes: As a data scientist, there's nothing worse than devoting months of your time to building a data product that appears to meet your stakeholders every need only to find it never gets used. It's depressing, demotivating, and can be devastating to your career. But as the old saying goes, you can lead a horse to water, but you can't make it drink.
[00:01:25] Or can you? In today's episode, you'll learn how to apply the best techniques from software product management and UI UX design to create ML and AI products your stakeholders will love. Now, UI UX design is not a topic that's commonly taught in most data science training programs. or in the training programs for other disciplines such as engineering or statistics, which many data scientists have a background in.
[00:01:58] But then again, Brian your own background is not typical of someone working in the data industry. Your backgrounds in music and to this day, In addition to your work in data science and analytics, you continue to lead a double life as a professional musician. So how did you get from music to UI UX design for data science and analytics?
[00:02:23] Brian T O'Neill: Well, I don't want to bore your listeners with my life history, but yes, I am a percussionist and drummer in my other life. So this weekend I was actually playing cartoon music with a symphony. There's a touring show called Bugs Bunny live at the symphony and it's all the original Bugs Bunny cartoons set to live orchestra.
[00:02:40] Playing on a click track and playing some great music by Carl Stalling. So yeah, it's quite a shift moving between these two things. But I really like both of my professional lives. And they're kind of intertwined in my personal life as well. Being, you know, self employed is, as you know they, they overlap a lot more kind of than a traditional career does.
[00:02:58] But how did I get started there? I have a design background and I have a music background. So in my design world, which is tied to the data thing that came out of actually Music school.
[00:03:10] So back in music school I got it exposed to the Internet in the late nineties and actually the neighbor next door to me in the dorm room was a bassoonist who spent more of his time Intel net and doing stuff at a command line interface, and then Netscape came out, I'm dating myself now, but the Internet was just kind of blossoming at that point.
[00:03:31] The graphical UI Internet that we think of today through the browser was kind of in its infancy there. And he was constantly showing me all this stuff that he was doing and building webpages and stuff like that. And I found that interesting, so I built a webpage for the drumline, which is the portion of the marching band that plays at the American football games and all that, and I kind of just got really interested in that space.
[00:03:53] So long story short, I ended up getting an internship at an agency that did mostly marketing website designs in the late 90s, primarily for the northern Arizona tourism industry.
[00:04:03] So I kind of cut my teeth there and got a second training while I was in music school. And by the time I graduated, I had a few clients of my own. And I was working a summer job there while also, you know, gigging and playing concerts and stuff at nighttime. So I kind of had those two lives and then after music school, I moved to Boston to kind of further both of those things.
[00:04:22] I was primarily focused on music, but I'm like, well, I'll just get a day job and then I'll gig at night. So I got a job with a startup and that was kind of my first foray into business and being a designer professionally, you know, in a company with a job.
[00:04:36] And so that was the start of that side of my career. And eventually after moving more from web design into software I ended up at banks. I was doing a lot of financial services UI and UX design at Fidelity, JP Morgan. Yahoo, like those I spent some time there.
[00:04:53] And then eventually I kind of stepped back and I was like, oh, I'm doing all this work with information products. and financial services, designing portfolios and stock screeners and kind of like data stuff. But I didn't know anything about analytics and I didn't know there was an industry for that.
[00:05:07] And I ended up doing a lot of work mid career. Consulting in the storage and infrastructure software industry. I had consulting work at Dell EMC and at NetApp and another storage performance startup. They were all doing stuff around like root cause analysis of troubleshooting performance issues in the data center.
[00:05:29] So you can imagine this as like a very data rich environment. Cause there's so much telemetry happening in the data center. And I was coming at all this from a design background. But I felt the challenges were really fun and even though I have an arts background I have a very analytical logical side and it was a fun domain to work in. And so I kind of intentionally decided to focus on that in my design consulting. Work to build a practice that was no longer kind of like I'll do design for anyone that needs it too. I'm going to focus on this data product space, which I kind of call data products, but this work of machine learning and analytics cause I enjoy it.
[00:06:06] I feel like I know how to relate to data professionals. Why I am not a data scientist and I'm not an analyst and I've never been one. I feel like I've got this weird chemistry with that industry. It's funny, you said like our data industry, like I kind of feel like I'm not in it.
[00:06:20] I kind of feel like I'm helping it, but I'm on the outside of it. And I don't mean that in a negative way, like I've been excluded. I just see it as like, part of my mission here is to drive more design and human centeredness into that work. I know that's not like the goal of most data scientists, but ultimately this will actually be good for them because what's going to happen is.
[00:06:41] Their work is going to matter more because it's going to get used.
[00:06:45] Dr Genevieve Hayes: When you were talking just there, you were talking about the challenge of producing products with low uptake. And that's the worst possible thing that can happen if you're a data scientist. Whenever that's happened to me, I've always tried to figure out why was this happening in the hope of fixing it? And from my experience of my own behavior and that of others , usually it comes down to the data scientists concluding one of two things, either that it's their own fault because I don't know, they had bugs in their code or misinterpreted the stakeholder, or in some cases, it's just the stakeholder's fault for not asking for the right thing.
[00:07:24] Either of those happens. Now, obviously. Your solutions center around design solutions to low uptake of products. But in general, what have you seen as being the main causes of. Low adoption rates for data products, even assuming the products are bug free and exactly what the stakeholder requested.
[00:07:48] Brian T O'Neill: You said exactly what the stakeholder requested.
[00:07:51] Dr Genevieve Hayes: What they said they wanted, not what they necessarily wanted deep down.
[00:07:56] Brian T O'Neill: Well, you're getting it.
[00:07:57] Dr Genevieve Hayes: yeah,
[00:07:58] Brian T O'Neill: Yeah, you're, you're, you're getting it where I'm going with this. So, so there's a dance here, right? Which is you want to balance giving somebody what they want with what's actually going to help them. And if you're in a service arm of the business and they look at you as a group of people who fulfills service oriented tickets.
[00:08:21] This is a dance that has nothing to do with your data science skills at all. How do you help people with what they actually need versus what they wanted? Because inside the request is often a solution, not a problem. Or the problem is kind of messily included in there. I need Gen AI so that we can resolve our customer support tickets faster.
[00:08:44] Really? Maybe you need a Gen, I don't know. This is at the crux of the problem. And part of this is also, we may not understand the problem space well enough. And the answer to that is not to just say, well, you're wrong.
[00:08:55] That's not really what you need. What you need is this thing. No one likes to be told they're wrong. So this direct approach also is not going to work. So now we're really in this like realm of like feelings and all this hand wavy, mushy stuff that a lot of analytically minded people don't particularly care for.
[00:09:14] But the reality is, especially with like how we make decisions, we all make decisions very emotionally. So if you're an analytical type, it might be frustrating, but there's plenty of like, studies now about why people do this kind of stuff. And you can go read up on behavioral economics and all the different biases that are out there and cognitive biases and how people make decisions.
[00:09:33] The important thing to realize though, is This work of figuring out what's actually needed and how we design solutions with a user to try to figure out how do we fail faster and figure out whether we're on the right track. All of this kind of work is mostly not machine learning engineering and data science work.
[00:09:51] It's other work. And so I know you, you go to school and you, you get the credentials and you take the test, you do your juries or exams or whatever that they call it in data science world. But so often, like, that's not what's needed inside the organization. They don't care about best practices. They don't care about whether the model is 72 percent accurate or 74 percent accurate.
[00:10:14] Because they can't understand how for me as someone in marketing is a 2 percent accuracy difference going to impact my life and the decisions I make as a CMO. You're talking about like, how is a carburetor work in a car? I don't give an F how the carburetor works. Will it get me to my work every day because I cannot miss my work or whatever their goal is, right?
[00:10:37] So we need to understand the problem space better. And probably the number one cause here is a lack of routine exposure to the actual end users of the solution and routine means it's not something you do just right before, kind of like the, let's figure out what needs to be built, and I'm talking to someone who's probably an employee in a data science team inside of a large organization right now, That's sort of my main audience, as well as, you know, the data product, people that work at software companies.
[00:11:10] Those are kind of the two audiences I serve. But thinking about this person that's inside of a large enterprise. This routine exposure to the people who are using this stuff and understanding what is it like to be Genevieve? What is it like to be Mark in marketing? What is it like to be Arnold in accounting?
[00:11:28] What does Arnold care about? What would make Arnold look great to his boss, to his manager? Why is Arnold asking for this thing in the first place? , if we have very little exposure to what it's like to be Arnold and accounting, it's going to be really hard to consistently produce high value solutions for them, because we don't know what it's like to be them.
[00:11:50] And it doesn't mean we need to know everything about everybody's job. It just, I mean, you can learn a lot and just to cut, you know, within even an hour of week of just getting exposure to how people that are supposed to be using data currently do their work, whether it's with. Analytics and machine learning solutions or without it, or with a competitor's solution or whatever it may be, you can learn a lot of stuff through direct observation.
[00:12:16] Dr Genevieve Hayes: So are you suggesting sitting next to them and watching what they do? Because I can imagine a lot of people would get frustrated by this, or is it having a weekly catch up with the person?
[00:12:25] Brian T O'Neill: Yeah, and design nerd talk, they might call that shadowing or contextual inquiry or a ride along, but this is very much like it comes from ethnographic research, but absolutely. That's exactly what I'm talking about now. There's ways to get permission to do that.
[00:12:40] You might need to build up a relationship to do that. I was telling you kind of about my career trajectory and I worked a lot of like trading for brokerages and stuff like that at E Trade and BrownCo, JP Morgan, all this kind of stuff. When I was at JP Morgan working on the BrownCo, which was their active trader brand.
[00:12:58] So it was a brokerage for people that really like to trade quite frequently and options traders, things like this. One of our requirements there was we had to spend like an hour a week at the trading desk next to a trader listening to phone calls come in. Watching them use the system, but also hearing from the direct customer who's calling in for a reason, but understanding what's it like to be both the trader, but also the customer that's paying for that stuff and just getting to know these people, their attitudes, their frustrations, what are they like, what are they not like, and this is how you get to insights such as, I remember like this very specific insight around options traders.
[00:13:38] We found for our audience that options traders don't want help. So they already know what a butterfly spread is, or they want to believe that they know what a butterfly spread is, and they don't want helpful content on how to place a butterfly spread order into an options ticket, the trading ticket.
[00:13:55] That was almost offensive to them because they're like, we are the ones that know more than anybody. And this is a really relatively like small thing. And it's more of something that would affect the brand perception, but it's the kind of thing that like analytics data is never going to tell you this stuff.
[00:14:14] So a lot of teams when I asked them, how do you measure adoption today? Most of them are jumping to analytics on analytics. How many people logged into the dashboard and looked at it and how long did they look at it and how many new people looked at it and they're counting stuff because analytics is good at counting stuff.
[00:14:31] But counting stuff doesn't ever tell you why people do it. It also doesn't tell you, did they spend two hours on Monday because they were getting two hours worth of value out of it, or it was so hard to use that it took them two hours and maybe they left in frustration, that data will never tell you that stuff.
[00:14:50] And so therefore you really can't just look at how long are people using the dashboard or would the tool or whatever the thing is, that is not a good indicator of what to change. It doesn't tell you how you're doing. It doesn't tell you what to change. It provides no why information. So at best, it's a rough barometer in the sand.
[00:15:08] I guess if you see zero, then you know, wow, we were definitely, you're doing something wrong because nobody's looking at it. Okay. But what should we change? You can't twist that out of the analytics. It's never going to tell you that. So yes, I am saying you do need routine exposure to these end users. And I think the positioning for that is, especially if the team has had a track record of producing stuff that didn't get worked is to frame this.
[00:15:35] In a perspective that the stakeholder would care about, which is, would you like us to waste another 200, 000 of your money? Well, of course not like Genevieve, why are you saying that to me? Well, because last year we built a model for this other team or your team, and it didn't get used. And here's what happened.
[00:15:53] We didn't do a good job figuring out exactly what you needed. The sales team didn't trust these forecast models. Because they were done with a black box technique and there was no explainability and they couldn't figure out why you wanted them, the salespeople to call this set of prospects instead of that set of prospects.
[00:16:11] And so they basically just tuned it out and didn't use it. And you guys spent 200, 000 of you know, inside business, funny money, but someone paid for that. And we don't want to put out work that doesn't get used because it's expensive for you and you probably don't look good to your boss and the sales team didn't get anything out of it.
[00:16:27] The customers didn't get any benefit, like everybody lost. So in order for us to do a better job this time, we need to be able to talk to the salespeople. Oh, they don't have time for that. Well, but that's what happened last time. Is they didn't have time for it. So what feels like it's slow to you? This is the fast way to do it.
[00:16:44] The cheap, fast way is to let us interface with the people who really need this solution so that we can make sure we build it right or more right the first time. And we need to design it with them. Otherwise, there's a really high chance that we're going to screw this up again. And it's expensive to work with us because there's only four of us.
[00:17:04] And we have a million projects and, and we want you to be successful. So that's just reframing it in a different way, which is about the stakeholder realizing that. Oh, this person can help me. And they actually care about my interest, which is I need to look good to my boss. I need to cut 5 percent on cost, or I need to get 20 percent more leads in the funnel or whatever the heck the thing is, but we're reframing it in a way that they understand that direct access to users is critical.
[00:17:32] Otherwise we are designing on guessing and we're not designing on fact and qualitative data is still data. We want data to drive our decision making and qualitative data is data. It's just a different kind.
[00:17:48] Dr Genevieve Hayes: you've got your client, they come to you with a request. You say, okay, we've got to spend time with, The end users you've got your time sitting next to brokers or whoever your end users are. What's the next step in fulfilling that request that they made to you?
[00:18:05] I don't know, a few weeks back.
[00:18:06] Brian T O'Neill: So there's lots of levels to this, right? And it's usually not one conversation. It's going to be setting up multiple different conversations and you might need to talk to stakeholders and to end users. And maybe to managers of end users who maybe don't directly touch the thing, but maybe they have some influence on the people who actually do touch the solution.
[00:18:28] There can be a lot of different conversations that need to happen, but usually at some point you have all this raw data and that data needs to be synthesized. So we need to look for patterns in that. And a lot of times you will say, well, how many people do you need to talk to? And this is like one of the most classic UX research questions out there.
[00:18:46] How many people do I have to talk to? And the general answer is until you stop hearing new information. And a lot of times it seems like under 10 people, you're going to stop hearing new stuff. You're going to start hearing repetition, at which point you probably have a good sense. All right. This is how this world sees the world.
[00:19:04] They see it like this. Now we have some insight. Now we have some direction because they feel this way. They hate doing end of whatever, end of the quarter reporting. The accountants can't stand it. It takes forever. Like there's bugs in the data or there's, then they got to merge it with this CSV.
[00:19:20] And then they got to double check it against last quarter's number. Oh, now I got to load it into this tool. And do all this stuff. And then they run them a model on it and blah, blah, blah. They can't stand that work. All they want to do is get that tooling stuff done so they can get to the projections piece, which they have to put into a PowerPoint deck to make their manager happy.
[00:19:39] All right. Look at all that stuff there. Well, could we use AI for that? Well, maybe, but what if we just took the thing they hate the most, which is like, Like we could clean up all the data in step four and push that out in the next two weeks. And overall this process of like, help the accounting team be more efficient.
[00:19:59] That's the business objective. And instead of biting off the whole thing, which is like completely transform how the accountants do their work. It's like, well, what if we just solve that pain point they have, which is the wishing last quarter's data and with this quarter's data and all this tooling and then fixing all the gaps in the data.
[00:20:18] Which maybe your machine learning algorithm can go in and clean all that stuff up. Or it could use some synthetic data in place of wherever there's nulls. We're going to load in the synthetic data that we think is valid and they already know exists. It's just too hard to use it, blah, blah, blah. The impact is that you have a happier accountant.
[00:20:34] They're bought into the process now because they know that you had their interest in mind. So when I say, There's business outcomes and there's user experience outcomes. The UX outcome is the account is a lot happier because I don't have to spend as much time doing tool time work that I hate. I can be an accountant and mingling data together is not really the accounting part.
[00:20:53] That's bookkeeping and that's just prep. It's like prep work. That's not the actual cooking. It's the chopping of the vegetables. I don't want to chop vegetables. I am too important. I am too senior to be doing that. You just made their life better. But the business promise is also like, did you know your accounting team is now spending more time doing actual accounting and less time doing data engineering work, because we built this little thing to fix that.
[00:21:15] Now you can assign a business value to that. You can quantify that value and cost savings in terms of labor or whatever. And generally a data product manager should be owning the creation of benefits. They don't own a piece of technology or a product or a platform, especially they are there to.
[00:21:33] Generate benefits for users and for the business, you own the outcome, not the output. And this is a core thing in my thinking. There's a book on this topic, but I'm a huge believer in, outcomes over outputs and a lot of data teams and engineering teams and also design teams are very focused on generating features, models, dashboards, the nouns.
[00:21:55] They're very focused on producing nouns because nouns are what we're asked for. And it's easy to deliver a noun because you can look at it and say, there it is. It wasn't there yesterday. And now it's there. I did what I was supposed to do for leaders and people in management, and especially in senior management, that's not good enough.
[00:22:12] Dr Genevieve Hayes: So what they say is I want a dashboard, but what they really mean is I want the information needed to make this decision. Is that what you're getting at?
[00:22:21] Brian T O'Neill: And a great one is like, you put a, so that at the end of it, you want a dashboard. Great. We are the dashboard Kings and Queens. What do you want it to do for you? You want a dashboard so that what, like, tell me what it's going to do for you. Like, let's say I could just snap my fingers and make one for you right now.
[00:22:39] Like what would make your life better if you had this accounting dashboard? What would it do? How would your life be better?
[00:22:46] Dr Genevieve Hayes: Dashboards are actually something that I was wanting to get your opinion on, because in one of the emails you sent to your mailing list recently, you made a comment about how self serve analytics was often promoted as a feature of dashboarding tools, whereas a lot of end users consider it to be a cost.
[00:23:04] And I thought that was a really interesting point because I've always had my doubts about that whole, you know, Look, it's great. You can do your own analytics and slice and dice your own data. And what senior executive in their right mind wants to slice and dice their own data. It's time consuming.
[00:23:23] And a lot of people just don't feel comfortable doing it. They want the data team to do it so that they can trust the outputs. And so, yeah, what are your thoughts about self serve analytics and dashboarding?
[00:23:38] Brian T O'Neill: Okay. This is a big conversation. My bottom line on it is that there are times where customization makes sense. And the idea of empowering people to do this stuff themselves is a noble one. In practice, most of the time it's used to not have to make hard decisions about what the design of the solution should be and what the scope of it should be.
[00:24:01] So by saying we're going to allow it to be self service, we can serve the blank page to the user and say, here's a blank page. Have at it. Whatever you want, the world is yours. And when I talk about it right there, it sounds like a benefit. Wow. Who wouldn't want that? And guess what?
[00:24:20] If you ask users what they want, if you say, would you like it? If you could customize this Tableau dashboard, of course. Oh my God, Genevieve, that would be amazing. I would love to be able to do that. They will, of course, tell you because it sounds like if they said, no, you're taking away power from them.
[00:24:35] Well, who wants to feel like they're not empowered? Nobody. So, of course, they're going to say, yes, it sounds great because they think it means whatever I need, I can just, like, tell the tool and then it will do it. Until they actually try to do it, and then reality kicks in. And now we're in the world of user experience design and user interface design.
[00:24:55] Well, how do we make it easy for them to do self service? It's hard enough to build good dashboards and data products that are not self service. Heavily customizable building a customizable one is even harder. It's hard enough for professional design teams because you have so many variables. What levers and knobs and switches are we going to give access to?
[00:25:15] Which ones are we not? How can they shoot themselves in the foot with this thing? There are so many considerations there. Now, some of this has gotten better, like natural language interfaces can be really interesting for this, assuming that they're actually spitting back reliable information. So I think there's some promise there with what large language models, for example, are doing.
[00:25:35] And if you combine that with, like, RAG, now we're getting in some interesting spaces here, where maybe I can Interrogate the system and get back business intelligence types of responses, quantifiable numerical types of information. That's not just guest.
[00:25:51] But that's also an experience that I think needs some design treatment. I think if you just simply, let's just hear all the SQL tables here, all here's the data lake or whatever your big oceanic reference to data is that your company, a lake and ocean or whatever you call it. They're always water.
[00:26:06] I don't know why they're always water, but here's your big body of water with data in it, and here's an LLM and you slap it on top and you have self service. I don't think so. I'm pretty sure you're going to find out that doesn't work for some reason, but the goal is noble and there may be a way to provide a lot of value, maybe on some discrete use cases.
[00:26:25] But I do think design is still going to be required to actually get people on board with this, to trust the information that's coming back. Trust means that they need to understand. You need to understand what, well, not trusting means what would make them not trust it. Again, analytics won't tell you this.
[00:26:42] You need to talk to them about why they didn't believe last year's numbers. Like what was wrong with it? Well, was it missing data? Was it the model predicted stuff? That's way out of bounds with their idea of what reality is. Like there, there's so many human factors, things that get in the way here. And these are not data science problems.
[00:27:01] These are human factors, types of issues. So dashboards, I like the idea of customizing. It takes a lot of effort to do a really great job with self service. It doesn't mean you can't, but it's hard enough to build a non self servicey tool. Or I mean, I'm non customizable too. I'm not saying self service is bad.
[00:27:22] I'm kind of, I'm sorry, I'm mixing metaphors. I like the idea of self service. I want people to be able to go in and see the information themselves and not have to go ask an analyst a question that's very slow, but if we're talking about you build it yourself with all the Lego bricks, like I've had conversation to DPLC about this, I'm a firm believer that data platforms are not data products.
[00:27:42] A platform is an enabling technology on which a product could be made. It's not a data product. It's a platform. It's an enablement tool. Data products are more about benefits. Data products are more about the verb of applying product oriented ways of doing the work than it is about defining the final output. Is it a dashboard? Is that a data product? I don't know. Let's stop asking that question.
[00:28:06] What's important was, did we deliver a benefit or not? So, if we have this in mind, it opens up our, our possibilities to like, well, maybe just developing the model is not enough. Maybe the model needs to be expressed in a tool. And we need to get the right skills on board to make sure that the tool probably embodies the model so that the user can get the benefit that we found out they need during our research stuff, it's not enough to just ship the model and maybe you dear data scientist manager, listening to the show, you're like, that's not my job.
[00:28:38] That's okay. But you should recognize that that work needs to happen, whether your team does it or not, that if you're taking the products. oriented definition of data products and you want to generate business value, then adoption has to happen before you can get to business value. So someone needs to be overseeing that work of ensuring utility, usability, ease of use, maybe even delight.
[00:29:03] Someone needs to do that work. If you want to create data products. And a data product for me means an end to end human in the loop decision support solution that's so good someone would pay for it or exchange something of value to use it. Exchange something of value to use it is most applicable if you're listening out there and you work on an internal data science team, exchange something of value to use it is probably the most important part for you.
[00:29:31] What does that mean? I'm willing to give up my spreadsheet. I'm a user and I'm willing to give up my spreadsheet. I'm willing to stop doing this thing the old way. I'm willing to take the time to use this solution and change the way I used to do stuff. Because this way is so much better.
[00:29:48] Genevieve, I see the value of this machine learning model that you built for us. This is awesome. And it fits what I need to do. You've made my life better. I'm willing to put away my old spreadsheet and use this thing now because it's that good. That to me is the test of whether you built a data product or not.
[00:30:06] Dr Genevieve Hayes: And ChatGPT would definitely meet that definition because people have actually turned away from traditional search engines like Google in order to use ChatGPT instead. So they are effectively giving away something of value in order to go over to ChatGPT.
[00:30:23] Brian T O'Neill: That's a good example of, yes, they, they were willing to switch their behavior, especially like something like stop using Google, for example, or a very default search behavior for many people. So yeah, I would say that that happens and it doesn't mean we all need to go out and build a giant large language model in order to have a data product.
[00:30:39] It doesn't mean that. The principle is more important here, this idea of benefits generation, this idea that there has to be an exchange. There has to be money exchanged because money is this like placeholder of value. It's like, I'm willing to give up something of value to me to get access to that thing that you made for me or their time or their habits or their reputation or something.
[00:31:03] Those are good signs to me that you've done a great job. Because they're excited, they're feeling the impact, there's money exchanging hands, those are the signs you're looking for as a data product manager that you're creating benefits.
[00:31:16] Dr Genevieve Hayes: And if we use the same analogy for most people, so I'll exclude data scientists, computer programmers, etc. The GPT model that underpins ChatGPT wouldn't be of value because they don't have the skills necessary to convert it into a ChatGPT or use it. So it's only once it's got that chatbot interface on top of it that it actually becomes something of value.
[00:31:41] Brian T O'Neill: Yeah, and , it's not even so much just like, oh, just add a UI. It's the, whatever's necessary to allow the end user to get something, a benefit from the underlying data science stuff. In this case, you're right. It's a UI wrapper. It's basically a chat bot interface.
[00:31:58] Like fairly rudimentary. It's a CLI, basically, if you think about it with a nicer fonts, cause we're not limited to courier. And we're at the infancy right now with these systems. I think there's a lot of challenges with that kind of interface, but we're very much at the beginning with all this LLM stuff.
[00:32:15] And that interface is fine. Like right now it's version one and part of it's like, well, there's like this infinite number of things you can type into the box. Does that mean there's an infinite number of experiences? Sort of like right now, that's a really hard problem. Like. How would you customize the interface for every possible interrogative that someone types into this thing?
[00:32:34] Wow! Talk about a design challenge. Like, that's a really hard one to do. But yes, you're getting the idea. We have to connect it to a value that they're willing to pay for or change their behavior to get access to the new thing. And they're feeling that delight. Now we're on the right track. Now you're playing the game, the data product game.
[00:32:53] The right way, in my opinion.
[00:32:56] Dr Genevieve Hayes: So for people who aren't used to design, I mean, I understand what you're getting at, but how do you actually produce a solution that does tick those boxes?
[00:33:08] Brian T O'Neill: Well, I mean, I have a whole course called designing human centered data products that goes into the nuts and bolts and I've tried to Break that down into eight modules. That are mostly like, there's a lot of stuff out there. Just so good data science. We try to like build a course on data science for somebody.
[00:33:24] It's like, where do you begin? It seems like such a huge world of stuff. I try to make that like it's for data people. It's not for designers. I mean, some designers take, take the course. I've really tried to make it for data professionals. You don't need every technique in the world. You need a few things.
[00:33:39] You need these guiding principles. You need a recipe to follow for a few things. So my course tries to do that and to still down just to the essentials. Cause I feel like if you can get from zero to one with changing your behavior and adopting these product design and product management methodologies from the software world, if you can just get from zero to one inactive to active that there's such a reward there, because I know, especially with all the people that have PhDs and extensive degrees.
[00:34:10] Y'all are really smart. You can get from one to two. The hard part is the zero to one. So that's kind of what I focused on with the course. When I designed it, it's like, what's the minimum level of effort people need to take to do this.
[00:34:25] And I will say it again, customer exposure, time, user exposure, time. Which helps us then define the problem space correctly. Because I think a lot of data solutions tend to solve a different problem than the one the person actually had. This is the way to get better at solving the right problem.
[00:34:44] Part of that is because we do it with them and not for them. So this is another design principle, and this is not my idea. I forget where I heard it, but I love this. We don't design data products for users. We design them with them. They are part of how we make stuff. It's not a weird thing. It's just the normal way to do it.
[00:35:02] And we have to frame it. Like, it's just normal. Like I would almost like play stupid with like a stakeholder. Like, wait a second. You want us to build something for the sales team and never talk to them? I'm like, I've never sold anything in my life. How am I possibly going to do a good job telling the sales people?
[00:35:16] Like, how do we build a predictive model? Like use this really crazy, complicated technology. How are we going to hit a home run for them? My team doesn't sell stuff. Oh, well, that's a good point. I guess. Fine. We'll let you talk to them next week. Great. Cause it's in their benefit to have something that works for them, right?
[00:35:35] Again, getting back to that thing. I want to create a benefit for you, dear head of sales. I can't do that. If I don't know anything about sales, we need to go in and talk to them. We need to understand how they do stuff today. How do they sell today? What's wrong with the way they do it now?
[00:35:48] What's wrong with the CRM? Why can't you just do it the way you're doing it? Where are they annoyed? Where do they use data right now? Do they use it at all? Do they even want to use it? We can build models out the wazoo. It doesn't mean anyone's going to use them.
[00:36:01] Dr Genevieve Hayes: So one of the things you discuss on your website, which I find interesting is the CED framework for designing useful analytics solutions. How does that fit into all of this?
[00:36:12] Brian T O'Neill: So when we get into like the creation of like the interfaces, particularly like dashboards and information interfaces and stuff like this, you can think of it as almost like a A module in my overall course or my overall thinking about how do we design human centered data products, right?
[00:36:27] So this is really getting into like more of the visual design piece, not graphics and fonts, but kind of like, what's the overall user experience strategy or approach that we would take with this data product. If you go to designing for analytics. com slash , just the letter CED lowercase you could just read the framework there.
[00:36:45] There's also a podcast recording. That will go over it, but. It's pretty simple. It's three letters and they just stand for three words, conclusions, evidence, and data. And the idea here is that when we design data products that need to get used before they will produce behavior change, such as making a decision, which at least in the analytical space, we're designing stuff so that people make better decisions.
[00:37:11] Like that's the underpinning of a lot of this is decision support. We want to lead the interfaces as much as possible, have them lead in the experience with a conclusion, or as one of my past podcast guests called them opinions, he's like, if a machine generated, ah, I like to think of it as more of an opinion than a conclusion.
[00:37:30] That's Gotti Orin the point there is leading with the conclusion. Is there a way to produce a meaningful conclusion in the data here?
[00:37:38] And the second stage, and this is like drilling down or cutting the data this way, or a pivot table or all this kind of stuff that data people think is a benefit. This is effectively evidence for the conclusion and a lot of dashboards where they get it wrong is when I look at them, it's like, great, you show me all this evidence.
[00:37:57] For what? What is this evidence for? It doesn't support any conclusions. And it's like, so what is the user supposed to get out of this? And sometimes this is really hard. Maybe the data isn't so basic that it just supports an obvious logical conclusion. But I almost always feel like there's room for improvement.
[00:38:17] And if we're just shoveling evidence at people, we're going to lose them. Because no user of these services really wants. Machine learning and analytics. They just want to be empowered to make better decisions. They want to move faster, work faster, look better to their boss, make more money.
[00:38:34] They want some kind of value or benefit to themselves looking at and using machine learning solutions and AI and analytics. It's not that for 90 percent of the people out there, maybe there's some that really like it. They want some benefit from it. So we need to support that idea of what does a successful outcome look like for that user?
[00:38:54] Were they able to grok the conclusion from the dashboard or did they have to eyeball, analyze all of the evidence and come up with something themselves. That is putting a tremendous amount of tax on the user to do work that a computer is probably better at doing. So how can we design with that conclusion mindset first, then gradually give access to the evidence.
[00:39:18] And there are tools sometimes where you may find over time, the user starts to trust the conclusions so much that they don't need to be convinced with the evidence. They don't even care about the evidence anymore. I just get an email alert when sales forecast goes one deviation outside of band, I'm going to get an email from this system and I don't get those very often, but I pay attention to when they come, what's the proof.
[00:39:41] Over time, that salesperson may not care what the evidence is because he's learned to trust. He was part of. The design decision, he knows what generates the alerts because he worked with Genevieve and her data team. He knows what the data that went into the system was, or maybe the email tells him like, Hey, we've checked this and we ran a forecast against this.
[00:40:02] And this system does not know anything about Salesforce, but it does know about this. And it's our conclusion is that sales are going to underperform by 27 percent in the next quarter. If you don't do anything. The proof again, maybe I don't need to click through and look at the Tableau dashboard of the tool because the email with a couple sentences and some whatever gave me enough conclusion there to take an action, which is I need to call my sales team and have a meeting right now and figure out what's going on where Pat, you did your job.
[00:40:30] That it told me that I need to have a meeting. I need to call somebody, I need to call the police or whatever it is, whoever you call when you have a sales emergency, that's what it was supposed to do is not let this hit me in the face in six months, but it's telling me right now ahead of time when there's still time to do something about it, that's a great data experience to me.
[00:40:47] Cause I didn't have to go look at all the evidence. The salesperson was able to make a decision without digging through the evidence to try to figure out what does it mean for me as a sales leader? They don't want to do that work. They want the tool to do that work. And by the way, AI sounds like it does all of that stuff.
[00:41:04] You put this thing called AI and it's just like, wow, that's going to finally do all this stuff for me. It sounds like magic and it, it is magic. Like I barely have my head around how these LLMs work. And then I find a lot of the data people don't even know how they work. And then I feel a little bit better about myself.
[00:41:20] But we're talking about like really magical stuff here. And I think creates this expectation that look at what's possible. I can do it with chat GPT. So why can't your team build something like that for us? I'm tired of looking at Tableau. I don't have time for that. So anyhow, we're CED.
[00:41:36] That's basically the framework. D is for data. Most of the time, no user wants to go in there, but if you really need to show them, what is all the underlying data that drove the evidence, that drove the conclusions, Then maybe you give them access to that stuff, download a CSV or whatever. This is a conceptual framework for how we design experiences, dashboard, data product experiences, CED, conclusions, evidence, data.
[00:41:59] It's more conceptual than it is like a recipe. So I don't know if that's helpful.
[00:42:04] Dr Genevieve Hayes: So if we had a sort of trading type example, it might be, you should buy shares in this particular stock. That would be the conclusion. The evidence might be because this thing has gone up and this thing has gone down and here are some nice pretty graphs.
[00:42:22] And then the data would be the underlying
[00:42:26] Brian T O'Neill: Correct. And part of this is too, it's That evidence thing, that's, this is again, where doing a good job with this means you really need to understand in your case, the trader, like, because what is it that they would need to believe that a machine could properly predict how to place the trade because they know that it doesn't factually know what's going to happen in the future.
[00:42:48] The users all know this, right? It's a prediction, but if they understand that the team that made it for us. Factored in stuff like my trading behavior, and I don't want it to factor in my team's trading behavior. I only want my stuff. Cause I am kind of playing this game against my fellow colleagues that are also traders.
[00:43:09] I'm totally making this up by the way, but you get the idea, right? I know that it did that. The interface told me that it did that. Why did it tell me? Because during the research process, we found out it was really important to To know that any of this predictive analytics stuff that they know what's being predicted and what's not, what went into the prediction.
[00:43:28] The transparency of the model is really important to this audience. So we have chosen to display some of that information that maybe we wouldn't to the head of sales. Cause they don't care as much about when was the data refreshed and like, which system and is it real time and streaming, or is it just delayed real time?
[00:43:45] Whatever, blah, blah, blah. They don't care. We wouldn't know that though, unless we spend time with that trader to really understand what's it like to be them. What's their mentality. Oh, they're actually like. They're kind of like gamblers like this for them. It's this is a game. There's real money, but it's almost like a game.
[00:44:00] They love this stuff. It's adrenaline. It's like you watch billions or one of these shows like you can see that mentality of like a hedge fund trader. This is like a game. It's basically like adult gaming in a lot of ways. I mean, they're there's real money on the line here, but
[00:44:15] there's a personality type and attitudes about that. And our designs and our solutions should encapsulate some of that because there's no such thing as one solution. That's right for everybody. There isn't there's just favoring some people over others. And I think the design should be opinionated.
[00:44:32] Because there's no such thing as like the average user.
[00:44:35] But if you want the data science stuff to get used, the work that you did, the models and all this, you have to think about all this other work. You might need to do some design, some research, some product management work.
[00:44:47] You may need to know something about legal and ethics, and you need to know what's hard about how do we price stuff, how do we price the products? What's the legacy there? What's everyone scared of that they're not talking about? These things all become determining factors and whether or not our little AI work that was just a JIRA ticket actually gets used or not.
[00:45:05] Dr Genevieve Hayes: On that note, what final advice would you give to data scientists looking to create business value from data?
[00:45:12] Brian T O'Neill: Well, again, I'm a big fan of getting access to the people that are going to use those solutions. I think that underpins all of this. So you got to increase that exposure time. So if you're a manager, I think mandating some regular cycle of exposure, I mean, you'll hear this too, like the software world, like you don't want to like protect your engineers from the business.
[00:45:37] Keep them over there. Keep the lights down. Do not let anyone get access to them. No, you want the engineers in front of the customer. You want them to see people using the current products and solutions in the wild, because great stuff can happen when a maker. A talented maker and engineer, or you can replace this with data scientists too, when they actually see what it's like in the wild for a real person.
[00:46:01] Pate, Roger, or whatever her name is actually doing something and struggling with the current Tableau dashboard, the current app or LLM or whatever the heck it is. That's when magic can happen because now they're like, wow, I know what it's like for Kate. Like these stories are powerful stories, move us.
[00:46:17] And when we, when we actually witnessed someone using a frustrating solution, or you see someone like taking 10 steps to do something, you're like, Oh my gosh. We could have solved this like a year ago in a couple of weeks. We could have built a model to clean up that data there. They're manually going through Excel and typing numbers in.
[00:46:37] Oh, my gosh. And when a data scientist sees that or anyone who's a maker, it creates this real empathy, which is like, wow. I had no idea. The accounting team wasted this much time doing this stuff. Let us help. Please let us help. But they're never going to do that if they just read stuff and Jira tickets, because Kate's not going to say I'm so frustrated because as an accountant, each month I go and do blah, blah, blah, blah, blah.
[00:47:00] Nobody talks like that. They just tell you what they want. They think they're being helpful by saying, I want an AI model that can predict sales for next quarter, because they think they're being helpful to you by telling you the, the thing they need, they're not telling you the benefit.
[00:47:14] They're telling you the thing. And so we run off and build the thing. And then we wonder why they're not happy at the end. Cause the details matter.
[00:47:24] Dr Genevieve Hayes: And for listeners who want to learn more about you or get in contact, what can they do?
[00:47:30] Brian T O'Neill: Well I publish a insights mailing list, a little dispatches every Tuesday for seven years, something like that. So that's it designing for analytics. com slash list. I also have a podcast called experiencing data. So I mostly interview data product management professionals, product management professionals, people working at this intersection of data science design and product.
[00:47:50] That's kind of my thing. So it's a non technical show. So if you're looking for how to build the latest, you know, whatever technical stuff. Completely not a good place for you to go. If you do care about all this human factor stuff, that's a good place is my podcast. I have lots of resources on the website as well, designing for analytics.
[00:48:07] com, the CED framework. A bunch of different services there. We, I did launch the community last year. So we have this data product leadership community which is connecting data product professionals are really applying these product and design oriented methods to doing machine learning and analytics work.
[00:48:24] So we have a community we meet every month and do webinars and meet greets and stuff like that, possibly doing a conference where there's a little talk about maybe doing a little symposium in real life coming up with that.
[00:48:34] Dr Genevieve Hayes: Sounds awesome. So thank you for joining me today, Brian.
[00:48:38] Brian T O'Neill: Yeah, it's been great. Thanks for letting me share my ideas and hopefully I didn't offend anybody too badly, but I really appreciate the time to talk here. I am not one of these designers or product people that thinks like only people that have this training can do this stuff, I think data professionals can totally do this stuff.
[00:48:57] Most of it is being willing to change how you work and to kind of change your framing a little bit there. Some people are not wired for this work. That's okay. But if you're on the hook to really provide value and like show the value of your team, this stuff really matters because if no one's using it, there's no way you're getting business value, you have to solve the adoption problem somehow, and there are proven ways to do this.
[00:49:22] There's no need to go alone. There's no need to start from scratch. This is what software product design and product management that this whole industry has been doing for years. Take it, steal it, make it your own. You will have better impact. It's good for you. It's good for them. It's good for the users.
[00:49:38] Dr Genevieve Hayes: Okay. Thank you again. And for those in the audience, thank you for listening. I'm Dr. Genevieve Hayes, and this has been Value Driven Data Science brought to you by Genevieve Hayes Consulting.

Episode 44: Designing Data Products People Actually Want to Use
Broadcast by