Episode 21: Responsible Data Sourcing for AI Model Building

Download MP3

00:00:00 Dr Genevieve Hayes
Hello and welcome to value driven data science brought to you by Genevieve Hayes Consulting. I'm your host doctor, Genevieve Hayes, and today I'm joined by Doctor Kate Bauer to discuss the responsible sourcing of data for AI model building. Kate is a consumer data advocate for Australian consumer advocacy group Choice.
00:00:22 Dr Genevieve Hayes
Following a previous career in academia where her focus was on qualitative health research, Kate, welcome to the show.
00:00:30 Dr Kate Bower
Thank you for having.
00:00:30 Dr Genevieve Hayes
Me now the saying goes that if you're not paying for the product, then you are the product and people are becoming increasingly aware of the fact that every time they interact with the digital world, there's a good chance their data is going to be harvested for some alternative use. I don't think anyone likes the idea of being the product sold.
00:00:52 Dr Genevieve Hayes
By a Silicon Valley tech giant. But when you're just one person, there's nothing really.
00:00:58 Dr Genevieve Hayes
You can.
00:00:58 Dr Genevieve Hayes
Do or maybe as a data scientist. You think being able to access huge volumes of data online is a dream come true.
00:01:07 Dr Genevieve Hayes
But just because you can harvest large quantities of data from the Internet doesn't necessarily mean that this is something that you should do.
00:01:15 Dr Genevieve Hayes
We're going to be considering data sourcing from both perspectives in this.
00:01:19 Dr Genevieve Hayes
Side, but particularly from the perspective of the consumer, because that's what choice is all.
00:01:25 Dr Genevieve Hayes
About that's right.
00:01:27 Dr Genevieve Hayes
In Australia, Choice is a very respected organisation.
00:01:32 Dr Genevieve Hayes
I remember back when I was in primary school, my mum taking me to the library to Access Choice magazine for use in my assignments back then.
00:01:40 Dr Genevieve Hayes
I'm not sure if the paper copy is still around, but it's definitely in still in existence on the Internet.
00:01:45 Dr Kate Bower
Yeah, that's right. So choice has been around for more than 60 years. Many people came to no choice through the publication of our magazine, which is still in circulation.
00:01:54 Dr Kate Bower
But most of our Members are now digital only, but we were created because consumers felt there was a need to have be able to kind of speak back against big business.
00:02:06 Dr Kate Bower
So we've had the same mission since the beginning, since 1959, which is to fight for fair, safe and just markets for Australian consumers.
00:02:14 Dr Kate Bower
And that's what we've been doing. So many people know us for our product testing. I'm sitting right now. In fact, in one of our labs.
00:02:21 Dr Kate Bower
But we're known for our product testing and our labs, we test fridges and washing machines and sound bars and laptops and just about any consumer product that you can buy in the market. We test it here or on it in our purpose built labs. But we've also started considering.
00:02:36 Dr Kate Bower
Other types of products and services that consumers come into contact with in their everyday lives, and that's kind of where my job comes.
00:02:43 Dr Genevieve Hayes
In and that's a good segue into my first question, which is you have the coolest sounding job ever, consumer data.
00:02:52 Dr Genevieve Hayes
Cricket what exactly does your role involve?
00:02:55 Dr Kate Bower
Well, I'm glad you like it cause I actually came up with the job title myself and I think I might be the only one in the world.
00:03:02 Dr Kate Bower
So I'm glad that it's it sounds exciting to other people and it is a pretty exciting job. So when my team was started and I say team is actually just the two of us working hard.
00:03:12 Dr Kate Bower
On this particular issue, but a couple of years ago, choices Board started thinking.
00:03:16 Dr Kate Bower
Through where do we think the harms are gonna be happening for consumers into the future and how do we prepare ourselves now and prepare consumers now for how to grapple with those sorts of harms?
00:03:28 Dr Kate Bower
And that's where the idea of investigating data misuse came out. So like interestingly for me, it's not simply digital spaces or it's not.
00:03:38 Dr Kate Bower
The Internet or digital platforms, even though most of you know many of those things, come into the space, but actually thinking through.
00:03:45 Dr Kate Bower
So, like Dada, you know, everyone likes to say it's the the new oil more frequently now I think we say it's the new, it's the uranium, it's the toxic toxic asset in your organisation.
00:03:57 Dr Kate Bower
But data has become, you know, almost like a secondary economy. I think of it as like, the byproduct of almost every product and service. Every business is dealing in data.
00:04:06 Dr Kate Bower
In the same way that they might be selling fridges, but now even your fridge collects data on you. You know your car collects data on you.
00:04:13 Dr Kate Bower
There's really very, very few products or service in in the market that don't involve some kind of data collection.
00:04:19 Dr Kate Bower
And with those changes in the market, we've seen a whole range of different interactions that consumers have to navigate.
00:04:27 Dr Kate Bower
Suddenly, things like terms and conditions and privacy policies become really important for consumers, but we know that they're like incredibly hard to navigate. So really, my team was set up to make impact in this space.
00:04:40 Dr Kate Bower
To think about what are the harms that are happening to consumers and how can we make them the market fair, safe and just in the same way that we do for other products.
00:04:49 Dr Genevieve Hayes
One of the things I feel as a consumer when it comes to my own personal data, I feel I basically have no rights whatsoever.
00:04:57 Dr Genevieve Hayes
I have to use the Internet. Every organisation including the government, is basically trying to force you onto the Internet these days, but if you touch the Internet, I suspect that every click.
00:05:10 Dr Genevieve Hayes
Every word you write is being recorded and stored in a database somewhere for future analysis. What rights do I, as a individual consumer, actually have when it comes to my own person?
00:05:22 Dr Kate Bower
Well, I think.
00:05:23 Dr Kate Bower
This is one of the kind of like critically important issues that we're looking at. Is this issue of, you know, access even to use these products or services?
00:05:31 Dr Kate Bower
As I mentioned, it's not even just the Internet anymore. It's literally every product you buy is, you know, an Internet of Things or a smart product and wants to collect some kind of data on you. Many of like, for example.
00:05:42 Dr Kate Bower
New cars are coming with their own 4G or 5G installed and are sending data back to manufacturers without you even knowing, so they're not even using your device to do that. They're they're sending telemetrics back to manufacturers. So really it is everywhere.
00:06:00 Dr Kate Bower
There and it goes back to the fundamentals of being an informed consumer. Traditionally, we've been able to, you know, vote with our feet if we have good price transparency, if we can compare products and this is what to us has been good at is comparing products in the market, then we can choose which one we think is suitable for us. But I think where we've.
00:06:20 Dr Kate Bower
Entered into because data is being collected by every product and every service, we've lost that choice.
00:06:26 Dr Kate Bower
And really, consumers currently have very few rights. There are some rights that we can go into them a bit more in the Privacy Act, for example.
00:06:33 Dr Kate Bower
But we're seeing, you know, basically at the moment that the Privacy Act has been under review and that review has suggested 116 improvements to the Privacy Act to deal with the complexity.
00:06:45 Dr Kate Bower
Of where data protection is is at currently. And really they wouldn't even need to stop there. There'd be that many more requirements that you could make to improve the situation for consumers.
00:06:55 Dr Kate Bower
So it's really, really hard for consumers to actively choose differently whether that's online or whether that's in in physical space.
00:07:03 Dr Genevieve Hayes
Is it better in other countries because I've heard that Europe has very strict laws when it comes to data access and data privacy.
00:07:11 Dr Kate Bower
Yeah. So certainly this is not a problem that's unique to Australia. So we have seen other jurisdictions move much earlier than Australia has various, you know, I guess you know political.
00:07:23 Dr Kate Bower
Vacancies of the ebb and flow of of politics. Some things are priorities and and some things are not, but certainly we've seen the European Union was one of the first to put in some very, very strong data protection laws with the general data protection.
00:07:37 Dr Kate Bower
The GDPR and then also the California Privacy Act is another kind of groundbreaking standard for data protection and for privacy.
00:07:47 Dr Kate Bower
But we're certainly I think not done yet. Again, Canada has just introduced an AI and data act. That's a kind of an interesting coupling of AI with data.
00:07:58 Dr Kate Bower
And there's lots of reasons why you might want to couple those things together. And certainly Australia, in addition to the Privacy Act, is now starting to look at what AI regulation might look like as well, but certainly other jurisdictions are substantially more progressed, particularly when it comes to individual right.
00:08:14 Dr Genevieve Hayes
I would assume that this is something that we will end up with in Australia at some point in time, but what's your estimate as to when we could see something like that?
00:08:25 Dr Kate Bower
Well, I mean, it's how long is a piece of string at the moment. So I think there is some complexity involved. As I said, there's 116 recommendations from the Privacy Act Review report.
00:08:34 Dr Kate Bower
Port. But we're also in a kind of a unique political situation here where we've had a a change of government and they have a substantial backlog of reforms that they took to the election that they're intending to put in.
00:08:47 Dr Kate Bower
The short answer is they're very, very busy. They have a lot of a lot of election promises that they committed to. If they want to get reelected.
00:08:54 Dr Kate Bower
They need to, they need to get those things.
00:08:56 Dr Kate Bower
Done. You know, we've got a referendum coming up. They've got a lot of things, you know, the attorney general in particular has a number of things in the portfolio that are likely to come before privacy reform.
00:09:06 Dr Kate Bower
We have seen quite a few indications from the Attorney General Dreyfus that he's very motivated to act on privacy reform. He had a go at it last time he was in office.
00:09:16 Dr Kate Bower
And he was briefly Attorney General under the the Rudd Gillard years, and he is personally motivated to enact privacy reform.
00:09:25 Dr Kate Bower
But I do think that there is a a range of complexities. Both the backlog politically, but also there.
00:09:31 Dr Kate Bower
It's quite an A number of I get complexities to some of the reforms and quite a number of quite powerful lobby groups opposing key aspects that will make that political process a bit more complicated.
00:09:44 Dr Genevieve Hayes
That's interesting because I would have thought that this would be something that everyone would be on board with. You don't have to.
00:09:51 Dr Genevieve Hayes
Name any groups, but what sort of people would be or what sort of groups would be opposed to privacy laws when it comes to?
00:10:00 Dr Genevieve Hayes
The data.
00:10:01 Dr Kate Bower
So I mean, I think this is really interesting. So one of the key reforms that choice is interested in in the in the Privacy Act, I'd say it's probably two things that it really boils down to that we're really keen to see get across the line because we think it will have a big impact.
00:10:14 Dr Kate Bower
One of them is just what is the definition of personal information. So currently in the Act it is information.
00:10:20 Dr Kate Bower
About a person.
00:10:22 Dr Kate Bower
Whereas what the proposal is to have information that relates to a person. So that seems like a very small wording change, but actually what it means is that things like technical identifiers or device identifiers and Mac addresses for example would be included in personal information.
00:10:43 Dr Kate Bower
Which means the Privacy Act would then apply to all of that. So I think if you follow that thread through that many of the identifiers that in particular larger data analytics and ad tech businesses are using currently.
00:11:02 Dr Kate Bower
They're able to do much of what they do and make a lot of money on the basis that they're using identifiers that are outside the privacy.
00:11:11 Dr Genevieve Hayes
Act. OK, so data relating to me, that would be my name, address, date of birth.
00:11:17 Dr Genevieve Hayes
Credit card number etcetera. And that's distinct from data.
00:11:22 Dr Genevieve Hayes
That relates to whoever was using my computer, which is probably me anyway.
00:11:27 Dr Kate Bower
That's right. I think if you work in the field of data science, as your listeners do, you know that there's lots of different ways to identify a customer that are not their name. In fact, the name is probably the least interesting thing about a customer.
00:11:40 Dr Kate Bower
There are many ways of identifying them. You know it right down to even when you remove cookies, browser fingerprinting, other types of device identification and practises like identity resolution where you're able to bring different data points together to identify that that's the same customer and then do things to them. So whether that show them.
00:12:01 Dr Kate Bower
Certain ads, whether that's, you know, push them through certain customer.
00:12:04 Dr Kate Bower
And is. So all of those things could be then included in the Privacy Act if we change this identification.
00:12:12 Dr Kate Bower
So the reason why choice wants that to happen is because that's in line with consumers expectations. So consumers don't really care how you identify them, whether it's through a device identifier, whether it's through some other kind of unique.
00:12:25 Dr Kate Bower
Identification that you've put on.
00:12:27 Dr Kate Bower
And to their browsing behaviour, what they care about is the effect on them. So what they care about is control over the ads.
00:12:33 Dr Kate Bower
They see control over the prices. They see control over the experience that they have online and in the physical world.
00:12:41 Dr Kate Bower
So that's what makes sense for consumers. So they really don't care whether it's their name or whether it's some other identification, but businesses.
00:12:47 Dr Kate Bower
Whose business model is reliant on many of these things not having to worry about that compliance?
00:12:52 Dr Kate Bower
Aspect of the Privacy Act.
00:12:54 Dr Kate Bower
Are opposing many of these changes, and particularly the ones that either relate to that definition change or relate to some of the the recommendations around targeted advertising and audience segmentation.
00:13:08 Dr Genevieve Hayes
What I'm thinking when you're saying this is how often I don't know. Just so you wanted to buy a new toaster.
00:13:14 Dr Genevieve Hayes
If you start Googling toasters, then for the next month you get advertisements for toasters popping up everywhere.
00:13:21 Dr Genevieve Hayes
Would that be the sort of thing that would be covered by this legislation?
00:13:24 Dr Kate Bower
Yeah, exactly right. So it's those things that we know that businesses currently are using things that are not your name but the effect for you as a consumer is identical in that you still see ads for toasters popping up.
00:13:36 Dr Kate Bower
So it's really about having a privacy act that's fit for purpose for the modern digital world that we live in that recognises that there's more than one way to skin a cat.
00:13:44 Dr Kate Bower
Uh, so to speak, that there's many different ways that businesses can identify you and what's important is returning the balance to consumers so that they can actively choose so they retain some of that consumer choice that we've traditionally had in, in the physical marketplace of being able to compare products fairly knowing what's out there having price.
00:14:04 Dr Kate Bower
Transparency, and I think it goes back to the.
00:14:07 Dr Kate Bower
Too that companies need to have good products and they need to have good services and that should be the basis on which people are choosing your product or choosing your service, not how well you were able to use dark patterns to manipulate them into a customer journey that they didn't know they were getting into.
00:14:22 Dr Kate Bower
For example, you know really it can be a good thing for businesses who want to do the right thing.
00:14:27 Dr Kate Bower
But it certainly will present some compliance challenges for large ad tech business and data analytics businesses.
00:14:35 Dr Genevieve Hayes
Yeah, I can imagine that. The other thing that came to mind when we're talking about this is.
00:14:41 Dr Genevieve Hayes
The data that many of these organisations seem to be harvesting, it's not just as you said before, the data from your fridge or from your interactions with the Internet.
00:14:50 Dr Genevieve Hayes
I've seen signs in front of Kmart and office works and things like that that are warning you that they're using facial recognition software to track your actual journey into the physical store or using your phone to work out where you are with regard to GPS or something.
00:15:07 Dr Kate Bower
Yeah, that's right. So we've done a number of investigations on this. One was the one that identified Bunnings and Kmart and the good guys using facial recognition technology and which we made a complaint to the Office of the Australian Information Commissioner.
00:15:18 Dr Kate Bower
And that is still under investigation by them expecting a determination fairly soon. But yeah, we also just did recently did some work on the other.
00:15:27 Dr Kate Bower
Types of technologies being used in retail spaces, things like Bluetooth beacons, Wi-Fi tracking, you know, and then often when you speak to these retails they say, Oh yeah, yeah, but it's it's anonymized. It's it's just using.
00:15:38 Dr Kate Bower
I mean, you know, a device identify we we we don't know your name but in actual fact if you have signed up to a loyalty programme with that business or you have the app already on your phone, it is very easy for them to match what the device identifier is on your loyalty programme as what it is. When you're getting pinged by the Bluetooth beacon. So in reality.
00:15:59 Dr Kate Bower
It's very easy to match up these various different data sets, and in fact there are whole businesses specialising in matching up these data sets for you.
00:16:06 Dr Kate Bower
If you're not able to do it so the practise of data enrichment is something that is very opaque for consumers and it's not something that they're aware of.
00:16:15 Dr Kate Bower
And the more of these technologies that we see happening covertly. So for example, you're unlikely to know if a Bluetooth beacon is operating unless you maybe have the app for that shopping centre on your phone.
00:16:26 Dr Kate Bower
You might get a, you know. Would you like to open the app? We've noticed you're nearby, but otherwise it could just be sending out pings to your phone and sending data back to that beacon.
00:16:35 Dr Kate Bower
Completely unbeknownst to you, and same with facial recognition, unless you happen to look at that sign on the.
00:16:40 Dr Kate Bower
In or you're one of the probably minuscule number of people who happen to read a privacy policy before you go into a store.
00:16:47 Dr Kate Bower
I think nobody's doing that. You know, then you're not going to know about it. So actually, there is an increasing number of covert ways that businesses are collecting data about people. And again, it's just eroding People's Choice and their rights to be able to.
00:17:00 Dr Kate Bower
Engage meaningfully in how much data is collected and what effect it might have.
00:17:05 Dr Genevieve Hayes
On them, once these businesses have collected that data, obviously they'd be using it for marketing.
00:17:11 Dr Genevieve Hayes
Officers. But are there any other things that they're using that data?
00:17:15 Dr Kate Bower
For so again, this is another area which is very opaque to consumers.
00:17:20 Dr Kate Bower
We did an investigation into Tinder, who we found was using personalised pricing, which it's not illegal to offer different prices for different people.
00:17:31 Dr Kate Bower
But it appeared to us that there seemed to be some pricing based on peoples protected characteristics, so their age, their gender and where they were from. So for example, when we did the research.
00:17:45 Dr Kate Bower
A queer, a single woman living in an urban area paid the least and an older heterosexual man who lived in the country paid the most, and the difference was up to seven times different.
00:17:57 Dr Kate Bower
So this is for their premium service the the basic version of Tinder is free for all users, but this is for their premium subscription.
00:18:02 Dr Kate Bower
Service. So that's a really significant.
00:18:06 Dr Kate Bower
Price difference, but completely unbeknownst to the users, the people who were seeing those different prices had no idea that they were seeing a different price than.
00:18:14 Dr Kate Bower
Other people and those are the sorts of issues that we're concerned about, a that it could be based on discriminatory characteristics and that it's actually it might not be illegal to show people different prices, but it is to price things based on people's gender or sexuality.
00:18:28 Dr Kate Bower
And, but the fact that it was completely opaque, opaque to people, they had no idea that this was happening. So I'm pleased to say that.
00:18:35 Dr Kate Bower
Following that, we partnered with Consumers International and they replicated the study in seven countries and found the same thing. And Tinder committed to not pricing well what they said was age based disc.
00:18:46 Dr Kate Bower
Counts using a as a factor. They claim that they weren't using sexuality, that it must have been some other indicator that had that effect.
00:18:54 Dr Kate Bower
We just have to believe them, but they haven't committed of of stepping back from personalised pricing or from showing people how and on what basis they're seeing that price, you know, so those kinds of practises, yes, Tinder.
00:19:06 Dr Kate Bower
Is a a kind of a niche service in a way. Not everyone's gonna necessarily be impacted.
00:19:11 Dr Kate Bower
But when we are able to do those things and it's hard for people to know that that's happening, what will it mean for our basic groceries or the basic services that we need in our everyday lives to start having personalised pricing and that's why we need some stronger consumer protections and data protections before we start to see those sorts of harms.
00:19:33 Dr Genevieve Hayes
That's fascinating about the personalised pricing. I have a background in the insurance industry.
00:19:38 Dr Genevieve Hayes
Yeah. Well, obviously you have rating factors with insurance, but you also have non rating factor based price optimization, but it never occurred to me that you could apply that sort of thing to something like Tinder.
00:19:52 Dr Kate Bower
Yeah. So this is kind of what we said. I mean, that's pretty standard in insurance. I mean it's kind of the principle of insurance in a ways and like it is risk.
00:19:58 Dr Kate Bower
Based pricing. So I think we kind of accept that in insurance, but also insurance is quite heavily regulated. So you know there are very strict rules around this, whereas when we're seeing it happen in other sorts of spaces, for example dating apps and who knows where else, we just relying on tip offs really to find out that kind of information.
00:20:17 Dr Kate Bower
It's concerning. It's concerning really, that there isn't actually strong regulation that that regulates, say, dating apps, or even any other type of apps in their subscription models.
00:20:25 Dr Kate Bower
And I think many of us now were signed up to multiple subscriptions. We could be paying very different prices than our neighbour and we wouldn't even know.
00:20:34 Dr Genevieve Hayes
And there's certain things that you can accept, like.
00:20:37 Dr Genevieve Hayes
Most software companies have the individual pricing versus business pricing. Fair enough.
00:20:43 Dr Genevieve Hayes
Accept that. But yeah, the idea that one person gets charged something by Tinder versus another, that's that's crazy.
00:20:51 Dr Kate Bower
Yeah, I mean it. It's pretty concerning when you when.
00:20:54 Dr Kate Bower
You see it?
00:20:54 Dr Kate Bower
Like that. And you know, as I said, dating apps is 1 area.
00:20:58 Dr Kate Bower
But I think when you start to think if you're paying a different price for a loaf of bread than someone else based on something like where you live or your name or your, your, your gender, or you know whether you have two cats or three cats, you know, like that kind of.
00:21:12 Dr Kate Bower
Idea of testing the kind of elasticity of what people are willing to pay is a real possibility as we move into this more digitised pricing space, businesses certainly have the amount of data required to do that kind of price experimentation and at the moment we're really just relying on the fact.
00:21:32 Dr Kate Bower
They're in good faith that that they're not doing that, that if people found out about it, it would be incredibly bad for the.
00:21:37 Dr Kate Bower
Business, but there could be companies out there who think maybe it's worth the risk.
00:21:42 Dr Genevieve Hayes
Well, location based price differentiation's been happening for decades.
00:21:47 Dr Genevieve Hayes
You know, you have Woolworths in one city charging one price versus Woolworths in another city, or even just different suburbs in the same city.
00:21:52 Dr Kate Bower
That's right.
00:21:55 Dr Kate Bower
Yeah, and a.
00:21:55 Dr Kate Bower
Lot of this is to do with. Again, this idea of consumer choice and consumer power, I think there's a reason why we named our organisation choice is that's really at the centre of consumers being engaged fairly in the market is being able to execute that choice.
00:22:10 Dr Kate Bower
And when those things are are not transparent, that's where you have a problem. So you could also if there's a good reason, you know, like obviously it costs more to get the the fresh food on a truck to send it out to Burke saying in NSW and it does do it in Wollongong where it's coming right off the ship. You know, so obviously there are there are legitimate.
00:22:30 Dr Kate Bower
Factors involved in pricing, food and groceries. People can accept those, but also they're transparent about it, you know, so it's more that do I know that I'm being served at different price and if it is a different price, can you explain to me on what basis that that that that is happening? So if I'm seeing a personalised price?
00:22:49 Dr Kate Bower
I'd like to know how did you, how did you get to this price and which what about me? Is it why I'm seeing this?
00:22:55 Dr Genevieve Hayes
Price and then now that we've got all these AI tools coming out, that's adding an additional layer of complexity to this whole discussion. Recently we've had ChatGPT, stable diffusion, and GitHub copilot.
00:23:10 Dr Genevieve Hayes
Coming out.
00:23:11 Dr Genevieve Hayes
All of which I believe are built based on data that was harvested from the Internet. From what I've read, ChatGPT was trained using text data scraped from millions of public websites.
00:23:22 Dr Genevieve Hayes
Stable diffusion, which is an AI image generator, was trained using images harvested, presumably in a similar way from the Internet.
00:23:32 Dr Genevieve Hayes
And GitHub copilot was trained using computer code harvested from public GitHub repositories. Is there any way that you can protect the copyright on your data if you put it on a public website?
00:23:47 Dr Kate Bower
I mean, yeah, I think it's a great question. I think generative AI is making us grapple with some of these big questions that probably a lot of people in this working in the space have been thinking about for a long time.
00:23:59 Dr Kate Bower
You know when we're talking previously before about like, what personal information is protected under the Privacy Act? Then we talked a little bit about our what I call consumer data.
00:24:09 Dr Kate Bower
So our behavioural data basically like you know what the loyalty programme card is collecting, but then there's also what's available publicly. So we've seen for example.
00:24:19 Dr Kate Bower
The Clearview AI case with the OIC here investigated and said that that collection of people's faces from face.
00:24:28 Dr Kate Bower
Book in order to build a facial recognition database was an illegal collection, so that was a very kind of clear determination that actually, no, this is people sensitive information.
00:24:40 Dr Kate Bower
People put their photos on Facebook for the purposes of being a profile photo or for, you know, share their kids birthday party or whatever it was.
00:24:48 Dr Kate Bower
And then to scrape it and then use it for a different purpose is a breach of the privacy.
00:24:52 Dr Kate Bower
Act when it comes to generative AI, though, we're talking not about peoples faces or even people's personal details, even though I think we've all seen that there is personal information in these large language.
00:25:04 Dr Kate Bower
Rules and in fact secure information and all all sorts of things that probably shouldn't be in there. But even if we put that that issue to one side for a moment, we haven't thought that well through what the kind of consequences I guess, of having all of this information. So the, you know, Internet was built with the kind of idealistic worldview.
00:25:25 Dr Kate Bower
Of the free and open Internet we used to talk about the information superhighway. It was gonna open the world and look, it has no doubt. You know, I think Wikipedia is just like one of the best things out there.
00:25:38 Dr Kate Bower
Really, I also teach ethics to a group of primary school students as as a volunteer job at my daughter school, and they still get told or don't use Wikipedia.
00:25:47 Dr Kate Bower
And I'm like, no, no, trust me. Go and put a mistake on Wikipedia and you see how quickly it gets corrected.
00:25:53 Dr Kate Bower
Like go in and edit a page and you put something false on there and I get the students do and they'll come back and they're like it took 20 minutes.
00:25:59 Dr Kate Bower
For someone to correct that source like it the power of the free and open Internet I think is amazing.
00:26:05 Dr Kate Bower
But what we have seen since really the advent of the larger digital platforms is an accumulation of market power.
00:26:14 Dr Kate Bower
That has changed the fundamental nature of the Internet and also what we think of as people's intellectual property and their data.
00:26:23 Dr Kate Bower
The reality is that these large language models can only be created by the largest of digital platforms, the largest companies, they're the only ones who have the compute power.
00:26:34 Dr Kate Bower
Big enough? You'd either need to own that much cloud storage. I mean order to be able to run and field them, or even to handle that volume of data is a very a large technical challenge.
00:26:45 Dr Kate Bower
So I think we need to think through like what does it mean for maybe only fewer than 10 companies in the world?
00:26:52 Dr Kate Bower
To have this power to be able to scrape up every single thing on the Internet and then spit it back out to us as a commercial product.
00:27:01 Dr Genevieve Hayes
And they have the power to influence public debate because if they choose to side on one side of the debate versus another, they can literally cancel.
00:27:12 Dr Genevieve Hayes
Other side that they don't.
00:27:14 Dr Kate Bower
Exactly. So I think we're only just coming to terms with that kind of power and I think it's important to think about these things in the context of that large accumulation of of tech power within a small number of companies, but also a huge amount of wealth accumulated like we're talking about individuals, you know, 20 or so individuals who are all from Silicon Valley.
00:27:36 Dr Kate Bower
Or all men all got similar educations who are wealthier than like one of them's wealthier than entire nation state.
00:27:45 Dr Kate Bower
You know, so yes, the influence is huge. So I think all of this needs to be part of the conversation.
00:27:51 Dr Kate Bower
I think any copyright lawyer will tell you that a large language model is breaching copyright.
00:27:56 Dr Genevieve Hayes
Yes, and there are a couple of class actions happening at the moment in America about that.
00:28:00 Dr Kate Bower
So this is actually more about how prepared our legal system is and how prepared our regulatory frameworks are to deal with this.
00:28:07 Dr Kate Bower
And that's why I think you can't talk about it without talking about the large accumulation of power and influence that these tech companies have had, because they are able to act like nation states, they do have that level of power. They do have that.
00:28:21 Dr Kate Bower
Level of wealth and they do have that level of info.
00:28:24 Dr Kate Bower
Once, if Amazon wanted to shut down Amazon Web Services tomorrow, they'd kill 40% of the Internet. You know, like they could just flick the switch tomorrow and say, OK, you don't give us the regulation we want.
00:28:35 Dr Kate Bower
We'll we'll literally turn flick the switch on 40% of the world's cloud storage. I mean that would collapse stock markets.
00:28:44 Dr Kate Bower
It would disrupt electricity networks. It would cripple the.
00:28:47 Dr Kate Bower
Well, you know, I mean, I don't think they're gonna do that. That would probably be a bit of a dumb move, but the fact that they can.
00:28:53 Dr Kate Bower
And governments are.
00:28:54 Dr Kate Bower
Actually powerless to stop them. Is is the real problem here?
00:28:57 Dr Kate Bower
So I think generative AI, as there's been like a big hype cycle about it. But what I'm kind of glad about it is, is that it is bringing these issues out into the open. It's like, OK, so now.
00:29:07 Dr Kate Bower
These companies have everything that's on the Internet. They're feeding it back to us as a product that we're supposed to buy, which we're also feeding the model again every time we.
00:29:16 Dr Genevieve Hayes
Is it what I'd be really interested to know is so with these things like stable diffusion, the argument that the big tech companies make is if code or information or images are publicly available through a public website, then anyone has the right to harvest that data and use it however they want.
00:29:37 Dr Genevieve Hayes
Doesn't make much sense to me because out there somewhere, there is undoubtedly a Disney website that contains lots of pictures of Mickey Mouse, and that doesn't give me personally the right to harvest images of Mickey Mouse and put them on T-shirts to.
00:29:53 Dr Genevieve Hayes
Sell. I mean, Disney's lawyers would be knocking on my door.
00:29:57 Dr Genevieve Hayes
Before I could actually get one of those T shirts.
00:29:59 Dr Genevieve Hayes
At the front door.
00:30:02 Dr Genevieve Hayes
But I'd love to know if one of these big companies messed with another big company like say, a Disney or Marvel or DC. One of those. What would happen?
00:30:15 Dr Kate Bower
Yeah. I mean, I think it will be really interesting and I think a lot of this will be played out with that on that big corporate scale because they're the people who are gonna have the resources to take those cases, you know, like, I think that argument like.
00:30:28 Dr Kate Bower
We frank, it's BS. Like we all know it's BS. We all went to school. You couldn't get a book from.
00:30:33 Dr Kate Bower
The library just.
00:30:35 Dr Kate Bower
Cut and paste it however you want it and put it into your school essay and have that be OK we all understand plagiarism.
00:30:41 Dr Kate Bower
We all understand copyright, intellect, intellectual property. I mean this idea that just because it's publicly available, you can own it.
00:30:50 Dr Kate Bower
It's it's just not true. But the narrative of innovation, you know that like you couldn't put any rules around this because ohh, it's it's innovative. We're gonna you.
00:31:00 Dr Kate Bower
You know, you know that is really what's allowing them to get away with it. And and the other thing.
00:31:05 Dr Kate Bower
Is the fact.
00:31:06 Dr Kate Bower
That, like open AI, for example, they haven't released any information about their model, which is the complete opposite of how they started out.
00:31:13 Dr Kate Bower
Like they sought it out as an open source AI company. They were like, we know that the ethical way to do this.
00:31:20 Dr Kate Bower
Is to to be open like to tell people what our data sets were to tell people how we came up with the models to like, share the code. You know, again, this whole idea of, you know, we're all in this.
00:31:31 Dr Kate Bower
Together and then the second, there was a profit motive involved. It became actually we're not gonna tell anybody. We don't have to tell you.
00:31:38 Dr Kate Bower
No one's making us tell you, so we don't have to tell you what's in our data set, you know.
00:31:42 Dr Kate Bower
So it'd be quite interesting if we ever do get to see what's in the data set that built that GPT or GPT 4, because we don't know. So we know for example.
00:31:51 Dr Kate Bower
Washington posted this really great story on Googles Lambda and the C4 data set, which I think is quite a few years old now, but it's one that is publicly available and the top website that was in is in C4's data set, which is the one that Google uses was patents dot Google Dot.
00:32:07 Dr Kate Bower
Com So patents.google.com is a public repository of patents, so Google's like we own this website becauseitsayspatterns.google.com, but actually that's all of the intellectual property. The blood, sweat and tears of every scientist and every data scientist.
00:32:28 Dr Kate Bower
Every bioethicist and you know every climate change science.
00:32:32 Dr Kate Bower
First, who've put all of their collective scientific knowledge together and put in for a patent in order to own that information, like in order to to kind of put the flag in the ground and say we came up with this idea, we own it, yet it's the number one website used in the C4 data set.
00:32:50 Dr Kate Bower
To build Google's.
00:32:52 Dr Kate Bower
Large language model. So what's going on there? You know, clearly there's been a big shift in how we think about that type of data and who owns it.
00:33:01 Dr Kate Bower
And I just think that the big tech companies are are just trying their luck. They're like come and.
00:33:06 Dr Kate Bower
Stop me.
00:33:07 Dr Genevieve Hayes
And the fact is, they've got bigger lawyers than anyone who's going to tell.
00:33:12 Dr Kate Bower
I mean, you know, and that's exactly right. You know, here in Australia, the only fine that's ever been issued for a breach of the Privacy Act is Facebook.
00:33:19 Dr Kate Bower
And it was for the Cambridge Analytica scandal, which happened in 2011, and they still haven't paid a dollar. It's still tied up in legal fees, you know, so this is how hard it is.
00:33:31 Dr Kate Bower
Even when you have a law, even when you find out that they've breached the law, and I mean, I think we can all agree, that was probably the most egregious of privacy invasions that we can think of.
00:33:40 Dr Kate Bower
That scandal was shocking at the time, but I think that we still haven't seen any remuneration, any payback, any kind of fine pay.
00:33:50 Dr Kate Bower
For that really egregiously bad behaviour, I just think we really either need to regulators either need to get on top of this and get quite heavy handed.
00:34:00 Dr Kate Bower
We need to see some of these companies start paying the fines. I mean meta in particular has been racking up the fines in Europe. They're getting over the billion dollar mark now. So like, where is the level where?
00:34:11 Dr Kate Bower
Eventually, the appeals run out. Eventually, the legal avenues run out, and they're actually gonna have to pay the fines. We're yet to see that happen, but it's gotta happen sooner or later, right at a.
00:34:22 Dr Kate Bower
Employing, we're gonna have to start see some some of these companies either start paying the fines and either think that's the cost of doing business and they can get away with this.
00:34:31 Dr Kate Bower
They're making enough money that they can afford to pay a few $100 million here or there, or yeah, they change their behaviour.
00:34:38 Dr Genevieve Hayes
I hate to say it that I suspect that the former will happen rather than the latter.
00:34:43 Dr Kate Bower
Yeah. And I mean, This is why, you know, sometimes I have these conversations and I hate to be the the big stick versus the carrot approach, but tech def tech industry definitely prefers a.
00:34:54 Dr Kate Bower
A. You know, a soft law kind of incentive based model of regulation, but I think it's just not working like when you're talking about these large companies with huge amount of of power and you know you know Australia has limited capacity really we have quite lean.
00:35:13 Dr Kate Bower
I would say regulators in that they don't have limitless funds and they don't have limitless capacity to investigate and to take cases to court.
00:35:22 Dr Kate Bower
But so we are already limited by that. But these tech companies know that, you know, and and so they're will they're willing to kind of try their luck and until we can kind of get a big enough stick to make them do the right thing, then they're going to continue to.
00:35:35 Dr Kate Bower
Do so.
00:35:36 Dr Genevieve Hayes
And the problem we have in Australia is we have a relatively small population, so less than 30 million people.
00:35:43 Dr Genevieve Hayes
If they lost every single person in Australia, it wouldn't be a big deal. It's not like losing every single person in India or China.
00:35:50 Dr Genevieve Hayes
You know.
00:35:51 Dr Kate Bower
Yeah, that's right. So you know and we've seen them make use of that power. We saw it with the news bargaining media code where they just switched off all the new sites in Australia on Facebook overnight.
00:36:03 Dr Kate Bower
It took with them a whole bunch of NGO's, a whole bunch of government sites, a whole bunch of, you know, so a lot of people were affected by that, whether that was deliberate or not.
00:36:12 Dr Kate Bower
We don't know. There is some suggestion that that Facebook knew that they were also turning off things that weren't strictly news sites.
00:36:18 Dr Kate Bower
But it's a flexing of the muscle. We saw a similar thing here when the Guardian released the Uber papers last year, they were able to get a whole bunch of internal communications.
00:36:27 Dr Kate Bower
It showed that the senior levels of Ubers management knew that their operating model was illegal when they came to Australia, but it said we'll just go. We'll set up, we'll become so popular.
00:36:38 Dr Kate Bower
Then we'll lobby the government for to change the law to make us legal retrospectively. That's the kind of economic power that we're talking about in a relatively small country like Australia, where sitting ducks.
00:36:51 Dr Genevieve Hayes
Yeah, well, that's a very cycling thought, yeah.
00:36:53 Dr Kate Bower
Take taking a bit of a depressing turn. This conversation I'm sorry about that.
00:36:59 Dr Genevieve Hayes
From the point of view of a data scientist, most data scientists that I've spoken to and worked with are not trying to deliberately set out to break the law.
00:37:09 Dr Genevieve Hayes
Most of those people, I think all of the ones that I've actually spoken to are fundamentally good people who do not set out to do the wrong thing.
00:37:19 Dr Genevieve Hayes
And in fact, are looking for guidance as to what that right thing is to do. If you're one of these data scientists who wants to build a model without.
00:37:29 Dr Genevieve Hayes
Not doing anything that's morally bankrupt. What should those people be aware of when sourcing data for their work so that they don't harm anyone or get themselves into any trouble?
00:37:41 Dr Kate Bower
I know that it takes more work, but really if we look to say for example areas where people have been using data for good purposes, let's look at academic research or let's look at, say, medical research. So academic and medical research have ethics committees for a reason because they got it really wrong.
00:38:01 Dr Kate Bower
In the beginning, I think anyone who's been to university and done any kind of social science course is aware of the the Milgram experiments and various other.
00:38:08
Ohh yeah.
00:38:09 Dr Kate Bower
That testing where the ethics were were way out there, right? But I think the same principles apply.
00:38:15 Dr Kate Bower
Which is that you?
00:38:16 Dr Kate Bower
Have to tell people what you're doing, why you're doing it. Have a good reason for doing what you're doing and then ask people's permission. It's really that simple. You just go back to basic principles of like, what would I?
00:38:29 Dr Kate Bower
Want when someone is using my information.
00:38:32 Dr Kate Bower
And we're a long way from that now. To be frank. You know, I think there's certain, you know, first party data, first party data, that's certainly the the topic does your, but that's by force.
00:38:43 Dr Kate Bower
That's with the removal.
00:38:44 Dr Kate Bower
Of the collection of third party data because of pushback from consumers and and from privacy advocates. But instead of just saying.
00:38:53 Dr Kate Bower
Not first party data? Well, I got them to tick a box when they first came to the website and now I can do whatever I want.
00:38:58 Dr Kate Bower
Actually think what's about an ethical relationship with the people whose data that I want to use, and particularly if it does relate to individual.
00:39:06 Dr Kate Bower
Rules. And again, there are public training data sets available, but you know one of my other hats that I wear is on the AI Standards Committee and another is on the diversity inclusion think tank with the national AI Centre. So in both of those roles, I'm thinking also about what is the.
00:39:25 Dr Kate Bower
Kind of.
00:39:26 Dr Kate Bower
The effect of bad data sets on the type of bias and the type of discrimination and what you're going to get out the other end.
00:39:35 Dr Kate Bower
So ideally, if we had an ideal world and I realised this isn't the ideal world, but if we did, we would be thinking about carefully curated data sets where people have consented and are aware of what the data is being used for, when they when they give that information and not either relying on publicly available training data sets which we know are biassed. We know that there are many issues in terms of inclusion particularly.
00:39:58 Dr Kate Bower
Around race, particularly around gender.
00:40:01 Dr Kate Bower
You know, and it's every data scientist knows garbage in, garbage out. You know, if if you do care about ethics, you do care about having an appropriately diverse and representative data set, then you really have to kind of go back to first principles and and collect that from the get go. And it's laborious and slow and it maybe means you're not innovating.
00:40:21 Dr Kate Bower
As fast as your peers and I think where we're at now is that kind of critical.
00:40:24 Dr Kate Bower
Moment of the innovation race. My view is that it's incredibly unethical for open AI.
00:40:31 Dr Kate Bower
To release ChatGPT and to leave it out there. I mean the real difference between ChatGPT and any other large language model we've seen come out as a chat bot.
00:40:40 Dr Kate Bower
Is is just that it's still live. So we've seen the other companies come out. People have said this is lying. This is not truthful information, this is.
00:40:51 Dr Kate Bower
Principle and people said that and they didn't put it back in the box, you know, just only months before we saw Facebook and Google come out with like language models and people were like, oh, this is, this is not OK and they've got that brand reputation that they need to manage. Open AI doesn't have any current products in the marketplace. It doesn't have to worry about that.
00:41:12 Dr Kate Bower
Brand reputation.
00:41:14 Dr Kate Bower
So they've they've let it loose and you know the race is on now, I think any kind of holding back because of ethics is probably gone.
00:41:22 Dr Kate Bower
And you've seen this in that some of the large companies have sacked entire ethics teams, they've sacked trust and safety teams, they've sacked ethics teams. So it's, yeah, it's unfortunately.
00:41:34 Dr Kate Bower
Not good. The place that we're moving. So I want, I want to give a good message to the data scientists working out there is that there are ethical ways to do this.
00:41:41 Dr Kate Bower
But I also know that that's not the current reality of where where we're at now. And if you're working for a business that you're not the owner of, you might not be the one getting to make these.
00:41:50 Dr Genevieve Hayes
Decisions and you can end up in a situation which is particularly challenging if you're early in your career, where you're basically told behave in this way or do this thing or else look for another job.
00:42:03 Dr Kate Bower
That's right. And you know, and there's plenty of people who lining up behind you to take your job. So it is, it is certainly challenging.
00:42:10 Dr Kate Bower
So I certainly don't want especially those people who are early in their career to feel the the whole weight of the world on their shoulders.
00:42:17 Dr Kate Bower
You know, we saw Sam Altman appearing before Congress saying, you know, yeah, we need regulation. And this is dangerous.
00:42:23 Dr Kate Bower
You know, if anyone has the power to stop this, it's him. He's the one with his hand on the, on the switch right now, he could.
00:42:30 Dr Kate Bower
And off that GPT tomorrow. But he's not talking about doing that. What he's talking about is other people should clean up his mess, but not not touch his business model, not do his other future.
00:42:42 Dr Kate Bower
People should be regulated to protect from future harms.
00:42:46 Dr Genevieve Hayes
Yeah, it's like, did you ever read that article? I was in the paper a few years back.
00:42:52 Dr Genevieve Hayes
About how there was some guy who thought that the Mona Lisa should not be on display at the Louvre.
00:42:59 Dr Genevieve Hayes
Because it was attracting the wrong type of person. You know, just the hoi polloi and making visiting the Louvre horrible for people who actually wanted to see all the other paintings. So now that he'd seen it, no one else should be allowed to see it. So, yeah.
00:43:15 Dr Kate Bower
No one else should say it. Yeah, it's exactly. It's exactly that logic. It's like, well, I've got what I want. So therefore every everyone else can go hand.
00:43:23 Dr Kate Bower
But you know, I mean, it's just incredible posturing really from Sam. I want to be there saying that in Congress, if anyone has the power to do anything about this situation right now, today it it's those companies.
00:43:35 Dr Kate Bower
It's open AI, it's meta, it's Google. It's Amazon.
00:43:38 Dr Kate Bower
It's the companies who are currently at Microsoft and not letting them off the hook. Funding open AI. You know, it's the companies who are developing the tech.
00:43:46 Dr Kate Bower
If they wanted to stop it.
00:43:47 Dr Genevieve Hayes
They can. There was an article in the paper I don't know about. A month ago about it was a lawyer who used ChatGPT to write his brief and ChatGPT.
00:43:58 Dr Genevieve Hayes
Made up all these fictional case.
00:44:00 Dr Genevieve Hayes
Cases and it the lawyer actually asked ChatGPT how do I know that you're telling the?
00:44:05 Dr Genevieve Hayes
Truth and chat.
00:44:05 Dr Genevieve Hayes
GPT came up with a fictional explanation to justify that it was right and the guy? Yeah. OK, I'll believe this.
00:44:15 Dr Kate Bower
Yeah. And you know, and we're seeing this so much in this in this hype cycle around TPT, it's just like a fundamental.
00:44:21 Dr Kate Bower
Misunderstanding about what AI can do and how it works, you know, and I think this is one of the kind of really kind of interesting moments where maybe this will actually lead to a big education piece around what AI is and what it is.
00:44:35 Dr Kate Bower
And I'm, you know, and when I try and talk to people about like, well, what do you think about that?
00:44:39 Dr Kate Bower
GPT and it's. It sounds this and it's that I'm like, it doesn't make meaning. It's maths, you know? Like it's literally it's a predictive model.
00:44:47 Dr Kate Bower
It's predicting the most likely next word in a sentence and its data set is so huge and the the.
00:44:55 Dr Kate Bower
The neural pathways.
00:44:56 Dr Kate Bower
Are like so large the deep learning.
00:44:57 Dr Kate Bower
Is so big that it sounds.
00:45:00 Dr Kate Bower
Good. It is so good at mimicking, but the words are completely meaningless. They may as well be in another language.
00:45:06 Dr Kate Bower
It is not producing meaning in any way, shape or form. What it is producing is is language you know. And there's a big separation between language and meaning. And I think anyone who's a fan of poetry and has had to read the.
00:45:20 Dr Kate Bower
The tripe that comes out of ChatGPT that they call poetry understands that there's a massive difference between words on a page and actually creating meaning out of language. Yes, exactly.
00:45:32 Dr Genevieve Hayes
Yeah, has its uses, but yeah, you don't want to rely on it for your legal briefs or poetry writing.
00:45:38 Dr Kate Bower
You know, and the things that it's good at are writing in the style of something, or you need to write a a business letter.
00:45:46 Dr Kate Bower
Business letters are standardised documents. You need to add the meaning in, but you can be like write me a business letter.
00:45:52 Dr Kate Bower
About XYZ, here's the parameters and then you go through when you make sure that it makes sense, but you know, like a business letter is. It's a pretty generic type of document.
00:46:01 Dr Kate Bower
But it's not creating the meaning. You still need to tell it the words that has to make sense of.
00:46:06 Dr Kate Bower
So this kind of idea that it it's it's using language or writing language I think is a kind of a fundamental misunderstanding of it. It is actually still mathematics that underpins a a generative AI, as does all other types of AI.
00:46:22 Dr Genevieve Hayes
Is there anything on your radar in the AI data and analytics space that you think is going to be important in the next three to five years?
00:46:31 Dr Kate Bower
Ohh three to five years. What a time frame. I mean, you could have asked this question three to five years ago and I don't think anyone could have predicted.
00:46:40 Dr Kate Bower
Where we are.
00:46:41 Dr Kate Bower
Right now, you know, I think we're about to see.
00:46:46 Dr Kate Bower
As I said, we're about to see some interesting things happen with regulation and with legislation and with some of those big fines that have been given, legal options are gonna run out.
00:46:56 Dr Kate Bower
So we're gonna see, I I think a bit of a, a comeuppance and a shift, and we'll see if that works or not for some of the big tech companies. So I think that's kind of something that's happening.
00:47:06 Dr Kate Bower
Guess in in the kind of.
00:47:08 Dr Kate Bower
The the market in in the political space. But the other thing I think we're about to see is potentially A convergence of many of the emerging technologies in development.
00:47:20 Dr Kate Bower
So things that we haven't perhaps spoken about today and yet another one of my hats is on the brain computer Interface Standards Committee.
00:47:28 Dr Kate Bower
And the work that's happening in what they call Neurotechnology is absolutely incredible. Similarly, quantum technology, equally impressive, and even there's some work I guess happening in, in, in the fundamentals of how we build the Internet.
00:47:44 Dr Kate Bower
Get around moving back to kind of its original original framing and away from a platforms model to a kind of on the on the blockchain or or or data. You know centres that are that are individually owned versus owned by platforms. So I think there's like lots of large technological change happening currently in.
00:48:04 Dr Kate Bower
Silos. So what I anticipate happening in the next three to five years is that we might see the convergence of some of those things and I am yet to kind of reckon with what the consumer challenges might be of some of those things.
00:48:18 Dr Kate Bower
I think the Neurotech example presents a like a lot of really, really interesting consumer issues. So is I can kind of use a.
00:48:25 Dr Kate Bower
A real life example recently.
00:48:27 Dr Kate Bower
There was a trial that was conducted in Australia, actually with a medical company who was building these brain computer interfaces and had put one in as a trial in a patient.
00:48:38 Dr Kate Bower
I believe she had epilepsy and had been experiencing basically daily seizures and once she had this implant implanted into her brain.
00:48:48 Dr Kate Bower
She stopped having seizures and the quality of life went up.
00:48:52 Dr Kate Bower
Just dramatically, I mean, I can't even imagine what it would be like from having daily seizures to suddenly not.
00:48:57 Dr Kate Bower
And this was because the the cheap act as a warning system. So it basically identified brain patterns that were happened before a seizure, and she was able to take the right preventative medication and reduce the number of seizures that she.
00:49:10 Dr Kate Bower
Was having amazing life.
00:49:12 Dr Kate Bower
Changing technology.
00:49:14 Dr Kate Bower
The company went bust and they removed the chip out of her brain, so she's back to the beginning. So this is life changing technology.
00:49:24 Dr Kate Bower
But what happens when the company goes bust? What kind of warranty is there? What kind of consumer protection do you have? If you then bought a consumer product?
00:49:34 Dr Kate Bower
Based, you know, brain computer interface that you had implanted inside your body. What happens when something goes wrong, you know?
00:49:42 Dr Kate Bower
So this is gonna present a whole range of consumer challenges that we haven't even begin to to to grapple with.
00:49:48 Dr Kate Bower
We haven't really got on top of the digital space, let alone to what happens when we move into the new tech space.
00:49:54 Dr Genevieve Hayes
Do you ever see the TV show years and years?
00:49:57 Dr Kate Bower
No, I didn't, but I've heard it's very.
00:49:59 Dr Genevieve Hayes
Good, it's awesome. It's set in the very near future and they've actually got a subplot in that about one of the characters who gets one of those neurotech implants put into them, except they get a black market one and then it malfunctions and.
00:50:17 Dr Genevieve Hayes
That's exactly the sort of thing that you're.
00:50:19 Dr Genevieve Hayes
Talking about, yeah.
00:50:20 Dr Kate Bower
So I.
00:50:21 Dr Kate Bower
Suspect and certainly if Elon Musk has anything to do with it, that will see consumer Neurotech in the not too distant future within that three to five year time frame and that's you know, a whole new world of consumer problems.
00:50:32 Dr Kate Bower
We might not just need a consumer data advocate. We might also need a neuro tech.
00:50:35 Dr Kate Bower
Advocate as well.
00:50:36 Dr Genevieve Hayes
Here at choice and the fact is the first ones will look remarkable.
00:50:41 Dr Genevieve Hayes
When we see them but 10 years from now, we will realise a complete and utter rubbish.
00:50:45 Dr Kate Bower
Indeed, indeed, as it is with anything, I mean, let's look back at the websites we were looking.
00:50:49 Dr Kate Bower
At in 1990.
00:50:50 Dr Genevieve Hayes
Six. Right. Oh, dear. Let's not. And what final advice would you give to data scientists looking to create business value from data?
00:50:59 Dr Kate Bower
Because I used to work as a data analyst and I've met lots of lovely data analysts and data scientists is trying to stick to your guns.
00:51:07 Dr Kate Bower
I know it.
00:51:08 Dr Kate Bower
Is a tough out there, but I do believe that most people working in the space are good ethical people and want to do the right thing. So don't be afraid of speaking up. I know it's harder when you're earlier.
00:51:19 Dr Kate Bower
Career. But if you have any amount of power and influence, try and use it.
00:51:24 Dr Kate Bower
You know, try and use it to to speak up. I mean, a lot of the time, sometimes there is profit motive that is driving some of these decisions, but also sometimes the people at the top just don't understand like they just sometimes don't understand the consequences of their decisions. So I think data scientists, because they understand the storytelling.
00:51:44 Dr Kate Bower
Elements of data and they understand the kind of consequences of what it can and can't do.
00:51:49 Dr Kate Bower
You. You're in a really unique position to be able to speak to senior leaders and explain to them what's at stake if they get it wrong.
00:51:57 Dr Kate Bower
You know what the what the risks are and you know and suggest better path forward. So I would say you know, my advice is if you're in that kind of position and you feel like you can speak up, chat to the the decision makers in the business and chat.
00:52:10 Dr Kate Bower
The senior leaders and try and and guide them on the right path. That doesn't mean you can't even make money.
00:52:15 Dr Kate Bower
You you can, but there are good ways to go about it and then.
00:52:18 Dr Genevieve Hayes
There are. There are other ways and the fact is consumers want to buy ethical, environmentally friendly.
00:52:24 Dr Genevieve Hayes
The good products with regard to just regular physical products, I can imagine that there will be a market for ethical digital products at some point in the future.
00:52:36 Dr Kate Bower
Absolutely. You know, I think this is what I was kind of getting to at the beginning is like let's get back to the basics of making a good product and making a good service that people want to buy, stop trying to manipulate people into buying your product.
00:52:48 Dr Kate Bower
Just make a good enough focus your efforts more on less on your marketing and and ad spend and more on actually building a product that people.
00:52:57 Dr Kate Bower
Want to buy?
00:52:58 Dr Kate Bower
You know, and I think that the more that we can think about that, the sooner we'll get back to that, that kind of mission of choice is the fair, safe and just.
00:53:05 Dr Kate Bower
Market, where consumers can exercise their choice.
00:53:07 Dr Genevieve Hayes
Hmm. Yeah, it's a good note to end.
00:53:10 Dr Kate Bower
On. Yeah. Thank you for having me. It's a great combo.
00:53:13 Dr Genevieve Hayes
And for any listeners who want to learn more about you or choice or get in contact with you, what can they do?
00:53:20 Dr Kate Bower
So they can come to choice website whichischoice.com dot AU and then if you add a slash, consumers hyphen and hyphen data it's not the best URL in the world.
00:53:31 Dr Kate Bower
But even if you're just putting consumers and data in the choice website, you'll find all of our stories and investigations, and you'll also find there some of our petitions that we've got open at the moment.
00:53:41 Dr Kate Bower
So we've got one, which is about privacy.
00:53:44 Dr Kate Bower
Form one of the things that you know that I mentioned that we're pushing for is change the information from about to relates to and how important that is to protect consumers can sign a petition for that or we've also got one about regulating facial recognition and how we really need to set some kind of clear guard rails around that. And if they want to find out more about me individually, you can find me on LinkedIn.
00:54:04 Dr Kate Bower
And Twitter.
00:54:05 Dr Genevieve Hayes
And I'll put some links to those things you've just mentioned in the show.
00:54:08 Dr Genevieve Hayes
Notes great. Thank you.
00:54:10 Dr Genevieve Hayes
OK. Anyway, thank you very much for joining me today. This has really been very.
00:54:15 Dr Kate Bower
Yes, I hope I didn't veer too far off. We covered lots of ground there. We covered Sam Altman and appearing in Congress and personalised pricing and dating apps.
00:54:24 Dr Kate Bower
So yeah, it was a good a good conversation. Thank you for having.
00:54:27 Dr Genevieve Hayes
Me and for those in the audience, thank you for listening. I'm doctor Genevieve Hayes, and this has been value driven data science brought to you by Genevieve.
00:54:36 Dr Genevieve Hayes
Nice consulting.

Episode 21: Responsible Data Sourcing for AI Model Building
Broadcast by