Building and Maintaining Public Trust in Data Sharing

Automatic Summary

Building and Maintaining Trust in Data Sharing - Insights from the Women Tech Conference

Welcome to our discussion on the central theme underscored in a recent talk presented by Alison Parea at the Women Tech Conference - "Building and Maintaining Trust in Data Sharing". As Alison, an expert from the University of Toronto, I CES, and Health Data Research Network Canada highlighted, trusting relations with data subjects can pave the way for improved outcomes and reduced risks. Let's delve into her insights.

Understanding Social License in Data Sharing

According to Parea, a key aspect of creating trust in data sharing involves obtaining social license. Much like a driver's license, a social license could be described as an informal, unofficial agreement granted by communities to authorize certain activities. However, unlike a conventional license, a social license may be silently retracted when communities disagree with the approved activities. This can result in potentially detrimental costs and enduring damage to relationships.

Two Steps to Building and Maintaining Trust in Data Sharing

Alison Parea outlines two essential steps for promoting and sustaining trust in data sharing:

  1. Engage with and listen to the public: Actively engaging with the public helps in understanding the tacit terms of the social license for data sharing.
  2. Act on what you hear: Data-sharing practices should be responsive to public feedback and should adapt as needed to retain that critical social license.

Public Perceptions and Concerns about Health Data Sharing

During Parea's talk, she presented findings from several focus groups conducted to gauge public opinion on health data usage for research. In the focus groups, participants were informed about the practicalities of health data sharing and were given fictional but realistic scenarios for the same.

Analysts learned that while there is broad public support for using data to improve health and healthcare, the context matters significantly. Concerns such as privacy breaches, surveillance, profit motives, and lack of transparency fuel public skepticism.

Introducing Essential Requirements for Data Trusts

To address these concerns and bolster trust, Parea and her team have developed a set of "essential requirements for data trusts". Publicly accessible via open access journals and Plain Language summaries, these requirements provide a clear roadmap for organizations to build and maintain trust in data sharing.

Engage, Act, and Progress in Data-Sharing

In conclusion, Alison Parea's talk paints a clear picture for organizations that aim to succeed in the realm of data sharing. The journey begins with engaging the public, listening to their concerns, and acting upon their feedback. By aligning data-sharing initiatives with community expectations — sketched out by the essential requirements for data trusts — organizations can progress steadily while maintaining the crucial social license for their operations.

Get in touch if you're ready to join us in this endeavor. Your collaboration could be pivotal in advancing our work in Plain Language deliberations or data trust efforts. Say goodbye to the days of data-sharing in shadows, and hello to an era of transparency, engagement, and trust.


Video Transcription

So thanks for joining this talk at the Women Tech Conference. I'm Alison Parea. I'm with the University of Toronto, I CE S and Health Data Research Network, Canada. And I'm gonna be talking today about building and maintaining trust related to data sharing.And a key message I want to convey is uh no matter what you are doing with data right now. If the data subjects, the people providing the data had increased trust in you and your organization. It's probably true that you could do more and conversely if you're doing things and you're not sure that the data subjects find you to be trustworthy. You're in a bit of a risky situation because it may be that you don't have a stable foundation for a lot of your technology. So what I'm going to talk about today is the fact that I think there's two main steps to building and maintaining trust related to data sharing. First is you really need to engage with and listen to members of the public. And secondly, you need to act on what you hear and I'll talk about both of those things beginning with the importance of social license So social license, it's not unlike its name, like driver's license or fishing license.

It's not something that is written on paper and is a binding legal agreement. We talk about social license as being an informal and unofficial agreement that is granted by communities or groups to some other organization or group of people to, to do something in a particular area.

And the interesting thing about social license is that when you're operating with it, you may not even know that you have one until sometimes it gets removed. In other words, you hear that people really don't agree with the things that you're doing. And at that point, it become, can become extremely costly in terms of human and economic costs and then sometimes uh make changes to relationships that are irreparable, permanent and damaging. And I first heard about the term in connection with a big multimillion dollar program in England with the NHS where there was a whole initiative around sharing health data that ultimately had to be pulled back at the cost of many millions of dollars. And it was in this publication where the author said really it was poorly informed understanding of social license. Members of the public didn't agree with their health data being used in this way. And a failure to recognize that the legal authority wasn't enough was ultimately behind the downfall of this major investment of the British government. And so when I first heard about social license I thought, oh, it's like this, there's all sorts of things that are legally allowed.

And a subset of those this, this inner pink circle are the activities that are within social license. But as I learned more, I found it, it's actually more like this. There are things that members of the public think are happening with their data, want to be happening with their data actually may not be legally allowed, but people expect that they're happening and it's not just that it's it's this idea that social license varies by jurisdiction.

So for example, in Denmark, it's not uncommon for a research team to contact a patient directly and say something like, oh would you like to be part of this diabetes study out of our university? But in Scotland, that would never happen, people don't want to be contacted unless it's someone that they reasonably think would have access to their health data. So for example, someone who works in their doctor's office or a hospital. And the whole thing is that we can't predict what social license is going to be. It varies by jurisdiction and it can vary over time. So we need to do public engagement to really understand the boundaries of social license and make sure that we're operating within them. So here's a little some findings from some research that we've done on how the public feels about using data for health research in particular. But I do think we can generalize these findings including outside of health research and health data. Uh I'm going to talk about a number of focus groups. Collectively, there are over 100 participants and these were studies that took place in 2015, 2017 and 2019, we had a professional moderator for each of them and we had 14 different focus groups in total. Each of them began with about 20 or 25 minutes of background information. And then we presented people with fictional but realistic scenarios in which health data are used in research and development. And intentionally, we included aspects that were likely to be controversial.

So for example, we said we're going to share genetic information and by the way, it can never be fully anonymized or we're going to have companies involved and they may make money based on your data. And we just saw how people reacted to them, things that they liked about them, things that they didn't like about them and the suggestions they had to make them more acceptable. And then finally, at the end of each session, we had some Q and A where people could ask questions. Uh We used a qualitative uh research methods, which it's called uh qualitative description analysis, which we say is data near. So we really are reporting on what people said as opposed to sort of a deeper interpretation, finding meaning behind the words, we're reporting the words and staying close to what we heard So, one of the first things that we found is um for a large organization I CS, which has data on over 18 million person lives.

You know, when we went to members of the public and say, oh, we hold all this data about you. The reaction wasn't, you know, an enthusiastic yay. That's great to hear. It was more like this, you know, sort of a, hm, what do you say? People didn't know about it. And this is despite the fact that the publications coming out of this study literally are covered in national and international media all the time. You can't go, you can't go a week without something in a Canadian newspaper for example. But even so people said things like, you know, um you're doing this already, like right now, I can't believe it. Some people thought that maybe we were trying to do this behind closed doors and wouldn't even have a sign on our building. Turns out we don't have a sign on our building, but it's because we're part of a research hospital site, not because we're trying to be shady. And people even said things like, you know, why are you coming out of the shadows now? So there was this real concern, even though we felt that we were being very transparent and open about all the things we were doing that maybe this was happening behind closed doors and, and we weren't very trustworthy, but it wasn't all negative. You know, there were a lot of comments that were supportive. People said things like they thought it was fantastic that data were used to improve health and health care.

People actually liked collaboration with the private sector in some cases, they thought it brought new skills to the table, new things could be learned, new public benefits. People said things like they didn't have a problem with data being used. You know, that's what data are for.

They should be used and they saw huge value including right up to the potential to save lives, but it wasn't all positive. You won't be surprised to hear. One of the things we learned is that context matters a lot. And people hear about uh breaches of trust, breaches of privacy in other sectors. And they, the overall message is they say things like, yeah, we get that you say you care about privacy and security, but we think others do too. And we've heard bad stories from other sectors.

So we're not sure we we trust all the measures you have in place. They worried about things like um that they were being surveilled actually. Right now, they would talk about experiences where they might be talking about something with a friend and something would pop up on their cell phone and they interpreted this as a I listening to them on an ongoing basis and tracking what they say and offering them ads. They worried about cases where ultimately, there was a profit motive behind data sharing and they, they said that they were just skeptical that people would actually benefit from that. Um They worried in particular where insurance companies were involved that it was a way of monitoring what the public did and intruding on their lives. And they also worried about transparency, you know, who's using what data for what purpose, like what exactly is going on with my data.

And they didn't like the fact that they felt like in some cases, they've been tricked into providing access to their data through long terms and conditions that they maybe didn't fully have the time to read and process. So some of their concerns are here, I mentioned at the start that if there could be a guarantee about privacy and security, they'd be more for it. They didn't want it to branch out indefinitely. They wanted some kind of governance and control, they worried about profit. They worried that any efforts to protect their identities might not be complete or sufficient, especially when you bring in some of A I artificial intelligence capabilities to play and they worried about malicious Attackers too. So we learned a lot there, right? We learned about where there's support, where there's concerns. And also we start to get some ideas about what people would like to see in order for the data sharing to be trustworthy and things that they would actually be support something that would be within social license.

So I'm gonna talk here about a few initiatives that we've started really aimed at being more trustworthy when it comes to how we approach data sharing. And the first I call essential requirements for data trusts. And if you want to read more about this, we've published it in an open access journal, there's no paywall. You don't need to pay anything to see it. And you can see here this is very much a team effort. Um At the first phase of the study, we were all Canadians um from 15 different organizations, but we're actually starting phase two. And I'm pleased to let you know that we have some international contributors from the States, from the Philippines, from Ireland and other places.

And there's also a plain language version of the findings of this study where we talk about how they might be applied to data collected during the pandemic. Uh And you can read about those in the conversation again. Everything I'm going to display here is open access, everyone can see it and not meant to be just for academic folks. So what we did with this mins es work is we held a facilitated meeting and we used um what's called mins specs facilitation, which kind of works like this. You, you get a group of experts together coming from slightly different places and you begin by having them brainstorm, a list of things that might be important in our case. For data trusts. But rather than stopping at this long list, what you do is you ask this, this critical question, you say, is it possible to have a complete and well functioning data trust without any of these? And you pretty ruthlessly cross off anything that's not seen as being absolutely essential.

We talked about our findings during the meeting and then there was a lot of refinement as we brought in concepts from the literature. And we worked on our manuscript which was published at, as I showed you earlier. And overall, we came up with 12 min specs and I'm going to show them on the next page in five requirement categories. And I'll talk a little bit about these at a high level because our hope is that these would be useful to many of the people who are participating in this presentation. Now, in other words, they are things that you can start using right away. The first is legal. You have to have legal authority to hold and use data as you want to. And we put this first because we realize in all the enthusiasm around data sharing, it's actually possible that organizations are doing this without legal authority to do so under governance which we define as the locus of accountability. We've got form inspects, you have to have a stated purpose, you have to be transparent in your activities, you have to have some kind of accountable governing body and you have to have governance that's adaptive because risks and opportunities are always changing when it comes to data.

So it can't be just establish it and walk away and think you have all the skills you need on your governance body under management. We say in the paper and I'll say right here, we only have three mins specs, but we really want to highlight that this is maybe 80% of the work of establishing a data trust or data infrastructure. Because this first requirement that you must have well defined policies and processes for all the things you want to do with data is a really big one. And often it's going to involve, you know, full time staff who are working on this, you know, more than one person, multiple auditable policies, legal advice to make sure you're complying all of those things. So that's, that's a lot. And practically speaking, if you're brand new to data sharing, you might be better off partnering with an organization that already has experience with all these policies and procedures versus trying to create your own from scratch. Also under management, we just have two more, we say among all those policies and procedures, you have to have something regarding data protection, safeguards and you must have some ongoing process to identify assess and manage risks because as noted under governance risks are changing all the time.

And our last two requirements are ones that, you know, don't necessarily need to cost a lot of money. But we find that they're, they're less obviously present in data infrastructure initiatives because a lot of what's written really refers to the organizations that are sharing data.

But for mins spec category four, the point is no matter what you do as an organization, your front line of privacy and security is with the data users. So our mins specs are, you must have training that clearly describes allowed and prohibited activities. So for example, you must be clear in saying that if you share your login credentials with someone else or if you display a screen with someone in another country that doesn't have access to see the data that that is prohibited. And then the second part is there has to be some sort of agreement where people acknowledge that their usage will be monitored and that there are consequences for non compliance because if you just have training about allowed and prohibited activities, but you never do anything when people do prohibited activities, it's not going to have any teeth or get you anywhere.

And our final uh category here around public and stakeholder engagement, you'll appreciate it. Given the title of this uh this talk, all data holding organizations that I'm aware of. All data sharing organizations, they naturally have good and strong relationships, they pay a lot of attention to with other organizations that provide data to them. But we need to recognize that members of the public are also important stakeholders that need to be engaged. And we've got two mins specs. The first is that there must be early an ongoing engagement, including members of the public. So it can't be just one time and it can't be after you've made some decision and hoping that members of the public will support, it has to be, you know, before decisions are made. And secondly, if you've got a case where there's a group or a subpopulation or a community that's gonna be really affected by whatever it is that you're thinking of doing it could be at the project or the policy level, then you can't count on your general engagement mechanisms to be sufficient.

You need to have tailored engagement really for that group on their terms in ways that make sense for them as opposed to counting on your general engagement mechanisms to work. So the last project that I'm going to talk about is also related to, you know, acting on what we hear and you saw in the little mini quote bubbles that people really feel that they're sort of being tricked into providing access to their own data and they don't like it. And so we wrote about this again in an open access plain language uh article where we talk about notches on the dial a call to action to develop plain language. So people can have a better sense of what's happening with their data. And we also published a couple of articles again in the conversation that you can see, no, everyone can see them. One is called the public needs to know why health data are used without consent because they are used without consent in some cases. And this other one, plain language about health data is essential for transparency and trust. And in that one, we make the point that you can't have trust without transparency. If people don't know what you're doing, you're not trustworthy, but you can't have transparency without plain language.

And that's what this last project that I'm gonna talk about is focused on. So here's an example of some simple plain language text that popped up on my very own screen. Amazon wants to access my Google account because they want to read, compose, send and permanently delete all my email from gmail. And by the way, they want to do that also with my contacts and and my calendar appointments. And so my point here is not that I agree with this. I actually don't agree with it. I declined to accept the terms, but I do appreciate the clarity with which they communicated their practice. And now contrast that with something that came to me from paypal, which said, oh by the way, we've made some changes, you can review them here. And this is what I was faced with. I have no idea what the policy said before or what's changed. And honestly, I don't have the time to really sift through it all. So we just engaged uh a large project team, another international one. And we've taken advice from the Public Advisory Council at Health Data Research Network Canada. And beginning probably in the late fall, once we get research Ethics board approval, we're going to have a series of small group dialogues where we talk to people about all the things they care about when it comes to how their data are used.

And then in the second phase of the project, we're going to go through a process where many individuals construct what we call their own personal data, not. And then what we're going to do is compare and contrast how the different data notices uh vary depending on the characteristics of the individuals involved. And our hope is that we have some examples that are much better examples for companies and public sector organizations to put out there when it comes to informing the public about how their data are used. So my closing thoughts on Ds and don't, you know, don't be overwhelmed when it comes to think, thinking about public engagement, do just start where you are. Uh you know, maybe look at the min specs that you're already fulfilling, put information on your website and then check in with people to see if you're providing the information they care about. But don't expect a few individuals to provide all the answers you need. You're gonna need target engagement with other groups with special interests and you do need to plan for public engagement involvement to be an ongoing activity.

Do act on what you hear or at least explain why you did or couldn't and do consider applying these mins specs the 12 essential requirements um to whatever data infrastructure you've established. And do, please get in touch if you want to collaborate on the plain language deliberations or data trust work. And with that, I think we're at time. So I will stop presenting the slides and I will just see if there's any questions in here. Yeah, I have from Fernando uh Fernanda. What would be your top advice when we're doing tech Archit sensitive data we're collecting, you got to engage with the communities, the intended beneficiaries, Fernanda. Uh We in research, we tend to have blinders on, we only see the benefits of what we're seeking, but we have to realize that the people that are providing the data may have a different view on benefits and risks. And I think folks, I'm going to stop sharing and thank you all for your interest in this presentation and wish you a good rest of the conference. Bye bye.