Skip to Content

Sponsored

Policy

Digital inclusion and equity changes what’s possible

While the pandemic has made clear the critical role of technology, it has also revealed major gaps in data accessibility, underrepresentation within tech organizations, and bias within technology.

Janice Zdankus

In association withHewlett Packard Enterprise

Fueled by innovations in AI, IoT, and blockchain, digital transformation has been accelerating rapidly across industries. But as the world’s data is growing at the edge, the stark differences in digital equity and inclusion have become clear. Access to technology, underrepresentation within tech companies, and bias within technology itself contribute to this stark digital divide, says Janice Zdankus, vice president of strategy and planning and innovation for social impact at HPE.

From health care to manufacturing to agriculture, many organizations don’t have a handle on the data they generate. While data is being created quickly, companies often lack a strategy to organize, share and account for bias in their data. "I think we see today that there's not an equitable exchange of data and those producing data aren't always seeing the value back to them for sharing their data,” says Zdankus.

Democratizing data access is key to bolstering data inclusion and equity but requires sophisticated data organization and sharing that doesn’t compromise privacy. Rights management governance and high levels of end-to-end security can help ensure that data is being shared without security risks, says Zdankus.

Ultimately, improving digital inclusion and equity comes down to company culture. “It can't just be a P&L [profit and loss] decision. It has to be around thought leadership and innovation and how you can engage your employees in a way that's meaningful in a way to build relevance for your company,” says Zdankus. Solutions need to be value-based to foster goodwill and trust among employees, other organizations, and consumers.

“If innovation for equity and inclusion were that easy, it would've been done already,” says Zdankus. The push for greater inclusion and equity is a long-term and full-fledged commitment. Companies need to prioritize inclusion within their workforce and offer greater visibility to marginalized voices, develop interest in technology among young people, and implement systems thinking that focuses on how to bring individual strengths together towards a common outcome.

This episode of Business Lab is produced in association with Hewlett Packard Enterprises.

Show notes and references

Full transcript:

Laurel Ruma: From MIT Technology Review, I'm Laurel Ruma. And this is Business Lab. The show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. Our topic today is digital inclusion and equity. The pandemic made clear that access to tech isn't the same for everyone. From broadband access to bias and data to who is hired, but innovation and digital transformation need to work for everyone. And that's a challenge for the entire tech community.

Two words for you. Unconditional inclusivity.

My guest is Janice Zdankus, who is the vice president of strategy and planning and innovation for social impact at HPE.

This episode of Business Lab is produced in association with Hewlett Packard Enterprise.

Welcome Janice.

Janice Zdankus: Hi there. Great to be here.

Laurel: So, you've been hosting HPE's Element podcast this season, and the episodes focus on inclusion. In your conversations with experts about digital equity—which includes balancing business and social agendas, biasing data, and how companies can use digital equity as a means of innovation—what sorts of innovative thinking and approaches stand out to you?

Janice: So, we've been talking a lot about ways that technology and innovative approaches can actually be useful for tackling equity and inclusion. And we've had a number of very interesting guests and topics ranging from thinking about how bias in media can be detected, all the way into thinking about trustworthy AI and how companies can actually build in an innovation agenda with digital equity in mind.

So, one example would be, we recently spoke to Yves Bergquist, who's the director of the entertainment technology center at the University of Southern California. And he leads a research center focusing on AI in neuro neuroscience and media. And he shared with us an effort to use AI, to actually scan images, to scan scripts, to watch movies and detect common uses of stereotypes to also look at how bias can be associated with stereotypes, whether intentional or not in the creation of a media piece, for example, and then to help provide that information on thousands of scripts and movies back to script writers and script reviewers and movie producers, so that they can start to increase their awareness and understanding of how the selection of certain actors or directors use of certain images and approaches can lead to an impression of bias.

And so by being able to automate that using AI, it really makes the job easier for those in the profession to actually understand how maybe, in an unconscious way they're creating bias or creating an illusion that maybe they didn't intend to. So that's an example of how technology is really assisting human-centered, thinking about how we're using media to influence.

Laurel: That's amazing because that's an industry that may be, I mean, obviously there's technology involved, but maybe a bit surprised that AI could be actually used in such a way.

Janice: Yeah. AI has a lot of ability to scan and learn way beyond the scale that the human brain can do that in. But I think there's also you have to be careful when you're talking about AI and how AI models are trained and the possibility for bias being introduced into those models. So, you really have to think about it end-to-end.

Laurel: So, if we dig a little deeper into the components of inclusion and digital equity issues, like starting with where we are now, what does the landscape look like at this point? And where are we falling short when it comes to digital equity?

Janice: There's three ways to think about this. One being is their bias within the technology itself. An example, I just mentioned around AI potentially being built on bias models, is certainly one example of that. The second is who has access to the technology. We have quite a disproportionate set of accessibility to cellular, to broadband, to technologies itself across the world. And the third is what is the representation of underrepresented groups, underserved groups in tech companies overall, and all three of those factors contribute to where we could be falling short around digital equity.

Laurel: Yeah. That's not a small amount of points there to really think about and dig through. But when we're thinking about this through the tech lens, how has the enormous increase in the volume of data affected digital equity?

Janice: So, it's a great thing to point out. There is a ton of data growing, at what we call at the edge, at the source of where information gets created. Whether it be on a manufacturing line or on an agricultural field, or whether sensors detecting creation of processes and information. In fact, most companies, I think more than 70% of companies say they don't have a full grasp on data being created in their organizations that they may have access to. So, it's being created. The problem is: is that data useful? Is that data meaningful? How is that data organized? And how do you share that data in such a way that you can actually gain useful outcomes and insights for it? And is that data also potentially being created in a way that's biased from the get-go?

So, an example for that might be, I think a common example that we hear about a lot is, gosh, a lot of medical testing is done on white males. And so therefore does that mean the outcomes from medical testing that's occurring and all the data gathered on that should only be used or applied to white males? Is there any problem around it not representing females or people of color, could those data points gathered from testing in a broader, more diverse range of demographics result in different outcomes? And that's really an important thing to do.

The second thing is around the access to the data. So yes, data is being generated in increasing volumes far more than we predicted, but how is that data being shared and are the people collecting or the machines or the organizations collecting that data willing to share it?

I think we see today that there's not an equitable exchange of data and those producing data aren't always seeing the value back to them for sharing their data. So, an example of that would be smallholder farmers around the world of which 70% are women, they may be producing a lot of information about what they're growing and how they're growing it. And if they share that to various members along the food system or the food supply chain, is there a benefit back to them for sharing that data, for example? So, there are other examples of this in the medical or health field. So there might be private information about your body, your images, your health results. How do you share that for the benefit in an aggregated way of society or for research without compromising privacy?

I mean, an example of addressing this is the introduction of swarm learning where data can be shared, but it can also be held private. So, I think this really highlights the need for rights management governance, high levels, and degrees of security end-to-end and trust ensuring that the data being shared is being used and the way it was intended to be used. I think the third challenge around all this is that the volume of data is almost too wieldy to work with, unless you really have a sophisticated technology system. In many cases there's an increasing demand for high performance computing and GPUs. At HPE, for example, we have high performance computing as a service offered through GreenLake, and that's a way to help create greater access or democratizing the access to data, but having systems and ways or I'll call it data spaces to share, distributed and diverse data sets is going to be more and more important as we look at the possibilities of sharing across not just within a company, but across companies and across governments and across NGOs to actually drive the benefit.

Laurel: Yeah and across research bodies and hospitals and schools as the pandemic has told us as well. That sort of sharing is really important, but to keep the privacy settings on as well.

Janice: That's right. And that's not widely available today. That's an area of innovation that really needs to be applied across all of the data sharing concepts.

Laurel: There's a lot to this, but is there a return on investment for enterprises that actually invest in digital equity?

Janice: So, I have a problem with the question and that's because we shouldn't be thinking about digital equity only in terms of, does it improve the P&L [profit and loss]. I think there's been a lot of effort recently done to try to make that argument to bring the discussion back to the purpose. But ultimately to me, this is about the culture and purpose of a company or an organization. It can't just be a P&L decision. It has to be around thought leadership and innovation and how you can engage your employees in a way that's meaningful in a way to build relevance for your company. I think one of the examples that NCWIT, the National Center for Women Information Technology used to describe the need for equity and inclusion is that inclusion changes what's possible.

So, when you start to think about innovation and addressing problems of the long term, you really need to stretch your thinking and away from just the immediate product you're creating next quarter and selling for the rest of the year. It needs to be values-based set of activities that oftentimes can bring goodwill, can bring trust. It leads to new partnerships, it grows new pipelines.

And the recent Trust Barometer published by Edelman had a couple of really interesting data points. One being that 86% of consumers expect brands to act beyond their product in business. And they believe that trust pays dividends. That 61% of consumers will advocate for a brand that they trust. And 43% will remain loyal to that brand even through a crisis. And then it's true for investors too. They also found that 90% of investors believe that a strong ESG [Environmental, Social and Governance] performance makes for better long-term investments for a company. And then I think what we've seen really in spades here at Hewlett Packard Enterprise is that our employees really want to be a part of these projects because it's rewarding, it's value aligned, and it gives them exposure to really sometimes very difficult problems around solving for. If innovation for equity and inclusion were that easy, it would've been done already.

So, some of the challenges in the world today that aligned to the United Nations, SDGs [Sustainable Development Goals] for example, are very difficult problems, and they are stress stretching the boundaries of technology innovation today. I think the Edelman Barometer also found that 59% of people who are thinking about leaving their jobs are doing so for better alignment with their personal values. So having programs like this and activities in your company or in your organization really can impact all of these aspects, not just your P&L. And I think you have to think about it systematically like that.

Laurel: And ESG stands for Environmental Social and Governance ideas or aspects, standards, et cetera. And SDG is the UN's initiative on Sustainability Development Goals. So, this is a lot because we're not actually assigning a dollar amount to what is possible here. It's more like if an enterprise wants to be socially conscious, not even socially conscious, just a player and attract the right talent and their customers have trust in them. They really have to invest in other ways of making digital equity real for everyone, maybe not just for their customers, but for tomorrow's customers as well.

Janice: That's right. And so the thing though is it's not just a one and done activity, it's not like, ‘Oh, I want my company to do better at digital equity. And so let's go do this project.’ It really has to be a full-fledged commitment around a culture change or an enhancement to a comprehensive approach around this. And so ways to do this would be, don't expect to go too fast. This is a long term, you're in it for the long haul. And you're really thinking or needing to think across industries with your customers, with your partners, and to really take into account that innovation around achieving digital equity needs to be inclusive in and of itself. So, you can't move too fast. You actually need to include those who provide a voice to ideas that maybe you don't have.

I think another great comment or slogan from NCWIT is the idea you don't have is the voice you haven't heard. So how do you hear those voices you haven't heard? And how do you learn from the experts or from those you're trying to serve and expect you don't know what you don't know. Expect that you don't necessarily have the right awareness necessarily at the ready in your company. And you need to really bring that in so that you have representation to help drive that innovation. And then that innovation will drive inclusivity.

Laurel: Yeah. And I think that's probably so crucial, especially what we've learned the last few years of the pandemic. If customers don't trust brands and employees don't trust the company they work for, they'll find other opportunities. So, this is a real thing. This is affecting companies’ bottom lines. This is not a touchy-feely, pie in the sky thing, but it is ongoing. As you mentioned, inclusivity changes what's possible. That's a one-time thing that's ongoing, but there are still obstacles. So maybe the first obstacle is just understanding, this is a long process. it's ongoing. The company is changing. So digital transformation is important as is digital equity transformation. So, what other things do companies have to think about when they're working toward digital equity?

Janice: So as I said, I think you have to include voices that you don't presently have. You have to have the voice of those you're trying to serve in your work on innovation to drive digital equity. You need to build the expectation that this is not a one and done thing. This is a culture shift. This is a long term commitment that has to be in place. And you can't go too fast. You can't expect that just in let's just say, ‘Oh, I'm going to adopt a new’— let's just say, for example, facial recognition technology—’into my application so that I have more awareness.’ Well, you know what, sometimes those technologies don't work. We know already that facial recognition technologies, which are rapidly being decommissioned are inherently biased and they're not working for all skin tones.

And so that's an example of, oh, okay. Somebody had a good idea and maybe a good intention in mind, but it failed miserably in terms of addressing inclusivity and equity. So, expect to iterate, expect that there will be challenges and you have to learn as you go to actually achieve it. But do you have an outcome in mind? Do you have a goal or an objective around equity, are you measuring that in some way, shape or form over the long haul and who are you involving to actually create that? Those are all important considerations to be able to address as you try to achieve digital equity.

Laurel: You mentioned the example of using AI to go through screenplays, to point out bias. That must be applicable in a number of different industries. So where else does AI machine learning have such a role for possibility really in digital equity?

Janice: Many, many places, certainly a lot of use cases in health care, but one I'll add is in agriculture and food systems. So that is a very urgent problem with the growth of the population expected to be over 9 billion by 2050. We are not on track on being able to feed the world. And that's tightly complicated by the issues around climate change. So, we've been working with CGIAR, an academic research leader in the world around food systems, and also with a nonprofit called digital green in India, where they're working with 2 million farmers in Behar around helping those farmers gain better market information about when to harvest their crops and to understand what the market opportunity is for those crops at the different markets that they've may go to. And so it's a great AI problem around weather, transportation, crop type market pricing, and how those figures all come together into the hands of a farmer who can actually decide to harvest or not.

That's one example. I think other examples with CGIAR really are around biodiversity and understanding information about what to plant given the changing nature of water and precipitation and soil health and providing those insights and that information in a way that small holder farmers in Africa can actually benefit from that. When to fertilize, when to and where to fertilize, perhaps. Those are all techniques for improving profitability on the part of a small shareholder farmer. And that's an example of where AI can do those complicated insights and models over time in concert with weather and climate data to actually make pretty good recommendations that can be useful to these farmers. So, I mean, that's an example.

I mean, another example we've been working on is one around disease predictions. So really understanding for certain diseases that are prominent in tropical areas, what are the factors that lead up to an outbreak of a mosquito-borne disease and how can you predict it, or can you predict it well enough in advance of actually being able to take an action or move a therapeutic or an intervention to the area that could be suspect to the outbreak. That's another complicated AI problem that hasn't been solved today. And those are great ways to address challenges that affect equity and access to treatment, for example.

Laurel: And certainly with the capabilities of compute power and AI, we're talking about almost real time capabilities versus trying to go back over history of weather maps and much more analog types of ways to deliver and understand information. So, what practical actions can companies take today to address digital equity challenges?

Janice: So, I think there are a few things. One is first of all, building your company with an intention to have an equitable inclusive employee population. So first of all the actions you take around hiring, who you mentor, who you help grow and develop in your company are important. And as part of that companies need to showcase role models. It might be a little cliché at this point, but you can't be what you can't see. And so we know in the world of technology that there haven't been a lot of great visible examples of women CIOs or African American CTOs or leaders and engineers doing really cool work that can inspire the next generation of talent to participate. So I think that's one thing. So, showcase those role models, invest in describing your efforts in inclusivity and innovation around achieving digital equity.

So really trying to explain how a particular technology innovation is leading to a better outcome around equity and inclusion is just important. So many students choose by the time they are in fifth grade, for example, that technology is boring or that it's not for them. It doesn't have a human impact that they really desire. And that falls on us. So, we have worked with a program called Curated Pathways to Innovation, which is an online, personalized learning product that's free, for schools that is attempting to exactly do that reach middle schoolers before they make that decision that a career in technology is not for them by really helping them improve their awareness and interest in careers and technology, and then help them in a stepwise function in an agency-driven approach, start to prepare for that content and that development around technology.

But you can think about children in the early elementary school days, where they're reading books and seeing examples of what does a nurse do? What does a firefighter do? What does a policeman do? Are those kinds of communications and examples available around what does a data scientist do? What does a computer engineer do? What does a cybersecurity professional do? And why is that important and why is that relevant? And I do think we have a lot of work to do as companies and technology to really showcase these examples. I mean, I would argue that technology companies have had the greatest amount of impact on our world globally in the last decade or two than probably any other industry. Yet we don't tell that story. And so how do we help connect the dots for students? So, we need to be a voice we need to be visible in developing that interest in the field. And that's something that everybody can do right now. So that's my two cents on that.

Laurel: So, there's so much opportunity here, Janice and certainly a lot of responsibility technologists really need to take on. So how do you envision the next two or three years going with digital equity and inclusion? Do you feel like this Clarion bell is just ringing all over the tech industry?

Janice: I do. In fact, I see a few key points really, really essential in the future evolution of equity and inclusion. First of all, I think we need to recognize that technology advancements are actually ways that inclusion can be improved and supported. So, it's a means to an end. And so recognize that the improvements we make in technology innovations we bring can drive in inclusion more fully. Secondly, I think we need to think about the future of work and where the jobs will be and how they'll be developing. We need to think about education as a means to participate in what is and will continue to be the fastest growing sector globally. And that's around technology around cyber security, around data science and those career fields. But yet right now some states really don't even have high school computer science curriculum in place.

It's hard to believe that, but it's true. And in some states that do, don't give college prep credit for that. And so, if we think the majority of jobs that are going to be created are going to be in the technology sector, in the fields I just described, then we need to ensure that our education system is supporting that in all avenues, in order to address the future of work. First and foremost, it has to start with literacy. We do still have issues around the world and even in the United States around literacy. So, we really have to tackle that at the get go.

The third thing is systems thinking. So, these really tough problems around equity are more than just funding or writing a check to an NGO or doing a philanthropic lunch-packing exercise. Those are all great. I'm not saying we should stop those, but I actually think we have a lot of expertise in the technology sector around how to partner, how work together, how to think about a system and to allow for outcomes where you bring the individual strengths of all the partners together towards a common outcome.

And I think now more than ever, and then going into the future, being able to build systems of change for inclusion and equity are going to be essential. And then finally, I think the innovation that is being created through the current programs around equity and social impact are really challenging us to think about bigger, better solutions. And I'm really, really optimistic that these new ideas that can be gained from those working on social innovation and technology innovation for social impact are just going to continue to impress us and to continue to drive solutions to these problems.

Laurel: I love that optimism and bigger and better solutions to the problems, that's what we all really need to focus on today. Janice, thank you so much for joining us on the Business Lab.

Janice: Thank so much for having me.

Laurel: That was Janice Zdankus, vice president of strategy and planning and innovation for social impact at HPE, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review, overlooking the Charles River. That's it for this episode of Business Lab. I'm your host, Laurel Ruma. I'm the director of insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology. And you can find us in print, on the web, and at events each around the world. For more information about us in the show, please check out our website at technologyreview.com.

This show is available wherever you get your podcast. If you enjoy this episode, we hope you'll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Collective Next. Thanks for listening.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Africa’s push to regulate AI starts now        

AI is expanding across the continent and new policies are taking shape. But poor digital infrastructure and regulatory bottlenecks could slow adoption.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.