Taking Ownership of Ethical, Inclusive Analytics Development

Avatar for Shazia Manus
Shazia Manus
SVP, Experience Capabilities, CunaMutualGroup
Read Bio

Why did you choose this topic?

I subscribe to the Peter Parker principle. Knowledge is power, and with great power comes great responsibility.

Every single person leading strategy in the field of data analytics must take ownership of the harm rapidly advancing technology could cause. Humans and machines are gaining access to personal data at an amazing rate from an equally amazing number of devices. When taken out of context, used for profit over people or looked at through a filter of bias, that data can drive life-altering, industry-disrupting, world-changing decisions.

There will be unintended consequences if we don’t, as a field, wrap our arms around the ethical implications of having this much knowledge, not to mention the power to process it at superhuman speeds.

How does this topic hit close to home for you?

Since I was young, the threat of being typecast loomed large. That’s because I grew up as a female in a part of the world that has long cast women as second-class citizens. Coming of age in Bangladesh can feel, at times, like your future is pre-determined, that you have no control of your destiny. Because of that influence, I’m a strong advocate for empowerment.

Data analytics, in the right hands, can be incredibly empowering, especially when it comes to money. Just think of the financial wellness, economic inclusion, wealth building and financial literacy we can bring about with people-centered analytics. The number of empowering use cases truly boggles the mind.

Tell me more. What keeps you up at night?

Those very use cases keep me up… but, in a good way. There are nights I can’t stop imagining all the amazing things credit unions will be able to do for their members, their employees and their communities once they strengthen their analytics muscles.

Insights from data can do so many things, but one of the most exciting is helping people know their worth. Or, from the credit union’s perspective, demonstrating that the organization knows and values its members as individuals.

What’s happening right now that makes this topic important?

More people than ever are taking notice of the inequalities, tokenism and marginalization present in American society. A realization of the need for stronger diversity, equity and inclusion is running parallel to the greater consciousness of technology’s influence in our life. As a result, media, researchers, developers and strategists are calling attention to the importance of building security, privacy and anti-bias controls into our analytics tools.

How is this going to look a year down the road? What about 5 years?

A year from now, I expect more of the technology providers that have a data analytics component will be transparent about the algorithms and models that underpin their solutions. Similar to how we’ve seen privacy policies become more prevalent as the Internet of Things has exploded, we’ll soon see companies divulge more about their collection, use and sharing of user data.

Alongside an increase in transparency, there will be an expanded set of opt-in options. People want to be able to tell an app things like, “Yes, you can monitor my location, but no, you can’t access my contacts.” There’s not a lot of customization around data permissions in financial services today, although the open banking trend is exposing a lot of potential in that realm.

In five years, I’d like to see some standards for the tech industry around developing artificial intelligence, machine learning and other big data analytics tools. I think larger society will demand it, actually. To be sure, there are best practices for bias mitigation in models and AI systems emerging. Techniques like counterfactual fairness and adversarial training are having meaningful impact. However, developers could really use some accepted guidance around what works and what doesn’t – and ultimately what advances humanity as much as it advances industry.