Head of Data
Insight & Analytics
View profileAs part of the ‘Tech Ethics Bristol’ events series, we caught up with J Cromack, Co-Founder of MyLife Digital, now part of DataGuard.
We talked about technology, ethics and the importance of understanding the implications that technology can have in our society, specifically about data privacy and using data to deliver better outcomes through more personalised experiences.
We were very excited to have J as a guest speaker at our second “Tech Ethics Bristol” event on Friday, May 14th, 12:20pm – 1:30pm (RSVP HERE).
MyLife Digital’s mission is to be the category-defining company for Consent and Preference Management and make Privacy UX the norm for consumers across the world. Put simply, enable people to live in a world where their data powers positive outcomes for themselves and society.
MyLife Digital has developed a Consent and Preference Management SaaS platform, which enables organisations to rebalance control of the personal data they hold on a consumer by surfacing the personal data held, the purposes it is being used for and the value being generated from the data (for both the consumer and the organisation) at any point in the customer journey. A core component needed to build trust and loyalty with the consumer and to meet new regulatory obligations.
J Cromack is an advocate of the opportunities emerging from GDPR and has been in demand as a speaker on the subject thanks to his pragmatic approach to data ethics, privacy and data protection and how organisations can embrace the new regulatory landscape to deliver greater value and build trust with their consumers. In 2020, J was awarded DataIQ’s Privacy and Trust Champion award and is recognised by DataIQ as one of the top 100 most influential people in data-driven businesses and the innovators who support them.
J Cromack: My blog ‘Falling off the Data Wagon’ from January 2015 documents why I made the switch and co-founded MyLife Digital. https://bigdatahound.com/2015/01/03/falling-off-the-data-wagon/
Having been involved in data and analytics for many years, I firmly believe human-centric data is a ‘must have’ for positive outcomes. It is data that has been authenticated by trusted individuals. In today’s world, personal data misuse [using people’s data for purposes of which they were unaware], we are now challenging the conventions, provenance and insights being generated as a result of the explosion of data across the world. If trust is lost in that data, value is lost. As a result, consumers provide less data, databases shrink, data sharing becomes limited, experiences become generic and communications less targeted. Ultimately, the outcome is reduced incomes and curtailed innovation for society.
J Cromack: I wouldn’t call myself a teacher. More a visiting lecturer, who once a year, hopefully, opens the eyes of future developers and tech entrepreneurs and helps them understand the challenges they’ll face when collecting data, tracking consumers and building AI. Students need to understand the importance of putting themselves in the shoes of the user [I hate that term – we’re not users, we’re humans!] and apply privacy by design and ethical frameworks to build better experiences and outcomes.
J Cromack: 2021 will be a year of transition. As communities, consumers, and businesses leave the pandemic behind, they will embrace a new normal.
Three privacy-related trends will underpin this transition: 1) an ever-increasing appetite to collect, process, and share sensitive personal data from consumers and employees; 2) despite the recessionary economy, values-based consumers will increasingly prefer to engage with and entrust their data to ethical businesses; and 3) regulatory and compliance complexity in relation to data privacy on a global scale will increase further.
If organisations can’t balance these three trends, then they will see their opportunities diminish and I expect organisations to adopt technologies that deliver greater transparency to consumers and empower them to have more control over the use of their data.
We are already seeing Apple adopt a privacy-first approach with the latest iOS 14.5 upgrade, requiring a positive opt-in (consent) to tracking.
As Gartner have highlighted, privacy is becoming a reason for consumers to purchase a product, in the same way that “organic,” “free trade” and “cruelty-free” labels have driven products sales in the past decade. And as you’d expect, Apple are trying to lead on this approach. With 60% of the global population expected to have their data rights governed by data protections laws by 2023, expect to see Privacy UX become a real-thing!
J Cromack: AdTech – Cookies get/or are about to be blocked, so new tools are created that address the “Privacy” concerns or the regulation. But when you scratch the surface, you have to question the ethics behind these solutions. Just because they meet the regulations doesn’t mean principled organisations should adopt them, because often they’re getting around the regulations to deliver the same objective.
A good example of this, which I have shamefully copied from the World Federation of Advertisers recent Data Ethics report…
When you download a pizza delivery or ride-hailing app and agree to its terms, the app learns a lot about you: the fact that you have the latest iPhone, for example, who you bank with, your network of friends and family, where you live and possibly where you work too. With every interaction, the algorithm gets to know you a little better; it won’t be long before it learns that you like to go out or eat when you get back. When you hail a cab, it knows it’s 2:00 am. With access to other data, it could work out that you’re in a dangerous part of the city, far from home and in the rain; it could even work out that you’ve had too much to drink (not from the regrettable text messages you sent, but by the change in your gait). Worst of all, it knows that you only have 5% left on your battery.
It is the company’s approach to data ethics that will determine how it chooses to use this information. On the one hand, if the company’s priority is safety, then it could use this data to protect you (making sure you are picked up first). On the other hand, it could raise its charges instead, since the algorithm knows that, in situations like these, you are statistically more likely to accept them.
Anyone with access to data will face choices like the example above and therefore we, as technologists need to be both ethical and responsible on the decisions we make. I have always believed data has a massive power to do good, and deliver us positive outcomes, but if people don’t trust us with their data, then the huge advantage society can gain from a plentiful supply of data will be lost. It’s time for us to make that decision – more transparency and more control will lead to better outcomes for you and society.