Skip to content

Privacy as infrastructure: designing for digital trust with data privacy strategist Ericka Watson

AI & emerging techDigital growthPrivacy-Led Marketing

Ericka Watson, Esq. sees privacy as a catalyst, not a constraint. As CEO of Data Strategy Advisors, she works with organizations to design trust into data and AI systems early so they can move faster and build better.

For organizations, that means moving from a “Can we collect it?” mentality to one that asks, “Should we collect it?”

Brunni Corsato
Written by
Brunni Corsato
Read time
6 mins
Updated
Jan 30, 2026
Magazine / Articles / Privacy as infrastructure: designing for digital trust with data privacy strategist Ericka Watson

Privacy is often framed as a gatekeeper — one that slows innovation and limits what’s possible for business. Ericka Watson, Esq., a data privacy and AI governance strategist, sees it differently.

Ericka is the CEO of Data Strategy Advisors, where privacy isn’t a barrier, but a catalyst. It’s also a design principle that sharpens thinking, builds trust, and unlocks collaboration between brands and their audiences.

Across Ericka’s work in data privacy, AI governance, and ethics-first research environments, she’s shown that the organizations willing to embed privacy early and throughout their workflows move faster and build better.

“Data literacy is understanding how information moves, what’s being collected, and how it’s used to influence experiences,” she explains. “Respecting data is a mindset shift that recognizes that every dataset represents real people, with real consequences attached to how that data is handled.”

In this conversation, Ericka shares how treating privacy as infrastructure can accelerate innovation, why data literacy is foundational to digital trust, and how organizations can embed consent, governance, and AI guardrails into systems that respect human agency.

Privacy as strategic infrastructure

Brunni: Your LinkedIn profile leads with: “It’s all about data: trust, consent, AI ethics, privacy.” Can you expand on  that, and the practical implications of personal data holding so much value?

Ericka: When I talk about trust, consent, AI ethics, and privacy, I’m really talking about data stewardship; how we harness the incredible power of data without losing sight of the humans behind it.  

Data today functions as both currency and a reflection of who we are. Every interaction creates data that can influence outcomes and shape decisions about us, from the ads we see to the medical treatments we’re offered. 

That reality creates responsibility. Data isn’t just something organizations use; it represents individuals and carries real-world consequences.

The practical implication is that privacy must be treated as infrastructure, not only policy. Organizations shouldn’t build data-driven systems without embedded consent, governance, and accountability. 

When privacy is designed into the core of how data flows, decisions are clearer, risks are reduced, and trust becomes operational. Data is an asset that represents people. If businesses respect that relationship, they can create experiences that are not only compliant but trustworthy.

— data privacy and AI governance strategist, Data Strategy Advisors

Every digital interaction creates a data point that shapes outcomes for real people. Treating privacy as infrastructure is how organizations harness data responsibly without losing sight of its human impact.

Brunni: You’ve described privacy as being a value-add rather than a constraint. How do you see privacy creating opportunities for innovation?

When organizations see privacy only as a compliance checkbox, they usually treat it as friction and miss its strategic potential. 

When embedded early with the right people and process, privacy becomes a design principle that forces clarity, intentionality, and better decision-making. It pushes teams to think about why they’re collecting data, howthey’re using it, and what value they’re providing in return.

That mindset shift leads to cleaner data, smarter products, and loyal customers.

I’ve seen firsthand in highly regulated research environments where sensitive data, including genetic and health data, was central to collaboration. A great example is when my team and I were working on a life science research collaboration involving genetic data. We developed a process to verify that every piece of genetic data acquired had been appropriately consented for sharing before the deal was finalized and before it entered the research environment.

That proactive step created a verifiable chain of trust that regulators and partners could rely on. It eliminated delays, reduced compliance rework, and strengthened our credibility as a trusted data steward. 

When privacy is embedded into deal structures and data pipelines, it removes friction instead of creating it. We not only streamlined research collaborations but also accelerated the pace of discovery, demonstrating trust as an enabler of innovation.

Empowering users through data literacy

Brunni: Digital trust is tied to whether users understand what happens to their data. Yet, most still find themselves confused. What skills or mindsets do people and organizations need to develop so they truly understand and stay in control of their data?

Ericka: It starts with data literacy and respect for data.

Data literacy is the ability to understand how information moves, what’s being collected, and how it’s used to influence experiences, whether that’s healthcare, shopping, or hiring. Data literacy is not just a user skill; it’s an organizational design obligation. 

Organizations can’t expect users to be data literate if systems are intentionally opaque. People need clarity and honesty from the systems and organizations they interact with. 

For organizations, that means moving from a “Can we collect it?” mentality to one that asks, “Should we collect it?”

For individuals, it means asking the simple questions: Why am I sharing this? What value am I getting in return?

— data privacy and AI governance strategist, Data Strategy Advisors

Data literacy is understanding how your information moves, what’s being collected, and how it’s used to influence your experiences.

Respecting data is a mindset shift that recognizes that every dataset represents real people, with real consequences attached to how that data is handled.

The evolution of data agency

Brunni: How do you see the conversation around ownership and control of personal data evolving, and what role can businesses and regulators play in ensuring individuals have a voice in how their data is used?

Ericka: The data ownership conversation is maturing.People want more than transparency, control and agency. This puts businesses and regulators in a position to redefine what digital trust looks like.

The role of businesses is to figure out how to embed respect for data into business operations. Where there is meaningful choice, data sharing is clearer, and consent withdrawal is easy. 

If companies treat privacy like a product feature, it will differentiate them in the market. The most trusted organizations will be those that build control into their user experience and make ethical data use a competitive advantage.

Have you seen the Usercentrics State of Digital Trust report?

Get up to speed on what consumers want from brands and how you can win over their trust.

READ THE REPORT

For regulators, the role is to create frameworks that reward responsible innovation while holding organizations accountable.That includesconsistent rules that make it possible for individuals to exercise control no matter where their data flows.

When regulators set clear guardrails and businesses commit to designing within them, individuals gain protection, but also control and the ability to participate in the digital economy on their own terms.

— data privacy and AI governance strategist, Data Strategy Advisors

When regulators set clear guardrails and businesses commit to designing within them, individuals gain not just protection, but control and the ability to participate in the digital economy on their own terms.

AI’s promises and necessary guardrails

Brunni: Thinking of the future, what excites you most about the potential of AI and data? What guardrails do you believe are non‑negotiable to develop the technology ethically?

Ericka: What excites me most is AI’s potential to solve real human problems, especially in accelerating medical breakthroughs and closing equity gaps in education and access. 

But we must not skip governance. AI moves fast, and without governance, it risks moving in the wrong direction. 

Governance isn’t about slowing progress; it’s about supporting it. It supports providing transparency in how data is used, accountability for outcomes, and fairness in the systems built.

AI can truly advance humanity in a way that’s ethical, inclusive, and worthy of trust.For me, it’s clear that transparency, accountability, and equity must be built into AI from the start. Every AI system must have a named owner, a defined purpose, and auditable decision pathways. We need to know who’s responsible, how decisions are made, and that those decisions don’t reinforce bias or harm.

____________________________

Ericka Watson is a privacy, data, and AI governance leader and the Founder and CEO of Data Strategy Advisors. She works with organizations to design trust into data and AI systems from the start, helping them innovate responsibly while navigating evolving global regulations. With more than 20 years of experience, including serving as Chief Privacy Officer at Regeneron and leading privacy programs at Danaher, AbbVie, Abbott, and GE Healthcare, Ericka brings a practical, human-centered approach to digital governance. She also teaches Ethics, Law, and Social Issues at Northwestern University, is a former Chair of the Science and Technology Section of the American Bar Association, and advises on emerging issues at the intersection of data, technology, and law.

Why trust varies by industry and how brands can rebuild stronger
Where the internet ends, and you begin
What’s the biggest lie on the web? Data literacy with Terms of Service; Didn’t Read
What brands miss when they expand across cultures, with BrandStack’s Grace Baldwin
Dr. Anastasia Kārkliņa Gabriel on why brands can’t afford to ignore cultural intelligence