We caught up with the brilliant and insightful Krishna Ganeriwal a few weeks ago and have shared our conversation below.
Krishna, appreciate you joining us today. Innovation comes in all shapes, sizes and across all industries, so we’d love to hear about something you’ve done that you feel was particularly innovative.
One of the most innovative initiatives I led at Meta was part of our broader initiative to embed privacy deeply into the Ads Infrastructure stack—particularly through a system we developed to enforce purpose limitation at scale.
Ads systems, by nature, handle massive volumes of personal data. The traditional challenge isn’t just protecting this data but ensuring it is used only for the purposes users consented to. This becomes exponentially complex in an environment like Meta’s, where hundreds of services, pipelines, and ML models interact with sensitive data in real time.
We built a privacy-aware infrastructure that incorporated automated data flow discovery through static and dynamic lineage analysis (https://engineering.fb.com/2025/01/22/security/how-meta-discovers-data-flows-via-lineage-at-scale/ ). Leveraging Meta’s internal tools, we could trace how data moved across microservices, annotate it with ‘purpose tags,’ and enforce these tags through runtime checks. For example, if a signal was collected for performance measurement, it would be programmatically blocked from being used in personalization or targeting.
In parallel, we worked on scalable fairness interventions in ads delivery (https://about.fb.com/wp-content/uploads/2023/01/Toward_fairness_in_personalized_ads.pdf ) —collaborating with teams building systems like VRS (Variance Reduction System). I contributed to integrating these fairness layers into our delivery engines in a way that respected both privacy boundaries and regulatory goals, such as non-discrimination in credit, housing, and employment ads.
What made this innovative wasn’t just the technical complexity, but the coordination across infrastructure, legal, and ML teams to operationalize abstract principles like ‘purpose’ and ‘fairness’ into enforceable, testable code.
This project became foundational to Meta’s $8B+ multi-year privacy investment (https://about.fb.com/news/2025/01/meta-8-billion-investment-privacy/ ), influenced our subscription-for-no-ads design in Europe, and helped us meet emerging regulatory requirements globally—all while maintaining system performance at hyperscale. For me, this work represented the intersection of infrastructure design, compliance automation, and ethical responsibility, and it’s one of the most rewarding and forward-looking efforts I’ve contributed to.
As always, we appreciate you sharing your insights and we’ve got a few more questions for you, but before we get to all of that can you take a minute to introduce yourself and give our readers some of your back background and context?
I’m Krishna Ganeriwal, a Senior Software Engineer with a focus on privacy infrastructure, currently working at Meta. My journey into tech started in college when I chose Computer Science as my major, even though I had no prior experience with coding. I wrote my first line of code in my first semester, and that moment sparked my passion for technology and problem-solving.
After completing my degree in 2017, I joined Texas Instruments, where I helped develop tools to improve the design of semiconductor chips. Over the years, I led projects that automated parts of the chip design process, making it faster and more efficient. But my interest in working on large-scale systems led me to pursue a Master’s in Computer Science at the University of Wisconsin-Madison. There, I deepened my knowledge of building complex systems that can handle vast amounts of data and support millions of users. After graduating with a perfect GPA, I joined Meta in 2023 as a full-time engineer.
At Meta, I work on building privacy-focused systems for our Ads platform. Essentially, I help ensure that user data is used responsibly and in compliance with privacy laws, all while making sure that our ad systems run efficiently and deliver the right content to users. Some of the key areas I focus on include:
Ensuring data is used only for the right purposes: I build systems that ensure user data is handled in line with what users have consented to, and I help enforce privacy policies across the company.
Helping teams understand how data is used: I design tools that allow engineers to track and manage how data flows across the company’s systems, ensuring that sensitive data is used safely.
Making sure our ads are fair and unbiased: I work on ways to ensure that our ad delivery systems treat all users equally, especially when it comes to sensitive areas like housing, credit, and employment.
Helping Meta comply with privacy laws: I create systems that help Meta follow complex privacy regulations, while also allowing us to offer services like ad-free subscriptions to users.
What sets me apart is my ability to think not just about how systems work, but also about how they impact users and society as a whole. I believe privacy isn’t just about compliance — it’s about building trust. By designing systems that are transparent and responsible, I help ensure that Meta continues to be a platform people can trust with their data, even at massive scale.
I’m particularly proud of my work in ads fairness, where I’ve helped create systems that prevent discrimination in ads. This work is important not only from a compliance standpoint but because it aligns with Meta’s commitment to creating a fairer, more inclusive advertising ecosystem.
Another accomplishment I’m proud of is my work on data privacy tools that help the company understand and control how sensitive data flows through its systems. These tools are crucial in helping Meta comply with regulations like GDPR and CCPA while continuing to deliver innovative products.
For anyone following my work, I want them to know that privacy isn’t just a challenge — it’s an opportunity to build better, more trustworthy technology. When done right, privacy helps foster deeper connections with users and creates more sustainable, long-term platforms. I’m passionate about using my expertise to create systems that are not only scalable but also ethical and responsible, ensuring that technology works for everyone.
We’d love to hear a story of resilience from your journey.
One story that stands out is the decision to leave a stable job at Texas Instruments and pursue a Master’s in Computer Science at the University of Wisconsin–Madison.
At the time, I was working on high-impact software tools in the semiconductor space, had recently been promoted, and was in a strong position within the company. But I knew that if I wanted to build large-scale infrastructure systems — the kind that power platforms used by billions — I needed to step out of my comfort zone.
So, I made the difficult decision to hit pause on that momentum and start over in a new country, in an extremely competitive academic environment. I had no safety net. I was a first-generation postgraduate student, surrounded by peers from elite global backgrounds. On top of that, I gave myself just 16 months to finish the program — while aiming for academic excellence and a breakthrough into Big Tech.
It wasn’t easy. I pushed through intense coursework, 14-hour days, and job search pressure while competing for internships where only a small fraction of candidates were selected. I ended up landing a coveted internship at Meta — and turned that opportunity into a full-time E4 return offer (skipping the E3 level, a rare outcome for non-PhD candidates).
But the resilience didn’t stop there. After joining Meta full-time, I continued to push myself — navigating hypergrowth, complex infra problems, and multiple rounds of layoffs. I focused relentlessly on delivering impact in privacy infrastructure and was promoted in no time. Every step required me to adapt, learn fast, and keep showing up with intent.
That entire arc — from leaving stability, proving myself in a new domain, and building back stronger at Meta — is a personal example of resilience. It taught me that resilience is not just about surviving hardship — it’s about embracing discomfort in pursuit of long-term growth.
What do you think helped you build your reputation within your market?
In highly specialized infrastructure domains like Ads Privacy, reputation isn’t just about being a great coder — it’s about building systems that are resilient, scalable, and aligned with ethical, legal, and product imperatives. At Meta, I earned trust by owning complex initiatives that required cross-functional alignment — like building purpose limitation enforcement frameworks, scaling privacy-aware data lineage tooling, and contributing to fairness interventions in ad delivery.
I also made it a point to go beyond individual deliverables — mentoring other engineers, writing clear RFCs, challenging assumptions when needed, and helping teams navigate regulatory impact on infra. I kept execution high even during turbulent phases like reorgs and layoffs, which helped me earn credibility not just within my org but across partner teams as well.
What helped most is treating privacy not as a compliance checkbox, but as a systems design problem — something that requires deep thinking, clear abstraction boundaries, and strong runtime guarantees. That mindset shaped the way I contributed, and it differentiated me in a market where responsible tech is now a core expectation.
What sets this work apart is its scale and its ethical impact. We’re not just optimizing latency or throughput — we’re operationalizing consent, fairness, and trust across billions of data points and thousands of models. It’s engineering with long-term responsibility baked in.
I’m most proud of the impact I’ve made on privacy infrastructure that touches billions of people. Being part of a core team helping Meta re-architect how consent, data flow, and fairness are enforced in ads — at a time when tech companies are under global scrutiny — has been incredibly meaningful.
What I want people to take away is this: it’s possible to build ethical, scalable systems that don’t compromise on performance or responsibility. My work — and my philosophy — is about designing infrastructure that can stand up to both engineering scrutiny and public trust.
Contact Info:
- Website: https://github.com/ganeriwalk11
- Instagram: https://www.instagram.com/krishna_ganeriwal?igsh=NTc4MTIwNjQ2YQ%3D%3D&utm_source=qr
- Facebook: https://www.facebook.com/krishna.ganeriwal
- Linkedin: https://www.linkedin.com/in/krishna-ganeriwal/