Striking a balance between the need for data and consumers’ privacy is critical to the success of any business, research organization, start-up or industry.
That’s the focus of Jules Polonetsky’s think tank, Future of Privacy Forum, headquartered in D.C. The nonprofit brings together industry, academics, civil society, policymakers and other stakeholders to explore the challenges posed by emerging technologies and develop privacy protections, ethical norms and workable best practices in these areas.
“It’s been exciting the last few years as artificial intelligence, the metaverse, self-driving cars, all of these new technologies create both excitement as well as a lot of stress over what could go wrong,” said Polonetsky, 57, a Potomac resident.
A shofar-blowing member of Beth Sholom Congregation, Polonetsky said inspiration for his work comes from Jewish sources.
“There are just so many great insights from the Bible. You would think it is so distant from self-driving cars, but it’s actually incredibly relevant.”
Although there is potential for self-driving cars to reduce accidents overall, some unavoidable collisions will undoubtedly occur. One of the famous challenges for self-driving cars is what’s known as the Trolley Problem, Polonetsky said. It asks whether under a situation in which five people are about to be killed by a speeding trolley, is it ethical to divert the trolley onto another track where it would hit and kill just one person?
“There were all these permutations that the rabbis had to evaluate to decide what was the moral thing to do. How did you prioritize a life or not? Did it matter whether you actively did something or whether it happened by default?
“Self-driving cars need a huge amount of information. The cameras need to know who and where and what is on the street. Many of the newest vehicles beep or alert and indicate that they know you are not looking at the road. They have your location if you have mapping services. So there’s a record of everywhere you’ve been and people are increasingly sharing that with insurance companies or maybe law enforcement might want to know where your vehicle is. So there is a whole range of hard and important issues that are becoming very practical for companies who are making decisions on how to use the technology in the car to keep people safer. But it also means having a lot of data that might be perceived as too intrusive. How you balance that is what it’s about. We (FPF) are the sort of centrists in this debate.”
The Future of Privacy Forum has 210 organizational members, from the Apples to the Googles to pharmaceutical research to startups to building the metauniverse. Offices are worldwide, including in Israel, which has become a cyber security power.
“Our goal is to ensure that the basic person who’s busy and has other things to do, we’d like there to be norms that ensure that most of us can go about our business and not constantly be worrying that something bad may happen. Your phone is the most useful thing for many of us. It’s incredibly reliant on data. It has your context and location. There are apps that you’ve agreed to give them personal data. Many people don’t think twice about it, but if you look at privacy settings on your phone, there are settings that show which apps are getting what data. People are shocked when I show them. You’re giving locations to all of them. They’re probably after your data especially if the app was free.
“There’s a famous saying in my world that if you can’t figure out why the product is free, then you’re the product. You are being sold. It means that advertising is being tailored to you, based on the information you have shared.
“It used to be really complicated, but today if you just scroll through the privacy settings, uncheck any apps that aren’t needing data for a purpose.”
Privacy issues used to center around marketing and cookies — tracking. “But today, we are at the center of many of those issues because they care about privacy as a human right and we work with civil society. Activists are worried about how data is being used. But also the researchers are trying to figure out, how do I study what causes COVID and how it spreads, about what long COVID does? We need data to learn, at the same time it’s data that can get us in trouble, can embarrass us, can get us arrested.”
Polonetsky said he sees “incredible lessons from the Talmud that can be drawn on to change the shape of the decisions that researchers or companies should be making. Whether it’s ethical decisions that need to be made or prioritizing lives which are relevant to self-driving cars. In fact, the Bible is actually the oldest, clear source for a right to privacy.”