EFF is happy to welcome our newest Staff Technologist Erica Portnoy. Erica is joining EFF's technology projects team, a group of technologists and computer scientists engineering responses to the problems of third-party tracking, inconsistent encryption, and other threats to users' privacy and security online. Erica earned her BSE in computer science at Princeton, and comes to EFF with experience in messaging privacy, searchable encryption, and tech policy and civil rights.
I asked Erica a few questions about her background and what she'll be working on at EFF.
What are you most excited about working on this year?
I'm excited to be working on Certbot, EFF's Let's Encrypt client. We're gradually working towards stability and the long tail of usage cases. I'm hoping to get it so that it just works for as many people as possible, so they can get and install their certificates 100% painlessly.
What drew you to EFF?
EFF's tech projects team is doing the uncommon work of making direct, concrete, technical contributions to improving people's safety online. Plus, everyone who works here is the nicest person you'll ever meet, which I promise is not logically inconsistent.
What kind of research did you do before coming to EFF?
My previous work involved experimenting with cryptographically-enforced privacy for cloud services. So I've worked with ORAM and encrypted search and SGX, to drop some jargon.
What advice would you have for users trying to secure their communications?
If you are only going to do one thing, use a password manager and diceware. I use the one built into Chrome, with a sync passphrase set up. No one's going to bother exploiting a million-dollar bug if your password is the same as the one you used for a service that was recently breached.
But more broadly, this is a hard issue, and the best thing to do is different for every individual. Definitely look at our Surveillance Self-Defense guide for more in-depth recommendations.
On another side of that, what should tech companies be doing to protect their users? How can users hold them accountable?
Especially now, companies can't absolve themselves of the responsibility for their users by claiming, "Well, high-risk users shouldn't be using our product." If a company makes a product that is used by people in high-risk situations, it is their duty to protect their users by offering the ability to turn on security features.
But that's the bare minimum. A system should neither compute nor retain information that could harm its users, and organizations that might have this data must also fight to protect people on a legal front.
As for users, making your voice heard will inform design decisions. Leave a one-star review on an application distrubution platform, like the Play Store or App Store, and include specific details of how the design decision in the product is harmful to your safety or the safety of those you care about. Do the same thing on Twitter. It's hard to prioritize features without knowing what people want to see.
How much are you loving EFF's dog-friendly offices?
90% of why I'm not a TGIF person is because Neko doesn't come in on Fridays. The other 10% is because Neko won't be there on the weekend, either.