When Security Involves a Company and an Autonomous Car

David Hechler (New York): She went to college thinking that she wanted to be a lawyer. She even had a work-study job in the law library at the University of Pittsburgh. But her experience there convinced her otherwise. Law students were still using print books for much of their work in the 1990s, and some of the students were “so cutthroat,” she said, “that they would check books out and tear pages out so that others wouldn’t find the information.” She resolved to search for a more “collaborative environment,” and that’s how Summer Fowler found her way to computer science.

There’s overwhelming evidence that it was a good move. After she earned a B.S. in computer science and a master’s in information science (also at Pitt), she went to work. She craved jobs and projects that brought her into contact with consumers. From early in her career, she tried to predict what would be hot—and navigated accordingly. She was looking for challenges and work that would have a national impact. She started at Northrop Grumman and later moved to the Johns Hopkins Applied Physics Lab. But after she and her husband had a child, they decided to return to Pittsburgh to be near their families. So she took a job at Carnegie Mellon University, which is how she got into cybersecurity.

She’s still in Pittsburgh (a bona fide tech hub that she loves), and she’s still affiliated with Carnegie Mellon, but now her trajectory has landed her in the world of artificial intelligence. She’s the chief security officer of a company creating software for autonomous vehicles, where security isn’t just about the safety of data; it can also be a matter of life and death. Which makes an effective partnership between IT and Legal all the more significant. She talked about the importance of communicating effectively across disciplines, and the need for cybersecurity terms of art.

CyberInsecurity News: When you were in college and you decided to major in computer science, where did you think that was going to lead?

Summer Fowler: I knew even when I was in college that I didn’t want to be in the heads-down programming role. I saw it as a means to an end. I have a trend in my career. From the position I’m in today, I want to think about what will be cutting-edge for the next 10 years. When I was sitting in school in 1995 to 1999, I knew that computer science and computers themselves were game-changing. Getting into that field was exciting, because I knew I was going to have some interesting runway in a career.

After I got to Carnegie Mellon in 2007, I was on the software side of the Software Engineering Institute [SEI], and I really wanted to get into the cybersecurity side, because I thought that during the 10 years in front of me, there were going to be some really interesting activities that I wanted to be a part of. So I got into the cybersecurity side. SEI is very much like the Johns Hopkins Applied Physics Lab. It’s a federally funded research and development center [FFRDC] with about 700 full-time employees. It’s really close to the mission of helping to solve a national challenge. We were working very directly with government customers on challenges to improving Department of Defense [DoD] programs by writing better requirements and upgrading the overall acquisition process. So even though it seems like it would be removed from consumers, it was direct access to people inside the government making acquisition decisions. SEI is sponsored by the DoD, but they work across multiple agencies and organizations. By the time I left there, my biggest customer was the Department of Homeland Security. We were also working with the National Security Agency, Department of Energy, Social Security Administration, Veterans Administration, and with industry. What was unique about SEI was that we were viewed as the bridge between academia, the government and industry, and our role was to bring together the best practices from all of those inside a research and development organization, and then make those solutions available to the greater community.

Landing At CERT

CIN: Carnegie Mellon has an interesting history. It’s the birthplace of CERT [Computer Emergency Response Team]. In fact, it’s the place where the name—or acronym—was coined.

SF: And that’s where I ended up working. Carnegie Mellon is the university. The FFRDC inside the university is the Software Engineering Institute, and CERT is a division of it.

CIN: Did you know all of this history when you decided to go to work at Carnegie Mellon—that it’s a pioneer in computer science and in cybersecurity?

SF: [laughs] No, I was thinking that I would be on the software acquisition side. And I was there from late 2007 to mid 2009 before I decided to go to the cyber side. And that’s when I switched to CERT.

CIN: There was actually a legal dispute over the use of the name CERT. Because it was coined at Carnegie Mellon, but it was a name that was adopted internationally. And still is used internationally. There was a time when Carnegie Mellon said, “Wait a minute. You can’t use that.” And then they decided to let it go.

SF: That’s right. And there are over 200 CERTS in the world right now.

CIN: What led you to opt for cybersecurity?

SF: Again, it was a point in my career where I wondered, “What’s going to have world-changing impact for the next 10 years?” And I looked at cyber and said, “Cybersecurity is something that will have great impact, and it’s something that I want to learn more about. I would like to become an expert in that field in some way.” I had friends who were working at CERT, and a job opened up to help manage its big Department of Homeland Security project. So I talked to them about switching over and running that program. It was more of a program manager role. I started doing that, and then moved into a technical director role over some of the bodies of work inside of CERT, with many more customers than just the Department of Homeland Security.

CIN: And a few years later you started teaching as well, correct?

SF: In 2013 or 2014, I started teaching at the Heinz School, which is one of the graduate programs at the university. I was teaching just one class a week, in the evening. And then I was a co-teacher of a second class. From there we started developing an executive education program, so my current affiliation with the university is twofold. I co-teach a cyber policy course to graduate-level students. And then I’m still very involved in executive education. There are now several programs. One is the certificate for chief information security officers [CISOs]. There’s also a chief risk officer [CRO] program. And we just graduated the first cohort with Chubb insurance. That’s a seven-month program that teaches insurance professionals about cybersecurity so that they are more educated as brokers. I also helped to design some of those programs. They are mostly synchronous distance classes [the teacher and students interact remotely, but in real time].

Her Next Big Thing Involved Automobiles

CIN: How did you land your current job?

SF: About 18 months ago, I was sitting in my job thinking, “What’s next? What is exciting, and where is the space taking us that’s going to give me 10 years of runway for a really great, challenging portion of my career?” And I knew it was autonomous vehicles. I knew I wanted to be in Pittsburgh. I have a family here. I don’t want to leave the area. I looked around Pittsburgh and thought, “Robotics is a hot field. Autonomous vehicles is a hot field.” I started exploring, and it took about 18 months before Argo posted a position.

Argo AI is an artificial intelligence company headquartered in Pittsburgh. It was founded in late 2016 and received $1 billion in funding from Ford in order to take Ford vehicles and make them autonomous. We don’t build cars; we take vehicles from Ford and put systems and software on them that make them run on their own. The company was founded by Bryan Salesky and Pete Rander, who both have a history in computer science, robotics and autonomous vehicles. They got together and realized that they really wanted to put a focus on the safety of autonomous vehicles. So we have a very focused mission here to build a Level 4 autonomous car—meaning it requires no human intervention—by the end of 2021. That’s the challenge.

At the end of 2016 the company had five people. We now have about 420 people. We are headquartered in Pittsburgh, with offices in Palo Alto and Dearborn, Michigan. We bought a lidar company near Princeton, New Jersey. A lidar—it stands for light detection and ranging—is a sensor on a car that performs remote sensing using light. And then we have depots where we drive vehicles in Miami and Washington, D.C.

CIN: As the chief security officer, you probably bring more to the table than most people in that role.

SF: One of the parts of safety is the fact that the vehicle itself also has to be secure. And our infrastructure as a company needs to be secure. And so my job spans the infrastructure of the company itself, our business operations, our IT, and the platform—the vehicle—and everything that comes in between. There’s a little bit of overlap. You have software developers and hardware engineers who are on the infrastructure side and directly connected to the vehicle. I am currently building a team that is ensuring that security is an embedded part of our culture. In everything that we do, we want to think about the safety and security angle.

CIN: The internet of things (IoT) is getting a lot of attention these days. And some of those “things” present extraordinary dangers, like medical devices and cars. There were instances of cars being hacked a few years ago. And the idea of cars being taken over or a pacemaker or some other medical device being taken over by hackers sounds like something right out of a sci-fi horror movie. But then you realize that it’s real. Talk about keeping you up at night! This has to be a major concern for everybody associated with this project at Ford.

SF: It is. But there are real benefits. We want to make sure that we’re taking care of our environment, the greater ecosystem of the earth, and we can make some improvements through the use of autonomous vehicles. And we can talk about electric vehicles. But then you also think about a societal impact. If you can provide accessibility to transportation to more people, it becomes an equalizer from a socio-economic standpoint. There are people who live in certain parts of the country who just flat-out don’t have access to services or education that would benefit them. To me, thinking about my children being in an autonomous vehicle—because, wow, would I not love to be able to drop my kid into a car that drives him to hockey practice or my daughter to tennis practice. That would be incredible. But it has to be safe. And I know that it has to be secure. So being a part of that solution—it’s something that I’m really passionate about.

Eating Her Own Dog Food—With Benefits For Students

CIN: You’ve worked for companies. Even your jobs in an academic setting have largely involved projects with direct connections to the real world. But not exactly like the one you’ve got now. Has Argo been a real shift for you?

SF: Yes. It has been a tremendous shift. Even when you think of the role I had at CERT, we were writing best practices and providing them to industry and the government. And now I’m not only applying best practices, but I’m responsible for the results of them. I’m doing a lot of eating my own dog food, and finding out that some of it is not as easy to swallow as I’d thought. It’s one thing to say, “Well, in order to have good cybersecurity, you have to start with some of the basics. You need to have a solid asset management program, and you need to put that into a good configuration management database.” It’s easy to say it, and it’s easy to recommend it. And in practice, it’s really difficult. This has been the exciting part of the challenge for me. Now I’m really learning what works and what doesn’t. And so, quite frankly, I think in my relationship that continues with the university, I will be able to provide some great feedback about the things that we’re teaching, and the things that we’re telling organizations to apply, by giving our students use cases of what works, what doesn’t, and how things can be made even better.

CIN: Let’s return to the subject of risk. I loved the movie that won the Academy Award this year for Best Documentary. It’s called “Free Solo.” It’s about a guy who climbed El Capitan in Yosemite Park without a rope. As a friend of his said in the film, “It’s sort of like being in the Olympics, hoping for the gold medal, but if you don’t get the gold medal, you die.” The level of risk is that you can’t have a bad day and live to climb the next. It’s great to be talking about dog food and what you eat, but if you guys make mistakes, someone could die. Now that’s true of other professions. You think of surgeons. But we’re talking about mass production, and the level of intensity and perfection—or something approaching it—has to be daunting.

SF: It is. Although at the same time, when you think about a doctor or any of these other fields, you really just want to start breaking it down to the smaller, bite-size problems. We’re writing software. And then we’re going to integrate that into a system. What are the ways that we can ensure that the software we write—even one line of code—is secure and functional? Then you think about design practices that go into writing good code. But then there’s the human that’s writing it, and humans have different motivations. And then you think about the different controls you put in place around that development process, and how do I audit that to make sure that I can say that the person who’s doing that didn’t have any nefarious intent or just make a mistake? Then you have someone review that code. Millions of lines of code can’t be reviewed by the human eye, so let’s find the best code analysis tool. And then let’s also make sure that we have data loss prevention in place to get to a point where we know that that code is not leaked outside or maliciously made available outside.

CIN: How much contact do you have with the lawyers? And is it a challenge bridging the gap between the tech team and the legal team?

SF: I have daily contact with our internal lawyers. And we have a really great relationship. As I think back to when I started in computer science, a good cybersecurity professional will have good communication skills because, by nature, it’s an interdisciplinary activity. You have to work with attorneys, and you have to work with hardware engineers, and you have to work with end-users. I really enjoy working with our attorneys, because they have knowledge that’s really important for me to understand, and I have knowledge that’s important to them. For example, policy development. I will develop an acceptable use policy—how our IT assets can be used—or a clean desk policy. When people leave for the day, we want to make sure they haven’t left any Argo sensitive data visible on their desks. I’ll write that, and then I collaborate with the legal team to make sure that it meets the employment standard. I want to make sure that I have not violated California employment law, for example, because we have an office in Palo Alto. And the attorneys provide that expertise. We learn from each other!

CIN: Is that relationship something you initiated, or had to initiate?

SF: We’re a small company, which makes relationship-building a little easier than in some places. We all wear multiple hats. We all have a lot of job responsibilities. Pete and Bryan have established a leadership team that includes all disciplines that work across the company, and they promote and encourage and demand collaboration across the team.

Lawyers and Tech Teams Have Much In Common

CIN: In some ways, it seems that the tech team and the legal team have things in common. They can both be seen by colleagues as being the Office of No. General counsel are frequently seen as slowing down, obstructing or killing business initiatives. And CISOs are frequently seen as being overly protective of company data—whether the people who seek access are business colleagues or outside vendors.

SF: [laughs] Yes. I do believe that if you don’t take the time to build the relationships with the leads in those areas, you could absolutely be seen as the Office of No. I had an example this morning. I was writing back to someone. They said, “We think that the policy you put in place is overly protective and that it may not be necessary.” My goal then was to get together with that person, explain why this is the position, and find a solution that works for the engineers. My job is not to say no to the engineers. My job is to find a solution that both enables them and keeps the company secure. It’s a lot of negotiation. It’s a lot of weighing the balance of this versus that. I think that negotiation is easier in a place like Argo because we have a singular goal in mind—an autonomous car by the end of 2021. And my job here is to make that software work. The software engineer’s job is to make that software work. The legal team’s job is to make that software work. We’re all just bringing our expertise to that problem.

CIN: For lawyers who work at companies where this is not the case—where there isn’t the level of communication that you just described—what should they do to bridge the gap? Do you call up the general counsel and say, “What are you doing tomorrow for lunch?”

SF: Starting with lunch is awesome. If you get to know people on a human level, everything is easier. You sit down and say, “What are the challenges we’re facing? What are the things we want to achieve together?” If you start from the position of what’s the problem we’re all trying to solve or what’s the goal that we’re trying to meet, then you can work backwards. Instead of starting from a position of “no.”

Argo is a very Silicon Valley-type of environment. We have lunch that’s provided to us every day. We all eat in a cafeteria-style area, and I think that is a tremendous help in terms of discussion. But I have other meetings with the attorneys to discuss a policy that I’ve put in place, or they’ll set it up with me, and we’ll review a policy together. We do that for software decisions, too. For example, the legal team will often send me the terms from a piece of software that we’re going to buy or a company that we’re going to work with, and then they ask me for my cybersecurity take on it. Are these the right terms of service? And then we work together on them.

CIN: Not all people on the tech side are as comfortable speaking in plain English without falling into jargon as you are. You’ve spent quite a bit of time doing podcasts, webinars, short videos. How did you get into that?

SF: It started at Carnegie Mellon. We wanted to have another outlet for the things we were producing and learning. And I wholeheartedly believe—and you’ll see lots of different studies that say—that to be successful in the cybersecurity field, you have to be able to communicate. And I am a firm believer in that. I wanted to make some of the information available to people who needed to hear it, or wanted to hear it, or wanted to learn more. And that was just being a part of the university—making information available.

CIN: Tell me about the role of resilience in this work.

SF: There’s a body of work that was developed by a large group of people at CERT. The concept is that we don’t just want our organizations and systems to be secure. We want them to survive and endure. Resilience means I can operate and achieve my objective before, during and after any sort of disruptive event. That event could be a cyber event. It could be a failure in process. It could be a human error. Once the disruptive event is over, the organization or system can return to full operating capability. You can stretch, you can bend, you can be challenged. You will never be challenged and stretched so thin that you can’t achieve your objective. It’s sort of the mature, grown-up version of security.

A Need For Terms of Art

CIN: How widely has this concept been adopted as an essential element of true cybersecurity?

SF: Cybersecurity suffers from a terminology struggle. We don’t all say the same words and mean the same thing. And that’s really something that’s holding us back. When you think about the financial world, you think about the concept of materiality. When you think about the legal world, there are terms that mean things. “Pro bono” means something. “Precedent” means something.

CIN: Terms of art. Like the abbreviated word “cert” means something different to lawyers.

SF: Exactly. Terms of art. The cybersecurity and computer science professions have really not settled on terms that everyone agrees on. So resilience as a concept can sometimes mean different things to different people. Every time I talk about it, I give a definition. In the military, resilience might mean that it has a hard outer shell. Or that if it drops, it won’t break. I do think that the community at large understands that, no matter what we do, we’ll never achieve 100 percent security. What we want to be able to do is handle those incidents that happen. The concept of resilience is there; it’s just not always called the same thing.

CIN: Should people be working on coming up with a common terminology?

SF: Absolutely. Because if you want to move from a job to a profession, and to something that endures, you have to have that.