Dr. Chris White, Principal Researcher. Photo courtesy of Maryatt Photography

Episode 27, June 6, 2018

When we think of medals, we usually picture them over the pocket of a military hero, not over the pocket protector of a computer scientist. That may be because not many academics end up working with the Department of Defense. But Dr. Chris White, now a Principal Researcher at Microsoft Research, has, and he's received several awards for his efforts in fighting terrorism and crime with big data, statistics and machine learning.

Today, Dr. White talks about his 'problem-first' approach to research, explains the vital importance of making data understandable for everyone, and shares the story of how a one-week detour from academia turned into an extended tour in Afghanistan, a stint at DARPA, and, eventually, a career at Microsoft Research.

Related:

Transcript

Chris White: I got approached to work on this very short-term project in Washington, D.C. And I said, 'No thanks.' I got asked a second time. I said, 'That's great, but no thanks.' And then the third time, as kind of a personal favor, and so I said, 'Yes, of course.' Turns out I went down for one week. One week turned into two weeks. Two weeks turned into three months. And instead of going back to Harvard, I went to Afghanistan.

(music plays)

Host: You're listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I'm your host, Gretchen Huizinga.

Host: When we think of medals, we usually picture them over the pocket of a military hero, not over the pocket protector of a computer scientist. That may be because not many academics end up working with the Department of Defense. But Dr. Chris White, now a Principal Researcher at Microsoft Research, has, and he's received several awards for his efforts in fighting terrorism and crime with big data, statistics and machine learning.

Today, Dr. White talks about his 'problem-first' approach to research, explains the vital importance of making data understandable for everyone, and shares the story of how a one-week detour from academia turned into an extended tour in Afghanistan, a stint at DARPA, and, eventually, a career at Microsoft Research. That and much more on this episode of the Microsoft Research Podcast.

(music plays)

Host: Chris White, welcome to the podcast.

Chris White: Hi, Gretchen, thanks.

Host: So, you're a principal researcher at MSR, and you work on special projects. We'll talk about that in a second, what that means. But for now, let's talk in general about what gets you up in the morning. What are the big problems you're working on and the big questions that you're asking?

Chris White: Well, there's a bunch of questions and problems worth looking at. And from my point of view, I looked at problems that were happening in society, problems that technology companies could build technology to address, technology that could also have a business purpose. And it sort of inspires me in lots of ways. One way to think about it is, you know, what does terrorist financing, human trafficking, propaganda, ransomware - what do those have in common? Well, those are all things that appear online. They appear in the vastness of big data and the darkness of usernames. And so, if one were trying to address those problems, technology would be a helpful aid, and a company like Microsoft, who has a huge technology platform and has the responsibility to maintain trust worldwide, would be a great place to work on it.

Host: So, the term 'research' covers a broad range of approaches to discovery, finding stuff out for, more colloquially. And the special projects approach you bring to the mix here, by way of the Defense Advanced Research Project Agency, or DARPA, is a little different from traditional academic research. Can you talk about that? How is this approach different? What advantages does it offer for specific kinds of research questions?

Chris White: Sure. One way to think about research is not knowing what you're doing. That's why it's called research, right? And given that, there are many ways to approach solving problems that you don't know how to solve. Sometimes the question depends on the scope of the problem. Sometimes it depends on the maturity of the approaches to solve the problem. Sometimes it depends on the state of society and its ability to adopt and use and afford solutions to problems. And so, at DARPA, the way it's approached is by identifying the problem first. And then understanding how to organize money, technology, talent, people, organizations… to best execute against solving that problem. And that usually means that you assemble lots of different kinds of people, as opposed to a classical view of research from the academic point of view where you study a problem, you write papers that are reviewed by your peers, and you advance the field by that kind of approach. In the projects kind of approach, you bring together people with different skills, and then you organize them to approach a problem with larger scope than you could address as a single person. And the hope is that you can do something impactful.

Host: Yeah. Do you find purchase with that method, as they say in the academic world?

Chris White: Well, for sure. In my world, we work on data analytics. We work on how to enable people to interact with computers to make decisions from information. We want them to have help from AI. And we focus on how to help them organize and structure information, and how to help them interact and visualize that to make decisions. To do that, requires people with different backgrounds. It requires user interface application developers. It requires big data distributed computing developers. It requires people familiar with machine learning and artificial intelligence techniques. Any one of those people can address part of the problem, but to address it end-to-end requires organizing them together. And so, that's the projects approach we take here.

Host: OK. Let's talk about data science, write large, for a minute. The massive amounts and multiple sources of data that we have, today, have prompted a need for big data analytics and data visualization tools. So, what are the specific challenges of big data problems, and how are you tackling them?

Chris White: Well, 'big data,' much like 'artificial intelligence,' are terms that are vague. In fact, they don't mean anything. Neither big nor data. They're not qualified. Just like artificial intelligence. And so that's good and bad. You know, it's good because there's a movement. That movement has funding and interest from policymakers. It has the need for understanding implications. But that movement is still very large and very vague. And so, I think about big data, really, in terms of publicly accessible information, as a starting point, because that's something that people are familiar with. They've all gone to a search bar. They've all issued a bunch of queries. They've all had a bunch of browser tabs open and had that familiar feeling of, gah, there's just like a lot of information out there. How do I find what I need? How do I organize it? And when that problem is a business problem, it's even bigger. And so, I think of it like that. Sometimes there's images, like an iceberg, where what you see from a search bar, what you see if you did a Bing search for a product or a celebrity or an event, and you get a list of links and an answer card, they think of that as data interaction. And it's true, but behind that there's a lot more. There are databases. There are APIs with streams like Twitter and Facebook. There are public records from FOIA results. There are all kinds of things that you have to have a different kind of skill to access. And a lot of the big data approaches are how to use technology to access that information, and how to present it to people so they can make use of it. One way I think about it, sometimes, is I go all the way back to the beginning of computers. 1830s. Charles Babbage is envisioning a computational machine of some kind. In the end, one gets built. He calls it 'difference engine.' And really, that was an engine that let you compare numbers. But what is happening now is the same fundamental operation of comparison, but it's not zeros and ones. People want to compare concept-like entities. They want to compare events. They want to understand the reaction to things, and those are the kinds of questions you can answer with big data. It's not a fact. It's more understanding the situation, understanding what's happening, who's involved, how to do the analysis. And those are the empowered abilities we want our users to have.

Host: What's your particular focus in the machine learning space?

Chris White: Right, so. I view the process as one where people are more or less doing the same thing they've always been doing in the world. But what's changed is now we have lots more partial and noisy observations about behavior. And with that, we can start to infer what was going on and what to do about it, what's changing. And so, we view that process through the lens of data analysis. How do we take in lots of partial, noisy observations from streams of sensors like social media, like news, like webpages and documents, internal reports, measurements of various kinds, to organize them in a way that lets people understand what might have been going on and to understand what might be related, and what they might be able to do about it? And so, employ methods of graph analysis and graph statistics to posit a data-generating process that we can measure, and then to evaluate that as a good representation. And then to bridge to the user, we find that that's not enough. So, we need ways to visualize and explain it. Those require advances and inventions in that visual space of representation and in the space of interaction. And so, we have focused in that space as well. So, I say that in a way my home is in machine learning and information theory. And I'm a tourist in the HCI space. But really now, there's like a second home. When I went overseas, I was there to do a machine learning problem. It turned out that I needed to do visualization and HCI, because people wouldn't use the results if they didn't understand them. People wouldn't take advantage and take action on information if they couldn't interrogate it. And so, very quickly, that process of visualization, of interaction, of application development, became as important, if not more, than the machine learning algorithms to transform data into a data structure for use. And so, we continue focusing on advances in modeling that are more realistic, that are more efficient, that are more expressive, as well as advances in HCI that are more representative, that have more points of view for interaction, that allow for different kinds of users to understand things more quickly, to have less training material and tutorial. And those are the basis for our research.

(music plays)

Host: Let's go back a little bit. You have a fascinating background. Talk about that for a minute.

Chris White: Well, I grew up in the Midwest, in Oklahoma, and focused on electrical engineering, and over time became more and more academic and got a PhD and did a postdoctoral fellowship, was planning to be faculty. That's kind of what you get taught to do when you're in that kind of schooling. And I got approached to work on this very short-term project in Washington, D.C. And I said, 'No thanks.' I was really enjoying the summer in Cambridge. I got asked a second time, said it was good for my professional development. I said, 'That's great, but no thanks.' And then the third time as kind of a personal favor, and so I said, 'Yes, of course.' Turns out I went down for one week. One week turned into two weeks. Two weeks turned into three months. And instead of going back to Harvard, I went to Afghanistan. And that began a very odd detour into fieldwork, where the reality of people using technology to understand information and make decisions really dramatically affected me and the way that I think about problem solving and the way I think about research. That led to several years of understanding and fielding technology for use in the Middle East, understanding how people adopt technology, what they need, how they prioritize their problems. And then all of those lessons got to come back with me and be part of major investments for DARPA. That included XDATA, which is a program that was part of President Obama's Big Data initiative. It was the lead project for DARPA. And Memex, which is titled after Vannevar Bush's famous article, As We May Think, where he anticipated Wikipedia and hyperlinks, and he describes this machine called a Memex that lets you dig into information and make sense of it. And the Open Catalog, which lets us publish and share the results from research funded by taxpayers at DARPA.

Host: Nice.

Chris White: In all of these, my view is that if we're investing taxpayer dollars, unless there's a compelling security reason to keep something secret, we should make it free and easy to access. And those projects - those were opportunities to invest in small companies, large companies, universities, to bring both general purpose technology, but then to bring it together to solve important problems like human trafficking and terrorism financing.

Host: Let's talk about that for a minute. You went from doing military work in Afghanistan to digital crime fighting. Can you talk about that era of your life for a minute?

Chris White: Sure. Many of these technologies, data analysis technologies, machine learning technologies, human computer interfaces - these are very general. Again, they're almost like utilities from my point of view. That's why having them as open source projects makes a lot of sense to me, or having them as low-cost products makes a lot of sense to me. And the observation coming back from Afghanistan was that many parts of the government had similar problems. And there were many law enforcement and related organizations that had similar problems. And so, we decided to pick a few of those and focus on them as applications, knowing that if we could address those as well as build general purpose technology, then other people might help apply them to other problems. And coming from the government to Microsoft, there was a real opportunity. And therefore, we thought, this a great place to work on that problem. We have a digital crimes unit. We have the ability to apply those techniques we've learned. And we have the technology platforms that Microsoft has. So, we took a shot at it.

Host: And?

Chris White: Well, the first problem we worked on with the digital crimes unit was a cousin problem to ransomware. There's a version of that problem called tech scams. And it works like this: you're on your computer, and either you click on something or you get an email or somehow your computer gets into a state where there's a popup that says, whoa, hey, you have a virus. You need to call tech support. And here's the number. Call it. Now, first of all, none of you should ever do that, ever call tech support if you're prompted to from a computer. It doesn't work that way, and you're going to get in trouble. We get tens of thousands of complaints, in writing, a month where people say, there was this thing that popped up on my computer. Popups, popups, popups, popups, wouldn't go away, wouldn't go away, wouldn't go away, couldn't delete it, couldn't get rid of it. And then what happens? Have to buy security software, have to give credentials, have to lose control of the computer, someone using it remotely. All these kinds of things. And it's just a difficult situation. And so, this problem, it's pervasive. It's something like, you know, one in twelve people that were interviewed had this problem happen to them, and of those, many of them were ensnared. And so, we took on this problem and the way we approached it was by building a web-scale crawling architecture to find where all of these scams are happening on webpages anywhere.

Host: And this is not easy?

Chris White: No, this is at the same scale as crawling the web to build a search index. It doesn't have the same difficulties in one sense because you're not enabling millions or billions of people to access that simultaneously, which is a very hard operational problem. You're talking about, instead, the analysts at the digital crimes unit or the Federal Trade Commission that we cooperate with, or other internal groups. So, it's a smaller number of users, but the size of data are still very large…

Host: More needle in the haystack kind of problem?

Chris White: Yeah, it's… exactly. It's more targeted. And so, with this problem, it's a crime problem, but we need something that people can use. How can they understand the vastness of this tech scam problem? How big is the problem? Where is it occurring? How many scams? How organized are the scams? These are the kinds of analysis questions that one can answer with big data access and these kinds of tools. But we have to build them. And so, we built a web crawling, distributed, back-end infrastructure that would find where these scams were happening online. One of the challenges was to find them as they were happening and to capture that, because one detail around law enforcement and digital crime is you have to have evidence. By building the ability for people to organize this kind of content, organize it with provenance, with comparability, with the ability to query and reason, we were able to then start to build tools that analysts could use. The second half of that problem was, okay, now we maybe have found all of this content. Maybe we've started to use machine learning and artificial intelligence to structure it. How do we make that available to people? How do we make that artificial intelligence visible and useable? Well, we have to build a bridge. And that bridge is through user interfaces or through HCI approaches in general. And so, we had to build many of those and organize them into an application and make that available to these analysts. The outcome, though, was satisfying. The outcome was that we worked with the Federal Trade Commission on Operation Tech Trap last year. We were able to supply them with the appropriate and relevant information to contribute to indictments. And they levied several indictments and raids. One of them involved a group in Ohio that had been defrauding 25,000 people of $40 million dollars. So, the ability to go end-to-end, to identify the problem, to organize the technologies to find relevant information, to make it accessible to an analyst, and then to matriculate those results into action, that, to me, is the real challenge of the modern era of computing using data and evidence-based decision-making. And so that's why our research focuses on both that organizational aspect, but then also, how do you present it? How do you make it navigable? How do you make it understandable and cheap?

Host: It seems like it would be satisfying as well. I mean, you've got all of it, this huge problem, and then the outcome is good. People were able to catch the bigger fish using those kinds of tools.

Chris White: Absolutely. With those tools, those are the kinds of questions you can ask. What is the biggest fish? What is the most recent fish? Using that to stay ahead, and work at the speed and scale of those that are doing exploitation, that's the opportunity. And we're in a good position to do it.

Host: So, let me ask you this, because it sounds like you're matching wits. I mean, the people that are doing these scams, do they employ guys like you, to do the bad stuff, and then you've got this cat and mouse game of who's going to stay ahead of whom in the big tech picture?

Chris White: One of the lessons I learned in Afghanistan was that people are capable of a lot. People see movies, they read books, but very few people really are exposed to what humans are capable of doing to each other and for money. And so, the problem you mentioned, this problem of keeping up with the adversary, is also one of the limitations of the work that we've done with the digital crimes unit, is that it still has a little bit of a whack-a-mole kind of approach for trying to fight crime and catch bad people. And that is useful, for example, it's useful for deterrence. It's useful for characterizing the problem. But it has limitation in terms of the long-term-ness of its impact. And so, from my point of view, the other reason we focus on the general purpose-ness of the technology is because the real impact in those problems is likely to be won through economics, through understanding how people are making money doing this stuff, and how to then use that as a way to approach a more systematic way to deal with the problem. And so, given all of that, to me, it allows us to talk about beyond the whack-a-mole approach, beyond the case-by-case approach, because if we think about how spheres of influence, like cyber and information, are being used systematically, then we can start to approach them with tactics and understanding. For example, if we now see that there are organizations that are trying to influence groups of people, we can ask questions like, how long does that take? What kind of measurement system would we need to understand that? And if we had such a system, such as big data and technology measurement system, how might we use it to protect ourselves?

Host: It sounds like new fronts and new frontiers.

Chris White: Well, I think what we're seeing with the rise of the cloud, and with the pervasive increase in sensors collecting and storing information, is we're seeing how people are starting to use that. And in the end, a lot of this really is about people, and the way they're using it to make money and exploit each other is something that's really happening. Just like people are also using it for business purposes, for normal everyday life, for just getting their work done. And so, we so have to acknowledge that and make sure that we can protect our platforms as a company built on trust. As the 'designated driver' of the IT industry, as being recently reported. And then, at the same time, when we do see these things happening, how can we make sure to empower the people that are protecting us with the tools they need to make decisions using information.

(music plays)

Host: You've been called 'The Man Who Lit the Dark Web.' Despite the sensationalism of that headline, how did you do that? How did you light the dark web? I mean, what was the context of why they said that, is a better question?

Chris White: Well, as I mentioned, with big data in general, and the rise of publicly accessible information, and the way it's being used now for both exploitive purposes as well as constructive purposes, we were trying to understand the right place to start applying the technology we were investing in from DARPA's point of view. And we looked around, and we found, to our surprise, that there was a tremendous use of the internet, and communication networks on the internet, as a mechanism for connecting buyers to their products, where the products were people. And that's the way I talk about it, because in the end, it does seem to be a function of economics, that there is a demand for products, and they're willing to pay for them. And there is a supply, and those are people who are willing to take risk from law enforcement in order to make money meeting demand, and the way that the internet is used for advertising and connecting the buyer and the product is where there's both an opportunity, as well as a use that seems suspicious. And so, when people are going online, and they're doing these searches, some of them are not looking for anything in particular, and then those that are trying to look for something a little bit more risky, or a little bit more dangerous, start to find places online where they are sought out. And so, our view is that these places that are before them, a place where you could operate with relative impunity, because no one could see what you were doing… If we could start to make it available for people to see what you're doing, even without judging exactly whether what you're doing is good or not - because that's what our, you know, legal system is for is to help us make arbitrary distinctions - that at least then people could have the evidence to know what you were doing and then decide whether it's worth prosecuting under our legal system or not. And that was the big opportunity, it was that this darkness of different types of networks of web pages, of usernames and large databases of FOIA documents and leaks, this darkness was something, where if we could make the information within it visible to regular people, subject matter experts, then maybe they would do something about it. And so, we took on the worst of the worst: people who were abusing children and who were abusing women and men in labor and sex situations. And they were doing it a lot. And without really much consequence. And so, that was why we picked that problem, and I think that we had a good impact, although there's still a lot of work to do.

Host: I really like the framework of illuminating. Simply by putting a light on something, and then allowing people to discern and follow up if they can. Right?

Chris White: Oh, for sure. Well, and when things are dark and when things are vague or unknown, they can be scary because you don't know their qualities. And once you start to make something visible, then you can operate on it. You can ask questions like, okay, we're in, you know, New York and we have a special victims bureau, and there are 500,000 ads for sex in a year in this jurisdiction…

Host: Unreal.

Chris White: How do we prioritize who to go after? How do we complement that work with work in domestic violence? How do we understand what hospital staff are needed, what victim outreach services? These are analysis questions. Analysis questions require the granularity to answer them, and from our point of view, a lot of that was available online. And so, if we could enable people to have access to it in an understandable way, then they wouldn't have to make those decisions by gut instinct or by precedent or by highest paid opinion. They could put it into a framework that let them evaluate it. And that way it wasn't even forever. They could make a decision. They would then have a measurement system to see the effect of their decision. And then they could decide to keep doing it or not. That, to me, was a much more maintainable, workable situation.

Host: Yeah, and this is a distinctly digital problem. I mean, the ways that people communicated about exploiting people or themselves for money in the past were much more, you know, seeable. And so now what you've done is transfer that into this digital realm and you're doing fingerprint analysis in cyberspace.

Chris White: Absolutely. If we look at parts of the world where physical security is still an issue, you'll see tall walls and razor wire. You'll see bars. You'll see people with machine guns. In the information space, in the cyber space, no such protections really exist, even here. I mean, very few. There's a large burden placed on platform companies to protect our customers and their data. And for now, if we want to proceed with business, if we want to have a market where people have value assigned to the services we offer, including the trust of our platform, then we have to protect it well. It's often the case, especially in the research community, that people think of users in terms of 'novice' and 'advanced.' I just think that's the wrong approach. There are not novice and advanced users. There's really technical experts and domain experts. And domain experts know a lot about what they're doing. They know the patrol. They know their patients. They know their company. They know the issues. They may not be comfortable with random variables, or different AI techniques, but they certainly have been doing something well for a while. And those are the kind of people that we have to enable, that we have to protect, we have to provide information to. And understanding that, then, also affects research, because you then design and build for those people, not just based on what the literature says is an innovation.

(music plays)

Host: Let's talk specifically about how the background you brought, the same fundamental technological thinking but applied in different areas. And so, it's been useful in law enforcement, useful in the military, useful in digital crime. How is it playing out now?

Chris White: Well, our approach to applied research is to take a look at organizing data on one side, creating data structures, using graph modeling, graph analytics, graph statistics, using natural language processing and computer vision… basically turning streaming access points of unstructured information into a data structure that one can work with, and at the same time, bridging the gap to the user through user interfaces, through HCI that has AI enabling it. That combination is possible partly because of our cloud, but also because of Power BI. Power BI is an Excel-like took for business intelligence, ostensibly, but it's grown, and growing, into more of a platform for data analysis. The business intelligence community was one that had a 'dashboard approach' to looking at graphs of numbers and status updates. But it also was a marketplace where people were going to be doing more complicated kinds of analytics. And so, we used that as our approach to organizing our research. And so, we decided that we could take that business intelligence market and expand what that meant, so that it didn't just mean tables of numbers, but it meant metadata and graphs and steams of content. To do that, we took the two areas of our research, streaming graph analytics and visual analytics with user interfaces, and we built them into Power BI. We built them by enabling end-to-end processes that would transform data using AI techniques, and we built them using new visual representations for interacting with content. That was something that allowed us to bridge the gap between the abstract notion of AI, the API notion of AI, the algorithmic notion of AI, in more like a real application experience. And the outcomes were quite useful. It was a way that we found to take the abstract notion of AI and make it approachable and workable with something that you can download and see and start to work with. But one part of the impact was, to me, very important: if the thing you're building in research can only be used by a Fortune 10, and costs millions of dollars, and requires a PhD in computer science, then there will be limited impact by definition, almost. If, instead, we want to enable a billion people, how do we get a billion people to understand how to use data? It's got to be cheap and it doesn't require a PhD in computer science. And so, to me, that outcome, while it seems like an economic or a business issue, is actually to me a research issue because it helps confirm that, as a priority, we can still build useful things, but have the constraint of low cost and ease of use.

Host: I've heard Microsoft Research described as a values-forward organization, and I hear that over and over when I talk to the researchers in this booth. It's interesting where you can marry business interests with global good interests. How could we move toward making sure those two stay married?

Chris White: Well, my dad always tells me that our goal is to do well by doing good. And with Microsoft, there's the opportunity, because of its position in the marketplace, because of its size, where things like trust, things like responsibility, those things are core to our business interests. They're not just company values. But for me, you know, on the research side, I often ask researchers, you know, what's a high-risk project for you? Like, how do you think about risk? Because to me, research is where you should take risk. Which means that you should do things that product groups can't do, can't afford, is not on their roadmap. If we take research, what we can do is we can take risk and then we can, if successful, make sure that the rest of the company benefits. And so, this problem of, how do we do well in the world, how do we address impacts to society, how do we approach problems that may have a non-market value at first - well, research is a great place to take that risk, because, when successful, it helps the company in many ways. And there's room for all kinds of people. Sometimes I think about complicated research as like baseball pitching, where, with baseball, occasionally, one person can pitch nine innings and succeed, but often, you need three pitchers: a starter, a middle reliever, and a closer. And they often have to have different skills. And so, in research, the starter, these are people that have really original ideas that are groundbreaking. You know, people like Yoshua Bengio, who works with us on deep learning. They set the field for everyone else to work in. Middle relievers, they take work and then they advance it to a usable degree, maybe. And then closers, they have to have the patience to deal with the tediousness of deployment and fielding, and the reality of operations, right? All those skills are very necessary to stay innovative and so, we need those different people. And so, to me, having a large research organization, almost an institution, comprised of these different kinds of people, is the best chance we have to stay innovative, to stay on the edge of what's relevant of society, and to make sure that Microsoft has businesses in the future.

Host: Proving once again that there's a baseball analogy for everything. Listen, I ask all the researchers that come in this booth, what keeps you up at night? I'm not even going to ask you that question because most of the stuff you said at the beginning keeps me up at night. But I do want to ask you, as we close, I wonder if you could give some advice to aspiring researchers who might look at you and say, 'What should I be thinking as I plunge into what I'm going to do after my PhD?'

Chris White: Right! My point of view is that being flexible, being relaxed, having an open mind, that those are really important characteristics. Right before I went over to Afghanistan, there was this former 3 Star General, and he told me a piece of advice. He said that when I was in charge, if I were organized, if I was on top of the situation, then I could really see an opportunity walk in the door. I could see it, I could take advantage of it, and I could execute on it. But if I was too concerned with my position, with my career, with what I thought of myself, with my identity, then I would miss them. And I really took that to heart. And it's not that there's one way to approach any of these things. But I do think that, given the pace of technology change in computer science, and given the role that's changing between companies, governments, and academic institutions, that it's very important to have that kind of flexible attitude, because the thing that you studied for a long time might be irrelevant in three years, and the locus of action might change for one kind of institution to another. And so, my point of view is to roll with that and to take advantage of those opportunities, and then to try to make it about the work. Because when I make it about the work, then it's not about me, and we can debate about the work in ways we can measure. And then other people can contribute things, and if no one cares where that comes from, then the work can proceed. And that's, to me, also why I left DARPA. I wanted to make sure I could leave in time for the work to survive my own point of view, because if it were good enough, then it should. And if it requires a single person's personality or oversight, then it's fragile. And so, I would encourage any perspective researcher to take a broad view of research and to avoid getting stuck too much on how they think of themselves and their career.

(music plays)

Host: And to say yes on that third ask.

Chris White: Yeah, 'Yes and!'

Host: Yes-and. Chris White, it's been a delight. Thank you so much for coming in and talking to us today.

Chris White: Happy to. Thanks for having me.

Host: To learn more about Dr. Chris White, and how data science, AI and the cloud are solving big data problems at scale, visit Microsoft.com/research

Attachments

  • Original document
  • Permalink

Disclaimer

Microsoft Corporation published this content on 07 June 2018 and is solely responsible for the information contained herein. Distributed by Public, unedited and unaltered, on 07 June 2018 13:52:10 UTC