John D. Vu

Director, Biotechnology Innovation and Computation

Distinguished Career Professor

Carnegie Mellon University

School of Computer Science

JAN

14

The next big thing: A revolution in healthcare, medicine and biotechnology.

by John Vu, posted on January 14th, 2011.

Several years ago, I participated in a panel discussion at a software conference where an audience member asked me: “What will be the next big thing?” Naturally, cloud computing, mobile devices and social networking, all popular and high growth areas entered my mind but after some reflection, I realized that these were things that had already emerged, have been quickly maturing and thus in my opinion, were not the ‘next’ big thing. After some careful consideration I answered that, “I’m not certain what the next big thing will be but I believe it to be located somewhere at the intersection of computer science, the biosciences and healthcare.” This answer surprised many of my friends and fellow panelists as they had expected me to answer with something more closely related to software engineering or globalization (which was my primary area of work and research at the time). Indeed, many of the panelists (who are all respected, accomplished experts in software) asked me afterwards to share the rationale behind my answer.

Lighted Circuit

The evolution of computing

I have spent most of my career in software engineering and aerospace. Prior to the mid-1980s, it was not uncommon for the design and manufacture of a new airplane to span a decade or more. Quite a lot of time was spent creating and testing physical models (wind tunnel experiments) and prototypes (mechanical mockups) through trial and error. Seeing this, I advocated the use of computer software and simulation tools in the design of aircraft and luckily, had the opportunity to work with a cadre of talented engineers to integrate these tools into the aircraft design and manufacturing process. Not only did software such as CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) systems allow aerospace engineers to design components better, faster and cheaper but many of the time-consuming activities being done by engineers suddenly became trivial, thus allowing skilled engineers to focus on solving higher order problems which in turn contributed greatly in propelling innovation. As a result, we were able to incorporate innovative new technologies like fly-by-wire control systems into our aircraft while reducing the time spent from aircraft design to manufacturing to about half of what it had previously been. These cascading innovations marked a major revolution for the aerospace industry and contributed to the creation of superior products delivered at a huge cost savings. Following design, we began to bring advanced computer technologies into the factory which to date has helped to redefine how we build airplanes, manage our inventory, our supply chains and even how we relate with our customers. We became much more “lean” and efficient, giving us a significant competitive advantage over our competitors.

As I look back, I realize that these revolutionary innovations did not occur due to some ingenious breakthrough within a particular field such as aerospace or electronics, but is better attributed to how these fields intersected and utilized computing to fuel innovation and change. Today, all of the innovations I have mentioned have evolved from disruptive, competitive differentiators to requisite competencies that are now essential to the operations of many organizations.

Indeed, if we look at the evolution of computing over the past several decades, we can observe that computing technologies have led to significant changes across a wide variety of fields. Accounting and finance were some of the earliest beneficiaries of computing in the 1960s and 1970s, followed by engineering and manufacturing in the 1970s and 1980s. In the 1990s, the integration of computation and communications technologies made information widely accessible and helped to redefine how businesses managed their inventories, supply chains and relationships with customers. More recently in the 2000s, human interconnectedness and social living underwent a dramatic shift due in large part through the ubiquity of mobile computing devices and social networking. Computing magnifies the speed of progress and reveals new opportunities. In pondering the question, “What will be the next big thing?” I began to consider where the next untapped opportunities for computation lie.

Drug R&D v. Approval

The challenge of drug development

In the pharmaceutical industry, there exists a tremendous innovation and productivity gap between the amount of funds being spent for drug development and the number of new drugs approved. Adjusted for inflation, the amount of dollars spent by pharmaceuticals on research and development increased from approximately $8B in 1984 to $60B in 2005 while the number of drug approvals over the same time period have remained mostly flat (Congressional Budget Office, “Research and Development in the Pharmaceutical Industry” - October 2006). Though innovations and potential breakthroughs in the industry are incredibly numerous, I believe that immature processes and fragmented datasets coupled with a long standing inclination towards research over commercialization have prevented the available potential from being fully realized.

Seeing this trend and market need, I began to wonder if the related technologies needed to revolutionize the pharmaceutical and broader healthcare industry were mature enough to facilitate rapid innovation. In looking at recently available technologies I see that the use of electronic health records has exploded over the last decade (generating a wealth of useful data), advances in genomics is making the cost of personal genome sequencing affordable (allowing for targeted treatment of certain cancers and genetic diseases), cloud computing has made analyzing extremely large datasets much more tractable and the list goes on and on. Many disparate innovations exist; perhaps it’s time we look at all of them not as individual, segregated tools but as interrelated components of a broader system. Could patient treatment data lead to safer, more effective therapies? Could information collected from active and abandoned research areas be mined to uncover insights leading to faster drug development? Could we tweak and combine existing, dissimilar technologies to create more novel and compelling products? I believe that the time for this is now and that by applying better tools, technologies and methodologies, we can give birth to new innovations that realize many of these opportunities.

Driving to Opportunity

Being there at the right time, at the right place

Over the last few years, I consulted with a number of companies to identify areas for software improvement and saw a lot of low-lying opportunities in this field ripe for innovation. I also visited a number of universities to look for related curricula but found that a majority of related programs are primarily focused on furthering very domain specific research in this area and developing more Ph.D. research scientists instead of looking at integration activities, large scale data analysis, statistical learning, systems engineering and opportunity capitalization/commercialization. In short, very few people are looking at engineering the computational framework required to advance the next generation of discoveries in healthcare and biotechnology. If the evolution of the IT (Information Technology) industry is any reasonable analogy, I believe we can expect an explosive proliferation of software in the biosciences sometime this decade. Because I believe in this emerging trend, I have proposed to create a new Master’s in Biotechnology Innovation and Computation program at Carnegie Mellon University. My vision is to educate and create leaders who can recognize viable opportunities and take action on them through the application of innovation and technology.

Forty years ago, Gordon Moore and Robert Noyce started a company in Santa Clara named Intel and helped to usher in the modern age of computing with the advent of semiconductors. Thirty years ago, two entrepreneurs named Steve Jobs and Bill Gates built upon these innovations and heralded the age of software and personal computing with the founding of Apple and Microsoft respectively. Fifteen years ago, Larry Page and Sergey Brin would forever change how we retrieve and search for information. What will be the next big thing? Something at the intersection between biotechnology, healthcare and computation is a good guess and CMU is a good place to look when looking for the next disruptive innovation.

Blog Archive

>> January 2011