Distinguished Career Professor
Some of you have asked me about the relationship between Biology and Information technology, two completely different fields and our approach to the Biotechnology Innovation and Computation program.
The Biology field has changed significantly, in large part by the Human Genome Project. Basically, we now view Biology as more or less an informational science. DNA is basically a digital four-letter language, storing information on chromosomes. Genes express themselves just like computers executing software programs. These expressions integrate with each others to create a particular kind of informational pathway that signals the body to perform certain function.
Anything wrong with these expression or pathway may create problem or disease. By understand how these genes work, we can look at diseases from a different perspective. By understand the detail genes that predispose to disease, we can predict certain health history and the probabilities and it could change the way we treat disease today.
Within our body, there are billion or trillion of these interconnected informational pathways that form networks. To understand this view of biology, we have to understand the information at different levels, from cells to organs so we can understand how they work. If we understand well about these informational pathways, we may be able to alter them thus prevent a disease from happening. This is where drug discovery is all about.
Based on this, I believe the convergent of Biology and Information Technology should happen and we need to look at those two fields as our subject to study and to create new opportunities. How we think about biology, how we deal with, how we integrate, and model the explosion of this information are fundamentally information technology (IT) problems. I believe future bio- scientists will need to have knowledge and IT skills to work on their biology research. If we can define our biological systems through some mathematic modeling we can predict some emergent properties, and make a breakthrough in the way we deal with diseases.
Of course, our body is a complex system and we only just begun to explore it from this new angle of information technology. Today in every research lab, there are tons of information and raw data available, some store at different databases, some at personal computers, and they have not been integrated, mined, organized, categorized for further study. This is where I think information technology could add some values.
Even our brain is limited, we cannot process large and complex information but a computer can. It can process billion of data, sort through thousands of articles to find what we are looking for. With the use of computer and sophisticated algorithms, we can save significant amount of time in research. Instead of spending weeks and months finding the information that we need, we can use computer software to sort through and get to the right data, at the right time, to the right people.
The goal of our Biotechnology Innovation and Computation is to apply information technology to enable bio-science and related fields to create valuable innovations that will improve their works. If we look back, we can see that information technology has led to significant changes across a wide variety of disciplines. Accounting, finance and manufacturing benefitted largely from software automation in the 1970s and 1980s. In the 1990s, advances in communication technologies helped to redefine how businesses managed their inventory, supply chains and how they relate with their customers. Many of these technologies and innovations have evolved from competitive differentiators to requisite competencies that are now essential to the operations of most organizations.
Though many industries have embraced information technology as an enabler of increased information and improved efficiency, adoption of information technology in the biosciences is still relatively immature. Since the early pioneering days of biotechnology in the 1970s, the number of promising discoveries and innovations has multiplied dramatically. Today there are over 200 therapies and vaccines that have been created through biotechnology and the industry as a whole has grown approximately 860% since 1994 to a global market cap of approximately $344 billion in 2008. Though innovations and potential breakthroughs in the industry are incredibly numerous, immature processes coupled with a long standing, biased inclination towards research over commercialization have prevented the available potential from being fully realized. For example, today drug development is very expensive. It costs hundred million dollars or more just to get a drug through clinical trials and approval. Among thousands of drugs being worked on, most only got partially way through then had to be abandoned. By applying information technology early in the process, it is possible to obtain better information and data to help scientist making informed decision and reduce waste.
Drug discovery and development is but one example where specific software innovations may be able to substantially streamline processes and productivity. By applying more advanced tools and methods, one can help close the innovation gap between research and commercialization activities in the biosciences. Successful development of such an innovation would create a disruptive inflection point in the industry that would soon regard it as a requisite necessity. If the evolution of the IT industry is any reasonable analogy, one can expect an explosive proliferation of software in the biosciences as emphasis shifts from research to product development.
Several years ago, I participated in a panel discussion at a software conference where an audience member asked me: “What will be the next big thing?” Naturally, cloud computing, mobile devices and social networking, all popular and high growth areas entered my mind but after some reflection, I realized that these were things that had already emerged, have been quickly maturing and thus in my opinion, were not the ‘next’ big thing. After some careful consideration I answered that, “I’m not certain what the next big thing will be but I believe it to be located somewhere at the intersection of computer science, the biosciences and healthcare.” This answer surprised many of my friends and fellow panelists as they had expected me to answer with something more closely related to software engineering or globalization (which was my primary area of work and research at the time). Indeed, many of the panelists (who are all respected, accomplished experts in software) asked me afterwards to share the rationale behind my answer.
I have spent most of my career in software engineering and aerospace. Prior to the mid-1980s, it was not uncommon for the design and manufacture of a new airplane to span a decade or more. Quite a lot of time was spent creating and testing physical models (wind tunnel experiments) and prototypes (mechanical mockups) through trial and error. Seeing this, I advocated the use of computer software and simulation tools in the design of aircraft and luckily, had the opportunity to work with a cadre of talented engineers to integrate these tools into the aircraft design and manufacturing process. Not only did software such as CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) systems allow aerospace engineers to design components better, faster and cheaper but many of the time-consuming activities being done by engineers suddenly became trivial, thus allowing skilled engineers to focus on solving higher order problems which in turn contributed greatly in propelling innovation. As a result, we were able to incorporate innovative new technologies like fly-by-wire control systems into our aircraft while reducing the time spent from aircraft design to manufacturing to about half of what it had previously been. These cascading innovations marked a major revolution for the aerospace industry and contributed to the creation of superior products delivered at a huge cost savings. Following design, we began to bring advanced computer technologies into the factory which to date has helped to redefine how we build airplanes, manage our inventory, our supply chains and even how we relate with our customers. We became much more “lean” and efficient, giving us a significant competitive advantage over our competitors.
As I look back, I realize that these revolutionary innovations did not occur due to some ingenious breakthrough within a particular field such as aerospace or electronics, but is better attributed to how these fields intersected and utilized computing to fuel innovation and change. Today, all of the innovations I have mentioned have evolved from disruptive, competitive differentiators to requisite competencies that are now essential to the operations of many organizations.
Indeed, if we look at the evolution of computing over the past several decades, we can observe that computing technologies have led to significant changes across a wide variety of fields. Accounting and finance were some of the earliest beneficiaries of computing in the 1960s and 1970s, followed by engineering and manufacturing in the 1970s and 1980s. In the 1990s, the integration of computation and communications technologies made information widely accessible and helped to redefine how businesses managed their inventories, supply chains and relationships with customers. More recently in the 2000s, human interconnectedness and social living underwent a dramatic shift due in large part through the ubiquity of mobile computing devices and social networking. Computing magnifies the speed of progress and reveals new opportunities. In pondering the question, “What will be the next big thing?” I began to consider where the next untapped opportunities for computation lie.
In the pharmaceutical industry, there exists a tremendous innovation and productivity gap between the amount of funds being spent for drug development and the number of new drugs approved. Adjusted for inflation, the amount of dollars spent by pharmaceuticals on research and development increased from approximately $8B in 1984 to $60B in 2005 while the number of drug approvals over the same time period have remained mostly flat (Congressional Budget Office, “Research and Development in the Pharmaceutical Industry” - October 2006). Though innovations and potential breakthroughs in the industry are incredibly numerous, I believe that immature processes and fragmented datasets coupled with a long standing inclination towards research over commercialization have prevented the available potential from being fully realized.
Seeing this trend and market need, I began to wonder if the related technologies needed to revolutionize the pharmaceutical and broader healthcare industry were mature enough to facilitate rapid innovation. In looking at recently available technologies I see that the use of electronic health records has exploded over the last decade (generating a wealth of useful data), advances in genomics is making the cost of personal genome sequencing affordable (allowing for targeted treatment of certain cancers and genetic diseases), cloud computing has made analyzing extremely large datasets much more tractable and the list goes on and on. Many disparate innovations exist; perhaps it’s time we look at all of them not as individual, segregated tools but as interrelated components of a broader system. Could patient treatment data lead to safer, more effective therapies? Could information collected from active and abandoned research areas be mined to uncover insights leading to faster drug development? Could we tweak and combine existing, dissimilar technologies to create more novel and compelling products? I believe that the time for this is now and that by applying better tools, technologies and methodologies, we can give birth to new innovations that realize many of these opportunities.
Over the last few years, I consulted with a number of companies to identify areas for software improvement and saw a lot of low-lying opportunities in this field ripe for innovation. I also visited a number of universities to look for related curricula but found that a majority of related programs are primarily focused on furthering very domain specific research in this area and developing more Ph.D. research scientists instead of looking at integration activities, large scale data analysis, statistical learning, systems engineering and opportunity capitalization/commercialization. In short, very few people are looking at engineering the computational framework required to advance the next generation of discoveries in healthcare and biotechnology. If the evolution of the IT (Information Technology) industry is any reasonable analogy, I believe we can expect an explosive proliferation of software in the biosciences sometime this decade. Because I believe in this emerging trend, I have proposed to create a new Master’s in Biotechnology Innovation and Computation program at Carnegie Mellon University. My vision is to educate and create leaders who can recognize viable opportunities and take action on them through the application of innovation and technology.
Forty years ago, Gordon Moore and Robert Noyce started a company in Santa Clara named Intel and helped to usher in the modern age of computing with the advent of semiconductors. Thirty years ago, two entrepreneurs named Steve Jobs and Bill Gates built upon these innovations and heralded the age of software and personal computing with the founding of Apple and Microsoft respectively. Fifteen years ago, Larry Page and Sergey Brin would forever change how we retrieve and search for information. What will be the next big thing? Something at the intersection between biotechnology, healthcare and computation is a good guess and CMU is a good place to look when looking for the next disruptive innovation.
>> January 2011