|
We have recently changed our website. Looking for the previous site? |
DKA (David Allison): Perhaps we could start with something you just mentioned, that you came from a family with a tradition of engineering and you, of course, went into science. Maybe you could talk a little bit about that and your interest in moving into studying sciences as a college student.
JAM (J. Andrew McCammon): Sure. It goes a long ways back. There's probably more there than you really want to hear. I grew up as one of three children in a family of a civil engineer. My father was a professor of civil engineering and also worked in private industry and was very interested in bridge design and things like that. I was troublesome child early in my life, like many scientists seem to have been, failing years of school and having trouble with mathematics, not learning multiplication tables and so forth. But always my parents were very interested in trying to expose me to scientific things, taking me to museums and exhibits of what modern technology and physics and so forth can do. I think my interest in science really took off, as with many people of my generation, in the post-Sputnik years when there was a great wave of concern in American society about trying to bring children into science to compete with our archrivals in the Soviet Union. I was still at the stage where I was not interested much in what went on in the classroom. My real entertainment was reading comic books and suddenly all the comic books, were about people exploring space and space stations orbiting the earth. Suddenly I became interested in what one could do with scientific tools and went from being a failing student to a student who was doing very well in classes.
DKA: About what age was that?
JAM: That would have been when I was in the fourth and fifth grade. Probably in the early 1960's. My parents tried to further encourage that and confused me in a way that lasted to this day, by giving me my first toy chemistry set and my first toy microscope at the same time. I was never sure whether I was a chemist or a biologist. I would look at leaves under the microscope, but I would also make reactions in test tubes and like many kids I was more interested in what bizarre colors or explosions I could generate than deep insights into nature. But that was really the beginning of my interest in science as I remember it. From the outset it was very interdisciplinary. Again my father tried to reinforce that, I think, by seizing any glimmer of interest that I might show in something. For example, I was interested in solar mirrors and was trying to construct one to create high temperature to burn pieces of paper or something like that, and he would get out the encyclopedia, look up the equation for the parabola and then show how the parabolic design really was founded in some sort of mathematical principles. Again, early on - elementary school, sixth, seventh, eighth grade - I began to have an interest in the quantitative side of things, the mathematical underpinnings of the experiments that I was doing. That really continued through high school and into college. I think it was really as a student at Pomona College, during my undergraduate work, that I became interested in computers as such. There was an organic chemist on the faculty of Pomona College who was one of the early pioneers of trying to use computers to help in the design of new pharmaceuticals and other useful compounds, Corwin Hansch. He used in some sense an empirical approach, a statistical approach in which one looked at a series of potential pharmaceutical compounds and tried to decide which ones had the optimum effect and then correlated that in some sort of a simple statistical correlation with simple chemical properties of the compound, their size and other simple features.
DKA: How were you exposed to those as a student?
JAM: Yes we were really exposed to that in two ways. One was through the classroom. We had the advantage that Pomona College, has relatively small classes where the students were able to work closely with faculty members, and go through normal textbook-like material and then supplement that with some of the ongoing research. Its not a new point, but one worth emphasizing here, is that there is this strong tie between research and teaching. We who do research often try to emphasize that this is important in the educational process. This is a good example of it, as Dr. Hansch was able to bring some of his current research work into the classroom, include that in some of the lecture material, show some of the things that were actually going on and proving fruitful in society at that given time. And of course that captured my imagination and other peoples' imagination. It made clear that what we were learning in the classroom really did have some real potential for usefulness in society in medicine, industry and other applications. In addition to that, Pomona was very fortunate to have a grant from the National Institutes of Health at that time that could sponsor summer research activities for the students. Again one of the unfortunate trends in society at this time - these things are always cyclical - but there's been a diminishing of federal support for education and research and alot of these programs have dried up. But in the mid-'60's there were programs from the National Institutes of Health and from the National Science Foundation to allow opportunities for undergraduate students to spend time actually in the laboratories participating in research activities, and so we got a more intensive exposure to some of the uses of computers and other tools through intensive summer research participation.
DKA: What type of computers did you use in that program?
JAM: Well, the earliest computers that we used at Pomona College - this starts out in 1965 - were pretty primitive of course by today's standards. They were in fact computers where you program them by using electrical cables that you would plug in to sockets and you would rearrange these cables if you wanted to change the program of the computer. Soon thereafter, we got an early IBM machine, one that had vacuum tubes and it had the wonderful feature that there were these punch cards that you could use instead of changing the plugs on the sockets. So, through the course of my undergraduate work we saw one advance in computing, and then of course later on, as I went on and did more advanced undergraduate work, we moved on into being able to use some of the supercomputers of the day. In my graduate, and especially post-doctoral, work we were using a machine called an IBM 360 91E, which at the time was the supercomputer of the era. The machine was actually located at Columbia University and we would use it over the telephone lines from Harvard where I was doing my graduate work. Occasionally, you'd have to visit the computer site to pick up magnetic tapes of data. It was an impressive machine to look at. It was fairly large in size and it generated alot of heat so it was in a room that had tremendous air conditioning capabilities and the whole room kind of throbbed with this air conditioning. You had the sense of enormous power. Of course, the same computer power is something that you would have in a laptop machine and the calculations that we struggled to do in mid-1970's, are things that you can do on a laptop computer travelling across the country now.
DKA: What made you decide to go to Harvard?
JAM: I was, as mentioned before, very interested in the interface between biology and chemistry - and physics for that matter. I had pretty much decided by the end of my undergraduate work that I wanted to try to use physical chemistry type methods to study biological behaviour at the molecular level. There were several outstanding schools in the country to consider for graduate work. By that time I was married and my wife was an English major, also from Pomona College, and I think the decision to go to Harvard was largely based on opportunities for her. There was a very good publishing procedures program that Radcliffe College offered and so she was able to participate in that and then look for a position in publishing. In fact she worked at MIT's Technology Review magazine for a number of years. It was determined partly, I would say, by extra-scientific factors. Clearly, there were a number of very good people on the faculty at Harvard working in this interface area between chemistry and biology.
DKA: Were they as interested in the computing aspects as you were or was that something you brought with you?
JAM: That really came somewhat later on. My graduate career was terribly complicated in a number of respects and its something I often try to present as an example to students who are faced with their own difficulties in graduate work. I started at Harvard in 1969 and during my first year I was drafted. It was during the Vietnam War. I was a conscientious objector, so I completed one year of graduate school and then I found the nearest hospital, which was Massachusetts General Hospital, and went and applied for an alternate service position. I simply assumed that I'd be doing some sort of degrading and loathsome work for two years, but when they found that I just received my Masters Degree in physics at Harvard and had a background in chemistry and so forth, they thought perhaps they could use that to some advantage, and so they assigned me to work for two years in a biochemistry laboratory. This was far away from the things that I planned to do. I was really working with test tubes and columns and freezing in cold rooms playing with protein molecules. And yet it was good experience in retrospect because it taught me to think about the actual physical behaviour of these protein molecules in a way that I might not have if I just dealt with purely theoretical issues. In the course of the graduate and particularly this alternate service type work, I became very interested in antibody molecules, the key component of the immune system. And one of the things that came up in the work that I was doing at Massachusetts General Hospital was the fact that antibody molecules are in fact quite flexible. They have very distinct globular domains of protein that are linked by rather thin segments. These globular domains can move around with respect to one another. That was something that, even at the time, I thought, "Perhaps one could use computers and theoretical methods to study how these parts of the molecule move with respect to one another and what that might have to do with how these molecules actually work". When I returned to graduate school, I wasn't able to pursue that idea right away. I wound up more or less by accident working with someone who I hadn't really planned to work with as thesis supervisor. Originally I had planned to do experimental work, applying physical chemistry to biological systems, and I found one potential Ph.D. advisor who was an assistant professor but who unfortunately did not get tenure. Then I picked as a second advisor another junior faculty member, who also did not get tenure. I'm a slow learner. At that time, I was fairly desperate about what to do. I had lost two years already and I was looking for my third advisor. We had a visiting faculty member, John Deutch from MIT, who is an eminent person in theoretical chemistry. Professor Deutch agreed to take me on as a student and I wound up more or less accidentally learning statistical mechanics. It is a branch of theoretical chemistry that has to do with the behaviour of large ensembles, large collections of atoms. So more or less by accident, I wound up learning material that put me in a position that I could go back and think about the behaviour of these protein molecules in a way that I could not have otherwise. After working with Deutch and learning statistical mechanics, I was able then to do postdoctoral work with Martin Karplus, a Harvard faculty member, an eminent theoretical chemist. Martin Karplus at that time was just getting interested himself in the use of computers to study biological molecules. It was not a brand new field but a relatively new field. There are some people, like Harold Scheraga at Cornell University and one or two others, who had been trying to use computers to study how atoms might readjust their positions in proteins and other molecules. One of the things that I was able to bring to this, I think, because of my accidental work with John, was a perspective on how the dynamics of these molecules might be described in very quantitative terms using equations from the theory of diffusion and equations from the theory of Newtonian dynamics, to really try to describe the atomic motion within protein molecules and overall motion of protein molecules in a fairly rigorous fashion. In order to do that, however, we needed to rely on the power of what were then the supercomputers of the day in order to solve these equations and describe these motions in a quantitatively accurate fashion.
DKA: It was not entirely clear to me from the material that I read, when you were able to do this. Was that your thesis problem or did you not really get into this after you finished the thesis?
JAM: The actual work on the simulation on the motions of atoms and proteins was something that really started after my thesis work. The thesis work was mostly concerned with how molecules like proteins move through a fluid environment. That has turned out to be a very useful thing to consider. Its much of what we have done recently. The simulation of the detailed motion of the atoms within protein molecules was something that I really pursued in the postdoctoral work.
DKA: My sense is that this passion that you had to understand the physical behaviour as well the chemical behaviour and the bridging between those two, is somewhat unusual. Did you feel that it was unusual or were there other people interested in that? It seems like it is fairly pioneering.
JAM: We did have the sense of doing something new when we tried to take particularly basic equations of motion from physics - very fundamental equations, like Newton's equation of motion or the diffusion equations - and try to apply them in a detailed way to individual protein molecules, individual nucleic acid molecules. That was something that really had not been done before. Its something that even at the time we had a sense of bridging in some sort of rather philosophically interesting fashion two really very different areas of science. I mean, here we were and we had no idea at the time what this might be useful for, if anything. It was purely abstract research at the time. And perhaps a good example of how something that starts out as pure research, not goal oriented in any sense whatsoever, turns out to be tremendously useful and practical for applications later on. We saw this work as an opportunity to use the power of computers, which of course is always a moving marker. We were working with the best machines of the day and with the power of these machines we were able to bridge the basic concepts of physics on the one side with the basic elements of life on the other and to say something about how these molecules actually function. I remember quite vividly when we did the first simulation of the atomic motion in a protein, we did not have nice graphic displays and these things, but we were able to output some of the structures to pen plotters and rather laboriously draw one snapshot of a protein and then a snapshot of what it looked like a little bit later and then a snapshot of what it looked like a little time later. And there was this sense, even at the time, of something truly historic going on, of getting these first glimpses of how an enzyme molecule might undergo internal motions that allow it function as a biological catalyst.
DKA: Was the response in the scientific community to what you were doing enthusiastic? Was it skeptical? How were you received in the early phases?
JAM: When we first tried to report this work, it was met with a mixture of skepticism and some enthusiasm. The greatest skepticism, I think, came from people who had done the most to determine the structures of protein molecules using x-ray crystallography. There's a very important and rather venerable field in which people isolate protein molecules, things like hemoglobin or other proteins, grow crystals of them and then by shining x-rays at such crystals and understanding the x-ray scan of the crystals you can determine what the structure of the underlying molecules is, the actual three dimensional architecture of these proteins. Now these protein structures were always depicted on the cover of Scientific American and journals such asScience and Nature as very static images, and so the great majority of the scientific community tended to think of proteins as fairly rigid structures. If one wasn't looking at a picture of such a structure then what one might be looking at was a brass model made out of rods and bolts and nuts, again a very rigid structure. So most people who dealt with proteins, if they thought of their flexibility at all, tended to think of it as very limited, very rigid. Because of that, when our first studies of protein dynamics showed rather large motions within these molecules, there was a sense on the part of many people in the community that something was terribly wrong. I should say that some of the best crystallographers, the people who knew the most about the theory of x-ray crystallography and so forth, knew that what they were looking at in these static images was just kind of a time - average picture and they were not surprised that there were fluctuations around that time - average structural. Although, I think that even they were a little surprised at how large some of these motions are.
DKA: You were beginning to understand this even in your years as a postdoctoral student at Harvard?
JAM: Yes this part of my work really started as a post- doc at Harvard. Then it's something, of course, that I continued to develop very vigorously once I moved on to the University of Houston as a faculty member.
DKA: Did you have any concerns about wanting to go into university life? Did you think about working for a pharmaceutical company or did you have a specific career goal?
JAM: I don't know that I ever thought about it in a very focused way. I was always interested in doing research and having my hands on research problems and trying to choose directions for the work to go. I think that pretty much dictated trying to pursue a university career if possible, simply because I had the freedom to sort of set directions and go after personal goals, somewhat more than one might in an industrial setting or another setting. At the same time, I had some interest in trying to see if there were not practical applications for the work that we were doing, but I didn't want to be constrained to try to have some definite answer to some practical question in six months' time. I was more interested in taking the long view, what might we be doing in five years or six years or ten years? Unless you're very, very fortunate in an industrial or government laboratory, a university is the place where one is able to take that long view.
DKA: When you went the University of Houston, what research environment did you find? Was it an environment that was favorable for the type of work that you were going to do? Or did you realize that you were going to have to come in and build a structure that would allow you to continue to work?
JAM: When I went to the University of Houston, I recognized that I was moving into an environment where I was going to have to build a lot of the infrastructure to support a lot of the research program. I was very fortunate when I looked for a university position to have more than one choice. It was one of those times in the job market cycle when things were very tight. Again, the decision to go to Houston was really based largely on our two-career family concerns. My wife had decided to leave the publishing business, went back to do her pre-medical courses and had gone to medical school. She was interested in finding a place to do her residency program in neurology. Among the two or three places I could think of to find a faculty position, the great strength in Houston was that there was a very large medical center and two medical schools where she could continue her training. I reluctantly declined offers from some schools that had, at that time, a stronger reputation and perhaps more resources. It was clear when I went to the University of Houston that it did have some advantages, but it was going to require a lot of work to really get a research program up and running. However, one of its advantages was that it was a very hungry school. It was a young school. It was hungry for reputation. It was hungry for achievements of scientists and so it fostered a kind of an entrepreneurial setting. People who were willing to roll up their sleeves and try and get things done would find some support on the part of the university administration. That turned out to be very important for me, in that when I first went to the University of Houston, they had rather poor computing resources, in particular. I won't name the brand of computer but it was a type of computer that one usually associates with the banking industry rather than a high performance scientific computing environment. One of the first things that I did was to start working on the administration to upgrade the computing resources of the university. I'm afraid I sometimes was a little bit devious in trying to influence the administration. After several pleas to the university administrators to try to bring the computing resources up to snuff and not getting the reaction I wanted to, I composed and sent a telegram to the president of the university saying, "Computing crisis. Have grant from NIH. Cannot do research. NIH may investigate. Situation desperate." Within 48 hours, there was an allocation of between $1-2,000,000 approved to buy a new computer for the university. I was given significant say in what type of hardware to acquire. Happily, within a couple of years of having been at the University of Houston, we were suddenly near the forefront in terms of computing resources and able to do calculations again that were pushing the envelope a little bit in terms of possible applications.
DKA: What did you buy with your $2,000,000 at that time?
JAM: We bought an NAS 9000, a machine by National Advanced Systems. Its an IBM-compatible type machine, which was fine with me because I had largely grown up with IBM type technology. It was familiar. It was something we could move onto and we were able to do quite a bit of work on that machine before we really needed to take the next step which was to what we begin to think of now as real supercomputers, the Cray machines and things like that, that really have traditionally set the top end of these machines.
DKA: I would like to you talk a little bit about the method that you pursued. If you could put it in a fairly simple form that people could understand. What is the thermodynamic cycle perturbation method? That's a big mouthful, but its critical to what you do.
JAM: Basically there's an interesting story behind this thermodynamic cycle perturbation method. As I mentioned when I first went to the University of Houston, we were already simulating the dynamic motions of the atoms and protein molecules and looking protein molecules, basically any kind of material system that you might look at. There was no real driving societal need or interest or anything like that. It was very much research, "How big are the displacements in the atoms? What's the time scale?" and so forth. My interest in doing something beyond that did stem in large part from a personal situation, which was that when I was still a relatively young faculty member at the University of Houston, the wife of colleague of mine, very good friends of ours, was stricken with rare form of pancreatic cancer. Fortunately she's recovered and in very good health. I remember feeling very intensely at the time how powerless I was to do anything to help and how absurd that seemed. Here we were studying these basic molecular components of living systems and surely there must be some way we could use this knowledge to help people who were stricken with cancer or other diseases. I remember thinking quite intensely about it at the time - there ought to be something we could do in our research to steer it toward a useful end, toward the design of new pharmaceuticals - and this thermodynamic cycle perturbation method was really the outcome of a lot of thinking about such issues. Many, perhaps most, pharmaceuticals are small molecules that act as what are called "enzyme inhibitors". Enzymes are simply protein molecules, large biological molecules that act as catalysts. A typical enzyme will bind to some molecule, maybe a sugar molecule and break it up into smaller pieces or maybe combine it with something else. Enzymes facilitate chemical reactions. Most pharmaceutical agents act as inhibitors to enzymes. It's like throwing a molecular monkey wrench into the works of an enzyme. If you have an enzyme inhibitor it will typically bind to the so-called active side of an enzyme and block it so that the enzyme can no longer function as a biological catalyst. Familiar drugs like aspirin, drugs for ulcers, and cancer agents and other things are inhibitors of one enzyme or another. So an important question in pharmaceutical industry, "If you have one pharmaceutical molecule that works pretty well, are there ways that you can tinker with that pharmaceutical to make it bind even more strongly to its target?". That was a question that I immediately began thinking about in connection with this young woman's bout with pancreatic cancer. Was there some new method that we could come up with using computers, using these simulations of enzymes and other molecules to help guide the design of new pharmaceuticals? We took, as a simple model problem, a very common enzyme, one called 'trypsin' and a very small molecule called benzamidine that binds to trypsin very specifically. We wanted to see if we could calculate the strength of binding of benzamidine to trypsin and then compare it with other modified forms of benzamidine that might bind more or less strongly. It was a simple model system that we setup and then we tried to use our computer simulation methods to calculate these strengths of binding. Well, it turns out for a variety of reasons, its very hard to do this in a brute force or a direct fashion. We had the enzyme. We tried it. We had the enzyme in a big box of water molecules and we had the inhibitor molecule and tried to push the inhibitor molecule into the active side of trypsin. We were never able for technical reasons to get a very realistic description of this binding process. Basically we were trying to make it go too fast in the computer and the enzyme was not able to relax in a way that it really would in a binding process. Our computer simulations were looking at, again because of limitations of computer speed, we were looking at picoseconds or maybe tens or hundreds of picoseconds - very, very short times, much, much smaller than the actual binding process occurs in the laboratory. We were not getting right answers. We were not doing physically meaningful calculations and I remember more than one sleepless night worrying about that. One day at about that time I went to a lecture in my other field of statistical mechanics, going back to the John Deutch type work, and I remember listening to Herman Berendsen, from the Netherlands, talk about the ways in which you can make a small cavity in liquid water have different sizes and calculate changes in certain thermodynamic properties. I was sort of half way dozing through the lecture. It was quite technical, and also right after lunch and feeling a little sleepy. And suddenly I remember sitting bolt upright in my chair because I realized that what this distinguished professor was talking about was exactly the missing link that we needed to do the calculations we really wanted to do. What we really wanted in some senses was not to answer the question, "How strongly does a potential drug molecule bind to a target," but "What binds better? This drug or this other possible drug?" We wanted to put things on a scale of relative binding strengths. It occurred to me that using the technique that this eminent scientist was describing, we could recast the whole problem we were trying to solve in a way that was much more amenable to solution, and that is the following: we could start with our drug molecule bound to the active side of the enzyme and simply change it from one drug molecule into another one while it's located in the binding site. So for example, we could take a hydrogen atom on a drug and slowly change it into a fluorine atom. The name computational alchemy has become popular for these kind of methods because you really are in these calculations changing one kind of element into another kind of element, typically. Its not something you can do in the laboratory, but because we're using computer models - because we're using mathematical representations of these systems - its perfectly possible to take one of these computer simulations and just slowly change the size of atoms, change the lengths of chemical bonds and actually transform, or I guess in modern parlance to morph, one molecule into another molecule and calculate the thermodynamic changes associated with that. If you use a basic principle of thermodynamics, it's possible to show that the changes of thermodynamic properties that you get by these artificial or non-physical changes of systems can be related to the actual changes in free energies of binding, which are what we're really after. What we're really able to do in some sense is to finesse the calculation to break it down in a different way that is more amenable to solution on computers of current power.
DKA: That was the trick? To learn what that mapping was between the binding energies and the thermodynamic properties? Did that just occur to you? Was it known in the literature?
JAM: This really was to some degree already known. In some sense we were using a set of ideas which were very, very old ideas but just combining them in a completely new fashion. We were using an idea that goes back to high school chemistry classes of so-called 'Thermodynamic Cycles', that if you have a set of states of a chemical or physical system, there are certain functions that have to have no net change when you run one of these thermodynamic cycles. What we were interested in was two parts of the cycle, the binding of the Molecule A and the binding of Molecule B. We were able to relate that to the other two sides of this cycle where we changed Molecule A into Molecule B. The idea that there exist such thermodynamic cycles was very familiar but it was very unfamiliar to look at non-physical transformations on one side of the cycle and to think about binding processes and then equating that to the non-physical processes was completely new. It was just one of those momentary insights.
DKA: Well that, plus the way you spoke about using the simulation, and seeing and doing almost as if it was the equation, must be something that became familiar to you but again was somewhat unusual in that time period?
JAM: The idea of combining the underlying thermodynamic theory and these computer simulations with experimental data on the structures of these molecules, it was something that we were doing very much in isolation at that time. I think, really, our group and the people in our group who were doing this were the only people in the world that were trying to do this particular type of calculation at that time. Its something that now has become a very, very widespread method. Essentially every large pharmaceutical company in the world carries out calculations of this kind. It certainly is not a guaranteed way of coming up with new pharmaceuticals but it is a way to help people make more educated guesses as to what compounds to try to synthesize and test. It's very widespread now in the pharmaceutical industry, many universities and medical schools. There must be hundreds of laboratories that carry out these types of calculations. Happily they are beginning to produce new pharmaceuticals. As you probably know, there's a very long development time in the pharmaceutical business. If you have a compound that's eventually going to prove useful in the clinic, there's about a ten year period in which that compound has to be tested - cell cultures, laboratory animals and so forth - and then a long time in the clinic before it's actually blessed with the term 'Pharmaceutical' or 'Drug' and put on the pharmacy shelf. But there are a number of compounds now in advanced clinical trials for treating diseases like influenza, HIV and emphysema that were really discovered by the computer simulations. The hope is that with time, some of these will actually be beneficial.
DKA: Maybe you should talk about how the change in computing power and resources allowed you to mature this method and what it first looked like and how that grew and changed in your years at Houston.
JAM: Well, that's certainly much of what we have done has been trying to take advantage of the very newest generations of computers. There are people who argue that there's no longer any need for more powerful machines. Small computers, workstations are already as powerful as Crays used to be, so why go any farther? They're partly right. Certainly problems that took a Cray computer to do five years ago are things that can be done on a powerful workstation or maybe a cluster of a few workstations today. But there are always going to be problems that require that single most powerful machine to begin to think about. Its easy to think about problems like that, and certainly one of the things that we've done in our research group, is that as we think about what areas of research to pursue, we try to think about what's the machine, what's the most powerful computer going to be like in five years time or ten years time? Let's start working on that problem now, even though we know we won't be able to solve that problem right away, but when that machine appears we'll be in a position to really take advantage of that resource and do something new and interesting. Certainly when I mentioned this work on trypsin binding to benzamidine, this initial thermodynamic cycle type work, the only reason we were able to undertake that project was because we had access through the National Science Foundation to one of the great supercomputers of the day. That, in fact, was one of the Cyber 205 vector supercomputers at Princeton University, an NSF Supercomputer Center that unfortunately no longer exists. When that machine was initially setup, we were granted a very generous allocation of time on the machine, and only because we had this extensive block of time on the most powerful machine of its day, were we able to think about trying to set up this extremely challenging calculation where we had a whole enzyme molecule in a box of thousands of water molecules and then think about drugs binding to it. Its not a calculation that you could have done on any other machine. There will always be calculations that are waiting for that next machine to do.
DKA: So you were moving into an environment of increased calculations and in labs you probably saw these problems faster. What was the visualization side of that as you came in and how did that develop along with the increased computing power?
JAM: The visualization side of things is very, very important in this kind of work. We're dealing with, as I've mentioned, enzyme molecules and their inhibitors. Such molecules are very large. They typically have thousands of atoms in them. They're very irregular in shape, and in order visualize what's going on, we need state-of-the-art computer graphics capabilities, not only to visualize the molecules, themselves and the motions of these molecules but also to display data relating to our calculations. If you're interested in, for example, the electric field around one of these molecules which may be very important to explain such properties, then we can use graphical techniques to study such things. At the outset, our graphics capabilities were extremely limited. Going back to the first molecular dynamic simulations of proteins, we simply used pen plotters to draw just a few snapshots to give us sort of a hint what might be going on in the internal motions of a protein. By the time we were doing the simulations on the machines at Princeton, we had slightly better computer graphics capability. We were able to make single colorful images of the protein molecule and this inhibitor bound to it, but we were not really in a position where we could easily make fully dynamic images of these molecules. We've depended really on the development of computer graphics technology alongside the basic computational technology to try to help us get more out of the simulations as we conduct them. One of the advantages of a place like the University of California at San Diego with the San Diego Supercomputer Center is that you have not only state of the art advanced computing capability here but you also have people who are experts in the visualization of the data. You have resources to look at these images in three-dimensional stereovision and even to gather with a group of colleagues, point to these structures in a three dimensional representation, discuss and argue about the results and calculations and plan the next step and calculations. Computer graphics is very helpful in many, many ways, but among them are understanding the data, discussing it with your colleagues in an interactive fashion. Of course, there is also presenting the results to students to help us presenting aids and to present the results to the public to help them understand what's going on in these computer simulations.
DKA: Could we look at some of this data?
JAM: Sure. That sounds fine.
DKA: Why don't you just describe what we're looking at?
JAM: Well, one of the enzyme molecules that we've been working on most recently is shown here on the screen. This is the enzyme acetylcholinesterase. It's an enzyme that's really critical to the operation of the nerve and muscle systems of humans and all other organisms. When one nerve talks to another nerve cell in the brain, for example, there typically is a small space between the nerves called a synapse. When a nerve is excited and brings its electrical impulse down to the synapse, there are neurotransmitter molecules that are released into the synapse. Those diffuse across and excite the next nerve on the line. That's what causes a thought to develop. Similar synapses operate where nerves come into contact with muscles and are responsible for all neuromuscular activity. Obviously these synapses have to operate very, very quickly. If you touch a hot stove, you want to respond instantly. You don't want to wait, and wait and wait and then respond. There's been tremendous evolutionary pressure to have that nerve and neuromuscular systems operate with lightning fast speed. Well, this molecule plays one key role in the speed of activity of the neuromuscular and nervous systems. I've mentioned that in the synapses there are neurotransmitter molecules that are released. But you need to get rid of those neurotransmitter molecules very, very quickly. Essentially its like having an on switch and an off switch. When you release the neurotransmitter molecules, that's an on switch. It turns on the following nerve or muscle, but also you need to be able to turn it off very quickly so you're poised for the next idea, the next response. In order to turn the switch off, you need this molecule which basically destroys the neurotransmitter molecules in the synapse, cleans it up and restores it to its original state, so its ready for another nerve or neuromuscular excitation. This enzyme, acetylcholinesterase, has one very simple but very important job and that is to breakdown the neurotransmitter, acetylcholine. Acetylcholine is a small molecule which binds to this enzyme and then is broken up into two parts and rendered inactive. The acetylcholinesterase enzyme shown here is actually not a human form. Its actually from an electric ray, one of these bat-like fish that swim off the coast of southern California, Torpedo californica. Electric fish and electric rays use the same synaptic mechanisms to build up the large voltages associated with their electrical activity. They're good sources for proteins such as this. This acetylcholinesterase molecule is a dimer. It has two identical subunits, one on the righthand side and one on the lefthand side but they're facing opposite directions. The one on the lefthand side is facing with the entrance to the active site pointing toward us and the one on the righthand side has its active site pointing back away from the screen. The thing that I want to call your attention to is that the active site is not really on the surface of this enzyme. Its actually buried very deep in the middle of this large globular structure. It has a small cave-like entrance to the active site that you can just barely make out. The acetylcholine molecule has to enter this active site and move down a long tunnel in order to get to the catalytic center itself and then be degraded. Well, if you actually make a model of the acetylcholine neurotransmitter and if you were to display it on the same screen, it would be too large to fit into this hole. It would look like the neurotransmitter could not possibly get from the outside of the enzyme deep into this catalytic center, so what's the trick? How does the enzyme manage to handle acetylcholine and handle it so quickly? Part of the answer is these thermal motions of the enzyme. As I mentioned, in our early molecular dynamics simulation we were able to see that the motion of the atoms inside proteins is of surprisingly large amplitude. If you do a molecular dynamics simulation in which all of the atoms of this protein are allowed to move due to their thermal energy, you find that this opening breathes. It opens. It closes. And occasionally it opens wide enough that neurotransmitter molecules can get into the enzyme and move down to the active site.
JAM: What I'll show you next is a set of snapshots that just illustrate this breathing motion. What you see in this series of three snapshots are actually frozen images taken from one of these molecular dynamics simulations. We've done, is we've taken the enzyme molecule, this acetylcholinesterase molecule and we've put it inside a large box with thousands and thousands of water molecules. Then we've allowed these atoms to move around according to Newton's equations of motion, just with the kinds of thermal energy that they would have at room temperature. Every now and then we'll take a snapshot of the system and display the molecular surface near this entrance to the active site. What you can see is that as you go from one moment in time to another, you can see how the active site entrance is somewhat opened up in this image. A few moments later its much smaller in width, and a little time later on its beginning to open up again. Protein molecules are not rigid structures, as people thought for many years. They are, in some sense, living, breathing entities of their own, and these motions are absolutely essential for the functions of enzymes, nucleic acids and really all biological molecules.
DKA: Itis fascinating. I'm curious whether you're finding the same electrical properties in other enzymes or if this is unusual in this particular one?
JAM: That's probably a good point to cut to another molecule because we have another molecule. What I'm showing here is yet another enzyme molecule that operates with tremendous speed. This is an enzyme called superoxide dismutase. Its job is basically to clear your body of a very toxic compound called 'superoxide'. Superoxide is just like normal molecular oxygen that we breathe and depend on but it has an extra electron attached to it and becomes a very toxic, highly reactive molecule. Superoxide develops in the normal course of metabolism and nature has developed this enzyme to get rid of this toxic material, superoxide. Since it has an extra electron on it, the superoxide molecule has a negative charge to it and this enzyme takes advantage of simple, static electricity to speed its rate clearance of superoxide from the body. Here you can see the molecular surface of the enzyme. Again, we have an enzyme with two active sites on opposite sides of the molecule and again the electric fields operate to steer the substrate molecule into the active site to speed its rate of action. What's nice about this story is that we had made these predictions a number of years ago, that the electric fields around the enzyme might speed the diffusion of the substrates into the active site and this stood as just a theoretical statement for two or three years, until the early 1990's. Then a team of investigators headed by Elizabeth Getzoff, here in La Jolla, at the Scripps Research Institute essentially took this up as a challenge. They said, "Well, if this enzyme operates with tremendous speed because of the electrostatic steering and if McCammon and his co-workers are right that the electrostatic steering is really responsible for the speed of the enzyme, we ought to be able to make an even faster version of the enzyme by what's called 'site specific mutagenesis'." Site specific mutagenesis is simply a way of taking the gene that contains the blueprints for this enzyme and making certain changes in the gene so that chemical groups in the enzyme are replaced by other groups. They were able to use these mutagenesis methods to create variations on this enzyme that had increased or decreased electrical charge around the active site, and the calculations predicted that the rates would be increased in some cases by a factor of two, for other changes increased by a factor of four. Libby and her group did what we as computationalists cannot do. She and her co-workers went into the laboratory and actually made these alternate structures to the enzyme, measured their kinetic properties and showed that in fact they do work twice as fast or four times as fast. I think this is interesting because it's really the first example of engineering at the molecular level of biology. People have talked for years about protein engineering or genetic engineering. As I mentioned earlier, I come from an engineer's family and so I take this engineering stuff very seriously. One of the people on my dad's Ph.D. committee was a person who designed the main span of the Golden Gate Bridge in San Francisco and one of my dad's most prized possessions is a book on the design of suspension bridges. Its an old book. In the middle there's about thirty pages that are worn and yellowed with repeated use, and if you open up those pages, what you see are equations. Pages and pages of equations, mathematical principles that connect physics to some structure that is not yet built, but when built will behave in a certain way. That's engineering to my mind, to have a deep enough physical understanding and the supporting mathematical structure to design something that will behave in a predicted way. This is the first time that that's been done in biology. Libby Getzoff and her group, I think, deserve tremendous credit for carrying out this first real feat of biomolecular engineering. Taking physical theory and mathematical, computational principles and using them to actually build a new enzyme that works as specified.
DKA: Its a great story. Terrific. Are you ready to talk through another one?
JAM: This is another enzyme molecule that we've been studying in our laboratory. There are perhaps two things to say. One is that this is a little bit different representation of the enzyme than you may see in some other images. Some of the other images that you'll see today have kind of a white sheet wrapped around them which are sometimes called 'Christo renditions', after the modern artist who wrapped islands and the German parliament buildings, and things like that. Here we've pulled the wrapping off and so you're looking underneath the molecular surface to see the actual bonds of the enzyme molecule, if you will. This is a polypeptide chain, the alpha helices that were predicted by Linus Pauling before they were actually discovered in experimental structure, the protein and other elements of the structure. The enzyme that you're looking at here is one called 'Adenosine deaminase'. It's again an enzyme of very great medical importance. There unfortunately certain children that are born with defects in this enzyme and they suffer from something called 'severe combined immunodeficiency syndrome', and those children will normally die in infancy unless dramatic measures are taken to try and restore the presence of this enzyme or reactivate it somehow. One of the first targets of genetic engineering in medicine is to try to restore the gene for this enzyme into children that have this severe immunodeficiency syndrome. This enzyme is also involved in a number of diseases, among them certain leukemias and lymphomas. What you see here in addition to the enzyme shown in green is a small pharmaceutical molecule, Deoxycoformicin. It's one of the few compounds that's in clinic use for the treatment of what's called 'hairy cell leukemia'. Now that the experimental structure for this enzyme and this drug molecule are known, we're engaged in calculations in which we're trying to change the structure of this drug in certain ways to see whether we can increase the binding of the drug to the enzyme. This is work that's being done by Tami Marrone, a young woman that's doing postdoctoral work in our group and certainly shows some promise I think for the engineering of better anti-cancer agents.
DKA: You're not only a researcher in the laboratory but also a teacher and I know part of your work has been to reform what teaching and chemistry means, bringing these bridges across. How in things like curriculum reform, have you worked in that part of your career?
JAM: One of the fun things, in some sense, about being a university professor is that there is this opportunity to have an interplay between work in the laboratory, the research side of one's work, and teaching, the educational side of our work. We do try to continually reexamine how we teach science in light of what we're learning at the research frontiers of our field. One good example of that, perhaps, is that I will be teaching in a few months an undergraduate physical chemistry course that I know will contain a lot of biologists and future doctors and other people who are interested in biological molecules. And one way we will try to liven this course up - it can be awfully dry just learning equations and pages of dreary text - will be to bring some of what we learn in our research work into the classroom. We plan to actually setup examples where students will be able to get their hands on the computer and carry out at least simple versions of some of these simulations as part of their undergraduate curriculum. For example, we will try to make it possible for students to calculate the strength of binding of different drug molecules to an enzyme, at least in a simplified way, to underscore the importance of thermodynamics and hopefully to turn this rather dry material into something that's clearly and compellingly important for their future work. We'll try to study the diffusional encounter of substrates in enzymes to underline the importance of again what can be rather dry kinetic theory. One of the attractions of the University of California at San Diego, one of the reasons that I moved here recently, is that there are tremendous opportunities not only on the research side but also on the teaching side. The San Diego Supercomputer Center, plays a critical role not only in supporting the nation's research efforts and computer simulations - not just in biology but in environmental modelling and the studies of global warming and so forth - but it also serves as a tremendous resource for teaching, for the development of video tapes showing examples such as this. It can be dispersed to classrooms throughout the country and can be used to generate model simulation exercises that students, certainly here on campus, but eventually students everywhere throughout the country and even around the world, can plug into the Internet and come to the San Diego Supercomputer Center through kind of a virtual experiment and be able to do some of the kinds of manipulations that I've described here. This will have, hopefully, two very valuable results. One, it will help students really learn the basic theory in a more solid fashion but it will also keep them excited about science. It's that sense of excitement that's so critical to a person being successful in their career.
DKA: You talked at a number of points about the role that teachers had for you, in guiding you along in your career. I know that you've spent alot of your own personal effort in repaying that favor to your students. Tell us some about the students have not just studied with you but have gone on to have you as their mentor and how they've gone to begin to change the face of this science.
JAM: Well, certainly one of the great satisfactions that any university teacher will tell you about is seeing promising students really flourish in their careers, to get a good solid grounding in certain fundamentals, to learn whatever you were able as a teacher to convey to the students and then to go on and do things that really can change the whole shape of the field. Ideally, some students will go beyond I what I might have been able to do myself and I'm very, very fortunate to have a number of students that have gone on to positions as university faculty members at the University of North Carolina, at the University of Washington and many other institutions, also some smaller institutions that really concentrate on undergraduate teaching. These former students of mine are themselves advancing the frontiers of this field, developing new applications and teaching yet new ranks of scientists who'll be entering the field. Other former students have gone into federal laboratories, and then to pharmaceutical companies where they're putting these methods to work to do the actual discovery of new pharmaceuticals, to do the discovery of new enzymes and really put these tools to work.
DKA: What is the secret of teaching them and leading them well?
JAM: The secrets to good teaching are something that's really quite nebulous. I don't really have a magic formula. I have tried to develop some outlines and some suggestions to students. One thing that I always encourage them to do, is to try to find a problem to think about as far as learning about research is concerned. Try to find a problem that is an important problem to work on, as much as possible, something that they themselves are very interested in. If a student is not deeply interested in what he or she is working on, there really is no hope of progress. I encourage students to try to find a problem that they're really obsessed by. A problem that will keep them up at night worrying and tossing and turning. And then beyond that to try to learn tools that have a very wide utility, things like molecular dynamic simulations, things like theories of how molecules diffuse. These are very general concepts that are fluid enough that a student will not run out of gas just on finishing one particular research project, but the student will be equipped with tools that can be used again and again in different settings. Those are at least two key ideas.
DKA: I sense in you also a very personal quality of leadership and caring about your students. It seems that you're more than just someone who gives them information. Do you have that kind of relationship with them?
JAM: I do care about each and every one of my students in a very personal fashion. I very much look at my group as a family in some sense. We try to get together on a very regular basis and we talk largely about scientific things but also about more personal things too, like people's concerns about the job market or people's concerns about teaching. We oftentimes have undergraduate students who will come and spend time working in the group. We talk about ways that we can make that experience as rewarding as possible for students during the time that they visit. We do try to create a strong family atmosphere in the group. I do encourage the students in our group and the students in my classes to interact with one another as much as they can, not to feel that they should advance by climbing over other people, but I do try to encourage a sense of cooperation among students. It is true that there's inevitably a degree of competitiveness in science and that's an essential ingredient, in fact, for people to feel challenged, to do the best that they can. But its possible to be competitive in science without stabbing other people in the back, and that's certainly something that I try to encourage my group to do, to work together on things but for each person to try to put his or her best talents to use in moving the whole enterprise forward.
DKA: You've described a research climate, a research environment first at Houston and maybe now more fully at San Diego that's dependent on a number of factors. Its dependent on a growing and developing world of supercomputing and a research center supported by a university climate. Do you see that continuing in its present form? Is that structure changing? What does the future look like for the kind of research group that you are leading?
JAM: The universities are always evolving in the setting of society at large, and clearly we're in a situation right now where we're undergoing a large change in our general climate. With the end of the cold war there's been a tremendous decrease in the willingness of the federal government to support basic research and science in general. There's a loss of this concern that we had in the post-Sputnik era. And so we're facing a difficult time,certainly in terms of federal support, and that's compounded to some extent by the current changes in the industrial sector. Obviously much of the work that we do, involves the pharmaceutical industry, and there are now tremendous pressures for cost control in the pharmaceutical industry that have led to a reduction of support and a reduction of opportunities, to some extent, for people who are interested in drug discovery. We're facing difficult times. One can only try to make the best of the situation by educating society as to the ultimate benefit of continuing this enterprise, to point out the value of advanced computing tools as something more than just a way to beat an international competitor in the defense arena, to point out the potential of advanced computers as tools for conquering diseases - for controlling diseases if you can't conquer them - and for dealing with a myriad of problems that society faces, environmental problems, and many, many others. One of the changing features perhaps in our time right now is university faculties. We need to be a bit more advocates of the university enterprise. We need to remind people of the enduring value of what we're doing and point out a dollar invested in a university may benefit society to the tune of hundreds of millions of dollars later on in terms of reduced impact of disease, in terms reduced impact of pollution, in terms of ability to improve society in many, many ways.
DKA: In a more technical sense, as you look forward to the tools that you've developed for now, what does the future look like? What lies ahead for the scientific tools that you've spent your career developing?
JAM: The future of the kind of field that I've been so involved in in recent years, that future looks extremely bright. In fact I feel a little bit as has often been described of Newton and other great scientists, of just having one little grain of sand in your hand and you realize that there's this whole beach, this whole ocean out there. We obviously have started to look at a few molecules, a few enzymes and learned a few things about how they function. There's much more to do in that arena. There's much more to be done in extending this kind of work in several directions. One is looking at the fine details of the reactions that go on in enzymes and what's the nature of those and how can those be influenced in other ways? My personal interest, I think, for the next few years is going to be going up the scale a little bit. I've mentioned how acetylcholinesterase is a critical enzyme in synapses and how one nerve talks to another nerve. There's much now that is known about the detailed structure of entire synapses, and so what I would like to do in the next few years with my research coworkers is to try to move the next level up and to go from computational molecular biology if you will to computational cellular biology. I think especially if we are able to continue more and more powerful computers, if the envelope of supercomputers continues to move the way that in principle it can, we ought to be able to think about integrating these molecular components into models of entire synapses and integrate the molecular picture into a computational cell biology. I think computational cell biology founded on truly molecular understanding could be one of the tremendous exciting areas of science in the early 21st century. It will depend on federal support. It will depend on the development of the advanced computing capability. We do need high performance machines to really move in this direction but given those resources, I think the next decade or two looks very exciting indeed.
DKA: That's a great conclusion.