2012 Doctoral Hooding Ceremony
Levoy tells graduates 'Being afraid is useful,' Clemens receives mentoring award
May 15, 2012
The UNC Graduate School held the 2012 Doctoral Hooding Ceremony at 10 a.m. Saturday, May 12 in the Dean E. Smith Center. Nearly 250 students participated in the ceremony. Each graduate came to the stage to receive their academic graduation hood conferred by their advisors or dissertation committee chairs and the Provost.
Dr. Marc Levoy gave the keynote address at the ceremony, offering insights including the following: “My favorite poster says, 'What would you do if you weren't afraid?' Being afraid is useful. On the savannah it kept us from being eaten by lions. Nevertheless, this is the message I'd like to leave you with, the same message my mentor Don Greenberg gave to me. Know your fears, know also what you really want, weigh the odds, and occasionally, make a run for it.”
Dr. J. Christopher Clemens, professor of physics and astronomy, received the Faculty Award for Excellence in Doctoral Mentoring at the ceremony.
- Photos and Video of Hooding Ceremony
- Keynote speech by Dr. Marc Levoy
- Faculty Award for Excellence in Doctoral Mentoring— Dr. J. Christopher Clemens
Photos and Video of Hooding Ceremony
Hooding Ceremony Speaker
Marc Levoy received his doctoral degree in computer science from Carolina in 1989 and is the VMware Founders professor of computer science at Stanford University, with a joint appointment in Stanford's electrical engineering department. The following is his full address to the University's doctoral graduates:
Last month there was an earthquake in Silicon Valley. Nobody died; it wasn't even recorded by the U.S. Geological Survey, but the foundations shook, people looked at each other in horror, and the landscape changed shape. Instagram, a free cell phone photography app written by two former Stanford students, was bought for $1 billion by Facebook. $1 billion.
I teach a course on digital photography at Stanford. The morning after the sale, I started my lecture by popping up on the screen a news clipping about it. I waited for the students to read the clipping, then turned around and asked them, “Why are you still sitting there? You should be in your dorm room writing photography apps!”
What is Instagram? It lets you take a picture, apply colorful, funky or retro filters to it, and upload it to your favorite social site. Instagram won't end poverty, cure diseases or bring world peace. Well, it might help with that—by making it easier for us to share experiences. But the Instagram story is powerful, because the idea is disruptive. Nikon and Canon have spent 80 years refining their cameras to produce good pictures. Instagram gives you a hundred ways to turn a good picture into a bad picture, and it's wildly popular. The art world is built on a tradition of curation—by schools, museums, galleries and critics. Instagram lets you upload every picture you take, to the public, within five seconds of taking it. Curation is done by “liking” a picture someone else has posted. And Instagram is free. Facebook presumably has some ideas for monetizing it, but the usual economic model—I make a widget, you buy it from me for money—does not apply.
Americans are fond of new ideas. A disruptive idea means it also upsets current ways of thinking, or current values or current markets. Many business schools offer a course on disruptive technologies. Is this a good thing? As children we are taught not to be disruptive. As faculty (or Ph.D. students), it is expected of us, and we are rewarded for it. Of course, the meaning of “disruptive” changes as we grow up. Even so, it has not always been considered a virtue, and it is not a virtue in many cultures.
So where did this idea come from—that the role of a university is to produce disruptive ideas?
The robes you and I are wearing originated in the universities of medieval Europe, particularly Bologna and Padua. The highest degree obtainable at these universities was a doctorate, so you became a Doctor of Theology, or Medicine, or Laws or Arts—which meant everything not covered by the other three. This is where our modern doctorate comes from, with one important difference; doctorates in the middle ages were granted for advanced scholarship, not original research. You didn't need to do anything new.
Our modern doctorate grew out of the German university system of the 19th century, especially Humboldt University in Berlin. It offered the same four fields, plus one more—science, urged by the founder's brother, South American explorer Alexander von Humboldt. The doctorate in arts was renamed Doctor of Philosophy, and the doctorate in science was named Doctor of Natural Philosophy. More important than names was the notion that to get such a degree you had to make an addition to human knowledge—something you couldn't find written in a book. This notion fit well with the Age of Enlightenment, and it spread quickly across Europe. In the United States it was the model for Johns Hopkins University, founded in 1876.
Although based on the German model, a visitor to a U.S. university at the beginning of the 20th century would have noticed some differences, for example that American faculty and students sat around a table together and called each other by first names. Though apparently minor, these differences are pivotal, because they encourage students to challenge the authority of their professors. When a graduate student from abroad starts at Stanford, calling me Marc instead of Professor Levoy is one of the hardest things they've ever done.
In fact, the entire American educational system, though frequently maligned, does a good job of cultivating independent thinking. In many countries, the walls of elementary school classrooms are bare. In the United States, the walls are festooned with students' creations. A common practice in U.S. primary schools is Student of the Week. It's not a competition; over the course of the school year every student gets to be Student of the Week. Its purpose is to showcase the individual: find each student's talents and interests, and celebrate them. I know, because my wife, Laurie, does this in her kindergarten classroom. I have to believe this practice contributes to America's high rate of innovation.
You guys are Ph.D. students. I once stood where you are. How does one foster independent thinking, in yourself and others, and what are its pitfalls? Let me tell you three stories.
My first story is about mentoring.
I studied architecture as an undergraduate. The first week of freshman year we were asked to draw a cube in perspective, but a cube with holes cut through it and extrusions sticking out from it. It's a nasty exercise. I had learned how to program a computer in high school, and this task seemed boring and methodical; maybe a computer could do it. So I took out a Fortran coding sheet and tried writing a program to generate a perspective drawing.
The instructor wandered by and said, “Levoy, what are you doing? Put that away, get out your T-square and triangle, and learn how to draw. Then, go downstairs and see professor Donald Greenberg; he works with computers, maybe he can 'help' you.” So I did what I was told, but as I approached Professor Greenberg's office, I got cold feet. I had no proof a computer could create a perspective drawing; what if he laughs at me? In the end I gritted my teeth, knocked on his door, introduced myself and showed him my Fortran coding sheet.
Well, he burst out laughing—but only for a moment. Then he said, “You're right; a computer can make a perspective drawing. Let me show you.” And he took out an article just published in Scientific American by Ivan Sutherland, the father of computer graphics, describing how perspective views could be generated using matrix algebra, and how surfaces occluded from your view could be removed using a depth counting technique.
I barely understood what he was talking about, but that meeting changed my life. First, it showed me that computers might be as interesting to me as architecture. More importantly, this famous professor had spent 30 minutes of his time, not showing off his own work—I got plenty of that from other professors, but talking to me about my idea. When he first showed me Sutherland's article I was crestfallen, because it meant my idea wasn't new.
Don saw this in my face, but would have none of it. This idea wasn't new, but maybe the next one would be. He invited me to join his research group, which was exploring this new field of computer graphics, and he has served as a mentor and role model for 40 years.
Many of you will become professors. Others may work in industrial laboratories. Most of you will be parents. You will all certainly become teachers. But teaching is not about lecturing; the next generation may download their lectures from the Internet, if you believe Stanford's recent online experiments. Teaching is really about mentoring, which to me means guiding your charges to think—independently, critically, and fruitfully.
Remember, it's not about you; it's about them; good mentors can draw satisfaction from the successes of their students.
My second story is about work.
In 1996 my Stanford colleague Pat Hanrahan and I developed a theory about collections of images captured from slightly different viewpoints, for example by an array of cameras. We called this collection a light field. The bullet-time flyaround effect in the movie “The Matrix” is based on this idea. But as a commercializable product, light fields were a non-starter. Who could afford all those cameras?
Eight years later, in 2004, one of our Ph.D. students, Ren Ng, starting from this idea, realized that if you insert an array of tiny lenses into an ordinary camera, you could take a photograph that could be refocused after it was captured. Ren knew his idea had commercial value, so after he graduated he rented a loft in Mountain View, California—I guess all the garages were taken—and spent five years trying to talk the traditional camera vendors into building a light field camera. But Ren's design didn't fit the paradigm; it sacrificed megapixels to obtain refocusability, so the vendors weren't interested. Eventually he gave up and decided to build it himself. He renamed his company Lytro, and designed a camera that's as revolutionary in its shape as it is in its capabilities. Here is a Lytro camera. You can take pictures with it, and you can change what's in focus after you take the picture.
But think how long this took. From the day Ren showed me his idea on the whiteboard, he struggled for seven years bringing his idea to market. I've been waiting for a light field camera even longer, 17 years since my original paper. I shouldn't be surprised; these long intervals between academic research and commercial deployment have been well documented in a study co-authored by our own Fred Brooks.
You just finished a Ph.D. On average, it took you five years, 5.5 if you graduated from UNC's Computer Science department. You're thinking to yourself—what a slog; I'm glad it's over. But if you continue to pursue a life in research, your Ph.D. will probably be your shortest project, a warm-up designed by your advisor to bring satisfaction and publication in two-three years.
The next part you have to do yourself. Make sure you choose a worthwhile problem. In the words of Richard Hamming: “If you don't work on important problems, it's not likely that you'll do important work.” Then, if you're creative you'll have lots of ideas. If you're wise, you'll discard most of them. If you're persuasive, you'll convince someone to pay you to work on the remaining ideas. Then, be patient and persistent. For scientists, Nature does not unlock her door easily, but the wonders inside are worth the struggle. For engineers, there are indeed new things under the sun. For scholars, the human condition is always changing, and the best, most useful words about it have not been written.
My last story is about fear.
When I applied for faculty positions, I was invited to interview at Stanford. Naturally, I was terrified. You see, I have an imposter syndrome. Computer scientists are supposed to be good at math. I'm bad at math. I can't write theorems. If I wrote a theorem I couldn't prove it. It would take Stanford 10 minutes to discover that their invitation was a mistake.
On the morning of the interview, I arrived on campus early and wandered into Stanford's Memorial Church—not to pray for success, but because it was beautiful, dark and quiet. After sitting for 15 minutes to calm my nerves, I started across the quadrangle towards the Computer Science department. Along the way I crossed paths with two Ph.D. students. One turned to the second and said in disgust, “This placing is so elitist!” I was a nervous wreck again.
My talk went OK, and at lunchtime they sent me to the faculty club with a Nobel Prize winner and two Turing Award winners (the Computer Science equivalent of the Nobel Prize). These worthies sat around the table debating alternative cosmologies of the universe. I stared into my chocolate mousse and wished I were on a beach somewhere. By late afternoon, after meeting with a different professor every 45 minutes, I was fried, so I wandered into the bookstore—another place of refuge and calm. Bad move. I immediately ran into a rack of books by Stanford authors, including a shelf by the professors who had just interviewed me. These books were full of math, and theorems and proofs of theorems.
Two weeks later, I got an offer from Stanford. What were they thinking? I can't get tenure there! I also had offers from more plausible schools for my modest skill set; I should take one of those. (By the way, I turned down an offer from UNC, because Ph.D. students should move to another school after their degree.) To help me decide I called Don Greenberg, my mentor from Cornell. He told me I was acting like a coward, rather bluntly as I recall. He told me that if I didn't take the job at Stanford, I would spend the rest of my life wondering, “What if?”
So I swallowed my fears and accepted Stanford's offer. That was 20-odd years ago. I still have an imposter syndrome, and I still can't do math, but it turns out that I know more than the students, at least when they first join. And every once in a while I stumble across a decent idea, or I stumble across 10 ideas and my students help me realize that nine of them are nonsense. And I like teaching, and telling stories. These things turn out to be more important than I thought.
Let me end by returning to the theme of disruptive technologies.
Facebook, the company that bought Instagram for $1 billion, is itself no stranger to “disruptive.” For three years I lived next door to Facebook. Literally next door. I could throw a rock from my bedroom window and break a window in Mark Zuckerberg's office. They have a particular culture at Facebook: young, edgy, in-your-face. Their walls are covered with graffiti and posters, spray-painted in Wild West Wanted-dead-or-alive font. One of them says: “Move fast, break things.” Their offices have concrete floors, no interior walls and lots of skateboards, so I imagine they do break things.
My favorite poster says, “What would you do if you weren't afraid?” Being afraid is useful. On the savannah it kept us from being eaten by lions. Nevertheless, this is the message I'd like to leave you with, the same message my mentor Don Greenberg gave to me. Know your fears, know also what you really want, weigh the odds, and occasionally, make a run for it.
“What would you do if you weren't afraid?”
Thank you, and good luck.
Dr. Levoy is known for helping to create the field of computational photography. He received his doctoral degree in computer science from Carolina in 1989 and is the VMware Founders professor of computer science at Stanford University, with a joint appointment in Stanford's electrical engineering department.
Dr. Levoy received bachelor's and master's degrees in architecture from Cornell University. His master's thesis focused on computer-assisted cartoon animation—research he applied later as senior scientist and director of Hanna-Barbera Productions' computer animation department.
He began his UNC doctoral studies in computer science, in the College of Arts and Sciences, in 1984, focusing on a computer graphics technique called volume rendering. The technique provides three-dimensional depth when displaying computed tomography and magnetic resonance imaging data. After receiving his doctorate, Dr. Levoy spent a year as a research assistant professor in UNC's computer science department before joining the Stanford faculty in 1990.
His more recent research achievements have included co-designing the Google book scanner and launching Google's Street View project. Dr. Levoy's current interests include light fields, optical microscopy and the emerging field of computational photography.
Faculty Award for Excellence in Doctoral Mentoring
Dr. J. Christopher Clemens, professor of physics and astronomy, received the 2012 Faculty Award for Excellence in Doctoral Mentoring at the University's recent Doctoral Hooding Ceremony. The Graduate School presents the annual award to a faculty member who has: encouraged graduate students to establish their own records of scholarly activity, provided a supportive environment that brings forth the very best from students, and achieved a successful record of graduate degree completion among students he or she has advised.
A nomination letter said the following of Clemens:
Of the more than 100 K-12 teachers, undergraduate lecturers and graduate professors that have taught me in the past 23 years, Chris is, without hesitation, the most effective, compassionate, intelligent and entertaining educator I have had the privilege of learning from.