Radia Perlman at LCA 2013
A little before 9am on Tuesday 29th January, I filed into ANU’s Llewellyn Hall along with approximately 700 other Linux.conf.au delegates to listen to the daily keynote speech. I’m now a little embarrassed to admit that I had never heard of Radia Perlman. A little over an hour later, I was a fangirl.
Radia delivered an engaging, funny, and highly-technical keynote address at LCA2013, and the audience of IT professionals and enthusiasts present lapped it up. In it, she placed the technical details of the network protocols she and her colleagues developed in an historical context. She half-jokingly explained that this was the only way in which anyone could hope to understand why the protocols we work with today include ‘features’ in their design that would otherwise seem crazy to an outside observer.
In delivering her keynote, Radia gave us not just the technical detail behind the development of networking protocols, but also wove in details from her creative side, as well as tidbits about her children’s involvement in her technical life. The crowd was delighted as Radia shared with us the poem that she created (an ‘Algorhyme’) shortly after devising the Spanning Tree Protocol in 1985 while at DEC. For as she says, “Every algorithm deserves an algorhyme…” (You can hear Radia reciting her AlgoRhyme in Dan’s video interview with Radia below, and read the text here.) There is also a recording of Radia’s daughter, Dawn Perlner, singing the Algorhyme set to music by Radia’s son, Ray Perlner. Radia also mentioned her son Ray’s involvement in the creation of an AlgoRhyme V2 to mark the creation of her most recent network protocol, TRILL.
Later in the week, I had the pleasure of heading off to lunch with Radia and a small group of fellow delegates during a break in technical sessions. She is engaging and thoughtful, and concerned as much with solutions for societal issues as solutions for thorny networking challenges. Radia is eternally self-effacing, and repeatedly claimed that she had “never done anything difficult” in her work. To the contrary, it seems to me that she applies her sharp brain to examining problems from all sides, building up a mental picture and thinking upon them until she distills her deep understanding of the situation into an elegant solution. The end result may appear ‘simple’, however the “thinky work” involved is anything but. Many delegates that I spoke with about Radia were in awe of the apparent ease with which she has managed to solve incredibly complex problems.
I’m a particular fan of Radia’s “auto-configure everything” approach to engineering solutions. She articulated her philosophy that all networking protocols should be designed to ‘simply’ auto-configure themselves to the optimal setup for any given network environment. She described the way this happens in the Spanning Tree Protocol as all of the nodes simply “gossiping amongst themselves” to figure out the topology and automatically devise optimal routes. When she tried to implement her “auto-configure everything” approach however, she found that “customers want knobs to fiddle with.” So, if it makes people happier to have some parameters to tweak, she’ll happily “give them some knobs”. It’s clear from her tone that she sees this as superfluous to an elegant solution, but shrugs her shoulders and moves on with her work.
I got the feeling that Radia Perlman is quite content to confine herself to solving the ‘simple’ problems, creating engineering solutions that enable our computers to talk to each other efficiently. She allows others to focus on the inelegant commercial aspects of things.
The full video of Radia’s LCA2013 keynote is now available online for you to view.
—
Interview Transcript
How are you finding the conference?
I’m actually amazed at how good a time I’ve had here. I was initially a little bit nervous because I’m not really a Linux person and I was trying to weave open source and Linux into the keynote somehow, but then decided I’d talk about what I know best. The audience was just great and one of the highlights is just how nice the people are. There’s a whole bunch of geeks together and nobody calls each other stupid. When somebody loses something or needs help, immediately they get a bunch of people on the thing answering it. It was amazingly well organized. All of the special social events and everything were beautiful; they kind of weaved in sightseeing. All the talks were very well organized. It’s an incredibly good conference.
Tell us about your work at Intel.
So I’m at Intel. My title is Fellow, whatever that means. Basically I do research on things. The kinds of things I like to do is, if at all possible, to make things just work automatically so that people don’t have to understand the underlying technology. Also, to make things evolutionary if possible so not to tell people, “Throw everything away that you have and invent this new thing”. I also really like trying to get to the heart of when there’s two competing things, or many competing things, trying to figure out what’s actually intrinsic to the differences between them, and I prefer not to look at two 600 page specs and compare them, but instead to break it into conceptual pieces where they might differ and talk about the pros and cons of those things.
Starting with your contribution to the web. Did it turn out how you expected it?
So, I’m starting with what I’ve done. I tend to do the very low level plumbing that people hopefully don’t even have to be aware of. As a matter of fact, I was doing DECnet , the parts of it that move the packets around and stuff and I suppose one of the greatest compliments that I ever got was after we deployed it and then the marketing department of Digital asked some of the major customers “What did you think of DECnet ?” and most of them answered “What’s DECnet ?” even though they were using it and so that’s how I really want technology to be. The thing that I’ve mainly done is how to make a network – you just plug it together and the little components gossip amongst themselves automatically, figure out how to move data around, figure out if there are configurations how to make the configurations compatible. So I’ve done that at layer three. If one knows what the network stack is, layer three is the thing that traditionally moves packets around.
The actual routing protocol that I designed at DECnet in the 1980’s is actually still in use in most ISP’s. It’s called ISIS. It’s not as well known as OSPF which kind of was derived from it, but it’s still in use, and a lot of the ideas also wound up in OSPF. Also, what’s kind of ironic is that Ethernet has nothing to do today with what was originally invented. It was invented as a single link that people, when anyone talked, everyone else could hear what was said. You still needed a layer three protocol in order to carry packets between Ethernets, but the industry just got confused and left that out. I was responsible for designing something that allowed you to hook many Ethernets together in a way that didn’t depend on the computers attached to it to have implemented the layer three protocol. That is known as spanning tree, the spanning tree algorithm. That I thought was only going to last for a year or two until people put layer three back, but it was so easy to use and so self configuring compared with IP, which was the layer three protocol that people chose, that sort of, to my horror, they were still using it. It’s not an optimal way of hooking things together. It was just the best that could be designed given the constraints that we had to work without changing the Ethernet packet in any way or changing how the computers worked with it. So, that is sub-optimal. You don’t get best pats and things like that, so about six or seven years ago I designed another thing that allows you to gradually migrate from spanning tree to something that allows optimal paths, multi-pathing, anything that a layer three protocol would do without having to change the nodes, or to have to throw away all of your spanning tree bridges. That new thing is know as Trill, t-r-i-l-l, which stands for transparent interconnection of lots of links.
Let’s talk about poetry.
I think I know what you’re asking me. Part of the spanning tree algorithm, because I had an entire week to work on it, of which I spent one night realized exactly how to do it, two days writing the spec which was sufficiently complete that people implemented it without asking a single question and the remainder of the week I wound up writing a poem that is the abstract of the paper. The title of the poem is “Algorhyme” because every algorithm should have an algorhyme. The poem is: “I think that we shall never see a graph more lovely than a tree. A tree whose crucial property is loop free connectivity. A tree which must be sure to span so packets can reach every LAN. First the root must be selected; by ID it is elected. Least crossed paths from root are trace. In the tree these paths are placed. A mesh is made by folks like me, then bridges find the spanning tree.”
I happened to also write a book on layers two and tree. It was such a fuzzy field full of marketing hype and so confusing. I believe that the book was a significant contribution in terms of organizing people’s thoughts. I did a second book with co-authors on cryptography and network security. That book is called “Network Security, Private Communication in a Public World”.
The first book is called “Interconnections – Bridges, Routers, Switches and Internetworking Protocols.” I seem to like to put everything into the title. I’ve also done some interesting stuff in security, in terms of authentication, authorization, the ability to create data with an expiration date so that even if the backups still exist because once you make a lot of backups you’ll never be able to guarantee to get rid of all of them. Once a file expires, once you’ve given it an expiration date, the day after that it’s impossible to recover. This is, like everything I’ve done, very simple, very practical, easy to manage, zero performance overhead, over a traditional encrypted file system.
Views on gadgets and technology.
Part of it is because I really dislike technology. I am not a gadget person. I literally went to college without knowing how to change a light bulb. It was something I would just tell my father and a few days later it was fixed. I never even wondered what he did. I assumed it was dangerous or difficult somehow. I think we need more people in the field that dislike technology, that can design things that work as transparently as possible. People should not need to have to understand what’s going on inside in order to use it.
Did the web bring any surprises?
In the beginning it was like, well, isn’t email amazing? You can send email between things and you can do file transfer. I don’t think people imagined much beyond that. There’s that good things that work, that I absolutely am astonished that they work – they shouldn’t be able to work. For one thing, the idea that you can search for something on the internet and you get back an answer instantly – way faster than if you’re searching through your email for a particular email thing on your one machine. How could that possible work? I even sort of know how it works, but I still don’t actually believe it. Another thing that’s amazing is that you can search for something like shoes and you can find some merchant that you never heard of before in some country that you never heard of before, send them your credit card, and the shoes appear and your credit card isn’t stolen, as a result. It’s just astonishing. Another is when you search the whole page rank business, it would just be so easy for people to post bad information and some really scary things like wrong information about things to do with your child ingests some poison or something, and yet even though I’m sure there’s things like that because of the page rank stuff, you see the correct information first. Another thing is Wikipedia. By any logic it should just be filled with just graffiti scrawling’s of ranting people, but it’s one of the best sources of information you can get. That’s an amazing thing too. On the other hand, there were negative things that nobody foresaw. One is spam. We’re just getting flooded in spam. Most of it is caught by spam filters and yet you still see all of this. What’s really kind of tragic is that there’s a lot of people that despite being that many years with these scams about “oh I have $20 billion I need to get out of this country ,and your email address seemed like a reputable person so you can have 10% of it”. People quite tragically get caught up that. It only takes a very, very tiny percentage of people to get fooled by that to make it worthwhile for these horrible criminals to do it because it costs them almost nothing.
Another thing that nobody saw coming is distributed denial of service. They were thinking, “one bad guy, how many messages could they possibly send”.
This notion of being able to break into tens of thousands of notes, in the early security papers people said “oh you don’t have to worry about denial of service” because it wouldn’t benefit anybody. There’s entire black market economies, in terms of I have 10,000 bots, 10,000 machines I have compromised and if you want to attack somebody by sending them lots of traffic “this is how much I will rent my army for per hour” or whatever. I’ve always wondered a little bit about exactly how you pay these criminals when you rent their bot army, because you’re a criminal, they’re a criminal
– what kind of currency do you actually use? All of these things are astonishing to me.
About Education.
I was always good at science and math. I loved logic problems. If people has asked me what I wanted to do when I grew up, at the time I would have said “I’m interested in pretty much anything. I just want a job with people I enjoy working with, that is something I’m reasonably good at, that pays me enough to live on and I do not have expensive tastes.” If I’d known a little bit more about computers when I was younger, I would have modified that statement by saying, “Oh I’d be happy with pretty much anything as long as it didn’t involve computers”. I’m very, very happy in what I do. There’s this traditional stereotype of an engineer – someone that built a ham radio when they were seven or built a computer out of spare parts or whatever. There’s two forces – one, there are people that believe that they wouldn’t be good in this field because they weren’t like that, and hiring people who think that you’re not in the true spirit of engineer if you don’t do that.
The thing is you need people who think at different levels. Although I’m not a dive right in and start writing code kind of person, I’m actually someone who thinks about a problem from different angles first, and gets to the very simple obvious solution, finds the right problem to solve and exactly how to do it. I can dive into with an implementer and discuss what the data structures should be and what the algorithms should be, but it’s this thoughtful conceptualizing first rather than the great enthusiasm of just understanding a few special cases and writing code for those and hearing about other things. That’s what we really need. We need to have a mixture of people; those that have actually met actual human beings and know what they would be likely to do; rather than people that have pop up boxes that say do you want to display only the secure items. I happen to know what that means and it still doesn’t mean anything. You need people who can think about a problem thoughtfully and conceptually first before worrying about the actual details. Someone that can examine two things without worrying about the actual details of syntax and try to get the heart of what the pros and cons are, as well as the people who just incredibly enthusiastically like to write code. We have to make sure that the young people understand that it’s a very wide field and that being different from everybody else in the field is likely to mean you will find an ecological niche in which you’ll be very valuable, as well as making sure that people who are creating a team understand that it’s very important to mix people with different skills.
One thing that’s incredibly exciting about the web is that we can really democratize education. Instead of just being something that’s only available to people who can drop everything else in their life and live someplace on campus and spend, who knows what it is these days; $6,000,000 per year on tuition. It’s been a long time since I was in school so I don’t know what tuition is right now. Instead, there really ought to be a lot of well -organized resources so that you can learn things yourself. One of the problems is that there is likely to be a lot of real junk out there too that would waste your time, so if there would be some way of making sure that you can find the right kind of courses and teach yourself. You may not be able to get accredited. Maybe you could spend extra money to be officially accredited after you’ve done that. Even in the traditional, only certain people get accepted into the particular university to take these for credit courses there’s still a lot of ways in which the web can make it easier for those by videoing the lectures so you can review things that you haven’t seen, by having an online community where people can ask each other questions and tutor each other and so forth.
People say “Isn’t it great. The internet is so powerful that you can’t prevent knowledge from getting out”. It’s actually kind of the opposite. In the old days when the only way you could get news, for instance, was on the three major networks the information had to be fairly reasonably balanced or else they would lose their license. These days no matter what bizarre belief that you have you can manage to focus in on, there’s so many TV channels you can find a TV channel that only reinforces and magnifies what you believe and they are motivated to do that because that’s what their listeners really want to hear. They don’t want to hear anything on the opposite side. Websites, also, if you believe that earthworms are telepathically controlling our thoughts or something you can find 50 people across the earth that all believe the same thing and in the same community it will seem like it must be true. It is very difficult to separate good information from bad. My thesis was on how to make a network work even though some of the switches are malicious.
A distributed algorithm is one where everybody’s doing their own piece of the puzzle, and it all forms this magnificent symphony. But imagine if some of them could give wrong information, or even if they’re doing the protocol correctly, forwarding packets incorrectly or flooding the network with garbage or working perfectly for everybody except throwing away your packets. How could you make a network work? When I proposed that as a thesis topic everyone said “yeah, yeah that’s like really hard”, but I’ve never done anything hard. Once I thought about it I said “Oh my gosh, it’s simple and even reasonably practical” then I was nervous I wouldn’t get a PHD for it. I asked somebody “Is there any minimum length to a thesis?” and the professor I asked said “Well, it either has to be long or good”. That’s another interest of mine, which is, people assume all information is correct, so there’s this web of trust with certificates, for instance, where anybody can sign anybody else’s certificates and you have this public depository where you start with someone that you know, the key of somebody that you know, then you just search through the database to find a path to the name that you want. This works fine in a small community where everybody is only putting in trusted information, but it only takes one guy to start putting in false identities and false certificates before the whole model falls down. Another thing is somebody was doing some research on trying to make inferences from information on the web, just looking at stuff that happens to be on the web. The example was Flipper is a dolphin and a dolphin is a mammal therefore you know that Flipper is mammal. I said
“well, that’s great if everything on the web is true, but if you actually turned it loose on the web you would conclude that the cause of global warming is women working outside the home or something like that”.
Web and security.
It’s hard to imagine that this whole wonderful thing won’t fall apart. We’re already drowning in spam and denial of service and Malware, where you get infected. In the old days it was just don’t be stupid enough to put a possibly infected into your machine to boot it, but now it’s a matter of don’t open any email message, don’t attach to any website, don’t even turn your machine on in which case you know that it won’t get infected. That’s not really acceptable. I’m hoping, of course, society is so dependent on it today that these things have to solved. I’m nervous about running out of wireless bandwidth with everybody just saying “Well, we should be able to watch seven HD movies simultaneously, one on our elbow, one on our wrist”,
just being so casual about assuming that everything will be connected at all times with high bandwidth just seems like that is going to fall apart, so I think people need to rethink that. They also want everything to just seamlessly talk to each other and you also have to be careful with that because you don’t want your neighbor controlling your thermostat, either accidentally or maliciously.
Possible solution?
So one of the things that I think could possibly help is if people didn’t make their software implementation so extensible. If really all you want to do is read email and look at websites, is it possible to do it in a way that you’re not required to download in order to view the website? It seems like that ought to be possible and so to have a machine that there’s no way to download code on it but just doing the very simple things, if most people using the web that’s all they really need to do you’ll have a much more resilient ecosystem.
My co-author on this security book once gave a talk many years ago, 20 years ago, saying “Is it too much to ask for, that the web be as secure as a telephone?” With a telephone you dial a number, you know who you’re reaching. You can dial any number without it destroying your phone. Then he gave the same talk many years later and he said “Well, the good news is the security of the web is approaching the security of the phone, but the bad news is it’s because phones are also getting to be general purpose machines that you download software on and they’re just as easily infected as regular machines”.
Smith J (2013-02-06 21:48:19). Radia Perlman at LCA 2013. Australian Science. Retrieved: Nov 21, 2024, from http://ozscience.com/interviews/radia-perlman-at-lca-2013/
Cool article. Seems like a fascinating woman! Thanks.