Google
 

luni, 24 septembrie 2007

Happy Birthday, Sputnik! (Thanks for the Internet)

Fifty years ago, a small Soviet satellite was launched, stunning the U.S. and sparking a massive technology research effort. Could we be in for another "October surprise"?

Quick, what's the most influential piece of hardware from the early days of computing? The IBM 360 mainframe? The DEC PDP-1 minicomputer? Maybe earlier computers such as Binac, ENIAC or Univac? Or, going way back to the 1800s, is it the Babbage Difference Engine?
More likely, it was a 183-pound aluminum sphere called Sputnik, Russian for "traveling companion." Fifty years ago, on Oct. 4, 1957, radio-transmitted beeps from the first man-made object to orbit the Earth stunned and frightened the U.S., and the country's reaction to the "October surprise" changed computing forever.
Although Sputnik fell from orbit just three months after launch, it marked the beginning of the Space Age, and in the U.S., it produced angst bordering on hysteria. Soon, there was talk of a U.S.-Soviet "missile gap." Then on Dec. 6, 1957, a Vanguard rocket that was to have carried aloft the first U.S. satellite exploded on the launch pad. The press dubbed the Vanguard "Kaputnik," and the public demanded that something be done.
The most immediate "something" was the creation of the Advanced Research Projects Agency (ARPA), a freewheeling Pentagon office created by President Eisenhower on Feb. 7, 1958. Its mission was to "prevent technological surprises," and in those first days, it was heavily weighted toward space programs.
Speaking of surprises, it might surprise some to learn that on the list of people who have most influenced the course of IT -- people with names like von Neumann, Watson, Hopper, Amdahl, Cerf, Gates and Berners-Lee -- appears the name J.C.R. Licklider, the first director of IT research at ARPA.
Armed with a big budget, carte blanche from his bosses and an unerring ability to attract bright people, Licklider catalyzed the invention of an astonishing array of IT, from time sharing to computer graphics to microprocessors to the Internet.
But now, the special culture that enabled Licklider and his successors to work their magic has largely disappeared from government, many say, setting up the U.S. once again for a technological drubbing. Could there be another Sputnik? "Oh, yes," says Leonard Kleinrock, the Internet pioneer who developed the principles behind packet-switching, the basis for the Internet, while Licklider was at ARPA. "But it's not going to be a surprise this time. We all see it coming."
The ARPA WayLicklider had studied psychology as an undergraduate, and in 1962, he brought to ARPA a passionate belief that computers could be far more user-friendly than the unconnected, batch-processing behemoths of the day. Two years earlier, he had published an influential paper, "Man-Computer Symbiosis," in which he laid out his vision for computers that could interact with users in real time. It was a radical idea, one utterly rejected by most academic and industrial researchers at the time. (See sidebar, Advanced Computing Visions from 1960.)
Driven by the idea that computers might not only converse with their users, but also with one another, Licklider set out on behalf of ARPA to find the best available research talent. He found it at companies like the RAND Corp., but mostly he found it at universities, starting first at MIT and then adding to his list Carnegie Mellon University; Stanford University; University of California, Berkeley; the University of Utah; and others.


Advanced Computing Visions from 1960 Nearly a half-century ago, a former MIT professor of psychology and electrical engineering wrote a paper -- largely forgotten today -- that anticipated by decades the emergence of computer time sharing, networks and some features that even today are at the leading edge of IT.
Licklider wrote "Man-Computer Symbiosis" in 1960, at a time when computing was done by a handful of big, stand-alone batch-processing machines. In addition to predicting "networks of thinking centers," he said man-computer symbiosis would require the following advances:
Indexed databases. "Implicit in the idea of man-computer symbiosis are the requirements that information be retrievable both by name and by pattern and that it be accessible through procedures much faster than serial search."
Machine learning in the form of "self-organizing" programs. "Computers will in due course be able to devise and simplify their own procedures for achieving stated goals."
Dynamic linking of programs and applications, or "real-time concatenation of preprogrammed segments and closed subroutines which the human operator can designate and call into action simply by name."
More and better methods for input and output. "In generally available computers, there is almost no provision for any more effective, immediate man-machine communication than can be achieved with an electric typewriter."
Tablet input and handwriting recognition. "It will be necessary for the man and the computer to draw graphs and pictures and to write notes and equations to each other on the same display surface."
Speech recognition. "The interest stems from realization that one can hardly take a ... corporation president away from his work to teach him to type."
Licklider sought out researchers like himself: bright, farsighted and impatient with bureaucratic impediments. He established a culture and modus operandi -- and passed it on to his successors Ivan Sutherland, Robert Taylor, Larry Roberts and Bob Kahn -- that would make the agency, over the next 30 years, the most powerful engine for IT innovation in the world.
Recalls Kleinrock, "Licklider set the tone for ARPA's funding model: long-term, high-risk, high-payoff and visionary, and with program managers, that let principal investigators run with research as they saw fit." (Although Kleinrock never worked at ARPA, he played a key role in the development of the ARPAnet, and in 1969, he directed the installation of the first ARPAnet node at UCLA.)
From the early 1960s, ARPA built close relationships with universities and a few companies, each doing what it did best while drawing on the accomplishments of the others. What began as a simple attempt to link the computers used by a handful of U.S. Department of Defense researchers ultimately led to the global Internet of today.
Along the way, ARPA spawned an incredible array of supporting technologies, including time sharing, workstations, computer graphics, graphical user interfaces, very large-scale integration (VLSI) design, RISC processors and parallel computing (see DARPA's Role in IT Innovations). There were four ingredients in this recipe for success: generous funding, brilliant people, freedom from red tape and the occasional ascent to the bully pulpit by ARPA managers.
These individual technologies had a way of cross-fertilizing and combining over time in ways probably not foreseen even by ARPA managers. What would become the Sun Microsystems Inc. workstation, for example, owes its origins rather directly to a half-dozen major technologies developed at multiple universities and companies, all funded by ARPA. (See Timeline: Three Decades of DARPA Hegemony.)
Ed Lazowska, a computer science professor at the University of Washington in Seattle, offers this story from the 1970s and early 1980s, when Kahn was a DARPA program manager, then director of its Information Processing Techniques Office:
What Kahn did was absolutely remarkable. He supported the DARPA VLSI program, which funded the [Carver] Mead-[Lynn] Conway integrated circuit design methodology. Then he funded the SUN workstation at Stanford because Forest Baskett needed a high-resolution, bitmapped workstation for doing VLSI design, and his grad student, Andy Bechtolsheim, had an idea for a new frame buffer.
Meanwhile, [Kahn] funded Berkeley to do Berkeley Unix. He wanted to turn Unix into a common platform for all his researchers so they could share results more easily, and he also saw it as a Trojan horse to drive the adoption of TCP/IP. That was at a time when every company had its own networking protocol -- IBM with SNA, DEC with DECnet, the Europeans with X.25 -- all brain-dead protocols.
Surprise?The launch of the Soviet satellite Sputnik shocked the world and became known as the "October surprise." But was it really?
Paul Green was working at MIT's Lincoln Laboratory in 1957 as a communications researcher. He had learned Russian and was invited to give talks to the Popov Society, a group of Soviet technology professionals. "So I knew Russian scientists," Green recalls. "In particular, I knew this big-shot academician named [Vladimir] Kotelnikov."
In the summer of 1957, Green told Computerworld, a coterie of Soviet scientists, including Kotelnikov, attended a meeting of the International Scientific Radio Union in Boulder, Colo. Says Green, "At the meeting, Kotelnikov -- who, it turned out later, was involved with Sputnik -- just mentioned casually, 'Yeah, we are about to launch a satellite.'"
"It didn't register much because the Russians were given to braggadocio. And we didn't realize what that might mean -- that if you could launch a satellite in those days, you must have a giant missile and all kinds of capabilities that were scary. It sort of went in one ear and out the other."
And did he tell anyone in Washington? "None of us even mentioned it in our trip reports," he says.
DARPA TodayBut around 2000, Kleinrock and other top-shelf technology researchers say, the agency, now called the Defense Advanced Research Projects Agency (DARPA), began to focus more on pragmatic, military objectives. A new administration was in power in Washington, and then 9/11 changed priorities everywhere. Observers say DARPA shifted much of its funding from long-range to shorter-term research, from universities to military contractors, and from unclassified work to secret programs.
Of government funding for IT, Kleinrock says, "our researchers are now being channeled into small science, small and incremental goals, short-term focus and small funding levels." The result, critics say, is that DARPA is much less likely today to spawn the kinds of revolutionary advances in IT that came from Licklider and his successors.
DARPA officials declined to be interviewed for this story. But Jan Walker, a spokesperson for DARPA Director Anthony Tether, said, "Dr. Tether ... does not agree. DARPA has not pulled back from long-term, high-risk, high-payoff research in IT or turned more to short-term projects." (See sidebar, DARPA's Response.)
A Shot in the RearDavid Farber, now a professor of computer science and public policy at Carnegie Mellon, was a young researcher at AT&T Bell Laboratories when Sputnik went up.
"We people in technology had a firm belief that we were leaders in science, and suddenly we got trumped," he recalls. "That was deeply disturbing. The Russians were considerably better than we thought they were, so what other fields were they good in?"
Farber says U.S. university science programs back then were weak and out of date, but higher education soon got a "shot in the rear end" via Eisenhower's ARPA. "It provided a jolt of funding," he says. "There's nothing to move academics like funding."
Farber says U.S. universities are no longer weak in science, but they are again suffering from lack of funds for long-range research.
"In the early years, ARPA was willing to fund things like artificial intelligence -- take five years and see what happens," he says. "Nobody cared whether you delivered something in six months. It was, 'Go and put forth your best effort and see if you can budge the field.' Now that's changed. It's more driven by, 'What did you do for us this year?'"
DARPA's budget calls for it to spend $414 million this year on information, communications and computing technologies, plus $483 million more on electronics, including things such as semiconductors. From 2001 to 2004, the percentage going to universities has shrunk from 39% to 21%, according the Senate Armed Services Committee. The beneficiaries have been defense contractors.
Meanwhile, funding from the National Science Foundation (NSF) for computer science and engineering -- most of it for universities -- has increased from $478 million in 2001 to $709 million this year, up 48%. But the NSF tends to fund smaller, more-focused efforts. And because contract awards are based on peer review, bidders on NSF jobs are inhibited from taking the kinds of chances that Licklider would have favored.
"At NSF, people look at your proposal and assign a grade, and if you are an outlier, chances are you won't get funded," says Victor Zue, who directs MIT's 900-person Computer Science and Artificial Intelligence Laboratory, the direct descendent of MIT's Project MAC, which was started with a $2 million ARPA grant in 1963.
"At DARPA, at least in the old days, they tended to fund people, and the program managers had tremendous latitude to say, 'I'm just going to bet on this.' At NSF, you don't bet on something."
Farber sits on a computer science advisory board at the NSF, and he says he has been urging the agency to "take a much more aggressive role in high-risk research." He explains, "Right now, the mechanisms guarantee that low-risk research gets funded. It's always, 'How do you know you can do that when you haven't done it?' A program manager is going to tell you, 'Look, a year from now, I have to write a report that says what this contributed to the country. I can't take a chance that it's not going to contribute to the country.' "
A report by the President's Council of Advisors on Science and Technology, released Sept. 10, indicates that at least some in the White House agree. In "Leadership Under Challenge: Information Technology R&D in a Competitive World," John H. Marburger, science advisor to the president, said, "The report highlights in particular the need to ... rebalance the federal networking and IT research and development portfolio to emphasize more large-scale, long-term, multidisciplinary activities and visionary, high-payoff goals."
No Help From IndustryThe U.S. has become the world's leader in IT because of the country's unique combination of government funding, university research, and industrial research and development, says the University of Washington's Lazowska. But just as the government has turned away from long-range research, so has industry, he says.
According to the Committee on Science, Engineering and Public Policy at the National Academy of Sciences, U.S. industry spent more on tort litigation than on research and development in 2001, the last year for which figures are available. And more than 95% of that R&D is engineering or development, not long-range research, Lazowska says. It's not looking out more than one product cycle; it's building the next release of the product," he says. "The question is, where do the ideas come from that allow you to do that five years from now? A lot of it has come from federally funded university research."
A great deal of fundamental research in IT used to take place at IBM, AT&T Inc. and Xerox Corp., but that has been cut way back, Lazowska says. "And of the new companies -- those created over the past 30 years -- only Microsoft is making significant investments that look out more than one product cycle."
Lazowska isn't expecting another event like Sputnik. "But I do think we are likely to wake up one day and find that China and India are producing far more highly qualified engineers than we are. Their educational systems are improving unbelievably quickly."
Farber also worries about those countries. His "Sputnik" vision is to "wake up and find that all our critical resources are now supplied by people who may not always be friendly." He recalls the book, The Japan That Can Say No (Simon & Schuster), which sent a Sputnik-like chill through the U.S. when it was published in 1991 by suggesting that Japan would one day outstrip the U.S. in technological prowess and thus exert economic hegemony over it.
"Japan could never pull that off because their internal markets aren't big enough, but a China that could say no or an India that could say no could be real," Farber says.
The U.S. has already fallen behind in communications, Farber says. "In computer science, we are right at the tender edge, although I do think we still have leadership there."
Some of the cutbacks in DARPA funding at universities are welcome, says MIT's Zue. "Our reliance on government funding is nowhere near what it was in 1963. In a way, that's healthy, because when a discipline matures, the people who benefit from it ought to begin paying the freight."
"But," Zue adds, "it's sad to see DARPA changing its priorities so that we can no longer rely on it to do the big things."

Niciun comentariu: