sâmbătă, 29 septembrie 2007

Opinion: Why Apple's 'new Newton' will rule

They can send a man to the moon (or at least they could 40 years ago). Why can't they make a tiny computer people want to buy?

Cell phone, laptop and desktop PC markets are all well established, with dominant players in each category raking in billions in sales. But in the world of mobile computers, the field for laptops that are bigger than cell phones but smaller than regular laptops is still wide open. A shockingly large number of companies have invested millions of dollars developing products in this category. They've shipped dozens of gadgets hyped as the Next Big Thing. But the buying public has responded with indifference.

Many observers blame this indifference on problems with the category itself. What's the appeal of a mobile computer too big for your pocket and too small for a full screen and keyboard?

But I disagree. There are many scenarios -- airplanes, restaurants, meetings, around the house -- where tiny mobile computers are ideal. The problem is price, performance and user experience. To date, products have been way too expensive, slow, clunky and awkward to use.

Eventually, somebody is going to get it right. And when they do, the tiny computer market will get huge.

Since Microsoft announced the "Origami" project way back in March of last year, the category has been going nowhere. But, suddenly, everything has changed.

Events in the past 30 days lead me to conclude something unthinkable just one month ago: Apple -- yeah, I said it -- Apple! will ship the first ever successful small computer. Call it the Newton on Crack (or, more accurately, on Mac).

Here's what happened in September.

Palm Foleo

Everyone seems to think that Palm's Foleo project has been canceled. But this isn't true.

The original Foleo concept was a Linux-based, low-power clamshell device that worked exclusively with Palm's Treo line of smart phones.

What is true is that Palm CEO Ed Colligan announced earlier this month that the company plans to discontinue the use of Linux as an operating system. This companywide strategic change will delay the Foleo, which will come out eventually on a new OS platform the company is now working on. The new operating system will be finished next year.

So just to be clear: The Palm Foleo project has not been canceled. It has been given a new operating system and delayed.

The Foleo is still a dark horse candidate. If the company's new platform is great, if the company can survive long enough without real innovation on the phone side, if they can get the price down far enough -- a lot of "ifs" here -- then Palm has a shot at selling a few of these to existing Treo owners.

The Foleo has zero chance of dominating the coming boom in tiny mobile computers.


The Federal Communications Commission recently approved a new minitablet, nonphone device from Nokia that supports Bluetooth, WLAN and GPS. The approval was branded as "confidential," so only the sketchiest of details are available on the product, which will almost certainly ship this year.

I'm not sure Nokia has the "right stuff" to compete in the nonphone market. For starters, the company has trouble focusing on individual products and tends to scatter its energy and resources across its massive line of devices. The future king of tiny mobile computers is going to need vision and focus.

Go ahead and take Nokia off the list of contenders.


The ultramobile PC (or UMPC) platform, originally developed by Microsoft, Intel and Samsung, is designed for small, low-voltage computers with pen-based touch screens and, optionally, QWERTY keyboards. UMPCs can run Windows XP Tablet PC Edition 2005, Windows Vista Home Premium Edition or Linux.

Intel announced last week that it would slash the power on its UMPC chip sets in an upcoming chip set code-named Moorestown and add hot features like WiMax, 3G and others.

The Intel announcement is the best news to ever hit the UMPC space. The future of UMPCs has potential, but so far nobody in the space has achieved the right combination of price, performance and overall user experience. The manufacturers are trying, however, and just this month have announced wide-ranging updates.

  • Asus announced yesterday major updates to its R2E UMPC. The new version uses Intel's 800-MHz A110 processor instead of a Celeron, which should improve battery life. The device sports a few impressive specs, including 1GB of RAM, 802.11g wireless and integrated GPS and a webcam. The R2E, however, is simply too expensive to succeed at over $1,500, and it doesn't have a keyboard.

  • Fujitsu recently announced its appealing LifeBook U1010 in Asia, which is sold as the U810 in the U.S. The device is for business professionals who also want to watch movies and play games. It even has a fingerprint scanner for security. Of all the UMPCs that are shipping, the Fujitsu has the most promise. It's both a tablet and a clamshell. It has a nice big keyboard. And it has a relatively low price: $1,000. Unfortunately, the UMPC runs Windows Vista, and some users report serious performance issues. If Fujitsu could make the U810 a lot faster and a little cheaper (say, under $700), they'd have a category buster. But they can't, so they don't.
  • Sony recently updated the hardware on its VAIO UX-Series UMPC. The computer has a screen that slides up to uncover an unusable keyboard. The company will need to completely overhaul the design for better usability if it wants leadership in the coming minicomputer space. I would think Sony could do better than this.
  • OQO's recently updated 02 UMPC is optimized for media, and has a small, awkward keyboard. The device is both too small -- very close in size to a large smart phone -- and too expensive -- at $1,300, it costs as much as a laptop.
  • HTC recently announced that it plans to jump on the Vista bandwagon with the company's Shift UMPC -- and also use Windows Mobile. The device uses Microsoft's cell phone operating system to collect e-mail while the computer is in sleep mode. The Shift has a nice, big keyboard and screen, but it's too expensive ($1,500), suffers from poor battery life (three hours!) and is a little on the fat side.
These are just the UMPCs updated during September. There are more than a dozen other devices out there on the Origami platform. Every single UMPC device that has been shipped or announced suffers from lousy usability, high prices, poor performance, ill-conceived user interfaces, or any combination of the above. And far too many of these companies are jumping on the Vista bandwagon. If Vista can't deliver good performance on a brand-new desktop PC, how can it function well enough on a low-powered handheld device with a touch screen?

Can anyone create the right combination of usability, performance and price? Yes: Someone can.


Two things happened in the Applesphere in September that changed everything. First, of course, is that Apple CEO Steve Jobs announced Sept. 5 the iPod Touch.

The second is that AppleInsider said this week that Apple is working on an updated Newton MessagePad -- basically a big iPod Touch with additional PDA functionality. The Mac OS X Leopard-based mobile minitablet PC will be 1.5 times the size of an iPhone, but with an approximate 720 by 480 high-resolution display. The site estimates that the new device will ship in the first half of 2008.

If true (and some believe it isn't), this rumor is very good news. If Apple ships an iPod Touch, but with good PIM (personal information manager) functionality, an optional wireless keyboard and good battery life for under $1,000, they win.

But even if this particular rumor is false, I still believe Apple will dominate this category with another project. As I've said before in this space, Apple's iPhone user interface is a glimpse of the future, not only of future Apple mobile computers, but desktops and the future of all PCs as well. It's inevitable that Apple will ship a tablet Mac that works like the iPhone. And, just as in the iPod space, the company will likely round out the category with a "mini" version.

Of course, everything could change again in October. But right now, the only company with a prayer of succeeding in the small computer space is also the only company that hasn't even shown a prototype -- Apple.

iPhone's Bluetooth bug under the hacker microscope

Almost lost in the hubbub over Thursday's iPhone firmware update and whether it would "brick" unlocked phones was the fact that Apple Inc. patched 10 vulnerabilities -- twice the number of fixes issued since the phone's June debut.

The iPhone 1.1.1 update, which like previous upgrades is delivered through Apple's iTunes software, fixes seven flaws in the built-in Safari browser, two in the smart phone's Mail application and one in its use of Bluetooth, the short-range wireless technology.

The seven Safari vulnerabilities include several cross-site scripting (XSS) flaws, one that can disclose the URL of other viewed pages -- an online banking site, say -- and another that lets attackers execute malicious JavaScript in pages delivered by the SSL-encrypted HTTPS protocol. One of the Safari flaws, and an associated vulnerability in Mail, involve "tel:" links, which can be exploited by hackers to dial a number without the user confirming the call.

But it was the Bluetooth bug that got the attention of security researchers. Symantec's DeepSight threat network team pointed out the vulnerability in an advisory to customers today. "Reportedly, the Bluetooth flaw occurs when malicious Service Discovery Protocol (SDP) packets are handled; any attacker that is within Bluetooth range can exploit it remotely," wrote DeepSight analyst Anthony Roe in the alert. "Successful exploits are reported to allow the attacker to execute arbitrary code."

According to Apple's security advisory, the Bluetooth bug was discovered and reported by Kevin Mahaffey and John Hering of Flexillis Inc., a Los Angeles-based company that specializes in mobile security development and consulting. Flexillis may be best known for its reverse engineering of the exploit used to hack into several celebrities' T-Mobile cell phone accounts in 2005, include Paris Hilton and Vin Diesel.

The Bluetooth bug may prove to be dangerous to iPhones, Roe speculated, since the potential range of the technology is much greater than most people think. While Bluetooth's potential range -- and thus the maximum distance between attacker and victim -- is about 400 feet, "Several proof-of-concept Bluetooth antennas have intercepted Bluetooth signals at almost a mile," he said.

Roe also pointed out that HD Moore, the driving force behind the Metasploit penetration framework, had recently demonstrated that shellcode could be run on an iPhone. Moore, said Roe, proved that "exploiting security vulnerabilities affecting the iPhone is by no means out of reach." n a post to his blog -- and to the Metasploit site -- on Wednesday, Moore said that because every process on the iPhone runs as root, and so has full privileges to the operating system, any exploit of an iPhone application vulnerability, such as Safari or Mail or Bluetooth, would result in a complete hijack of the device. Moore also announced that he would add iPhone support to Metasploit, which would make it much easier for hackers to access a vulnerable phone.

Moore acknowledged that he's looking at the Bluetooth vulnerability. "The Bluetooth SDP vulnerability is the only issue I am focusing on," he said in an e-mail Friday.

He also hinted that locating vulnerable iPhones wouldn't be a problem. "The Bluetooth MAC [media address control] address is always one less than the Wi-Fi interface's MAC address," he said. "Since the iPhone is always probing for or connected to its list of known access points, the presence of the iPhone and its Bluetooth MAC address can be determining by using a standard Wi-Fi sniffer.

"Once the Bluetooth MAC address is obtained, the SDP issue can be exploited by anyone within range of the Bluetooth chip, or within range of the attacker's antenna, which can be up to a mile away in some cases," he said.

If Moore manages to craft an exploit and add it to Metasploit, it's probable that criminal hackers will quickly follow. "Once we see something in Metasploit, we know it's likely we'll see it used in attacks," Alfred Huger, vice president of engineering with Symantec's security response group, said in a July interview.

Jarno Neimela, a senior researcher with F-Secure Corp., a Helsinki-based security vendor, also hit the alarm button, but for a different reason. In a posting to his company's blog today, Neimela pointed out that there's no security software available for the iPhone, thanks to Apple's decision to keep the device's inner workings a secret.

"The amount of technical information [available about the iPhone] makes it likely that sooner or later someone will create a worm or some other malware," Neimela said. "This will create an interesting problem for the security field as the iPhone is currently a closed system and it's not feasible to provide anti-virus or other third-party security solutions for it.

"So if someone were able to create a rapidly spreading worm on the iPhone, protecting users against it would be problematic."

Although iPhone owners will be automatically notified in the next week that the new patches are ready to download and install, a large number of those who have modified or unlocked their phones will probably forgo the fixes, since the 1.1.1 update apparently also disables unlocked phones and wipes unauthorized third-party applications that have been added with various hacks.

miercuri, 26 septembrie 2007


The problem may not rival the movies Poltergeist or The Amityville Horror for sheer terror, but CIOs and data center managers are still well advised to deal decisively with so-called ghost servers. Like celluloid zombies, these forgotten pieces of equipment are dead when it comes to improving the bottom line, but they are very much alive when it comes to eating up IT budgets.The unproductive -- and usually undocumented -- servers take up valuable real estate, consume increasingly expensive electricity and, in some cases, absorb ongoing maintenance and lease payments. "You can find ghost servers in a lot of enterprises," says John Phelps, an analyst at Gartner Inc. "And the larger and more diverse the company, the harder it can be to have a single group or technology platform that provides control over all corporate assets." Sun Microsystems Inc., through case studies of two large corporate data center operations and anecdotal analysis of efforts with many customers, believes that 8% to 10% of all servers in large corporations have no identifiable function. In the two data center studies, 150 ghost servers were found in an installation of 1,800 servers, and 354 ghost servers were found in an installation of 3,500 servers. One of the companies studied was Sun itself. Sun used system performance tools to monitor CPU utilization and I/O and network traffic, collecting the data over the course of a month, and sorted out machines with zero utilization. Sun removed the questionable servers from operation for 90 days to determine any impact. At the end of the period, it found that 60% of the servers could be permanently decommissioned, says Mark Monroe, director of sustainable computing at Sun. The company now conducts quarterly reviews of utilization rates. "It's hard to get people to admit they have unused infrastructure," Monroe says. "It's expensive, wasteful, and having a CIO admit he's got millions of dollars of idle assets lying around could get a guy fired. I think we can remove some of the stigma by talking about the facts, and having people realize it's worse just to leave them lying around." Corporations need to admit "they are like everyone else" and try to reduce the number of idle machines to 3% or less, "which is three times better than the industry average," he says. The cost of running a server for three years exceeds its original acquisition cost, so keeping the ghost servers around has an easily measured effect in energy savings. Identifying and eliminating wasted resources are key components of green or "eco-computing," Monroe says. The good news: Advances in asset management and configuration software can help businesses find and shut down these useless machines. And more companies are being proactive when it comes to adopting policies in this area. A survey of more than 300 businesses in the U.S. and Europe conducted by Gartner in 2006 found that 74% of respondents said they have a formal software and hardware asset management program. More advanced autodiscovery capabilities and more efficient, effective and accurate search engines mean "there has recently been more focus on discovering the lost and unidentified equipment, especially in industries where security concerns are growing," says Richard Ptak, analyst at Ptak, Noel & Associates. "Every major company I've ever dealt with has had the problem to some level."
When BlueCross BlueShield of Florida (BCBSF), which provides insurance services for nearly 9 million people, decided to build a new data center in 2005, the company also felt it was time to get a comprehensive look at its asset inventory, says Paul Stallings, senior manager for provisioning services. In addition to the new and existing data centers, BCBSF also has about 10 smaller server rooms in various locations around the state. The server room equipment was being managed by different departments within the insurance provider, and each had a different way of tracking and documenting assets, Stallings says. Using Aperture Technologies Inc.'s VISTA tool beginning in late 2005, BCBSF began documenting its IT resources under a single umbrella. "There were definitely undocumented assets," Stallings says. "When you have seven or eight teams managing the hardware, each using a different process, there was really no way to get a systemwide understanding of what we had in our various data centers." By getting a complete view of its asset inventory, BCBSF has been able to create a formal decommissioning process. When a system needs to be retired, a workflow ticket is automatically generated. The IT department can then start to reclaim floor space, remove cabling and place old servers in pool of equipment that can be either redeployed or completely removed from the environment. As one of largest IT services providers in the world, Fujitsu Services operates seven Fujitsu-owned data centers in the U.K. with more than 200,000 sq. ft. of floor space, and manages 11 customer-owned facilities. As a result of its outsourcing business, the company has acquired data center facilities and IT equipment from a variety of sources, and much of the equipment came with no asset or configuration management process in place, says Mark Scott, global data center delivery manager at Fujitsu Services. The company had previously used a configuration cataloging system that recorded only the location of equipment on the floor, but it did not provide any insight into specific use of the equipment. In attempt to get a comprehensive inventory and create an asset management strategy that would allow it to maximize existing data center space and continue to grow, Fujitsu Services began working with Aperture to deploy its VISTA data center resource management system in 2005, and the company began to see a variety of issues that were wasting resources. "We found we had IT equipment on the floor that people within the company had thought we had gotten rid of years ago," he says. "We looked deeper and found out we were also still paying lease and maintenance on the equipment. We even found that we were paying lease and maintenance on equipment that been removed from our data centers." By extending the asset management systems across its entire U.K.-based data center operation, Fujitsu Services has reduced its operational costs and then passed the savings on to customers, Scott says. Since customers are generally charged for the floor space they use within the data centers, the removal of unproductive equipment has allowed Fujitsu Services to reduce specific hosting charges for those customers. So, for example, if Fujitsu can reduce the footprint by 10%, Scott says that 10% savings can be passed along to the customer. The company is also developing the ability to invoice for the actual power required by individual customer installations, versus the current practice of invoicing for power based on the floor space used by the servers. "We also found lots of badly installed equipment," Scott says. The poor installation processes had led to what he characterized as vulnerabilities that meant his company had difficulties meeting customer uptime requirements. "The result was really that we had actually built vulnerabilities into the operation. It was a real wake-up call. We now have a complete vulnerability check and can control installations from the beginning to end." Alticor Inc., the parent company for such businesses as Amway Corp., Quixtar Inc. and Access Business Group LLC, handles IT demands for affiliates all over the world, says Randy Gast, supervisor of server technology at Alticor. The company uses a combination of management tools to keep track of its software and hardware assets, including BMC Software Inc.'s Remedy service management and Hewlett-Packard Co.'s Systems Insight Manager. Using the software, Alticor early last year found that more than 200 servers, or about a third of its 650 x86 processor-based servers, were running at utilization rates of 10% or less, Gast says. Even scarier, these underused servers had accumulated without IT knowledge, over the previous three years as new equipment was bought to handle individual applications or affiliate requirements. Working with virtualization specialist VMware Inc., Alticor has embarked on an effort to consolidate the unproductive system and increase overall utilization rates to between 60% and 70%, Gast says. To date, Alticor has consolidated 150 of the unproductive servers onto seven servers using virtualization software. Alticor has donated the majority of the servers it has taken out of operation during the consolidation effort to charitable organizations. Servers that were too old to be used by groups such as local schools and churches have been sold for scrap. "Using Remedy, we can track the lease agreements for our hardware, when they start and expire, and use it to have automatic triggers that let us know we need to consider replacing certain equipment," Gast says. That said, there is generally a great deal of inefficiency regarding how customers get rid of old gear. "A lot of businesses don't have a disposition strategy, primarily because it's easier to buy" new servers than dispose of the old ones, says Daniel Ransdell, general manager of IBM's Global Asset Recovery Solutions business unit. "Even if you've unplugged the equipment and stuck it in some closet or corner, the business is losing the opportunity to recoup value. Servers aren't like fine wine. They don't get better with age." Energy cost is the major issue that is changing attitudes inside corporations. In August, the Environmental Protection Agency published a report that documented data center power usage. According to the report, data centers in the U.S. consumed about 60 billion kilowatt-hours in 2006, or about 1.5% of total electricity consumption in the country. The EPA points out that data center energy use has doubled in the past five years, and is expected to double again in the next five years to an annual cost of about $7.4 billion. The EPA says existing technologies and strategies could reduce typical sever energy use by 25%, and even greater savings are possible using more advanced technologies. Rockwell Bonecutter, head of data center technology and operations at Accenture Ltd., believes that a large percentage of the ghost server problem was alleviated when most businesses engaged in extensive Y2k efforts within their infrastructure. In the intervening years, however, there has been a significant growth of systems that operate at 5% utilization or less, often because of poor communication and asset management within the company. "When it comes to servers that nobody knows about that are sitting for years and nobody has touched, there are probably examples in every IT environment, but it's obviously impossible to measure what you don't know exists," Bonecutter says. "What we have found is that it is not unusual to find that 40% of all servers on a floor could be consolidated and virtualized out of the environment." Consolidation through virtualization has also led to the new phenomena of virtual ghost servers. The ease and quickness with which virtual servers can be created can often leave servers cluttered with numerous poorly documented virtual machines created for short-term or abandoned projects. With tools allowing businesses to get a more holistic view of their assets and policies in place to guide a formal decommissioning process, businesses can now reduce the risk and associated costs of ghost servers, without the need to call on the aid of another great Hollywood institution, Ghostbusters.

luni, 24 septembrie 2007

Happy Birthday, Sputnik! (Thanks for the Internet)

Fifty years ago, a small Soviet satellite was launched, stunning the U.S. and sparking a massive technology research effort. Could we be in for another "October surprise"?

Quick, what's the most influential piece of hardware from the early days of computing? The IBM 360 mainframe? The DEC PDP-1 minicomputer? Maybe earlier computers such as Binac, ENIAC or Univac? Or, going way back to the 1800s, is it the Babbage Difference Engine?
More likely, it was a 183-pound aluminum sphere called Sputnik, Russian for "traveling companion." Fifty years ago, on Oct. 4, 1957, radio-transmitted beeps from the first man-made object to orbit the Earth stunned and frightened the U.S., and the country's reaction to the "October surprise" changed computing forever.
Although Sputnik fell from orbit just three months after launch, it marked the beginning of the Space Age, and in the U.S., it produced angst bordering on hysteria. Soon, there was talk of a U.S.-Soviet "missile gap." Then on Dec. 6, 1957, a Vanguard rocket that was to have carried aloft the first U.S. satellite exploded on the launch pad. The press dubbed the Vanguard "Kaputnik," and the public demanded that something be done.
The most immediate "something" was the creation of the Advanced Research Projects Agency (ARPA), a freewheeling Pentagon office created by President Eisenhower on Feb. 7, 1958. Its mission was to "prevent technological surprises," and in those first days, it was heavily weighted toward space programs.
Speaking of surprises, it might surprise some to learn that on the list of people who have most influenced the course of IT -- people with names like von Neumann, Watson, Hopper, Amdahl, Cerf, Gates and Berners-Lee -- appears the name J.C.R. Licklider, the first director of IT research at ARPA.
Armed with a big budget, carte blanche from his bosses and an unerring ability to attract bright people, Licklider catalyzed the invention of an astonishing array of IT, from time sharing to computer graphics to microprocessors to the Internet.
But now, the special culture that enabled Licklider and his successors to work their magic has largely disappeared from government, many say, setting up the U.S. once again for a technological drubbing. Could there be another Sputnik? "Oh, yes," says Leonard Kleinrock, the Internet pioneer who developed the principles behind packet-switching, the basis for the Internet, while Licklider was at ARPA. "But it's not going to be a surprise this time. We all see it coming."
The ARPA WayLicklider had studied psychology as an undergraduate, and in 1962, he brought to ARPA a passionate belief that computers could be far more user-friendly than the unconnected, batch-processing behemoths of the day. Two years earlier, he had published an influential paper, "Man-Computer Symbiosis," in which he laid out his vision for computers that could interact with users in real time. It was a radical idea, one utterly rejected by most academic and industrial researchers at the time. (See sidebar, Advanced Computing Visions from 1960.)
Driven by the idea that computers might not only converse with their users, but also with one another, Licklider set out on behalf of ARPA to find the best available research talent. He found it at companies like the RAND Corp., but mostly he found it at universities, starting first at MIT and then adding to his list Carnegie Mellon University; Stanford University; University of California, Berkeley; the University of Utah; and others.

Advanced Computing Visions from 1960 Nearly a half-century ago, a former MIT professor of psychology and electrical engineering wrote a paper -- largely forgotten today -- that anticipated by decades the emergence of computer time sharing, networks and some features that even today are at the leading edge of IT.
Licklider wrote "Man-Computer Symbiosis" in 1960, at a time when computing was done by a handful of big, stand-alone batch-processing machines. In addition to predicting "networks of thinking centers," he said man-computer symbiosis would require the following advances:
Indexed databases. "Implicit in the idea of man-computer symbiosis are the requirements that information be retrievable both by name and by pattern and that it be accessible through procedures much faster than serial search."
Machine learning in the form of "self-organizing" programs. "Computers will in due course be able to devise and simplify their own procedures for achieving stated goals."
Dynamic linking of programs and applications, or "real-time concatenation of preprogrammed segments and closed subroutines which the human operator can designate and call into action simply by name."
More and better methods for input and output. "In generally available computers, there is almost no provision for any more effective, immediate man-machine communication than can be achieved with an electric typewriter."
Tablet input and handwriting recognition. "It will be necessary for the man and the computer to draw graphs and pictures and to write notes and equations to each other on the same display surface."
Speech recognition. "The interest stems from realization that one can hardly take a ... corporation president away from his work to teach him to type."
Licklider sought out researchers like himself: bright, farsighted and impatient with bureaucratic impediments. He established a culture and modus operandi -- and passed it on to his successors Ivan Sutherland, Robert Taylor, Larry Roberts and Bob Kahn -- that would make the agency, over the next 30 years, the most powerful engine for IT innovation in the world.
Recalls Kleinrock, "Licklider set the tone for ARPA's funding model: long-term, high-risk, high-payoff and visionary, and with program managers, that let principal investigators run with research as they saw fit." (Although Kleinrock never worked at ARPA, he played a key role in the development of the ARPAnet, and in 1969, he directed the installation of the first ARPAnet node at UCLA.)
From the early 1960s, ARPA built close relationships with universities and a few companies, each doing what it did best while drawing on the accomplishments of the others. What began as a simple attempt to link the computers used by a handful of U.S. Department of Defense researchers ultimately led to the global Internet of today.
Along the way, ARPA spawned an incredible array of supporting technologies, including time sharing, workstations, computer graphics, graphical user interfaces, very large-scale integration (VLSI) design, RISC processors and parallel computing (see DARPA's Role in IT Innovations). There were four ingredients in this recipe for success: generous funding, brilliant people, freedom from red tape and the occasional ascent to the bully pulpit by ARPA managers.
These individual technologies had a way of cross-fertilizing and combining over time in ways probably not foreseen even by ARPA managers. What would become the Sun Microsystems Inc. workstation, for example, owes its origins rather directly to a half-dozen major technologies developed at multiple universities and companies, all funded by ARPA. (See Timeline: Three Decades of DARPA Hegemony.)
Ed Lazowska, a computer science professor at the University of Washington in Seattle, offers this story from the 1970s and early 1980s, when Kahn was a DARPA program manager, then director of its Information Processing Techniques Office:
What Kahn did was absolutely remarkable. He supported the DARPA VLSI program, which funded the [Carver] Mead-[Lynn] Conway integrated circuit design methodology. Then he funded the SUN workstation at Stanford because Forest Baskett needed a high-resolution, bitmapped workstation for doing VLSI design, and his grad student, Andy Bechtolsheim, had an idea for a new frame buffer.
Meanwhile, [Kahn] funded Berkeley to do Berkeley Unix. He wanted to turn Unix into a common platform for all his researchers so they could share results more easily, and he also saw it as a Trojan horse to drive the adoption of TCP/IP. That was at a time when every company had its own networking protocol -- IBM with SNA, DEC with DECnet, the Europeans with X.25 -- all brain-dead protocols.
Surprise?The launch of the Soviet satellite Sputnik shocked the world and became known as the "October surprise." But was it really?
Paul Green was working at MIT's Lincoln Laboratory in 1957 as a communications researcher. He had learned Russian and was invited to give talks to the Popov Society, a group of Soviet technology professionals. "So I knew Russian scientists," Green recalls. "In particular, I knew this big-shot academician named [Vladimir] Kotelnikov."
In the summer of 1957, Green told Computerworld, a coterie of Soviet scientists, including Kotelnikov, attended a meeting of the International Scientific Radio Union in Boulder, Colo. Says Green, "At the meeting, Kotelnikov -- who, it turned out later, was involved with Sputnik -- just mentioned casually, 'Yeah, we are about to launch a satellite.'"
"It didn't register much because the Russians were given to braggadocio. And we didn't realize what that might mean -- that if you could launch a satellite in those days, you must have a giant missile and all kinds of capabilities that were scary. It sort of went in one ear and out the other."
And did he tell anyone in Washington? "None of us even mentioned it in our trip reports," he says.
DARPA TodayBut around 2000, Kleinrock and other top-shelf technology researchers say, the agency, now called the Defense Advanced Research Projects Agency (DARPA), began to focus more on pragmatic, military objectives. A new administration was in power in Washington, and then 9/11 changed priorities everywhere. Observers say DARPA shifted much of its funding from long-range to shorter-term research, from universities to military contractors, and from unclassified work to secret programs.
Of government funding for IT, Kleinrock says, "our researchers are now being channeled into small science, small and incremental goals, short-term focus and small funding levels." The result, critics say, is that DARPA is much less likely today to spawn the kinds of revolutionary advances in IT that came from Licklider and his successors.
DARPA officials declined to be interviewed for this story. But Jan Walker, a spokesperson for DARPA Director Anthony Tether, said, "Dr. Tether ... does not agree. DARPA has not pulled back from long-term, high-risk, high-payoff research in IT or turned more to short-term projects." (See sidebar, DARPA's Response.)
A Shot in the RearDavid Farber, now a professor of computer science and public policy at Carnegie Mellon, was a young researcher at AT&T Bell Laboratories when Sputnik went up.
"We people in technology had a firm belief that we were leaders in science, and suddenly we got trumped," he recalls. "That was deeply disturbing. The Russians were considerably better than we thought they were, so what other fields were they good in?"
Farber says U.S. university science programs back then were weak and out of date, but higher education soon got a "shot in the rear end" via Eisenhower's ARPA. "It provided a jolt of funding," he says. "There's nothing to move academics like funding."
Farber says U.S. universities are no longer weak in science, but they are again suffering from lack of funds for long-range research.
"In the early years, ARPA was willing to fund things like artificial intelligence -- take five years and see what happens," he says. "Nobody cared whether you delivered something in six months. It was, 'Go and put forth your best effort and see if you can budge the field.' Now that's changed. It's more driven by, 'What did you do for us this year?'"
DARPA's budget calls for it to spend $414 million this year on information, communications and computing technologies, plus $483 million more on electronics, including things such as semiconductors. From 2001 to 2004, the percentage going to universities has shrunk from 39% to 21%, according the Senate Armed Services Committee. The beneficiaries have been defense contractors.
Meanwhile, funding from the National Science Foundation (NSF) for computer science and engineering -- most of it for universities -- has increased from $478 million in 2001 to $709 million this year, up 48%. But the NSF tends to fund smaller, more-focused efforts. And because contract awards are based on peer review, bidders on NSF jobs are inhibited from taking the kinds of chances that Licklider would have favored.
"At NSF, people look at your proposal and assign a grade, and if you are an outlier, chances are you won't get funded," says Victor Zue, who directs MIT's 900-person Computer Science and Artificial Intelligence Laboratory, the direct descendent of MIT's Project MAC, which was started with a $2 million ARPA grant in 1963.
"At DARPA, at least in the old days, they tended to fund people, and the program managers had tremendous latitude to say, 'I'm just going to bet on this.' At NSF, you don't bet on something."
Farber sits on a computer science advisory board at the NSF, and he says he has been urging the agency to "take a much more aggressive role in high-risk research." He explains, "Right now, the mechanisms guarantee that low-risk research gets funded. It's always, 'How do you know you can do that when you haven't done it?' A program manager is going to tell you, 'Look, a year from now, I have to write a report that says what this contributed to the country. I can't take a chance that it's not going to contribute to the country.' "
A report by the President's Council of Advisors on Science and Technology, released Sept. 10, indicates that at least some in the White House agree. In "Leadership Under Challenge: Information Technology R&D in a Competitive World," John H. Marburger, science advisor to the president, said, "The report highlights in particular the need to ... rebalance the federal networking and IT research and development portfolio to emphasize more large-scale, long-term, multidisciplinary activities and visionary, high-payoff goals."
No Help From IndustryThe U.S. has become the world's leader in IT because of the country's unique combination of government funding, university research, and industrial research and development, says the University of Washington's Lazowska. But just as the government has turned away from long-range research, so has industry, he says.
According to the Committee on Science, Engineering and Public Policy at the National Academy of Sciences, U.S. industry spent more on tort litigation than on research and development in 2001, the last year for which figures are available. And more than 95% of that R&D is engineering or development, not long-range research, Lazowska says. It's not looking out more than one product cycle; it's building the next release of the product," he says. "The question is, where do the ideas come from that allow you to do that five years from now? A lot of it has come from federally funded university research."
A great deal of fundamental research in IT used to take place at IBM, AT&T Inc. and Xerox Corp., but that has been cut way back, Lazowska says. "And of the new companies -- those created over the past 30 years -- only Microsoft is making significant investments that look out more than one product cycle."
Lazowska isn't expecting another event like Sputnik. "But I do think we are likely to wake up one day and find that China and India are producing far more highly qualified engineers than we are. Their educational systems are improving unbelievably quickly."
Farber also worries about those countries. His "Sputnik" vision is to "wake up and find that all our critical resources are now supplied by people who may not always be friendly." He recalls the book, The Japan That Can Say No (Simon & Schuster), which sent a Sputnik-like chill through the U.S. when it was published in 1991 by suggesting that Japan would one day outstrip the U.S. in technological prowess and thus exert economic hegemony over it.
"Japan could never pull that off because their internal markets aren't big enough, but a China that could say no or an India that could say no could be real," Farber says.
The U.S. has already fallen behind in communications, Farber says. "In computer science, we are right at the tender edge, although I do think we still have leadership there."
Some of the cutbacks in DARPA funding at universities are welcome, says MIT's Zue. "Our reliance on government funding is nowhere near what it was in 1963. In a way, that's healthy, because when a discipline matures, the people who benefit from it ought to begin paying the freight."
"But," Zue adds, "it's sad to see DARPA changing its priorities so that we can no longer rely on it to do the big things."

joi, 20 septembrie 2007

Can IBM save from itself?'s biggest foe may be Microsoft Office, but critics say the open-source organization has, from its inception, also been one of its application suite's own worst enemies -- a victim of a development culture that differs radically from the open-source norm. Observers now wonder if IBM's entry into can make the necessary changes.
Though spun out by Sun Microsystems Inc. in 2000, remains almost totally under the control of Sun employees working full-time on the project.
There is also a free, native Aqua version of OpenOffice called NeoOffice that was created by open-source developers unaffiliated with NeoOffice has received positive reviews and the latest version, 2.2.1, includes support for Mac OS X Spellchecker and Address Book, as well as experimental support for opening Open XML files created by Excel 2007 and PowerPoint 2007. It was released late last month and is available for download.
That includes virtually all of's lead programmers and software testers, most of whom are based in Sun's Hamburg, Germany office, as well as's overall boss, Louis Suarez-Potts, who is the community's equivalent to Linux's Linus Torvalds.
"I think Sun developers have worked hard to open up the process at, and to my mind it has shown positive results," said Bruce D'Arcus, a lead developer at who has blogged about his dissatisfaction. "But there's a fundamental contradiction between having a vibrant open community and having the process controlled by a single party."
That tight control, combined with a bureaucratic culture, has hurt's ability to roll out new features quickly and otherwise keep pace technically with Microsoft Office, say insiders. For instance, OpenOffice's current (non-Aqua) Mac version lacks rich graphics, there is no e-mail module, and the software cannot yet open or read files in the Office Open XML document format used by Office 2007.
As a result, OpenOffice and its commercial cousin, StarOffice, still own just a small slice of the office software market, though the former has been downloaded more than 96 million times. The most recent version, OpenOffice 2.3, was released Monday as the organization prepared for its worldwide developer conference in Barcelona this week.
Is Sun missing the cultural point?
There are "enough developers frustrated by both the technical and the organizational infrastructure at" that it is "a real problem that is weighing on the project," said D'Arcus, a university geography professor who participates in the project.
Or as another OpenOffice developer, Michael Meeks of Novell Inc., blogged last week: "Question for Sun mgmt: at what fraction of the community will Sun reconsider its demand for ownership of the entirety of"
That has long hurt's attempts to recruit and, moreover, keep contributors that are not paid by Sun or other leading backers such as Novell or Google Inc. to work on
" has a very central business process of controlling what comes into the source base and by that very system misses the point of Open Source development," said Ken Foskey, an Australian open-source developer who volunteered for for three years. He left in 2005 after becoming "increasingly frustrated" with the organization's bureaucracy.

Scott Carr, a community member of, acknowledges he has lost two key members of his already-small documentation team.
"I understand where some of [the criticism] is coming from," he said.
Enter IBM, accompanied by Symphony
So does IBM Corp., which is joining and creating its own free version called Lotus Symphony, aimed at its enterprise and government customers.
"We think that there's a broad-based consensus that some governance and structural changes are in order that would make the OpenOffice project more attractive to others," Doug Heintzman, director of strategy for IBM's Lotus Software, said in an interview last week. "It's no secret that this has been an issue for us for some time, and we haven't viewed as being as healthy as it might be in this respect."
Besides committing 35 China developers to, IBM plans to make its voice heard -- immediately and loudly. IBM will "work within the leadership structure that exists," said Sean Poulley, vice president of business and strategy in IBM's Lotus Software division. "But we will take our rightful leadership position in the community along with Sun and others."
In e-mailed comments, Heintzman said his criticisms about the situation have been made openly.
"We think that Open Office has quite a bit of potential and would love to see it move to the independent foundation that was promised in the press release back when Sun originally announced OpenOffice," he said. "We think that there are plenty of existing models of communities, [such as] Apache and Eclipse, that we can look to as models of open governance, copyright aggregation and licensing regimes that would make the code much more relevant to a much larger set of potential contributors and implementers of the technology....
"Obviously, by joining we do believe that the organization is important and has potential," he wrote. "I think that new voices at the table, including IBM's, will help the organization become more efficient and relevant to a greater audience.... Our primary reason for joining was to contribute to the community and leverage the work that the community produces.... I think it is true there are many areas worthy of improvement and I sincerely hope we can work on those.... I hope the story coming out of Barcelona isn't a dysfunctional community story, but rather a [story about a] potentially significant and meaningful community with considerable potential that has lots of room for improvement...."
Suarez-Potts did not return repeated requests for comment. But Erwin Tenhumberg, community development and marketing manager for and a Sun employee in its Hamburg, Germany office where OpenOffice / StarOffice development is centered, acknowledged the criticism.
"There's a long tradition at Sun of not paying attention to outside contributors because there weren't many for a long time," said Tenhumberg, who estimated that 90 percent of the programming in OpenOffice 2.0, the last major release from two years ago, was done by Sun employees.
Alexandro Colorado, who helps run a project to create a Spanish-language version of OpenOffice, said while many of the criticisms leveled's management are valid, "there are other sides of the story than [just] pure bashing."
He blamed "exponential" growth in OpenOffice's code base, a situation that has been partly corrected after the group began to limit development in the core OpenOffice code and ask developers to build new features in the form of "extensions," a model successfully used by the Firefox web browser.
"So far we have exciting extensions like Google Docs integration with," Colorado wrote in an e-mail. "This would have taken ages to integrate into the code base and now it's available in a matter of weeks."

Another community developer, Charles H. Schulz, says that much of the criticism is simply misplaced.
"Unfortunately, some Novell engineers' behavior and vision of what the project should be and should [not be] leave me and others appalled by their misunderstanding of what a community really is," he said. "I think the real issue with Novell now has more to do with individual egos and agendas than anything else."
Convincing the mouse to roar again
Another problem with Sun is that it has taken an increasingly passive position in the past several years against's chief rival, Microsoft. Out is ex-CEO Scott McNealy, who was famous for his scripted put-downs of Microsoft, and in is Jonathan Schwartz, whose tenure has been marked by an increasing cooperation with what once was Sun's symbolic bogeyman.
For instance, Sun abstained from voting for or against ratifying the Office Open XML document format as a standard in the ISO vote earlier this month. And the one time it did weigh in, it was to express its conditional support for Open XML.
One observer close to links the change in tone to the terms of a $1.6 billion settlement paid by Microsoft to Sun in 2004 that has also resulted in technical and marketing cooperation.
IBM led the opposition against Open XML's approval. The observer expects IBM, which plans to inject 35 China-based developers into the process, to take over the role of being's public champion.
And he thinks that will be a good thing. "They'll be able to say some of the things that Sun can't," he said.
Moreover, says IBM's Poulley, "we bring our credibility and prowess in enterprise software, which has less been the forte of Sun."
But will Sun be willing to relinquish some or most of its control over Poulley thinks the transition has already begun. Simply "by virtue of our joining, becomes a lot less Sun-dominated," he said.
And that process can't happen fast enough, if the software hopes to make any dent into Office's dominance, says another expert.
"As much as people like open formats, they won't buy an inferior product," said Andy Updegrove, a Boston lawyer who represents open-source organizations and blogs about the same topics. With IBM "betting big on OpenOffice, in two and a half years we could be looking at another Mozilla situation, where Firefox has 15 percent of the market. That could lead to Microsoft modifying Office or changing its licensing or prices, which benefits the entire market."

miercuri, 19 septembrie 2007

Get a life: 10 tips for achieving a better work/life balance

As president of Encompass, a 16,000-member user group for business customers of Hewlett-Packard, Buik comes in contact with a wide variety of technology professionals who all seem to log a lot more than the traditional 40-hour workweek. "I rarely talk to anyone putting less than 60 hours a week into their jobs," says Buik, who is also senior vice president of MindIQ Corp., a Norcross, Ga., designer of technology-based training materials.

Buik herself has managed to forge a different path. She has let her staff know that once she's done for the day, that's it. They shouldn't contact her for routine issues and should text-message her only for true emergencies.

"If our Web site goes down, I need to be contacted, because our entire organization functions from our Web site," Buik explains. "If there's a simple power failure, we have backup for that, so I don't need to know immediately."

As companies increasingly look to technology to help them do more while spending less, technologists like Buik, and the IT workers she manages, are clearly feeling the squeeze.

It's pressure that hits at all levels. Some IT positions, such as help desk jobs, still tend to follow a traditional eight-hour shift, but such employees are often scheduled for evening and weekend work as well as the usual 9 to 5. Meanwhile, higher-level managers are racking up the hours at work as they try to meet tight deadlines and respond to those they serve.

Now, at all levels, IT professionals are beginning to give voice to their desire to have some time for personal pursuits. In other words, they want at least some semblance of what's known as work/life balance.

Pie in the sky?

Given the nature of IT work and the economic realities of the marketplace, achieving that kind of balance can be a tall order.

"IT workers do seem to work longer hours. Fifty hours is an average," says Lily Mok, who analyzes work/life balance in IT for Gartner Inc.

"IT work often requires [employees] to work different shifts and to be on call 24/7. And especially in recent years, as IT organizations became leaner after downsizing and outsourcing, people are required to do more work and take on more responsibilities."

According to the U.S. Department of Labor's Bureau of Labor Statistics, the average full-time worker in the U.S. puts in 9.3 hours a day. IT staffers work considerably more than that, other statistics show.

One study (download PDF), for instance, found that the average workweek for software programmers, engineers and technicians ranges from 43 to 62 hours.

While those numbers might seem dire, recent developments at the corporate level have improved IT professionals' lives as companies add work/life benefits in order to attract top talent. Flexible schedules, job sharing, condensed workweeks and telecommuting are some of the options now available to technologists, says Mok.

IT people have a better work/life balance today than they did when Leo Collins started in the field about 20 years ago. Collins, CIO at Lions Gate Entertainment Corp., points out, for instance, that employees don't always have to physically be in the office to do their work nowadays.

In fact, in a study released in July by Robert Half Technology, a Menlo Park, Calif.-based IT staffing company, 44% of CIOs surveyed said their company's IT workforce is telecommuting at a rate that's the same as or higher than it was five years ago. They cited improved retention, better morale and increased productivity as the greatest benefits of telecommuting.

While that's a step in the right direction, the industry still has a long way to go. Reluctant managers and a domineering corporate culture can influence how effectively work/life benefits are implemented in an organization and how willing employees are to seek them out, Mok points out.

And Collins acknowledges that for all the progress, IT workers at his company still tend to log some serious overtime. "The norm is a lot closer to 50 to 60 hours," he admits, "but you don't always physically need to be there." Either way, "we try not to have a crisis attitude," he says. "So if you need to take care of [personal] things, we can adjust priorities and move tasks around."

How can you find ways to better balance your professional and personal time -- even if you're at a company that's less progressive on the issue? Work/life coaches, IT executives and experienced tech professionals share their strategies for finding the right balance, with these 10 tips:

1. Establish and enforce your own priorities.

Many people who want to make a change in their lives fail to first reflect on exactly what it is they want to do differently, says Kathie Lingle, director of the Alliance for Work-Life Progress in Scottsdale, Ariz.

Step 1 should always be to set your priorities, she says. "Get those straight in your mind, and [then] act on them," Lingle suggests.

Whether your goal is to be active in your community or nurture personal relationships, it's likely you'll need to make time for those priorities by limiting your hours at work -- even if that means saying no to overtime or extra projects, or to a promotion.

Brian Schultz, information assurance practice lead at the Battelle Memorial Institute in Columbus, Ohio, undertook this exercise when working as a manager with the computer risk management practice at the former Arthur Anderson LLP. He didn't want to follow the same track as the executives he knew who sacrificed fulfilling personal lives to work 60-hour weeks.

"Early on, I established a priority list: God, family, country, community and company," Schultz explains. "The company is last. If you take that strictly, of course, you'd be living on the street, so there's a definite balance between those commitments. But to be fulfilled, you need that balance."

It wasn't an empty exercise. Schultz left Arthur Anderson in 2000 because he wasn't willing to put in the 14-hour days and weekend time needed to reach the next level. Instead, he found a position with another company that offered challenging work yet still respected the work/life balance he sought.

Now at Battelle, where he works an average of 45 hours a week, Schultz says he doesn't have to sacrifice career aspirations for personal time. Unless there's a looming deadline or an after-hours client meeting, Schultz doesn't work on Sundays, and weekdays he's usually home for dinner with his family.

2. Communicate.

You've set your priorities. Now let your co-workers know about them.

"Boundaries are often invisible. No one knows they're there but you. If you don't articulate [your boundaries], then how will other people know they're crossing them?" asks Lisa Martin, founder and president of coaching and training company Briefcase Moms in Vancouver, British Columbia.

It's crucial to be clear about what you want, what you can do and what you can't do, she says. It's equally crucial, of course, to take a business approach to this step, Martin emphasizes. Find opportune times to discuss such matters, and use a neutral voice to address missteps.

If, for instance, you've negotiated the ability to leave nightly by a certain time but your boss still keeps you late, state the problem neutrally ("This is the seventh time in two months I've worked late on a Friday") and remind her of your initial boundary ("We'd agreed to a firm leaving time.")

For some employees, this step might not come naturally, especially when speaking to a supervisor, but "you've got to take your 'boundaries vitamins,' " Martin maintains. "You have to keep fortifying [your position]. "It gets easier with practice," she promises.

Lingle suggests sharing not only your established priorities but also select details of your personal life with your co-workers.

It's an approach that Bob Keefe, senior vice president and CIO of Mueller Co.'s Mueller Water Products division and president-elect of the Society for Information Management, has seen put to good use firsthand.

While working at another company, his team encountered a serious error during an electronic data interchange. The team had to contact a colleague for information, although they knew he was out because his wife was heading into surgery.

"He was the kind of person who, if we made that phone call, he'd be back in the office, so we told him the program just 'ab ended,' " knowing that an abnormal end to a program wasn't serious enough to make him feel he needed to return to work, Keefe explains. Because they knew about his personal situation beforehand, the team took the trouble to glean the necessary information from their colleague without calling him back into the office.

3. Build a business case for your better life.

Savvy professionals are increasingly willing to asking for flexible schedules as part of their compensation packages when offered new jobs, Mok says.

"People with hot skills have more leverage in getting this kind of special treatment," she says, but that doesn't preclude others from negotiating additional vacation time, limited overtime hours or flexible start and end times before signing on with a new company. You can use the same strategy to negotiate benefits from your existing employer, too, Mok suggests.

Just approach the situation as you would any other business proposal: by building a business case for what you need. "You need to demonstrate, based on your previous performance, that you will be able to deliver the same results," Mok explains.

If you want to telecommute, for example, you should explain how you already successfully work without direct supervision -- making sure to include specific examples -- and how you can accomplish more without frequent office interruptions. Moreover, you should point out that a home setup is in line with your company's disaster recovery plans because it allows you to work even if the main office is empty due to, say, bad weather.

4. Take advantage of corporate policies and programs.

A survey conducted last year by OfficeTeam, a Menlo Park, Calif., staffing service, found that 53% of workers and 45% of executives said their employers were "very supportive" of efforts to achieve work/life balance. Another 37% of workers and 50% of executives said their employers were "somewhat supportive."

But work/life benefits, whether they're on-site child care, flextime or job sharing, can't help you if you don't take advantage of them. Learn what programs your company offers and consider when and how they can benefit you, Mok says.

Look at Schultz's case. His round-trip commute takes at least two hours, so he takes advantage of Battelle company policy and telecommutes when he has a daytime appointment close to home. "It's a huge relief of pressure, and it saves a great deal of time," he says.

5. Seek out a mentor.

"Look to people who you feel who have a good work/life balance and ask them, 'How did you accomplish this?' " advises Katherine Spencer Lee, executive director of Robert Half Technology.

Brian Abeyta, vice president of IT at insurance provider Aflac Inc. in Columbus, Ga., remembers admiring a supervisor who was gifted at managing both her executive-level job and her life as a mom.

"It forced me to appreciate very disciplined time management," he says, noting that his superior was very good about dedicating her time and focus to the task at hand. "She set a schedule and committed to that. Wherever she was, she was at that place and wasn't thinking about where she had to be next," he observes.

That kind of focus and discipline, both from the Aflac executive and from professionals Abeyta had known at previous companies, helped him and his co-workers learn how to honor their own personal priorities while still fulfilling their job requirements. "It showed that we could respect for each other's time, and that we need to respect each other's lives," Abeyta says.

6. Work more efficiently.

Seasoned tech workers know when they need to rush back to the office and when they can dial in and troubleshoot remotely, says Natalie Gahrmann, a work/life expert at N-R-G Coaching Associates in Hillsborough, N.J.

She points to her husband, an IT director, and his own work habits as an example: He recently drove to his company's New Jersey backup site rather than to his Manhattan office to handle an off-hours problem, saving precious personal time in the process.

Another way to work more efficiently: Tap the expertise of a professional group, Buik suggests. "You become more efficient with your time at work when you can share issues with others," she says. "That's less time dealing with certain problems, which means all of a sudden you have more time at home."

One of Buik's colleagues, for example, recently researched details on HP C-Class blade servers through the Encompass user group and was able to make a purchasing decision based on that interaction, which saved him from conducting hours of research on his own.

7. Share your knowledge.

It's often fulfilling to be an expert on a specialized program, but Keefe warns against being the only one in the know.

"In the cases when you're a de facto expert, you want to pull teammates in and train them," even if you have to take the initiative to make that training happen, he explains. "You want to share that knowledge, because if you have an on-call structure, then you won't have to always be the only one on call."

8. Use your gadgets.

There's no doubt that Keefe is a fan of gadgets because he, like many others, can use mobile devices to get work done whenever, wherever.

Moreover, he says, mobile devices can be tied into the office network, allowing employees to not only receive automated messages about potential problems but also to troubleshoot from wherever they are.

"IT professionals have invented everything that lets people work from wherever, [so] no one in IT should be enslaved to a particular place," Lingle adds.

9. Use your gadgets wisely.

Consider this statement: "Devices like BlackBerry chain you to work more than they liberate you." In the Digital Life America survey released in February 2007 by Solutions Research Group, one-third of respondents agreed with that statement -- and they were themselves all users of BlackBerries, Palm Treos and other smart phones.

It doesn't have to be that way -- if you're willing to put your foot down about how much of your attention such devices can demand.

When Steve Davidek, a systems administrator for the city of Sparks, Nev., got a BlackBerry about a year ago, he quickly found himself dealing with e-mails at all sorts of times and places. He reassessed his situation and decided to stop checking e-mails during off-hours. Instead, colleagues know to reach him via phone to relay news of problems that truly needed his immediate attention. "I need a cell phone; I don't need a leash," he explains.

10. Maintain perspective.

It's easy to feel your life is out of whack when a looming deadline or major systems failure has everyone in overdrive. Before you panic or throw in the towel and quit, take a deep breath, the experts advise.

"You're going to have blips; that's just life," Briefcase Moms' Martin says.

Martin suggests that instead of focusing on how tough you have it at any particular moment -- or, worse yet, making decisions based on short-term problems -- you should take a long-term perspective and consider how you're working to achieve your work and life goals.

She remembers coaching one working mother who had worked hard to develop highly specialized skills that were in high demand and yet "felt like she was chasing her tail all the time and felt her only solution was to find a different job."

When Martin asked her client to consider what she liked about her career and what she wanted from her job and personal life, the woman realized she liked her work; she just didn't like the hours. In the end, rather than walking away from a job she liked, the woman negotiated a four-day workweek that allowed her the extra time at home that she wanted.

"Sometimes this is difficult," Martin says, "but work/life balance is about being clear about what your boundaries are and then communicating them."

sâmbătă, 15 septembrie 2007

Google calls for global online privacy standard

Search giant Google Inc. will propose on Friday that governments and technology companies create a transnational privacy policy to address growing concerns over how personal data is handled across the Internet.

Google's global privacy counsel, Peter Fleischer, will make the proposal at a United Nations Educational, Scientific and Cultural Organization meeting in Strasbourg, France, dealing with the intersection of technology with human rights and ethics.

Fleischer's 30-minute presentation will advocate that regulators, international organizations and private companies increase dialog on privacy issues with a goal to create a unified standard.

Google envisions the policy to be a product of self-regulation by companies, improved laws and possible new ones, according to a Google spokesman based in London.

"We don’t want to be prescriptive about who does that and what those standards are because it should be a collaborative effort," the spokesman said.

Other organizations have already made progress on privacy standards, he said. For example, Asia-Pacific Economic Cooperation (APEC) created a nine-point Privacy Framework designed to aid countries without existing policies.

However, the framework has been criticized for vagueness and only been partially implemented by APEC members, said David Bradshaw, principal analyst at Ovum PLC.

E.U. privacy regulations are already more stringent than the APEC's recommendations, which highlights the difficulty in creating a global standard that meets existing regulatory requirements in various geographic areas, he said.

"From Google's viewpoint, they can't expect the E.U. and those nations that have higher privacy standards to level down to the APEC standards," Bradshaw said.

Google's increasing power in search, Internet commerce and software services has place its privacy policies under scrutiny.

In June, Google Inc. said it would delete the data it stores about end users anonymous in its server logs after 18 months, part of an effort to deflate concerns about privacy raised by a European Union (E.U.) working group composed of data protection officials from 27 countries.

Google took a further battering after it acquired DoubleClick Inc., an online advertising company that uses technology to track user trends in order to serve them targeted ads. The technology, also used by many other Internet advertising companies, has raised privacy concerns.

A European consumer group, Bureau Européen des Unions de Consommateurs asked the European Commission in July as well as other authorities to investigate how the DoubleClick deal would impact consumers.

The focus on privacy by governments and individual Internet users has resulted in localized legislation, causing a fragmentation in privacy regulations, Google's spokesman said.

That can make it difficult for e-commerce businesses, as an increasing amount of data is routinely crossing international borders through credit card transactions, he said.

"We really hope that this sparks a sustained, thoughtful creative debate, he said.

Update: SCO files for Chapter 11 bankruptcy protection

The SCO Group Inc. today filed for Chapter 11 bankruptcy protection, just a month after losing several key court rulings in its legal fight against Novell Inc., IBM and others over what it asserts is the company's Unix intellectual property.

In an announcement today, Lindon, Utah-based SCO said it filed a voluntary petition for reorganization, as well as for its subsidiary, SCO Operations Inc.

"The board of directors of The SCO Group have unanimously determined that Chapter 11 reorganization is in the best long-term interest of SCO and its subsidiaries, as well as its customers, shareholders and employees," the company said in a terse press release late today.

The company said its normal business operations will continue throughout the bankruptcy proceedings.

"We want to assure our customers and partners that they can continue to rely on SCO products, support and services for their business critical operations," Darl McBride, the company's CEO and president, said in a statement. "Chapter 11 reorganization provides the company with an opportunity to protect its assets during this time while focusing on building our future plans."

On Monday, SCO is expected to be in court in a trial that will determine how much money SCO might have to pay to Novell for Unix licensing revenues collected by SCO over the last several years.

Dan Kusnetzky, an analyst at the Kusnetzky Group in Osprey, Fla., said the bankruptcy filing by the company was not unexpected.

"It seemed to be a very odd strategy to go -- and before they even had established a court precedent -- to attack IBM with litigation and go after Novell and go after customers," he said. "I don't believe that at any point that they made it clear what they thought the [Unix code] infringement was."

Kusnetzky called the bankruptcy filing "an interesting move when they are facing a court battle where almost every single one of their [legal] pillars has been pulled out."

Kusnetzky said SCO "believed that IBM would pay them a lot of money to shut up" after SCO sued IBM in March 2003 in what became a $5 billion lawsuit alleging that IBM had illegally contributed some of SCO's Unix code to the Linux open-source project. "Instead, they got an IBM that wanted to fight them.

"I didn't think they could win and I think this is evidence that that's true," he said.

Al Gillen, an analyst at IDC in Framingham, Mass., said the move today was not a shock because of the steady decline in SCO's financial performance since it began its legal battles four years ago.

"SCO had a tough problem on their hands -- even if you date back six years ago -- about what they were going to do with Linux," which was invading parts of its Unix customer base, Gillen said. By filing lawsuits against IBM and Novell, SCO "risked alienating customers and partners. I'm not sure if that's why they're in Chapter 11 now, but it couldn't have helped."

Rival Sun Microsystems Inc. has faced similar market challenges in recent years, but took a different way to adjust, Gillen said, including moving key products like Sun Solaris to open source. "What's clear is that going after this from a litigation approach [as SCO did] wasn't going to work for Sun."

SCO will now have to see how the bankruptcy filing affects its recent mobile software initiatives, Gillen said. "From a bigger picture, some of their mobile applications are looking interesting, if they can stay alive to market it."

SCO and Novell have been fighting over who legally owns the rights to Unix and UnixWare since 2003. That's when SCO sued IBM in what became a $5 billion lawsuit, alleging that IBM illegally contributed some of SCO's Unix code to the then-fledgling Linux open-source project. SCO sued Novell directly in 2004; Novell filed counterclaims that disputed SCO's case.

Last month, U.S. District Court Judge Dale A. Kimball in Salt Lake City undercut much of SCO's case in a ruling that declared Novell the owner of the Unix and UnixWare copyrights. As a result, a bench trial that begins Monday will determine how much money SCO might now have to pay Novell for Unix licensing revenue it received from Sun Microsystems and Microsoft Corp.

Earlier this month, in an interview with Computerworld, SCO CEO Darl McBride said his company will continue to fight its legal case, despite recent setbacks.

The battle between SCO and IBM is not expected to start until next year and is expected to be affected by the results of the SCO-Novell fight.

vineri, 14 septembrie 2007

Microsoft downplays stealth update concerns

Microsoft Corp. today essentially called the concerns over undercover updates to Windows XP and Windows Vista a tempest in a teapot, saying that silent modifications to the Windows Update (WU) software have been a longtime practice and are needed to keep users patched.

"Windows Update is a service that primarily delivers updates to Windows," said Nate Clinton, program manager in the WU group on the team's blog today. "To ensure ongoing service reliability and operation, we must also update and enhance the Windows Update service itself, including its client-side software."

Microsoft was moved to respond after the popular "Windows Secrets" newsletter looked into complaints that WU had modified numerous files in both XP and Vista, even though users had set the operating system to not install updates without their permission. In many cases, users who dug into Windows' event logs found that the updates had been done in the middle of the night.

Windows gives users some flexibility in how their PCs retrieve and install updates and patches from the company's servers. In Vista, for example, users can turn off Automatic Updates entirely; allow the operating system to check for, but neither download or install, any fixes; or allow it to download files but not install them.

Clinton tackled the stealth install issue in some detail. "One question we have been asked is why do we update the client code for Windows Update automatically if the customer did not opt into automatically installing updates without further notice? The answer is simple: Any user who chooses to use Windows Update either expected updates to be installed or to at least be notified that updates were available."

Failing to do so, he argued, would have ultimately run counter to what a user wants and needs. "Had we failed to update the service automatically, users would not have been able to successfully check for updates and, in turn, users would not have had updates installed automatically or received expected notifications." The result, he said, would be to leave users at risk to attack via vulnerabilities Microsoft has patched. "That would lead users to believe that they were secure, even though there was no installation and/or notification of upgrades."

In fact, the practice has been going on for some time, Clinton claimed. "The Windows Update client is configured to automatically check for updates anytime a system uses the WU service, independent of the selected settings for handling updates. This has been the case since we introduced the Automatic Update feature in Windows XP. In fact, WU has autoupdated itself many times in the past," he said.

That would be news to the majority of people who filled several threads on Microsoft's own support newsgroups starting in late August. "I found this information by myself, checking the Windows directories," griped someone identified as Frank. "But the point is that I didn't allow the update (Automatic Update properties on 'notify') and there is no information about this update on Microsoft [Web pages]. Why [didn't] Microsoft publish any information about this update?"

Clinton also disputed user accounts of stealth updates to WU even when they had completely disabled the automatic update feature in the operating system. "WU does not automatically update itself when Automatic Updates is turned off, this only happens when the customer is using WU to automatically install upgrades or to be notified of updates," said Clinton.

He did issue a mea culpa -- of sorts. Although he stopped short of apologizing for the lack of information, he said Microsoft is considering changes. "[This] is not to suggest that we were as transparent as we could have been," he admitted. "To the contrary, people have told us that we should have been clearer on how Windows Update behaves when it updates itself. We are now looking at the best way to clarify WU's behavior to customers so that they can more clearly understand how WU works."

That's crucial for both end users and companies, said Andrew Storms, director of security operations at nCircle Network Security Inc., a security and compliance vendor. "The question is, why haven't users been more clearly educated that this is the way [WU] updates work?" Storms asked. "I'm glad to see software updated, but the better tack would have been to fully explain this.

"Frankly, this surprises me a bit. Microsoft's making an effort to provide us with more information, especially in the last year."

Microsoft didn't completely address one question Storms had, however: In corporations, where system integrity is not only demanded, but often crucial, how is Microsoft handling these kinds of updates to the WU client files on machines patched through Windows Server Update Services (WSUS), the server-side update manager?

Microsoft's Clinton mentioned WSUS in passing. "[For] enterprise customers who use WSUS or Systems Management Server, [the now-obsolete predecessor to WSUS], all updating, including the WU client, is controlled by the network administrator, who has authority over the download and install experience."

Microsoft's own technical documentation is unclear as to exactly what control administrators have over Automatic Updates. In a page headlined "Automatic Updates client self-update feature," WSUS administrators are told much the same as consumers, in some of the same language Clinton used in his blog. "Each time Automatic Updates checks the public Web site or internal server for updates, it also checks for updates to itself. This means that most versions of Automatic Updates can be pointed to the public Windows Update site, and they will automatically self-update," the document reads.

That's exactly how the process works for users not connecting to a WSUS-equipped server.

Even the alternative -- "You can also use the WSUS server to self-update the client software," the document said -- doesn't spell out what oversight administrators have over the modifications. In fact, this approach relies on the Internet Information Services (IIS) component of Windows Server to ping the same public servers Microsoft uses to push WU updates to anyone not using WSUS.

IIS, according to another support document, feeds the updates to a virtual directory named "Selfupdate" under the Web site running on port 80 of the WSUS server. Dubbed the SelfUpdate Tree, this folder contains the WSUS-compatible Automatic Updates software, said Microsoft.

The company did not provide more information on how, or whether, silent updates are processed by WSUS.

"This could be a very big deal to enterprises," said Storms, depending on exactly what happens in a WSUS environment. "You just don't want unknown files installed or changed."

And it's the not-knowing that bothers him. "What's really interesting here is that we don't know, do we?" said Storms. "We're looking for a more holistic view of what WU does. And Microsoft hasn't given it to us."

Help wanted: IT workers with server virtualization skills

SAN FRANCISCO – As more organizations adopt server virtualization software, they're also looking to hire people who have worked with the technology in live applications. But that experience is hard to find, as Joel Sweatte, director of IT at East Carolina University's College of Technology and Computer Science, recently discovered when he advertised for an IT systems engineer.

Sweatte received about 40 applications for the job at the Greenville, N.C.-based college, but few of the applicants had any virtualization experience, and he ended up hiring someone with none. "I'm fishing in an empty ocean," Sweatte said.

To give the new employee a crash course in virtualization, Sweatte brought him to market leader VMware Inc.'s annual user conference here this week. "That's a major expenditure for a university," Sweatte said of the conference and travel costs. "[But] I wanted him to take a drink from the fire hose."

Sweatte isn't the only one who has had trouble finding IT workers with virtualization skills. VMware said VMworld 2007, which ends today, drew more than 10,000 attendees -- up from about 7,000 at last year's event. But in interviews at the conference, it was common to find attendees who were new to virtualization and largely self-taught on the technology.

For instance, Jeff Perry, IT manager at HealthBridge, a not-for-profit organization in Cincinnati that electronically connects area hospitals and other medical facilities so doctors can exchange patient data, began deploying virtualization software six months ago. He came to VMworld to pick up some more technical skills and said he plans to spend a lot of time teaching himself about virtual systems.

The conference was a good starting point for learning about the technology, Perry said, "but there is so much research that you have to do after this."

And there's no question in Perry's mind that virtualization has become a critical IT component. "Hardware right now is so underutilized," he said. "To carve out spaces for virtual machines is the wave of the future."

IT professionals can certainly train themselves to work with virtualization software, VMworld attendees said. But, they added, it helps to have a broad base of data center skills beforehand.

"In the old days, you really just needed to understand the server –- now you have to understand not just the server, but the command lines of the Linux operating system, networking, how switches work, storage and fiber connections," said Kirk Marty, a senior systems engineer at Minneapolis-based Jostens Inc., which makes class rings, yearbooks and other products.

Michael Youngers, a lead systems administrator for the storage and storage-area networking groups at Carter & Burgess Inc. in Fort Worth, Texas, said that when the engineering and consulting firm decided to adopt virtualization about six months ago to improve its disaster recovery capabilities, he taught himself how to use the software. "I stumbled into it," he said.

But after seeing how virtualization has led to server consolidation, the removal of old hardware and lower power and cooling costs at Carter & Burgess, Youngers is convinced that it's a need-to-know technology for IT workers. "You are going to have to get on board," he said.

Peter Marx, chief IT architect at Knorr-Bremse Gmbh, a Munich-based manufacturer of truck and railroad components, has been involved in x86 server virtualization for several years, making his company a relatively longtime user. When Knorr-Bremse started out with the technology, Marx couldn't hire anyone with virtualization skills. Such people "simply weren't available then," he said.

Workers at the company attended some training programs, Marx said. But mostly, "they simply did it," he added. "It's more of a German-type approach."