HP equips desktop PCs with SSDs for faster Windows 7 boots

Addressing complaints about slow Windows startup times, Hewlett-Packard Co. plans to introduce a new business desktop PC that comes with a solid-state disk drive (SSD) to speed up Windows 7 and other applications. Available on Oct. 22 - the same day Windows 7 is officially released - the 6005 Pro will also come with a larger conventional hard disk drive connected via SATA interface for storing data and documents, said Martha Rost, worldwide product manager for business PCs at HP. HP calls the combo drive configuration its RapidDrive technology. The 64GB SSD on the HP Compaq 6005 Pro will be used to store and run Windows and commonly-accessed applications. The AMD-based 6005 Pro with RapidDrive costs $774, about $200 more than the $550 starting cost of the 6000 Pro, which lacks an SSD. As for just how rapid the 6005 Pro is, "We didn't run explicit tests.

The SATA-connected PM800 can read data sequentially at a maximum rate of 225.4 MB per second, and write it sequentially at 160 MB per second, HP said. But it boots up pretty quickly," Rost said. "You'll definitely see a difference." She said that the RapidDrive uses the Samsung PM800 SSD based on multi-level cell (MLC) technology. Sequential data rates apply to large files such as movies or songs. However, the PM800's performance - especially when writing small chunks of data - is much less impressive. It is about twice as fast as the zippiest consumer (7,200 RPM) SATA hard disk drives.

When doing random reads and writes, the Samsung drive is rated at a maximum of 27.4 MB per second and 4.2 MB per second, respectively, which is far slower than conventional hard drives. And they inevitably get even slower over time. SSDs are slower than conventional hard drives when recycling old blocks of data and doing other "garbage collection" tasks. Roger Kay, an independent analyst with EndPoint Technology Associates Inc., said this isn't the first time hardware makers have tried technology aimed at speeding up Windows. Intel also introduced a technology called Robson that placed a flash memory cache on notebook motherboards. Samsung and Seagate both introduced hybrid drives for notebooks that combined a conventional spinning disk with a smaller flash SSD that would be used to run Windows and popular apps.

Such technologies haven't taken off, in large part because of lack of need. Microsoft has also tried to speed up Windows. Most laptops can quickly go into sleep or hibernate modes and waking up from suspended modes is much faster than booting a PC. Kay doesn't think RapidDrive, at least in its current form, will spread to laptops, since it requires two drives. A feature introduced in Vista called ReadyBoost allowed users to boot Windows off a flash drive. Microsoft has promised that Windows 7 will boot and run faster than Vista . But performance gains were small, users said.

FAA glitch shines spotlight on troubled telco project

The outage of a computer system used by airline pilots to file flight plans in the U.S will likely prompt a closer look at a $2.4 billion telecommunications system that has grappled with numerous problems in the past. But in a statement, the agency blamed a "software configuration problem" within the FAA Telecommunications Infrastructure (FTI) in Salt Lake City. The U.S. Federal Aviation Administration (FAA) offered few details Thursday about the exact nature of the glitch, which caused major delays and flight cancellations in airports across the country.

That problem brought down a system used mainly for traffic flow and flight planning services for about four hours this morning. There was no indication that the disruption was the result of a cyberattack, the FAA said. The flight management system - it's called the National Airspace Data Interchange Network (NADIN) - was affected because it relies on FTI services to operate, the FAA said. FAA experts were investigating the outage and meeting with Harris Corp., the company that manages FTI to "discuss system corrections to prevent similar outages," the agency said. Safety and security is our highest priority," the company said.

In an e-mailed statement, a Harris spokesman said the company is working to "evaluate the interruption" to prevent future outages. "FTI has proven to be one of the most reliable and secure communications networks operating within the civilian government. A spokeswoman for the Professional Airways Systems Specialists (PASS) union, which represents more than 11,000 FAA employees, told Computerworld the problem arose when scheduled maintenance on FTI in Los Angeles corrupted a router in Salt Lake City. The $2.4 billion FTI program was introduced by the FAA in 2002 to replace seven FAA-owned and leased telecommunications networks. A back-up router that should have kicked-in when the primary router went down failed to do so, resulting in the widespread outage, she said. It provides a range of voice, data and video communication services for operations and mission support at more than 4,000 FAA and Defense Department facilities, according to Harris.

The report also noted several technical problems that had caused unscheduled outages to air traffic control operations. "In some cases, these outages have involved simultaneous loss of both primary and back-up FTI services, which not only disrupts air travel but also creates potential safety risks," the inspector general report warned, pointing to several incidents in recent years. The FTI network provides switching and routing services, as well as centralized infrastructure security monitoring services for the FAA. An audit of the program released by the FAA's Inspector General last September cited concerns over delays in the project's implementation and doubts about the promised cost-benefits the network was supposed to yield. On Sept. 25, 2007 for instance, all FTI services were lost at the Memphis Air Route Traffic Control Control Center (ARTCC), disrupting air traffic control for several hours and causing 566 flight delays, the report said. The FAA was vulnerable to the same issues at Atlanta and in Jacksonville. The problem stemmed from a "catastrophic failure" of an optical network ring that was supposed to offer built-in fault tolerance. Another incident occurred on Nov. 9, 2007, when all primary and alternate FTI services were lost at Jacksonville, resulting in 85 flight delays. "We also found that when FTI outages occur, the services are not always restored within contractual timeframes," the inspector general's report said.

Others have been critical of the program as well. In some cases, where services are supposed to be restored within three hours, Harris took twice as long to fix the problem. "Several areas remain critical watch items for decision makers as FAA moves forward with FTI," the report said. PASS, for instance, has in the past voiced concern over safety and efficiency issues related to FTI. PASS spokeswoman Kori Blalock Keller said the FAA needs to hold Harris accountable for the problems. "If they are going to provide service, we need to make sure they are reliable and they are quick" to respond to outages, she said. According to Keller, the incident will likely prompt Congress to ask the FAA Inspector General for another review of the system. Although several FAA technicians were on hand in Salt Lake City today, they couldn't do much to help out because the FTI system is managed by Harris, she said.

The National Air Traffic Controllers Association (NACTA) has also expressed frustration over FTI. After the failure in Memphis, the organization blasted the network as "unreliable [and] lacking suitable backup" and called it a source of "great frustration and deep concern" for FAA technicians and air traffic controllers. Bill Curtis chief scientist at CAST Software and co-author of the Capability Maturity Model used in software development today, said the outage highlights the havoc that can be created when something goes wrong in large, highly interconnected systems such as the FAA air traffic control system. "It's not just one system, but a system of systems," he said. "If one of them starts behaving in a funny way, it starts propagating out and causes problems in other systems," said Curtis.

VMware's Fusion 3.0 release leads to confusion

VMware Inc. is having trouble getting VMware Fusion 3.0 out its download door and is getting customer complaints about timeouts and licensing problems. The alert remained on the Web site early this morning. Shortly after the software was released Tuesday for download by customers, VMware issued a support alert about its upgrade portal, blaming "overwhelming demand" for the upgrade problems.

Fusion 3.0 is virtualization software that allows Windows, and other guest operating systems, to run on Intel-based Mac OS X. VMware Workstation 7 , also released Tuesday, is a virtual machine platform that supports multiple operating systems on a PC. Most of the VMware portal problems appear to be with Fusion. In a blog post , Pat Lee, director of VMware's personal desktop products, posted the 30-day free trial key as a workaround. "Because we've seen even more demand than anticipated, the VMware Fusion upgrade portal is having significant problems keeping up with the demand," wrote Lee, in a post Tuesday afternoon. "While we have already transacted thousands of upgrades today and many people are able to get the product, I apologize immensely to those of you who are anxious to get the product immediately and are running into issues." Responded one user, Miku, in a comment field: "I'm very happy that you posted a temporary serial for us to try it out, the license server problems were driving me insane, I was really thinking I was insane." Rob Enderle, an independent IT analyst in San Jose, said the demand for the product would imply that a lot of people suddenly want to run Windows on a Mac, "so many that it is crashing VMware's servers." "VMware is largely a server company and not really used to the kinds of numbers that can be generated by a popular desktop offering. One problem was difficulty in getting activation codes for the new products. You jump from 100s for a server application to millions for a popular desktop application under load and this looks like VMware wasn't ready for this jump," Enderle said. A VMware spokesman said the company wouldn't provide details beyond what was in the blog post.

The site issues may be an indication that Paul Maritz, VMware's CEO, who was appointed last year and is a longtime Microsoft veteran, may have assumed that the demand was anticipated by his staff, Enderle said. "This should be one hell of a wake-up call for him, not unusual for a new CEO, and it will remind him that he needs to test his assumptions, because what he assumes, and what turns out not to be true, can be very damaging," he said. The company was also addressing upgrade issue via a Twitter account, vmwarefusion .

Novell grabs for big role in virtualization security

Novell this week will lay out an ambitious plan to secure applications across heterogeneous virtualization platforms at customer sites and off-premises, an effort designed to play off Novell's strengths in network and identity management. Under the plan, workloads will maintain security and compliance policies, along with real-time reporting and monitoring capabilities, wherever they go. Novell's Intelligent Workload Management initiative will be designed for the creation of application workloads, described by the company as portable, self-contained units of work built through the integration of the operating system, middleware and application, to run on server virtualization products from VMware, Microsoft and Citrix, among others. The company says it will roll out eight products over the next year to support the plan. "It's somewhat revolutionary," said CEO Ron Hovsepian during an interview with Network World. "[In the core trends around virtualization and the cloud] what's bogging down the CIO? Security." Desktop virtualization cheat sheet  While Novell is taking an aggressive approach, other management vendors such as HP and IBM are also in the mix.

Offerings following that will include the SUSE Appliance Toolkit for deploying and maintaining Linux-based appliances in physical and virtual environments for update, access and configuration. For its part, Novell during the first quarter of next year plans to release a tool simply called Workshop that customers can use to build their workloads on Linux and Windows. Analysts say Novell's initiative is likely to first win adoption among the company's existing customers that are virtualizing their servers and already using products such as Novell's Identity Manager. You might shift it to New York, for instance, if your main usage is there, and traditional firewalling and identity management aren't enough anymore," says James Staten, principal analyst at Forrester. "You want something very lightweight that sets policy and identity in the application." Novell needs its Intelligent Workload Management effort to pay off in light of falling revenue and growing losses, even as the company's Linux-based products business is on the rise (the company last week posted a fourth fiscal quarter-over-quarter revenue dip of 12% and a loss of $256 million, which swelled in large part due to acquisition and other costs). The foundation components for security in Novell's Intelligent Workload Management initiative include capabilities available in Identity Manager 4 for real-time provisioning, reporting and management as well as the already announced Cloud Security Service, also expected to debut in 2010. Novell today offers management products under the brands PlateSpin Workload Management and Business Management, but the company will introduce new products that integrate and extend management capabilities to the cloud. But Novell's approach should catch the eye of non-Novell customers, too, industry watchers say. "Today, the workload is moving around. These will include PlateSpin "Atlantic," a self-service provisioning portal, PlateSpin "Bluestar" for physical server change and configuration management and monitoring, and ZENworks "Workbench," a master repository and change/control system for on-demand deployment of workloads.

Another new product is expected to be Compliance Automation to integrating Sentinel security information and event-monitoring with Business Service Manager for monitoring events. Other Novell products, including Business Service Manager, Business Experience Manager, myCMDB, Sentinel and Sentinel Log Manager, will also be tailored for service-level reporting of workloads across physical, virtual and cloud environments. While there are a lot of unknowns about how exactly the effort will play out, Forrester's Staten says Novell is bringing a strong argument to the table about managing workloads in a virtualized environment. Mary Johnston Turner, research director at IDC, says Novell is addressing a new set of requirements. "As we move into dynamic virtualization, you need to integrate and have a more policy-based approach," she says. He notes this has been a weighty topic for other vendors, including HP with its Orchestrator, though it's not oriented toward heterogeneous virtualization.

However, Turner notes that this approach presents challenges in that organizations have an installed base of management tools and are not set up to run the way that Novell envisions. In a recent survey of about 1,000 organizations, DiDio said almost 40% used multiple types of server virtualization. "They get VMware, Microsoft, Citrix, and a real surprise, Parallels for the Mac," she says. There is also the practical matter of seeing how well Novell delivers on its promises, though she gives the vendor "credit for being early to the game." Laura DiDio, principal analyst at Information Technology Intelligence Corp., points out there's a real need for a heterogeneous approach to container virtualization because companies often do use more than one type of server virtualization. DiDio says Novell's core identity and security technologies are widely regarded as "top-notch" and it makes complete sense to include them in the container approach outlined in the Intelligent Workload Management strategy. She also notes it will probably represent a huge opportunity for Novell to get its foot in the door to woo new customers. This will potentially enable Novell customers to "quickly and safely" go into public and private cloud computing "because it will minimize the risk" at deployment.

Google, Verizon issue joint statement on network neutrality

The following is a joint statement from Lowell McAdam, CEO Verizon Wireless and Eric Schmidt, CEO Google, regarding network neutrality.(Cross-posted on the Verizon Policy Blog and Google Public Policy blog.)Verizon and Google might seem unlikely bedfellows in the current debate around network neutrality, or an open Internet. For starters we both think it's essential that the Internet remains an unrestricted and open platform-where people can access any content (so long as it's legal), as well as the services and applications of their choice. And while it's true we do disagree quite strongly about certain aspects of government policy in this area-such as whether mobile networks should even be part of the discussion-there are many issues on which we agree.

There are two key factors driving innovation on the web today. It enables Macs to talk to PCs, Blackberry Storms to iPhones, the newest computers to the oldest hardware on the planet across any kind of network-cable, DSL, fiber, mobile, WiFi or even dial up. First is the programming language of the Internet, which was designed over forty years ago by engineers who wanted the freedom to communicate from any computer, anywhere in the world. Second, private investment is dramatically increasing broadband capacity and the intelligence of networks, creating the infrastructure to support ever more sophisticated applications. There is no central authority that can step in and prevent you from talking to someone else, or that imposes rules prescribing what services should be available.

Networking's greatest arguments: Network neutrality vs. tiered services As a result, however or wherever you access the Internet the people you want to connect with can receive your message. Transformative is an over-used word, especially in the tech sector. Consumers of all stripes can decide which services they want to use and the companies they trust to provide them. But the Internet has genuinely changed the world. In addition, if you're an entrepreneur with a big idea, you can launch your service online and instantly connect to an audience of billions.

At the same time, network providers are free to develop new applications, either on their own or in collaboration with others. You don't need advance permission to use the network. This kind of "innovation without permission" has changed the way we do business forever, fueling unprecedented collaboration, creativity and opportunity. So, in conjunction with the Federal Communications Commission's national plan to bring broadband to all Americans, we understand its decision to start a debate about how best to protect and promote the openness of the Internet. And because America has been at the forefront of most of these changes, we have disproportionately benefited in terms of economic growth and job creation.

FCC Chairman Julius Genachowski has promised a thoughtful, transparent decision-making process, and we look forward to taking part in the analysis and discussion that is to follow. First, it's obvious that users should continue to have the final say about their web experience, from the networks and software they use, to the hardware they plug in to the Internet and the services they access online. We believe this kind of process can work, because as the two of us have debated these issues we have found a number of basic concepts to agree on. The Internet revolution has been people powered from the very beginning, and should remain so. Second, advanced and open networks are essential to the future development of the Web.

The minute that anyone, whether from government or the private sector, starts to control how people use the Internet, it is the beginning of the end of the Net as we know it. Policies that continue to provide incentives for investment and innovation are a vital part of the debate we are now beginning. So we think it makes sense for the Commission to establish that these existing principles are enforceable, and implement them on a case-by-case basis. Third, the FCC's existing wireline broadband principles make clear that users are in charge of all aspects of their Internet experience-from access to apps and content. Fourth, we're in wild agreement that in this rapidly changing Internet ecosystem, flexibility in government policy is key.

This can have unintended consequences. Policymakers sometimes fall prey to the temptation to write overly detailed rules, attempting to predict every possible scenario and address every possible concern. Fifth, broadband network providers should have the flexibility to manage their networks to deal with issues like traffic congestion, spam, "malware" and denial of service attacks, as well as other threats that may emerge in the future-so long as they do it reasonably, consistent with their customers' preferences, and don't unreasonably discriminate in ways that either harm users or are anti-competitive. Finally, transparency is a must. They should also be free to offer managed network services, such as IP television.

Chairman Genachowski has proposed adding this principle to the FCC's guidelines, and we both support this step. Doubtless, there will be disagreements along the way. All providers of broadband access, services and applications should provide their customers with clear information about their offerings. While Verizon supports openness across its networks, it believes that there is no evidence of a problem today - especially for wireless - and no basis for new rules and that regulation in the US could have a detrimental effect globally. While Google supports light touch regulation, it believes that safeguards are needed to combat the incentives for carriers to pick winners and losers online.

MS won't punish users for switching to hosted software

Microsoft's licensing of internal versions of software vs. their online counterparts won't penalize users for buying on-premises licenses and then switching to online hosted software, according to CEO Steve Ballmer. Ballmer, in a meeting with Network World at the annual SharePoint Conference, said moving between enterprise applications like SharePoint and Exchange deployed internally to versions of that software operated in the cloud by Microsoft will be "seamless." "Customers are saying give me some credit here, this is more like an upgrade than it is like a new buy, give us a little credit,"he said. Ballmer says Sidekick episode 'not good,' but Microsoft ensuring that its online services won't make the same error. Users have been questioning whether they can move licenses online without having to take a credit and renegotiate with Microsoft on licensing terms. "I know it will take them time to get it straight; it is really complicated," said Guy Creese, an analyst with the Burton Group. "They claim software plus services as a mantra and if that is true they need to make it so these two environments [cloud and on-premises] are seamless [from a licensing perspective]." Ballmer said users need to break it down by separating Internet and intranet deployments from cloud and on-premises. "Internet stuff we do is all priced basically per application or per server and it will be priced that way whether it is offered in the cloud, as a service or on-premises," he said. "I think that is pretty clean and I think that is the way that people would like to see things licensed." He said intranet applications are essentially priced by the number of users and that fact is true whether it is in the cloud or on-premises. "So one is user-based and one is application based." But Ballmer said Microsoft will be flexible in the way the company prices cloud versus on-premises.

For example, if a user has a client access license for SharePoint running internally but decides he wants Microsoft to run SharePoint in the cloud, the customer only pays to have Microsoft operate the SharePoint service. "You don't need to convert [the license], you can use your on-premise license and just buy the service capability; that you can do." If you want to transition you can do that too but most of our customers say just let me use the license that I already bought and have you operate this thing for me." Follow John on Twitter: twitter.com/johnfontana He said users that want to come to the cloud can buy the service and use the license they own or they can start in the cloud and buy an integrated license that pays for both the service Microsoft operates and the license. "We designed it to be seamless, in a sense it looks more complicated now because you have two choices." "We have a big enough install base of people that bought licenses that say, 'Hey, when we buy your service we don't want to be re-buying what we have already paid you for in terms of software.' We have to recognize that our customers expect a transition step where we give them credit for the software that they already own," he said.