[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips March 21, 2003



Clips March 21, 2003

ARTICLES

Web Users Clog U.S., British Military Sites
Their Mission: Intercepting Deadly Cargo
OMB weighs changes to job competition process 
Dell to Recycle Old Electronics 
War Worms Inch Across Internet  
Whatever Happened to Internet2 - And Why You Can't Touch It
Seeking Additional Security After a Big Theft
Schism hits key open-source group 
IGs: Watch those Social Security numbers 
Report: Federal IT spending has doubled in five years 


*******************************
Web Users Clog U.S., British Military Sites
Wed Mar 19, 8:48 PM ET

PALO ALTO, Calif. (Reuters) - Several government Web sites in the United States and United Kingdom have been caught off-guard by traffic spikes spurred by the looming attack on Iraq (news - web sites) and worries about a rise in terrorist attacks. 


"Some of the major government sites are having a very difficult time ... the government hasn't necessarily experienced this before," said Eric Siegel, principal Internet Consultant at Silicon Valley's Keynote Systems Inc. (NasdaqNM:KEYN - news), a Web performance management and testing company that tracks top sites. 


"The U.S. Army home page is continuing to have severe problems," Siegel said. 


That site, at www.army.mil, is taking about 80 seconds to load and only 7 of 10 people who attempt to connect to it are successful. The problems started on Monday and have continued to worsen as time has passed, Siegel said. 


Web users have also had a harder time getting to the U.S. Marine Corps' official Web site at www.usmc.mil. 


Half of the U.K. residents trying to connect with the Home Office's site for terrorism information and advice -- www.homeoffice.gov.uk/terrorism/ -- during peak day-time hours failed to get through. 


Many operators of news and information sites in the United States and Britain have already made preparations for traffic spikes -- having suffered similar woes during big news days or the busy holiday shopping season. 


In Europe, Internet service providers are bracing for traffic surges the likes of which they've never before experienced. Suspecting much of the news to break while Europeans are asleep, ISPs are counting on a surge in traffic in the early part of a work day. 


French ISP Wanadoo (NAD.PA) said it had invested in extra computer servers to keep e-mail, Web-surfing, and chatting functions performing at normal levels, customary protocol for ISPs expecting audience levels to shoot up by the tens of thousands in a short span. (Additional reporting by Bernhard Warner in London) 
*******************************
New York Times
March 20, 2003
Their Mission: Intercepting Deadly Cargo
By SETH SCHIESEL

ON a sparkling Tuesday morning last week, just across the harbor from the gap in Manhattan's skyline, Kevin McCabe's team was trying to prevent another Sept. 11. 

Mr. McCabe, the chief customs inspector for the New York and New Jersey seaport, gestured across the wind-swept water. "The towers were right there," he said. "It was like you could reach out and touch them. That's what we're doing here." 

What Mr. McCabe and his team were doing at the Red Hook cargo terminal in Brooklyn was employing some of the latest high-tech tools in what has become his agency's top priority: fending off terrorism.

Just yards from the water, a boom extended from an International Harvester truck chassis. At the end of the boom was a suitcase-size box that emitted beams from a pellet of radioactive cesium 137. As 20-foot-long cargo containers inched past the box, two inspectors, like gun-toting radiologists, peered at a computer screen in the truck's cab, trying to decipher the ghostly images of the containers' interiors. 

"You see, this one is supposed to be household goods," one inspector said, scanning a computer printout representing the container's contents. "But look here."

Most of the image was a haze, but at the bottom loomed a solid cylindrical shape. "That's obviously something really dense, and it doesn't look like household goods," the inspector said. "We're going to open that one up. Look, it could be a propane tank. It could be a statue." Left unsaid was the possibility that it could be something much more dangerous.

Before the attacks of Sept. 11, 2001, the main purpose of what was then known as the Customs Service was to slow the flow of illegal drugs into the United States. Now, the renamed Bureau of Customs and Border Protection is expanding its use of advanced technologies in the service of its new No. 1 mission: stopping potential terrorist weapons.

It is a big job. According to the customs bureau, 7.2 million shipping containers entered the country in the year ended last September, in addition to 11.1 million trucks, 2.4 million railroad cars, 768,000 commercial airline flights and 128,000 private flights.

Complementing the government's array of fiber-optic cameras (for looking into containers without opening them), vapor tracers (for detecting conventional explosives) and other technologies, most of the new devices in the customs arsenal seem designed to detect components of a nuclear weapon or a radiological "dirty bomb." 

At ports, hand-held radiation detectors are used by teams like Mr. McCabe's, along with the imaging systems; at border crossings, trucks are routed through radiation-detection portals. The bureau is also working on tools to detect chemical and biological materials. 

It can be difficult to measure the effectiveness of systems meant to detect or deter the unthinkable, and while government officials are reluctant to discuss what their high-tech tools have detected, there is no public indication they have uncovered terrorist activity. And to be sure, the new systems do not cover the entire range of potential threats, including air piracy like that used on Sept. 11 or conventional explosives of domestic origin. 

Still, with security agencies at a heightened level of alert, federal officials say the new technologies are critical. "Technology is our greatest ally in preventing terrorists from getting weapons of mass destruction across our borders," said Robert C. Bonner, commissioner of the customs bureau, part of the new Homeland Security Department. "It is technology that is allowing us to facilitate the movement of goods and people while simultaneously giving us the capacity to detect weapons of mass destruction."

Some of the new anti-terrorism systems were being used before Sept. 11, mostly to look for narcotics, and have been adapted; others are newly deployed. The systems are getting a look from agencies beyond the customs bureau, like the Port Authority of New York and New Jersey, which operates the region's airports and many of its bridges and tunnels.

The challenge of securing the nation's gateways alone suggests how daunting a task it will be, even with the latest technological advances, to extend such efforts inside the country. The price tag is substantial; a unit like the one used by Mr. McCabe's crew costs more than $1 million. But the attraction is evident: all of the new technologies enable the government to screen more vehicles and cargo containers.

"These systems have given us much more flexibility and have made us more efficient," Todd A. Hoffman, the acting director for interdiction and security at the customs bureau's office of field operations, said in a telephone interview. Previously, inspecting a container shipment usually meant unloading it. "You could maybe do one or two a day because it's so labor-intensive," he said. "Now, you can easily do 8 or 10 an hour through a mobile system, and we're getting better."

The digital nerve center of the customs bureau lies far from the ports and border crossings that are the domain of field inspectors like Mr. McCabe. Just down Pennsylvania Avenue from the White House, the bureau's National Targeting Center sifts through information about all of the shipments entering the nation by sea. Under a federal rule that took effect late last year, information about cargo bound for the United States must be provided to the federal government at least 24 hours before ships leave their foreign ports. 

The National Targeting Center then uses advanced computerized risk-assessment techniques to sort the information according to more than 100 variables. Citing security concerns, federal officials refused to list those variables, but some officials said that the port of origin, the nature of the cargo and the track records of the exporter and importer were among the criteria.

"I can tell you that if you say you're importing bananas from Iceland, you're going to score higher," a federal official said.

Inspectors also scan manifests and information on the shipper and work with the National Targeting Center to identify shipments that pose the highest risk.

The containers being inspected by Mr. McCabe's crew in Brooklyn last week had been unloaded the previous day from a boat that had started its voyage in Karachi, Pakistan, and had picked up more cargo in Port Qasim, Pakistan, before stopping in the United Arab Emirates in the Persian Gulf and at least two Mediterranean ports in Europe on its way to New York. Of the 498 containers that were to be unloaded in New York, 89 held shipments that had been given a high-risk designation by the National Targeting Center or by agents in New York.

Each of the 89 containers was scanned by the truck-mounted cesium-detection system called Vacis, for vehicle and cargo inspection system. 

Vacis, made by the Science Applications International Corporation, uses the gamma rays emitted by radioactive material rather than the X-rays commonly used in medical procedures. (The newer systems use cobalt rather than cesium because cobalt can penetrate thicker steel walls than cesium does. The trade-off is that while cesium has a half-life of 30 years, cobalt's half-life is barely five years, so cobalt systems must have their radioactive material replaced more often.) 

The radiation is picked up by hundreds of advanced sensors that convert the information into a picture that a customs inspector examines to detect anomalies in density. Drivers and others exposed to the system's radiation absorb a dose comparable to what humans normally receive in an hour from natural sources, and far less than that of a medical X-ray, the company said. 

The customs bureau deployed Vacis in the late 1990's to help detect narcotics shipments. Since the Sept. 11 attacks, however, the bureau has stepped up its purchasing, placed the devices all around the country and used them primarily to detect potential terrorist weapons. "Instead of just focusing on the southwestern border, now we've got these out to other locations to address other threats," Mr. Hoffman said.

The 112 systems now used by the customs bureau are mounted on rail cars, on tracks, in stationary car-wash-type enclosures and on trucks. The truck-mounted systems cost about $1.3 million each.

In addition to installing the system in the United States, the government is beginning to deploy it overseas to scan cargo containers before they leave for American ports. 

The gamma-ray imaging is only part of the effort. In the late 1990's the bureau also began equipping its field agents with radiation pagers, hand-held devices that are sensitive enough to detect an office colleague who recently had radiation therapy. 

Before Sept. 11, the bureau had perhaps 3,500 such pagers, each costing about $1,400. Now, it has 6,000, and it preparing to buy as many as 15,000 more, Mr. Hoffman said. Inspectors wear the pagers as they work but also wield them to scan the outside of high-risk containers.

If radiation is detected, inspectors use a hand-held isotope identifier to gauge what sort of radiation it is. 

The isotope identifiers can be connected to a computer to send the information to Lawrence Livermore National Laboratory in California for analysis, Mr. Hoffman said.

Of the containers scanned by the Vacis system, about 5 percent are opened and physically inspected by agents, Mr. McCabe said. Even when the inspectors open a container, however, they rarely find contraband. That ominous cylindrical shape in Red Hook, for instance, turned out to be an industrial stove much like a pizza oven.

The radiation pagers generally give an alert far less often. One inspector said that in the two years he has carried a pager, it has shown abnormal levels of radiation only about 10 times, and never as a result of potential weapons components.

Each of those systems was in use before Sept. 11, though not nearly as widely as today. Then, last October, a new tool was added to the effort: radiation-detection portals.

Used mostly for truck traffic, the portals are like metal detectors for radiation. They tell the operator if a vehicle is emitting gamma radiation, which can be associated with legitimate materials like cat-box litter, or neutron radiation, which is more often associated with what are known euphemistically as "special nuclear materials."

Bob Thompson, the portal project manager at Pacific Northwest National Laboratory, which is advising the customs bureau on the project, said that such portal systems had long been used at government installations to help prevent the theft of sensitive materials, and in the scrap metal industry to detect radioactive materials before smelters or other equipment could be contaminated.

Jayson Ahern, the bureau's assistant commissioner for field operations, said that portals were already screening all trucks crossing the border at Buffalo and Detroit, and at least some passenger vehicles in Detroit. By next month, the bureau intends to institute 100 percent screening for trucks at the border crossings at Champlain, N.Y., Port Huron, Mich., and Blaine, Wash. 

"We want to get the commercial centers on the northern border because these are the locations where there might be something hidden on a truck that might be a radiological dispersion device," Mr. Ahern said. 

Mr. Hoffman said that trucks and other vehicles generally roll through the radiation portals at 4 to 7 miles per hour but do not generally add to delays at the border. 

Roughly one vehicle in 1,000 sets off an alarm, prompting a scan with the isotope identifier, he said. 

The radiation portals, which cost about $80,000 each, have not yet been deployed widely to screen shipping traffic, but the customs bureau has entered into a joint venture with the Port Authority to begin installing the systems to screen seaborne containers coming into the New York area.

"The Vacis equipment and radiation portal devices are both very effective, and we support Customs' deployment of them," said Allen Morrison, a spokesman for the Port Authority. "We are also in the process of evaluating this equipment and other devices for possible use at nonport facilities." A transportation official in the area said that less intrusive devices like hand-held radiation detectors were already in use by police officers at bridges and tunnels.

While drug smuggling is no longer the customs bureau's top priority, customs officials said that Vacis continued to detect such shipments. The systems are also used to monitor outbound shipments. A multiagency task force in the Miami area, for instance, uses the system to look for illegal shipments of stolen cars.

Mr. Ahern of the customs bureau said that one of his department's top goals for the future was to push for cargo containers that could electronically detect when they have been opened, and perhaps even by whom. 

Because there are millions of cargo containers in the world, however, and the cost of such an enhancement would largely be borne by private industry, the day of the "smart container" appears to be years away, customs officials said.

For now, the array of high-tech devices used by the customs service continues to be backstopped by a low-tech alternative: the dog. 

While dogs cannot detect radiation directly, the customs bureau is training dogs to sniff out chemicals that could be used in terrorist weapons. 

Systems like Vacis operate by allowing inspectors to detect anomalies within a cargo container, "but what if the container is already full of scrap metal or trash and car parts?" Mr. Hoffman said. "Think about what that would look like in an image. In that case, a canine could help us."

"Besides," he added, "they're cheap." 
*******************************
Government Executive
March 19, 2003 
OMB weighs changes to job competition process 
By Jason Peckenpaugh
jpeckenpaugh@xxxxxxxxxxx 

The Bush administration is contemplating wide-ranging changes to rules that allow federal jobs to be directly outsourced to private companies, Office of Federal Procurement Policy Administrator Angela Styles told a Senate panel on Wednesday.


Amid concerns that agencies may be misusing the rules, which govern streamlined job competitions and direct conversions, Styles said the Office of Management and Budget is considering changing how the government competes the jobs of small groups of federal employees. 


?Our concern has been that over the past two years, agencies have made decisions to directly convert that have not been in the best interest of the taxpayer,? Styles told the Readiness and Management Support Subcommittee of the Senate Armed Services Committee. ?We do not want that to continue.?


Under current rules, agencies may directly outsource work involving 10 or fewer federal employees to the private sector. Agencies may also hold streamlined competitions on functions performed by 65 or fewer federal employees. In streamlined competitions, in-house employees do not form a ?most effective organization? as they do in larger job competitions. The competition simply pits the current unitwhich has not restructured itself to be more competitiveagainst private firms. Some observers believe this puts federal employees at a disadvantage.


OMB is also acting to counter a perception that its rewrite of OMB Circular A-76the rulebook of federal outsourcingis intended to encourage direct conversions. In response to questions from Sen. Carl Levin, D-Mich., Styles said direct conversions would not be mentioned in the new circular itself, a change from the draft circular issued by OMB in November. ?We are even in the process of discussing. . . eliminating direct conversions altogether,? she told Levin. 


After the hearing, Styles told reporters that OMB is studying whether to eliminate direct conversions ?as they exist right now.? She said OMB is looking at a variety of alternatives, including a process designed by the Interior Department that allows agencies to hold competitions on work involving 10 or fewer employees. Under the Interior process, federal employees can keep their jobs if they could perform the work at a lower cost than private firms. OMB already has allowed the Treasury, Agriculture, and Health and Human Services departments to use Interior?s method for credit toward their competitive sourcing goals.


Revamping the direct conversion process is just one issue that OMB is wrestling with as it finalizes the new A-76 process, Styles said after the hearing. ?We still have direct conversions, we still have appeal issues that we are considering,? she said, adding that any changes to the direct conversion process would have to be approved by OMB Director Mitch Daniels. 


When asked if the new circular would be released by late April, she said, ?We?re going to try.?


During the hearing, Styles said the new circular would contain "very aggressive deadlines" for finishing public-private competitions, although she did not endorse the 12-month time limit contained in the draft circular. At Wednesday's hearing, Comptroller General David Walker repeated his view that most competitions cannot be finished within 12 months.


Styles also said OMB would work to provide agencies with training and resources when they start using the new A-76 process. 
*******************************
CNET News.com
Hacker says he leaked info on Unix flaw 
By Robert Lemos 
Staff Writer, CNET News.com
March 19, 2003, 4:20 PM PT

A self-proclaimed hacker claims to have stolen three unreleased security advisories from a corporate computer and posted them to a public mailing list.

The online vandal, who uses the monicker "Hack4Life," said Wednesday that he stole advisories detailing flaws in a common set of Unix code, the Kerberos authentication system and some implementations of encryption for Web sites. He claims to have stolen them from a firm that had been working with the Computer Emergency Response Team (CERT) Coordination Center, a clearinghouse for security information. 

"I am not in any way connected with CERT or any of the vendors involved," he wrote in an e-mail to CNET News.com. He added that he wouldn't give further details of the break-in and that he primarily stole the information for amusement and to show off. 

 

The outing of the advisories this weekend caused some consternation in the security world, because the companies involved didn't have time to create patches for the problems before the information became publicly known. When a security problem is found in their products, software makers prefer to release the information after a patch is available. 

One advisory outlines a problem with a library originally created by Sun Microsystems that is included in many Unix- and Linux-based operating systems. A second advisory highlights an issue in the Kerberos authentication system that could allow an attacker to impersonate other users. The third advisory discusses a specialized attack that could target servers using Secure Sockets Layer and break the software's encryption. 

The CERT Coordination Center had been prepping the advisories for publication. In an interview earlier this week, the organization identified 50 different companies that had access to all three advisories, and Sean Hernan, team leader for vulnerability handling at the CERT Coordination Center, believed one of the firms or one of the firms' employees may have leaked the information. 

"Ultimately, if someone chooses to take some information and post it anonymously to some mailing list, there is not a lot we can do about that," Hernan said, stressing that the incident wouldn't change how the group operated. "I think it is an unfortunate event, but I don't think it changes the plan to share information with vendors." 

Hernan had suggested that the information could have been stolen by a hacker. Hack4Life's statement Wednesday apparently confirms that. 

This is the third episode of early disclosure--or a lack of proper disclosure--of a vulnerability in the past two weeks. 

Last Friday, the Samba Team rushed an advisory out to the open-source community after learning that an online vandal may have reverse-engineered a patch under development to identify the vulnerability that the patch was intended to fix. The incident came to light after a server apparently had been compromised by exploiting the vulnerable Samba program, a widely used application for hosting Windows files on a Linux or Unix computer.

On Monday, Microsoft announced that a customer had been compromised the week before by an attacker using a previously unknown vulnerability. The U.S. Army acknowledged that a publicly accessible military server had its security breached by an online vandal using the flaw in Microsoft's Web server software. 

The three latest advisories were posted to the Full Disclosure security mailing list. The list, which is only lightly moderated, had been asked to pull down the documents earlier this week but refused, citing that it would be unethical to do so given that the issues were already out in the public. 

"I have a philosophy about security problems that everyone should be informed at all costs," said Len Rose, moderator of the list. "If we end up with a group of people...that rely on the Internet but they are the last people to be informed, then that leads to bad security." 

Rose understands the mind-set of the underground hacker community. The computer consultant, who had used the monicker "Terminus," plead guilty in 1991 to a charge that he sent proprietary AT&T Unix source code to other hackers. 

Now saddled with the task of preventing similar breaches, Rose stresses that knowledge is power. 

"I feel that full disclosure is always the best policy," he said. "I believe it was the best way, because it gives those of us who are responsible for the security of companies the information we need to implement defenses."
*******************************
BBC Online
Broadband access leaps ahead
Permanent internet connections in the UK have grown by over 200% in the last year, according to a government report. 
Permanent connections include any service that is always on, both home broadband services and fast-net access in the workplace. 

Such services now account for nearly 11% of the internet market in the UK and have leapt up by 255.7% in the 12 months of 2002, according to the report from the Office of National Statistics. 

The massive increase illustrated the power of broadband publicity campaigns and falling prices, the report said. 

UK overtaking France 

According to analyst firm Datamonitor, the leap in broadband numbers is set to carry on for the foreseeable future as the technology wavers on the brink of becoming mass-market. 

Datamonitor predicts that over 41 million European households will be accessing the internet via high-speed connections by 2006. 

By that time, it predicts the UK will have overtaken France to become Europe's second biggest broadband-connected country behind Germany. 

The problem of getting broadband out to remote towns and villages is also being addressed in the UK. 

Boost for rural broadband 

BT is set to launch its so-called Midband service - which offers almost-broadband speeds to remote areas - in the summer. 

It has also added 200 telephone exchanges across the country to its list for a broadband upgrade. 

A rash of grass-root schemes to boost broadband access in rural counties have also helped consumers make the switch to fast net services. 

The latest is a £2.5m grant from One NorthEast, the regional development agency for the area, to improve broadband links across County Durham. 

And AOL, in conjunction with the Citizens Online charity, is offering a series of Innovation in the Community Awards to local groups who are doing the most to bridge the digital divide across the UK. 

On offer is £2,000 each and a year's free broadband from AOL.
*******************************
Australian IT
Broadband growth slows
Kate Mackenzie
MARCH 20, 2003  
 
THE competition watchdog has set its sights on ADSL transfer delays as its latest report shows the rate of ADSL take-up growth has slowed dramatically.

Although overall connections have risen to 363,500, the latest report on broadband connections from the Australian Competition and Consumer Commission (ACCC) found the rate of growth in ADSL connections fell sharply from 51.4 per cent in the three months to June 2002. 
The following quarter saw growth of 24.1 per cent, while growth in the latest period - the December 2002 quarter - was just 16 per cent in the December quarter. 

While ADSL has gained ground much faster than cable internet in the past year, it still lags with 139,900 lines compared with 173,200 cable connections. However the combined number for ADSL and other forms of DSL, such as symmetrical SHDSL and VDSL, slightly exceed cable with 177,900 connections. 

Other DSL connections also slowed from 179.5 per cent in the June 2002 quarter to 71.2 per cent in December. 

Although the growth figures are still impressive, with growth in all broadband connections was 16.4 per cent, the boom of early 2002 has slowed. 


Telstra's ownership of the "last mile" network, essential for ADSL connections to the home, makes it central to the broadband competition debate. 

The 2002 growth was largely due to an an explosion in the number of DSL providers from late 2001, after the ACCC forced Telstra to reduce its wholesale ADSL pricing. 

ACCC chairman Professor Alan Fels said the commission would continue to promote competition in the broadband industry. 

"For example, Telstra is currently implementing new ADSL transfer processes following an investigation by the ACCC," he said. 

As ADSL becomes increasingly common, the issue of transferring from one ADSL provider to another has become more contentious, and the ACCC said it had received several complaints from Telstra's wholesale customers, who are themselves ISPs, and Telstra BigPond retail customers. The complaints related to the length of time a customer must spend without ADSL when they transfer between providers. 

The period of disconnection usually lasts at least several days, while the ACCC maintains it should "be a matter of minutes". 

Telstra has long insisted it is not solely responsible for speeding up the transfer process, and has been involved in trials with two of its wholesale customers to introduce faster transfers. 

"The ACCC is encouraged by Telstra's recent initiatives aimed at addressing this problem and will be closely monitoring the implementation of the new ADSL transfer processes to ensure that it delivers a high quality service to Telstra's retail customers", Professor Fels said.
*******************************
Washington Post
Dell to Recycle Old Electronics 
Service to Cost $15 Per Item 
By Mike Musgrove
Thursday, March 20, 2003; Page E03 

Dell Computer Corp. announced plans yesterday to offer, for a small fee, a new service to pick up old computers at consumers' doorsteps to better ensure that unused and obsolete equipment gets properly recycled.

The service, which begins next week, is intended to help address environmental concerns over the toxins that can emanate from improperly discarded computers and other electronics.

"If we can make it easier and more convenient, more people will recycle," John Hamlin, Dell's senior vice president of U.S. consumer business, said in a conference call with reporters yesterday.

Starting Tuesday, consumers will be able to go to Dell's Web site and pay $15 per item for Airborne Express to whisk away old printers, desktops, laptops or monitors (consumers will still have to box up the old equipment themselves).

Donated PCs that are still useable will go to the National Cristina Foundation, a nonprofit organization that distributes computer equipment to schools and organizations for the disabled.

"Most computers that are too old for consumers have years of life left in them for use by organizations serving those in need who can't afford new technology," said Yvette Marrin, president of the foundation.

This is not Dell's first step into getting consumers and business customers to recycle their old computers, but it does represent a slightly easier solution than what the company offered previously. Before this program, consumers had to bring used equipment to shippers and pay $20 to $50 for delivery.

Consumers recycled 1,000 systems through Dell last year under a previous recycling program. The company has disposed of 2 million business systems since it began a recycling program in 1991. 

Dell isn't the only computer company trying to get consumers to recycle. Hewlett-Packard Co. charges consumers $13 to $34, depending on the item, and also passes on the used equipment to nonprofits or breaks it down to key commodities, depending on its condition.

Gateway Inc., meanwhile, started a program last year in which it offers consumers a rebate when they purchase PCs or other products and trade in a PC or a related piece of equipment.
*******************************
Washington Post
Texan to Lead House Cybersecurity Panel 


By Brian Krebs
washingtonpost.com Staff Writer
Thursday, March 20, 2003; 6:02 PM 


Congressional leaders have picked Rep. Mac Thornberry (R-Texas) to lead a new congressional subcommittee on cybersecurity, a House spokeswoman said today.

Thornberry will head the subcommittee on Cybersecurity, Science, Research and Development. The panel is part of the House Select Committee on Homeland Security, which was created last month to oversee the new Department of Homeland Security.

The cybersecurity subcommittee's top Democrat will be Zoe Lofgren (D-Calif.), a staunch advocate for the high-tech industry who represents San Jose and other parts of Silicon Valley. As a member on the House Judiciary Committee, Lofgren has taken an interest in online intellectual property issues. She also is a member of the New Democrat Coalition, a group of technology-oriented lawmakers.

The subcommittee will examine key areas of computer security policy, including the protection of government and private information networks from domestic and foreign attacks. It also will tackle the security of the telecommunications and electric infrastructures, including associated scientific research and development projects.

Cooperation between the government and the private sector on cybersecurity will be the subcommittee's priority, Thornberry said in an interview.

"A big part of what we'll need to do is get our arms around where the government is in terms of cybersecurity," he said. "We'll also need to look at the [Homeland Security Act of 2002] to see if we've set it up for maximum government and private sector cooperation."

Thornberry is a relative newcomer to the technology security field. In April 2001, he introduced a homeland security department proposal that included a centralized cybersecurity office, but has not made a name for himself on Capitol Hill in other technology areas.

Mario Correa, director of Internet and network security policy for the Business Software Alliance, said Thornberry probably was selected because many of the Homeland Security Committee's members already have many other assignments.

The 50-member committee includes some of the most powerful lawmakers in the House, including nearly all of the heads of the major committees. It was founded after numerous House committees claimed jurisdiction over homeland security issues.

Much of the panel's work will focus on the Homeland Security Department's Information Analysis and Infrastructure Protection division (IAIP), which is in charge of cybersecurity and intelligence gathering and analysis.

Committee Chairman Christopher Cox (R-Calif.) said he wants to know whether the administration has given enough authority to the IAIP to gather the intelligence it needs to do its job.

The White House recently decided to place responsibility for intelligence analysis and collection within a new division of the Central Intelligence Agency. That decision has caused confusion about the extent of the IAIP's mission and authority and is handicapping the White House search for people to lead the division, according to sources familiar with the process.

Another panel that will work closely with the IAIP division -- the Subcommittee on Intelligence and Counterterrorism -- will be led by Rep. Jim Gibbons (R-Nev.), who chairs the House Intelligence subcommittee on Human Intelligence, Analysis and Counterintelligence. Gibbons is also vice chairman of the Terrorism and Homeland Security subcommittee, and a member of the Armed Services Committee.

Rep. James Turner (D-Texas), the lead Democrat on the Homeland Security Committee, said he expects the panel to examine possible privacy violations raised by the department's programs and activities.

"We're going to need to strike a balance between the methods we employ to protect our security and our privacy," Turner said. "Anything the department will want to do to try to enhance security will almost inevitably raise these kinds of issues."

Former CIA Deputy Director John Gannon will serve as the committee's staff director, according to a statement from Cox. Gannon also is the former chairman of the National Intelligence Council. The Democrat staff director will be Steven Cash, a former attorney and terrorism expert for the Senate Select Committee on Intelligence. He also is a seven-year veteran of the CIA.
*******************************
News.com
Israel warns Web sites on war coverage 


By Declan McCullagh 
Staff Writer, CNET News.com
March 20, 2003, 12:45 PM PT


WASHINGTON--Israel's top government censor has warned Web sites in her country not to publish sensitive information about the war with Iraq.

Chief Censor Rachel Dolev sent a letter on Wednesday to "scoop" news sites, instructing editors to seek government permission before publishing information about "materials that could pose a threat to the security of the State of Israel and its residents." 

Dolev's letter warned the sites, including Rotter.net and Fresh.co.il, not to publish the locations of any missile strikes, information about Israeli Cabinet deliberations or information about Israeli wartime cooperation with other governments such as that of the United States. 

Dolev said editors must contact official censors in Tel Aviv and Jerusalem before posting information online. "In addition, censors will be working 24 hours a day in the two media centers--in the Foreign Ministry in Jerusalem, and in the David Intercontinental in Tel Aviv--and you may also turn to them," the letter said. 

Boaz Guttman, an Israeli attorney, said that the Web sites targeted are widely read. 

"Israelis around the globe are connected to this forum," Guttman said. "In such a crazy country, the news appears quicker (in such forums) than in the official media. So you can get videos, pictures, a long time before they will be published on regular TV or radio." 

Guttman said this was the first time that the government has warned Web sites. "The (Israel Defense Forces) censor can forget about cooperation," Guttman said. "Everybody knows the law well. If someone needs to publish something, he also knows how to do it without a trace." 

Israel has ranked poorly in a press-freedoms index created by advocacy group Reporters Without Borders, ranking below the Palestinian National Authority, Zambia and Cambodia. Last year, the Committee to Protect Journalists condemned what it called "attempts by the (Israel Defense Forces) to intimidate the press from covering the IDF's widening military campaign in the territories." 
*******************************
Wired News
War Worms Inch Across Internet  
02:00 AM Mar. 21, 2003 PT

The U.S. military action in Iraq has stirred up computer virus writers and malicious hackers, who have apparently decided to vent by defacing websites and releasing e-mail worms that prey on people's fears and curiosity. 

Antagonists and activists based in the United States, Europe and the Middle East are engaged in their own form of war games. Some are vandalizing websites, particularly government sites, scrawling scornful cybergraffiti or urging people to "make love not war."

And at least three e-mail viruses that their authors claim were released in response to the war have started making rounds on the Net. 

Website defacements don't normally cause problems for anyone but those in charge of the altered sites. And for most people the war-related e-mail viruses, which don't yet seem to be in wide circulation, are nothing more than an annoyance. 

But relatives of military personnel involved in the war in Iraq said malicious code lurking in their inboxes is the last thing they need to copes212 with right now. 

"My husband had been e-mailing me every day, (but) the messages stopped on Tuesday," said Marilyn Montero, a military wife who lives in Southern California. "Very late yesterday I got an e-mail that said it had pictures attached of what happened in Iraq last night. I'm stupid with worry now and I opened it. 

"It screwed up my computer, and now I can't get e-mail or check news sites until my brother comes over to fix it. E-mail from my husband was the one thing that was keeping me going. I hope the idiot who made this virus has a really horrible day." 

Regina Scalone from New York, whose cousin is stationed on an aircraft carrier in the Persian Gulf, also got whacked with a war worm. 

"I'm not an idiot, but everything I know about computers went out the window when the war started," Scalone said. "I'm so frantic to find out what's happening over there. So I opened an attachment that was labeled 'Go USA,' and my computer locked up. 

"I've been told tricks like this, playing off people's worries, are called social engineering by hackers," Scalone added. "Well, this seems pretty goddamn antisocial to me." 

Social engineering refers to the use of psychological tricks to encourage people to do things they might not normally do. Virus writers often include messages tied to current concerns like war, or eternal human urges like lust, to get people to open infected e-mail attachments. 

So far, the war-related worm in widest circulation is Ganda, the one that both Scalone and Montero opened. Ganda is currently rated a low threat by most security firms. 

Ganda arrives with one of several different subjects and messages, all with references to the current military action and political situation. 

One variation claims the attachment contains pictures of Iraq taken by U.S. spy satellites; another purports to contain a pro-America screensaver and urges people to display it to show support in the war against terrorism. Others come with messages claiming the attachment is either an anti-George W. Bush or pro-peace screensaver. 

Once the attachments are opened on PCs running Windows, Ganda behaves like many other e-mail worms, e-mailing itself to all the addresses in the infected machine's Outlook contact list. It also scans the machine for security software -- such as McAfee, Norton or Sophos antivirus products -- and shuts them down.

Ganda's code claims it was written by "Uncle Roger" from Hõrnsand, Sweden. Oddly, the virus' code also contains this message: "I am being discriminated by the Swedish school system. This is a response to eight long years of discrimination." 

"We don't know what Uncle Roger's problem is with the school system in Sweden," said Graham Cluley, senior technology consultant for Sophos. "But clearly the author of this virus is exploiting an interest in current affairs by deliberately presenting his virus in this way. At a time of international crisis, it is understandable that computer users will be interested in finding out the latest news from the Middle East.

"Whatever his problem is, a worm is not an appropriate way to complain about it."

Worms aren't the only war-related Internet irritation. Network security firm F-Secure reported that as of Thursday morning, several hundred U.S. and Middle East websites had been defaced with pro- and anti-war messages. 

Some of the messages are in English, others are in Arabic, said Mikko Hypponen from F-Secure. 

Hypponen said he expects to see more war-related defacements and worms in the next week. He also advised computer users to treat all e-mails containing attachments purporting to contain war-related information with extra skepticism. 

As always, security experts said the best way Internet users can protect themselves is to update antivirus programs regularly, and to stay current with security patches for all software. 

But some who have attempted to keep their programs patched ran into problems earlier this week after applying a critical patch for Windows 2000. The patch made it impossible to reboot their computers. 

Microsoft later announced the patch was incompatible with 12 previous software fixes for Windows 2000 issued between December 2001 and February 2002. Users running any of those fixes won't be able to reboot their Windows 2000 systems after applying the new patch, according to Microsoft. 

"We get slammed when we don't quickly apply the patches, and slammed when we do," said Jeff Kinsel, a systems administrator for a Manhattan publishing firm. "Some days I just hate my job." 
*******************************
03/20/03 

Software bugs, schedule delays could slow production of Raptor 

By Dawn S. Onley 
GCN Staff

The General Accounting Office is recommending the Air Force slow production of F/A-22 Raptor fighter planes due in part to persistent software problems that require computers in the cockpit to be rebooted regularly. 

The GAO also found fault with the aircraft?s performance and development schedule, which now includes a potential $1.3 billion in cost overruns in the engineering, manufacturing and development phase of the program, according to a GAO report. 

?The uncertainties regarding performance capabilities of the F/A-22 aircraft and its development schedule will persist until technical problems have been addressed,? the GAO said. ?In light of those uncertainties, steadily increasing annual production rates could result in the Air Force having to modify a larger quantity of aircraft after they are built.? 

The major problems in the stealth plane program include software instability, overheating in portions of the plane and excessive movement in the aircraft?s twin vertical tails. So far, 12 planes built by Lockheed Martin Corp. and Boeing Co. are being used by several commands for training, officials said. 

Air Force officials said they didn?t know when the software glitches would be fixed and that they ?do not yet understand the problems associated with the instability of the avionics software well enough? to give a timeline for when the problems can be resolved, according to Friday?s GAO report. 

Overall, Air Force officials and representatives at Lockheed Martin disagreed with the GAO?s findings. 

During a news conference in January, Ronald Sega, director of Defense Department research and engineering, said the software glitches appear to be caused by the custom integration of several applications, including radar and electronic warfare programs.
*******************************
Associated Press
Whatever Happened to Internet2 - And Why You Can't Touch It
Thu Mar 20, 1:43 PM ET

Vincent Ryan, www.NewsFactor.com 

Internet2, the next-generation network that supposedly leaves the mainstream Internet in the dust, is still an ivory-tower project cloistered in universities and research labs. If you are a graduate student at a U.S. university with a major computing center, you may get your hands on it, but if you are sitting at home waiting to reap direct benefits from this mammoth project, you face a long wait. 


Why is it, then, that everyone who talks about Internet2 says it is crucial to development of next-generation technologies that will benefit all users? 


In the Beginning 


Internet2 germinated as the commercial Internet emerged into the mainstream in 1994 and 1995. Discussions began among researchers in academia as they realized the goal of this commercial Internet was much different than the goals of the academic, scientific and government communities that had birthed the Internet's predecessor, NSF.net. "Experimentation wasn't possible on the commercial Internet," Greg Wood, a spokesperson for Internet2, told NewsFactor. 


The Internet2 consortium began as an effort among 34 universities but has grown to include 202 universities and numerous corporate research labs. Internet2-connected universities have committed more than US$80 million per year in new investments on campuses, and corporate members have committed upward of $30 million over the life of the project. Internet2 institutions also receive funding via grants from the National Science Foundation (news - web sites) and other federal agencies. 


Experimental Internet 


The promise of Internet2 is that its leading-edge networking techniques will prove to be valuable and will be built into new commercial products and services, Wood said. 


"It's a testbed for next-generation applications that won't operate on the commodity Internet," said Greg Moore, a spokesperson at Indiana University. Ideally, data and information gathered from tests will be used to construct the hardware, software and services for the next mainstream Internet, he added. 


Some applications supposedly enabled by Internet2 include uncompressed high-definition television (HDTV)-quality video, digital libraries, virtual laboratories, distance-independent learning, scalable multicasting and tele-immersion. 


Fast Ride 


There is no question about the value of Internet2, according to Wood. NSF.net proved beyond a doubt that in a world where companies spit out products on a three- to six-month time horizon, it is vital to have an environment that can take a long-term view of technology development, he said. The current Internet was essentially a 20-year R&D program begun in the 1970s. "There's a need to have a place where the long horizon can foster new technologies to be used in high-performance networks," he noted. 


Internet2 is an ideal testbed because it comprises a collection of high-performance networks that span 300 universities, corporate research labs, state governments and commercial networks. Abilene, the largest research and education network in the United States, is one of the backbone networks of Internet2. I-Light, a regional network connecting the University of Indiana and Purdue University, is also part of the whole. The slowest link in the Internet2 networks is 100 Mbps (megabits per second), Wood said. 


As such, Internet2 suits "research applications that use enormous amounts of bandwidth and that won't work over a regular Internet connection," Moore said. 


This year, Internet2's Abilene backbone is being upgraded to 10 Gbps (gigabits per second), quadruple its previous capacity. Qwest Communications (NYSE: Q - news) is providing the 10 Gbps wavelengths, and Juniper Networks (Nasdaq: JNPR - news) is providing T640 routers for the project. "There should always be enough headroom in the network," Wood said. 


Performance a Given 


Such high performance is crucial for Internet2 because its major selling point is a lack of network constraints -- freedom from the performance bottlenecks that can cripple applications on the commercial Internet. "In the Internet2 environment, the network is no longer in the way," Wood said. "There are issues about optimizing end-to-end performance, but it's almost never the case that the backbone network is the problem." 


For researchers and technologists, that means they can focus on aspects of technology besides performance, Wood said. For example, scientists developing videoconferencing applications can take for granted that the network will support an MPEG2 video stream. Therefore, they can concentrate on scalability issues, such as how to enable videoconferencing with 10 people simultaneously. 

   



Trickling Downstream 

But how much time will elapse before Internet2 begins to benefit mainstream Internet users? The answer may be surprising: Technologies tested on this new network already are making their way into the commercial world, according to Wood. "Cisco Systems, Microsoft (Nasdaq: MSFT - news) and IBM (NYSE: IBM - news) are all participating in Internet2, trying out new technologies," he explained. 

In 1998, for example, multicasting, in which one person sends data to several different endpoints, was first being widely deployed in the Internet2 working environment by universities. Cisco's (Nasdaq: CSCO - news) current implementation of multicasting was heavily influenced by that large-scale test, Wood said. 

In the same vein, an application that eventually will make it to the mainstream -- TV-quality videoconferencing, in particular 1.5 Gbps uncompressed HDTV -- is becoming ever more common among the Internet2 community. "The hard problem is figuring out how to make videoconferencing as easy as e-mail is today," Wood said. "The Internet2 community is working on scaling issues right now." 

Toward Shibboleth 

Another application currently in testing is expanded network performance monitoring. Computer scientists are gathering data from network operations to better understand how the network itself works, Wood said. 

There are also proof-of-concept projects, such as Shibboleth, a Web authorization architecture and software system. This standards-based, federated authorization system enables intercampus sharing of Web resources subject to access controls, Wood said. On test campuses, off-campus students with broadband connections can access licensed digital-library content transparently. In addition, scientists and faculty can share research Web sites securely with remote colleagues. 

Ideally, the Shibboleth project will eventually help any organization or company that wants to simplify the administrative burden of granting Web access control to a limited set of users in a way that maintains privacy and security. 

Slowly But Surely 

Internet2 is also a testbed for IPV6, the next-generation Internet protocol that has been slow to infiltrate the commercial Internet. "You're seeing industry segments adopting IPV6, but you're going to see it evolve into the infrastructure over the next five to seven years," Jerald Murphy, a senior vice president at Meta Group, told NewsFactor. "There won't be a dramatic wholesale revamping of company or service provider infrastructures." 

In fact, that seems to be the story with most of the technologies being experimented with in Internet2. 

The direct benefit to the commercial Internet is that in doing research on the next-generation network, vendors may uncover technology that ultimately filters down into commercially produced networking equipment, Murphy said. "But [Internet2] is a crucible for developing next-generation networking technologies, more than a blueprint of the technology that's going to be used in tomorrow's Internet," he explained. 

So, while some Internet2 technologies may whet the appetites of mainstream users, the technology they ultimately use could be vastly different. Even so, the Internet2 environment remains vital for spurring innovation -- and without innovation, we would still be using pen and paper.
*******************************
Chronicle of Higher Education
Seeking Additional Security After a Big Theft, JSTOR Tests Internet2's Shibboleth
By FLORENCE OLSEN

Last fall, someone taking advantage of a common method for gaining access to online databases attempted to download the vast collection of scholarly journals known as JSTOR. Now the owners of JSTOR are experimenting with new software developed by Internet2 researchers to prevent such incidents by improving online authentication. 

The online thief or thieves had exploited a weakness in the method that JSTOR and other publishers use to control who can gain access to their subscription databases. The method, known as IP authentication, recognizes a block of IP network addresses as belonging to a particular institution that has paid for access to the database. 

If a request to download an article comes from an IP address that the database recognizes, the database automatically responds to the request, even if the request is coming from an online intruder who is illegally using a college's computer network to gain access to the database. 

JSTOR officials think that's what happened last fall, when someone downloaded about 50,000 articles from the journal database before being detected and prevented from downloading the entire collection (The Chronicle, January 10). JSTOR is a nonprofit group that licenses digital copies of scholarly journals. 

College-network managers and publishers have known for a long time that IP authentication has inherent weaknesses because computer networks themselves are insecure. But IP authentication is easy to set up and therefore is widely used throughout the publishing industry. 

Colleges also rely on user ID's and passwords to authenticate identities on the Internet, but keeping track of those is cumbersome and costly both for colleges and for publishers of databases and other Internet resources that colleges license. When students and faculty members are expected to use many different user ID's and passwords to gain access to off-campus resources, they frequently forget which ID's and passwords to use where. 

But researchers are developing more-sophisticated methods for verifying a person's identity online, and one of those methods is the result of work done by some members of the Internet2 consortium, a group of colleges that develops advanced Internet applications. The Internet2 researchers call their method Shibboleth, after an ancient Hebrew word that members of one tribe couldn't pronounce the same way members of other tribes did, and that came to be used as a way of distinguishing members of that tribe. 

The researchers, who have spent two years working on Shibboleth, say the software is now ready for colleges to use. The physics department at Pennsylvania State University's University Park campus, an early adopter, has set up Shibboleth to authenticate its students before they can use an external grading service that North Carolina State University provides over the Internet. 

Besides being more secure than IP authentication, Shibboleth goes a step further, its creators say. It not only authenticates, or verifies, a person's identity online, but also checks to see whether a person is authorized -- by virtue of being a librarian, for example -- for a higher-than-usual level of access to an online database to which a college might subscribe. 

This extra step makes it possible for publishers and subscribers to enforce complicated license agreements that may restrict access to special collections to small groups of faculty researchers, for example. Shibboleth does all this while protecting users' privacy, says Kenneth J. Klingenstein, who is the chief technologist for the University of Colorado at Boulder and who heads the Internet2 group that created Shibboleth. 

Shibboleth can be used within a college, but it was designed to be a secure way for one institution to authorize users from another institution to use online databases or other online materials, he says. 

Companies like Sun Microsystems and Microsoft also are trying to devise better methods of authorizing the use of Web databases and other Internet resources, but some publishers expect that colleges will adopt Shibboleth as their primary means of Internet authorization. 

JSTOR has installed Shibboleth on its Web servers so that it will be ready if and when more colleges start using it. "It appears to have the right characteristics," says David Yakimischak, JSTOR's chief technology officer. 

Shibboleth works in tandem with a college's directory server to generate a kind of digital token. The token is stored in the user's browser. When the user goes out on the Internet to gain access to a licensed database, for example, the token tells the database who the user is and what the user is allowed to do in the database. 

The approach is more sophisticated than that taken by "cookies" -- bits of code that many Web sites rely on to identify users. Unlike Shibboleth, cookies are not sophisticated enough to enforce different levels of access to online resources based on who you are. 

Most of Internet2's effort in the months ahead will be to get key software companies and online publishers to install Shibboleth "handlers" on their Web servers. A handler refers to the software, also known as middleware, that would handle requests for access to online resources. 

"In general, the industry hasn't enforced authorization," primarily because there have been few effective means of doing so, says Kimberly Voltero, a senior manager at WebCT. 

WebCT, which makes software for managing courses, has announced that its WebCT Campus Edition is able to handle authentication and authorization requests that originate from Shibboleth. The company's next release of WebCT Vista will also work with Shibboleth, Ms. Voltero says. 

The most colorful endorsement of Shibboleth comes from Mr. Klingenstein himself. Shibboleth is like plywood subflooring, he says. "People will admire the wonderful tiles and carpets of applications on top of this, but [Shibboleth] will be utterly invisible."
*******************************
Los Angeles Times
Computer Artists Draw Interest From Attorneys
High-tech animation is becoming a popular tool in court cases. Critics say it 'Disney-ups' evidence.
By Monte Morin
Times Staff Writer

March 21, 2003

At a Halloween party in 2000, an actor dressed as the devil playing with a rubber handgun is shot and killed by an LAPD officer called to the Benedict Canyon house because of noise complaints.

To the police, the death of Anthony Dwain Lee was a tragic case of an officer stumbling onto what appeared to be an armed man. Fearing for his life, the officer fired in self-defense.

To Lee's relatives, the 39-year-old actor was an innocent, unarmed man who was a victim of senseless police aggression.

Though they differed on what happened, both sides agreed on one thing: The best way to illustrate the opposing scenarios to a jury was by using computer-animated videos.

Although the civil lawsuit was eventually settled before trial, the dueling reenactments, which employ computer-animation techniques pioneered by Hollywood movie studios, demonstrate just how popular computer graphics have become as courtroom exhibits.

The use of the technology remains controversial -- some attorneys say it's used to "Disney up" evidence -- but the animations are revolutionizing the way trials are conducted. Lawyers who once relied on dry, expert testimony and their own courtroom eloquence to lay out a case are now turning on television screens to get their point across.

Animations are being used extensively in civil litigation but also are increasingly common in criminal cases, by prosecutors and defense lawyers alike. But they aren't cheap. Even a brief animation can cost $7,500 to $15,000.

The fear is that jurors will mistakenly believe they have witnessed the actual event, or that they will be swayed by glitzy theatrics instead of by the evidence.

"There's this sense you get from watching an animated video that you're somehow staring through a window into the past," said David Sklansky, a UCLA criminal law professor. "But in reality, you're not."

In a recent, high-profile criminal case, a Los Angeles Superior Court judge barred a computer animation and other testimony from the penalty phase of a murder trial. The tape, according to the defendant's lawyer, was critical in proving that his client could not have committed the crime he was convicted of: executing an LAPD motorcycle officer during a routine traffic stop. The lawyer is fighting in federal court to have the computer reenactment shown to a new jury.

The defendant, Kenneth Gay, was convicted of the 1983 Lake View Terrace murder of LAPD Officer Paul Verna. In 1998, the California Supreme Court overturned his death sentence on the grounds of incompetent counsel but left intact the guilty verdict.

Gay's new lawyer, Kenneth Lezin, a deputy public defender, hired a computer-animation firm to reconstruct the shooting for the second sentencing trial.

The animation, which is roughly equivalent to the type found in a video game, shows Gay's co-defendant -- Raynard Cummings -- stepping from the stopped vehicle with a revolver. Bright red lines extend from the barrel of the gun, showing the presumed path of six bullets fired into the police officer as he staggers back to his motorcycle.

Prosecutors and Cummings dispute Gay's version. They say Cummings shot the officer once from the back seat, then handed the revolver to Gay, who fired five more shots into the officer.

Lezin said this is impossible, based on the trajectory of the bullets. The video, he said, demonstrates the only way in which the officer could have died: He was pursued and shot repeatedly by Cummings.

"It's like a jigsaw puzzle," Lezin said. "That's why we need an animation to illustrate it. It's very rare that you can use this type of technology in a defense case, because very rarely are we proving something. In this case, we were essentially trying to prosecute the other guy."

Computer animation was first used in a California criminal case in 1992. Jim Mitchell, a San Francisco pornography mogul, was accused of killing his brother and business partner, Artie.

Though he claimed the killing was accidental, California prosecutors used the animation to help explain to a jury how Mitchell could have committed the crime. The simple, cartoon-like animation showed a sleeping figure that resembled a crash-test dummy lying in bed. The dummy suddenly awakens as the first of eight gunshots is fired in his direction. Walking down a corridor, the gray figure is hit by shots in the arm, the chest and finally the head. He crumples and falls to the floor.

Mitchell was convicted. He appealed his conviction on the basis that the animation was "speculative" and "prejudicial." The appeals court disagreed, and Mitchell served three years of a six-year manslaughter sentence.

Since then, use of computer animations has grown increasingly popular, particularly in civil litigation. Auto collisions, plane accidents, workplace injuries and the workings of machinery and medical equipment are the most common cases.

Production companies say that more often than not, their reenactments don't make it before a jury. Once presented with a polished animation, the opposing side often sees it as a sign that their opponent is armed to the teeth and considers settling.

"Very rarely does this stuff go to court," said Stuart Gold of Berkeley's Shadow and Light Productions. "It's basically used as a club.... It's like the trump card."

Still, experts say that even though computer animation has its place in the courtroom, it is no substitute for a strong case and a persuasive lawyer.
*******************************
CNET News.com
Schism hits key open-source group 
By Stephen Shankland 
Staff Writer, CNET News.com
March 20, 2003, 9:55 PM PT

A schism has struck the XFree86 movement, an open-source graphics project key to Linux and several other operating systems, resulting in the expulsion, by the core group in charge of the project, of one of its members. 
The six-member Core Team of the XFree86 project announced Thursday that it had ejected member Keith Packard for trying to create a parallel XFree86 project and refusing to discuss reasons for the radical move with the rest of the core team. The core team disclosed the ouster in conjunction with an announcement of a new mailing list to discuss XFree86's future. 

In an interview, Packard didn't comment on specific actions but indicated that he was trying to make it easier for interested and qualified programmers to contribute to the XFree86 project. "XFree86 is not currently a friendly place to play," he said. 


XFree86, which sends graphics commands for tasks such as drawing windows to video cards, is a crucial part of Linux and several other operating systems. Packard has been leading improvements to font technology. 

The stakes in the issue are high. Open-source advocates often boast of the advantages of the collaborative programming model, in which the source code underlying a particular program may be freely shared and changed. One drawback of this freedom, however, is that people with different ideas can "fork" the software into different projects that are incompatible, that dilute programmers' energy by forcing them to duplicate the same features or that require outside programmers to work with more than one group. 

Forking isn't all bad, though, advocates say. It can act as a safety valve to prevent a misguided group from gaining too much control, and it can be used to provide competition to assess which of two approaches is best. Indeed, forks of the heart of Linux itself, the kernel, frequently branch off as programmers work on pet projects they hope eventually will be accepted by leader Linus Torvalds into the mainstream. 

The XFree86 raised the forking issue as evidence of the seriousness of Packard's actions. 

"It has been brought to the attention of the XFree86 Core Team that one of its members, Keith Packard, has been actively (but privately) seeking out support for a fork of XFree86 that would be led by himself," a message from the Core Team and the XFree86 board of directors said. 

Packard also is forming a group of people with "vested interests" to discuss his concerns about XFree86 but has refused to disclose those concerns, the group said. "As a consequence, Keith Packard is no longer a member of the XFree86 Core Team," the group said. 

Packard defended his actions, saying he was gathering information before making his case. "I was trying to get some help in framing the discussion from members of the community before making incoherent statements to the XFree86 management," he told CNET News.com. "I'm still trying to do that." 

But Packard abused his privileges as one of the handful of people authorized to "commit" changes to the XFree86 code base, said XFree86 project leader David Dawes in remarks to the new forum mailing list. Specifically, Packard added a new "Xfixes" section to the software "without any prior discussion, let alone public review," Dawes said. 

David Wexelblat, one of the XFree86 board members who signed the original notice of Packard's expulsion, directed reporter queries about the group's actions to his own posting. "While still a member of the XFree86 Core Team, (Packard) has explicitly attempted to subvert XFree86 by soliciting individuals and corporations to create an alternative to XFree86," Wexelblat said. 

But some changes are afoot, Wexelblat added. For example, the project needs more people who are authorized to lock in software changes, "mostly to reduce load on the people who are currently responsible. I believe this is already in progress," he said. 

One source familiar with the debate said Packard's actions are aligned with the views of top Linux seller Red Hat and server maker Hewlett-Packard, which employs Packard. Red Hat and HP didn't immediately respond to requests for comment. 

The split left some distressed. 

"Bad mistake," said Alan Cox, a top deputy to Torvalds and an employee of Red Hat, in a response to the XFree86 Core Team's move to oust Packard. It would have been better to let him try an experimental project within the XFree86 community then evaluate that project on its merits, he said. 

"XFree86 is hard to get involved with usefully, resistant to cool ideas," Cox said. The project suffers from "plodding progress," a reluctance by leaders to delegate decisions to lower-ranking programmers and preference for infrequent releases rather than the frequent updates used for the Linux kernel itself. 

And in January, another Red Hat programmer, Mike Harris, said XFree86's actions damage relations with outside companies such as graphics card maker ATI Technologies, which has had to wait "months and months" for software updates to be accepted. 

"How long is ATI going to continue submitting patches to XFree86.org that take nine months to get (accepted), and then perhaps another four to six months to be available in an operating system," Harris asked. "Quite frankly, if I were ATI, and submitting patches as frequently as they do, and the patches just sat there, I might start thinking twice about bothering in the future." 

Others urged an amicable resolution. 

"I think I'm going to cry," lamented one commenter at the Slashdot "news for nerds" discussion site. "Please try to work this out with Keith...Don't end up hating each other and refusing to share your code with each other. Either side of such a fork would have a much weaker team."
*******************************
Government Computer News
03/21/03 
IGs: Watch those Social Security numbers 
By Susan M. Menke 

Fifteen inspectors general this month told the President?s Council on Integrity and Efficiency that federal agencies are lax in overseeing the use and disclosure of Social Security numbers stored in agency databases. 

Some agencies, they said, are allowing contractors free access to such private information before their background checks are completed or after they stop working within the government. 

The Social Security Administration?s inspector general undertook the survey at the request of the Senate Governmental Affairs? Ways and Means Subcommittee on Social Security. 

?An individual?s Social Security number is often the last line of defense against identity theft,? Governmental Affairs chairwoman Sen. Susan M. Collins (R-Maine) said in a statement about the report. Click here for a PDF of the report. 

The 15 IGs interviewed personnel within their agencies and observed contractor activities. They noted such practices as unlocked file cabinets and sensitive records lying on desks or shelves after working hours. 

Of the 15 agencies surveyed, only the Environmental Protection Agency had adequate controls, the IGs concluded. They found one or more areas of weakness at the departments of Agriculture, Defense, Education, Health and Human Services, Housing and Urban Development, Labor and Treasury; Federal Deposit Insurance Corp.; IRS; Nuclear Regulatory Commission; Office of Personnel Management; Railroad Retirement Board; Small Business Administration; and SSA. 

The General Accounting Office?s May 2002 assessment of Social Security number safety had said all levels of government use and exchange individuals? numbers frequently for many purposes such as benefits, debt collection, statistics and taxation. 

GAO said the 1974 Privacy Act requires government bodies that collect the numbers to: 
Tell individuals whether disclosure is voluntary or, if mandatory, under which law or authority. 
Tell individuals what use will be made of their numbers. 

In the 2002 report, GAO found 32 percent of federal agencies were not following the first requirement, and 21 percent were not following the second.
*******************************
Government Executive
March 19, 2003 
Report: Federal IT spending has doubled in five years 
 From National Journal's Technology Daily 

Federal spending on information technology doubled between 1997 and 2001, with the bulk of those funds going to larger businesses, according to a study released Tuesday by the General Accounting Office (GAO-03-384R).

Federal agencies in 2001 purchased 62 percent of their IT services and supplies from large companies. Twenty-one percent of IT services were purchased from medium-sized businesses and 14 percent from smaller entities, the GAO report said.


GAO defined small businesses as those with less than $21 million in annual receipts. Medium-sized businesses were defined as those with $21 million to $500 million in revenues, and all companies with more than $500 million revenues were defined as large.


While the Defense Department remains the largest buyer of IT products and services, GAO found that agencies have been increasing spending of IT through the General Services Administration's Federal Technology Service. IT purchases made through GSA supply schedule contracts grew from $405 million in 1997 to $4.3 billion from in 2001, the report said.


Under the supply schedule program, GSA negotiates contracts with a wide variety of vendors and allows agencies to place orders under the contracts directly with the firms. In fiscal 2001, medium-sized businesses got almost 40 percent of their contract dollars through schedule sales. Small businesses got 35 percent of their contract dollars through the schedules, and large firms 18 percent.


According to GAO?s analysis, American Management Systems obtained the largest percentage of federal IT dollars in 2001, followed by Lockheed Martin and SAIC.
*******************************
Government Executive
March 17, 2003 
Defense, NSA move on 'open source' software development 

By Drew Clark, National Journal's Technology Daily 


The National Security Agency and Defense Department are continuing to promote government use of software like Linux whose source code is freely available to the public, representatives of the departments said at a Monday conference on such "open source" software.

Peter Loscocco, a senior research scientist at NSA, said that in spite of complaints from proprietary software vendors, the agency is continuing to improve its Security Enhanced Linux (SE Linux), a variant of the popular Linux operating system software that deploys an advanced security architecture. 

"We are preparing to submit SE Linux [for inclusion in] the Linux kernel," or the core element of the evolving operating system that is available for free, Loscocco said at an event sponsored by the Center of Open Source and Government at George Washington University. "That will make it possible for [SE Linux] to run with an out-of-the-box Linux system." 

NSA began the project because of its frustrations with computer-security weaknesses in the late 1990s. Rebuffed by proprietary software vendors skeptical of investing large amounts in extra-secure systems without a proven marketplace demand, agency computer scientists studied Linux, modified it and released it to the public in January 2001.

That flexibility and reusability is possible because Linux is offered under general public license (GPL), a software contract that permits endless recopying and modification. Some observers believe that the license has lent a competitive edge to Linux, which has emerged as the foremost competitor to the dominant Microsoft Windows operating system. 

Red Hat Linux, another version of the software, received a Defense Department certification Feb. 10, said Fritz Schultz, an official with the Defense Information Systems Agency. The system was one of at least six certified for widespread Defense use, he said. 

Schultz said Defense's policy toward open-source software is best summarized as, "Let's compete, as long as [open source and propriety systems] are on an equal footing." Rather than articulating a preference for or against open source, he said the department would "use the best software for the job." 

Microsoft has been instrumental in organizing companies to form the Initiative for Software Choice, which seeks to forestall government use of software offered under the GPL. Microsoft objected to the NSA's SE Linux on the grounds that the GPL undermines commercial use of government investment and because it represents competition from the government.

In his remarks, Loscocco said he was glad that NSA had rebuffed such criticism. "We spent a lot of time educating our managers, who accepted a lot of the flak that has come back to NSA about SE Linux," he said. Loscocco was critical of "some of the problems that other people want to impose on us, to make sure we are not working on a product that is competing." 

Speaking about the possible inclusion of SE Linux in conventional Linux packages, Loscocco said it "is very good news for us. Linux will then have a leg up on security that other operating systems just don't have." 
*******************************