[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips June 18, 2002



Clips June 18, 2002

ARTICLES

Ridge to Unveil Homeland Dept. Plan
White House cyber czar maps out intelligence and security strategy
Panel to assess FBI reorganization plan
You may owe use taxes on your big, out-of-state purchases
Music Industry, Song-Swapping Site Reach Deal
Man Released From Jail in Web Case
Former Administrator Sues U. of Tennessee Over Released of E-Mail
TSA will hand out $92.3 million for port security
UN Conference Hears Digital Divide Still Growing
Microsoft, HP Join UN Tech Effort
Anti-Censorship Advocate Draws Heat
FedCIRC will work with university's CERT
Survey: Cyberterror threat ignored
Security Flaw in Web Software
Digital imaging standards unveiled
New Internet Domains A Year Or More Away
Supreme Court Says Satellite Companies Must Carry All Local Stations
Funny money prevention may add color to dollar
'Snoop' climbdown by Blunkett
What your computer says about you
Devices That Move Digital Media Complicate Piracy Clampdown
Computerized Legal Assistance Is Getting Its Day in Court
Workforce bill on tap
A-11 revisions stress planning
Air Force studies PC costs
Two security alerts point to Apache Web Server flaws
IT integration key to U.S. security
Coast Guard schedules Deepwater contract award for June 25
Agencies to unveil e-signature prototype
Agencies' security spending may rise 12 percent a year
U.S. to implement wireless emergency telecom network



******************
Associated Press
Ridge to Unveil Homeland Dept. Plan
Mon Jun 17, 9:39 PM ET
By JENNIFER LOVEN, Associated Press Writer

WASHINGTON (AP) - After a hurry-up drafting job over the weekend, President Bush ( news - web sites) was ready Monday to send Congress the legislative blueprint for creating his proposed domestic security agency.

Bush's homeland security adviser, Tom Ridge, was presenting the bill Tuesday morning at a ceremony in the Capitol, House Speaker Dennis Hastert, R-Ill., said Monday. Along with Hastert, the other Republican and Democratic leaders of the two chambers were scheduled to receive the White House measure.

About a quarter-inch thick, the legislation was completed in a rush over the weekend by White House aides eager to meet a stepped-up timetable for creating the Department of Homeland Security, a senior Bush adviser said.

Instead of Bush's usual practice of providing only outlines of his principles when he wants something from Congress, the White House determined this proposal was so big and complex that many details needed to be fleshed out before it was delivered to lawmakers, the adviser said.

The legislation contains no surprises from Bush's announcement, the adviser said.

With so many details outlining how 100 federal entities and almost 170,000 employees should be gathered into a single Cabinet-level agency with a $37.4 billion budget, however, it was sure to be fodder for critics and others hungry for information.

Skeptics have questioned the new agency's lack of authority over intelligence gathering.

Among the entities the new agency would absorb are the Secret Service ( news - web sites), the Coast Guard and the embattled immigration and customs services but neither the FBI ( news - web sites) nor the CIA ( news - web sites). The White House says the reshuffle would be the biggest overhaul in a half-century, which will nonetheless not cost significantly more that announced.

The White House promised it would have draft legislation to Congress within two weeks of Bush's June 6 announcement.

Since then, the president and congressional leaders concurred that the bill should be completed by Sept. 11, the first anniversary of the terror attacks. To achieve that goal, congressional leaders have said they hoped to have initial versions passed before lawmakers leave for their August recess.

Action had begun on Capitol Hill even before the president's proposal arrived.

On Monday, former leaders of the Customs Service, the Coast Guard and other agencies that probably will be affected told a House Government Reform subcommittee those agencies should be kept intact even if they were moved.

On Thursday, Ridge was to testify before House and Senate committees.

Sen. Joseph Lieberman ( news, bio, voting record), D-Conn., chairman of the Senate Governmental Affairs Committee ( news - web sites), which will handle the legislation, has a homeland security bill similar to Bush's but differing sharply on the sharing and analysis of intelligence.
**************************
Government Executive
White House cyber czar maps out intelligence and security strategy
By Shane Harris
sharris@xxxxxxxxxxx


In announcing the president's proposal for a new Department of Homeland Security last week, Bush administration officials said protecting information networks from electronic attacks and conducting more thorough analyses of intelligence were among its top priorities. Under the new department, both of these functions would be housed in the same division, covering "information analysis and infrastructure protection." In an interview with Government Executive last week, Richard Clarke, the President's cybersecurity adviser, explained how the new approach would work.


The White House wants to merge cybersecurity and infrastructure protection agencies at the FBI, the Commerce Department, the General Services Administration and the Defense Department into the Homeland Security Department, Clarke said. However, the information and infrastructure operations would be kept separate and the White House would continue to set overall cybersecurity policy.



"The structural thing to bear in mind is that the job?of the cyberspace security adviser [Clarke's] is a White House job that is going to continue," Clarke said. The Critical Infrastructure Protection Board, which advises the president on matters of cybersecurity policy and is chaired by Clarke, will also remain intact, he said. "What we're doing over here won't change."



There is some confusion over exactly what "infrastructure protection" means under the president's proposal. Clarke said that "99 percent" of the work done by the cybersecurity agencies that would be merged into the new department involves protecting information networks, rather than physical structures. Many electric power plants, nuclear reactors and water treatment facilities operate on networks that are vulnerable to attack, and the agencies see guarding those networks as key to protecting the facilities. But Clarke said he wasn't sure what protection would be provided for the physical assets themselves under the new division.



The information analysis component of the Homeland Security Department would be made up of analysts who would review intelligence compiled and gathered by other agencies, including the CIA, the National Security Agency, the National Reconnaissance Office and a host of others. Those analysts would produce intelligence reports for the Homeland Security secretary.



Clarke compared the division with the State Department's Bureau of Intelligence and Research, established in 1946 to provide the secretary of State with objective analysis of global political developments. Clarke said the State bureau produces some of the "highest-quality intelligence" in the government. The assistant secretary who heads the bureau reports directly to the secretary of State and serves as his principal intelligence adviser.



The head of the new intelligence analysis division would presumably report to the Homeland Security secretary, as well. Clarke said no one would be named to head the office until the president names a secretary for the department.



Clarke emphasized that the intelligence division will not actually collect information from sources on its own. "Its first job is to do analysis" of intelligence provided by other agencies, he said. Asked how those agencies would be compelled to cooperate with one another and with the new department when distrust and operational differences have kept them apart for decades, Clarke said, "It's really easy, because they all work for the president," adding that word has gone out across the government that the new analysis shop has Bush's full backing.



The Bush administration has spent the past several weeks defending itself against charges that senior officials at the FBI and the CIA failed to communicate with each other about the presence of known Islamic militants in the United States in the weeks before the Sept. 11 attacks. FBI Director Robert Mueller has said that the bureau's antiquated technology systems and historic reticence to cooperate with other law enforcement and intelligence agencies have hampered its ability to stop attacks.



Last month, the FBI established a new Office of Intelligence to share information with the CIA. That office is headed by Mark Miller, a career CIA Soviet intelligence analyst. The CIA has also lent the FBI 25 intelligence analysts.



Clarke acknowledged that adding another layer of bureaucracy to the already scattered intelligence community might strike some as counterproductive. But he said history has shown that when different sets of analysts review the same information, their understanding of it sharpens.



Clarke said that while the work of the intelligence analysis and cybersecurity divisions has little in common, elements of the two groups "will [work] cheek to jowl." He pointed out that the FBI's National Infrastructure Protection Center, which would move into the new department, already employs a cadre of intelligence analysts. However, an NIPC spokeswoman said no formal plans have been made about which of its employees will be transferred to the new department.



Clarke wouldn't say whether he was involved in crafting the Homeland Security Department proposal. Asked whether he would accept the position of Homeland Security secretary if asked by the president, Clarke replied, "I don't think that's likely."
*******************
Government Executive
Panel to assess FBI reorganization plan
By Brian Friel
bfriel@xxxxxxxxxxx


Former Attorney General Richard Thornburgh will head a congressionally appointed panel that will assess the FBI's reorganization plans, the National Academy of Public Administration (NAPA) announced Friday.


Thornburgh, who was FBI Director Robert Mueller's direct supervisor during the first Bush administration, will study whether Mueller's plans to restructure the bureau will align resources with the administration's new counterterrorism mission and improve coordination with other law enforcement agencies. Thornburgh will also assess the reorganization's impact on the bureau's personnel, performance management and technology systems.



The panel will present its findings June 21 at a House Appropriations Committee hearing.



Mueller announced plans on May 29 to devote more of the bureau's agents to investigating terrorism, spend less time investigating drug trafficking and create a new Office of Intelligence to analyze information about potential terrorist attacks.



After the announcement, the appropriations committee asked NAPA to convene a panel to assess Mueller's plans. Thornburgh, who is a former Republican governor of Pennsylvania, will be joined on the panel by Robert Alloway, a former House staff member; Kristine Marcy, a former Small Business Administration and Immigration and Naturalization Service official; NAPA President Robert O'Neill Jr.; and Harold Saunders, a former State Department and National Security Council official.



Thornburgh previously headed a commission to assess effective measures against child pornography on the Internet and headed up an investigation into mismanagement at the United Nations.



Mueller served as Thornburgh's assistant at the Justice Department from 1989 to 1990.
**********************
USA Today
You may owe use taxes on your big, out-of-state purchases
By Sandra Block


Shortly after Robert Doyle of St. Petersburg, Fla., got married, he and his wife embarked on a pilgrimage undertaken by thousands of new homeowners every year. They drove to High Point, N.C., in search of furniture bargains.

They purchased several items, and as expected, a truck delivered their furniture three months later. But six weeks after the furniture arrived, they received something they didn't expect: a bill from the Florida Department of Revenue informing them they owed use taxes on their purchases.

That was about eight years ago. But Doyle, a personal financial specialist at Spoor Doyle & Associates, believes more shoppers will start receiving tax bills for large out-of-state purchases as states look for ways to cope with falling revenue and rising deficits.

Use taxes are the little-known companion of sales taxes, which are imposed by most states (box, right). Use taxes allow states to collect taxes on items purchased out-of-state but used within their borders.

For practical reasons, use taxes are rarely enforced, says Harley Duncan, executive director of the Federation of Tax Administrators, a trade group for state tax agencies. Most states don't have the resources or inclination to track your weekend cross-border trips to the mall.

But big spenders could run into problems, particularly if state tax officials believe they're deliberately dodging taxes. The most recent example: the tax-evasion indictment of former Tyco International chief executive Dennis Kozlowski for allegedly evading more than $1 million in New York state and local taxes on valuable artwork. Kozlowski, who has pleaded not guilty, is charged with arranging to have his artwork shipped from New York to New Hampshire, which has no sales tax, then back to New York.

Even before the Kozlowski case, states were making an effort to inform taxpayers about use taxes, says John Logan, senior tax analyst for tax publisher CCH. Several states have included lines on their state tax returns where taxpayers can declare out-of-state purchases and the amount of use tax they owe, he says. Some include a separate form for use taxes, he says.

If you fail to provide the information and the state later unearths some out-of-state purchases, you could be charged with providing false information on your tax return, he says. ''That does up the ante a bit,'' Logan says.

Use-tax targets

Failure to report a blender you bought in New Jersey probably won't bring the tax collector to your door. But purchases that could catch the attention of cash-strapped state tax agencies include:

* Items that must be registered. When you buy a car, you're required to register it. That makes it easier for state tax authorities to spot out-of-state purchases and tax you accordingly.

* Items purchased overseas. U.S. Customs officials often pass on declarations of purchases to state tax agencies, Logan says. So if you buy an expensive watch in Italy and bring it home, you may receive a use-tax bill from your state.

* Items purchased from a state that has an information-sharing agreement with yours. About a decade ago, several East Coast states arranged to share information about out-of-state purchases, Duncan says. Some of those pacts languished in the late 1990s, but with many states facing a budget crunch, ''they may be invigorated,'' he says.

* Big-ticket items purchased out of state and shipped to your home. Most states charge a sales tax. But many retailers won't charge sales taxes on purchases shipped to an out-of-state address. Some retailers eager to make a sale even suggest shipping an item to your home address to avoid sales taxes.

However, you still owe use taxes on those items. For that reason, some state tax authorities are on the lookout for deliveries from popular shopping destinations, such as the North Carolina furniture outlets. Doyle believes Florida documented his furniture purchases when his delivery truck stopped at a state weigh station.

Given the uneven enforcement of use taxes, many may decide to take their chances and wait for the state to bill them. That strategy probably won't land you in jail, but you could face late-payment penalties if the state decides to get tough.

''You're really taking a risk now if you buy from an out-of-state furniture or appliance retailer,'' Logan says. ''There's a much higher likelihood you'll be charged'' use taxes.

If you've made some major out-of-state purchases and want to come clean, several states offer tax amnesty programs that allow you to pay up without penalty.

You can find a link to your state's tax agency at the Federation of Tax Administrators' Internet site, www.taxadmin.org.
***********************
Reuters Internet Reports
Music Industry, Song-Swapping Site Reach Deal
Mon Jun 17, 8:25 PM ET


LOS ANGELES (Reuters) - Music industry groups, including record labels, songwriters and music publishers on Monday said they reached a legal settlement with file-sharing Internet service Audiogalaxy.com for an unspecified payment and an agreement to block unauthorized material from the service.


The Recording Industry Association of America ( news - web sites), a powerful trade group for the record labels, and the National Music Publishers Association had filed suit against Audiogalaxy in a New York federal court in late May.


The suit had charged Audiogalaxy with encouraging and facilitating online trade by "millions of individual, anonymous users" in copyrighted music.

The full terms of the quickly reached settlement were not available but as part of the deal Austin, Texas-based Audiogalaxy agreed to filter the songs available on its network and ensure that the copyright holders had consented to their use, the music groups said.

Audiogalaxy also agreed to pay the music publishers and record companies a "substantial sum" to resolve the case quickly, the groups said in a statement.

A representative for Audiogalaxy, which had been one of the more heavily trafficked file-sharing Web sites to spring up after the court-ordered shutdown of Napster ( news - web sites), could not be immediately reached for comment.

"This should serve as a wake-up call to the other networks that facilitate unauthorized copying," said Hilary Rosen, chairman of the RIAA. "The responsibility for implementing systems that allow for the authorized use of copyrighted works rests squarely on the shoulders of the peer-to-peer network."

The settlement does not require court approval, a spokeswoman for the RIAA said.

The record industry has waged and won a series of legal battles against online file-swapping services, including the high-profile case against Napster.

The music industry blames such services for encouraging consumers to conduct a massive, black-market trade in recorded music, a phenomenon it says has eaten into legitimate sales.

Napster is now controlled by Bertelsmann AG ( news - web sites), which acquired the assets of the original peer-to-peer network as part of a deal under which Napster will emerge from bankruptcy as a unit of Europe's second-largest media group.

The settlement in the Audiogalaxy case would allow the privately held service to operate what the recording industry called a "filter-in" system, essentially requiring that all music available was authorized for online sharing by the copyright holders, the music industry groups said.
***************************
Associated Press
Man Released From Jail in Web Case
Mon Jun 17, 3:25 PM ET
By GENE JOHNSON, Associated Press Writer


SEATTLE (AP) - A 68-year-old man jailed for 3 months for refusing to remove personal information about people he disliked from his Web site was released Monday.


Superior Court Judge James A. Doerty gave Paul Trummel until Friday to remove the addresses and phone numbers of directors, staff and neighbors at the retirement home where he used to live. Otherwise, he will be sent back to jail.


The case has drawn attention from First Amendment groups, but the judge said the case was not about freedom of speech. Trummel, he said, is simply "a mean old man who becomes angry and vicious when he doesn't get his own way."

About three years ago, Trummel began publishing a Web site and newsletter detailing his complaints about Council House, a 163-unit, federally subsidized apartment building in Seattle's Capitol Hill neighborhood. He accused staff and residents of bigotry, housing-law violations, a conspiracy to keep him awake at night, even "sexual dysfunction."

Council House managers claimed harassment, and in October, the judge ordered Trummel to remove the personal data. Trummel initially complied but soon listed the information again. Doerty found him in contempt Feb. 27 and sent him to King County Jail.

"All he's alleged to have done is publish names and phone numbers on the Internet. It's not against the law to do so," said Sandra Baron, executive director of the Libel Defense Resource Center in New York.

Many Council House residents wanted Trummel stopped.

"It's been horribly scary," said Nathaniel Stahl, 59. "He's spent all these years trying to really hurt people here."

Trummel's lawyers said they do not know whether he plans to comply with the judge's order.
***********************
The Chronicle of Higher Education
Former Administrator Sues U. of Tennessee at Knoxville Over Released E-Mail Messages
By ANDREA L. FOSTER


Pamela S. Reed, a former administrator at the University of Tennessee at Knoxville, has filed a $14-million lawsuit against the university and three of its administrators. The suit accuses them of violating her civil rights and her privacy by publicly disseminating her e-mail messages and other personal information.

The suit, filed earlier this month in the U.S. District Court for the Eastern District of Tennessee in Knoxville, says the university and some of its administrators sought to undermine her professional status after she revealed her intimate relationship with the former president of the university, J. Wade Gilley.

In addition to the university, the lawsuit names as defendants the university's general counsel, Catherine S. Mizell; T. Dwayne McCay, the vice president for research and information technology; and Katherine N. High, the vice chancellor for student affairs at the university. All three declined to comment on the case.

The university announced Ms. Reed's resignation on June 13, 2001, the same day it handed over to news reporters 900 pages of e-mail messages that a reporter for The Knoxville News-Sentinel had requested under provisions of the state's open-records law. Those messages, and others released to news organizations later, suggested an extramarital romance between Ms. Reed and Mr. Gilley, who had announced his resignation as president on June 1, 2001. (See an article in the current issue of The Chronicle.)

According to the lawsuit, the administrators conspired to discredit and find a reason to fire Ms. Reed after she revealed to Mr. McCay, her supervisor, that she had had an affair with Mr. Gilley, and after she complained to Mr. McCay "of a hostile work environment caused by discriminatory conduct of university officials."

Ms. Reed said in an interview that she told Mr. McCay that university officials were punishing her for the relationship, rather than punishing Mr. Gilley, and that she believed this was because she was a woman.

In their efforts to discredit her, the lawsuit alleges, administrators read her stored e-mail messages. She also says they used government property to gain illegal access to her personal computer files.

Ms. Reed also accuses the three administrators of hiring a private investigator to gain access to her credit records, of searching her office and desk for private communications, and of releasing to the news media her medical records, Social Security number, and letter mail.

Ms. Reed, who lives in Knoxville, says she is now working as a journalist for a weekly newspaper and weighing an offer from a pharmaceutical company. "I've given up on higher ed," she said. "I feel so betrayed."

She seeks $2-million in compensatory damages and $5-million in punitive damages from the university. She seeks the same amount from the three administrators together.
**********************
Government Computer News
TSA will hand out $92.3 million for port security
By Preeti Vasishtha


The Transportation Department is giving $92.3 million in grants to 51 ports nationwide to boost their security. About $78 million of the grants will go to secure facilities and $5 million to evaluate vulnerabilities, department officials said.

The remaining $9.3 million will fund proof-of-concept projects in new technologies, such as electronic seals for shipping containers, vessel tracking and electronic notification about vessel arrivals.

Transportation secretary Norman Y. Mineta said terrorism has "renewed focus on the security of our transportation systems." He and Coast Guard and local officials spoke today near New York's Staten Island Ferry landing.

Congress gave the funds to the Transportation Security Administration, which will manage the grants along with the Maritime Administration and Coast Guard. TSA is responsible for securing all U.S. modes of transportation.
***********************
Reuters
UN Conference Hears Digital Divide Still Growing
Mon Jun 17, 8:51 PM ET


UNITED NATIONS (Reuters) - The digital divide between rich and poor countries is growing despite the many efforts to help developing nations break into the global economy via computers, U.N. Secretary-General Kofi Annan ( news - web sites) said on Monday.


"The digital divide still yaws as widely as ever, with billions of people still unconnected to a global society which, on its side, is more and more wired," Annan said.


"Despite commendable efforts and various initiatives, we are still very far from ensuring that the benefits of information and communications technology are available to all," he said at the start of a two-day session of the U.N. General Assembly devoted to computers and development.

He called on industry to work with governments, civic groups and the United Nations ( news - web sites) to find better ways to integrate developing nations into globalization, and to be prepared to commit resources to the problem over the long term.

Participants said there was broad consensus that information and communications technologies could play a major role in promoting economic growth and development, and fighting poverty and disease.

But progress has been slow in many parts of the world.

"Some countries have prospered while others have fallen behind," said Yoshio Utsumi, secretary-general of the Geneva-based International Telecommunications Union. "If we do not take any action, the gap between the information 'haves' and 'have nots' will continue to grow."

Utsumi said "information poverty" remained a reality for much of the world. More than 80 countries had fewer than 10 telephone lines for every 100 inhabitants. And in three out of five countries, fewer than one out of 100 people used the Internet, he said.

"Information has become the key to competitive advantage for both business and modern states," he said. "Anyone can work and provide a product to the global market, even from a remote corner of the world, if the means of communication are readily and cheaply available."
************************
Associated Press
Microsoft, HP Join UN Tech Effort
Tue Jun 18, 3:26 AM ET
By JIM KRANE, AP Technology Writer


NEW YORK (AP) - Microsoft Corp. and Hewlett-Packard ( news - web sites) Co. are two of a handful of companies agreeing to divert 20 percent of their charity donations to the cause of providing developing countries with Internet and telephone service.


At a meeting of the United Nations ( news - web sites) General Assembly on Monday, both companies announced they had joined the CEO Charter for Digital Development, an initiative by the World Economic Forum ( news - web sites) being administered by the United Nations.


In the case of Microsoft, which donated $36 million in cash and $179 million in software last year, funds given to alleviate the so-called digital divide should surpass $43 million.

Hewlett-Packard is expected to add around $10 million of its philanthropy budget, which was about $50 million last year.

Microsoft, which has contributed to technology programs in Poland, South Africa, Kosovo and elsewhere, decided to join the initiative to take advantage of the "synergies" that come with collaborating on aid projects, said Bruce Brooks, Microsoft's director of community affairs.

H-P's philanthropy has set up Internet and telephone center projects in Senegal, Ghana and South Africa, and soon, in Brazil, said Debra Dunn, H-P's senior vice president of corporate affairs.

"These projects are not purely philanthropic," Dunn said. "We're very much lining up business opportunities in these countries. Unless we grow our business, what we can do is very constrained."

The thought that donor companies might use charity to increase their markets in poor countries is seen by the UN as the key to bringing all the world's inhabitants into the global economy, said Jose Maria Figueres-Olsen, special representative to secretary General Kofi Annan ( news - web sites) on technology and development issues.

"If they look at this to include business down the road, I think that's perfectly valid," said Figueres-Olsen, a former president of Costa Rica. "We should be trying to create a globe with six billion consumers."

Dunn said H-P's just-completed merger ( news - web sites) with computer maker Compaq ( news - web sites) will boost its charity budget but not by much.

"We will add Compaq's philanthropy budget to our own, but their giving was, frankly, very modest," she said.

Microsoft and H-P are the initiative's lone American donor companies. Also taking part are France's Vivendi Universal, South Africa's MIH Group, Egypt's Masreya and Equitable Cardnetwork of the Philippines.

Figueres-Olsen said the initiative aims to recruit 150 member corporations by year's end and collect hundreds of millions in donations.

The CEO Charter initiative is one of a few programs that aim to bring communications technology and the economic gains that is believed to follow to the world's poor.

The schemes often bring Internet-connected computers to schools or cafes sometimes in remote villages connecting them via satellite, allowing villagers to check market prices for crops, sell handicrafts on the world market or send e-mail to relatives abroad.

Other projects have distributed cell phones or computer terminals to villagers, who form small businesses by renting them to neighbors.

Several previous initiatives failed because they lacked a long-term commitment, Annan said, addressing a session on the topic on Monday.

"The digital divide still yawns as widely as ever, with billions of people still unconnected to a global society which, on its side, is more and more 'wired,'" Annan said, noting that he hoped current programs would be "provided with adequate resources over the long term."

Other programs aimed at spreading technology to the world's poor and raising their living standards include the UN Development Program's Digital Opportunity Initiative and the Digital Opportunities Task Force, sponsored by the Group of Eight countries.

The $10 million Digital Opportunity Initiative is sending teams of high-tech consultants to a dozen developing countries, urging governments to adopt business-friendly laws, build communications infrastructure and train workers to use the Internet.

The project supports the use of wireless technology in developing countries where government-run telecom monopolies have been unable or unwilling to provide inexpensive long-distance calls or connections to the Internet.
**********************
Associated Press
Anti-Censorship Advocate Draws Heat
Mon Jun 17, 3:48 PM ET
By LUIS CABRERA, Associated Press Writer


BELLEVUE, Wash. (AP) - Internet activist Bennett Haselton has made a name for himself by helping minors disable filtering programs designed to block Web sites that their parents deem offensive or pornographic.

His Peacefire.org site offers free downloads and details methods for circumventing filtering software that critics say also inevitably blocks out a range of useful, even beneficial, Internet content.

Yet while Haselton's crusade, launched six years ago while he was a college student, has made him a hero among some Web-savvy minors, he's something of a supervillain to filtering advocates.

"He's being totally irresponsible," said Marc Kanter, marketing director for Santa Barbara, Calif.-based Solid Oak Software, which makes the CYBERsitter program.

"When he started Peacefire, he was a kid himself," Kanter said. "Basically he was enticing minors into his beliefs and activities, which was to undermine parents' rights. As an adult now, he should know better than that."

Haselton, a 23-year-old who simultaneously earned a bachelor's and master's degree in mathematics from Vanderbilt University in Nashville, Tenn., says his objection to Net censorship is not born so much of passion as logic.

The criteria used by filter program designers is too arbitrary, he says.

Besides, children should be able to view whatever Web page they like, Haselton asserts: "I think intellectual development is one of the fundamental human rights and it's also a right that people under 18 have."

Haselton was heartened by a federal appeals court decision last month that struck down the Children's Internet Protection Act, ruling that public libraries cannot be forced to install filtering software in order to receive federal funding.

But many who share Haselton's opposition to filtering consider his position extreme.

"I'm not of the opinion that parents don't have any say where children should go" on the Internet, said Chris Hunter, a University of Pennsylvania researcher who testified on behalf of librarians at the trial.

Hunter worries that Haselton's line of thinking "that parents shouldn't have a right to monitor their children's access lends fuel to the other side saying that we're somehow uncaring about the issue."

Haselton, who works from a cramped one-bedroom apartment in Seattle's eastern suburbs, was raised as a U.S. citizen in Copenhagen, Denmark, where his mother taught music to diplomats' children, among others.

After graduating from Vanderbilt at age 20, he went west to work for Microsoft. But he left in January 2000, frustrated that he was writing code rather than tracking bugs for the software giant.

In addition to running Peacefire, Haselton now does battle with purveyors of Internet spam and works to ferret out security flaws on the Internet.

He made about $15,000 in bounty from Netscape last year for discovering flaws in the company's browser software. And last month he gained notoriety for finding flaws with Anonymizer.com, a popular Internet privacy service that lets Web surfers visit sites anonymously.

"That was pretty sophisticated," Anonymizer President Lance Cottrell said. "The fact that he was able to find it is testimony to what a clever fellow he is."

Haselton also has won 10 of 14 small-claims cases and thousands of dollars in judgments against senders of e-mail spam though he has yet to collect a cent. Washington is one of about two dozen states with anti-spam laws.

On a recent weekday, virtually every square foot of floor space in Haselton's apartment was covered by stacks of programming books, floppy disks, empty boxes, dirty clothes and an upended office chair. Four computers dominated a corner table, where Haselton probes for vulnerabilities in filtering programs.

Haselton says that while he intends to keep sniffing out bugs for bounty, he hopes to focus more of his energy on Peacefire's crusade.

"This is something that practically nobody else is working on, and only a couple of people in the world actually know as much about the blocking software issue," he said.
**********************
Government Computer News
FedCIRC will work with university's CERT
By Jason Miller


The Federal Computer Incident Response Center is putting together a pilot to stop hacker attacks on agency Web sites. FedCIRC, a General Services Administration unit that is to be part of the proposed Homeland Security Department, is joining with Carnegie Mellon University's CERT Coordination Center to collect and analyze data from sensors in agency firewalls and intrusion detection systems. . "We want to give agencies some analysis so they know more about incidents that occur on their sites," said Sallie McDonald, assistant commissioner in the Federal Technology Service's Office of Information Assurance and Critical Infrastructure Protection. "We also will feed the data to FedCIRC so we can do the same kind of analysis governmentwide."

McDonald said agency managers could go to the FedCIRC portal to watch what is happening agencywide and governmentwide. "We will be able to see trends and warn agencies if we see an attack occurring in government," she said.

Four or five agencies will take part in the pilot this fall, and full implementation would occur about a year later, McDonald said.

Meanwhile, FedCIRC will issue two requests for proposals this summer. One will be for a secure knowledge management portal for federal employees involved in network security, McDonald said. Systems administrators, CIOs and other officials would have access there to FedCIRC tools and services and could communicate with each other in a secure environment. The portal also would provide access to FedCIRC's security patch management site that Science Applications International Corp. is developing.

The second RFP will be for packaging a security tool kit of federally developed programs from the National Institute for Science and Technology and the National Security Agency.

"These are all tools that have been tested by government that agencies could use at no cost," McDonald said. "We want to cut down on expense and standardize the types of security tools used across government so assessments are done similarly."
*******************
MSNBC
Survey: Cyberterror threat ignored
Silicon Valley firms not taking adequate precautions
By Robert Mullins and Andrew F. Hamm
SAN JOSE BUSINESS JOURNAL


SAN JOSE, Calif., June 17 Headlines about the arrest of a man suspected of plotting to attack the United States with a "dirty" radioactive bomb reinforces the point that terrorist attacks can take many forms. Attacks on computer networks are one such form. But businesses, while aware of the risk, are slow to pay money or attention to cybersecurity, a new report shows.
IN MANY CASES, the down economy has cut into their information technology budgets to address the problem. Even if security gets more favorable treatment than other IT areas in budgeting, companies may still be underestimating the threat at their peril.
"The number of cyberattacks on businesses has doubled since Sept. 11," says Bill Rohde, president of the global technology group for St. Paul Cos., a property and liability insurance company.
Cyberattacks could include a hacker stealing customers' credit card information, defacing a corporate Web site, or spreading a virus that could disrupt a company's business. In theory, a cyberattack could go so far as to cripple a nuclear power plant, interrupt transportation systems or steal information critical to national security.
A St. Paul survey of more than 500 company risk managers and IT heads questions Internet-related and software-related companies' commitment to cybersecurity. The survey released June 12 in San Francisco says companies don't plan well enough to assess cybersecurity risks or work to protect against those risks.
Cyberattacks are just as big a threat as the dirty bomb or other physical attacks, says William Martel, a professor of national security at the Naval War College in Newport, R.I.
"When we went through Y2K, it became only too apparent how dependent our society has become on technology," Mr. Martel says. "We've seen quotes about al-Qaida wanting to hit us where we hurt. No one in or out of our government has ruled out al-Qaida using technology."
In fact, groups of all stripes have launched cyberattacks to send political messages, says Eric Friedberg, managing director of Stroz Associates LLC, a New York City cybersecurity consulting firm.
Groups including animal rights activists, anti-abortion advocates and Chinese Communists have defaced Web sites or clogged the Internet to block traffic, Mr. Friedberg says. In the Middle East, groups representing Israel, the Palestine Liberation Organization, or Arabs frequently deface one another's Web sites or attack one another's networks.
Al-Qaida is not the only source of cyberthreats we should worry about, Mr. Friedberg says.
"It's a politically motivated attack meant to attack or destroy a target system and it's carried out through electronic means," he says.


MORE SPENDING NEEDED
The St. Paul survey shows many company CEOs and risk managers falsely assume that the problem has been taken care of at their companies, Mr. Rohde says. And whether or not they understand the risk, some businesses fail to fully fund cybersecurity programs.
"Businesses really need to pick it up in terms of their spending on security," says Arthur Wong, chief executive officer of SecurityFocus Inc., a network security company in San Mateo. SecurityFocus monitors clients' computer networks and warns them about potential hacker or virus threats.
Are the clients spending enough on the problem?
"If you're talking about technology, and if you ask 'Are we ready for the next attack?' in general, the answer is definitely no," says Mr. Wong.
This seeming indifference to cybersecurity in the private sector contrasts with increased attention in the public sector.
President George W. Bush's proposal for a new Cabinet-level Department of Homeland Security, announced June 6, includes a cyberterrorism component. The National Infrastructure Protection Center, a department within the FBI that monitors the Internet for potential attacks, would be brought into this new department.
A reorganization of the FBI, proposed by Director Robert Mueller on May 30, would create a cybercrime division.
In California, security has been increased at nuclear power plants, water treatment facilities and transportation links such as the Golden Gate Bridge. But increased government vigilance is no replacement for business vigilance, says Stroz Associates' Mr. Friedberg. Stroz Associates plans to participate in a cyberterrorism forum in the Bay Area this fall.
"The greater responsibility lies with the private sector," he says. "The critical infrastructure that makes business function, which if disrupted would cause business interruption, lies in the private sector's hands."
THREAT WILL ONLY GROW
The threat of cyberterrorism increases as the sophistication of the attackers increase, St. Paul's Mr. Rohde says.
"There is a new, emerging professional out there the professional hacker who makes his living attacking computer systems," he says. Along with this new professional is a higher degree of cyber-weaponry.
The risk of viruses or worms (computer intruders that can take over and disable whole networks of machines) grows as computer networks expand and software becomes more complex and therefore more vulnerable, says SecurityFocus's Mr. Wong. So far this year, an average of 50 new computer software vulnerabilities have been discovered each week, up from 35 a week last year and 20 a week in 2000.
But even businesses aware of the risks have to balance cyberattack possibilities with budget realities.
O'Shea's Computer Consultants has recently seen clients consider investing in strong, expensive security technology to protect their networks, but then pull back and go with a less costly, less secure option, says Timothy O'Shea, chief executive officer of the San Mateo consulting firm.
"They have not actually felt the wrath of a cyberattack, so they haven't felt the need," Mr. O'Shea says. "Security is a state of mind."


       Copyright 2002 American City Business Journals Inc.
***********************
Associated Press
Security Flaw in Web Software
Tue Jun 18, 5:14 AM ET
By D. IAN HOPPER, AP Technology Writer

WASHINGTON (AP) - A security bug was found in software used by millions of Web sites. Private experts alerted users and the FBI ( news - web sites)'s computer security division.


Problem is, they didn't tell the maker of the software. Then they issued the wrong prescription for fixing the problem.


The incident Monday involving Apache's Web software shows that the system to insulate the Internet from attack a joint effort of the government and private companies is still a long way from perfect.

"It would be good if people would agree on some standards," said Chris Wysopal of Boston security firm AtStake. "People can't be put at risk like this again and again."

Internet Security Systems of Atlanta published a warning early Monday about vulnerabilities in Apache on some computer operating systems. Apache is used on about 60 percent of Web servers, the computers that deliver Web pages to the Internet. Many companies, including IBM and Oracle, create products that rely in part on Apache.

Now ISS is under fire for breaking informal industry agreements by rushing out the warning and a partial fix before coordinating with Apache developers.

The issue reveals infighting and hasty decisions that have become common in the computer security industry. Experts say the effect is to confuse users and possibly cause even more security problems.

Several third-party groups are designed to coordinate computer security information. But there may be too many ISS and the Apache developers chose different ones, and never coordinated with each other.

ISS researcher Chris Rouland said the company talked to the National Infrastructure Protection Center, part of the FBI. Apache developer Mark Cox said his group spoke with researchers at the CERT Coordination Center ( news - web sites), based at Carnegie Mellon University in Pittsburgh and partially funded by the Defense Department.

Spokesman Bill Pollak said CERT does share information with NIPC, but would give no specific details on the Apache hole. A spokeswoman for NIPC had no comment.

The Bush administration has called for the consolidation of government computer security groups under the proposed Homeland Security Department, and Bush advisers have admonished the technology community to share more information with government to protect consumers.

Rouland said ISS was rushing to beat hackers to the punch.

"We didn't set out to burn Apache," Rouland said. "We want to make sure we notify our customers appropriately."

Rouland said he didn't notify the developers of Apache because they aren't a formal company. Apache is open-source, meaning that the software and its blueprints are free and managed by programmers who coordinate its evolution.

Complicating the matter, Rouland said he didn't trust Cox, who along with his Apache duties is the senior director of engineering at Red Hat Software, which distributes the Linux ( news - web sites) operating system. Rouland accused Red Hat of taking credit for earlier ISS research.

Cox said he already knew about the hole from a different researcher, and that the ISS fix doesn't repair the entire problem.

"If ISS had told us before going public, we could have told them their patch was insufficient," Cox said. "The fact that they didn't has caused some problems."
*********************
MSNBC
Digital imaging standards unveiled
Industry coalition tries to simplify the ordering of photo prints
By David Becker



June 17 A coalition supported by some of the biggest companies in digital imaging announced Monday an open standard and network intended to simplify ordering photo prints. The International Imaging Industry Association (I3A) a nonprofit trade group supported by Eastman Kodak, Hewlett-Packard, Fujifilm and others is developing the Common Picture Exchange Environment (CPXe), a new standard for distributing photos over the Internet.
THE I3A WILL MAINTAIN a directory of retail photofinishers that support the standard and will supervise the network that will allow online photo services, retail photofinishers and other services supporting CPXe standard to exchange images with each other.
In one of the most common scenarios, consumers would send images from their PC to an online storage service, order prints and pick them up at a neighborhood photofinishing shop. This process is more similar to the film-based system that consumers are familiar with than that of current digital alternatives such as ordering photos online and waiting for them to arrive in the mail, or using a print kiosk at a photo store, said Lisa Walker, executive director of the I3A.
"What we're trying to do is enable any Web site, any user, any camera to communicate with each other seamlessly," Walker said.
"We've got this really massive photofinishing infrastructure sitting at retail that really isn't being leveraged by digital imaging," she said. "One of the keys to changing that is to realize that consumers really don't like to change their behavior. They have a comfort level with the way photofinishing works now. We really have to meet or exceed the convenience the film model offers today."
Walker said the network to support exchange of CPXe data would be ready by the end of this year, with the first companies to support the service likely offering services in early 2003. I3A will maintain the infrastructure using fees collected from participating companies.
Mark Cook, director of product management for Kodak's digital imaging division, said that while the initial focus will be on connecting consumer PCs with photofinishers, new applications are likely to emerge as support for CPXe proliferates.
"Once you standardize these interfaces, there are other scenarios you can imagine happening," he said.
Cook envisions photo kiosks at tourist attractions such as Walt Disney World, where consumers could order prints from a hometown camera shop as soon as they filled up their camera's memory card.
Chris Chute, digital imaging analyst for research firm IDC, said that while the I3A's goals are on target, implementation may not be easy.
"This is the kind of effort that needs to happen across the industry to make the consumer digital imaging experience as seamless as film is now," he said. "But there are a number of challenges ahead as far as who manages this infrastructure and how much is that infrastructure going to cost.
"If not everyone signs on, it's going to be hard to get the critical mass to make it work," he said.
Sufficient broadband penetration, both for consumers and the photofinishing retailers, also will be an obstacle when dealing with large image files, Chute said. "From an infrastructure point of view, there's a real issue as far as what kind of pipeline you're going to have to support this," he said.
Ramon Garrido, director of digital imaging programs for Hewlett-Packard, said he expects CPXe to generate widespread support in the imaging industry.
"The thing were focused on is propelling the growth of digital imaging by reaching a mainstream consumer," he said. "Once the market grows, we can all get in there and compete with our individual products, but we need to build that market."
CPXe will be based on established Web services standards such as XML and Simple Object Access Protocol (SOAP), making it one of the first major efforts of the much-hyped Web services push aimed at consumers.
"The technology part isn't really all that complicatedwere using a lot of existing standards," Walker said. "The challenge of putting the business model together has been more of a concern for us."
Copyright © 1995-2002 CNET Networks, Inc. All rights reserved
*********************
Washington Post
New Internet Domains A Year Or More Away
By David McGuire
Monday, June 17, 2002; 5:30 PM


Internet users looking for alternatives to "dot-com" and a handful of other existing address suffixes will have to wait at least another year, the president of the Internet Corporation for Assigned Names and Numbers (ICANN) said today.

"ICANN has a lot on its plate right now," ICANN President Stuart Lynn said, adding that the body must determine "how much can be done with the resources available" before it creates new Internet addressing codes.

If Internet addressing authorities decide to introduce more domains to the system, ICANN would probably need one to two years to find prospective operators and ink a new round of agreements, Lynn said.

ICANN, which manages the Internet's worldwide addressing system, is responsible for deciding when and if new Internet domains are added to the international Domain Name System (DNS).

In November 2000, ICANN approved the creation of seven new suffixes designed to ease crowding in dot-com, dot-net and dot-org. Those domains, which included "dot-info" and "dot-biz," were the first generic domains created since the advent of dot-com, dot-net and dot-org more than a decade before.

Over the weekend, an internal ICANN task force - chaired by Lynn - issued a report outlining criteria for evaluating the effectiveness of those domains and possible steps that the ICANN board could take toward approving another series of suffixes.

Specifically, the report weighs the risks and rewards of moving to add another set of domains before ICANN completes its internal review of the domains that it approved in 2000.

Lynn said that while proceeding on parallel tracks could address some of the concerns of critics who feel that ICANN is moving too slowly toward adding new domains, that approach could also make it harder for ICANN to learn from whatever mistakes it may have made in 2000.

"I would hope we'd be able to have a much more precise framework of criteria (for choosing new domains) than we had last time around," Lynn said.

ICANN critics have accused the organization of basing its November 2000 choices at least partially on arbitrary criteria not outlined in its request for proposals.

If ICANN decides to move in the quickest way possible, it could approve new domains within a year, whereas two years is a better estimate if the organization decides to take a more deliberative approach, Lynn said.

All of those timelines are contingent on the ICANN board deciding to move forward with creating new Internet domains in the first place.

The ICANN board is set to meet in Bucharest, Romania, next month, where it will take up an internal reform proposal submitted by Lynn earlier this year.

A draft of the domain evaluation report is on the ICANN Web site, www.icann.org.
***********************
Washington Post
Supreme Court Says Satellite Companies Must Carry All Local Stations
By Gina Holland
Associated Press Writer
Monday, June 17, 2002; 12:48 PM


Television station lineups won't be changing much for Americans who get their broadcasts by satellite, the Supreme Court said Monday.

The court refused to consider letting satellite companies choose which local stations to air. Justices rejected arguments that the companies had a free speech right to broadcast what they want.

That means satellite companies have to follow the same "must-carry" rules imposed on cable systems. The high court ruled 5-4 in 1997 that cable TV systems could be forced to carry local stations.

The rule protects small, independent broadcasters and keeps companies from dropping the less popular ones and adding new channels.

Under the Federal Communications Commission rules, satellite companies must run all local stations if they choose to carry one. The companies still can opt not to use any.

The Satellite Broadcasting and Communications Association argued that satellite companies are different.

"There is no evidence whatever supporting the proposition that carry-one, carry-all will preserve even a single broadcast station that otherwise would go dark," attorney Charles J. Cooper, representing the satellite companies, told the court.

Most Americans have either cable or satellite television service, and the Bush administration said the rules should be comparable for both.

If satellite companies were allowed to pick the most popular among local stations, their customers would need a separate antenna to be able to pick up some local stations, the administration told the court.

The Richmond-based 4th U.S. Circuit Court of Appeals had sided with the government. The appeals court said the rule was a "reasonable, content-neutral restriction on satellite carriers' speech."

The case is Satellite Broadcasting and Communications Association v. Federal Communications Commission, 01-1332.
**********************
USA Today
Funny money prevention may add color to dollar
WASHINGTON (AP) When you look at your greenbacks in the future, you might see red. Or blue, or any number of colors as the nation's money makers mull another makeover to thwart high-tech counterfeiters.


Perhaps a spot on the paper bills might even look 3-D.

Those are some of the ideas being floated as the government works on designing new bills that will be harder to knock off. It is a continuing challenge in a world where large quantities of counterfeit notes can be produced easily and quickly using increasingly sophisticated computer technology.

New bills are expected to begin debuting in mid- to late 2003. A final design, which Treasury Secretary Paul O'Neill must approve, is not expected to be publicly released until next year.

The last currency makeover started in 1996 and was the biggest change in the dollar's design in 67 years, with a number of high-tech features added.

The most noticeable change, however, was that portraits were made bigger and moved slightly off center. As a result, a number of nicknames cropped up for the notes, including Monopoly Money.

One change being considered now is the addition of "subtle color" to the bills, says the Bureau of Engraving and Printing, which makes the nation's paper money. The goal would be to use color in such a way that would make it harder to make bogus bills.

Green and black ink is now used on neutral-colored paper. Experts say color could be added in the neutral areas, in other specific spots or be used to tint the entire note. Colors could vary by denomination.

The government is not offering details. But the bureau says that whatever changes are made, "the public can rest assured that notes will maintain their distinct American look and feel."

The size of the notes will not change and the same faces will appear on the same bills.

The United States has had colorful money before, but that was years ago.

"Some of the bills of the late 1860s are so colorful because of the combination of the inks and special hued paper that collectors refer to them as Rainbow Notes," said Lyn Knight, whose business auctions rare U.S. and foreign bills to collectors.

From 1929 until 1963, five different colors of ink appeared on the Treasury Department seals printed on the fronts of circulating bills, Knight said.

"I think our currency right now is the most boring in the world," he said. "Adding color would make it more interesting."

Besides color, another change may include using more distinct color-shifting ink. In the last redesign, color-shifting ink that looks green when viewed straight on but black at an angle was used in a spot on some notes.

Another idea: adding what can look like a sophisticated 3-D hologram to new notes, a notion promoted by Leonhard Kurz and Co., a major supplier of technology for optically variable devices. They are state-of-the-art foils, which incorporate metallic reflection and diffraction.

Some of the anti-counterfeiting features included in the last redesign are likely to be retained, the bureau says. They include watermarks that are visible when held up to a light; embedded security threads that glow a color when exposed to an ultraviolet light; and very tiny images, visible with a magnifying glass, known as microprinting.

Benjamin Franklin, whose face is on the $100 bill, got a makeover in 1996. He was followed by Ulysses S. Grant on the $50 bill in 1997 and Andrew Jackson on the $20 bill the following year. New $5s and $10s came out in 2000.

As with the last makeover, there are no plans to give George Washington, whose visage is on the most common bill the dollar a high-tech facelift because the note is not attractive to counterfeiters, experts say. The same goes for the obscure $2 bill.

When new bills are issued, the old bills continue to be accepted and recirculated until they wear out. The government also plans to work closely again with industry to make sure new bills can be read by ATMs and vending machines, including those used for public transportation.

Over the years, counterfeiters have graduated from offset printing to sophisticated color copiers, computer scanners, color ink jet printers and publishing-grade software technologies readily available.

In the 2001 fiscal year, $47.5 million in counterfeit bills got into circulation in the United States, according to the Secret Service, which was created in 1865 to stem the rampant counterfeiting taking place at the time. Of that amount, $18.4 million or 39% were phony computer-generated notes.

While that is down from the 47% of computer-generated counterfeit bills in 2000, it remain a large concern.

"Counterfeiting is a problem. There's not only a loss to the victim, but it is also used to underwrite violent crimes," said Secret Service spokesman Jim Mackin. Between 40% and 45% of counterfeited bills in the United States were from Colombia

The $20 bill is most counterfeited note in the United States, while the $100 bill is the most knocked-off bill outside the country.

The last money makeover has been proving effective in combating counterfeiters, Mackin said.

"With those notes we are catching more counterfeiters at the first pass the first bank teller or retail clerk," he said. "You want to get to the counterfeit notes before they change too many hands."
*********************
BBC
'Snoop' climbdown by Blunkett


Home Secretary David Blunkett has admitted he blundered over plans dubbed a "snooper's charter" to give a raft of public bodies access to private e-mail and mobile phone records.

The proposals are to be put on hold indefinitely in the face of huge opposition, which the home secretary conceded his department totally failed to predict.

The move - officially said to allow time for a rethink - has been welcomed by opposition parties.

But Lib Dem MP Norman Baker, who has led criticism of the plans, branded it a "humiliating climbdown for the home secretary".

The extension of the powers to seven Whitehall departments, as well as local authorities and other public bodies will now not be discussed in the Commons before the next session of Parliament, which starts in November.

Blunkett 'values privacy'

Mr Blunkett said there needed to be "calmer and lengthy" public discussion of the issues before new proposals were drawn up.

"We believe we got it wrong and we need to address fears people have.

"If we get this right we can get protection and privacy while tackling organised crime."

He added: "I have no intention that we should be Big Brother.

"These are issues that are too important for us to use our majority - that is why we are seeking agreement before bringing them through."

He said that despite being in public life, he valued his own privacy and understood the sensitivities surrounding this legislation.

"The time has come for a much broader public debate about how we effectively regulate modern communications and strike the balance between the privacy of the individual and the need to ensure our laws and society are upheld," he added.

Mr Blunkett's son Hugh, who works in computers, is understood to have briefed his father on privacy fears associated with the original proposals.

'Illiberal' proposal

The change of heart was welcomed by the Conservative leader in the House of Lords, Lord Strathclyde, who had threatened to use his party's voting strength in the Lords to block the proposals.

"I very much hope the government will now rethink their whole proposal.

"And that they will work with the industry and have uppermost in their minds the right to individual privacy for people using the internet, looking at web sites and using their mobile phones."

Lib Dem Norman Baker said Mr Blunkett deserved credit for admitting his mistake.

But he said serious questions should be asked about how such an "illiberal proposal got so far through the home office".

'Binned in its entirety'

He added: "This is a humiliating climbdown for the home secretary but I suppose he took the view that that was better than a humiliating defeat in the House of Lords, which is what would have happened had he pressed on."

Mr Baker said the proposal should be "binned in its entirety".

"It's difficult to understand how intrusive and privacy-intruding powers should be given to local councils, the Food Standards Agency and bodies like that," he added.

Taken aback

At the moment, the power to examine private phone records is only available to the police, Inland Revenue and Customs and Excise.

The powers - contained in the Regulation of Investigatory Powers Act (RIPA) - were introduced to combat serious crime and terrorism.

But the government wanted to extend access to a wide range of organisations including local councils and bodies such as the Food Standards Agency.

Ministers are reported to have been taken aback by the scale of opposition to the proposals.

Plug pulled

They had been due to be debated for just 90 minutes on Tuesday by a committee of MPs dealing with secondary legislation.

But the plug was pulled on the debate at the last minute following an outcry from MPs.

The government had cited the investigation of benefit fraud rings and pirate radio stations as two examples where the new powers would be used.
*********************
BBC
What your computer says about you


The computer in front of you could provide an insight into the kind of person you are.
Psychologists say that a computer's virtual desktop can tell you as much about a person's personality as their real desktop.


And it is not a matter of analysing the screensaver you use or the picture you have chosen as the computer's wallpaper.

"How people prioritise the order in which certain things come onto their screen tells you about their priorities in their lives," explained Ben Williams, a corporate psychologist.

"You can tell whether they are proactive people who are going to make things happen or reactive people who wait for things to happen and then respond to them," he told the BBC programme Go Digital.

Personal space

Often, one of the first things people do with the computer at work is to try to make it their own.

Psychologists say we are staking out the computer as our personal space, creating a sense of control of our surroundings.

"If you have cute pictures or toys on your computer, that says you spend a lot of time on the computer," explained Mr Williams.

"It says this is my territory, look how exciting or dramatic it is."

Some people tend to stick things onto the equipment to make it theirs, or use mouse mats that make political statements.

"Mouse mats can display a lot about not only your interests but also your value system," said Mr Williams. "People like to display their attitudes and beliefs."

Virtual clues

Your computer provides more than just physical clues to your personality.
Analysing the computer screen, the images you use or simply the way you organise your icons can reveal much about your inner desires and ambitions.


Whereas some people may just have company screensavers, others may use something that reflects their personal interests, so a diving enthusiast may have a fish screensaver.

The appearance of the desktop may also provide powerful insights into how comfortable someone is with technology.

"It could be that the person with not much on their desktop is naive about technology," said Mr Williams.

"The person with a lot of whizz bang stuff is very technology aware. They know how to download these things, install them and store them."

In some cases, a person may just be trying to show off.

"It says look at me, I can afford it, I have the most expensive, the biggest, the longest, the hardest, the sharpest," explained Mr Williams. "It's that sort of macho stuff
**********************
Los Angeles Times
Devices That Move Digital Media Complicate Piracy Clampdown
By JON HEALEY
TIMES STAFF WRITER


June 18 2002

While Hollywood studios try to rein in what consumers can do with digital files, some consumer-electronics companies are speeding ahead with products that make it even easier for people to move movies and music around the home and the Web.

The latest example is a gadget Toshiba Corp. is unveiling today that's an electronic library for photos, songs and movies. The device moves digital media wirelessly across the home and lets people share their audio-video collections over the Internet.

Toshiba's biggest competitors also are working on electronic libraries that can deliver entertainment throughout the home and, in some cases, via the Net. By the end of the year, at least half a dozen companies are expected to have these kinds of products on the market or in final planning stages, said Jeremy Toeman of Mediabolic Inc., a software firm that specializes in home entertainment networks. These devices illustrate the challenge facing the studios and other copyright holders as they try to clamp down on Internet piracy. The more consumers become accustomed to moving media around their home and the Net, the harder it may be to put an electronic leash on digital movies and music.

"Every June a digitally savvy generation of young people graduates," said analyst Richard Doherty of the Envisioneering Group consulting firm, "and they graduate into higher income and more toys. And those toys are built to share" digital files.

The new breed of device is designed to be the centerpiece of a digital home network that transmits entertainment and information to any room with a computer or TV screen. Relatively few homes have these networks today, but the demand is expected to grow as consumers amass digital collections of music, photos and videos on their computers.

Scott Dinsdale, a top digital strategist for the Motion Picture Assn. of America, said the group's goal is to protect the studios' copyrighted works without restricting the flow of noncommercial material or stopping independent producers from distributing files under their own rules. "We strongly encourage consumer-electronics companies to talk to us about their ideas and their plans."

Many do. For example, Pioneer Electronics USA Inc. modified its as-yet-unreleased electronic library to block access to files through the Internet. The move deters pirates from copying video clips and music, but it also stops owners of the device from accessing their digital photos, home movies and other personal items through the Web.

Toshiba's Wireless Media Center has two hard drives that store and back up digital music, video and photo files from its owner's computers. It uses common wireless or wired techniques to connect to Internet-enabled digital devices in the home, including computers and home automation systems.

The media center also links to the Internet through a high-speed phone or cable modem, making its electronic library available through a private, secure Web site. With the right electronic keys, anyone can connect to a media center through the Web and play the files stored there, effectively turning the device into an online jukebox.

The Web connection also can be used to watch live video feeds from digital cameras connected to the media center, or to control home appliances remotely, said Oscar Koenders, vice president of worldwide product planning for Toshiba Computer Systems Group. Koenders said the digital cameras connected to his media center let relatives in Holland pay virtual visits to his home.

Toeman of Mediabolic said every consumer-electronics company has a different approach to the security issue, with some far more sensitive to Hollywood's concerns than others. The biggest issue, he said, is keeping consumers' personal collections from being pirated through the Internet--a problem exacerbated by the inclusion of Internet-connected computers in home networks.

Mediabolic and other software providers have added several layers of security to their networking technology to help personal networks stay private. "We think those are the kind of mechanisms that are going to have to be put in place to get the copyright holders to start approving these types of systems," Toeman said.

"The problem I have with what they're doing now, they're so scared of the technology, they're not trying things."
*******************
Los Angeles Times
Computerized Legal Assistance Is Getting Its Day in Court
Law: Information kiosks are helping people who can't afford an attorney represent themselves.
By MONTE MORIN


When single father Thurman Williams needed help filling out papers in a custody suit recently, he didn't look to his lawyer for help. He walked to a computerized kiosk at the Orange County Family Court and started tapping the keys.

As part of a legal experiment, lawyer-less litigants across California are using computerized video kiosks to prepare common court filings and seek basic legal advice.

The kiosk used by Williams is part of a statewide effort to cope with a flood of litigants who cannot afford or refuse to hire their own lawyers. Court officials statewide fear the number of self-represented litigants has reached crisis levels and threatens to clog court calendars. Like Williams, more than 6,000 Orange County litigants have initiated court actions on I-CAN! kiosks or accessed the programs on the Internet, using home computers. Similar programs are operating in Sacramento, San Diego and Ventura.

A recent study of the kiosks' first 18 months of operation concluded it is too soon to tell whether the system will relieve pressure on court calendars. But the report, by UC Irvine's School of Social Ecology, said users were overwhelmingly positive about the free legal assistance.

"It's made life a lot easier for me," Williams said. "It's helped keep me from going to the poorhouse."

The 29-year-old Orange resident was directed to a kiosk in the Betty Lou Lamoreaux Justice Center by court staff. After putting on headphones and following the directions of a videotaped instructor, Williams filled out a quarter-inch stack of paternity and custody documents. The exercise took 20 minutes; it would have cost him about $800 if he had relied on a lawyer, he said. "It was a lot easier than I thought."

Whether they simply can't afford a lawyer or just want to save money, more Californians are choosing to go to court without a lawyer.

"I'm just amazed at the numbers," said Commissioner Salvador Sarmiento, who hears between 40 and 90 child-support cases a day in Orange County Family Court. "Eighty percent of the cases I hear involve people representing themselves. These cases can take 50% longer to process than others."

Sarmiento said the kiosks have put many cases on the fast track.

"Most people who appear before me without attorneys are real nervous," Sarmiento said. "They don't know what to expect and they want to tell me everything. It's an opportunity for them to vent. A lot of what they say is irrelevant. When they go to the kiosks though, I get the information that I need so I can rule."

Of the 4.3 million state residents who find themselves in court each year, more than half are pro per, or self-represented litigants. The phenomenon is particularly evident in family courts, where fewer than 16% of all child-support cases involve parents who are both represented by lawyers. Also, 80% of all domestic-violence cases are handled without lawyers.

The State Bar of California has characterized the trend as "the pro per crisis in family law," and the State Judicial Council has established a task force on the matter.

The Legislature has attempted to address the problem by establishing family-law facilitator offices throughout the state to help litigants in child-support matters. In courts such as Orange County's, the offices have offered workshops for litigants. Classes often have two-month waiting lists.

In Van Nuys, officials last year established the Self-Help Legal Access Center, in which people can seek legal help from computers and volunteers.

I-CAN! kiosks are in eight locations, including the Orange County district attorney's office, Irvine City Hall and the Fullerton and San Juan Capistrano libraries. However, the busiest location is in the Family Law Information Center at Family Court in Orange.

At the information center, where open boxes of tissues are displayed as prominently as forms for initiating divorce, custody and child-support proceedings, office assistant Beatrice Contreras said there is often a line of people waiting to use the two machines.

Employing interactive video and touch-screen technology, the kiosks walk users through the bureaucracy of obtaining domestic-violence restraining orders, establishing child custody, responding to child support and eviction orders, initiating small-claims suits and requesting waivers for legal filing fees. The kiosks give instructions in English, Spanish and Vietnamese and offer users video tours of court complexes and primers on what to expect during hearings.

"People like it because they say it's fast and easy and especially because they don't have to pay for an attorney. They really like that," Contreras said.
******************
Federal Computer Week
Workforce bill on tap
BY Vhristopher J. Dorobek
In what would be the first reform of government personnel policies in decades, Sen. George Voinovich (R-Ohio) plans to introduce legislation to create agency chief human capital officers.


The Federal Workforce Improvement Act of 2002 is a comprehensive reform of civil service laws, Voinovich said during a speech at the National Academy of Public Administration's (NAPA) Performance Conference in College Park, Md., this month.

Workforce issues have affected the government's ability to carry out its duties, Voinovich said. After Sept. 11, "it has become critical to have well-trained federal employees," he said.

Myra Howze Shiplett, director of NAPA's Center for Human Resource Management, said NAPA has been working with the senator and his staff. "We think the legislation is absolutely moving in the right direction," she said.

The proposed legislation would amend the 1993 Government Performance and Results Act to require agencies to make workforce plans part of their overall performance plans, Voinovich said. It would also reform the government's slow hiring processes.

Specific provisions of the bill are still being negotiated. However, one version requires the Office of Personnel Management to create a system to assess how agencies manage workforce issues.

The bill has much better prospects as a result of the Sept. 11 terrorist attacks, because now lawmakers can make a real connection between workforce issues and homeland security, Voinovich said. He introduced similar legislation last fall.

Shiplett said the bill addresses issues that NAPA research has identified as problems. "Due in large part to the influence of technology, the way people work has changed significantly," she said. The government needs to be flexible and people need to be able to shift from one project to another. "The current structure doesn't allow that."
*********************
Federal Computer Week
A-11 revisions stress planning
Management crucial to IT investments
BY Diane Frank


An increased emphasis on enterprise architecture within agencies and across government is a key feature of the revised policy for information technology investment and capital planning that the Office of Management and Budget will release later this month.

The revisions to Circular A-11, the document that defines investment policies and requirements for the federal government, will eventually lead agencies to work out the initial plan for IT and e-government investment governmentwide, officials said.

For the fiscal 2004 budget, OMB did much of the work to match agencies' requests against the federal enterprise architecture (EA) to determine, and potentially eliminate, redundant investments. But for the fiscal 2005 budget cycle, OMB will ask agencies to make those matches and work with other agencies before submitting their requests, said William McVay, a senior policy analyst in OMB's Office of Information and Regulatory Affairs (OIRA).

OMB developed and will maintain the federal EA business reference model as part of the Bush administration's e-government strategy. The A-11 revisions also include further definitions of e-government and e-business systems, which will help agencies get a better grasp on developing mission-based business cases for information systems, McVay said.

"This does not mean that the entire federal government has been doing [investment] all wrong over the last few years. Just the questions are different," he said June 4 at the National Academy of Public Administration's Performance Conference in College Park, Md.

Experts and even OMB officials agree that the most immediate challenge will be at the agency level.

Agencies don't know how to write business cases and work within EAs very well at this point, McVay said. OMB will continue to work with agencies on the details, including endorsing the use of tools such as the Enterprise Architecture Management System being tested by many agencies, he said.

Because of the disparity in capabilities among agencies, not all agencies will be able to work on their own for the fiscal 2005 budget process, said John Spotila, former administrator of OIRA and now president and chief operating officer of GTSI Corp.

"There's not necessarily going to be a transformation on a political timetable," he said.

The changes do not reflect any radically new ideas, however, he added.

"This really continues a very positive approach that extends back at least to the Clinger-Cohen Act...to get agencies to adopt a more systematic, integrated and thoughtful approach to IT capital planning," he said.

The revisions show that OMB is serious about using the budget to put management processes in place in agencies and across government, said Bruce McConnell, a former chief of information policy and technology at OMB and now president of McConnell International LLC, a marketing and consulting firm.

"To the extent you can use the budget to shape programs, this is the way to go," he said.

But the lack of experience at agencies will need to be addressed, and the issue of enforcement will definitely come up when OMB attempts to penalize agencies for not following the EA plan, McConnell said. "Some of these programs, you can't just cut the funding," he said.

Congress definitely will have something to say about any cuts or holds on funding, and OMB officials must enlist the full cooperation of the authorization and appropriations committees on Capitol Hill, Spotila said. Even then, because of the differences in agencies' capital planning capabilities, change will happen in small steps, with some agencies progressing faster than others, he said.

"You have to acknowledge the political reality and try to work the process and accept incremental gains," he said.

***

Why planning matters

The Office of Management and Budget will issue revisions to Circular A-11, which states OMB's policy for information technology and capital planning, later this month. OMB officials say better management of agencies' investments is necessary because:

* Federal IT investment for fiscal 2003 totals $52 billion. Out of that total investment, 1,000 "major" projects worth $18 billion require business cases.

* Some $9 billion in those "major" projects still need their business cases approved for fiscal 2003.

* The remaining $34 billion worth of IT projects do not require a business case and, at this point, have little or no visibility in the investment planning process.
*********************
Federal Computer Week
Air Force studies PC costs
Pilot project compares traditional systems with a centralized, managed architecture
BY Dan Caterinicchia


Information technology managers often ask their workers to solve problems by thinking outside the box, but if an ongoing study at Hill Air Force Base, Utah, is an indication, the better advice might be to remove the box from the desktop.

In an attempt to minimize the interruption of service to computer users at Hill, the base's chief information officer is conducting a total cost of ownership study comparing the traditional distributed desktop environment with a centralized, managed PC architecture from ClearCube Technology Inc.

The base normally gets one-third of its PCs refreshed annually through the Air Force's central purchasing authority, said Capt. Tim Ohrenberger, the base CIO who also serves as the information services flight commander at Hill's Tanner Memorial Medical Clinic. The ClearCube PC environment "just made sense" as an alternative to traditional distributed desktops by requiring fewer man-hours for similar tasks, he said.

The Blade, ClearCube's CPU box, is long and thin and sits in a rack removed from the actual computing environment. That provides physical security for the hard drives and helps ease downtime because the Blades can be kept near support personnel, said Michael Frost, president and chief executive officer of the Austin, Texas-based company.

ClearCube commissioned IDC to conduct a total cost of ownership study last year comparing the ClearCube technology to Hill's current desktop environment. IDC's report estimated an annual savings of about $1.5 million during a three-year period using Blades, including annual savings of more than $100,000 on trips to the desktop and $330,000 in "technician utilization."

But Ohrenberger said he knew the Air Force central purchasing personnel would need more proof, especially since Blades cost about twice as much upfront as a standard Dell Computer Corp. desktop PC.

"I decided to do a pilot [project] to attempt to validate those numbers, which could show the spectrum of complete failure to [totally] true," he said, adding that funding was then made available and the pilot launched May 24.

Since the IDC study, which concluded in January, Ohrenberger and his staff have been collecting data on updgrade costs they received in the traditional environment, including cost, man-hours and other variables.

Through the pilot project, those same numbers are being compared to ClearCube's centrally managed system and will continue to be collected through the end of this month (see box).

Despite the fact that the Blades initially cost more than a standard desktop, there are benefits that make them a viable alternative to the traditional environment, including lower IT support costs and faster and easier support for local users, Ohrenberger said.

"The bottom line is a minimal interruption of service to my internal customers, which in turn, relates to them providing a higher level of care" to the patients and visitors seeking medical attention, he said.

Frost acknowledged that Blades initially cost more than a $900 Dell desktop, but he said the same concept that causes people to buy wireless telephones applies: Consumers are willing to pay more as long as the plan includes more minutes at a better price.

The cost of supporting a Dell is three to six times more than the cost of buying it, Frost said. "We're a self-funding product."

"You, as the user, can't tell the difference," he said, because the user is still using Microsoft Corp. Windows.

Leslie Fiering, vice president of mobile computing at Gartner Inc., said, "PCs in a managed environment always have a lower total cost of ownership."

"It's a way of forcing consistency in the system image and software load," said Fiering, adding that she was not surprised by the estimated savings and commended the ClearCube technology for its security, easier maintenance and backup.

But there are potential pitfalls associated with the centralized architecture. For notebook users who travel frequently or require flexible software application loads, ClearCube is not the answer, she said.

"The style of computing must match the organizational structure," Fiering said. "In the right environment, the savings can be dramatic."

And that's what Ohrenberger is hoping this total cost of ownership study will prove.

"By the end of June, we should have enough data," Ohrenberger said.

By mid-July, he said he would like to forward his conclusions to the central purchasing authority.

"The end product that I'd want is that you're at your facility and need to purchase 125 PCs and you can decide if you want all desktop, all ClearCube or a mixture," he said.

***

Study basics

The chief information officer of Hill Air Force Base, Utah, is conducting a total cost of ownership study comparing the traditional distributed desktop environment with a centralized, managed PC architecture. The base has about 350 users in five buildings and a staff of about six in the information technology department.

The study is being conducted in the medical group, which is housed in a building separate from the main facility and runs patient care for three clinics: optometry, pediatrics and the flight surgeon's office. There are 44 users in the medical facility, and everyday obstacles include space, security and reliability, said Capt. Tim Ohrenberger, Hill's CIO.

"The environment is totally realistic," he said. "It's not in a classroom or a test bed that's not indicative of the actual environment."

Ohrenberger used $75,000 in funding for equipment purchases and said he was using "my own people and all other resources." Final results of the study are expected in mid-July.
********************
Computerworld
Two security alerts point to Apache Web Server flaws
By TODD R. WEISS


Two security alerts about new vulnerabilities affecting the popular open-source Apache Web Server have been posted by two groups today.
The nonprofit Apache HTTP Server Project group has issued a bulletin about a vulnerability that can allow distributed denial-of-service attacks in Apache Versions 1.3, including 1.3.24, and Apache 2, including all versions up to 2.0.36.


The Apache Project said in the announcement that an Internet Security Systems Inc. (ISS) patch posted earlier in the day for an Apache vulnerability does not fix the denial-of-service problem. A patch for that problem is expected to be ready by tonight on the group's Web site.

In a separate posting, Atlanta-based security vendor ISS reported the discovery of an Apache vulnerability that contains a flawed mechanism meant to calculate the size of "chunked" encoding for Windows 32-bit users. Chunked encoding is part of the HTTP Protocol Specification used for accepting data from Web users, according to ISS.

When data is sent from the user, the Web server needs to allocate a memory buffer of a certain size to hold the submitted data. When the size of the data being submitted is unknown, the client or Web browser will communicate with the server by creating "chunks" of data of a negotiated size.

But the flaw, affecting Apache Versions 1.x, misinterprets the size of incoming data chunks, which could lead to a signal race, heap overflow, and to exploitation of malicious code, according to ISS.

ISS said it had posted a fix for the problem on its Web site.

Chris Rouland, director of ISS's X-Force research and development group, said the two vulnerabilities are different. The ISS patch will protect Windows servers running Apache from remote compromise attacks, he said, while the denial-of-service problem reported by the Apache Project appears to be a separate issue.

"This is open-source in action," he said. referring to the wide discourse on the exact vulnerabilities being addressed by both groups. "The value is you get a lot of different minds thinking about something and the challenge is that you have to decide what to do."
*********************
Computerworld
IT integration key to U.S. security
By DAN VERTON


The success of the proposed Department of Homeland Security - the biggest reorganization of the federal government in six decades - hinges on IT systems integration, security experts said last week.

The cabinet-level department would combine tens of thousands of federal officials from up to 100 agencies under one command structure. It would focus exclusively on all aspects of homeland security, including terrorism defense, cybersecurity and protection of critical infrastructure.

The mammoth project to integrate a multitude of stovepipe systems - while maintaining the security of compartmentalized information - would be overseen by the U.S. Department of Defense's Critical Infrastructure Assurance Office (CIAO).

According to CIAO Director John Tritak, plans were under way to move his organization under the auspices of the existing White House Office of Homeland Security before the Bush administration unveiled its intention to create a new department. The new plan calls for the CIAO to become part of the Department of Homeland Security's information analysis and infrastructure protection division, Tritak said.

"The most important function of this office will be to design and help implement an interagency information architecture that will support efforts to find, track and respond to terrorist threats within the United States in a way that improves both the time of response and the quality of decisions," Tritak said.

A primary responsibility of the CIAO will be to ascertain information-sharing requirements and "determine the personnel, software, hardware and technical resources needed to implement the architecture," said Tritak.

"We already have most of the technology pieces. The question is, How do you connect those pieces?" said Alan Harbitter, chief technology officer at PEC Solutions Inc., a Fairfax, Va.-based Web services company.

It's unclear whether any systems integration effort would entail creation of the previously proposed GovNet, said Olga Grkavac, an executive vice president at the Information Technology Association of America. GovNet would be a secure government intranet with no gateway to the Internet.

What's clear is that there are massive technology challenges still to be overcome. The consolidation will mean that disparate databases with different data fields running on different operating systems will have to be integrated in a way that provides senior decision-makers in the new department with a common picture of the intelligence flowing in from all parts of the government. The integration also poses significant security challenges, say experts. Sensitive intelligence will have to be compartmentalized and tightly controlled in the new organization of 169,000 workers, while still being viewed as a whole by those responsible for advising the president on threats.

IT a Strategic Weapon

Donald Zimmerman, CEO of Washington-based IT consulting firm Synergy Inc. and a former IT consultant to the U.S. Air Force, said it will be critical for the new agency to emulate the private sector's ability to "leverage IT as a strategic weapon." Zimmerman cited as examples Delta Air Lines Inc.'s use of IT to re-engineer its business processes and the massive scalability and warehousing capabilities of companies such as Yahoo Inc., Nasdaq Stock Market Inc. and Wal-Mart Stores Inc.

Zimmerman recommended the creation of a homeland security portal based on commercial Web standards that would enable all government agencies and private companies that have a stake in critical infrastructure protection to tap into databases and share information. Such a portal should be "role-based," meaning that users would have access only to information required for them to do their jobs, he said.
**********************
Government Computer News
Coast Guard schedules Deepwater contract award for June 25
By Preeti Vasishtha


The Coast Guard next week expects to award a multibillion-dollar contract to one of three vendor teams to overhaul its vessels and aircraft, replacing obsolete computer and sensor technologies.

Through the 30-year Integrated Deepwater System contract, the guard will overhaul or replace 100 aging cutters and 200 aircraft.

The three teams that bid to design, develop and roll out systems for Deepwater are:


Boeing Co.
Integrated Coast Guard Systems, a team that includes Lockheed Martin Corp., Northrop Grumman Corp. and Ingalls Shipbuilding of Pascagoula, Miss.
Science Applications International Corp.


The acquisition, the largest in Coast Guard history, has drawn General Accounting Office criticism for promoting the use of unproven technology.

Congress also has raised concerns and has committed to funding the project for only three years. The Guard estimates that it will need to spend $500 million annually on it for several years beyond that.
*************************
Government Executive
Agencies to unveil e-signature prototype
By Maureen Sirhal, National Journal's Technology Daily


The federal government is embarking on an initiative that could be the linchpin for revolutionizing government services and boosting the use of advanced Internet commerce.

The Office of Management and Budget, the General Services Administration and other agencies this week will be discussing a prototype of an electronic gateway for delivering federal services to businesses, consumers and other government entities utilizing electronic signatures and identification.

Some experts predict that the e-authentication project will serve as a model of how technology can boost the reliance on e-signatures. E-authentication is one of the 24 e-government initiatives approved by the President's Management Council, and experts said that increased government use of the technology also could propel several industries into a new realm of global commerce and boost the deployment of high-speed Internet services.


The Keys To E-Government Security



Officials at OMB and GSA have invited technology companies to join them Tuesday for e-Authentication Industry Day, which is designed to update businesses about the gateway and explore opportunities for government contracts. The portal seeks to enable citizens, businesses and state and local government agencies to obtain the keys for conducting government business over the Internet.


In the offline world, individuals or organizations can flash state-issued driver's licenses or some other government-approved document to verify their identities. But in the online world, a name and an Internet password are not sufficient evidence that the parties in government and business transactions are transmitting authentic information.

That explains the drive toward e-authentication and the new portal, begun in the mid-1990s by several agencies, including GSA, the National Institute for Standards and Technology and OMB.

"It came about as a reaction to trends that we at GSA saw happening in government," said Judith Spencer of GSA and the chairwoman of the Public Key Infrastructure Steering Committee. "Years ago, when we were pursuing the idea of e-government, and realizing ... before Congress mandated a commitment to electronic government services, [we decided] that as people became more comfortable with online [commercial] services, they would expect government to offer online government service."

The Defense Department began developing a system that provided the models for the move to public-key infrastructure (PKI) a third-party verification system. Under PKI, individuals obtain digital certificates, encrypted "keys" that let them send information sealed with the verification of a third party (usually the software companies that issue the keys).

That process ensures the recipient in a transaction of the sender's identity and the authenticity and security of the documents. The Clinton administration was seeking a technical solution to facilitate e-government transactions, Spencer said, and "the best in breed was PKI technology."

In 1996, nearly two-dozen agencies formed the PKI panel that Spencer heads, choosing that method over an agency-by-agency piecemeal approach to specific problems. The steering committee helped create the Federal Bridge Certificate Authority, which enables departments and agencies to issue digital certificates, Spencer said. She said the authority has a model policy based on four levels of trust in digital transactions -- everything from rudimentary protection such as a simple password all the way to "substantial assurance" that participants are who you say they are.

Under contract with the government, Digital Signature Trust and AT&T also worked with industry partners, such as VeriSign, to develop Access Certificates for Electronic Services, the foundation for current government-wide PKI solutions.


'The Enablers'


"We think of ourselves as the enablers," said Steven Timchak, GSA's e-authentication program manager. "E-authentication is not only about technology. ... What's equally important is that agencies are cooperating and working together to get it done."

Timchak's team, which includes participants from industry and government, is creating risk profiles and assessments for e-government transactions. Transactions that require high security likely will use digital certificates. The Defense Department has gone even further, combining PKI with biometrics-based "smart cards" for its employees.

A subsidiary of the defense contractor Mitre currently is building the prototype for the e-authentication project, which Timchak said is slated to be unveiled by September. GSA will bid a contract to develop the final product within the next year. Part of the "industry day" event is designed to create awareness among technology makers in order to promote the bidding process for the gateway's final implementation.

The prototype will connect to FirstGov, the federal government's Web portal, while it integrates with several of the 24 e-government initiatives.

Congressional mandates to reduce paperwork in federal agencies and laws such as the Health Insurance Portability and Accountability Act that set rules for transferring information electronically are proving to be major catalysts for developing the e-authentication infrastructure. A key deadline for the Government Paperwork Elimination Act, for instance, is October 2003.

"You just can't get there from here unless you are starting now," said Keren Cummins, vice president of government services at Digital Signature Trust.


Bumps In The E-Authentication Road


The implementation of online authentication faces some challenges. One of the major problems with the deployment of the e-gateway, for example, is the consolidation necessary to create one authority for issuing digital certificates to consumers and businesses. "We are asking some agencies to give up their rice bowls, and that is difficult to do," Timchak said.

Another hurdle is that the use of digital authentication in the private sector still lags. The process remains expensive, and it can only work if both parties to a transaction have the technology. Many observers hope a government push will spur private-sector adoption of the technology.

Yet in order to boost the overall acceptance of digital authentication and PKI, the system that GSA and other agencies are creating must be wholly interoperable.

The Federated Electronic Government Coalition, a trade group for security-technology firms, issued a report in May criticizing some federal PKI efforts. The report warned that without interoperability among the e-authentication approaches of federal, state and local governments and the private sector, a "disruptive technological approach" might be the end result.

But industry experts remain hopeful that authentication and the subsequent services it will enable will promote the adoption higher-speed services.

"Governments today have the opportunity to build the rhythms of national life into high-speed networks," the Information Technology Association of America states in its 2001 broadband report. "E-government presents government agencies with unprecedented options for round-the-clock, citizen-centric service delivery, and the American people with the chance to interact with the democratic process and institutions in new and compelling ways."
***********************
Government Executive
Agencies' security spending may rise 12 percent a year
By William New, National Journal's Technology Daily


Presidential initiatives on homeland security are forecast to increase spending in that area from $32 billion to $50 billion, or about 12 percent a year, over the next five years, a private-sector analysis of budget figures show, and the spending represents "significant" opportunities for the information technology industry.

Opportunities in various agencies are forecast for such tasks as hiring and training security personnel, developing explosion-detection equipment, modernizing distress systems, improving cybersecurity, creating real-time identification systems, and fostering the sharing and interoperability of data, according to the preliminary forecast presented to industry on Friday by the Government Electronics and Information Technology Industry Association.

The analysis examined the funding available for homeland security based on five presidential initiatives. It excludes some homeland defense spending and is subject to change based on efforts in Congress and other factors.

The report analyzed three budget laws or series of laws--the fiscal 2002 appropriations, the first fiscal 2002 emergency spending measure and the second fiscal 2002 emergency spending law enacted after the Sept. 11 terrorist attacks--and President Bush's fiscal 2003 budget request.

The five initiatives it covered were: supporting "first responders" to emergencies, using 21st-century technology to secure the homeland, aviation security, border security and bioterrorism defense. The biggest gainers were border security and first responders, GEIA found.

Six departments or agencies would receive 85 percent of the funding proposed since Sept. 11: the Defense, Health and Human Services, Justice, Transportation and Treasury departments, and the Federal Emergency Management Agency.

GEIA had nearly completed the months-long study when Bush announced a proposal to create a Homeland Security Department. The department would consume the majority of homeland security funds, the group's study found.

"We are pleased to have begun the process of forecasting homeland-security market opportunities for our industry," GEIA President Dan Heinemeier said Monday.

GEIA, which conducts numerous public and private-sector interviews in the development of its report, found that homeland security IT needs include new testing and research centers and new anti-terrorism equipment and technologies. The technologies might include real-time identification of terrorists, integration of collaborating systems, new technologies for inspection, identification and tracking of shipments for materials, and infrastructure protection.

Needed technologies include those for providing digital surveillance, data mining, advanced encryption, "smart cards," sensors, and early-warning and profiling tools.

"By forecasting budgets and business opportunities," Heinemeier said, "our goal is to help industry better prepare to meet emerging government needs and enhance our national ability to response to the homeland security challenge."
************************
Government Executive
U.S. to implement wireless emergency telecom network
By William New, National Journal's Technology Daily


The U.S. government will establish an emergency wireless communications system for the nation's top decision makers by the end of the year, a Bush administration official said last week.

Implementation of the Wireless Priority Services program, an effort of the 22-agency National Communications System (NCS), is being sped up since the Sept. 11 terrorist attacks, according to Brenton Greene, the NCS deputy manager. It is already a pilot program in Washington and New York City and is "fully on track" for full implementation in December, Greene told a conference of the Air Force Communications and Electronics Association on Thursday.

The system gives priority to key decision makers to connect their calls "anytime, anywhere," he said. It will use the Global System for Mobile Communications technology standard and involves VoiceStream Wireless, Cingular Wireless, AT&T and Nextel Communication.

The system would have a minimum impact on consumers by never using more than 25 percent of bandwidth, Greene said, adding that the use of handheld computers and the Internet also is being examined.

The wireless system was commissioned and approved in 2000 but received no funding, he said. It builds on the existing landline-based Government Emergency Telecommunications Services, under which a special calling card is used for access to priority switches.

Before Sept. 11, there were 40,000 users of the system, 1,500 of whom used it that day. About 10,000 priority calls were made on and immediately following Sept. 11, with a 95-percent completion rate, Greene said.

Lt. Gen. Harry Raduege, head of the Defense Information Systems Agency, leads NCS, which was created by presidential executive order after communications problems between heads of state during the 1962 Cuban missile crisis heightened the threat of conflict. The system was created to provide a single, unified communications system for the president and other decision makers during emergencies.

In 1984, the emergency system was broadened to include 22 federal departments and agencies. They are represented on the NCS through the Committee for National Security and Emergency Preparedness Communications, which meets at least twice yearly. The preparedness committee provides recommendations through the president's Critical Infrastructure Protection Board, which is headed by Richard Clarke, director of the Office of Cyberspace Security.

Also under the NCS is the Federal Communications Commission's Telecommunications Service Priority Program, which gives priority to companies with highest need for circuits during a disaster. The program was vital to the restoration of Wall Street financial operations in six days after Sept. 11, Greene said.

The Shared Resources High Frequency Radio Program also is under NCS. Known as SHARES, it provides a single, interagency emergency message-handling system by bringing together existing high-frequency radio resources of federal, state and industry organizations when normal communications are unavailable for transmitting national security information.

Other groups that coordinate with the NCS include the president's National Security Telecommunications Advisory Committee and the National Coordinating Center for Telecommunications, a public-private body that works to coordinate emergency preparedness efforts nationally and internationally.
***********************
MSNBC
Why software is so bad ...
... and what's being done to fix it
By Charles C. Mann


June 17 It's one of the oldest jokes on the Internet, endlessly forwarded from e-mailbox to e-mailbox. A software mogul usually Bill Gates, but sometimes another makes a speech. "If the automobile industry had developed like the software industry," the mogul proclaims, "we would all be driving $25 cars that get 1,000 miles to the gallon." To which an automobile executive retorts, "Yeah, and if cars were like software, they would crash twice a day for no reason, and when you called for service, they'd tell you to reinstall the engine."
THE JOKE ENCAPSULATES one of the great puzzles of contemporary technology. In an amazingly short time, software has become critical to almost every aspect of modern life. From bank vaults to city stoplights, from telephone networks to DVD players, from automobile air bags to air traffic control systems, the world around us is regulated by code. Yet much software simply doesn't work reliably: ask anyone who has watched a computer screen flush blue, wiping out hours of effort. All too often, software engineers say, code is bloated, ugly, inefficient and poorly designed; even when programs do function correctly, users find them too hard to understand. Groaning beneath the weight of bricklike manuals, bookstore shelves across the nation testify to the perduring dysfunctionality of software.
"Software's simply terrible today," says Watts S. Humphrey, a fellow of Carnegie Mellon University's Software Engineering Institute who has written several well-known books on software quality. "And it's getting worse all the time." Good software, in Humphrey's view, "is usable, reliable, defect free, cost effective and maintainable. And software now is none of those things. You can't take something out of the box and know it's going to work." Over the years, in the view of Edsger W. Dijkstra, an emeritus computer scientist at the University of Texas at Austin, the average computer user "has been served so poorly that he expects his system to crash all the time, and we witness a massive worldwide distribution of bug-ridden software for which we should be deeply ashamed."
Jim McCarthy is more generous. The founder, with his wife Michele, of a software quality training company in Woodinville, WA, McCarthy believes that "most software products have the necessary features to be worth buying and using and adopting." But, he allows, "only the extreme usefulness of software lets us tolerate its huge deficiencies." McCarthy sometimes begins talks at his school with a PowerPoint presentation. The first slide reads, "Most Software Sucks."


GETTING WORSE, NOT BETTER
It is difficult to overemphasize the uniqueness of software's problems. When automotive engineers discuss the cars on the market, they don't say that vehicles today are no better than they were ten or fifteen years ago. The same is true for aeronautical engineers: nobody claims that Boeing or Airbus makes lousy planes. Nor do electrical engineers complain that chips and circuitry aren't improving. As the engineering historian Henry Petroski suggested in his 1992 book The Evolution of Useful Things, continual refinement is the usual rule in technology. Engineers constantly notice shortcomings in their designs and fix them little by little, a process Petroski wryly described as "form follows failure." As a result, products incrementally improve.
Software, alas, seems different. One would expect a 45-million-line program like Windows XP, Microsoft's newest operating system, to have a few bugs. And software engineering is a newer discipline than mechanical or electrical engineering; the first real programs were created only 50 years ago. But what's surprising astonishing, in fact is that many software engineers believe that software quality is not improving. If anything, they say, it's getting worse. It's as if the cars Detroit produced in 2002 were less reliable than those built in 1982.
As software becomes increasingly important, the potential impact of bad code will increase to match, in the view of Peter G. Neumann, a computer scientist at SRI International, a private R&D center in Menlo Park, CA. In the last 15 years alone, software defects have wrecked a European satellite launch, delayed the opening of the hugely expensive Denver airport for a year, destroyed a NASA Mars mission, killed four marines in a helicopter crash, induced a U.S. Navy ship to destroy a civilian airliner, and shut down ambulance systems in London, leading to as many as 30 deaths. And because of our growing dependence on the Net, Neumann says, "We're much worse off than we were five years ago. The risks are worse and the defenses are not as good. We're going backwardsand that's a scary thing."
Some software companies are responding to these criticisms by revamping their procedures; Microsoft, stung by charges that its products are buggy, is publicly leading the way. Yet problems with software quality have endured so long, and seem so intractably embedded in software culture, that some coders are beginning to think the unthinkable. To their own amazement, these people have found themselves wondering if the real problem with software is that not enough lawyers are involved.


'IT'S TOTAL CHAOS'
Microsoft released Windows XP on Oct. 25, 2001. That same day, in what may be a record, the company posted 18 megabytes of patches on its Web site: bug fixes, compatibility updates, and enhancements. Two patches fixed important security holes. Or rather, one of them did; the other patch didn't work. Microsoft advised (and still advises) users to back up critical files before installing the patches. Buyers of the home version of Windows XP, however, discovered that the system provided no way to restore these backup files if things went awry. As Microsoft's online Knowledge Base blandly explained, the special backup floppy disks created by Windows XP Home "do not work with Windows XP Home."
Such slip-ups, critics say, are merely surface lapses signs that the software's developers were too rushed or too careless to fix obvious defects. The real problems lie in software's basic design, according to R. A. Downes of Radsoft, a software consulting firm. Or rather, its lack of design. Microsoft's popular Visual Studio programming software is an example, to Downes's way of thinking. Simply placing the cursor over the Visual Studio window, Downes has found, invisibly barrages the central processing unit with thousands of unnecessary messages, even though the program is not doing anything. "It's cataclysmic. ... It's total chaos," he complains.
The issue, in the view of Dan Wallach, a computer scientist at Rice University, is not the pointless churning of the processor after all, he notes, "processing power is cheap." Nor is Microsoft software especially flawed; critics often employ the company's products as examples more because they are familiar than because they are unusually bad. Instead, in Wallach's view, the blooming, buzzing confusion in Visual Studio and so many other programs betrays how the techniques for writing software have failed to keep up with the explosive increase in its complexity.
Programmers write code in languages such as Java, C and C++, which can be read by human beings. Specialized programs known as "compilers" transform this code into the strings of ones and zeroes used by computers. Importantly, compilers refuse to compile code with obvious problems they spit out error messages instead. Until the 1970s, compilers sat on large mainframes that were often booked days or weeks in advance. Not wanting errors to cause delay, coders who in the early days tended to be trained as mathematicians or physicists stayed late in their offices exhaustively checking their work. Writing software was much like writing scientific papers. Rigor, documentation and peer-review vetting were the custom.


OVERWHELMED BY COMPLEXITY
But as computers became widespread, attitudes changed. Instead of meticulously planning code, programmers stayed up in caffeinated all-night hacking sessions, constantly bouncing results off the compiler. Again and again, the compiler would spit back error messages; the programmers would fix the mistakes one by one until the software compiled properly. "The attitude today is that you can write any sloppy piece of code and the compiler will run diagnostics," says SRI's Neumann. "If it doesn't spit out an error message, it must be done correctly, right?"
As programs grew in size and complexity, however, the limits of this "code and fix" approach became evident. On average, professional coders make 100 to 150 errors in every thousand lines of code they write, according to a multiyear study of 13,000 programs by Humphrey of Carnegie Mellon. Using Humphrey's figures, the business operating system Windows NT 4, with its 16 million lines of code, would thus have been written with about two million mistakes. Most would have been too small to have any effect, but some many thousands would have caused serious problems.
Naturally, Microsoft exhaustively tested NT 4 before release, but "in almost any phase of tests you'll find less than half the defects," Humphrey says. If Microsoft had gone through four rounds of testing, an expensive and time-consuming procedure, the company would have found at most 15 out of 16 bugs. "That's going to leave you with something like five defects per thousand lines of code," Humphrey says. "Which is very low" but the software would still have as many as 80,000 errors.
Software engineers know that their code is often riddled with lacunae, and they have long been searching for new technologies to prevent them. To manage increasingly distended projects like Windows, for example, they have developed a variety of techniques, of which perhaps the best known is component-based design. Just as houses are built with standardized two-by-fours and electrical fittings, component-based programs are built out of modular, interchangeable elements: an example is the nearly identical menu bar atop every Windows or Macintosh program. Such standardized components, according to Wallach, are not only good engineering practice, they are "the only way you can make something the size of Microsoft Office work at all." Microsoft, he says, was an early, aggressive promoter of this approach "it's the single best engineering decision they ever made."


INADEQUATE PLANNING CITED
Unfortunately, critics say, the components are often glued together with no real central planas if contractors tried to erect large structures with no blueprints. Incredibly, Humphrey says, the design for large software projects is sometimes "nothing but a couple bubbles on the back of an envelope." Worse, for marketing reasons companies wire as many features as possible into new software, counteracting the benefits of modular construction. The most widespread example is Windows itself, which Bill Gates testified in an April session of the Microsoft antitrust trial simply would not function if customers removed individual components such as browsers, file managers or e-mail programs. "That's an incredible claim," says Neumann. "It means there's no structure or architecture or rhyme or reason in the way they've built those systems, other than to make them as bundled as possible, so that if you remove any part it will all fail."
The inadequate design in the final products, critics argue, reflects inadequate planning in the process of creating them. According to a study by the Standish Group, a consulting firm in West Yarmouth, MA, U.S. commercial software projects are so poorly planned and managed that in 2000 almost a quarter were canceled outright, creating no final product. The canceled projects cost firms $67 billion; overruns on other projects racked up another $21 billion. But because "code and fix" leads to such extensive, costly rounds of testing, even successful projects can be wildly inefficient. Incredibly, software projects often devote 80 percent of their budgets to repairing flaws they themselves produced a figure that does not include the even more costly process of furnishing product support and developing patches for problems found after release.
"System testing goes on for almost half the process," Humphrey says. And even when "they finally get it to work, there's still no design." In consequence, the software can't be updated or improved with any assurance that the updates or improvements won't introduce major faults. "That's the way software is designed and built everywhere it's that way in spaceships, for God's sake."


IS SOFTWARE A SPECIAL CASE?
The potential risks of bad software were grimly illustrated between 1985 and 1987, when a computer-controlled radiation therapy machine manufactured by the government-backed Atomic Energy of Canada massively overdosed patients in the United States and Canada, killing at least three. In an exhaustive examination, Nancy Leveson, now an MIT computer scientist, assigned much of the blame to the manufacturer's inadequate software-engineering practices. Because the program used to set radiation intensity was not designed or tested carefully, simple typing errors triggered lethal blasts.
Despite this tragic experience, similar machines running software made by Multidata Systems International, of St. Louis, massively overdosed patients in Panama in 2000 and 2001, leading to eight more deaths. A team from the International Atomic Energy Agency attributed the deaths to "the entering of data" in a way programmers had not anticipated. As Leveson notes, simple data-entry errors should not have lethal consequences. So this failure, too, may be due to inadequate software.
Programming experts tend to agree that such disasters are distressingly common. Consider the Mars Climate Orbiter and the Polar Lander, both destroyed in 1999 by familiar, readily prevented coding errors. But some argue that software simply cannot be judged, measured and improved in the same way as other engineering products. "It's just a fact that there are things that other engineers can do that we can't do," says Shari Pfleeger, a senior researcher at the Rand think tank in Washington, DC, and author of the 1998 volume Software Engineering: Theory and Practice. If a bridge survives a 500-kilogram weight and a 50,000-kilogram weight, Pfleeger notes, engineers can assume that it will bear all the values between. With software, she says, "I can't make that assumptionI can't interpolate."
Moreover, software makers labor under extraordinary demands. Ford and General Motors have been manufacturing the same product a four-wheeled box with an internal-combustion engine for decades. In consequence, says Charles H. Connell, former principal engineer of Lotus Development (now part of IBM), they have been able to improve their products incrementally. But software companies are constantly asked to create products Web browsers in the early 1990s, new cell phone interfaces today unlike anything seen before. "It's like a car manufacturer saying, 'This year we're going to make a rocket ship instead of a car,'" Connell says. "Of course they'll have problems."
"The classic dilemma in software is that people continually want more and more and more stuff," says Nathan Myhrvold, former chief technology officer of Microsoft. Unfortunately, he notes, the constant demand for novelty means that software is always "in the bleeding-edge phase," when products are inherently less reliable. In 1983, he says, Microsoft Word had only 27,000 lines of code. "Trouble is, it didn't do very much" which customers today wouldn't accept. If Microsoft had not kept pumping up Word with new features, the product would no longer exist.
"Users are tremendously non-self-aware," Myhrvold adds. At Microsoft, he says, corporate customers often demanded that the company simultaneously add new features and stop adding new features. "Literally, I've heard it in a single breath, a single sentence. 'We're not sure why we should upgrade to this new release it has all this stuff we don't want and when are you going to put in these three things?' And you say, 'Whaaat?'" Myhrvold's sardonic summary: "Software sucks because users demand it to."


HIGHER STANDARDS
In January, Bill Gates issued a call to Microsoft employees to make "reliable and secure" computing their "highest priority." In what the company billed as one of its most important initiatives in years, Gates demanded that Microsoft "dramatically reduce" the number of defects in its products. A month later, the company took the unprecedented step of suspending all new code writing for almost two months. Instead, it gathered together programmers, a thousand at a time, for mass training sessions on reliability and security. Using huge screens in a giant auditorium, company executives displayed embarrassing snippets of flawed code produced by those in the audience.
Gates's initiative was apparently inspired by the blast of criticism that engulfed Microsoft in July 2001 when a buffer overflow a long-familiar type of error in its Internet Information Services Web-server software let the Code Red worm victimize thousands of its corporate clients. (In a buffer overflow, a program receives more data than expected as if one filled in the space for a zip code with a 50-digit number. In a computer, the extra information will spill into adjacent parts of memory, corrupting or overwriting the data there, unless it is carefully blocked.) Two months later, the Nimda worm exploited other flaws in the software to attack thousands more machines.
Battered by such experiences, software developers are becoming more attentive to quality. Even as Gates was rallying his troops, think tanks like the Kestrel Institute, of Palo Alto, CA, were developing "correct-by-construction" programming tool kits that almost force coders to write reliable programs (see "First Aid for Faulty Code" ). At Microsoft itself, according to Amitabh Srivastava, head of the firm's Programmer Productivity Research Center, coders are working with new, "higher-level" languages like C# that don't permit certain errors. And in May, Microsoft cofounded the $30 million Sustainable Computing Consortium based at Carnegie Mellon with NASA and 16 other firms to promote standardized ways to measure and improve software dependability. Quality control efforts can pay off handsomely: in helping Lockheed Martin revamp the software in its C130J aircraft, Praxis Critical Systems, of Bath, England, used such methods to cut development costs by 80 percent while producing software that passed stringent Federal Aviation Administration exams with "very few errors."
Critics welcome calls for excellence like those from Kestrel and Microsoft but argue that they will come to naught unless commercial software developers abandon many of their ingrained practices. "The mindset of the industry is to treat quality as secondary," says Cem Kaner, a computer scientist and lawyer at the Florida Institute of Technology. Before releasing products, companies routinely hold "bug deferral meetings" to decide which defects to fix immediately, which to fix later by forcing customers to download patches or buy upgrades, and which to forget about entirely. "Other industries get sued when they ignore known defects," Kaner says. "In software, it's standard practice. That's why you don't buy version 1.0 of a program." Exasperatingly, software vendors deliver buggy, badly designed products with incomprehensible help files and then charge high fees for the inevitable customer service calls. In this way, amazingly, firms profit from poor engineering practices.
When engineers inside a software company choose to ignore a serious flaw, there are usually plenty of reviewers, pundits, hackers and other outsiders who will point it out. This is a good thing; as Petroski wrote in The Evolution of Useful Things, "a technologically savvy and understanding public is the best check on errant design." Unfortunately, companies increasingly try to discourage such public discussion. The fine print in many software licenses forbids publishing benchmark tests. When PC Magazine tried in 1999 to run a head-to-head comparison of Oracle and Microsoft databases, Oracle used the license terms to block it even though the magazine had gone out of its way to assure a fair test by asking both firms to help it set up their software. To purchase Network Associates' popular McAfee VirusScan software, customers must promise not to publish reviews without prior consent from Network Associates a condition so onerous that the State of New York sued the firm in February for creating an "illegal ... restrictive covenant" that "chills free speech." (At press time, no trial date had been set.)
IS LITIGATION THE ANSWER?
Even a few members of the software-is-different school believe that some programming practices must be reformed. "We don't learn from our mistakes," says Rand's Pfleeger.
In 1996, for example, the French Ariane 5 rocket catastrophically failed, exploding just 40 seconds after liftoff on its maiden voyage. Its $500 million satellite payload was a total loss. According to the subsequent committee of inquiry, the accident was due to "systematic software design error" more precisely, a buffer overflow. In most engineering fields, Pfleeger says, such disasters trigger industrywide reforms, as the collapse of the World Trade Center seems likely to do for fireproofing in construction. But in software, "there is no well-defined mechanism for investigating failures and no mechanism for ensuring that people read about them." If the French coders had been drilled, like other engineers, in the history of their own discipline, the Ariane fiasco might have been avoided.
One way or another, some computer scientists predict, software culture will change. To the surprise of many observers, the industry is relatively free of product liability lawsuits. The "I Love You" virus, for instance, spread largely because Microsoft against the vehement warnings of security experts designed Outlook to run programs in e-mail attachments easily. According to Computer Economics, a consulting group in Carlsbad, CA, the total cost of this decision was $8.75 billion. "It's amazing that there wasn't a blizzard of lawsuits," Wallach says.
Software firms have been able to avoid product liability litigation partly because software licenses force customers into arbitration, often on unfavorable terms, and partly because such lawsuits would be highly technical, which means that plaintiffs would need to hire costly experts to build their cases. Nonetheless, critics predict, the lawsuits will eventually come. And when the costs of litigation go up enough, companies will be motivated to bulletproof their code. The downside of quality enforcement through class action lawsuits, of course, is that groundless litigation can extort undeserved settlements. But as Wallach says, "it just might be a bad idea whose time has come."
In fact, a growing number of software engineers believe that computers have become so essential to daily life that society will eventually be unwilling to keep giving software firms a free legal pass. "It's either going to be a big product liability suit, or the government will come in and regulate the industry," says Jeffrey Voas, chief scientist of Cigital Labs, a software-testing firm in Dulles, VA. "Something's going to give. It won't be pretty, but once companies have a gun to their head, they'll figure out a way to improve their code."
********************


Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 510
2120 L Street, NW
Washington, D.C. 20037
202-478-6124
lillie.coney@xxxxxxx