[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips October 24, 2002



Clips October 24, 2002

ARTICLES

Ruling doesn't affect Section 508 Web requirements, board says
Congressman under fire for attack on free software
Google excluding controversial sites
Listen.com Cuts Deals With Two Big Labels
ISPs revolt against data retention law
No Easy Money Suing Spammers
EMC wins injunction in copyright case
Major Net backbone attack could be first of many
Vendors lock horns on Web services standards
Objective Force draft reviewed
Agencies' efforts to design 'usable' Web sites
Bush signs $355 billion Defense bill
More Than One Internet Attack Occurred Monday
Interpol police organization finally going digital
At Senators' Web Sites, a 60-Day Vow of Silence
FBI Seeks to Trace Internet Attack
Net attacks: Internet pioneer predicted outages in 2000


*************************** Government Computer News Ruling doesn't affect Section 508 Web requirements, board says By Dipka Bhambhani

Government Web sites must still be accessible to the disabled, despite a recent court ruling and the ongoing debate over accessibility for commercial sites, according to the Access Board.

Earlier this month, a U.S. District judge in south Florida said the Americans with Disabilities Act, signed into law in 1990, applied only to physical spaces such as ticket counters, not cyberspace. The judge ruled against a blind plaintiff who had sued Southwest Airlines, claiming the company's Web site was inaccessible.

"I don't think it [the ruling] has any impact at all on Section 508," said David Capozzi, director of technical and information services for the Access Board.

Section 508, included in the 1998 amendments to the Rehabilitation Act of 1973, makes it clear that federal Web sites are mandated to be accessible, he said. "That shouldn't be open to interpretation."

The Justice Department has fielded its own share of complaints against private enterprises that have not made their Web sites accessible, Capozzi said.

Nearly two years ago, a mentally disabled plaintiff filed a complaint with Justice, which eventually filed a brief that said public sites, including the Internet, are required to adhere to the ADA regulations.
**************************
Seattle Post Intelligencer
Congressman under fire for attack on free software
By D. IAN HOPPER
THE ASSOCIATED PRESS


WASHINGTON -- Rep. Adam Smith, D-Wash., was criticized Wednesday by the chairman of a House technology committee for an attack on the free software movement.

A bipartisan group of lawmakers had written a letter urging White House computer security adviser Richard Clarke to find sales opportunities for government-funded software projects. The letter had no mention of the issue of free software, also known as open-source or General Public License (GPL) software.

But when Smith, whose biggest political contributor is Microsoft, began circulating the letter to his fellow Democrats asking for their signatures, he added his own correspondence saying the free software philosophy is "problematic and threaten(s) to undermine innovation and security."

The open-source movement advocates that software, such as the Linux operating system, should be distributed free and open to modification by others rather than treated as copyright-protected, for-profit property.

Smith's attack on open-source drew an angry response from one of the original authors of the letter, Rep. Tom Davis, R-Va., chairman of the Government Reform Subcommittee on Technology and Procurement Policy.

"We had no knowledge about that letter that twisted this position into a debate over the open source GPL issues," said Melissa Wojciak, staff director of the subcommittee. Wojciak added that Davis supports government funding of open-source projects.

Smith spokeswoman Katharine Lister said he has "definitely spoken with (Microsoft) about this issue," but that there wasn't a direct relationship between those discussions and his decision to write his letter to fellow Democrats.

Sixty-seven representatives signed the letter to Clarke; almost two-thirds were Democrats. "I'm going to hope that the people who signed on to the letter did their homework," Lister said.

Microsoft, whose Windows operating system competes with Linux, says open-source hurts a company's right to protect its intellectual property.

Microsoft is Smith's top source of donations. According to the Center for Responsive Politics, Microsoft employees and its political action committee have given $22,900 to Smith's re-election campaign.

The original letter was fashioned by Davis and Jim Turner, D-Texas. They wanted the White House's national cybersecurity plan, which is set to be finished next month, to ensure that companies that develop software using federal funds are free to use the resulting products for commercial gain.

Clarke and his top spokeswoman were traveling Wednesday, and did not return a message seeking comment.
**************************
CNET News.com
Google excluding controversial sites
By Declan McCullagh
October 23, 2002, 8:55 PM PT



Google, the world's most popular search engine, has quietly deleted more than 100 controversial sites from some search result listings.



Absent from Google's French and German listings are Web sites that are anti-Semitic, pro-Nazi or related to white supremacy, according to a new report from Harvard University's Berkman Center. Also banned is Jesus-is-lord.com, a fundamentalist Christian site that is adamantly opposed to abortion.


Google confirmed on Wednesday that the sites had been removed from listings available at Google.fr and Google.de. The removed sites continue to appear in listings on the main Google.com site.

The Harvard report, prepared by law student Ben Edelman and assistant professor Jonathan Zittrain, and scheduled to be released Thursday, is the result of automated testing of Google's massive 2.5 billion-page index and a comparison of the results returned by different foreign-language versions. The duo found 113 excluded sites, most with racial overtones.

"To avoid legal liability, we remove sites from Google.de search results pages that may conflict with German law," said Google spokesman Nate Tyler. He indicated that each site that was delisted came after a specific complaint from a foreign government.

German law considers the publication of Holocaust denials and similar material as an incitement of racial and ethnic hatred, and therefore illegal. In the past, Germany has ordered Internet providers to block access to U.S. Web sites that post revisionist literature.

France has similar laws that allowed a students' antiracism group to successfully sue Yahoo in a Paris court for allowing Third Reich memorabilia and Adolf Hitler's "Mein Kampf" to be sold on the company's auction sites. In November 2001, a U.S. judge ruled that the First Amendment's guarantee of free speech protects Yahoo from liability.

Google's battles
The Harvard report comes as Google is becoming increasingly embroiled in international political disputes over copyright and censorship. China blocked access to Google last month.


Google was criticized in March for bowing to a demand from the Church of Scientology to delete critical sites from its index. In a response that won praise, Google replied by pledging to report future legal threats to the ChillingEffects.org site run by law school clinics.

As Google has become the way more and more people find information on the Internet, it has also become an increasingly visible target for copyright complaints about cached information and allegedly infringing links. ChillingEffect.org's Google section lists 16 requests or legal threats the company has received in the past three months. One Google competitor and critic even suggested that the wildly popular search engine be transformed into a government-controlled "public utility."

Edelman, who created the program that tested URLs against Google's index, said he was investigating a tip about Google's German-language version.

"One concern that I've had for some time vis-a-vis filtering is that filtering is almost always secretive," Edelman said. "In the (library filtering) case, that meant you can't look at the list of blocked sites. In the Chinese government case, you can't see what sites are being blocked."

Edelman, who is a first-year law student, testified as an expert witness for the American Civil Liberties Union (ACLU) in a court challenge to a law requiring libraries to install filtering software if they accept federal funds. He is also a plaintiff in a second lawsuit filed in June to eviscerate key portions of the Digital Millennium Copyright Act.

Google's response
Google refused to reply to a list of questions that CNET News.com sent via e-mail, including which sites have been delisted, how many sites have been delisted, what standards are used, and what other Google-operated sites have less-than-complete listings.


In an e-mail response, Google's Tyler said: "As a matter of company policy we do not provide specific details about why or when we removed any one particular site from our index. We occasionally receive notices from partners, users, government agencies and the like about sites in our index. We carefully consider any credible complaint on a case-by-case basis and take necessary action when needed. This is not pre-emptive--we only react to requests that come to us...to avoid legal liability, we remove sites from Google search results pages that may conflict with local laws."

Tyler said an internal team involving lawyers, management and engineers makes the final decision on what to remove. "At Google we take these types of decisions very seriously," he said. "The objective is to limit legal exposure while continuing to deliver high quality search results that enable our users to find the information they need quickly and easily."

Tyler pointed to Google's terms of service agreement, which says Google will "consider on a case-by-case basis requests" to remove links from its index.

A moving target
Because Google has to keep track of a constantly moving target--new sites arguably illegal under French or German law appear every day--the search engine is encountering the same problems of overinclusiveness that traditional filtering software has experienced.


According to the Harvard report, some sites that Google does not list include 1488.com, a "Chinese legal consultation network," and 14words.com, a discount Web-hosting service and some conservative, anti-abortion religious sites. Those sites do not appear to violate either German or French laws.

Banned from Google.de and Google.fr listings is Stormfront.org, one of the Internet's most popular "white pride" sites. Stormfront features discussion areas, a library of white nationalist articles and essays by David Duke, a former Ku Klux Klan leader.

"We've been dealing with this for quite a few years," said Don Black, who runs the site. "The German police agencies seem obsessed with Stormfront even though we're not focused on any German language material."

Black, who learned a few months ago that Google.de delisted Stormfront, says he doesn't hold it against the Mountain View, Calif.-based company. "Google is trying to conform to their outrageous laws," Black said. "So there's really nothing we can do about it. It's really a French and German issue rather than a Google issue."

The First Amendment
Because Google is a company and not a government agency, it has the right in general to delete listings from its service or alter the way they appear. (On Tuesday, however, CNET News.com reported that an Oklahoma advertising company has sued Google over its position in search results.)


"Google may not only have the legal right to (delete listings), they may have the legal obligation to do it," said Barry Steinhardt, director of the ACLU's technology and liberty program, and a co-founder of the Global Internet Liberty Campaign.

"Over the long term, this will become a significant issue on the Net," Steinhardt said. "There's a wide variety of laws around the world prohibiting different forms of speech. You can imagine what the Chinese government prohibits versus what the French government prohibits versus what the U.S. government prohibits."

Edelman, of Harvard's Berkman Center, suggests that Google find a way to alert users that information is missing from their search results.

"If Google is prohibited from linking to Stormfront, they could include a listing but no link," Edelman said. "And if they can't even include a listing for Stormfront, they could at least report the fact that they've hidden results from the user. The core idea here is that there's no need to be secretive."
*****************************
Los Angeles Times
Listen.com Cuts Deals With Two Big Labels
Universal and Warner will allow users of the firm's online jukebox to make permanent copies of thousands of songs.
By Jon Healey
October 24 2002


Internet music distributor Listen.com of San Francisco said it reached deals with two major record companies that would let subscribers make permanent copies of some of the songs on Listen's online jukebox.

The agreements with Vivendi Universal's Universal Music Group and AOL Time Warner's Warner Music Group add an important new dimension to Listen's service: For an additional fee, subscribers will be able to listen to some major-label songs when they are not connected to the Internet. Starting today, subscribers can record tens of thousands of the labels' songs onto CDs at 99 cents apiece.

"It's hugely important," said analyst P.J. McNealy of GartnerG2, a technology research and consulting firm. "Portability and depth and breadth of the catalog are going to be the two most important things for music services. And getting [CD recording] into the service gets them partly down that path."

Listen's Rhapsody, an online jukebox and radio service that lets subscribers hear songs on demand, is the only authorized subscription service that includes songs from all five major labels. That distinction is expected to vanish soon, however, as two ventures owned by the record labels -- Pressplay and MusicNet -- fill the holes in their catalogs.

The labels have been slow to grant licenses that permit CD burning, in part because they would prefer downloaded music files to remain in copy-protected formats. Standard CDs have no such protection.

But in a recent survey of more than 2,000 adults and teenagers who use the Internet, McNealy said, GartnerG2 found that "if consumers perceive that things are being restricted, then they have a problem with it."
***************************
ZDNet [UK]
ISPs revolt against data retention law
18:08 Wednesday 23rd October 2002
Matt Loney


The government wants ISPs to intercept and retain all Internet traffic, but refuses to answer industry concerns over the process. ISPs say they may not be able to comply
UK ISPs are poised to ignore a Home Office voluntary code of practice addressing retention of Internet data unless big changes are made to the wording.


The code of practice lays out the obligations of ISPs under the Anti-Terrorism, Crime and Security Act, which was rushed through parliament in the wake of the 11 September terrorist attacks. It obliges ISPs to retain communications data for law enforcement purposes, but, a year since the first draft was released, the Home Office has failed to explain how ISPs will be reimbursed for retaining the data, or how they can comply with the code without breaking numerous other laws.

The proposals have already been knocked back by European data protection commissioners.

In a letter to the ISP Association's members, ISPA general secretary Nicholas Lansman said he could not "recommend to members that they voluntarily comply with the proposed code of practice." According to the letter, which was seen and first reported by The Guardian, the industry had not been convinced that extending the length of time companies hold on to customer logs was necessary for the fight against terrorism and serious crime.

An IPSA spokesman confirmed the contents of the letter to ZDNet UK, but played down the significance, saying that it merely restates a position that ISPA has held for "many months". Furthermore, he said, if the ISP community refuses to abide by the voluntary code of practice then it will be forced upon them. "The government has said that if after a year the voluntary code of practice is seen, in the eyes of the home secretary, not to have worked then they will make it mandatory and so communications service providers will not have any option. It will be law," said the spokesman.

Nevertheless, ISPs say they still have many concerns about the code of practice, not least of which is the worry that by complying with it, ISPs may be forced to break other laws. "The ACTS law and code of practice has to reconciled with the Regulation of Investigatory Powers Act (RIPA), the Data Protection Act, the Human Rights Act, and the Police and Criminal Intelligence Act. ISPs need to know their legal position," said the spokesman. It's a concern that the ISP community and others have been voicing to the Home Office for more than a year now -- with no response.

ISPA is not alone is voicing such concerns. Shortly after the first draft was published last year a joint parliamentary committee warned it was likely to break European human rights legislation. The House of Lords and House of Commons Joint Committee on Human Rights said the code appeared to be incompatible with the European Convention on Human Rights (ECHR), and said safeguards are needed to prevent the government from compiling a stockpile of communications data on innocent citizens.

"We consider that measures should be put in place to ensure that the Code of Practice and any directions are compatible with the right to respect for private and family life, home and correspondence under Article 8 of the ECHR, and that those measures should be specified, so far as practicable, on the face of the legislation," the Committee concluded.

Even if the legal issues are sorted out, the costs of implementing the measures are likely to be high. "There is the initial set up," said the ISPA spokesman. "Then there are costs associated with management processes, storage and storage management, human resources, and then ISPs will also have to deal with requests from the data subjects themselves. There are so many problems that need to be resolved."

The spokesman said that ISPs are keen to work with law enforcement, but the Home Office must reply to the concerns of industry. "We need to know what the terms of the cost recovery process will be. We want to help, but law enforcement is not our job so we shouldn't have to pay for it."
******************************
Wired News
No Easy Money Suing Spammers


When Ken Pugh sued the Elizabeth Dole for Senate campaign last month for sending him spam, it wasn't money that motivated him.

Even if he wins, according to the North Carolina statute he's suing under, Pugh stands to net a whopping $80. That's $10 for each of eight e-mails he received.

No, it's the principle of the thing, says Pugh, a computer consultant from Durham, North Carolina, who is claiming that the unwanted e-mails constituted an illegal computer trespass.

"I would be happy if it was $1, because even at $80 I'm not exactly making up for my time," said Pugh, who filed the suit in small claims court in Rowan County, North Carolina. "What I'd like to do is send a message to someone who may be next senator."

He hopes the suit will alert federal legislators, particularly in his home state, of the need for more effective laws against spam. Pugh himself favors a federal anti-spam law that would include provisions for a national "do-not-send list" of e-mail addresses that are off limits to bulk senders.

Pugh is not alone in calling for tighter regulation of spam. With the scourge of unwanted e-mail on the rise, some unlikely candidates are joining the crusade.

This week, even the Direct Marketing Association (DMA), a group that represents e-mail marketers, came out in support of federal anti-spam legislation. The DMA says that scurrilous spammers are ruining the e-mail marketing business for legitimate operators.

For the time being, however, most anti-spam activists have contented themselves by fighting their battles in state courts. Although a few federal legislators have proposed bills to rein in spam, there is no statute on the books specifically addressing unsolicited e-mail. Meanwhile, 26 states have anti-spam laws in effect.

A number of spam haters are finding limited success fighting unscrupulous e-mail marketers in small claims court. Their focus is more on creating havoc for spammers than on winning large judgments.

"I wouldn't recommend this as a way to make money, but if you're interested in learning about the small claims court system, you could try it," said Bennett Haselton, a Washington resident who has filed at least 30 cases against spammers under a 4-year-old state anti-spam law.

Haselton said he's won 10 cases and collected about $2,000 from defendants. It's not a lot of money considering the effort required to pursue the cases, he said.

Nonetheless, Washington is the most popular venue for anti-spam filings in small claims court, says Bruce Miller, another resident of the state who has filed multiple cases against bulk e-mailers.

Miller attributes this popularity in part to the fact that Washington was one of the first states to have an anti-spam law, and along with California, it provides for some of the strictest penalties. The Washington law allows spam victims to recoup up to $500 for messages that are sent without permission or that contain false or misleading information.

But not all judges have been receptive to spam cases. Both Miller and Haselton have had some claims against out-of-state spammers tossed out by judges who said that small claims court was not the proper venue.

In other cases, rewards have been sizeable. This month, a Seattle court ordered an Oregon spammer, Jason Heckel, to pay $98,000 in penalties and legal costs for deluging Washington residents with unwanted messages.

Ben Livingston, another Seattle-area anti-spam activist and author of a website on "Zen and the art of small claims," is pursuing a $150,000 claim against a spammer who sells fake marijuana and smoking products.

Such outsized claims may inspire more activists to haul spammers into court, though Miller believes it's sheer outrage among spam recipients that will truly spur change.

"People need to understand that spam is theft," he said. "When enough people get outraged enough about this kind of theft, then we'll see more movement."
*****************************
New York Times
The Web on the Campaign Trail
By JEFFREY SELINGO


IN his closing statement at the end of the first presidential debate of the 1996 campaign, Senator Bob Dole included a plug for his Web site. Hundreds of thousands of Internet users flocked to the site in the hour after the debate ended, overloading the server to such an extent that the site virtually shut down.

For many political analysts, that moment signaled the beginning of the Internet age in politics, much as the Kennedy-Nixon debates forever merged politics and television. In the 2000 presidential campaign, Senator John McCain used the Web to organize his insurgent campaign for the Republican presidential nomination, raising some $2.6 million online in the week after the New Hampshire primary. Since then, the Internet has become almost a required tool for candidates for federal and statewide offices around the country.

This year, even candidates much farther down the ballot, like those for running for the school board, city council or county sheriff, are turning to the Web to post their campaign platforms, solicit donations and find all-important volunteers for their grass-roots campaigns. The local candidates say the sites attract a surprising number of visitors, generate e-mail exchanges with voters and even collect a few dollars in donations.

"There's no easier or cheaper way to get your name out there," said Reneé Valletutti (www.reneeforschools.org), who is running for the school board in Brevard County, Fla.

Candidates say that the Web enhances old-fashioned door-to-door campaigns by reaching potential voters when it is convenient for them and in general reaching them more often, since many people surf the Web at work. Most local candidates post biographies, press releases, policy statements and photographs, while a few add bells and whistles like audio clips of speeches.

Perhaps most important for the candidates, the Internet allows them to disseminate information that is unfiltered by the media or other sources.

"Everything I release to the media is subject to editing or is condensed," said Dirk Anderson (www.dirk2002.com ), a Republican candidate for sheriff in Lewis and Clark County in Montana. "I can't get anything in the paper the way I want unless I buy an ad. But I can put anything I want on the Web."

But that doesn't necessarily mean many people will see it. In an effort to help voters find their sites, most local candidates plaster their Web address on yard signs, brochures and everything they send out in the mail. Ms. Valletutti puts hers on bumper stickers. Ron Pritchard, a Republican running for county commissioner in Brevard County, features www.ronpritchard.com more prominently than his phone number on most of his campaign handouts.

Not everyone is convinced that using a computer is the best way to reach voters just around the block. With Election Day less than two weeks away, many local candidates have ignored the Web altogether, while others have given up their sites for lack of voter interest.

Political consultants say that the sometimes primitive Web sites of local campaigns, often designed by friends or the candidates themselves, may be little more than online brochures that are difficult to find.

"At the local level, people expect to see you face to face," said Michael Cornfield, research director at the Institute for Politics, Democracy and the Internet at George Washington University. "Constructing and maintaining a good campaign Web site takes time, and many local candidates think that the time is better spent on other activities."

Still, low cost nonetheless draws local candidates to the Internet, since mounting a political campaign, even for a small-time office, is expensive. Ms. Valletutti said she had spent about $4,500 so far on yard signs and $500 on an advertisement in a local newspaper aimed at parents. Her Web site, meanwhile, costs her only $20 a month and provides information about her background and her opinions on issues like teachers' salaries, labor negotiations and class size.

"It's much better than direct mail," Ms. Valletutti said. "With direct mail, you get a 6-by-9 postcard and a few bullet points that most people just toss aside. On the Web, you can go much more in-depth."

The low cost of putting up a Web site is particularly appealing to third-party candidates, who often receive little media attention and operate on shoestring budgets. The national Green Party, for instance, will host a site for any of its candidates for a $5 monthly fee. Unlike printing a brochure, a Web site "doesn't cost you more money every time you want to add something to it," said Dean Myerson, the Green Party's national political coordinator.

Whether campaign information on the Web is reaching voters remains unclear. For one thing, only 61 percent of American adults use the Internet, according to the Pew Internet and American Life Project. What's more, Internet users tend to be younger, and young people are less likely to vote than older Americans.

Voters who use the Web for political information typically visit sites that favor their views, said Lee Rainie, director of the Pew project. "The Web is more about accessing information than it's an opportunity to change people's minds," he said.

Even so, some local candidates say their campaign sites are attracting more visitors than they expected, although few keep official statistics. Ms. Valletutti said that the site for her school-board campaign had logged more than 2,600 visitors since it went up in July. (There are 58,000 registered voters in her district.)

Most candidates judge their site's popularity by e-mail messages they receive at an address posted on the Web. Don Larson (www.shopseaside.com/donlarson), who is running for mayor in Seaside, Ore., said he received several e-mail messages a week from voters seeking his views on issues facing the town, which has 5,000 residents. "I get better questions in e-mails than I do in person," said Mr. Larson, who carries pages printed from his Web site for voters without Internet access. "The questions are coherent, thoughtful, and clearly show people are concerned."

Few local candidates, it seems, have mastered what political consultants say are the two biggest advantages of the Internet: the ability to collect personal information on visitors and to take in contributions. Many local candidates resist saving e-mail addresses for fear that messages sent en masse to voters could be perceived as spam and turn the voters off to the campaign. And while some campaign sites have the capability to accept donations online, the money is barely trickling in so far.

Mr. Pritchard, the county commissioner candidate in Florida, said that of the $55,000 he had raised for his campaign, only a couple of hundred dollars was collected online. "I don't get contributions out of the blue from the site," Mr. Pritchard said.

One reason could be that unlike yard signs or mailed brochures, campaign Web sites must be actively sought out by voters. The sites can be difficult to find if voters are unsure exactly who is running.

The Internet is also riddled with old campaign sites. For instance, a Google search for "Brevard County commissioner candidate" will turn up the site of John J. Anderson, whom Mr. Pritchard defeated in a primary last month. When a resident of Waltham, Mass., plugs the words "Waltham mayor candidate" into Google, it returns, among other things, a Web site for Russ Malone, who lost the election in 1999.

"There are people in the political business who say that the lowliest yard sign is better than a fancy Web site because at least a yard sign is going to be seen inadvertently by people passing by," said Mr. Cornfield, the researcher at the Institute for Politics, Democracy and the Internet at George Washington University. "It's very hard to see a message online inadvertently."

Even if voters locate a campaign site, they may be disappointed with what they find there, political consultants say. Many sites for local candidates are antiquated by today's Web standards. Often they are static pages that lack search functions, have nonworking links and are infrequently updated. "They are basically brochures in the sky," said Phil Nash, the chief executive of campaignadvantage.com, a political consulting company based in Bethesda, Md.

Voters are not the only ones disappointed by Web sites. Some candidates are frustrated by the response to their sites. Mitch Goldstone (www.goldstoneforcouncil.org), a candidate for city council in Irvine, Calif., said he was excited about showcasing his talent for Web design when he put up his site in June. "But I quickly realized that voters don't really care about that," said Mr. Goldstone, who has given up work on the site, although it is still active.

"Voters want to meet the candidate in person, they want to know what the candidate is about," he said. "There is a big difference between going on the Internet and opening up your front door to people."
****************************
CNET News.com
EMC wins injunction in copyright case
By Dawn Kawamoto
October 23, 2002, 3:49 PM PT


Storage maker EMC announced Wednesday it received a court injunction against a support services company that allegedly used its copyrighted software without authorization.
A U.S. District Court in North Carolina granted EMC the injunction against Triangle Technology Services, which provides EMC-related support services and is a seller of used EMC equipment.


Under the permanent injunction, Triangle Technology is prohibited from using certain EMC copyrighted software and trade secrets relating to EMC's service business. The storage maker alleged Triangle had been using its maintenance software, training materials, engineering documents and other parts of its intellectual property without authorization.


"The violations that occurred represent only a small portion of our business, but the court's decision sends a clear message to others that the $3 billion we spent on research and development is protected," EMC spokesman Greg Eden said.


EMC became aware of the alleged violations upon talking with EMC customers who had begun to use Triangle's service as they came off their service agreements with EMC, he said.

Triangle Technology declined to immediately comment.

The court ruling offers some good news for EMC, which has seen its finances battered by slow sales in a weak IT spending environment. Earlier this month, EMC reported a 9 percent sequential drop in third-quarter revenue to $1.26 billion.
******************************
IDG Reports
Major Net backbone attack could be first of many


The distributed denial of service (DDOS) attack launched Monday against all 13 of the Internet domain name system (DNS) root servers failed to bring down the Internet, but that doesn't mean that more attacks won't follow and succeed where this week's attack failed, according to experts, some of whom feel that the federal government needs to step in to secure the Net infrastructure.

Monday's attack was targeted at 13 key servers that translate easy-to-remember URLs (uniform resource locators) into the numeric IP (Internet Protocol) addresses used by computers to communicate. [See "Net backbone withstands major attack," Oct. 22.]

Attackers flooded the DNS servers with Internet traffic using ICMP (Internet Control Message Protocol) at more than 10 times the normal rate of traffic, according to Brian O'Shaughnessy, a spokesman at VeriSign, which manages the "A" and "J" root servers.

Such events are nothing new, with high-profile attacks in past years against Internet service providers and companies such as Microsoft Corp. and eBay Inc. But experts say that Monday's incident opens a new chapter in the history of Internet-based attacks.

"Monday's attack was an example of people not targeting enterprises, but going against the Internet itself by attacking the architecture and protocols on which the Internet was built," said Ted Julian, chief strategist at Arbor Networks Inc. of Lexington, Massachusetts.

Factors contributing to such attacks are well known, according to experts. Worms such as Code Red, Nimda and Slapper have left hundreds -- if not thousands -- of compromised computers on the Internet, Julian said. Such systems can be used as "zombies" in a DDOS attack. Zombies are machines controlled remotely and used to launch an attack.

Reports from Matrix NetSystems Inc. Tuesday traced the attacks to Internet hosting service providers in the U.S. and Europe.

Gerry Brady, chief technology officer for Guardent Inc. said that sophisticated software programs make leveraging those compromised machines a simple matter, even for novice attackers.

"With automated attack tools, even inexperienced people can get control of a large number of hosts. The IP addresses and access passwords for those systems are traded on the Internet like you or I used to trade baseball cards," Brady said.

While the Federal Bureau of Investigation's National Infrastructure Protection Center (NIPC) is investigating the attacks, Brady pointed out that some of the most frequent sources of such attacks are teenagers, not terrorists.

"The big drivers we're seeing (in DDOS attacks) are juvenile rivalries -- revenge for incidents that might have happened during online gaming. These attacks are not professional or financial in nature. They're random and non-directed," Brady said.

Fortunately, Monday's attacks were not sophisticated, relying on a simple "packet flood" approach in which information packets are sent in high volumes to a server, and using a protocol -- ICMP -- that is typically not seen in very high volumes, Brady and Julian said.

Future attacks could be much more sophisticated, they said.

Instead of sending a flood of packets all using the same protocol, attackers might disguise a DDOS attack as normal traffic -- what Julian referred to as a "bandwidth anomaly." In such an attack, nothing about the protocols used or packets sent would appear unusual, but the volume of traffic would be enough to overwhelm the targeted server.

Even more pernicious, Brady and Julian agreed, would be attacks that target the routing infrastructure, as opposed to the DNS infrastructure of the Internet. That infrastructure of roadways over which Internet traffic passes is more "brittle" than the flexible architecture of DNS, Brady said.

"When one backbone goes down, the traffic has to go somewhere," said Brady, recalling that the recent outage on the UUNet Internet backbone operated by WorldCom Inc. was felt instantly worldwide. [See "UUNet backbone problems slow down the Net" Oct. 3.]

More federal management of key components of the Internet infrastructure is needed, Julian and Brady agreed. That could include tax incentives or direct federal funding for private companies and public organizations managing key DNS servers to secure their systems, all of which are currently operated as a free service by companies, government entities and non-profit organizations.

"This showcases a specific vulnerability that requires the government to get involved," Julian said. "If you run a DNS server what is your monetary incentive to secure it? There is none. This is the number one area of focus that the government should have."

As for the backbone providers, Brady said that because of the dire financial condition of most companies that manage the Internet backbone, there is little private money available to ensure the extra capacity should one or more parts of the backbone be attacked. Federal investment could help create and secure a more robust infrastructure.

"If this were voice communications (that were attacked) can you imagine (U.S. Secretary of Defense Donald) Rumsfeld's reaction?" Brady said. "That would be a national security issue. We must acknowledge that this is critical infrastructure and we have to find remediation."

"This is rich territory for Mr. Clarke and his people," said Julian, referring to Richard Clarke, President Bush's special adviser for cyberspace security.

In the meantime, Brady said that the pattern of past DDOS attacks make more of them likely in the near future.

"I would be worried that we're in a short-term countdown to more infrastructure attacks because they're just so easy to do," Brady said.
**************************
Computerworld
Vendors lock horns on Web services standards
By THOMAS HOFFMAN
OCTOBER 23, 2002


NEW YORK -- The good news for IT managers considering Web services deployments is that vendors such as Microsoft Corp., IBM and Oracle Corp. appear to be committed to creating standards aimed at facilitating interoperability between disparate technology platforms.
But as with most, if not all, standards-building efforts, politics could get in the way and prevent other needed vendors, such as Sun Microsystems Inc., from helping to draft open-systems specifications needed in order for Web services to succeed.


One of the more significant standards groups to emerge is the Web Services Interoperability Organization (WS-I), which counts Oracle, IBM and Microsoft among its 100-plus members. Absent from the group, however, is Sun, which would consider taking part in drafting security, transaction and other Web services specifications "if we were to be offered board-level seats with equal standing" among other big-name vendors, said John Bobowicz, chief technical strategist for Sun ONE.

Bobowicz spoke about the standards impasse at a panel discussion for a Web services conference sponsored by the Software & Information Industry Association here yesterday.

One of the panelists, Wim Geurden, a senior technology strategist for Microsoft's banking unit, suggested that one way to ensure industrywide compliance with standards is for customers "to demand [interoperability between vendor platforms] and put it into their contracts."

Conference attendee Bob Zurek, a vice president of advanced technology at Ascential Software Corp. in Westboro, Mass., complained in an open forum that another standards effort in progress, under the World Wide Web Consortium (W3C), is being crafted by big vendors "that can afford to pay $150,000 for a three-year commitment and populate" W3C subcommittees with personnel to push specifications that benefit their respective product sales.

Panelist Ted Shelton, a senior vice president of business development and chief strategy officer at Borland Software Corp. in Scotts Valley, Calif., acknowledged that his company "tries to participate in standards committees to help the industry and to shape standards that will be beneficial to Borland."

Other panelists, including Bobowicz, argued that Internet protocols such as Simple Object Access Protocol, WS Security, and Universal Description, Discovery and Integration should remain royalty-free, regardless of what course the standards efforts take. "You shouldn't have to pay royalties on any of these standards or protocols -- they've always been free," said Bobowicz.

Despite some of the rancor among competing vendors in this space, there are hopeful signs for certain standards. WS-I conducted tests in August, for example, that demonstrated Web services interoperability between IBM's WebSphere and Microsoft's .Net platforms, using an emerging standard it developed called WS 1, according to Mark Colan, lead e-business technology evangelist at IBM.

WS-I is also planning to test the interoperability of different vendors' software tools by the end of the year, said John Magee, vice president for Oracle9i at Oracle.

"Integration is one of the biggest challenges facing IT organizations," especially as companies struggle to pull together data housed in different formats from "all over the place," Magee said.

Borland's Shelton cited research data from Gartner Inc. that claims that "more than 90% of companies are going to have problems integrating .Net and Java. You're going to have to make these two technologies work together, and Web services is one way of doing that.

"That's going to be one of the big breaking points for us as an industry," Shelton added. "I'm optimistic, but it's an enormous challenge."
***************************
Federal Computer Week
Objective Force draft reviewed
BY Dan Caterinicchia
Oct. 23, 2002


The Army's staff components and major commands are reviewing the first draft of a white paper released last week by the Objective Force Task Force that attempts to define what the service will look like in 2015.

The white paper, titled "The Objective Force in 2015," looks at all aspects of the service's ongoing transformation and how it will affect personnel; intelligence; command, control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR); space; sustainment; training; and other staff areas, said Lt. Gen. John Riggs, director of the Objective Force Task Force.

The Objective Force is a strategy to develop advanced IT tools, vehicles and weaponry to make the Army's armored forces more agile and lethal. The service's plan is to have the first Objective Force unit employed in 2008, with initial operational capability in 2010. Army officials admit that it is an aggressive timetable, but one they remain confident in achieving.

Army staff contingents and major commands have until Nov. 4 to review the white paper and submit their own plans, Riggs said. Once those reviews are in, a final copy of the white paper is expected to be complete by Nov. 15.

"This [transformation] is in a constant state of review and update," Riggs told Federal Computer Week after an Oct. 22 panel discussion at the Association of the U.S. Army's annual meeting in Washington, D.C. "By no stretch of the imagination should it be considered static."

Once the white paper is complete, Army leaders will work to synchronize the timelines of the transformational staff components and major commands, working toward an approved Army Transformation Campaign Plan by mid-December, a Transformation Roadmap by March 1, 2003, and a State of the Army review scheduled for March 15.

"We need to by synchronized to be successful...and we need a major review annually or semi-annually," said Riggs, adding that the current review was pre-planned and is not a reaction to concerns from Defense Department and Capitol Hill officials that the Army's timelines are unattainable. "The questioning of aggressive timelines is healthy."

Maj. Gen. David Melcher, director of the Army's Program Analysis and Evaluation office in the Pentagon, agreed and said the service remains confident in its Objective Force plans, including the fielding of Future Combat Systems.

FCS will equip Army vehicles with information and communications systems to enable soldiers to conduct missions, including command and control, surveillance and reconnaissance, direct and indirect fire, and personnel transport. FCS is scheduled to reach Milestone B next March.

"The timelines are eminently attainable," Melcher told FCW, adding that the Army will conduct at least three FCS reviews with the DOD between now and next March.

Riggs said that the Army might never reach its "objective designs" for FCS because the threshold capabilities for 2010 are still being defined and technology evolves so quickly.

"All of the architectural work on C4ISR is ongoing at this time," and the service will continually assess technologies to support rapid acquisition and fielding in the future, he said.
***************************
Federal Computer Week
Making do
Agencies' efforts to design 'usable' Web sites slowed by lack of resources, training
BY BY BRIAN ROBINSON
Oct. 14, 2002


Web usability the notion that Web pages should be designed to make information as easy to find as possible might be described as an unfunded mandate the government has imposed on its own agencies.

Both Congress and the Bush administration have made more accessible Web sites a core mandate of e-government. The law known as Section 508 requires agencies to make information technology, including Web sites, accessible to people with disabilities. It forced many Webmasters to think seriously about Web design and usability for the first time.

But talking about usability and making sure it happens are two different things. Usability means more than coming up with a good site design. It requires follow-through, and that's where many agencies short-staffed and with little time or money for training often come up short.

"Only a handful of agencies have so far dealt seriously with the resources, training, staff commitment and other things it takes to make usable Web sites," said Charles McClure, who has been an outspoken critic of the government's Web presence.

McClure, director of Florida State University's Information Use Management and Policy Institute, believes the talk about Web usability is far ahead of the reality.

Officials have talked a lot about e-government, he said, but agencies have been so focused on concerns such as online security that only a handful have spent the energy necessary to make sure their Web sites are accessible and useful. And so far, they haven't had a lot of help.

"The Office of Management and Budget tells the agencies to go out and make good Web sites, but it doesn't give them any resources to do that with," McClure said. OMB "puts out a lot of rhetoric, but not a lot of guidance."

Part of the problem is that usability is not clearly defined. You don't build applications with specific functions, according to Eric Schaffer, chief executive officer and founder of Human Factors International Inc., but instead create "systems that must work in the context of a given range of users doing a given set of tasks in a given environment."

To ensure success, organizations must involve a mix of participants, according to Schaffer:

* An executive who will champion the usability strategy and advocate usability engineering.

* A usability manager to implement the strategy and oversee activities.

* A usability coordinator to handle meetings.

* A primary usability expert to give technical direction and validate the work of others on the team.

* Usability staff members to work on designated projects.

* Graphic artists to create images for the Web site in accordance with specifications that are developed during the usability testing process.

This is a tall order for government agencies, most of which are struggling with shrinking budgets.

However, some come close to following that model. The Government Printing Office (GPO), for example, has a number of staff members dedicated to usability issues and a statistician who analyzes usage trends, said T.C. Evans, director of GPO's Office of Electronic Information Dissemination Services. GPO had an online presence even before the Web became the main way of getting information to the public, so it's had time to build a knowledgeable Web team.

Even so, Evans said, the number of people on the team has grown only slightly over time. Instead, the skill set of the team members has expanded to include an understanding of usability through training and on-the-job exposure.

"I'm a firm believer in doing and in learning from doing," he said.

The General Services Administration also encourages widespread awareness of the importance of usability, though perhaps more from necessity than choice. Its Web site must cater to the variety of services and offices that make up GSA, which have very different constituencies.

Having individuals on the Web team who can encapsulate all of the knowledge needed to serve those different players is probably not possible, said Tom Skribunt, acting director of marketing and strategic planning at GSA.

"I'm not sure there is such a thing as a usability expert" for GSA, he said. "We prefer that the people who are involved in such things as the management of Web content be sensitized to the usability concerns of their customers, which can be very different."

That requires eternal vigilance and a constant flow of communication about usability issues among content providers, Skribunt said.

Agencies can either train their Web staffs in usability skills or hire outside experts for activities such as conducting usability tests, said Gina Pearson, Web manager for the Agriculture Department's Economic Research Service. But agencies probably cannot do without someone whose sole focus is usability.

"Ideally, you want one person at least who is the usability advocate for the Web site," she said. "Someone who goes to the training sessions, who keeps tabs on what industry is doing about usability, who can help set up the testing and also help train other people on the Web team on aspects of usability."

That kind of advocacy helps brings another perspective to the table when Web issues are being considered beyond the specific concerns of graphic designers, content managers or other specialists. Those people could also be up-to-date on usability issues, Pearson said, but an advocate would make sure such concerns were always part of the discussion.

It's also important that Web usability has a champion as high up the agency's executive ladder as possible, according to Sanjay Koyani, a senior staffer in the National Cancer Institute's Communication Technologies Branch. Cultural issues could force usability to the sidelines without that high-level support, he said.

"A lot of people we work with believe that usability is something that can be thrown in at the end of the Web design process," he said. Instead, usability "needs someone who can apply the philosophy of designing from a user perspective and who can make it central" to the process.

The institute has a Web site (www. cancer.gov) that has won several usability awards in the past year. It runs the resource site www.usability.gov and is one of the principal advocates of Web usability in the federal government.

However, many government officials have not embraced Web usability, at least not as a formal concept that must be applied to what they do on a daily basis.

The Army's Web site, for example, is one of the busiest and biggest in the federal government. By the Army's reckoning, it is about 64 times larger in file size than the next largest military site.

And Lt. Col. Mark Wiggins, Web manager for the Army's home page (www.army.mil), handles the site with the help of just two other people.

"Usability is a term we refrain from using," he said. "It doesn't really mean much, because it depends on who you speak to and it can mean very different things to different people."

For Wiggins, the goals are establishing an identifiable "brand" for the Army site with appropriate messaging features and content, "and then all the other stuff the look and feel, navigation, etc. has to support that," he said.

His team conducted a series of focus group meetings early in 2001 before redesigning the site's home page, and the groups are reconvened regularly to see what participants think of the site "at least on an informal basis," he said. The Army also surveys Web site users.

Apart from that, he said, usability issues are covered by getting as much feedback as possible from users and "staying on top of things day by day."

That's not to say that he wouldn't like to have more resources, of course. He recently requested an increase in his Web staff to a total of eight people, including an assistant Web manager, content coordinators, software programmers, and graphics and multimedia specialists but not a usability expert.

Robinson is a freelance journalist based in Portland, Ore. He can be reached at hullite@xxxxxxxxxxxxxxx
*****************************
Federal Computer Week
Bush signs $355 billion Defense bill
By Dawn Onley


President Bush signed into law today a $355.1 billion defense appropriations bill marking the largest increase in defense spending across the board since the Reagan administration 20 years ago.

A $37 billion increase over last year's budget, the spending plan covers a 4.1 percent pay increase to service members. It also doles out $58 billion to cover research and development of new weapons and technologies.

The bill falls $1.6 billion short of what was requested, but funds a range of transformational IT initiatives, including:


$251 million for the Army's Future Combat System



$338 million for the Air Force's Multi-sensor command and control constellation development program



A 17 percent increase for the Defense Advanced Projects Agency over last fiscal year



The Navy received the $1.4 billion it requested to fund the Navy-Marine Corps Intranet, but the bill includes language which calls for "rigorous operational assessment" before entering the next phase of the program.


The budget also increases procurement to $71.6 billion a $10.7 billion increase over fiscal year 2002.
*************************
Washington Post
More Than One Internet Attack Occurred Monday
By Brian Krebs and David McGuire
Wednesday, October 23, 2002; 7:06 PM


Monday's attack on the 13 computer servers that manage the world's Internet traffic was the first of two assaults, according to officials at the companies that were affected.

Just after 5 p.m. EDT on Monday, a "distributed denial of service" (DDOS) attack struck the 13 "root servers" that provide the primary roadmap for the Internet. The second attack started several hours later and targeted a different kind of Internet server.

DDOS attacks are intended to overwhelm networks with data until they fail.

The first attack, which lasted an hour, targeted all 13 of the root servers that form the core of the worldwide Domain Name System, which converts numeric codes into the words and names that form e-mail and Web addresses. Some of the servers failed intermittently, but Internet users were largely unaffected due to redundant nature of the root-server system, experts said.

The second attack, say sources familiar with the incident, targeted "name" servers that direct Internet users to more specific online locations. Those servers house Internet domains such as dot-com, dot-biz and dot-info, and country code domains such as Great Britain's dot-uk and Canada's dot-ca.

"At around 11 (p.m. EDT), the whole thing started over again, this time switching to the global (name) servers," said Chris Morrow, a network security engineer for UUNET, in an interview Tuesday. A unit of WorldCom Inc., UUNET handles roughly half of the global Internet traffic and is the service provider for two of the 13 root servers.

VeriSign, which manages the servers for the dot-com, dot-net and dot-org domains, tracked attacks against all of its name servers beginning around 10 p.m. Monday, company spokesman Brian O'Shaughnessy said.

VeriSign also operates two of the 13 root servers that were targeted in the first attack. Neither VeriSign's root servers nor its name servers were taken down in the attacks, O'Shaughnessy said.

"We experienced an attack on our name-server constellation and we dealt with it the same way we dealt with the previous attack on our root servers," he said.

Dublin-based Afilias Ltd. also reported having its "dot-info" name servers struck late Monday, but Afilias spokeswoman Heather Carle said the company was able to easily repel the attack. "We're able to internally balance the load from any hits our DNS server takes," she said.

Afilias operates dot-info, one of seven newer domains created to ease crowding in the popular dot-com, dot-net and dot-org domains.

If all of the name servers for any domain were crippled long enough, users would start having difficulty reaching addresses within those domains. Most name and root servers are designed with enough back-up capacity that such an attack would be very difficult to execute.

The White House's Office of Homeland Security and the FBI are investigating Monday's cyber attacks, but have declined to speculate on who might have been responsible. It is also not clear whether the same source was to blame for the separate attacks on root and name servers.

At a press conference today, White House Press Secretary Ari Fleischer sought to downplay speculation that the strikes were carried out by terrorists.

"I'm not aware there's anything that would lead anybody to that direction," Fleischer said. "History has shown that many of these attacks actually come from the hacker community."

It is difficult to discover the identities of DDOS hackers because the computers they use to mount the assaults usually are commandeered -- either manually or remotely -- and programmed to carry out the attacks. These computers often belong to unsuspecting home users.

Experts say the only way to trace the attacks to their true source is to deconstruct the data packets used in the assault as it is happening. According to Gordon Johndroe, spokesman for the Office of Homeland Security, the FBI was able to "monitor the attack while in progress."

UUNET's Morrow said a successful investigation ultimately could hinge on boasts made in the hacker community.

"I don't think anyone knows who's responsible for this yet," Morrow said. "Somebody might blather about it in a couple months, and that's probably the only chance have of finding out who did it."
*****************************
USA Today
Interpol police organization finally going digital


YAOUNDE, Cameroon (AP) The Interpol international police organization is finally going digital, dispensing with clackety telex machines and snail mail for expediting most-wanted notices.

By the end of next year, Interpol officials hope, all 181 member countries will be linked to an Internet-based clearinghouse on criminals that will flash digital fingerprints, pictures and even DNA profiles to anyone with a personal computer, the right software and proper authorization.

"In most countries, 12-year-olds with computers could move data across borders faster than the international policing community," said Stanley Morris, an American law-enforcement veteran tasked with dragging Interpol into the Internet age.

The roughly $9.8 million Internet project, approved this week at Interpol's annual meeting, took form in February and is being tested in 40 countries, Morris said.

For decades, Interpol sent notices for wanted suspects, missing persons and stolen vehicles around the world across text-only private lines. Pictures were often sent by mail on printed leaflets.

A most-wanted notice can sometimes take five months to reach all member nations, said Morris, 60, who led the U.S. Marshals service in the 1980s. Under the new system, he said, an urgent message can be circulated in one day, bringing the group even closer to its stated purpose: to help the world's police share information.
***************************
New York Times
October 24, 2002
At Senators' Web Sites, a 60-Day Vow of Silence


Visitors to the Web site of Representative John Thune, Republican of South Dakota, can easily point and click their way to recent press releases on drought relief and other issues affecting the state's voters.

But those looking for similar information at the site of Senator Tim Johnson, a Democratic incumbent whom Mr. Thune is challenging in this year's election, find this prominent disclaimer: "Pursuant to Senate policy, this home page may not be updated for the 60-day period immediately before the date of a primary or general election."

The latest press releases are dated Sept. 5, and there is a photograph of Senator Johnson at an agriculture forum on Aug. 15.

Under Senate rules intended to prevent members from campaigning at government sites, the 30 senators seeking re-election next month cannot update their sites during the home stretch of the campaign. There is no such rule for House members.

Some senators and political groups say the rule should be abolished, or at least altered, to reflect the way the Internet has penetrated society. In addition to the Johnson-Thune race in South Dakota, current races in Louisiana, Iowa and Georgia pit a House member against an incumbent senator.

"This is a quirk in the rules that is really unfair to Senate incumbents," said Bob Martin, the communication director for Senator Johnson. "Web sites have become such an effective communication tool for constituents that it is like being thrown back a decade." Members of Congress are barred from sending out mass mailings in the 60 days before an election. When Congress enacted its Internet rules in 1996, the House decided that information provided on the Web was different from unsolicited mail sent to a voter's home and would be permitted.

The Senate, however, determined that government Web sites could provide an incumbent with a "taxpayer-subsidized advantage" during election years.

A spokesman for Senator Christopher J. Dodd, Democrat of Connecticut and chairman of the Rules and Administration Committee, said that the committee would probably review the Senate rule next year. "It's important to ensure that members are able to interact with their constituents while also protecting against any unfair political advantage," said the spokesman, Marvin Fast.
*************************
Associated Press
FBI Seeks to Trace Internet Attack
Wed Oct 23, 5:00 PM ET
By TED BRIDIS, Associated Press Writer


WASHINGTON (AP) - The White House sought Wednesday to allay concerns about an unusual attack this week against the 13 computer servers that manage global Internet traffic, stressing that disruption was minimal and the FBI (news - web sites) is working to trace the attackers.

Most Internet users didn't notice any effects from Monday's attack because it lasted only one hour and because the Internet's architecture was designed to tolerate such short-term disruptions, experts said.


The White House said it was unclear where the attack originated, who might be responsible or whether the attack could be considered cyber-terrorism.



"We don't know. We'll take a look to see if there are any signs of who it may or may not be," spokesman Ari Fleischer (news - web sites) said. "I'm not aware there's anything that would lead anybody in that direction. History has shown that many of these attacks actually come from the hacker community. But that's why an investigation is under way."



The FBI's National Infrastructure Protection Center and agents from its cyber-crime division were investigating, FBI spokesman Steven Berry said.



Civilian technical experts assisting with the investigation, speaking on condition of anonymity, said the FBI was reviewing electronic logs of computers used in the attack to determine the origin of those responsible.



"It's the nature of these things that they're never easy to untangle and yet sometimes there are clues left behind," said Steve Crocker, chairman of an advisory committee on the security and stability of these servers for the Internet Corporation for Assigned Names and Numbers.



Another expert, Paul Mockapetris, the chief scientist at Nominum Inc., said those responsible appeared to use generic "ping flood" attack software that had been installed on computers across the globe using many different Internet providers. His company provides consulting advice to some of the organizations operating the servers.



"It was a fairly large attack, but it doesn't look to be an attack designed to do maximum damage," said Richard Probst, a vice president at Nominum. "Either it was a wake-up call or a publicity stunt or a probe to understand how the system works."



In so-called "denial of service" attacks, hackers traditionally seize control of third-party computers owned by universities, corporations and even home users and direct them to send floods of data at pre-selected targets.


The attack on Monday was notable because it crippled nine of the 13 servers around the globe that manage Internet traffic. Seven failed to respond to legitimate network traffic and two others failed intermittently during the attack, officials confirmed.

Service was restored after experts enacted defensive measures and the attack suddenly stopped.

"There was some degradation of service; however, nothing failed and providers were able to mitigate the attacks pretty quickly," Fleischer said.

A spokesman for Office of Homeland Security, Gordon Johndroe, disputed experts who characterized the attack as the most sophisticated and large-scale assault against these crucial computers in the history of the Internet. He said the attack did not use any special techniques and was not particularly sophisticated.

"There were minor degradations, but no failures," Johndroe said.

Computer experts who manage some of the affected computers, speaking on condition of anonymity, said the attack effectively shut down seven of the 13 computers by saturating their network connections and partially saturating the connections for two others. Although the servers continued operating, they were unable to respond to legitimate Internet requests.

The 13 computers are spread geographically across the globe as a precaution against physical disasters and operated by U.S. government agencies, universities, corporations and private organizations.

"The public harm in this attack was low," agreed Marc Zwillinger, a former Justice Department (news - web sites) lawyer who investigated similar attacks against e-commerce Web sites in 2000. "What it demonstrates is the potential for further harm."

Monday's attack wasn't more disruptive because many Internet providers and large corporations and organizations temporarily store, or "cache," popular Web directory information for better performance.

Although the Internet theoretically can operate with only a single root server, its performance would slow if more than four root servers failed for any appreciable length of time.


*************************** Sydney Morning Herald Net attacks: Internet pioneer predicted outages in 2000 October 24 2002

Monday's distributed denial of service attacks on the Internet's root servers were the real thing.

But such attacks were discussed over two years ago by an Internet pioneer with a technology journalist from The Age and The Sydney Morning Herald.

In the article, published on 22 August 2000 and headlined "How crackers could crash the Internet", Nathan Cochrane spoke to Milo Medin, chief technologist of the now defunct US broadband ISP Excite@Home, who warned that crashing the root servers would bring down the Net.

Medin said that if would-be cyber-terrorists understood the way the Net was structured, they would concentrate their energies on the root servers and not bother with vulnerabilities in individual Web sites.

Somewhat eerily, Cochrane's article said: "According to Excite@Home's Medin, a simultaneous, sustained distributed denial of service (DDoS) attack, such as that executed recently against high-profile e-commerce sites Yahoo! and Amazon.com, would knock out all communications."

It continued:" "The effects of a DDoS attack would not be immediately obvious, Medin warns. The popular sites would stay online slightly longer, as these are often cached. The damage would initially be revealed as more obscure sites started to go off the air. Ultimately, entire nations, those without their own root servers - such as Australia - would become invisible to the rest of the online community. "

Excite@Home filed for Chapter 11 bankruptcy protection on September 28 last year, but AT&T subsequently withdrew its offer to buy its assets for $307 million. The US provider was affiliated with the Australian broadband ISP formerly known as Optus@Home, now called Optusnet Cable and owned by Singapore's SingTel, sharing technical and business resources.

Most of Excite@Home's 4 million US customers were transferred this year to other providers including AT&T's broadband service, sparking a court feud between the former partners. Excite@Home formally shut its doors on February 28; its assets were sold at auction three months later.
*****************************


Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 510
2120 L Street, NW
Washington, D.C. 20037
202-478-6124
lillie.coney@xxxxxxx