[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips April 22, 2002



Clips April 22, 2002

ARTICLES

EU to Clamp Down on Hacking, Internet Attacks 
Europeans Eye E-Vote Eventuality
Md. Opens Registry Of Sex Offenders 
How Teens Still Hack Million-Dollar Security Systems
Klez virus passes confidential info 
Wireless program gets new life
Google Runs Into Copyright Dispute
Japanese Computer Is World's Fastest, as U.S. Falls Back
U.S. Exporting Personal Internet Privacy Technologies 
If Bertelsmann Wed Napster, It Could Sue Itself, and More
Army proxy server closes Web back door
Grown-ups like county kids' site 
Virtual IT job fair draws attention
Picking up the pace at the border
NMCI decision set for May 3
Seems Computers Baffle 10-Year-Olds, Too
Why Is This Room So Popular? Shh. You're About to Find Out
Mainframe Skills Shortage Five Years Off
License Bill Could Create IT Headaches
Egg backs digital payments
Britons 'do not want e-government'
Fancy an electronic helper through life?
Talking tech makes life easier
Battle for Brains 
Navy first to use e-signatures on smart cards for travel 
This phone knows where you are ... and how to help
Exhibit lets visitors change ethnicity
Custom Services: Putting Web Services to Work
Internet-enabled fax technologies find following

********************
Reuters Internet Reports
EU to Clamp Down on Hacking, Internet Attacks 
Fri Apr 19,10:38 AM ET 
By Marie-Louise Moller 

BRUSSELS (Reuters) - Internet hackers and spreaders of computer viruses could
face four years in jail under a draft "cybercrime" law adopted by the European
Commission (news - web sites) on Friday. 

  
The European Union (news - web sites) has pledged to clamp down on so-called
cybercrime, aimed at destroying computer networks, which has caused billions of
dollars in damage worldwide. 

"Organized hacking groups specialized in hacking and defacement of Web sites
are more and more active at worldwide level," the Commission said in the
proposal obtained by Reuters. 

"More serious attacks could lead not only to serious financial damage, but in
some cases even to the loss of life," it added, referring to attacks on
hospital and air-traffic control systems. 

The draft law, to be officially unveiled by the Commission next Monday, seeks
to harmonize existing national legislation in the 15-nation EU and would
require backing from EU governments before coming into force. 

It is also seen as an important part of the Union's fight against terrorism,
because the proposal requires member states to set up an exchange of
information on attacks against information systems. 

"There have already been several recent occasions where tensions in
international relations have led to a spate of attacks against information
systems," the proposal said. 

Hackers have sabotaged Israeli and Palestinian Web sites, for example, as well
as U.S. administration Internet sites. 

The proposal defines hacking as gaining unauthorized access to an information
system with the intent to cause damage or for economic gain. 

Among organized criminal hacking groups it named "the Brazilian (news - web
sites) Silver Lords and the Pakistan G-force, which try to extort money from
their victims by offering them specialized assistance after hacking into their
information systems." 

The proposal also targets anyone who sends viruses such as the infamous "I love
you" virus, which caused major information system breakdowns across the world
in 2000. It also criminalizes sending other types of destructive software such
as "logic bombs," "worms" and "Trojan horses." 

If they approve the proposal, the 15 EU governments will have to introduce
maximum prison sentences of at least one year for acts of cybercrime, and four
years in cases that caused physical harm, large economic losses or gains, or
that were committed by a member of an organized crime network. 
******************
Wired News
Europeans Eye E-Vote Eventuality

PARIS -- In the first flush of Internet fever, electronic voting was hailed as
the miracle cure-all for democracy's ills. E-vangelists argued it would engage
young people in the political process, invigorate democracy and bring voting
methods up to speed with current technology. 

These days, online voting invariably comes with a health warning attached: Use
only in carefully controlled circumstances. All experts are now more or less of
the opinion that it is too soon to contemplate remote Internet voting -- in
which people vote from home or other unofficial locations -- on a large scale. 

Yet perennial concerns about the security and integrity of the e-voting process
have failed to dampen the enthusiasm of many European countries for pressing
ahead with Internet voting experiments and pilot projects. 

The British government, for example, is providing £3.5 million (about $5
million) to fund trials of Internet, digital TV and SMS voting at 30 councils
in May's local elections. 

Selected wards will allow citizens to vote from their PCs at home, in local
libraries and at council-run information kiosks. Liverpool and Sheffield will
test text-messaging and digital TV voting, while other cities will use
electronic counting technology. 

"We are particularly keen to engage younger voters and feel that these
innovations will help," said Nick Raynsford, Great Britain's local government
minister. "Our aim is to learn from these pilots so that we can confidently
modernize our voting arrangements. We propose an ever-more extensive program of
pilots at future local elections to open up the possibility of an e-enabled
general election some time after 2006." 

In Germany, led by cities such as Bremen and Cologne, authorities have
announced that citizens will be able to vote online by 2006. If all goes
according to plan, people will be able to vote from electronic polling stations
anywhere in the country, instead of their own constituency, and some may be
able to vote via the Internet. 

The Estonian government has also announced plans to introduce online voting for
the 2003 general elections. Similarly, the canton of Geneva in Switzerland
plans to allow remote voting by Internet for its local elections in 2003. 

Robert Hensler, Chancellor of the Canton of Geneva, considers that a vote is a
public service like any other and must evolve with the times. "There is no
reason to insist on voting methods that are not in keeping with the habits of a
population. Indeed, such an attitude is contrary to the spirit of democracy,"
he said. 

Other towns in France, Italy and Spain are also planning experiments with
e-voting systems in forthcoming elections and referenda. 

Caution remains the watchword, however, and advocates of online voting stress
it is more important for new systems to be reliable and secure than simply to
have them implemented speedily. 

"Confidence should not be sacrificed for convenience," said Jim Adler,
president and CEO of VoteHere, a Washington-based supplier of election software
and services. "We have the technology to deal with the worst horror scenarios
that anyone can think of in terms of virus attacks or denial-of-service
attacks, but we have to go further in educating and reassuring people about the
integrity of e-voting." 

A major step in the right direction, believes Adler, would be tackling the
current alphabet soup of e-voting security standards. 

"There are no uniform standards right now for electronic voting," Adler said.
"The next step is having standards that can prove to the public and the
election community that we can meet the requirements of secure and transparent
e-voting. Claiming we can do it is one thing, but we have to be able to prove
it. That can only be done against a benchmarked standard." 

Alex Folkes, of Britain's Electoral Reform Society, said his group welcomed
studies into alternative voting methods aimed at making voting more convenient
without compromising the safety and security of the ballot. "We believe that
starting off with small-scale pilots and building up in subsequent elections is
the right way to go about it," he said. 

Jo Dungey, a policy advisor at the Local Government Information Unit in the
U.K., agreed with Folkes' assessment. While broadly welcoming the initiatives,
she cited concerns about fraud and said it was important not to introduce a
system that disadvantaged those without access to new technology, mainly older
and lower-income people. 

"As long as these can be tackled, I think we have to have change, and new
approaches are likely to be particularly appealing for young voters, among whom
turnout has fallen very badly," she said. 

Dungey, however, harbors no illusions that online voting will provide a panacea
for widespread voter apathy and alienation. 

"Technical changes alone will not solve the whole problem of low turnout," she
said. In her view, other more complex reasons had to be examined, including a
lack of information about the process of voting, public alienation from the
conventions of the political process and the way the media portrays politics. 

André Santini, the tech-savvy mayor of Issy-les-Moulineaux and a long-time
advocate of changing the current French law prohibiting online voting, bemoaned
the fact that France was lagging well behind other European countries in
embracing new technologies for e-democracy. 

"We have to debate the impact of information technology on our democratic
processes rather than simply dismiss it on the grounds that it's too
complicated or that it creates too many problems," he said. 

Folkes of the Electoral Reform Society stressed it was important not to focus
only on the technical performance of new voting systems. "We will be looking
not just at whether the systems worked, but also how voters took to them. There
is no point in spending lots of time and money on systems that people do not
want," he said. 

He also agreed that there was no quick-fix solution to voter apathy. 

"Making voting more convenient is certainly one part of addressing the problem,
but so is the politicians conducting the election campaign in such a way that
they enthuse people to go out and vote," he said. 
**********************
Washington Post
Md. Opens Registry Of Sex Offenders 
Lawsuits Expected Over Online List 
By Manuel Roig-Franzia
Monday, April 22, 2002; Page B03 


Maryland's first Internet sex-offender registry officially debuts today,
allowing neighbors to search an online database that will tell them whether a
child molester or a rapist lives down the street.

The registry, which is maintained by the Department of Public Safety and
Correctional Services, contains the names, addresses, photographs and offenses
of more than 2,200 people convicted of a variety of sex crimes since October
1995. A law passed by the General Assembly this spring will add offenders
convicted since 1975, according to Leonard A. Sipes Jr., a corrections
department spokesman.

"There's no doubt the list will grow by leaps and bounds," Sipes said.

The registry can be found at www.dpscs.state.md.us. The offenders required to
register are kidnappers, child pornographers, rapists and people convicted of
sex offenses against children.

Most of the people on the list were convicted in Maryland, Sipes said, though
the registry also contains the names of some people who were convicted in other
states but now live in Maryland.

The District of Columbia and about 30 states, including Virginia and now
Maryland, have Internet sex-offender registries, according to the Department of
Justice. Maryland's site debuts in the face of growing court battles over
whether the registries violate offenders' constitutional privacy guarantees and
impinge on their civil rights.

Sipes said Maryland officials are almost certain that their site will be
challenged in court. It has been two years since the state legislature approved
the creation of an Internet registry in Maryland, but officials chose to move
cautiously, studying legal sparring in other states before starting their own
site, Sipes said.

Four men convicted of sex crimes in Fairfax County filed a lawsuit anonymously
last month against Virginia, saying its registry violates their privacy and
civil rights. Similar suits have been filed in every state that has an Internet
sex registry, Sipes said.

Plaintiffs and corrections officials are watching a case accepted by the U.S.
Supreme Court in February that challenges Alaska's registry and could affect
registries in other states. The justices will decide whether the Alaska
registry violates a constitutional ban on retroactive punishment because it
publishes the names of offenders convicted before the registration law was
enacted. Virginia and Maryland also post the names and addresses of offenders
convicted before their registration laws went into effect.

Sipes warned the public not to view the Maryland registry as a panacea, saying
the names of most sex offenders are not known to law enforcement because so
many sex crimes are not reported.

"We do not want people to gain a false sense of security," Sipes said.

The corrections department spent $414,000 to develop the site and will spend
$368,000 a year to maintain it, Sipes said. The site has been up and going on
the Internet for several days in response to requests by various news
organizations to preview it before today's debut.

The Maryland site will be updated once a week using addresses and other
information that offenders are required to provide under the state's "Megan's
law," named for a New Jersey girl killed in her neighborhood by a repeat sex
offender.

Sex registry opponents fear that vigilantes might use inaccurate addresses to
harass innocent neighbors.

"So, now your mother lives in the house where the sex offender used to live . .
. so it is your mother who suffers the broken windows," said Susan Paisner, a
criminology consultant from Adelphi.

Sipes acknowledged that inaccurate addresses are a concern but noted that
several police departments -- including those in Montgomery County, Baltimore
and Baltimore County -- are conducting spot checks to make sure offenders give
accurate information to the registry.
********************
Government Computer News
Microsoft .Net, J2EE could build 
e-gov structure 
By Jason Miller

The Office of Management and Budget is recommending Microsoft?s .Net and Java2
Enterprise Edition as possible architectures for its 24 e-government projects. 

Debra Stouffer, OMB?s federal enterprise architecture program manager,
yesterday discussed progress on a governmentwide enterprise architecture at a
Washington luncheon sponsored by the Association for Federal Information
Resources Management. It was the first time an OMB official has outlined a
component architecture for the initiatives. 

?These two are the only prominent ones that are relatively proven in industry,?
she said. ?They offer the capabilities that we outlined in our requirements.
They are open technologies with proven industry standards.? 

Stouffer said systems based on them are reusable, stable, interoperable,
portable and secureall of which are requirements for the 24 projects. 

OMB rated both technologies on a 24-point scale for Web services, File Transfer
Protocol and 
e-mail. J2EE rated higher, especially in Web services where it earned 22 of 24
points. 

?We are pointing out the disadvantages and advantages of each, not recommending
one or the other,? she said. ?Agencies must consider other things besides the
technology, such as how the new system relates to your legacy system or what
other systems must be integrated.? 

Stouffer also discussed OMB?s just-completed business reference model, which
precedes an enterprise architecture. The model found that federal agencies
engage in 32 major lines of business, each of which has 123 or more
subfunctions. 

?The business reference model is a starting point and framework that we build
the enterprise architecture on,? she said. ?We look for common business
processes, then we drill down to find out how they are or are not being
supported, where the gaps are and where the opportunities are to reduce
redundancies. This is the guidance on how to build applications in the future.?
******************
News Factor
How Teens Still Hack Million-Dollar Security Systems

By Lisa Gill
NewsFactor Network
April 22, 2002

As awareness of information security and the threat of cyber terrorists
increases, U.S. government agencies and businesses have beefed up security in
order to thwart system outages and intrusions in mission-critical operations.
For the complete article see http://www.newsfactor.com/perl/story/17371.html

*************************
News.com
Klez virus passes confidential info 
By Robert Lemos 
Staff Writer, CNET News.com
April 19, 2002, 1:10 PM PT

The latest variant of the Klez worm sometimes chooses to hitch a ride on
sensitive documents, resulting in victims' confidential information spreading
with the malicious program, Russian antivirus firm Kaspersky Labs said Friday. 
Known as Klez.g, Klez.h and Klez.k, depending on the security advisory, the
newest incarnation has spread worldwide, sending itself in e-mail messages with
infected documents attached. 

Occasionally the documents contain sensitive material, said an advisory from
Kaspersky Labs.


"Klez.h poses a special threat: The worm scans the disks of an infected
computer and, depending on a set of conditions, attaches a file to each
infected e-mail it distributes," stated the advisory. 

Text, HTML (Hypertext Markup Language), Adobe Acrobat and Excel files are
included in the types of documents that the virus can forward, but other files
that the worm could attach--such as JPEG and MPEG files--are less likely to
contain important information. 

Representatives of Kaspersky Labs were not available for comment. 

This is not the first time a virus has leaked information, however. The SirCam
worm, which is still spreading among computers on the Internet, also attached
itself to documents and forwarded on the infected files to potential victims. 

Security-software maker Symantec upgraded on Wednesday the latest variant,
which it labeled W32.Klez.h, to a threat level of three from a previous rating
of two. The company categorizes threats on a scale of one, the lowest threat,
to five. 

However, Vincent Weafer, senior director of Symantec's Security Response team,
on Friday said they haven't been able to reproduce the information-leaking
function of the worm that Kaspersky Labs is claiming. 

"It is nothing that we have seen in our lab," he said. "It definitely data
mines files for e-mail addresses, but we haven't seen it attach files. We will
keep doing some additional testing in this area." 

E-mail security firm MessageLabs said the Klez.h worm had proliferated
"dramatically" during the day Friday. 

MessageLabs, based in the United Kingdom, first detected the new variant on
Monday from an Internet address in China. Most antivirus vendors, such as
Symantec, McAfee and Sophos, have offered Klez.h patches since Wednesday. 

MessageLabs said it stopped two copies of Klez variants on Monday. Since
Wednesday afternoon the number of copies rose sharply, gathering pace on
Friday. The firm said it stopped several thousand copies on Friday, for a total
of more than 46,000 copies by Friday afternoon--or nearly one in every 77
e-mails. The United Kingdom topped its list with more than 5,000 copies
stopped, followed by Hong Kong and the United States. 

The worm arrives in an e-mail message with one of 120 possible subject lines. 

In many circumstances, the worm doesn't need the victim to open it in order to
run. Instead, it takes advantage of a 12-month-old vulnerability in Microsoft
Outlook, known as the Automatic Execution of Embedded MIME Type bug, to open
itself automatically on un-patched versions of Outlook. 

The program will also cull e-mail addresses by searching a host of different
file types on the infected PC. Using its own mail program, the worm will send
itself off to those e-mail addresses. In addition, it will use the addresses to
create a fake "From:" field in the e-mail message, disguising the actual source
of the e-mail. 

The worm also attempts to disable antivirus software by deleting registry keys,
stopping running processes and removing virus-definition files. 

Finally, the worm drops a second virus on the computer and spreads to other
disk drives connected to the PC over an internal network.
*******************
Federal Computer Week
Wireless program gets new life  

The Bush administration's focus on homeland security has revitalized an
intergovernmental wireless program designed to enhance communications among
first responders, and now the General Services Administration is stepping in to
help.

The National Wireless Communications Infrastructure Program (NWCIP) aims to
provide interoperability among the many land mobile radio systems used by the
Defense Department and federal, state and local law enforcement agencies. 

NWCIP, conceived several years ago, was placed on a fast track after the Office
of Homeland Security took over the program from the Commerce Department
following the Sept. 11 terrorist attacks. 

Now GSA's Federal Technology Service is helping vendors on its FTS 2001
long-distance telecommunications contract modify their offerings to meet the
needs of NWCIP, FTS Commissioner Sandy Bates said April 16 at the FTS Network
Services Conference in Orlando, Fla.

Sprint and WorldCom Inc., the two primary vendors on FTS 2001, are in the final
stages of picking integrators to help fit the necessary features and offerings
into the contract, Bates said.

NWCIP is intended to extend the emergency communications capabilities past the
existing landline priority system, the Government Emergency Telecommunications
Service (GETS).

Existing mobile radio systems are generally single-channel analog FM voice
systems, owned and operated by single agencies to perform a single,
well-defined mission.

NWCIP has been in development for more than a year, but it received little
attention or funding, said an industry official involved in the program.
However, the importance of wireless communications interoperability was made
evident by the events of Sept. 11, and the program is now moving forward
quickly, the official said.

NWCIP began as part of the Defense Department's Pacific Mobile Emergency Radio
System, which is being deployed in Hawaii and Alaska to improve coordination
between DOD and civilian federal, state and local law enforcement agencies,
according to a May 2001 presentation at the Federal Wireless Users Forum by
Charles Cape, director of special projects and programs in Commerce's Office of
the Chief Information Officer.

At the time, the system was expected to save $94 million over eight years by
eliminating resource and equipment duplication. Because communication between
military and law enforcement is necessary now more than ever, DOD is
accelerating the fielding of the system, Adm. Dennis Blair, commander in chief
of the U.S. Pacific Command, testified in February before the House
International Relations Committee.

NWCIP is one of several wireless interoperability programs under way in
government. Others include the Public Safety Wireless Network and the Project
Safecom e-government initiative led by the Treasury Department.
*************************
New York Times
Agreement on Computer Recycling
By JENNIFER 8. LEE

Responding to a growing problem of waste computer equipment, manufacturers and
local governments have agreed in principle to set up a nationwide recycling
program.

Under the proposal, a fee  perhaps $25 or $30  would be added to computer
systems at the time of purchase. The collected money would finance a recycling
program for computers and television sets. Most likely, the recycling would be
handled by private rather than government organizations.

The National Electronics Product Stewardship Initiative, the group that is
coordinating the agreement among governments, manufacturers and
environmentalists, hopes to have a detailed framework worked out by September.
The program would be rolled out slowly over the next few years.

If carried out, the proposal would be one of only a few recycling plans with
national scope. For example, only 10 states have laws requiring deposits on
cans and bottles.

It would also be a concrete accomplishment for a specific environmental
movement that has so far been largely theoretical in the United States. The
movement, known as product stewardship, places shared responsibility for
recycling products on manufacturers, government, retailers and consumers. 

"The message to the consumer when they are buying the product is that
responsibility of it is not only in the use, but also in the after-use," said
Scott Cassel, the director of the Product Stewardship Institute, which is
taking part in the recycling discussions.

Disposal of obsolete computers has become an increasing financial and
logistical headache for local governments over the last several years. The
toxic materials and the intricate designs make environmentally sound disposal
expensive. For example, cathode-ray tube devices like monitors and television
sets have four to eight pounds of lead each. Massachusetts has already banned
cathode-ray tubes in local dumps.

"Our local governments are in a real bind because they have such demand for
recycling," said Maureen Hickman, a policy analyst with the Minnesota Office of
Environment Assistance. "But some won't even start collecting because they are
afraid if they open that door they won't be able to afford it." 

Hennepin County, which includes Minneapolis, has one of the most advanced
electronics recycling programs in the country. The volume of electronics
recycling in Hennepin County has increased about 30 percent a year for the last
10 years. The county, which has a population of 1.2 million, spent $1.1 million
on electronics recycling last year.

As a result of pressure from local governments, more than 20 state legislatures
have introduced bills on computer recycling. Many of these would place
responsibility for disposal on the manufacturers. Environmental agencies in
California, Massachusetts and Minnesota in particular have been aggressive in
pressing for recycling legislation. 

The electronics companies are also facing legislation in Europe and Japan that
places responsibility on the manufacturers. 

Computer disposal has attracted public attention because of a recent report by
environmental groups that 50 to 80 percent of American high-technology trash
was exported to developing countries. The report described the hazards
experienced by residents of China, India and Pakistan who are exposed to the
hazards of electronic recycling.

Manufacturers want to pre-empt a patchwork of state laws. "The reason we are
looking at a national solution is because it's the only way it can work," said
Kerry Fennelly, spokeswoman for the Electronics Industry Alliance, a group that
represents manufacturers. "We have to develop a system where everyone plays." 
*******************
Mercury News
Ex-AMD executive's suit alleges ethnic bias
By Howard Mintz
Mercury News

 
A former top executive at Advanced Micro Devices sued the chip giant Thursday,
claiming company founder Jerry Sanders and other leaders humiliated him and
forced him out after Sept. 11 because he is an Arab-American.

Walid Maghribi claims Sanders and AMD President Hector Ruiz repeatedly directed
ethnic slurs and jokes at him, according to a complaint filed in U.S. District
Court in San Jose.

Maghribi, a president of AMD's memory group and a member of the company's
executive staff, contends that both Sanders and Ruiz were directly responsible
for ruining his career in the months after the attacks. Next week, Ruiz is set
to succeed Sanders as the Sunnyvale company's chief executive officer.

AMD officials vigorously denied any allegations of discrimination against
Maghribi, who resigned in December from one of the highest-paying positions in
the company, earning more than $6 million last year. Calls to Sanders and Ruiz
were referred to AMD spokesman John Greenagel.

``The company's position is that it's utterly without merit,'' Greenagel said
of Maghribi's claims. ``We'll contest it.''

Among other things, the suit contends that Maghribi's troubles started at a
meeting in October with Sanders and a number of other top executives and
directors in which the subject of Sept. 11 came up. The suit maintains that
Sanders was startled to discover that Maghribi, who is Lebanese-born, was a
Muslim, at one point allegedly saying to him, ``You are not an Arab, right?''

Maghribi replied that, in fact, he was Arab and Muslim, and that Lebanon is an
Arab country. Sanders allegedly replied: ``No, it is not. You are not an
Arab!''

The suit contends that Sanders immediately began to treat Maghribi poorly after
the meeting. Within months, the suit contends, Maghribi went from one of AMD's
leaders to an outcast and he quit because of the hostile environment.

``I just don't get it,'' Maghribi, who lives in Los Gatos, said in an
interview. ``I was with AMD for 16 years. They promoted me five times. I was
making a tremendous amount of money. Nothing was ever done to me until that
date when we had that meeting.''

Several days after the meeting with Sanders, the suit contends that Sanders
withdrew his support for what the complaint describes as the biggest business
development deal AMD had going at the time. Maghribi was overseeing the
undisclosed deal, which was then effectively scratched, according to the suit.

From there, the suit contends that Maghribi, despite his status as one of AMD's
leaders, was given ``absurd'' and ``insulting'' tasks that forced him to quit.
The suit also contends that through this time period, Ruiz and others directed
``demeaning'' jokes about Arab nationals to both Maghribi and his wife.

The suit seeks unspecified damages. Maghribi, who has been in the United States
since 1969, when he arrived to attend college, said the experience was his
first encounter with discrimination at AMD.

``This was the first time that I saw an officer of the company could be treated
like a nobody,'' he said.
*********************
New York Times
Google Runs Into Copyright Dispute
By DAVID F. GALLAGHER

Google, the company behind the popular Web search engine, has been playing a
complicated game recently that involves the Church of Scientology and a
controversial copyright law. 

Legal experts say the episode highlights problems with the law that can make
companies or individuals liable for linking to sites they do not control. And
it has turned Google, whose business is built around a database of two billion
Web pages, into a quiet campaigner for the freedom to link.

The church sent a complaint to Google last month, saying that its search
results for "Scientology" included links to copyrighted church material that
appears on a Web site critical of the church. Under the Digital Millennium
Copyright Act of 1998, which was intended to make it easier for copyright
holders to fight piracy, the complaint meant that Google was required to remove
those links quickly or risk being sued for contributing to copyright
infringement. 

The site in question, Operation Clambake (www.xenu.net), is based in Norway,
beyond the reach of the United States copyright act. The site portrays the
church as a greedy cult that exploits its members and harasses critics. Andreas
Heldal-Lund, the site's owner, says the posting of church materials, including
some internal documents and pictures of church leaders, is allowable under the
"fair use" provisions of internationally recognized copyright law. 

When Google responded to the church's complaint by removing the links to the
Scientology material, techies and free-speech advocates accused Google of
censoring its search results. Google also briefly removed the link to Operation
Clambake's home page but soon restored it, saying the removal had been a
mistake. 

At that point, according to Matthew Cutts, a software engineer at Google, it
started developing a better way to handle such complaints. "We respond very
quickly to challenges, and not just technical challenges but also these sort of
interesting, delicate situations, as well," Mr. Cutts said. 

Under Google's new policy, when it receives a complaint that causes it to
remove links from its index, it will give a copy of the complaint to the
Chilling Effects Clearinghouse (chillingeffects.org). Chilling Effects is a
project of a civil liberties advocacy group called the Electronic Frontier
Foundation and several law schools. It it offers information about Internet
rights issues. 

In the new procedure, Google informs its users when a link has been removed
from a set of search results and directs them to the Chilling Effects site. For
example, a search for the word "helatrobus," which appears in some Scientology
texts, brings up a page of results with this notice at the bottom: "In response
to a complaint we received under the Digital Millennium Copyright Act, we have
removed one result(s) from this page. If you wish, you may read the D.M.C.A.
complaint for these removed results."

The notice includes a link to Scientology's complaint on chillingeffects.org,
which lists the Web addresses of the material to which Google no longer links.
The result is that a complaint could end up drawing more attention to the very
pages it is trying to block. 

Mr. Cutts said Google started linking to chillingeffects.org early this month
but made no announcement, so it took a while for word to go around online.
Meanwhile, Scientology sent Google two more complaints, citing pages within
copies of the Operation Clambake site on other servers. All three complaints
are now on the Chilling Effects site. 

Don Marti, the technical editor of Linux Journal, first wrote about Google's
move on the magazine's site. He said he had been so upset about the company's
initial response to the Scientologists that he organized a small group of
protesters who visited Google's headquarters in Mountain View, Calif., where he
also lives. Mr. Marti says he now applauds Google's efforts to make the process
more transparent. If a letter of complaint simply makes a site more popular,
"only a fool would send one," he said. 

Helena Kobrin, a lawyer representing Scientology at the law firm of Moxon &
Kobrin in Los Angeles, said that Google's use of the letters of complaint would
not discourage the church from pursuing further complaints if necessary and
that there was nothing in the letters that needed to be hidden. "I think they
show very graphically to people that the only thing we're trying to do is
protect copyrights," she said. 

As part of its new process for handling complaints, Mr. Cutts said, Google
added more information on its site explaining how site owners could have their
links restored by filing a countercomplaint with Google. (The required forms
can be downloaded from chillingeffects.org.) If site owners take this step, he
said, they accept responsibility for the contents of their pages. 

Mr. Heldal-Lund, a Norwegian citizen, said he would not file a countercomplaint
because it would put him under the jurisdiction of United States law. He said
that he regretted making so much trouble for Google but was glad that the
incident had highlighted the church's pursuit of its critics. 

The church, which has beliefs based on the idea that people need to release
themselves from trauma suffered in past lives, has taken a keen interest in the
Internet since 1994, when someone posted secret church teachings on an online
discussion group. Critics say the church guards its teachings closely because
it wants its followers to pay for access to higher levels of instruction. The
church says that these payments are donations and that it is simply seeking to
protect its rights online.

With its Chilling Effects partnership, Google is subtly making the point that
the right to link is important to its business and to the health of the Web,
said David G. Post, a law professor at Temple University who specializes in
Internet issues. 

"This is an example where copyright law is being used in conflict with free
connectivity and free expression on the Net," he said. Dr. Post said Google's
situation highlighted the need for more awareness of copyright issues,
including pending legislation that is more restrictive than the 1998 law. The
measure is backed by entertainment giants like Walt Disney, but technology
companies like Intel have come out against it, saying it would hurt consumers
and slow innovation. 

Mr. Cutts said that the links to the complaints were not a political statement,
just a way to "make sure our users get all of the information that they need."
He said that Google had no official position on the copyright act and that so
far it had not been involved in political activity or lobbying. But he said it
"might take an interest in more of those issues." 

The copyright controversy has had an interesting side effect for Operation
Clambake. The Google software judges the importance of a page in part by
looking at how many other pages link to it. Scientology's complaint set off a
flurry of linking to the critics' site, pushing it up two spots to No. 2 in the
search results for "Scientology"  just below the church's official site.

********************
New York Times
Japanese Computer Is World's Fastest, as U.S. Falls Back
By JOHN MARKOFF

SAN FRANCISCO, April 19  A Japanese laboratory has built the world's fastest
computer, a machine so powerful that it matches the raw processing power of the
20 fastest American computers combined and far outstrips the previous leader,
an I.B.M.-built machine.

The achievement, which was reported today by an American scientist who tracks
the performance of the world's most powerful computers, is evidence that a
technology race that most American engineers thought they were winning handily
is far from over. American companies have built the fastest computers for most
of the last decade.

The accomplishment is also a vivid statement of contrasting scientific and
technology priorities in the United States and Japan. The Japanese machine was
built to analyze climate change, including global warming, as well as weather
and earthquake patterns. By contrast, the United States has predominantly
focused its efforts on building powerful computers for simulating weapons,
while its efforts have lagged in scientific areas like climate modeling.

For some American computer scientists, the arrival of the Japanese
supercomputer evokes the type of alarm raised by the Soviet Union's Sputnik
satellite in 1957.

"In some sense we have a Computenik on our hands," said Jack Dongarra, a
University of Tennessee computer scientist who reported the achievement today.
For many years he has maintained an authoritative list of the world's 500
fastest computers.

Several United States computer scientists said the Japanese machine reflected
differences in style and commitment that suggest that United States research
and spending efforts have grown complacent in recent years. For now, the new
computer will be used only for climate research, and American scientists have
already begun preparing to move some of their climate simulation research to
run on the Japanese machine.

"The Japanese clearly have a level of will that we haven't achieved," said
Thomas Sterling, a supercomputer designer at the California Institute of
Technology. "These guys are blowing us out of the water, and we need to sit up
and take notice."

The new Japanese supercomputer will have both scientific and practical
applications. It will be used for advanced modeling of theories about global
warming and climate change, and it will be able to predict short-term weather
patterns.

Advances in computer speed today routinely extend computer simulation into all
areas of science and engineering as complex calculations take an increasingly
shorter time. Because increases in computing power tend to have exponential
results, a problem that could take years for even the fastest computers today
might be finished in hours on the new Japanese computer.

The ability to track the path of a typhoon, for example, is of immediate
relevance to the island nation of Japan. Improved prediction made possible by a
more powerful computer might save lives and property.

Computer simulation has become a standard tool in both science and modern
design of products ranging from drugs to bicycles. Computers that are more
powerful make possible simulations that are more accurate and can reduce cost
and increase efficiency. At one time, for example, computers were capable of
computing the flow of air over a single airplane wing but can now cover the
entire aircraft.

The new Japanese supercomputer was financed by the Japanese government and has
been installed at the Earth Simulator Research and Development Center in
Yokohama, west of Tokyo. The Japanese government spent $350 million to $400
million developing the system over the last five years, according to Dr. Akira
Sekino, president and chief executive of HNSX Supercomputers, a unit of the NEC
Corporation based in Littleton, Colo.

The new computer was formally dedicated last month, and the Japan Marine
Science and Technology Center said yesterday that the machine had reached more
than 87 percent of its theoretical peak speed.

"This is a huge achievement for the Japanese," Dr. Sekino said.

NEC sells a scaled-down version of the new supercomputer. Several United States
universities and government agencies have tried to buy the machines over the
last decade for purposes like aircraft simulation, seismic studies and
molecular modeling. But sales have been thwarted by resistance from the
Commerce Department and members of Congress, who complained that NEC was
"dumping" the machines, or selling them below cost. Last year Cray Inc., a
United States maker of supercomputers, entered into a marketing agreement to
sell the machines in the United States, but no sales have been announced.

The NEC supercomputers are based on vector processing, a way of using
specialized hardware to solve complex calculations that was pioneered by the
American supercomputer designer Seymour Cray. The concept has generally fallen
out of favor in the United States in recent years.

Assembled from 640 specialized nodes that are in turn composed of 5,104
processors made by NEC, the new Japanese supercomputer occupies the space of
four tennis courts and has achieved a computing speed of 35.6 trillion
mathematical operations a second. The processors are linked in a way that
allows extremely efficient operation compared with the previously fastest
"massively parallel" computers, which are based on standard parts rather than
custom-made chips.

The earth simulator project is intended to create a "virtual earth" on an NEC
supercomputer to show what the world will look like under various climate
conditions by means of advanced numerical simulation. The system is intended to
serve as a research platform for international teams of researchers, and United
States scientists are planning to participate in new projects made possible by
the more powerful computer.

By comparison, the fastest American supercomputer, which until now held the
world computing speed record, is the ASCI White Pacific computer at the
Lawrence Livermore National Laboratory in California. Based on I.B.M.
processors, it has achieved a top speed of 7 trillion math operations a second.

Faster machines are being designed at government-financed labs in Livermore,
Pittsburgh and Los Alamos, N.M., but they are far from operational.

The Japanese supercomputer underscores a continuing debate within the computer
design community. One camp has argued for building massively parallel
supercomputers by chaining together thousands of off-the-shelf microprocessors.
That philosophy has come to dominate designs in the United States in recent
years. A second camp has pushed for computers made from specialized processors
dedicated to solving a particular class of problem. 

The vector processors used in the Japanese machine are an example of the second
approach, and they have long been used with great success for scientific
problems ranging from weather prediction to bomb design. 

Scientists from the National Center for Atmospheric Research in Boulder, Colo.,
said they were planning to work with the Japanese earth simulation center to
convert United States weather modeling codes to work with the new computer.

"It's potentially quite significant for climate studies," said Dr. Tim Kalleen,
a space scientist who is director of the American climate research center. He
said his researchers were discussing with their Japanese counterparts the
technical details needed to make sure the advanced American programs will run
on the Japanese machine.
*********************
Reuters
U.S. Exporting Personal Internet Privacy Technologies 
Fri Apr 19,10:22 PM ET 
By Elinor Mills Abreu 

SAN FRANCISCO (Reuters) - While technologies to protect personal online privacy
(news - web sites) have stalled in the world's richest nations, they're still
in grave demand from human rights workers in other countries, experts said at
the Computers, Freedom and Privacy conference that ended here on Friday. 

  
Five years ago there was a burgeoning consumer personal privacy market in North
America, with a growing list of software and services that allowed people to
maintain their anonymity on the Internet, said Ian Goldberg, chief scientist at
Zero-Knowledge Systems Inc., a Montreal-based Internet privacy provider. 

There were anonymous e-mail and Web surfing systems and a company called
DigiCash was preparing to roll out a service that would allow for anonymous Web
shopping. "The future for privacy enhancing technologies seemed promising," he
said. 

But the complexity of the technologies kept them from being widely adopted and
the free services were costing companies too much, according to Goldberg. For
example, Zero-Knowledge's Freedom Network, which allowed anonymous Web surfing,
was shut down because it was too expensive to run, he added. 

Meanwhile, a group of technologists in the United States is working on
exporting its privacy technologies to countries like Sri Lanka, Cambodia,
Russia and Guatemala, where keeping information secret is a matter of survival
for many people being helped by human rights organizations. 

San Francisco-based Benetech is a nonprofit technology company that is working
on a project dubbed "Martus," after the Greek word for "witness," said Marc
Levine, a senior product manager at Benetech. 

PGP LEGACY 

The group is following in the footsteps of Phil Zimmermann, creator of the PGP
(Pretty Good Privacy) encryption software used to scramble e-mail and other
electronic communications so they are read only by the intended recipient.
Zimmermann has said he invented the technology specifically to help human
rights organizations in eastern Europe and elsewhere. 

Martus, which is expected to be launched later this year, is a simple-to-use
program that will allow nongovernmental organizations (NGOs) to keep their
sensitive information safe from governments and others accused of human rights
abuses, according to Levine. 

"I don't think that American consumers feel the dire consequences that human
rights NGOs feel by having their information read by the wrong eyes," he said. 

"For Martus it's a matter of life and death," said session attendee David
Singer, an engineer in Internet technology at International Business Machines
Corp. 

Personal privacy services failed to take off in the United States because
people didn't think the risk was high enough priority to hassle with the often
complicated systems, he said. 

"If you don't have a problem that people think needs to be solved, it doesn't
matter how good the technology is," people won't use it, Singer added. 

Since the market failed to take off, most privacy firms have begun reinventing
themselves to focus on offering software and services to help corporations
manage their online privacy policies, like Zero-Knowledge has done. 

One exception is Anonymizer.com, a service that allows anonymous Web surfing
for $5 a month. 

After simplifying its interface and features, the Web site "started making
money overnight," said President Lance Cottrell. The site now has "tens of
thousands" of paying customers, he added. 
*******************
Boston Globe
Life sciences boost slumping IT firms 

By John Dodge, Special to the Globe, 4/22/2002 

With 800 interconnected Compaq AlphaServers making up one of the world's
biggest supercomputers, Celera Genomics is capable of performing more than 250
billion genomic sequence comparisons per hour. 

Such eye-popping computational power forms the foundation of today's genomic
and proteomic research - the identification of gene and protein functions -
which promises to radically transform medicine and drugs. Without cheap and
plentiful computational power, this potential revolution in drug discovery and
the industry growing up around it wouldn't exist. 

Explosive demand for computing in the life sciences couldn't come soon enough
for the likes of IBM Corp., Compaq Computer Corp., Oracle Corp.,
Hewlett-Packard Inc., and EMC Corp., which have endured 18 months of sluggish
information technology spending after years of double-digit growth. 

Research firm IDC calculated growth in information technology spending overall
fell from 11 percent in 2000 to 1.5 percent last year, with dollars for
hardware plunging 7.8 percent. Even stalwart IBM, which once seemed immune to
the malaise, last week posted a 32 percent drop in quarterly profit. 

The life sciences are a beacon amid the information technology spending gloom.
The life sciences include research and development, drug trials that can last
two years or longer, and subsequent sales and marketing spending in the far
from certain event a drug makes it to market. 

Picture the videos of baby sea turtles gliding across the beach toward open
ocean. Like the baby turtles, only a tiny fraction of drug compounds - one in
5,000, actually - ever reach maturity or, in the case of drugs, make it to the
marketplace. Raising the odds of survival even further is the drug-development
life cycle, which takes an average 10 to 15 years from concept to market,
according to the Tufts Center for the Study of Drug Development. 

Despite seemingly insurmountable economic odds, money spent on drug research
and development has risen steadily for years at a 15 percent clip, topping $30
billion in 2001, the Pharmaceutical Research and Manufacturers of America
reports. A large slice is spent on computers and developing software. 

IDC forecasts a 24 percent compound annual growth rate in information
technology spending for the life sciences, resulting in a $38 billion market by
2006. Frost & Sullivan, a researcher commissioned by IBM to size up the life
sciences market, is even more optimistic, putting the figure at $40 billion
within two years. 

''Life sciences are at the same point that we were in 1947 in electronics at
the invention of the transistor,'' says Dorman Followwill, the Frost & Sullivan
vice president for health care and life sciences. ''On the downstream side, you
see an aging baby boomer population that is going to be demanding a gigantic
volume of new drugs starting in five to 10 years. We're looking at this space
for the next 30 to 40 years.'' 

The new crop of genomic and proteomic drug-discovery companies, which are in
great demand by the big pharmaceutical firms whose track record in finding new
drugs is poor, are spending heavily on information technology. But that doesn't
mean all IT products - such as storage units, server arrays, and software - are
sucking up the lion's share of those dollars. 

''Certainly, we're big users of IT tools,'' says Mark Murcko, chief technology
officer at Vertex Pharmaceuticals Inc. in Cambridge. ''Year over year, we're
spending 50 percent more on IT.'' 

Growth figures like those may give computer company executives goose bumps, but
the majority of Vertex's technology spending - and like most biotech and
pharmaceutical companies, it won't give specific dollar amounts - goes for
people and databases containing genomic sequencing data from Inctye Genomics
Inc. 

''Our hardware and storage costs, while not trivial, are much smaller than our
people costs,'' Murcko says. ''I'm more likely to write a big check to someone
of high quality than I am to Compaq. We don't know enough to be able to say to
confidently make projections like we'll need 10 terabytes of storage [and that]
IBM, Oracle, and Sun [Microsystems Inc.] are the big winners.'' 

At Vertex, computers seem like a detail, not quite as mundane and utilitarian
as phones, but close. ''We have hardware from Silicon Graphics [and] Sun along
with PCs and Macs. The software we write will run anywhere,'' says Murcko. 

Software is the key. Vertex, a genomics-based drug-discovery company, writes
almost all its own software for the simple reason it isn't available anywhere
else. 

''We develop software for 50 different kinds of scientists, from people doing
cell assays, people who work with robots, people working the FDA to understand
safety requirements,'' Murcko explains. Hardware is to Vertex scientists what a
nail gun is to a house framer. The end game is finding lucrative new drugs. 

Those in biotech IT point out that having scientists run information technology
is like putting the inmates in charge of the asylum. Horror stories of 85
percent server ''uptime,'' which would get any IT director in the commercial
world fired, abound in the lab. 

''People who run biotech are highly scientific and tend not to be IT savvy,''
says Caroline Kovac, general manager of IBM's Life Sciences division. ''But
they can't run their business without very powerful IT infrastructures. They're
using genomic data as a competitive weapon. You can't do that without a lot of
fast computers.'' 

Indeed, someone has to make sense of it all. 

''We've got scientists where the science is, and that is in the research area.
And we've got business people working on the applications. We need things to be
up at all times,'' says John Atkinson, senior manager of information technology
infrastructure at Abgenix Inc., a Fremont, Calif., developer of human
antibodies that treat cancers and inflammatory diseases. 

The good news for technology companies is that Atkinson spent heavily in 2001,
shelling out $2.5 million to EMC alone for storage, double the previous year. 

''It will be bigger than that this year,'' he says. ''I can't even speculate.
When I came on board in June 2001, there were five people in IT. Now there are
30, and IT will double in 2002 from where it is now.'' 

At the same time, biotechnology companies are focused on squeezing jobs out of
the drug-discovery process. New Haven-based CuraGen Corp., with a quarter of
its 500 employees in information technology, is one such company. 

''We'll spend 20 to 25 percent more each year [on IT] as we expand into areas
we have not been in before,'' says John Murphy, the CuraGen chief information
officer. Many biotechs like CuraGen are moving into clinical trials and
developing drugs in addition to their traditional role of supporting large
pharmaceutical companies in identifying drug targets. 

Of course, CuraGen, like most biotechs, is expanding its work force. However, a
primary goal is to replace the lone scientist working on a bench with teams
using automation to complete big chunks of the drug-discovery process. 

''We're heavily invested in bioinformatics so we need fewer people,'' Murphy
says. ''We're fully computerized and [big pharmaceutical companies] are not.
They have legacy systems and would love to be able to do what we do. We can
[find new drugs] faster, cheaper, and with greater probability of success.'' 

But Murphy has the same complaint as Vertex's Murcko about software. ''The
industry is very good at building hardware, but not very good at building
software. We have to build it ourselves,'' he says. And that still means lots
of computer sales. 

The beneficiaries of all this spending are tripping over themselves in support
of this market. IBM, for one, has spent $200 million since August 2000 when it
created a high-profile Life Sciences division. 

''We've had biotech initiatives going back to the 1960s, but the decoding of
the human genome was when we saw this as substantial business opportunity and
we saw that it was going to explode,'' says IBM's Kovac. 

IBM life sciences sales, she says, are growing at a ''triple-digit rate''
annually. 

Driving the business as much as discoveries is the urgent need pharmaceutical
companies have to replenish the drug pipeline as patents expire. At the same
time, pharmaceutical companies face a crisis in drug-development productivity
and are throwing tens of billions in research and development dollars against
the problem. 

For instance, the Food and Drug Administration took an average of 16.4 months
to approve each of the 24 drugs approved in 2001, longer than the average
period in 1999 when 35 new drugs were added ''to the nation's medicine chest,''
according to PHRMA. 

''Large mergers have not solved the problem. Today, everyone is looking at the
new science and the human genome project and saying maybe this is the solution
to the problem,'' Kovac says. 

Incyte Genomics Inc., for instance, boasts that its databases can shave up to a
year off drug-development cycles. And the biotech companies claim their
integrated genomic approach to drug development, while yet largely unproven,
will give them a big advantage over the traditional methods employed by
pharmaceutical companies. 

It's little wonder that companies like Vertex, CuraGen, and others each have
dozens of lucrative and long-term partnerships with deep-pocketed
pharmaceutical companies to find new genomic and proteomic-based drugs. 

Meanwhile, other big players, such as Compaq in genomic research and HP with
the large pharmaceutical companies, market their life sciences efforts as
high-performance computing. Compared with IBM's, the pair's marketing efforts
look paltry, but the two companies, which will likely merge soon, both enjoy a
strong presence in life sciences. So does Oracle Corp. (Strangely, Microsoft
has yet to enter the life sciences arena.) 

But $30 billion in life-sciences research spending for 2001 probably won't
reignite the double-digit information technology spending growth suppliers saw
for the past three decades. After all, information technology products overall
have become a trillion-dollar industry. 

''This is not going to be the silver bullet that brings back [the 1990s],''
says Suresh Gunasekaran, a life-sciences analyst with Gartner Inc. ''Those were
unsustainable booms. But biotech is one of those handful of critical new
technologies that are going to spawn a whole new generation in IT. The real
headline is not dramatic growth for next three years. It's going to be over the
next decade.'' 

John Dodge is executive editor of Bio-ITWorld, a monthly magazine about IT and
the life sciences. He can be reached at jdodge@xxxxxxxxxxxxxxxx 
*******************
Wired News
A Bad Year for Privacy 
By Declan McCullagh

SAN FRANCISCO -- Long before planes slammed into the World Trade Center and
anthraxed mail snarled Capitol Hill, privacy mavens had worried that a
terrorist attack would spur Congress to approve invasive new laws. 

Then came Sept. 11's deadly attacks, followed by President Bush signing the USA
PATRIOT Act the following month. 

Others, predicting that music and video could be locked up in ways that prevent
legitimate backups as well as illicit copying, had fretted that Congress might
make such protections mandatory. 

Then an influential senator proposed doing just that last month. 

These are trying times for technology activists, lawyers and other random
savants who gather each year for the ritual of the Computers, Freedom and
Privacy conference, which pits them against their ideological foes in
government and the entertainment industry. 

Last week's summit, which ended Friday, comes at a time when Hollywood is eager
to restrict technology in hopes of restricting privacy, and governments are
becoming positively entrepreneurial in testing new technologies of surveillance
and eavesdropping. 

Take face-recognition cameras, an awkward phrase describing closed-circuit
cameras tied to computers that attempt to match your face against existing
images in a database. 

The upside: Miscreants could be arrested, assuming the technology works as
described. The downside, depending on how the system is programmed: A record of
what you do in public for the rest of your life could be compiled, sorted, and
kept on file for police perusal. 

Public outcry over such facecams prompted the Winter Olympics and the National
Basketball Association to pledge not to install them. But after last September,
a nation newly-conscious of security began to install them in airports and some
city streets. 

A representative of the American Civil Liberties Union, who spoke at the
conference, said facecams simply don't work. 

"It has utterly failed, sometimes in astounding ways," said ACLU associate
director Barry Steinhardt. "When a technology demonstrably does not work, we
ought not to use it. We don't even have to debate the privacy issues." 

In January, the ACLU released a report saying that in Tampa, Florida -- the
first city to adopt the surveillance system -- the facecam network never ID'd a
single person present in the database. 

Another panel pitted a Justice Department attorney, Chris Painter, against
activists and a representative of UUNet, the network provider. Painter said
that criticisms of the USA PATRIOT Act, which handed police unprecedented
surveillance power, were misguided and/or based on a misreading of the
complicated law. 

During an "award ceremony," Privacy International announced that Attorney
General John Ashcroft would receive one of this year's not-very-coveted awards:
a golden statue of a jackboot crushing a human head. 

Ashcroft, who once likened criticism of the Bush administration to treason and
lobbied for the USA PATRIOT Act's wiretapping expansion to have no expiration
date, received "Worst Government Official." 

Ashcroft partially won. Only a small portion of the USA PATRIOT Act will expire
in December 2005 unless Congress votes otherwise. Permanent sections grant
police the ability to conduct Internet surveillance, without a court order in
some circumstances, secretly search homes and offices without notifying the
owner at the time, and share confidential grand jury information with the CIA. 

Also exempt from the expiration date are investigations underway by December
2005, and any future investigations of crimes that took place before that date.


The only other topic that drew as much debate among conference attendees was
the future of intellectual property: Call it the post-Napster conversation on
copyright. 

On Thursday afternoon, panelists role-played what might happen if a researcher
was arrested under the Digital Millennium Copyright Act (DMCA) for presenting
his research at a conference -- a thinly disguised reference to Russian hacker
Dmitri Sklyarov, whom the FBI nabbed last August at the Defcon convention. 

The consensus, however, seemed to be that the DMCA does not prohibit merely
presenting a paper. The law restricts distributing a technology, device, or
component that could be used to circumvent copyright -- and if you sell such a
product, it's a federal crime. 

Sklyarov's company, Elcomsoft, sold a product to remove copy protection from
Adobe e-Books, which is why the Justice Department says it is prosecuting the
firm. Sklyarov is no longer in danger of prison time if he testifies against
his employer, according to a deal announced last December. 

Even though the recording industry threatened Princeton University professor Ed
Felten with DMCA charges over a paper last year, the threats were likely
spurious. Jessica Litman, a professor of law at Wayne State University, said
she thought it was "very unlikely" a judge would apply the DMCA to scientific
research. 

A bill introduced last month by Senate Commerce chairman Fritz Hollings
(D-South Carolina) would go further than the DMCA, prohibiting the sale or
distribution of nearly any technology -- unless it features copy-protection
standards to be set by the federal government. 
*****************
New York Times
If Bertelsmann Wed Napster, It Could Sue Itself, and More
By MATT RICHTEL

By pursuing a possible deal to buy the music-trading service Napster, the
German media conglomerate Bertelsmann is opening the unusual possibility that
it may be financing an antitrust investigation against itself.

The deal to buy Napster is being promoted by Bertelsmann's chief executive,
Thomas Middelhoff, who told a German newspaper this month that he was willing
to pay $15 million to $30 million, in addition to the $85 million that his
company has already lent Napster over the last few years. He said he believed a
for-pay version of Napster could still become the Internet's most successful
service.

At the same time, BMG Records, a division of Bertelsmann, has joined four other
record labels in pursuing a lawsuit against Napster for copyright infringement.
That suit succeeded in knocking the service off the Internet last July.

Napster, in turn, has said the case against it should be thrown out because the
record companies had engaged in antitrust activities, joining to thwart
competition to their own Internet music services. A judge has given Napster
permission to gather evidence from the record companies for its claims.

So if Bertelsmann buys Napster, it will have two of its divisions on opposite
sides of a serious legal divide, with billions of dollars in damages at stake.
As Bertelsmann is only one of a number of plaintiffs seeking damages from
Napster, it could not on its own simply drop the suit.

Bertelsmann officials refused to comment.

With negotiations continuing, the digital music market continues to evolve.
Even as the new Napster and other services seek to create for-pay exchanges, a
new generation of services has emerged that continues to give consumers the
option of tapping into free music on the Internet. Some of these services are
also being sued by the major record companies.

Eric Scheirer, an independent music industry analyst, said it was unlikely that
Bertelsmann would buy Napster for the express purpose of closing the antitrust
investigation by Napster's defense team. But Mr. Scheirer also said that it was
also unlikely that the company would continue to finance Napster's legal
defense. "It makes no sense to be investigating itself," he said.

Rather, Mr. Scheirer said, the probable outcome is that the sides will settle
in "rapid order" because neither Bertelsmann nor its fellow record labels want
to see Napster's investigation proceed.

Adding another wrinkle, the Justice Department said in October that it had
started an antitrust investigation of its own into whether the record companies
have misused their copyrights to dominate the digital market.

Legal experts said that government investigators could have additional
ammunition if Bertelsmann buys Napster and then shuts down the service's
antitrust investigation.

The future of Napster, however, depends on its ability to transform itself into
a for-pay service. The company, which in its heyday was used by millions of
people each day to exchange free music, has been running low on finances.

Under its chief executive, Konrad Hilbers, himself a former Bertelsmann
executive, Napster has been seeking to license music from the same record
labels that succeeded in ending its free service. 

Mr. Hilbers has said he has made progress, but he has not signed any deals yet,
and his plans to start the service by the end of the first quarter have failed.

That appears to account in part for Mr. Middelhoff's interest in taking control
of Napster. "Middelhoff thinks he can work out the deal that Napster hasn't
been able to do," said a person close to the negotiations, who then added,
referring to the record labels, "He thinks he's going to be able to convince
people to come along."

Still, parties close to Napster and Bertelsmann say talks are grinding slowly,
partly because of an internal dispute on Napster's own board. John Fanning, the
uncle of Sean Fanning, Napster's founder, has sued to wrest control of the
board from Hummer Winblad, the Silicon Valley venture capital company that
invested around $15 million in Napster.

Napster's very existence could depend on the company being acquired. People
close to the company say that without receiving a new capital infusion  from
Bertelsmann or elsewhere  it could exhaust its money in the next two months.
Last week, the company laid off 30 people to cut expenses.

For Bertelsmann's part, even a $30 million investment would be a relatively
minor capital outlay.

That said, Mr. Scheirer, the music industry analyst, said he was not convinced
that Bertelsmann would be making a sound investment. He noted that since
Napster has gone offline, many new services have emerged, with names like
LimeWire, KaZaA and Grokster, that have replaced Napster.

"I suppose somebody could honestly believe that the Napster brand name is worth
something, but I don't agree," Mr. Scheirer said. "I don't know what's left
that you're buying."
********************
MSNBC
Catholic scandal debate rages online 
   
Net proves popular outlet for anger, angst

April 19   As the sexual abuse scandal battering the Catholic Church has
intensified, rank-and-file churchgoers have flocked to Web sites and online
forums devoted to Catholicism to vent anger and sadness, express support for
the church and debate what the actions of a few priests might mean for the
future of their faith.

       POSTINGS TO DISCUSSION boards on church-related sites reveal that
Catholics are experiencing a wide range of emotions  anger, remorse, despair
and defiance chief among them  over disclosures that church leaders apparently
shielded pedophile priests from prosecution.
       ?In a sense, (the priests) have betrayed Christ much like Judas,? read a
post by Robert on a Catholic.org discussion group that typifies the soul
searching taking place online. ?At this point, I can?t imagine anything worse.?
       ?I think the Catholic Church will be all the better if wrongs are
admitted and corrected, including the dismissal and/or resignation of people in
authority who have committed grievous wrongs,? wrote another poster, Karl. 
       Others see the crisis as a test of their beliefs.
       
?IS ANY FAITH LEFT??
       ?Maybe the Lord is asking if there is any faith left in the United
States?? a poster who identified herself as Gayle asked rhetorically on
Catholicity.com.
       Timothy Harrison, director of interactive services for Catholicity.com,
said that most of the postings to the site?s discussion groups have been
supportive of the church.
       ?There are certainly people that are concerned, but the overwhelming
opinion isn?t a condemnation of the church, it?s an attitude that we need to
beg our Lord for mercy and really pray for our church,? he said, adding that
traffic at the site is up more than 20 percent in recent weeks.
       In many ways the online discussion reflects the public debate taking
place among the approximately 64 million American Catholics on such topics as
the necessity for a shake-up of the U.S. church?s leadership and the wisdom of
continuing the requirement that priests take a vow of celibacy. 
       But the Internet version frequently is more robust and free-wheeling. At
times, the tone of the exchanges can be decidedly un-Christian, especially on
Internet news groups like alt.religion.christian.roman-catholic.
       Catholics and other interested observers also have been tapping into
opinion from a wide range of online columnists, both church critics like Andrew
Sullivan, the openly gay former editor of the New Republic magazine who writes
regularly about the church, and staunch defenders like Deal Hudson, editor of
Crisis Magazine, a Catholic monthly, and a spokesman for the Catholic
Leadership Conference. 
       
RELIGION SITES? TRAFFIC RISES
       The outpouring of opinion on the scandal and its impact is apparently at
least partly responsible for an overall surge in interest in religious-oriented
Web sites. According to Internet-usage statistics released this week by Jupiter
Media Metrix, the religion sector saw its ?audience reach? increase by 54
percent in March, to 6.4 percent of the total online population. That capped a
run-up that has seen traffic to religion sites double in the past six months,
the report said.
      The report said that ?intense media attention on religion? was one
element fueling the growth, but said that the major religious holidays Easter
and Passover in March also were contributors.
       The media attention lavished on the sexual abuse scandal has itself
become a hot topic in online forums on sites like Catholic.org and
Catholicity.com, a privately run site devoted to Catholicism.
       ?There has been a lot of discussion about media bias on our discussion
boards,? said Catholicity.com?s Harrison. 
       He said that many Catholics believe the media has exaggerated the scope
of the problem because of the church?s high profile.
       ?There is a lot of statistical evidence that the percentage of priests
(who are sexual predators) is by no means above the norm,? he said. ?Also,
incidents involving other Christian and non-Christian denominations that don?t
draw the attention that the church does as an institution receive little
coverage.?
       Tony, a defender of the church on numerous online forums, sees a
conspiracy behind the coverage of the story.
       
EFFORT TO UNDERMINE CHURCH SEEN
       ?This is a horrible thing, but what is almost as horrible is the way
that the enemies of the Catholic Church and their willing puppets in the media
have latched onto this as a lever to undermine the authority of the Church and
the Holy Father,? he wrote on the Yahoo discussion group devoted to the
Catholic Church.
       Other hot topics on the forums are the future of Cardinal Bernard Law of
the Boston Archdiocese, who has come under fire for his handling of a priest
accused of being a serial child molester, and what will come out of next week?s
Vatican meeting between Pope John Paul II and the U.S. cardinals.
       Reflecting a recent poll showing that a majority of the faithful in the
archdiocese want him to resign, many engaging in the online discourse say that
Law must go.
       ?I don?t trust his judgment and I don?t think he is fit to lead the
Catholic Church in this area,? a poster who gave her name as Karen wrote on
alt.religion.christianity.roman-catholic news group.
       But the cardinal also has numerous defenders in the discussion forums.
       ?As sadly tormented as the victims of sexual abuse must be, they too
must surely realize that without Cardinal Law?s intervention and disclosure,
how much worse their plight may have been,? wrote William on Catholic.org.
       As for the meeting in Rome, some contributors to the online forums are
hopeful that the gathering itself signals that church leaders are ready to do
something.
       ?Does anyone else think this meeting may be a much needed first step
toward re-examining the requirements of celibacy?? posed Robert, the
Catholic.org poster. ?Or perhaps re-examining other areas of human sexuality??
       But other writers, like the one who identified himself as Paul on
Catholocity.com, say they expect nothing to come out of the emergency session.
       ?I know we are supposed to have hope, but I have none,? he said.
********************
USA Today
Google: Reigning champ of the online search
By Michael Liedtke, Associated Press

SAN FRANCISCO  In the rarefied world of online searches, it looks like Google
remains the engine of choice.

At least that's what we found in an unscientific test that pit Google's powers
against the tools of Teoma, an industry upstart claiming that it has developed
a better way to explore unfamiliar turf on the Web.

The duel consisted of seven widely divergent questions provided by Michael
Bass, the director of the Associated Press' News and Information Research
Center.

The questions have either recently come up in AP stories or in an investigative
reporting class that Bass teaches at New York University.

I posed them as well to two other highly touted search engines, alltheweb.com
and wisenut.com, as well as AltaVista, a pioneer that lost its way a few years
ago during the dot-com boom.

In all cases, I used the most elementary of search techniques, entering the
same keywords from each question into each engine.

While all the engines fared reasonably well on most questions, none approached
Google's processing speed or ability to provide relevant links to the answers.
What's more, Google was the only engine to guide us in each case to the
requested information on its first page of results.

These two questions stymied all the other engines:

"What Pulitzer Prizes did the New Orleans Times-Picayune win and in what year?"
and "What is the name of the song featured in the Mitsubishi commercials with
the lyrics, 'I wish that I knew what I know now, when I was younger?'

It took just 0.24 seconds for Google to provide me with a link to a page on the
Times-Picayune's Web site, where I learned the New Orleans paper had won two
Pulitzers, both in 1997  one for public service and another for editorial
cartooning.

Google took even less time  0.19 seconds on Google's clock  to answer the
question about the Mitsubishi song, even though I initially misspelled
Mitsubishi in the search term.

In a nice demonstration of Google's intuitive powers, the search engine still
figured out what I really meant and provided a link to an online discussion
board, where I learned that the Mitsubishi ad used a 1973 song called Ooh La
La, written by Ron Wood and Ronnie Lane and sung by Rod Stewart. On this
question at least, more authoritative Web sources seemed harder to come by.

Google's performance seemed even more impressive after seeing how the question
about the Times-Picayune fooled the other engines.

Both Teoma and AltaVista provided a high ranking to an MSN Money page that
informed me its managing editor used to work at the New Orleans paper and
several of its staffers had won Pulitzer Prizes during the 1980s and 1990s.

Alltheweb pointed me to a Web page featuring a schedule for last month's
Tennessee Williams/New Orleans Library Festival. The page listed scheduled
speeches by two former Pulitzer Prize winners and a former Times-Picayune
cartoonist.

Google's database, the largest of those tested, appears to give it a major
advantage over its rivals. The Mountain View-based company says it draws upon
an index of 3 billion documents.

Teoma's owner, Ask Jeeves, insists Google's index is littered with junk links.
Teoma believes it does a better job of filtering useless links, one of the
reasons its index consists of just 200 million pages. Teoma says it will be
expanding its database.

This is not to suggest the other engines are clueless. They all provide useful
road maps for getting around online.

Teoma looks especially promising as it continues to develop a new format it
unveiled along with souped-up search tools this month. An easy-to-use "refine"
button helps focus search requests, which helped with some of our queries but
not with others.

The refine tool appears especially useful if you are entering a broad search
term such as "lincoln" that could be interpreted in various ways. Enter that
word into Teoma's engine and the "refine" feature will provide several
subcategories, including Abraham Lincoln, Lincoln Benefit Life and Lincoln,
Neb.

Teoma's "resources" category also is a handy way to find more experts on
topics. Despite the intrigue of Teoma's extra bells and whistles, Google
remains my first stop for online directions.
******************
Federal Computer Week
Army proxy server closes Web back door
BY Dan Caterinicchia 
April 22, 2002

As part of a larger effort to scour the Internet for sensitive information, the
Army has set up a "proxy server" on which it can host its Web sites for public
viewing without opening a back door for hackers.

Lt. Col. John Quigg, branch chief for the network security improvement program
under the Army's director for information assurance, likened it to a museum
setting up a monitor that would allow visitors to look through a historical
document, while the document remains in safekeeping.

Quigg acknowledged that no system is hack-proof but said that so far no site
protected by the proxy server has been breached.

John Pescatore, research director for Internet security at Gartner Inc., said
the proxy server is basically an "application-level firewall" that has been
used for some time in the private sector, especially in the banking and
financial industries.

About 67 percent of Web servers are susceptible to "content-changing" hacks,
but good application-level firewalls "get that down to less than five percent,"
Pescatore said.

The protection of its public Web sites is part of a larger effort the Army
began in October when it established a Web Risk Assessment Cell of about 30
people to identify sensitive content on the service's public Web sites.

Quigg said the team, which includes a contractor as well as Army personnel, is
working through and adding to a candidate list of sites, using keyword searches
to locate Army content on non-Army IP addresses, or sensitive data that the
service might need to remove or protect.

For example, the History Channel's Web site might be flagged because it
contains the words "top secret," but because it was referring to World War II
information, no action is necessary, he said.

However, sites with potentially useful information for U.S. enemies, including
anything from maps containing data on ammunition dumps to personal information
about an Army commander or other top officers, are taken down, cleansed or made
secret.

The cell started with Army sites and is in the process of including Reserve and
National Guard sites and personnel, Quigg said. The Pentagon began a similar
process when former Defense Secretary William Cohen approved the Joint Web Risk
Assessment Cell in 1999. 

"The fact that the Army is doing this is better late than never," Pescatore
said. He added that he remembered issuing Web site security warnings for
defense and civilian agencies as far back as 1996, "and we've only seen it get
worse." 

"This is something everybody with sites on the Internet, and certainly the
[Defense Department], should have been doing five years ago," he said.

***

What is a Web Risk Assessment Cell?

The Defense Department opened a Joint Web Risk Assessment Cell in 1999 to
monitor DOD Web sites for sensitive data that could compromise military
operations or personnel. Last October, the Army established its own Web Risk
Assessment Cell, consisting of about 30 people responsible for identifying
sensitive content still available on public Web sites.
********************
Federal Computer Week
Grown-ups like county kids' site 
BY Brian Robinson 
April 22, 2002

When Miami-Dade County in Florida started its Web site for children
(kids.miamidade.gov) in October, its goal was to educate kids about county
government and encourage their participation in civic life. Now it seems that
the adults want to get in on the fun.

A recent addition to the site, a Cool Careers section, aimed to let kids know
that there are lots of interesting jobs in government other than police and
firefighting. Children can learn about government careers and what they need to
do if they want to work in government when they grow up.

"Then we started to find there were a lot of adults who were finding things out
about the county that they didn't know before from the site," said Judi Zito,
director of e-government for Miami-Dade.

The upshot was a second-tier goal for Cool Careers as a place where county
workers can find out what their fellow employees do in the large Miami-Dade
government.

Now, other parts of the government want to get in on the action with a spot on
the aquatic-themed site. The county libraries will soon be included, the Water
and Sewer Department wants to add a section on water conservation, and the
Public Safety Department link will soon appear as a jellyfish with a rotating
light on its head.

Zito believes adults are drawn to the site because of its playful approach to
the issues and its "child-like" simplicity.

The site received much more attention than was first expected, and Zito hopes
that all the activity will generate some money.

Created on less than a shoestring budget with donated labor, the site has
become a creative magnet for many content creators in the county government.

And now that the county school district is trying to find ways to introduce the
site and its approach in classrooms, Zito said she is thinking about asking the
county for funding.
*********************
Federal Computer Week
Virtual IT job fair draws attention

A virtual information technology job fair, jointly sponsored by the CIO Council
and the Office of Personnel Management, starts today, and it has already
attracted the attention of thousands of job seekers nationwide.

Monster.com, the largest employment Web site in the country, began running a
banner ad for the job fair on its home page April 15, and within two days, the
ad had received almost 20,000 hits, according to Pat Popovich, deputy chief
information officer at the State Department, who is organizing the online event
for the CIO Council. 

"We are rolling," Popovich exulted about public response to the banner ad,
which will also run at various times on HotJobs.com, Dice.com and
WashingtonJobs.com.

Twenty-six federal agencies  including the Defense Department and the National
Gallery of Art  are participating in the job fair, which runs through April 26
on OPM's USAJobs Web site (www.usajobs.opm.gov).

The job fair also will include a new IT competency-based job profile for use on
a pilot basis, Popovich said. The profile, developed by OPM last year, tests
applicants on such characteristics as interpersonal skills, problem-solving,
decision-making and project management, in addition to IT skills.
******************
Federal Computer Week
Picking up the pace at the border

The U.S. Customs Service has announced a plan to enlist the help of private
companies in tightening border security in exchange for processing their
imports more quickly.

Under the plan, 60 companies  including General Motors Corp., Target Corp. and
Sara Lee Corp.  will equip their trucks with transponders that will
electronically transmit information about the cargo to border points at
Detroit; Port Huron, Mich.; and Laredo, Texas. The program, announced April 16,
eventually will be expanded to every land entry in the United States.

Importers also agreed to work with the government to boost security in their
supply chains, including conducting more stringent employee background checks
and closer scrutiny of goods being shipped across the border. Participating
companies must conduct security assessments and enforce standards every step of
the way, and the cargo must be sealed to prevent tampering.

In exchange, those companies' imports will pass through border crossings in
seconds instead of waiting up to 12 hours for inspections as some trucks did in
the wake of the Sept. 11 terrorist attacks. 

Although waiting times have returned to pre-Sept. 11 levels of 30 minutes or
less, importers are still concerned about the potential for delays. Even
routine questioning of drivers can increase the time it takes to cross the
borders.

Tom Wickham, a spokesman for General Motors, said the program is "a step in the
right direction in terms of improving security at the border and allowing the
smoother flow of trade between countries." 

Using a transponder, a truck could clear customs in 17 seconds, he said. Under
the current system used by the majority of haulers, it can take several minutes
or an hour to two hours.

The new system would speed the process and make it easier for agents to focus
on trouble spots, said Angela Ryan, Customs' port director for Detroit, which
handles 25 percent of the imports from Canada.

"We're confident that between what importers are doing and what we're doing,
we've got security covered," said Ryan, whose port handles 6,000 trucks a day
crossing the Ambassador Bridge.

If a truck does not have a transponder, the driver can present a bar code to
customs officials for scanning. The electronic data will appear on a screen for
an agent to inspect before allowing the truck into the United States. The
program also has dedicated commercial lanes for trucks participating in the
program, Ryan said.

Customs officials have been working since Sept. 11 to reduce wait times by
increasing the number of agents at the borders and paying them for overtime.
But with the need to tighten security and conduct more thorough inspections,
the backups at border points sometimes last for several hours.

"What they are trying to do is build on an industry partnership, and some big
chunks of industry are stepping up to the plate," said Sam Banks, director of
customs work for Sandler & Travis Trade Advisory Services Inc. and a former
deputy commissioner at Customs.

The plan also means that customs officers will be able to spend more time on
questionable deliveries, according to Olga Grkavac, an executive vice president
with the Information Technology Association of America. 

"It is using 21st-century technology and allowing customs officials to focus
their time on national security threats and illegal contraband rather than
legitimate deliveries that go through the borders every day," she said.

An additional 100 companies have submitted applications to join the program,
and they will have to meet rigorous requirements before getting approval.

Office of Homeland Security Director Tom Ridge praised the program when it was
unveiled at the U.S.-Canadian border in Detroit.

"We will enhance security," Ridge said. "We will facilitate commerce. And in
the end, we'll be a safer...country."

The program is part of the $1.3 billion Customs modernization program, which is
expected to replace a paper-based system with a Web-based program, according to
Charles Armstrong, executive director of the Customs modernization office.

Armstrong said that officials from the United States and Canada hope to enter
into an agreement soon that would expedite cargo entering Canada from the
United States.
******************
Federal Computer Week
NMCI decision set for May 3


Decision day for the Navy's initiative to create a single network for its
shore-based facilities will be May 3, Defense Department officials have told
the Navy.

The decision is a significant milestone for the Navy Marine Corps Intranet. The
law authorizing NMCI stipulated that the Navy would first roll out 60,000 seats
to prove the feasibility of the concept.

Under a September 2001 agreement, John Stenbit, DOD's chief information
officer, and Michael Wynne, deputy undersecretary of Defense for acquisition
and technology, must approve the pilot sites' progress before the project can
proceed. That would allow the Navy to lease an additional 100,000 seats of the
411,000 total seats.

During an April 12 meeting, Pentagon officials told NMCI director Rear Adm.
Charles Munns that they would make the decision May 3.

NMCI is a $6.9 billion initiative to eliminate hundreds of separate networks
and create a single one for all of the Navy's shore-based sites.

Munns, who was named NMCI director in February, has been intensely focused on
the "milestone one" decision and testing the network, which will largely
provide the data for DOD's decision.

"We are happy with the progress we have made," Munns said. "We believe we are
moving in the right direction in the right way, and we have a firm date for the
decision."

Meanwhile, the Navy is already investigating the possibility of extending the
current contract with lead vendor EDS, Munns said.

Navy officials are considering modifying the contract so the start date would
coincide with the approval of milestone one.

The Navy would like to have some time to use NMCI before officials are faced
with entering the contract's three-year option period, Munns said April 11 in a
speech at AFCEA's Naval Information Technology Day in Vienna, Va.

NMCI is a five-year contract valued at $4.1 billion, with an additional
three-year option that brings the total value of the contract to $6.9 billion.

NMCI officials said that the idea behind the extension is for the Navy to start
the five-year contract period when NMCI passes its first milestone. Otherwise,
the Navy would have to make a decision about the three-year option about two
years after it gets all of NMCI's seats rolled out. Munns said the target date
for that is December 2003.

***

Milestones for NMCI

Under the Navy Marine Corps Intranet contract, the Navy is authorized to
initially roll out 60,000 seats. The first sites must pass certain tests before
the initiative can proceed to other sites. Pentagon and Navy officials have
established three milestones for the contract:

Milestone 1  Adding 100,000 seats, bringing the total to 160,000 seats.
Decision expected May 3.

Milestone 2  Adding 150,000 seats, bringing the total to 310,000 seats.
Deadline has not been determined.

Milestone 3  Adding 101,000 seats, completing NMCI?s rollout across shore-based
facilities by December 2003.
**********************
New York Times
Seems Computers Baffle 10-Year-Olds, Too

The next time your Web browser freezes, you may not want to assume that the
nearest 10-year-old can fix the problem. A new study indicates that, contrary
to prevailing wisdom, children are often as baffled by technology as adults. 

One difference, though, is that they are not as intimidated by digital gadgets
and their inevitable meltdowns. "A lot of adults, if a scary dialog box comes
up, they feel that they have blown up their computer  that reaction you don't
see among the kids," said Jakob Nielsen, principal of the Nielsen Norman Group,
the consulting company that conducted the study. 

Dr. Nielsen focuses on an area of consulting known as usability, examining how
easy or difficult it is for Internet users  in this case, children in grades
one through five  to navigate various Web sites. By observing 55 children using
sites for children, like SesameStreet.com and ABC News for Kids, as well as
sites like Yahoo and Amazon.com, Dr. Nielsen and his co-author, Shuli Gilutz,
found that children were just as likely to become frustrated by poorly designed
Web sites as adults were.

In general, the study found that if an activity on a Web site was not
immediately satisfying, young visitors moved on. Although children tend to like
animation and sound effects more than adults do, they have been warned by
parents not to download anything and so tend to skip a feature if it requires a
special plug-in. Site designers also tend to overestimate children's language
skills, the study found, often using complicated or vague words that prevent
younger visitors from understanding the choices that are available. 

Another difference among young Internet users from adults is that children
click on ads more often  apparently unable to distinguish them from content
even when they are clearly marked. 

Dr. Nielsen speculated that parents and teachers had not spent much time
teaching children how to recognize advertisements on the Web because adults
have tuned them out. But the study found that children had been taught about
privacy and been warned not to give out personal information online. "Which I
think is the good news," Dr. Nielsen said, "because it shows that it's possible
to educate children about the Internet." 
********************
New York Times
Japan Slow to Accept New Phones
By KEN BELSON

TOKYO, April 21  Mikio Fukai would seem to be a wireless phone company's dream.
At 26, he is young, upwardly mobile and technology-savvy, just the type
marketers expect to drop hundreds of dollars on fees every month. Yet in the
latest pitch for his pocket money  so-called third-generation cellular phone
service  Mr. Fukai is noticeably cool.

Instead of running out to buy the latest handset packed with high-speed video
and audio functions, Mr. Fukai, a salesman for Compaq Computer in Tokyo, is
staying with his second-generation i-mode phone from NTT DoCoMo. Although
intrigued by a handset that is part personal digital assistant, part portable
entertainment center, he is satisfied doing what most cellphone users do: talk
and send occasional e-mail.

"I am extremely interested in the functions on the 3G phones," said Mr. Fukai,
who will spend his extra cash on a new broadband connection for his apartment.
"But the critical problem is that the screens are not spacious enough to read
letters, and the buttons are too small."

Across town, Yukiko Asaoka is just as fussy. The small technology company where
she works bought two DoCoMo videophones. But as an office worker, she finds
little that has persuaded her to buy her own handset. The videoconferencing is
fun, she says, but the novelty wears off. Besides, she is unwilling to spend
65,000 yen ($490) for a new handset when older phones provide many of the same
functions. "It's out of the question," Ms. Asaoka said.

Japan may be the birthplace of the new generation mobile phones, but the
carriers that offer the service are having teething pains. For years, Japan's
three largest cellular phone carriers DoCoMo, KDDI and J-Phone had double-digit
sales growth. Their biggest problems stemmed from their success, like keeping
popular handsets in stock and repairing data networks downed by heavy traffic.

Now, they face a thornier challenge. 

Nearly 60 percent of the Japanese own cellphones, and persuading them to trade
in their trusty year-old models for newfangled ones is becoming tougher. The
economy is at a standstill, and the number of new mobile-phone subscribers has
fallen for seven consecutive months. Carriers are signing up fewer than half a
million new customers a month, one-third as many as a year ago. Worse, the 3G
handsets, packed with cameras and stereo sound, are twice as expensive as are
the older handsets with similar functions. And though the new handsets, with
data links of up to 384,000 bits a second, allow users to download video and
audio clips and hold videoconferences, the more complex functions also lead to
higher connection fees.
DoCoMo, which started its service in October, and KDDI, which introduced a
rival plan on April 1, and J-Phone, which plans to roll out its 3G service in
June, must also convince consumers that the new technology is better than the
gadget-packed phones already in use. The Japanese consume technology as few
others do, but are videophones and 30-second movie clips crucial to everyday
life?

Apparently not. DoCoMo, the market leader, fell short of its target of 150,000
users by March, netting 89,400. (By comparison, its flagship i-mode phones,
which include Internet messaging with voice services, have 32.2 million
subscribers.)

The fuzzy reception from consumers may hamper plans to introduce what is
perhaps the best feature of the phones: the ability to use them around the
globe. To do that, Japanese carriers are forging alliances with counterparts
overseas that must build similar networks in their countries.

Carriers in Europe and the United States have spent more than $150 billion to
obtain 3G licenses for the needed network capacity. Some, like KPN Mobile in
the Netherlands and E-Plus Mobifunk in Germany, plan to use DoCoMo's global 3G
standard eventually but are only just adopting DoCoMo's older i-mode service.
In the United States, AT&T Wireless, in which DoCoMo holds a 16 percent stake,
introduced a service similar to i-mode last week, called mMode. Other
companies' large debts, as well as consumer skepticism, have slowed investment
plans for 3G networks. 

DoCoMo is not eager to lend a financial hand. For the fiscal year ended March
31, the company said, it will report its first loss since going public in 1998,
a result of a 550 billion yen ($4.2 billion) decline in the value of its
investments overseas.

Back home, DoCoMo needs the new generation of phones to catch on so it can
recoup the $7.6 billion it is spending to introduce the service. The company's
i-mode service still dominates the market, but sales growth is slowing, leading
carriers to cut prices. Like its competitors, DoCoMo is betting that its 3G
phones, with their snazzy functions, will lead consumers to run up bigger
monthly bills.

Yet with consumers in wait-and-see mode, DoCoMo's president, Keiji Tachikawa,
is looking to the recent entrance of a rival, KDDI, to stimulate the 3G market.
"Competition is good for the sake of customers," Mr. Tachikawa said. 

KDDI, long the underdog, is happy to accommodate. Its new service includes many
of the functions of the DoCoMo 3G service, like better sound quality, video and
audio downloading. But KDDI's data links are only half as fast as DoCoMo's, one
reason the company is charging half the price for its service.

By selling more-affordable 3G phones, KDDI hopes to set off the so-called
network effect, in which a flurry of new customers persuades others to
subscribe. More than most network devices, Internet-enabled phones need a lot
of users to reach their potential. DoCoMo's videophones, for example, work best
if customers' friends have them, too, something that is unlikely because the
phones cost twice as much as the top-shelf i-mode handsets. Shops are
discounting them, but prices have to fall by half before young buyers, the
heaviest users, can afford them. "That's when it becomes a hit," said Kirk
Boodry, an analyst at Dresdner Kleinwort Wasserstein in Tokyo.

Once the phones catch on, Mr. Boodry said, the software on the phones will
improve, too. Developers are paid based on the amount of traffic on their Web
sites, which are designed especially for cellular phones.

Here, KDDI has an advantage because its 3G service is similar to earlier
versions, so programmers do not have to write entirely new software. "There's
complete compatibility, so content providers do not have to change anything,"
said Ikuyoshi Inoue, a senior general manager at KDDI.

There are signs, though, that consumers are still adapting to technology in the
second-generation phones. J-Phone handsets, including its latest hit, the
"movie sha-mail" phone, already allow users to snap pictures and send them as
e-mail attachments, make five-second movie clips and download a compact disc of
music onto a memory chip. That could eliminate the need for a personal audio
player, though it takes 45 minutes to download a 45-minute CD.

"Japan has some of the most sophisticated users in the world, but they are just
catching up to the 2.5 generation services now in use," said Darryl Green,
president of J-Phone, which is 70 percent owned by Vodafone.

Businesses, too, are still learning to use the 3G phones. With data links up to
40 times faster than current services, 3G videophones allow sales
representatives in the field to check inventory or relay images from a
construction site.

But most executives in search of speed and mobility are just as likely to use
laptop computers connected to wireless base stations in hotels, train stations
and other so-called hot spots. Connection speeds are nearly 30 times faster
than 3G phones, and at about $15 a month, only a fifth the cost.

Acknowledging the wealth of options available and the hurdles facing carriers,
Mr. Green said J-Phone's 3G offering in June would be a very soft rollout. And,
with a modesty heard increasingly in the industry, he said there was "a massive
amount of fine-tuning going on."
**********************
New York Times
Why Is This Room So Popular? Shh. You're About to Find Out.

Who says there is nothing new on the Net? 

A Web puzzle, "What's Wrong With This Picture," has attracted millions of
visitors, without the benefit of advertising, since it was created in February.


Describing what is "wrong" would spoil it, so just go to
www.jaybill.com/whatswrong  that is, unless you have a weak heart. Go ahead.
We'll wait. Be sure the computer's sound is turned on.

Back now? 

Good. (If you didn't go but intend to do so, skip the rest of this article to
avoid ruining the puzzle's effect.)

The site's creator, J. William McCarthy, who goes by Jaybill, is a Web site
developer in Manhattan. He created the puzzle on a whim, he said, "exclusively
for the purpose of scaring my girlfriend." He sent her the Internet address,
and also sent it to a friend. 

On the Internet, of course, no good deed goes unforwarded. Mr. McCarthy's
personal Web site went from receiving a mere 300 visits a month ("mostly me,"
he said) to 90,000 hits a day. He decided to give visitors more than the
"What's Wrong" puzzle, so he asked friends to contribute articles. There are
other novelties, like a contest to see which visitor can make the wittiest use
of Photoshop software to alter a picture of a New York City taxi. The result of
all this is "the `Seinfeld' of Web sites," he said, adding, "It's a site about
nothing, that is really popular."

Mr. McCarthy has sought no advertising, except for a banner ad from an Internet
service provider in return for discounts on playing host to the suddenly
high-traffic site. He has no plans to try to turn it into a profit-making
enterprise, he said, because "money has a way of destroying the originality of
things." Still, "if somebody comes along and says, `Hey, I'll give you X number
of thousand dollars a month to put up a banner ad,' I'll say, `O.K.' "

Mystified by hundreds of e-mail messages from the apparently clueless, asking
him what, actually, is wrong with the picture, Mr. McCarthy concluded that some
folks have a very short attention span or need to have things spelled out for
them. So he created a companion page, www.jaybill.com/whatswrong2, that gives
clownish explication and pop-up warnings.

Many of the e-mail messages he receives thank him for providing a hearty scare
and a good laugh, he said. But "10 percent of the mail I get regarding it is
people who react emotionally and want to rip my head off." The site includes
comments from visitors. "This is a prime example of the demise of the
Internet," writes one. Another says, "You are one sick individual"  and "I like
that!"
******************
Computerworld
Mainframe Skills Shortage Five Years Off
Skills shortage 5 years out; firms unprepared 
By JULIA KING 
(April 22, 2002) 

Attention unemployed mainframe programmers and operators: Prospects are dim for
landing a mainframe IT job in the immediate future.

But experts predicted last week that by about 2007, when aging IT staffers are
expected to begin retiring in large numbers, companies will be clamoring for
your skills, especially if they include mainframe systems support experience
and knowledge. 


In other words, the much-touted mainframe skills shortage is still about five
years off. 


"It's really more of a futuristic thing and not a current crisis," said Kent
Howell, manager of computer operations in the 400-person IT group at Illinois
Power Co. in Decatur. 


"What we're looking at is an aging staff of mainframe personnel" coupled with
the prospect of maintaining legacy systems "for quite a number of years,"
Howell said during a roundtable discussion on mainframe skills at last week's
American Federation of Computer Operations Management conference for enterprise
data center managers in Las Vegas. "And with that kind of problem, it's never
too early to start planning." 


Rather than hiring new employees, Illinois Power plans to refresh its mainframe
expertise by teaching mainframe skills to younger employees already on staff
and working in open systems. 


"We're also looking at salary issues to try and entice these people to keep an
interest in the mainframe side," Howell said. "Two years ago, we paid premiums
for open-systems development skills. Now we're starting to look at premiums for
mainframes." 


But experts say that companies such as Illinois Power are definitely in the
minority. 


More than 90% of 300 companies that have mainframe staffs said in a recent Meta
Group Inc. survey that they have "zero strategy" for dealing with the
diminishing pool of skilled mainframe workers. Stamford, Conn.-based Meta
estimates that 55% of IT workers with mainframe experience are over 50 years
old. 


"Most companies do a very poor job of managing and planning their skills base,
especially their aging skills base," agreed Diane Tunick Morello, an analyst at
Gartner Inc., also in Stamford. "Companies don't realize they're putting
themselves at risk because they have a heavy part of their day-to-day business
relying on skills not even being taught in schools any longer." 


Meanwhile, vendors see a huge market opportunity in such risk. For example, Sun
Microsystems Inc. by no means expects users to abandon their
multibillion-dollar investments in Cobol and other legacy applications,
according to Don Whitehead, director of business and market development for
Sun's enterprise systems products group. 


So Sun has adopted technology designed to port such applications to servers
running its Solaris operating system. 


"We're preserving the ability to write Cobol code and use it in a Unix
environment," said Whitehead.
*********************
Computerworld
License Bill Could Create IT Headaches

Compatibility issues could cost businesses millions if driver's licenses are
standardized

Congress this week is expected to consider legislation requiring national
uniform standards for driver's licenses. The intent is to improve security, an
issue of importance to businesses that swipe or scan driver's licenses to
authenticate customers. 

But creating a driver's license standard would require law enforcement agencies
and businesses such as airlines and convenience stores to adopt common scanning
and identification systems to proof the licenses, and the trickle-down effect
could be costly to both the public and private sectors. 

For instance, the Food Marketing Institute in Washington, which represents some
26,000 retail food stores, estimated that changes in the magnetic stripe
standard alone could cost its supermarket retailers $175 million in upgrade
costs. 

Interoperability Key 

In the meantime, "if you go straight to [adopting a] smart card, you are going
to leave millions of terminals at cash registers and airline counters unable to
read the card," said Richard Varn, CIO for the state of Iowa and one of the
leaders in the state's effort to improve driver's license security. That would
be a mistake, he said. 

"You don't want to end up with a counter full of readers for different state
cards, so you have to work on standards and interoperability," said Varn. 

Key to a successful authentication system will be back-end databases with
reliable information maintained by state and federal authorities that
businesses can access, Varn said. "A dumb card with smart networks and good
databases can do just as well as a smart card," he said. 

Legislation being finalized by U.S. Sen. Richard "Dick" Durbin (D-Ill.),
chairman of the Senate Subcommittee on Oversight of Government Management,
would require states to develop a minimum verification and identification
standard. It also seeks unique identifiers encoded in each license. 

If Congress passes the bill, law enforcement officials and businesses would
then have three years to adopt the necessary technologies to scan and identify
the licenses. 

While there's no standard for driver's licenses, the magnetic strip is the most
frequently used technology. But some states are using 2-D or 3-D bar codes, and
others are eyeing smart cards. 

John Hervey, chief technical officer at the Alexandria, Va.-based National
Association of Convenience Stores (NACS), said his biggest concern is backward
compatibility with existing systems. 

Many retailers, including the 130,000 stores represented by the NACS, use
magnetic strip readers for everything from credit cards to driver's licenses.
Changing those systems would cost millions of dollars, Hervey said. 

"I don't think you can do away with mag stripe technology immediately," said
Hervey, who believes that a driver's license could incorporate multiple
technologies that could satisfy magnetic strips as well as bar code readers. 

Privacy issues will also be significant. By swiping driver's licenses, a
business can capture that data and use it for customer relationship management,
which scares some privacy advocates. Putting limits on what businesses can do
with that data will likely be part of the debate. 
*****************
Computerworld
Early Adopters: .Net Tools Ready

They say key challenge is learning framework

Getting their arms around Microsoft Corp.'s new .Net development framework,
with hopes of salvaging their existing Visual Basic and C++ code, was a central
focus for many corporate users attending last week's TechEd 2002 conference
here. 

The few early .Net adopters that Microsoft spotlighted not only dismissed any
notion that the new tools, languages and framework aren't ready to use; they
also said they don't want to go back to their old programming environments. But
they cautioned developers that their greatest learning hurdle may lie in the
6,500 class libraries in the .Net framework. 

"It's big," warned Brad Sewell, an assistant vice president in IT at Pacific
Life Insurance Co. "But if you were developing on a Microsoft platform before,
the learning curve is not that steep." 

Some developers expressed concerns about adjusting to .Net's greater
object-oriented thrusta factor that will vary based on experience and the
complexity of the applications being developed. But Patrick O'Toole, a senior
developer at Zagat Survey LLC in New York, said he and four colleagues picked
up the tool and started using it without any formal training. 

"There's some ramp-up time, but it was nothing extraordinary," said O'Toole, a
former C programmer who added he's happy to have his "toys" back. 

Joel Zinn, a senior IT architect at Columbus, Ohio-based American Electric
Power Co., said he had only a moderate understanding of object-oriented
programming when he started using .Net to work on a Web-based knowledge
management system. He said that without taking a course, he got up to speed on
.Net within six weeks and that, with .Net's framework, he tends to need no more
than 20% of its classes. 

One concern cited by some conference attendees was whether their existing
Microsoft-based application code will run in the new .Net environment.
Microsoft lead product manager Christopher Flores said a user shouldn't expect
to be able to take 500,000 lines of Visual Basic 6 code and flip a switch to
run it on the .Net platform. But some code will run unchanged, and an Upgrade
Wizard can help users flag parts that users will need to manually tweak. 

Mark Driver, an analyst at Stamford, Conn.-based Gartner Inc., estimated that
40% of Visual Basic code can be migrated to .Net without considerable recoding
and redesign. 

"That means 60% of all VB code that exists today will have to be rewritten or
thrown out," he said. "If you're a Microsoft shop and you haven't started a
fairly aggressive .Net adoption by the middle of next year, you're in real
trouble." 

Sewell said Pacific Life had no problem leveraging its existing COM+ business
logic in the middle of its three-tier Web applications through the use of
.Net's COM interoperability feature. The Newport Beach, Calif.-based firm's
life insurance division converted its Web presentation layer from Active Server
Pages (ASP) to ASP.Net and built a Web service interface to the middle tier.
Sewell said any small performance penalty in using XML-based messages to
communicate between the presentation layer and the middle tier was more than
offset by ASP.Net's speedier page delivery. 
********************
BBC
Internet unites Kosovo foes

Albanians and Serbians are putting ethnic enmities behind them and coming
together in cyberspace to protect the environment in Kosovo. 
Environmental groups in the region have taken the first tentative steps towards
setting up an electronic network to share resources and information. 

Activists say that years of conflict in the region have taken their toll on the
environment in Kosovo, with polluted rivers, areas stripped of their forests
and the capital, Pristina, blanketed in thick smog. 

They hope to use the internet to highlight the problems and enable Albanian and
Serb activists to work together. 

"It's sort of a success for a multi-ethnic Kosovo," said Blerim Vela of the
Regional Environmental Center for Central and Eastern Europe in Pristina, which
is co-ordinating the scheme. 

"Most of the people don't know there is an environmental problem. 

"So we decided there was a need for an electronic network so that they can
disseminate information about the environmental problems." 

The network, called Sharri.Net, was set up in February and the organisers aim
to have the website up and running by June. 

'Serbian courage' 

Getting Albanians and Serbs to join forces was not easy. The two sides had to
overcome years of fear and mistrust. 

The first step was persuading Serbian environmentalists to take part in a
workshop in Pristina last January. 

It was the first time in three years that a non-governmental group from the
northern Serb enclave went to the Kosovan capital. 

"We were a bit concerned about how people would react," admitted Mr Vela. 

"We were afraid that some groups might not want to work with them. But they
thanked the Serbian non-governmental organisation which showed the courage to
come here." 

Overcoming ethnic differences is only the first hurdle in setting up an
electronic network in Kosovo. 

The region has been devastated by years of violence, ethnic cleansing and
guerrilla warfare. Since the end of the fighting, the international community
has been investing time and money in rebuilding Kosovo. 

"The reconstruction process is causing the problems," said Blerim Vela. "A lot
of money is put on reconstruction, but not on environmental issues. 

"But now things are starting to change and major international donors are
giving more money to environmental issues." 

Getting connected 

Funding for the electronic network is coming from the Norwegian Ministry of
Environment and Dutch Ministry of Foreign Affairs. 

At the heart of the proposed electronic network will be a dedicated office in
Pristina, available to all environmental groups. They will have four computers,
connected to the internet 

The facility will also offer free advice on setting up computer networks and
publishing on the web once a week. 

The aim is to help groups in Kosovo get online by providing them with
second-hand computers, donated by a private Dutch company. 

In the past, there was only one internet service provider, (ISP), in
Yugoslavia, and few Albanians in Kosovo were online. 

"Now we have four ISPs, which provide very high quality services," said Mr
Vela. "People are now using those services and developing websites."
*******************
BBC
Egg backs digital payments

Egg Pay allows Egg customers to digitally transfer money via e-mail to anybody
in the UK with a bank account. 

Research from Egg and MORI has found that a third of all British adults are
interested in digital payment services but security remains a big issue for
many. 

To use Egg Pay customers send an e-mail stating which account they wish the
money to come from, providing the recipient's e-mail address and a choice of
two security questions. 

No cheque book 

Any amount between £1 and £200 can be sent. Recipients receive an e-mail with a
web link to access the Egg site where they answer the security question and
tell Egg which account to credit. 

Egg is hopeful the service will catch on for small transactions such as
repaying loans from friends. 

"It offers a quick and easy alternative to traditional payment methods which
could see the cheque book banished forever," said Marketing Director of Egg
Patrick Muir. 

People are losing interest in paper cheques, according to government industry
body APACS (Association for Payment Clearing Services). It predicts a 41%
decline by 2009. 

Three-quarters of the cheques sent in the UK are for £100 or less. 

Brand importance 

In the US web-based banking is proving popular and new companies are
threatening traditional banks by acting as brokers between individuals. 

In Europe, people are more suspicious of companies they do not know although
firms like Nochex are running email payment systems. 

"In the US it is simpler to create a relationship with customers but in Europe
people prefer to go with well-known brands," explained IDC analyst Daniele
Bonfanti. 

Some European banks are jumping on the bandwagon and offering such services but
Egg is the first in the UK. 

According to Mr Bonfanti others may be slow to follow in Egg's footsteps. 

"Banks are profiting from traditional payment systems and they may not want to
change it," he said.
********************
BBC
Britons 'do not want e-government'

A former civil servant has added his voice to recent criticisms of the UK's
e-government policy, questioning whether people actually want online services. 
Business development officer for NextiraOne Ken D'Rosario was scathing about
the government's determination to be a world leader in the delivery of
electronic services. 

"It is rhetoric rather than reality. I think the government has focused on an
objective which isn't in line with what citizens want," he said at the Public
Sector Expo in London's Docklands. 

Instead of spending time and money on putting services online, the government
should be concentrating on traditional contact methods such as the telephone
and face-to-face, he argued. 

"People like to interact by phone or face-to-face. Money should be invested in
making these services better and providing more staff for call centres," he
said. 

"At the moment the government is giving citizens what it thinks they want
rather than what they actually want." 

Offering choice 

Government officials say that online services are not intended to replace other
methods of communication. 

"It is about offering a variety of service deliveries," said a spokesperson for
the Office of the e-Envoy. 

Mr D'Rosario spent 20 years as a civil servant, first in the Treasury and for
the last 10 years at the Telecommunications Agency on major government
projects. 

Fed up with pushing his point internally, Mr D'Rosario decided to move into
private industry. 

NextiraOne, formerly Alcatel's e-business arm, works closely with government on
its communication projects. 

Money wasted? 

The government has come in for scathing criticism about its plans to put all
services online by 2005. 

Head of the National Audit Office, Sir John Bourn, said earlier this month that
there was "much to do" to realise the full potential of using internet
technology. 

He was concerned that taxpayers' money was being wasted on delivering
electronic services. 

Another of the government's private sector partners, Novell, has cast doubt on
the targets set for e-government and analyst firm Forrester published a
critical report on the progress of projects. 

Currently, around 11% of the population connect with the government online. 

Mr D'Rosario is adamant that the government will not achieve its target. 

"It won't and it is already paving the way to admitting that. It is now hoping
to achieve about 70% but I think even that is optimistic," he said. 

The government says it is still very much on target. 

"Our projection is to have 99.9% of services online by the end of 2005," said a
spokesperson for the Office of the e-Envoy. 

Test schemes 

The government has piloted all sorts of schemes to allow better interaction
with the citizen. 

An e-voting scheme in Liverpool is planned that will let residents register
their vote via text messages from a mobile phone. 

A self assessment website which allows self-employed citizens to submit their
tax forms online has been hailed as an excellent example of using the web. 

It has, however, only been used by a tiny percentage of people. 

"What people really need is someone to talk them through how to fill in the
form," commented Mr D'Rosario. 

Even the government's flagship portal, UK Online, was failing to connect
citizens to the information they want, said Mr D'Rosario. 

"If you are looking to find out how much your council tax will cost you it will
take an IT-literate person 10 to 15 minutes to get there," he said. 

Instead of creating flashy portals, the government should go back to basics and
make sure the information citizens wanted was easily accessible, he argued. 

The government says it has recently redesigned the UK Online site following
feedback from citizens.
**********************
BBC
Fancy an electronic helper through life?

Inside a nondescript squat brick building that is home to Sprint's Advanced
Technology Lab, a team of engineers, scientists and technologists is busy
devising what it hopes might become the virtual future.

And at the centre of operations is something called an "e-assistant". 

The company bills the invention as "an intelligent agent that acts as a virtual
personal assistant to help you sort through the junk mail of life". 

For the busy exec getting ready for work, the lab's director Frank De Nap says
the e-assistant is all about making things that bit easier. 

Forewarned is forearmed 

"In the morning you'd like to have something that as an entity will fetch your
e-mail, tell you about your appointments and remind you of the files to bring
to work, recognise what the weather is going to be like and say, 'Hey! - it's
going to rain today. Bring the umbrella.' 

"It will tell you about accidents on your route to work and suggest
alternatives. In many ways some of us have the basis for an e-assistant. I have
one today who's called Martha and who is my secretary. But how many of us have
secretaries in our daily life?" 

In reality, the e-assistant is an amalgam of various existing technologies
ranging from voice recognition to face recognition. To make it all seem more
human, the team has given the e-assistant a personality, a face and a name -
e-Sandy. 

There's even a dash of humour as she explains: "I am a fully animated
linguistic entity. I was created at Sprint Labs in California. I am made of
many things. Some in Java and some visual basic, but I have never been a
dot.commer." 

The prototype that pops up on the computer screen in the cluttered lab in
Burlingame is named after Sprint employee Sandy Cuskaden, a grandmother who
defies her years. 

Sandy says whilst she thought seeing herself on screen was pretty weird at
first, she realised there were a lot of great ways the technology could be
used. 

"I thought about those people who died in 9/11, and thought if they had
something like this all those little children and wives could have their loved
ones still talk to them or be there." 

Wake up to 007 

In theory e-Sandy could be anyone you liked, from Shakespeare reading to your
children to Pierce Brosnan waking you up in the morning and reading the news
headlines as you get dressed. 

The scientists at the lab say the appeal of the whole project lies in the fact
you don't have to be a techno whiz to operate it. 

Mike O'Brien, who is the manager of systems and services at the lab says: "We
are eliminating the requirement that you understand the technology and how to
use it. 

"All you need to know is how to phrase a request in a humanly meaningful way
and the system will translate that into the technical language for the various
kinds of networks that the e-assistant controls." 

So that means when you want to burn a CD, you don't actually need to know how
to do it because e-Sandy will. 

And, of course, the whole thing can be personalised so it recognises you on
sight and understands the way you talk and the way you phrase things. 


Science fact not fiction 

This all may seem like shades of fantasy, but De Nap reckons products like
e-Sandy will be on the market perhaps in five years. 

"The problem we have to solve is what's going to make my life easier. Today an
e-assistant might sound like a wacky idea but only a few years ago people
thought PDAs and kids carrying cell phones were wacky and amazing." 

While e-Sandy is a virtual entity, the question being asked is how far are we
from the world of fully functioning robots like Hal from the classic movie
2001: A Space Odyssey? 

Dr Michio Kaku - one of the US's most revered physicists and holds the Henry
Semat Professorship in Theoretical Physics at the City University of New York -
claims we are decades away. 

He says: "Our most advanced robot has the collective intelligence of a
cockroach. A lobotomised cockroach. Our most advanced robot built at Carnegie
Mellon takes six hours to walk across the room." 

Frank De Nap is not so pessimistic. 

"Technology is already moving towards robots. One of last year's hottest toys
was the robotic dog. And once you have the robotic community put more
functionality and intelligence on computers there will be more experimentation
in that space." 
*********************
BBC
Talking tech makes life easier

Speech will increasingly play an important part in people's relationship with
technology, and ultimately we may even talk to the web. 
This was the view of delegates who gathered in London for the annual speech
technology conference Voice World. 

Already speech-activated software is taking over from automated systems in the
customer service industry and offering an alternative to the void of
call-waiting. 

A phone enquiry to a large organisation these days is as likely to connect to a
machine offering a series of options as it is to a human voice. 

More and more companies are recognising that such systems are frustrating for
callers and bad for customer relations. But few can afford a human to be on the
end of every enquiry. 

"A happy medium between automation and a real person is speech recognition,"
said Stuart Patterson, CEO of SpeechWorks, a company specialising in such
software. 

More intelligent 

A human sounding voice takes your call and can respond to your spoken enquiry. 

Speech technology known as Natural Language ASR means that computers respond to
the meaning of sentences rather than just specific words. 

This gives it more of a "brain" and makes it able to anticipate callers'
questions, which in turn saves time and is less frustrating for callers. 

"It is down to the power of speech," Mr Patterson told BBC News Online. "You
can say what you want, rather than to listen to what you might want." 

When the Boston Medical Centre replaced its automated service with a
voice-activated one, 90% of its customers said they preferred it. 

One man was so keen on the almost-human voice he wanted to take it out to
dinner. 

Jobs threatened? 

In some cases voice-activated software can entirely replace a human operator. 

US car rental firm Thrifty offers customers the chance to compare prices, while
United Airlines and US train firm Amtrak use speech technology to provide
timetables and information. 

The benefits of such systems are obvious as one PC can handle up to 100 calls
at a time. 

But this did not necessarily mean that human jobs in call centres would be
threatened, said Mr Patterson. 

"Even in manned call centres there are frequently people on hold and speech
allows you to get them off hold," he explained. 

It could also make the job more interesting he argues, with speech systems
dealing with simple enquiries, leaving the humans to answer more complex
questions. 

According to analyst firm Datamonitor, speech recognition software will be
worth $1 billion by 2006. 

"Voice-activated software will be universally accepted and a range of
applications such as banking will be commonplace," said Datamonitor analyst
Benjamin Farmer. 

The future of speech software is not limited to call centres. 

Couch potatoes 

For those that remember talking cars which nagged you to put your seatbelt on,
the idea of speech-enabled vehicles might not sound that alluring. 

But having the ability to open e-mail and have it read to you while driving
might prove more popular. 

"Speech in cars at the moment is a luxury but it will eventually become a
safety feature and will be the norm," said Mr Patterson. 

In entertainment there will also be applications. Microsoft is considering
voice-enabling its games console, Xbox, and the latest Harry Potter DVD comes
with a feature allowing children to talk their way around Hogwarts. 

For real couch potatoes who find reaching for the remote control wears them
out, speech could prove the ultimate laziness. 

Interactive television could in the future be operated via voice commands, said
Mr Farmer. 

Talk to the web 

Ultimately surfers will not have to be sitting at the computer in order to
access web information. 

"The next real step forward in this market is to talk to the computer through
the telephone," said Mr Farmer. 

Two languages, VoiceXML and a Microsoft-backed technology, Salt, are being
developed to achieve voice-activated web browsing. 

But those hoping to minimise contact with the keyboard and mouse might have to
wait a little longer. 

Microphones on PCs are not sophisticated enough to allow voice browsing for the
foreseeable future.
********************
San Francisco Chronicle
Battle for Brains 
To attract top students, biotech firms turn academic

Kate Rubins has been known to show up to work wearing her favorite pair of
flannel pajama bottoms. 

The 23-year-old said she would shun any future employer whose dress code
required nylons or heels. 

"I wouldn't give them a second look," Rubins said. 

When Rubins goes job hunting in a few years, she could be one of the hottest
hiring prospects in an industry where success absolutely depends on attracting
as many workers like her as possible. 

Rubins is a graduate student working on gene chip hybridization at a Stanford
University Medical School lab, where she's not the only one working well into
the evening in PJs or sweats -- comfy clothes that ease the long hours of
research work. 

Biotechnology companies that can lure such graduates out of their cozy academic
world have the best shot at turning out groundbreaking work that can yield big
profits. 

But the very candidates a biotech firm would most like to hire -- brilliant
innovators training at top schools -- may be the ones least likely to take such
a job. The first choice for most of the elite students in those fields would be
a post as a professor that allows them to push the envelope of scientific
discovery, their academic advisers say. While biotech firms treasure that
creativity, they need to harness it in the drive to develop products. 

"It used to be unheard of for those people to go to biotech," said Dr. David
Drubin, a genetics professor at the University of California at Berkeley. 

"There used to be a lot of stigma attached to those moves." 

Attitudes are changing, but not because of the bigger salaries, hefty benefits
and stock options biotech firms can dangle before the most talented candidates.


"You're not attracted by money or profit," Rubins said. "A lot of the business
types don't understand that." 

Overall, university programs are keeping up with the growing demand for Ph. D.
and postdoctoral researchers in fields like genetics and bioengineering. But to
hire the best of them, biotech recruiters need to know what these graduates
fear most -- and what they value most highly, says Bill Lindstaedt, director of
the career center at the University of California at San Francisco. 

"The first thing is the science," Lindstaedt said. 

What keeps the lights on at night at UCSF's science towers on Parnassus Avenue,
and at Mount Zion, where Rich Price works, is the chance to delve into the
secret processes of life and follow the most intriguing leads. 

"There's a lot of freedom you have in academia to pursue experiments that
interest you," said Price, a UCSF postdoctoral researcher studying hormonal
effects in breast cancer. Price would consider a biotech job, but he fears he
may lose the fascination of his current work. 

"When it's about a bottom line, your goal is developing a drug that will sell,"
he said. 

Young scientists like Price and Rubins are wary of getting stuck doing rote
work, isolated from the stimulating exchange of ideas in the academic
community. Even if they did pioneering research, they wonder whether it would
be recognized, because publication in scientific journals could compromise the
company's intellectual property. Without the opportunity of being credited for
their work, they fear they could never return to the academic world. 

Such concerns have been a longtime hurdle to hiring at traditional
pharmaceutical companies. 

In a poll early this month of the Science Advisory Board, surveying a panel of
9,200 life sciences professionals who weigh in online about technology issues,
two-thirds said the chief drawbacks of working in big drug companies are the
loss of freedom to set research priorities and the inflexibility of some
corporate cultures. 

Biotech firms may have an advantage over traditional drug companies, because
their research is seen as cutting edge. Still, many biotech companies could be
just the kind of dead end that young scientists fear, said Dr. George
Schreiner, chief scientific officer for the biotech firm Scios in Sunnyvale. 

"Many are based on a single technology or product," Schreiner said. "Not many
have the resources to focus on anything except the furrow in front of them." 

Such companies will wither in the long run, Schreiner said, because they won't
attract creative researchers. 

To draw the best candidates, he said, biotech firms have to try to replicate
academic life within company walls. To that end, Scios throws together teams of
geneticists, chemists, computer specialists and other experts to brainstorm new
problems. That multidisciplinary approach will also yield better products,
Schreiner said. 

Bay Area companies like Scios, Exelixis, Gilead and Genentech duplicate other
academic practices. Their scientists give in-house seminars, present their work
at academic conferences and submit papers to the journals that publish the work
of university researchers. 

"We place a premium on the publication of information in peer-reviewed
journals," said John Milligan, chief financial officer at Gilead. To do that,
they rush the legal work to protect intellectual property. "We drive our patent
attorneys nuts." 

New biotech recruits may have to give up their sweatpants for shorts or jeans,
but dress codes are often similar to those in universities, said Lisa
Stemmerich, Exelixis' human resources director. "We don't have people who work
in pajamas, though," she said. 

The model held up by Schreiner and others as the pioneer for converging the
academic and biotech worlds is Genentech. Founded in 1976, Genentech is about
to double the lab space at its campuslike waterfront complex in South San
Francisco. 

"The founders, (Robert) Swanson and (Herbert) Boyer, modeled Genentech after
the best academic practices," research director Richard Schiller said. "We've
tried to scale up that culture to a larger company and maintain that feeling of
creativity." 

Biotech companies are also blending into academic life by offering internships,
post-docs, guest speakers, collaborations and donations. More and more,
scientists are moving back and forth between the two worlds throughout their
careers. 

Given all that, top students still hold a slight bias toward academia, Drubin
said. But stiff competition for the limited number of academic posts opening up
each year works to biotech's advantage. Students like Price are also fully
aware of the downside of a professor's life. 

"The main thing that deters people in my position from pursuing that track is
seeing their mentors struggle to get funding -- the constant busywork it
requires to get enough to be competitive," Price said. 

Schiller and Schreiner said they are not looking for people who reluctantly
surrender their passion for pure science. They are offering the chance to
discover another, perhaps deeper, devotion -- the chance to turn lab
discoveries into lifesaving therapies. 

"Just out of a post-doc, they can be working on strategies that could provide
medical salvation to tens of millions of people," Schreiner said. "That can be
a very heady feeling." 
**********************
Government Computer World
Navy first to use e-signatures on smart cards for travel 
By Dipka Bhambhani 

The Navy is the first department in the Defense Department to use Common Access
smart cards for the Defense Travel System. 

?I put my claim in, and it?s all done electronically in a few days,? said David
Wennergren, deputy Navy CIO and chairman of DOD?s Smart Card Senior
Coordinating Group. 

The travel system, nearly a year old, connects to 40 accounting and disbursing
systems, the Defense table of official distances, per-diem rates, a repository
for records management and DOD?s public-key infrastructure. The smart card has
an embedded private key that works with the PKI component, ?so I can digitally
sign a travel order request,? Wennergren said. ?But you need security,
particularly digital signatures.? 

Although Defense Travel System users have been making online travel
reservations since the fall, verification and approval until now have been
mostly on paper, Wennergren said.
*****************
Government Computer Week
Modern architecture: midrange mainframes 

By Patricia Daukantas 
GCN Staff


Three midrange mainframe computers just announced by Unisys Corp. can run
applications under both mainframe and Microsoft Windows operating systems. 

The ClearPath Plus models reflect the company?s move to put its cellular
multiprocessing architecture in all its mainframes, marketing director Rod Sapp
said. The architecture groups processors into four-CPU pods that can be swapped
out quickly. It also replaces internal buses with a high-bandwidth crossbar
interconnect. 

The ClearPath Plus CS7201 accommodates up to 32 Intel Corp. processors, either
32-bit 900-MHz Pentium III Xeon or 64-bit Itanium chips. It runs either Win
2000 Advanced Server or Datacenter Server, but it also can run the Unisys MCP
mainframe OS and MCP applications under Win 2000. 

A capacity-on-demand feature lets users start with 150G of the system?s 1T of
internal storage switched on and later order a software key to activate more
storage. 

The ClearPath Plus CS7402 and the CS7802 models operate heterogeneously, Sapp
said. The four- to 32-processor CS7402 can be configured with up to four
partitions for separate Win 2000 or Unisys OS2200 applications. It requires a
minimum of four Unisys CMOS processors, but the remaining chips can be either
Unisys or Intel. 

The CS7802 can handle up to eight partitions for OS2200 and Win 2000 apps, Sapp
said. 

The ClearPath Plus mainframes come with 2G to 64G of main memory and 96 PCI
slots. A four-processor CS7402 that can execute 20 million instructions per
second starts at $587,000.
*******************
MSNBC
This phone knows where you are ... and how to help 
 
Pinpoints your location even if you can?t

Apr. 19   Carrying a mobile phone with you for emergencies is a smart idea. If
you run into trouble you can call for help. But what happens if you run into
trouble - and for whatever reason you don?t know where you are, or where to
call for help? There?s a new cellular phone being marketed that might be able
to help.

       CO-WORKERS HERE AT MSNBC are always interested to know what I?m testing 
but at no time in recent memory did any one particular item generate so much
interest. I?m talking about the Magnavox MobilePal + GPS phone marketed by
Remote MDX, Inc. Everyone marveled at the phone saying they knew someone for
whom this would be the perfect gift.
       The idea behind the phone is simple. It?s a cellular phone that can
pinpoint your location if you need assistance  and can do so at the push of one
red button. As a matter of fact there is no dial - just the aforementioned red
button emblazoned with the word CALL.
       When you push the red button the phone automatically calls the Secure
Alert help desk  where a live personal assistant is there to help you anytime
you need, 24 hours a day, seven days a week.
      There are two levels of service you can subscribe to. The less expensive
Personal Security plan, at $9.95 per month, provides a service that will
dispatch police, fire or an ambulance anywhere in the U.S. if you request. It
can also get you roadside assistance, help in locating the nearest hospital or
veterinary service, and the ability to give your exact location to any
emergency service personnel. 
       The Personal Security Advantage plan, which costs $16.95 a month,
includes all of the above features, plus a switchboard to connect your calls
elsewhere. The more expensive service also provides step-by-step driving
instructions, auto accident assistance with connections directly to your
insurance company, 411 directory assistance and location of the nearest
bank/ATM/gas station.
       The Magnavox flip-phone itself is pretty nifty too. First of all it?s
been made to run on four AAA batteries. No rechargeable cells or cords to lug
around or worry about. In my testing, the batteries lasted for months and
months in the standby state. The company says the batteries will last up to a
year in that mode. The phone operates on the U.S. cellular phone network. That
means you?re likely to find a cell nearly everywhere you go.
      Don?t forget, lots of new-fangled, modern digital phones also have analog
abilities  for times when you find yourself in remote locations.
       Operating the white handset is as simple as can be. There are only two
buttons. The large, red CALL button located where a keypad usually sits and a
small, gray test/siren button on the side. When pressed the test mode does a
self-check of all phone operations and seeks out clear paths to GPS location
satellites to determine your exact location. It should go without saying that
this phone is made to be used outdoors - preferably where trees and tall
buildings aren?t in your way. When you hold down the test/siren button, a
95-decibel alarm goe off  good for attracting attention during emergencies 
and, unfortunately good for attracting attention when you?re testing the device
in MSNBC?s parking lot.
       Set-up is easy. I pressed the test button and something like 3 minutes
later I received a beep telling me the phone was able to get a fix on my
location. Then it was time to try calling the service.
      I pressed the red button  about 30 seconds later I heard ringing  then a
pleasant voice answered and asked me if I needed assistance. The phone
automatically turns into a speakerphone so you don?t have to hold it to your
head. I told her I was testing the phone and asked if she could tell me where I
was standing. She asked me to hold, and about 90 seconds later came back on the
line and told me my address and what she would have done for me if this had
been a real emergency.
       The audio quality of the call was average for analog cellular service
(not as good as a digital call can be) but more than sufficient for getting
help in a dangerous situation. As for the GPS positioning function  it?s better
than what the FCC mandates for location finding devices. All in all  this is
one clever little phone.
      The MobilePal + GPS handset is currently selling for $199  not an
outrageous price for a specialized item such as this. I think the monthly
service fees are actually quite reasonable for what is offered. Remember, this
phone is meant to be used in emergencies  not as a substitute for a regular
cellular phone. I wouldn?t purchase this for someone who wants to place regular
conversational mobile calls. But, I would purchase it for someone who hates the
idea of cellular phones  and might consider keeping one of these around for an
emergency
*****************
MSNBC
New tool helps hackers hide 
By Robert Lemos
 
April 19  A new tool for manipulating packets of data that travel over the
Internet could allow attackers to camouflage malicious programs just enough to
bypass many intrusion-detection systems and firewalls.

       THE TOOL, called Fragroute, performs several techniques to fool the
signature-based recognition systems used by many intrusion-detection systems
and firewalls. Many of these duping techniques were outlined in a research
paper published four years ago. 
       Arbor Networks security researcher Dug Song posted the tool to his Web
site this week. 
       ?(Some) firewalls and intrusion prevention or other application-layer
content-filtering devices have similar vulnerabilities that may be tested with
Fragroute,? Song wrote in a posting to security mailing list BugTraq on
Thursday. 
       The new tool tips the arms race between those who look to break in to
networks and those who defend them towards the attackers, at least for the
moment. Any firewall or intrusion-detection system that fails the Fragroute
test is vulnerable to be attacked by vandals using the program. 
       Song was traveling and could not be reached for comment, an Arbor
spokesperson said, and his company would not comment on the issue.
      The Fragroute program is a dual-use program: It illuminates weaknesses in
a network?s securityinformation that can aid a system administrator in
protecting the network or helping a hacker attack the network. The program
exploits several ways of inserting specific data into a sequence of information
in order to fool detection programs. The methods were highlighted in a January
1998 paper written by Thomas Ptacek and Timothy Newsham of Secure Networks, a
company later bought by Network Associates. 
       The program essentially exploits problems caused because
intrusion-detection systems typically check incoming data for correctness less
stringently than the server software that usually is targeted by hackers. In
one version of such ?insertion? attacks, a command sent to a server could be
disguised by adding extraneous, illegitimate data. The targeted server software
will throw away any bad data, leaving itself with a valid, but malicious,
command. However, many intrusion-detection systems don?t always remove the
corrupted data, and so the hostile command remains disguised from the systems
recognition functions.
      For example, an intrusion-detection system that watches out for a recent
buffer overflow might recognize the attack by the text ?http:///? appearing in
the incoming data. However, if an attacker sends ?http://somegarbagehere/? and
knows that the ?somegarbagehere? portion will be thrown out by the target
computer, then the attack still works. Moreover, if the intrusion detection
system doesn?t remove the same text portion as the server, it won?t recognize
the threat. 
       Marti Roesch, president of security appliance seller SourceFire and the
creator of the popular open-source intrusion detection system, Snort, said that
the majority of the problems exploited by Fragroute have been fixed, and he
plans to fix the rest by next week. 
       ?Dug contacted me about this stuff several months ago, and I fixed it,?
Roesch said. 
       While he hasn?t programmed a defense to every stealth attack that
Fragroute has in its repertoire, doing so won?t be hard, he added. 
       ?Many of these take 10 minutes of coding, max, to fix,? he said. ?It
just wasn?t an issue before.? 
       While many of the attacks won?t work against Snort, if configured
properly, Roesch added that the default configuration doesn?t detect the
camouflaged data because such settings produce a far greater number of false
alarms as well. 
       Some security aficionados posting to the BugTraq list concentrated on
Snort as a program vulnerable to the Fragroute program, but Song waved off the
implied criticism on the open-source program in his posting. 
       ?Snort, I?d wager, does much better than most,? he wrote, adding that
many other proprietary programs are also vulnerable. 
       One commercial software seller, network protection firm Internet
Security Systems, claimed that its product, RealSecure, wasn?t affected. 
       ?We initially fixed the fragmentation issues when we saw the paper quite
some time ago,? said Dan Ingevaldson, team lead for the company?s security
research and development group. 
       His group tested Song?s tool earlier this week, and they were still able
to detect attacks, Ingevaldson said.
*********************
MSNBC
Exhibit lets visitors change ethnicity

Machine morphs visitors? faces with those of other ethnic groups

NEW YORK, April 19   Ever want to see what you?d look like as Asian, Middle
Eastern, Hispanic, black, white or Indian? Anything you aren?t? Step into the
Human Race Machine and find out. The machine is part of ?Seeing and Believing:
The Art of Nancy Burson,? a traveling retrospective that attempts to make an
argument for human unity.

      THE SHOW of 100 photos and multimedia works is on view at the Grey Art
Gallery in Greenwich Village through Saturday. It then travels to the Blaffer
Gallery in Houston, the Weatherspoon Art Gallery in Greensboro, N.C., and
beyond.
       For those who miss the show, a Human Race Machine (there are several),
will be on permanent view at the New York Hall of Science in Queens as of June.
       ?It?s a weird feeling,? said Kathy Zajchenko, a museum visitor in her
50s from Prattsburg, Penn. As soon as she sat down in the machine, she glimpsed
herself as a woman in her 70s (the machine also has an aging function and also
allows people to see how they might look with a facial deformity). She then
tried out a spectrum of ethnic groups.
       ?The Middle Eastern image worked pretty well for me,? she said with a
grin as she stepped out of the machine for the next person in line.
       ?The machine is really a prayer for unity. ... It?s about seeing through
our differences to sameness, it?s like stepping into someone else?s skin,? said
Burson, who added the database of Middle Eastern faces, both Arab and Jewish,
after the terrorist attacks on Sept. 11.
       When you sit inside the box, the machine creates a digital image of your
face. You push some buttons, and, using a composite of various photos of people
of a certain ethnic group mixed with your own facial features, the machine
comes up with an image.
       The resulting photo, while not always recognizable, is eerie.
       ?I?ve always wanted to allow people to see differently. I?m a
documentary photographer. I?m documenting the unseen, because what we can?t see
is so much more fascinating that what we can see,? Burson said.
       Known for her computer generated photographs of composite faces, she
first gained national attention for her Aging Machine.
       The technology, later purchased by the FBI, makes images of what a face
might look like as it ages. Used to come up with the images of missing children
as they age, it proved so effective that it helped authorities find four
missing children in the first year of its use.
THOUGHT-PROVOKING IMAGES
       The exhibit also includes thought-provoking composite images.
       Burson?s 1982 ?Warhead I,? features a haunting and strangely familiar
face. Weighted to the number of warheads in each nation?s arsenal at the time,
it is a composite of Ronald Reagan, Leonid Brezhnev, Margaret Thatcher,
Francois Mitterand and Deng Xiaoping.
       ?Mankind,? completed in 1985, features a distinctly Asian-looking face.
It is a composite of Asian, white and black faces used in proportion to each
ethnic group?s population in the world. Burson is making an updated version
using the database of images from the Human Race Machine.
       There is also a serene-looking 1990 composite of images of Jesus, Buddha
and Muhammad, an attempt to imagine the face of universal holiness.
       Also featured are photos of alternative healers and what Burson says are
their auras, androgynous faces, children and adults with facial anomalies, and
a wall of photos of ?Guys Who Look Like Jesus,? including people of all ethnic
groups who responded to an ad Burson placed in the Village Voice requesting
Jesus look-alikes.
       ?There are a lot of guys around who look like Jesus, and I wondered why
they look like Jesus,? she said. ?Some of them really carry that peaceful
energy with them, and some of them just had a bad hair cut in high school.?
       The show is accompanied by a full-color illustrated catalog. Produced by
Twin Palms Press, the hefty book provides the first comprehensive overview of
Burson?s work.
********************
Government Executive
Commerce chief of staff balances dual roles 

By Bara Vaida , National Journal 


Since Donald L. Evans was confirmed as Commerce secretary early last year, he
has made no secret of his desire to have his department take the lead within
the Bush administration on high-tech policy. Evans raised that profile earlier
this year when he named Phil Bond, his undersecretary for the Technology
Administration, as chief of staff. Bond, who will also continue as
undersecretary, is a former high-tech lobbyist--most recently for
Hewlett-Packard. Before holding that position, he spent three years as the
chief lobbyist for the Information Technology Industry Council, which
represents about 30 of the nation's largest high-tech companies, including
Microsoft, IBM, and Intel. During his years as a high-tech lobbyist, Bond, who
was a former chief of staff to Rep. Jennifer Dunn, R-Wash., played a key role
in helping high-tech companies become more politically savvy.

Bond also was a principal deputy assistant secretary of Defense for legislative
affairs for then-Secretary Dick Cheney, and he maintained close ties to the
campaign of George W. Bush. When a high-tech policy position opened up in the
new administration's Commerce Department, he decided to leave the private
sector for government. Bond is the first person in the department to hold two
high-level jobs at the same time. In an interview with National Journal, Bond,
45, talked about how he juggles his roles and about Secretary Evans's high-tech
policy priorities. Following are edited excerpts from the interview.

NJ: How are you juggling two jobs?

Bond: It's not as difficult as you would think, for two reasons. Technology is
in every bureau of Commerce and every sector of the economy. So almost
regardless of what issue the secretary is working on, nine times out of ten
there is a technology component to it. I found the chief of staff role to be
mutually reinforcing with my role at the Technology Administration.

NJ: How did it come about that you became chief of staff?

Bond: I don't know the whole story there, but I think the reason my name came
up was two- or threefold. One, I was a Jennifer Dunn chief of staff and she was
close to the campaign and to Secretary Evans; two, I was popping up here [in
Evans's office] quite a bit, with technology touching so many different things;
and three, and most important, since I joined Commerce, I was directing regular
technology-coordinating council meetings, and the secretary decided he wanted
to make it much more coordinated.

All the technology offices and all the bureaus get together on a regular basis
to compare notes. I made sure, as the adviser on technology, that I knew all
the latest and that all the bureaus knew what everyone else was doing so they
could look for opportunities to work together. The secretary decided he wanted
this technology discussion to be much more integrated than has traditionally
been the Commerce [process].

NJ: How much time do you spend as undersecretary of technology and as chief of
staff?

Bond: I don't know how to do that breakdown because technology touches
everything in the department. It's not like I have a 40-hour week delineated
between two jobs. I do know that the secretary has done more technology events
since January. We've just integrated technology much more into his activities
and his appearances. For example, Evans did a homeland security technology demo
with [Homeland Security Director Tom] Ridge and spoke at a group that Michael
Dell [CEO of Dell Computer] was chairing. Those are all areas that, if I was
just undersecretary, I would have been coordinating or involved in to some
degree anyway. So now I do it with both hats.

NJ: One of the concerns that was raised by the high-tech and science
communities about your role as undersecretary is that you don't have a science
degree, unlike previous undersecretaries of technology. What is your response
to that concern?

Bond: First of all, you can judge an administration by its budget: [This
administration] has carved out enormous amounts for research and development
and science in a wartime environment. Point two is that I did go get an
experienced Hill veteran in science [Benjamin Wu] to be the deputy
undersecretary because I understood the need for that. Third, we got a
world-class scientist to head the National Institute of Standards and
Technology, Arden Bement. And fourth, we have a bit of an expert in the
Commerce deputy secretary, Sam Bodman, who is an MIT professor and a chemical
engineer.

And what I have said to science groups is that I acknowledge my background and
that I'm not the guy with a science degree. But when you listen to what the
science community wants, it is the appropriate attention paid--in the policy
councils in the administration and on the Hill--to their needs, and connecting
those needs to the overall benefit of the economy and society. I can do that
part.

I've also turned around and urged scientists and high-tech people to deploy
their forces and intellectual power on the Hill, because in my view, the
science community in the country has been hesitant to engage [politically]--to
their detriment.

NJ: How have high-tech policy priorities shifted in terms of what the sector
wants from the government since the September 11 terrorist attacks?

Bond: Increasingly, when I've talked to folks in the technology sector, they
want to talk about homeland security and ways that security can be not just a
cost but also a productivity enhancer. Given the downturn in that sector, they
want the government to step up as both a policy leader and a purchaser [of
security technology]. They want the government to focus on e-government--so you
have a record e-government proposal coming forth [in the Administration's
fiscal year 2003 budget request]. They want to make sure that we keep the
pipeline of ideas and innovation going with R&D budgets. We are responding to
that.

NJ: How much has the tech sector's lobbying shifted since September 11?

Bond: There has been a really significant shift, for a number of reasons. One
is that when it comes to homeland security and technology's key role in
that--whether it is cyber-security, or intelligence and knowledge
management--technology is critical. The real expertise resides in the private
sector, which is a big shift from 20 years ago. They also all came rushing
forward like real patriots after September 11 and said we've got what you need
or experts that can come and help. Secondly, because of the downturn, they all
realized this is a growing market opportunity because the government is going
to have to do a lot in security--and spend a lot more.

NJ: Is the recession over in the information-technology sector yet?

Bond: Not yet. There are some positive signs, but certainly the
telecommunications sector has some real problems. In the IT sector, there are
some growth components, like cyber-security, and overall spending continues to
grow, but not at the pace we saw. IT managers throughout the economy have been
holding back. My personal suspicion is that there is some pent-up demand, but
because of how severe the downturn was, managers are going to have to be very
convinced that things are on the upswing before they buy a new round of IT
equipment and upgrade their systems.

NJ: A number of technology executives were in town in March and said they are
seeing some turnaround in Silicon Valley.

Bond: There is still a divide between economists, academics, CEOs, and IT
managers. They want to see the numbers on their bottom line before they can say
there is an upswing, and many of them haven't seen their numbers show up on the
bottom line. One thing we have picked up on is that there is a reawakening in
the venture-capital [sector], and the initial-public-offering market is
starting to crack open again. Those are all positive signs for the
entrepreneurial interests.

NJ: What did you bring from your years in the high-tech sector? What have you
learned?

Bond: One of the biggest things is understanding of the high-tech culture and
where they are politically in terms of their engagement in public affairs. And
that goes back to my days in Jennifer Dunn's office, as Microsoft is a big Dunn
constituent. So that saves me some of the frustration that some of my
colleagues may have in terms of the industry's lack of [political] development.

At the same time, I understand that the high-tech sector has had an incredible
run of legislative and policy success, with very, very few people deployed in
Washington to achieve that purpose. So if I am a CEO in Silicon Valley and
someone like me is telling them that they need to increase their participation
in public affairs, my first question would be, "Why? I have two people in
Washington, and we've gotten everything we want."

The answer is that there is this grand convergence that is occurring on many
layers in Washington and the business world. And that means that things will be
decided as public affairs rather than private affairs--as a policy process
rather than in the boardroom. Decisions will be made in Washington that will
affect their companies and stakeholders. If they want a positive outcome, then
they need to be involved.
*******************
Government Executive
Justice program looks for ways to share crime data with states 

By Liza Porteus , National Journal's Technology Daily 


As the federal government works with state governments to find ways to use
technology in anti-terrorism efforts, one organization is working to educate
law enforcement and policymakers on how to use technology to bring the
criminal-justice system into the 21st century. 


SEARCH is a program under the Justice Department's National Criminal History
Improvement Program (NCHIP). Members of SEARCH are primarily state-level
justice officials responsible for operational decisions and policymaking on the
management of criminal-justice information. 


Since the Sept. 11 terrorist attacks, the Bush administration has called for
increased information sharing among governments, more background checks on
certain employees, and more thorough checking of visa applications, among other
things. And federal measures such as those in the airport security and
anti-terrorism laws enacted last fall all affect privately managed databases. 

The mandates could have a great impact on how states oversee their
criminal-information systems, SEARCH Executive Director Gary Cooper said.
"There's great pressure because of those bills to implement programs as quickly
and inexpensively as possible, and to be able to complete the checks as quickly
as possible," he said. 


Cooper said the temptation is to only check federal databases held by agencies
such as the FBI. "The fact is, there are many more criminal-history records at
the state level collectively than presides at the FBI, and they're more
complete records," Cooper said. 

SEARCH has advocated that agencies conduct checks based on fingerprints and
scan state records. 


Cooper said one of the most exciting SEARCH projects is creating a methodology
on common information exchanged among justice agencies because every state has
its own laws and practices. With increasing calls for more information sharing
among governments, the effort may prove vital, he said. The software tool is
being tested in states such as New Mexico, Colorado, Wisconsin and Minnesota. 

"What you're ultimately promoting is the ability to share this information
nationally," Cooper said. 

Missouri Chief Information Officer (CIO) Gerry Wethington serves as the SEARCH
board chairman. He said state CIOs and law enforcement officials increasingly
are joining to create a cross-pollination of ideas for using technology in the
criminal justice system. He said SEARCH also has worked with the National
Association of State Chief Information Officers to develop software to
facilitate information sharing and interoperability. 


Wethington stressed that players must be familiar with representatives from the
National Law Enforcement Telecommunication System, which is a message-switching
network linking local, state and federal agencies to allow information
exchanges on criminal justice and public safety. 


"It's important all the major players are aware of each other and what
directions are being taken," Wethington said. "I think there is a much more
cohesive group [working on these issues], and that was certainly accelerated by
Sept. 11." 
******************
CIO Insight
Custom Services: Putting Web Services to Work
By Stephen Lawton 

Everyone agrees that the term "Web services" refers to a new set of flexible
application building blocks that let companies exchange data easily between
dissimilar systems. But that's where the agreement ends; the rest appears to be
a mass of confusion. Sometimes the technology is said to be as simple as basic
browser-based servicesa single request by a PC in New York City for a currency
conversion from a server in Geneva, for instance. Sometimes it's described as
sophisticated communications links between Web servers and host computers that
let incompatible systems talk to one another as if they were virtual twins.
Meanwhile, vendors, looking to jump on what they perceive as a bandwagon
rapidly gaining speed, claim that anything they do that uses XML is Web
services.

And the confusion extends beyond the complications of the technology itself.
How best should CIOs determine the business value of Web services when they are
so hard-pressed to figure out how exactly to go about using the technology?

Says John Hagel, a former McKinsey & Co. principal who coauthored a forthcoming
book on the business value of Web services: "I think there's confusion at three
levels. First, a lot of people, when you say, 'Web services,' just think Web
sites." Then there's the changing nature of the technologies themselves.
Confusion at "the second level is understandable, given that standards are
still in the process of definition," he notes. As for the third level, "users
say, 'Okay, I think I understand the technology. Now tell me as a business
person why I should be interested in this stuff. What are we talking about
today, versus a year from now, versus five years from now?'"

Says Bob Sutor, IBM Corp.'s director for e-business standards strategy and the
man most responsible for the technology giant's multivendor efforts with Web
services: "It's confusing because there's been a little bit of marketing hype,
as opposed to speaking to Web services' business value," he said. "We have a
tremendous amount of education to do."

To that end, rather than offering up yet another explanation of what Web
services is, we went at the problem by talking to several corporate CIOs who
have already taken the plunge.

Gift Horse: Nordstrom Inc.

Nordstrom Inc., an upscale department store based in Seattle that targets
female shoppers, had a problem. All its major competitors offered some
high-value services on their Web sites that Nordstrom's Web site,
Nordstrom.com, couldn't. Competitors' online customers could purchase a wide
range of cosmetics and order gift cards by simply clicking on a link.

Lacking such services put Nordstrom at a "significant competitive
disadvantage," says Paul Onnen, Nordstrom.com's former chief technology officer
and now an independent consultant. But putting together the technology to offer
and accept gift cards on the Web is a complex process. Using a gift card on the
Web site would require a link to Nordstrom's bankyes, Nordstrom has its own
bankso that the card number could be validated and the amount of the purchase
deleted from the card's dollar value. The transaction would also have to link
to Nordstrom's inventory control system on its corporate mainframes, which
would deduct the items purchased from the company's inventory list. And if the
purchase were executed through the store's catalog rather than its Web site,
mail-order software on yet another system also had to be linked to the
transaction in order to make sure the product got to the customer.

But the company's Windows 2000-based Web servers were incompatible with the IBM
mainframes in the corporate offices, retail stores and the banking subsidiary
that handles credit cards, and even with the Hewlett-Packard Co. server running
the mail-order applications. Onnen's task: Design a system that could
unobtrusively reach through all three business operations' firewalls in real
time. And the software to do this had to be written, tested, debugged, tested
again, and up and running in time for the 2001 holiday shopping seasonless than
six months away. Add the ability to sell cosmeticsno small task in itselfand
you have a recipe for a missed deadline.

Nordstrom.com considered, but quickly rejected, the prospect of developing
custom software. It could have been done, Onnen said, but it would have been
far too expensive, and it would have taken close to a year to test and deploy.
Instead, Nordstrom opted for Web services development software from IONA
Technologies PLC. Since Nordstrom already was using IONA's XML-based Netfish
software on its corporate mainframe, going to the same vendor eliminated the
need for extensive compatibility testing.

The entire project was completed in less than three monthswell ahead of
schedule, says Onnen. The Web-based application, which translates the gift card
transaction data so that each system could process it through its respective
firewall, worked correctly from the outset, said Onnen, allowing Nordstrom.com
to provide not only gift cards, but cosmetics order fulfillment as welland all
in time for the holiday rush.

The value to Nordstrom? Since the company doesn't break out revenue by product
category, it's difficult to estimate the kind of revenue opportunity the new
services on its site represent. But cosmetics alone is a high-revenue product
category for department store retailers, said David Unter, a retail analyst
with Deloitte & Touche LLP in Costa Mesa, Calif. Unter estimates that cosmetics
represented 10 percent to 15 percent of Nordstrom's total revenue last year,
and that sales from Nordstrom.com represented 7 percent to 10 percent of the
company's overall sales. Had the cosmetics product category been available for
the entire year, given Nordstrom's $6 billion in revenue for the fiscal year
ended January 2002, cosmetics revenue from Nordstrom.com could have reached $90
million.

But Nordstrom isn't stopping with cosmetics and gift certificates. Nordstrom is
currently testing what Onnen calls a "perpetual inventory system" in the shoe
department of some of its stores in the Pacific Northwest. Salespeople can go
to a special version of the Nordstrom.com Web site to order out-of-stock shoes.
The customer pays no shipping charges, the store is credited with a sale, and
the salesperson gets the commissiona transaction that would not have been
possible without Web services, says Onnen.

Banking on It: Putnam Lovell Securities Inc.

Putnam Lovell Securities Inc., a San Francisco-based boutique investment
banking firm, found itself unable to deliver the kind of value-added research
reports its customers wanted. Traditionally, research documents were created
in-house every morning. Then the company had to manually download information
from two databases, one for customer contact and research preference data
that's managed by Web-based software from application service provider
Salesforce.com Inc., and the other for distribution data such as clients' names
and e-mail addresses from a database hosted by research services vendor
BlueMatrix Inc. The data was then loaded into an Excel file, where it was
cross-referenced. Only then could the research documents be e-mailed to
customers. The process could take as long as 30 minutes for a simple piece of
research, said Putnam Lovell Chief Technology Officer Rodric O'Connor. And
since the research needed to be in customers' hands when the financial markets
opened each day, only a relatively small number of reports could be sent out
based on the size of O'Connor's staff. The result: Putnum simply couldn't
create the more complex, personalized reports it believed would give it an
advantage in a highly competitive institutional market.

Last summer, the firm decided to change the way it distributed its reports. The
company's lofty goal: Send customized research to each client while cutting in
half the $300,000 per quarter the company was spending to distribute the
reports. O'Connor considered two alternatives: buying enterprise application
integration (EAI) software or integrating the existing data using some type of
application over the Internet. But EAI, he decided, would be prohibitively
expensive: The license alone would cost $250,000. Meanwhile, the cost of
ownership of a Web service was at most 20 percent of the cost of using EAI, and
integration software was estimated to require more than six months to develop
before testing, versus what O'Connor estimated to be weeks to develop an
XML-based Web service.

Rather than solving the problem internally, the company decided to look outside
for help. So O'Connor chose a hosted Web service, which would serve as the hub
to the "spokes" of the databases maintained by Salesforce.com and BlueMatrix,
as well as the research reports created by Putnam Lovell. The software he
chose, from San Francisco-based Grand Central Communications Inc., matches each
customer's requirements to a customized version of the investment firm's
research, then automatically e-mails it directly to the customer. Rather than
sending entire reports, the Web service transmits only the specific components
of the research that the customer requests. And because the software hub is
located outside Putnam Lovell's firewall, security, encryption, authen-
tication, policy management and any data transformation become Grand Central's
responsibility, says O'Connor. The result: Putnam exceeded its goal of reducing
costs by $150,000 a quarter by $50,000.

The new system integrated the data from both databases and the documents
created in-house in just one month, says O'Connor. That first step alone was
able to get the company halfway to its goal, he said.

Catalog Shopper: Eastman Chemical Co.

Web services won't necessarily roll out without a little pain. As Eastman
Chemical Co. found, it's essential to test thoroughly before putting a system
into production. "This is definitely technology that we're rolling out
carefully, and measuring," says Carroll Pleasant, an analyst in Eastman
Chemical's emerging digital technologies group, a research and development arm
of the Kingsport, Tenn.-based chemical giant.

Eastman maintains a Web-based catalog containing thousands of detailed entries
about the chemical products it sells, and its distributors use the catalog
regularly. "Through one mechanism or another, [distributors] have managed to
take our product catalog data and put it into their systems" on the Web, says
Pleasant. Distributors gathered this information in a variety of ways,
including file transfers, rekeying data and even using software to
automatically pull catalog listings from Eastman's Web pages. But these largely
manual processes meant that the information on distributors' sites was usually
out of date.

"Fairly early on, I recognized the value of what was going on here, that we
were talking about being able to effectively extend these applications out to
our trading partners," says Pleasant. "When we were looking for something to
pilot Web services with, [the catalog] immediately occurred to us as a great
place to get started. If we could make it electronically available with a
single source, everybody can share the same data and technical information
about our products."

Like Putnam Lovell, Eastman is a fan of outsourcing. Rather than building its
own Web services internally, Eastman used XML to link the catalog data on a
Windows 2000 server to an outsourced "hub-and-spoke" service. Requests from
customers accessing Eastman's custom pages on outsourcer Grand Central's server
would be automatically forwarded to Eastman. Then Eastman's Web service would
send back the requested data, and the customer's Web site would be
automatically populated with Eastman's up-to-date catalog information.

At least, that was the plan. But Pleasant says the company's testing turned up
several glitches, including interoperability problems with different
implementations of SOAP, the Simple Object Access Protocol for invoking Web
services applications, and the difficulties of using the Web Services
Description Language that defines how Web services are used. Had Eastman not
tested the process by which a wide range of customers would access the Web
service, it would have fielded an offering that could have failed some
customers the minute the service went live.

Eastman is facing internal issues as well. Pleasant believes Eastman's business
units will require extensive help to understand how the much quicker turnaround
of Web services will demand a radically different approach from traditional
software initiatives, requiring the business side to identify focused,
near-term business opportunities for Web services that can be put in place
rapidly. "You really have to work hard to educate people about this," he says,
"because it's a real change in what's possible with systems."

Pleasant is philosophical about the challenges they've encountered. "I mean,
this is an emerging technology," says Pleasant. "If it worked out of the box,
then it wouldn't be emerging, would it?"

Tangled Web

Web services represents an opportunity to derive business value quickly out of
business situations where interoperability and time to market are critical.
Michael Hoch, senior analyst for Internet infrastructure at Boston-based
Aberdeen Group Inc., estimates that companies can create new applications or
integrate old ones in 30 percent of the time it would take to build their own
integrated applications using traditional software development methods. That
means CIOs can often tell their staffs simply to jump in and pilot Web services
projects without having to change their existing infrastructure significantly.
"With Web services, you don't have to rip out the software investment you've
already made," says consultant Hagel "You can get more business value from
existing platforms."

Yet with standards still developing and many efforts still in the pilot stage,
Web services still has far to go before it fulfills its potential. "But that's
always true with these emerging technologies," says IBM's Sutor. "It takes some
time to really understand the business value of these things."


----------------------------------------------------------------------------
----

STEPHEN LAWTON, a freelance business and technology writer in San Bruno,
Calif., has written about technology issues for more than 20 years.

Service With a Smile

With all the hype about Web services, you'd think it would solve every ill
known to IT kind. To the World Wide Web Consortium, or W3C, the defining body
for all things Web, the term "Web services" refers to any of the various
schemes for allowing applications to communicate with one another over the
Internet to share information and support business processes. But it's the very
broadness of that definition that contributes to the confusion.

Web services are really modular components, basic building blocks that can be
assembled into larger applications. They can be as small as a single function,
such as asking a legacy system about the exchange rate between two currencies,
or as large and complex as initiating a series of currency exchanges using
multiple banking systems. And these services can be run by everything from a
single user with a Web browser to a massive enterprise application.

But that's nothing more than middleware, you say. Actually, middleware is
responsible for finding and supplying the necessary data wherever it's located;
Web services are responsible for processing the data. Think of Web services as
a layer of standards slathered over middleware and Web-based applications that
ensure information will be transmitted and processed in consistent ways.

To build a basic Web service, you need HTTP, the HyperText Transfer Protocol;
XML, the Extensible Markup Language; and SOAP, the Simple Object Access
Protocol. XML defines the standard format for the data, and can even define the
business processes that should be used with the data. HTTP lets you send and
receive data freely through the corporate firewall, without needing to punch a
new security hole in the firewall. SOAP lets applications, either on your own
system inside the firewall or on another company's, "invoke," or execute, the
Web service.

But Web services can start to get complicated when you want to do more than
transmit basic information. If you want to make it easy for your applications
to find unfamiliar Web services somewhere on the Net so the applications can
run those Web services, you'll need some kind of white or yellow pages: That's
the Universal Description, Discovery and Integration service, UDDI. And if you
want your applications to be able to find information about a Web service, such
as how it should be used, you need WSDL, the Web Services Description Language.

Here's another way of looking at it. Think of how you might go about finding a
plumber when you're living in a foreign country. If you already know the
plumber and the language he speaks, you'd simply dial him up (HTTP). But if you
didn't speak the same language, you'd need a translator who could speak both
your language and the plumber's, and translate between them (XML). Now suppose
you needed the plumber on a regular basis; you'd want to be able to put his
number on speed dial, and possibly even have a recorded message saying what you
wanted him to do automatically (SOAP). But what if you didn't know how to reach
the plumber? You'd need a good phone book (UDDI). Now suppose you regularly
required different kinds of laborers on a regular basis, but needed to know
ahead of time what their qualifications were, when they were available, and so
on; you'd need a standard way of describing, in a manner both you and the
workers understood, what they could do (WSDL).

Most users today are focusing their Web services efforts inside the firewall,
on relatively straightforward problems such as linking two legacy systems to a
new business process such as allowing visitors to a corporate portal to check
their benefit balances. The best way to begin with Web services is to start off
with small pilot programs using the basic XML and SOAP protocols, then
continually adapt those basic applications until they're performing as needed.
********************
Nando Times
Internet-enabled fax technologies find following
Copyright © 2002 AP Online 
By MAY WONG, AP Technology Writer

SAN JOSE, Calif. (April 21, 2002 6:56 p.m. EDT) - At Boeing Co., workers no
longer hover over fax machines. The aerospace giant now sends 18,000 faxes and
receives 15,000 each month via e-mail.
A new crop of sophisticated, networked fax products is gaining ground as
corporate technology managers begin to appreciate the convenience and savings.

Internet faxing can involve a fax machine hooked to the Internet, a PC with a
fax modem or an Internet connection or, in Boeing's case, a fax service that
uses computer servers to do all the heavy lifting and give users remote access
to faxing via the Web.

The most compelling feature of the new breed of Internet-ready fax machines is
their ability to send or receive faxes as e-mail attachments, catering to the
world's deepening shift to digital documents created and stored on computers.

Using the Internet rather than traditional voice telecommunications lines, a
single fax can reach any number of people anywhere in the world faster and more
cheaply. A fax sent from an Internet-enabled machine can eliminate completely
the need - and cost - of a long-distance phone call.

"We are a global company with offices in 60 countries. Our work force is
increasingly mobile, and we need abilities to support that global, mobile work
force," Boeing spokeswoman Bev Clark said.

Boeing first tried Internet faxing last year and liked it.

About 4,000 of its 150,000 employees - primarily those receiving numerous faxes
and dealing with legal documents - subscribe to Canadian-based Interstar
Technologies' faxserver.com, which along with j2 Global Communications'
eFax.com, are among a growing number of Web-based fax services.

The change was part of Boeing's larger goal of being able to access and manage,
within five years, all business communications, including e-mail, voice mail
and faxes, from a unified messaging system.

The venerable facsimile machine - first used by the government in the 1930s -
scans the image of a document and sends it using an analog telephone connection
from point A to point B. With faxing over the Internet, the image is
transformed into a digital file that can be processed by a computer or an
Internet-enabled fax machine.

Fax traffic peaked in 2000 and is now on the decline as e-mail becomes more
prevalent, analysts say.

Fax machine makers are responding: Market consulting firm CAP Ventures Inc.
predicts that in 2005, more than 60 percent of all 200-page-plus-capacity
office fax machines sold will be Internet-enabled, up from less than 5 percent
in 2001.

"E-mail will continue to increase, and telephone fax transmissions will
continue to decrease, and this is the great bridging product that combines the
phone and e-mail lines," said Paul Wharton, a Panasonic marketing manager.

Internet fax machines, such as Panasonic's newest Panafax DX-800, can receive
image files via a phone line while also sending them over a network connection.
Scanned paper documents can be sent to e-mail addresses. Users can program
e-mail addresses into the "autodial" list of machines.

In most cases, if the sender of a fax - whether it's a hard copy from a fax
machine or a digital file from a computer desktop - is transmitting to an
old-fashioned analog fax machine, they must still address it correctly to a fax
phone number.

Some higher-end machines never even print out a fax, converting all documents
to digital files instead. Time-consuming manual deliveries of faxes are thus
replaced by electronic delivery over corporate networks.

All Internet fax machines also handle traditional faxing.

Panasonic, the leading provider of Internet fax machines, plans to expand its
I-fax line from two to nine models - or half of its fax product line - this
year.

Still, analysts don't see paper facsimiles becoming extinct.

Many people still rely heavily on paper records - from warehouse purchase
orders to local police reports - and many companies remain concerned about the
legality of digital documents.

A contract that is signed and faxed is considered legally binding. But if the
document is e-mailed, it requires an authentic digital signature, a technology
analysts say has yet to pass muster in the courts.

Though the 750 attorneys at San Francisco-based Brobeck, Phleger & Harrison LLP
are big e-mail users, critical documents still wind through the law firm's
central fax processing center at the rate of up to 1,000 faxes a day.

Jonathan Wong, Brobeck's chief information officer, nevertheless predicts his
and other big law firms will try Internet faxing within a year.

Internet fax providers are adding security measures such as the encrypted
passwords enabled by Panasonic's newest machine for e-mail faxes.

Fax technology consultant Maury Kauffman has this take on the fax evolution:

"In the long run, regardless of how much outbound faxing we do, all of our
faxes will come in our e-mail box."
*********************


Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 507
1100 Seventeenth Street, NW
Washington, D.C. 20036-4632
202-659-9711