[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Clips June 17, 2002
- To: "Lillie Coney":;, Gene Spafford <spaf@xxxxxxxxxxxxxxxxx>;, Jeff Grove <jeff_grove@xxxxxxx>;, goodman@xxxxxxxxxxxxx;, David Farber <dave@xxxxxxxxxx>;, CSSP <cssp@xxxxxxx>;, glee@xxxxxxxxxxxxx;, Charlie Oriez <coriez@xxxxxxxxx>;, John White <white@xxxxxxxxxx>;, Andrew Grosso<Agrosso@xxxxxxxxxxxxxxxx>;, computer_security_day@xxxxxxx;, ver@xxxxxxxxx;, lillie.coney@xxxxxxx;, v_gold@xxxxxxx;, harsha@xxxxxxx;, KathrynKL@xxxxxxx;
- Subject: Clips June 17, 2002
- From: Lillie Coney <lillie.coney@xxxxxxx>
- Date: Mon, 17 Jun 2002 11:02:32 -0400
- Cc: lillie@xxxxxxx
June 17, 2002
ARTICLES
Competition Is Heating Up for Control of .org Domain
Battle Over Access to Online Books
Pay-TV piracy
Beijing Mayor Cracks Down on Internet Cafes After Fatal Fire
Number of U.S. Telecommuters Rising
Microsoft at Heart of Calls for Software Liability
Web surfing gives up suspect
Security concerns raised
GSA seeks input on gateway project
Army given funding for backup portal
Security on the go
CryptCard lessons learned
FAA to simulate GPS outages
The lowdown on Web services
Change agents needed
State issues ID card plan
Trucking app secured
Official site to advise on state snooping (United Kingdom)
Web gives a voice to Iranian women
Australian teleport breakthrough
'Snoop' vote postponed
Connecting the villages
Copyright rows ring down the centuries
New immigration computer to detect phony visas
Congress juggles homeland security bills
Major changes afoot at ICANN
ICANN HISTORY
U.S. uses Japan supercomputer
Citibank to block Net gambling transactions
Wiring the New Docs
Aviation security agency to award $1 billion in technology work
Open Source Goes Mainstream
EU privacy group investigates music players
IM'ers Get a Secure Chat Room
What Supercomputers Can and Cannot Do - Yet
One system for all handhelds?
Software to keep your money safe
********************
New York Times
Competition Is Heating Up for Control of .org Domain
SAN FRANCISCO, June 14 An intense, largely behind-the-scenes competition
is under way for the right to manage the global database that keeps track
of Internet addresses of noncommercial organizations.
Although the business of registering Internet names has begun to shrink
this year, as many as eight or nine bids are expected at a meeting this
month in Bucharest, Romania, when the group that oversees Internet
addresses will decide who should manage the list of names that end in .org.
The decision will shift the .org domain from VeriSign Inc., which currently
manages the list of 2.7 million organizations. The company struck a deal
with the oversight organization, the Internet Corporation for Assigned
Names and Numbers, or Icann, last year that extended VeriSign's control
over .com and .net addresses in exchange for giving up the .org
designation. VeriSign, based in Mountain View, Calif., also promised to
contribute $5 million to assist in the transition.
Although the management of .org was once intended to go to a nonprofit
organization, the competition has more recently attracted some
profit-minded businesses.
In addition, the competition is likely to become much more visible with the
entry on Monday of two iconoclastic Internet pioneers who say that many of
the entrants have served as shields for large businesses that are hoping to
help themselves to what some analysts estimate will be a $10-million-a-year
business.
One of those pioneers, Carl Malamud, has previously forced the government
to make Securities and Exchange Commission financial data available freely
over the Internet. His partner, Paul Vixie, has been a longtime Internet
software developer and a determined opponent of unsolicited commercial
e-mail, known as spam. The two said they intended to run the .org registry
on a nonprofit basis.
Mr. Malamud and Mr. Vixie say their plan differs from those of other
competitors because they intend to place the database software needed to
operate the .org name system in the public domain.
"Is this a public trust or a public trough?" Mr. Malamud asked.
James Love, director of the Consumer Project on Technology, a Washington
lobbying group, says the competition has drawn commercial bidders that have
associated themselves with a nonprofit organization to improve the
appearance in front of the Icann review committee.
But Icann's supporters respond that the organization has created a process
that will select the group that will best manage the database.
"Icann is trying hard to make sure this isn't a gold rush," said Esther
Dyson, chairwoman of EDventure Holdings and a former chairwoman of Icann.
One of the first partnerships to announce a planned bid is Poptel, the
British manager of the new .coop domain, and AusRegistry, the operator of
the Australian .au country domain. The two companies are calling their
partnership Unity Registry.
Another bid is being planned by Global Name Registry, a British company
that was recently awarded the .name domain, in conjunction with the
International Red Cross, according to several people close to the company's
plans.
In a similar fashion, Afilias Global Registry Services, which was recently
awarded the .info domain, is planning to submit a bid in conjunction with
the Internet Society, the nonprofit organization that oversees the Internet
standards group, the Internet Engineering Taskforce.
There has also been speculation among a number of people involved in the
bidding process that even though VeriSign originally struck a deal to
release the .org domain, it is planning a bid of its own.
Icann's request for proposals has emphasized both diversifying control over
the approved domains as well as a complex proposal process to qualify the
bidders.
"The board wants a stable well-functioning .org registry," said Miriam
Sapiro, founder of Summit Strategies International, a Washington company
specializing in Internet policy and international issues. "It doesn't want
to take a risk and jeopardize the domain names of 2.7 million organizations."
Mr. Malamud, who heads the Internet Multicasting Society, an organization
in Stewarts Point, Calif., that develops open source Internet software, and
Mr. Vixie, who founded the Internet Software Consortium, a group in Redwood
City, Calif., that develops open source versions of crucial Internet
infrastructure software, said they planned to place the complex software
used to manage domain names in the public domain as open source, freely
available to any organization.
They say that would have the twin effect of making it simpler for Icann to
diversify control of the domains as well as making it easier to create new
ones. The issue is a hotly debated one that the organization, which was
created under a contract with the United States Commerce Department, is
struggling with.
"This shouldn't be a dot-com opportunity," Mr. Malamud said. "There has
been a lot of smoke and mirrors, but what we need is actually a public
utility that is well managed in the public interest."
************************
New York Times
Battle Over Access to Online Books
By DAVID D. KIRKPATRICK
hen Internet song-sharing services created digital jukeboxes of free music,
book publishers raced to bolt the door to their own archives of copyrighted
works.
Many librarians, on the other hand, thought the idea was pretty exciting.
Now, new technologies are igniting a similar battle closer to home.
Librarians have seized on the potential of digital technology and offered
users free online access to the contents of books from their homes, and
they are squaring off with publishers who fear that free remote access
costs them book sales.
And as they haggle over contracts to obtain the rights to books,
librarians' trade groups have begun lobbying against publishers on Capitol
Hill, as well, fighting to block legislation that would strengthen copy
protection and make it illegal to duplicate protected digital files. They
have already been fighting publishers in a suit asking the Supreme Court to
shorten the duration of copyright.
Libraries are one of the few places where there is real demand for
electronic books, which so far have been a dud with consumers. The Web
sites of more than 7,300 libraries, including the New York Public Library,
provide patrons 24-hour remote access to the texts of a few hundred to
several thousand electronic books, occasionally even in languages like
Chinese and Russian.
"What we are really excited about is the potential of the technology to
allow greater dissemination of information because getting information into
the hands of everybody we can is what we are all about," said Miriam
Nisbet, legislative council for the American Library Association. "What we
are concerned about is the dark side, which is trying to lock everything up."
But locking everything up is exactly the response from the largest
publishers. Although hundreds of smaller publishers with fewer popular
titles have allowed libraries to "lend" their books electronically, the
major trade publishers are refusing to cooperate.
"Lending over their Web sites I think that is a problem," said Laurence
Kirshbaum, chairman of the books division of AOL Time Warner. "There is an
inherent danger that would worry me you are opening yourself up to being
copied wildly without any control."
The tensions among libraries and publishers are coming to the fore as
several companies that originally aimed to help publishers sell digital
books to consumers have instead set their sights on serving libraries,
which are the only paying customers. Some of their efforts are ruffling
publishers' feathers.
One controversial case involves the renegade publisher RosettaBooks. At the
American Library Association meeting over the weekend, the company began
selling libraries a chance to provide patrons unlimited, simultaneous
access to a collection of 100 20th-century classics by authors like Kurt
Vonnegut and William Styron for an annual fee of $200 to $1,000.
RosettaBooks acquired the rights to publish digital editions of the books
directly from the authors, arguing that the original contracts covered only
printed books. But the original publishers of the books argue that
RosettaBooks is violating their exclusive licenses to publish them for the
duration of the copyright. Random House is suing RosettaBooks on those
grounds, with support from the publishers' trade group.
The offer by RosettaBooks is forcing librarians to take sides, but
Frederick W. Weingarten, director for information technology policy at the
American Library Association, said, "I think libraries will pick that up
and run with it." He added, "They are not going to be looking at whether or
not they are offending publishers." A Random House spokesman declined to
comment.
Others are taking smaller steps. NetLibrary, the biggest middleman selling
electronic editions to libraries, has tried to mollify publishers by
restricting access to each digital book to one patron at a time just like
printed volumes even though the technology would allow everyone in town to
read an electronic book from home at once. But in January a consortium of
libraries acquired netLibrary, and this year the company has begun
experimenting with allowing libraries the right to offer more than one user
access to the same books at once. In the experiments, a library pays an
annual fee to circulate a collection of digital books for a year and
netLibrary passes a portion of the fees to publishers.
"Libraries very much want to move away from the one book, one user model,"
said Marge Gammon, senior director for marketing and publisher relations at
netLibrary, "We get asked a lot, `When is that going to change?' "
Others originally aiming to sell electronic books for publishers have
switched their focus to helping libraries. One company, Ebrary, started out
seeking to provide a digital showcase to let consumers browse and buy
books. This year, it shifted its emphasis to selling libraries a database
of digital books they can offer their patrons. Fifty libraries have signed
up so far. The Web sites of many libraries already direct users to sites
like Project Gutenberg and the University of Virginia's Electronic Text
Center, which provide free access to digital books no longer under
copyright. More than 30,000 people a day visit the Electronic Text Center,
where the texts of 10,000 books are available to the public.
Others see digital books as a way to provide immigrants access to
hard-to-obtain foreign language texts. The Queens Public Library in New
York, for example, provides unlimited simultaneous access to 3,500 Chinese
texts. More than 400 Chinese speakers used the service last month, and many
apparently read entire texts, said Gary E. Strong, the library's director.
So far, RosettaBooks is one of the only English language publishers to
offer similar access to their books, even under an annual lease. "That is
one of the big complaints," said Kay Cassell, associate director for
programs and services at the New York Public Library. "It would be great
for libraries if we could have more than one person access an electronic
book at a time. But I think the publishers are a little scared."
****************************
San Francisco Chronicle
Pay-TV piracy
Decrypting is costing the satellite and cable TV industry $6.5 billion a year
Patrick doesn't consider himself a pirate, nor does he feel guilty about
pulling down free satellite TV signals for the past two years.
The Bay Area resident, who did not want his full named published, uses a
computer to descramble the TV signals beamed down from outer space,
bringing him an unlimited selection of movies, sports and news.
"It's more of a hobby, it's more of a game," said Patrick, referring to the
satellite TV industry's attempts to crack down on pirates, including
transmitting signals that knock out the reception. "As soon as they turn
you off, you go to a couple of (Web) sites, find a file, get it fixed.
Sometimes you get it fixed within a half hour."
But officials of both the satellite and cable TV industries view these
pirates as a multimillion-dollar headache -- not a game. In the past year,
both industries have become more aggressive in their efforts to sink pay-TV
pirates, including a new effort to use the 1998 Digital Millennium
Copyright Act to hunt down people who have illegal satellite hookups.
DirecTV, for example, has sent about 10,000 legal notices to people around
the country in the past six months demanding that they stop using devices
that decrypt satellite signals or face civil or criminal prosecution.
"It's always important to go after the end users to send a message that
it's not right to steal," said Larry Rissler, a former FBI agent who now
heads the Office of Signal Integrity for DirecTV Inc., the El Segundo (Los
Angeles County) company that is the largest U.S. provider of satellite TV
service.
Last week, AT&T Broadband officials in Southern California used a 5-ton
steamroller to crush 3,000 "black boxes" -- illegal set-top devices
designed to unscramble cable TV signals -- that were seized by law
enforcement officials during the last two years in the Los Angeles area.
"We thought we'd bring attention to this as an issue by doing more of a
public destruction of the boxes," said Patti Rockenwagner, executive
director of corporate communications for AT&T Broadband, which is the
dominant cable TV service for the Bay Area.
The black boxes came from completed criminal cases that brought in about $2
million in fines to AT&T Broadband. Rockenwagner said the company still had
10, 000 illegal boxes being kept in storage as evidence in pending cases.
"A lot of boxes we recovered from illegal users who were then prosecuted,"
Rockenwagner said. "Many paid in excess of $2,500."
The National Cable and Telecommunications Association has estimated cable
theft costs the industry $6.5 billion per year in lost revenue. The costs
to the satellite TV industry are estimated to be hundreds of millions per
year.
Consumer electronics groups, however, fear that this battle to control TV
piracy could be used to bolster arguments from entertainment companies that
all future digital devices have built-in anti-piracy technology.
Pay-TV piracy has been a problem ever since cable TV operators started
laying cable. Pirates found low-tech ways to splice into the line to hook
up their sets.
But in the digital age, pay-TV piracy has also gone high tech, particularly
for satellite TV providers.
The services use set-top receivers that are able to decode digital signals
beamed down from space with a device called a conditional access module,
also known as a smart card. But the receivers will get signals for free
with a hacked or counterfeit card.
Satellite companies are able to disable the counterfeit cards by
transmitting electronic counter measures, known as ECMs.
But the Internet abounds with sites that dispense and share information
about reprogramming smart cards that have been scrambled by an ECM. The
sites also have forums for members who call themselves testers or
hobbyists. Some sites advertise and sell the necessary equipment like
counterfeit cards or card readers.
Patrick said he pieced together a system that works off an $80 satellite
receiver, which came with a smart card, and a dish he bought at Wal-Mart.
He feeds the signal through a computer, which has a program that decodes
the transmission and sends it cleanly to the TV screen.
The Web master of the Digital Satellite Underground (www.dssunderground.
com), a 6-year-old Canadian site, said in an e-mail to The Chronicle that
more than 100,000 people have registered for its forums.
"Lots of people try to say it's a hobby to try to justify it," said the
site's Web master, who goes by the name Deejay. "Yes, a small amount of
people actually do learn things about electronics, and smart card tech, and
do find it a interesting hobby, but from what I have seen, the majority of
the public just wants free TV."
However, Deejay also said part of the blame rests on satellite TV companies
that are charging subscription prices he called "way too high and out of
control."
Rissler, however, scoffs at that argument and the contention that people
are entitled to any signal that falls into their backyards. "There seems to
be an almost alarming lack of appreciation on the part of some people on
the concept of intellectual property," he said.
The most famous electronic countermeasure came on Jan. 21, 2001, when
DirecTV beamed what became known as the "Black Sunday ECM." The ECM coded
all access cards, making illegal ones inoperative.
That created an explosion of hardware devices that circumvented the Black
Sunday ECM, Rissler said.
DirecTV responded by creating a team of its own investigators and lawyers,
who go directly after the hardware distributors, using for the first time a
provision in the 1998 Digital Millennium Copyright Act that makes it
illegal to circumvent technology designed to protect copyrights.
The company has filed about 30 civil suits, and in a series of raids in
California, Texas and Florida, seized equipment with an estimated street
value of $4.5 million. In May 2002, the company won a $1 million judgment
against an Indiana man who was swept up in one of those raids.
DirecTV even sued a radio personality who described on the air how he used
a hacked access card to receive free programming.
In the last six months, DirecTV's lawyers have started contacting the
customers listed in records seized during the raids.
"They say, 'We've identified you, we want you to stop, we want you to send
us the device and, by the way, you owe us money," Rissler said. "We're
getting a lot of settlements just on the demand letter."
Deejay, the Digital Satellite Underground Web site master, said he got rid
of his equipment after the Canadian Supreme Court in April ruled that it
was illegal for Canadian residents to decode U.S. satellite signals.
Now, Deejay said his site does not sell circumvention equipment, but is a
forum on satellite encryption and smart card technology.
"We have been threatened by DTV and their lawyers, but nothing has ever
come of it,," Deejay said. "Do they think they can stop people from talking
about this stuff? One site gets shut down, and five more pop up to fill its
place."
Cable operators haven't faced the same problem as the satellite providers,
since cable boxes don't rely on smart cards. But cable TV pirates use a
variety of methods, from illegally tapping into a cable line to using a
black box.
Keith Relph, security director for AT&T Broadband's Bay Area cable
operations, said his 40-member team finds about one black box per day.
AT&T Broadband dominates the Bay Area's cable TV market, with 1.8 million
customers out of 2.6 million homes that could have cable. But Relph also
estimates that approximately 10 percent, about 260,000 homes, have an
unauthorized hookup.
Relph's security team monitors the area for electronic signal leakage. But
Relph said that instead of initiating raids, sending a formal legal notice
or beaming in electronic counter measures, the team leaves a door hanger
that "lets them know they are in violation" and inviting them to become
paying customers.
"We're trying to take a gentler approach," Relph said.
An official from a group representing consumer electronics-makers wonders
if the problem of controlling pay-TV piracy could get dragged into a larger
debate about using technology to protect copyright infringement of music,
movies and other digital entertainment.
Jeff Joseph, vice president of communication for the Consumer Electronics
Association, said there is already a proposal to install "selectable output
control" technology in future cable TV set-top boxes that could give
entertainment companies "remote control of America's living rooms."
Pay-TV piracy "is wrong, there are laws against that and there are
penalties against that," Joseph said. "But let's not confuse piracy with
what most consumers want to do. It can't come at the expense of home
recording rights or technological innovation."
**********************
New York Times
Beijing Mayor Cracks Down on Internet Cafes After Fatal Fire
By ERIK ECKHOLM
BEIJING, June 16 The mayor of Beijing ordered all Internet cafes in the
city today to shut down temporarily after at least 24 died in an
early-morning fire in an unlicensed cafe in the university district.
The fire, said to be the deadliest here in more than 50 years, started at
2:40 a.m. in the Lanjisu Internet cafe in northwestern Beijing. Like many
similar establishments, the Internet cafe was crowded with students who
cannot afford personal computers and visit the cafes at night to take
advantage of lower rates as they surf the Web, play Internet games, dawdle
in chat rooms and exchange e-mail.
And like many similar cafes in Beijing and around China, it was a firetrap.
Dozens of computers were crowded into a second-story room with barred
windows and just one door that witnesses said was bolted shut, requiring
customers to knock to be let in or out.
Neighbors described hearing the screams of students pressed against the
windows, unable to escape. Firefighters broke into a back window and
rescued several patrons, according to a report by the state-run New China
News Agency, and had the blaze under control within about two hours.
The authorities released no information today about the cause of the fire,
and the police were reportedly trying to find and arrest the owner.
Past efforts by Beijing and other cities to regulate the proliferating
Internet bars have involved more threats than effective action. In Beijing
alone, there are some 2,400 computer cafes, only 200 of them licensed, the
news agency said.
But the new tragedy evoked unusually strong vows from Beijing leaders, who
know that the country's widespread problems with fire and building safety
have become politically sensitive. In December 2000, when a disco fire
killed 309 in the central city of Luoyang, official careers were derailed
and 23 people deemed directly responsible for the unsafe conditions in the
building were sent to prison.
Today, after an emergency meeting, Mayor Liu Qi said all Internet cafes,
legal or not, must close down immediately, the news agency reported.
No new licenses will be granted to Internet cafes, the mayor reportedly
decreed, and those previously licensed will have to reapply and certify
that they "meet relevant requirements." Any illegal cafes, the mayor said,
"will be severely punished."
The mayor also announced a special three-month inspection of fire safety in
all the city's enterprises and commercial buildings.
Officials have been frustrated not only by safety lapses in the cafes, but
also by their alleged use to spread pornography, provocative rumors and
unapproved political ideas or information.
School officials have also fretted that computer rooms are luring teenagers
away from their studies and are a bad social influence.
But the demand for computer services is huge and growing, and officials did
not say how they will stamp out the thousands of establishments, which
range from store-fronts with a few machines to large-scale operations like
the one that burned today.
By this evening, a computer bulletin board run by People's Daily, the
newspaper of the Communist Party, carried several messages criticizing the
Beijing authorities.
The underground Internet bars are so popular, one writer said, because the
local police and officials make it so difficult to obtain a license.
Another person, noting the city government's urgent new safety measures,
wrote, "Everyone is asking what they were doing before this happened."
***************************
Associated Press
Number of U.S. Telecommuters Rising
Sun Jun 16, 3:30 PM ET
By MAY WONG, AP Technology Writer
MILL VALLEY, Calif. (AP) - With its quaint shops and leafy residential
roads, it's easy to mistake Mill Valley for simply a quiet, upscale bedroom
community across the Golden Gate Bridge from San Francisco.
Truth is, there's as much wheeling and dealing in this town as in a
big-city skyscraper.
From their Mill Valley homes, Joe Caldwell handles the investment
portfolios of millionaire clients, Robin Thompson works with large
corporations like Wells Fargo or Oracle, promoting Canada as a meeting
destination, and Marilyn Jackson's computer consultancy clocks in at three
clients a day.
The three are part of a growing contingent of Americans whose commutes
consist of a walk down the hall or a jaunt to the converted garage.
The number of Americans working at home three or more days a week grew
nearly 23 percent, from 3.4 million in 1990 to 4.2 million in 2000,
according to U.S. Census figures. Mill Valley topped California's list,
with 15.4 percent of its 14,000 residents working at home.
The census category includes farmers, so South Dakota, at 6.5 percent,
leads other rural states atop the nation's work-at-home list. And the
census only partly reflects the growing scope of telecommuting, since
millions of others work from home one or two days a week as corporate
America has grown to accept more flexible schedules.
"The biggest constraint was managers letting people telecommute, and that's
diminishing," said Patricia Mokhtarian, a civil and environmental
engineering professor at the University of California, Davis.
The estimated number of Americans who telecommute at least some portion of
the week jumped more than 42 percent in two years, from 19.6 million in
1999 to 28 million in 2001, according to the International Telework
Association and Council. Most live in New England and on the East and West
coasts in areas with dense populations and notorious traffic congestion,
said Tim Kane, the organization's president.
More than two-thirds of telecommuters surveyed by the group said they're
more satisfied or much more satisfied since they began working at home,
Kane said. "They're saying, 'This is three hours I don't need to be in the
car, and I could be with my kids, pick the dry cleaning, or whatever.'"
The changes are evident in Mill Valley, where people armed with laptops,
cell phones and personal digital assistants set up shop among the
latte-drinkers at the Depot Bookstore and Cafe, its outdoor patio
overlooking the town square.
"I see all kinds of people now they're figuring out retail or real estate
issues or calling suppliers," said Peter Graumann, a clerk at the store.
"It's not just the writers and artists anymore."
Now that computer firewalls allow secure connections to corporate networks,
work-related communication can happen anytime, anyplace.
"People are amazed they get e-mails from me at 5 a.m. or 10 p.m.," said
Thompson, a manager for the Canadian Tourism Commission, her first
full-time telecommuting job in a 20-year hotel industry marketing career.
Instead of leaving by 6 a.m. to beat the traffic over the Golden Gate,
Thompson can be hard at work by dawn. If her 13-year-old daughter needs her
during the day, she can complete a chunk of work later in the evening.
Productivity doesn't suffer, many telecommuters say.
"There's no office chitchat, no 'how was your weekend?'" Thompson said. "I
get a lot accomplished without all the interoffice distractions, and no
commute."
A computer, phone line, dial-up modem and Internet access are all many
telecommuters need. For many home-based small businesses, their storefronts
are on the Web and delivery services come to the door.
As Jackson, owner of mjacksoncomputers.com, puts it: "I'm really not based
anywhere. I'm really based in the Internet, which is the tech universe."
The computer consultant walks through her rose garden to her workspace in a
converted garage. Between e-mails and house calls, she squeezes in a daily
5-mile hike from Mill Valley to the Pacific Ocean, and avoids peak road
congestion periods.
It's all about regaining a more balanced quality of life.
"People don't want to put in the 16-hour days to drive an hour-and-a-half
or two from home and then come back," said Charlie Grantham, chief
scientist who telecommutes himself at the Institute for the Study of
Distributed Work, a think tank based in Windsor, Calif., in northern Sonoma
wine country. "And corporate America is beginning to examine how to use
technology to connect the workers they need with the work that needs to be
done, regardless of where the workers are located."
In fact, of the 8 million business subscribers of broadband services
expected this year, more than 60 percent will be for residences, according
to In-stat/MDR, a Scottsdale, Ariz.-based market research firm.
At Cigna Corp., where about 9,000 of the 43,000 employees have arranged
with their managers to telecommute, a formal E-Worker program was started
two years ago. Already, 2,100 workers have signed up, getting additional
training, home-office equipment and technical support.
Productivity increased by as much as 15 percent, and job turnover rates
have been cut nearly in half in some divisions of the Philadelphia-based
insurance company.
Now, Cigna is adding "touchdown spaces" in more of its 250 offices for
workers who occasionally need shared offices for meetings or social
contact. It also has a call-forwarding system allowing untethered employees
to answer their direct lines wherever they are.
Support services also are adapting.
At Mill Valley Services, a printing business, owner Dave Semling needs the
latest technologies to serve his clients. Last week, a client e-mailed a
document, which he printed on a four-color press and then sent to another
at-home worker on the East Coast.
Many companies have come to recognize telework as a recruiting tool.
Scimagix Inc. of San Mateo, which does imaging software for drug companies,
offers two-day-a-week telecommuting as it competes for engineers.
"You have to do this to run a business," said the startup's co-founder,
Bryan Van Vliet, a married father of two young children who now works at
home three days a week himself. "You're looking for a good pool of talent,
and you can't always find someone who lives 10 minutes away."
Jackson, the Mill Valley computer consultant, said telecommuting helped her
raise and support two children after her husband died of cancer.
"It made a huge difference to me that when I woke up or that when I came
home from school, she was here," said her oldest child, Noah, a 20-year-old
UC Berkeley student. "And being able to eat as a family every night made a
big difference, too."
*****************************
Reuters
Microsoft at Heart of Calls for Software Liability
Sun Jun 16, 4:50 PM ET
By Elinor Mills Abreu
SAN FRANCISCO (Reuters) - Microsoft Corp. , a company known for its popular
software and its very deep pockets -- but also glitches in some products --
is a liability lawyer's dream: the big game target that always gets away.
For decades, software makers have been protected from lawsuits as U.S.
courts have struggled with the task of defining something as abstract and
fast-changing as computer code.
But now a growing number of voices within the industry and government are
arguing for software to be held to the same standards as other products, a
potential reform that puts the world's largest software vendor squarely in
its sights.
Although it's hard to put a dollar figure on the potential risk to
Microsoft, with almost $39 billion in cash and short-term investments, the
company would be the obvious deep-pocketed target, said Mark Rasch, a
computer and Internet policy lawyer in Bethesda, Maryland, and former head
of the U.S. Department of Justice ( news - web sites)'s computer crime unit.
"They've got such a huge market penetration. They're a huge, deep pocket.
Their software has a lot of vulnerabilities and defects in it and people
tend to use a whole suite of their software," Rasch said.
"It is the homogeneity of the environment that means that a particular
vulnerability in one piece of software can expose a company to a lot of
damages," he said. "So Microsoft is target No. 1 for this potential
litigation."
Although calls for reform are increasing, it's not likely to happen anytime
soon, given that laws typically lag technology, Rasch and other experts
concede. Debate over the issue is as old as the software industry itself.
If and when it comes, Microsoft, no stranger to titanic legal battles, will
be prepared, having weathered long-running antitrust litigation in both the
United States and a parallel probe in Europe.
But this time, the company has also taken pre-emptive steps against a
backdrop of increasing complaints to improve the security of its software.
In general, Microsoft says it is unfairly targeted because of the
popularity of its software, which run everything from PCs and handheld
devices to servers and game consoles.
The products are even less buggy than others, in terms of per capita usage,
Microsoft Chief Executive Steve Ballmer has said.
And with software increasingly becoming more interconnected with other
systems, it's often hard to tell exactly where problems are coming from,
said Craig Mundie, Microsoft's chief technical officer.
"Society has benefited from high-volume, low-cost software and a rapidly
evolving ecosystem" where disparate computer systems, software and hardware
link up, Mundie said. "Microsoft can't control that process. If the printer
driver tanks the system, who do you hold liable?"
AIR FORCE COMPLAINS
But some clients have reached the limits of their tolerance. Air Force
Chief Information Officer John Gilligan has complained to Microsoft and
other companies, for example.
"I'm spending more money patching and fixing than we did to buy" the
software, he said in a recent interview. "I can't afford to do this anymore."
The complaints run far and wide. "It's a confusing point to me that
Microsoft can release a product which has fundamental flaws and they're in
no way held accountable for that," said Tim Wright, chief technology
officer and chief information officer of Terra Lycos S.A.
"It's like Boeing making planes that crash and saying it's (waiver of
responsibility is) in the disclaimer," Wright said from his Waltham,
Massachusetts-based office of the Spanish-American Internet media company.
Even market researchers and insurance companies -- themselves harbingers
for the legal community -- have weighed in.
Problems with Microsoft's Internet Information Server (IIS) were so
dramatic that last summer leading technology research firm Gartner analyst
John Pescatore recommended that users of this critical piece of Web site
software switch to alternate Web server software.
British-based J.S. Wurzler, an insurance underwriter for Lloyds of London,
last year raised its rates for IIS users, citing the large number of
security holes affecting it.
"Today, Firestone can produce a tire with a systemic flaw and they're
liable," says Bruce Schneier, chief technology officer of network
monitoring firm Counterpane Internet Security, who has been calling for
software liability reform for years.
"But Microsoft can produce an operating system with multiple systemic flaws
per week and not be liable." (With additional reporting by Eric Auchard in
New York.)
********************
Chicago Sun-Times
Web surfing gives up suspect
BY JIM SUHR
ST. LOUIS--Perhaps it was cockiness that got the best of Maury Troy Travis.
Maybe he was just naive about cyberspace.
Investigators say Travis didn't emerge as a suspect in the killings of
women around St. Louis until he sent a computer-drafted letter to the St.
Louis Post-Dispatch. More important, he e-mailed a map off the Internet.
''Before that, his name to my knowledge never came up,'' said police Capt.
Harry Hegger, who has led the investigation into the deaths of 10 women
whose bodies have turned up on both sides of the Mississippi River since
April 2001. ''That letter was the critical piece of evidence.''
Though the mailing was unsigned, Travis' electronic fingerprints were all
over it. He was arrested June 7.
Three days later, the case took another strange twist: While jailed on
federal charges in the abduction, torture and strangling of two
prostitutes, Travis, 36, was found hanging in his cell. Officials ruled it
a suicide.
Police credit computer forensics for cracking the case even as others
question whether the tool foreshadows an Orwellian ''Big Brother'' threat.
''It's almost science fiction,'' said Charles Marske, a Saint Louis
University sociology and criminology professor. ''I suspect most of us are
pretty oblivious to this. As long as it doesn't affect us directly, we may
yawn and go about our business.
''But I think people who get on the Internet really fail to realize that
their comings and goings through cyberspace can be tracked.''
The FBI declined to discuss the case. But the criminal complaint against
Travis offers some insight:
On May 24, five days after the Post-Dispatch profiled one of the slain
prostitutes, the newspaper received a letter with the return address of ''I
Thralldom'' in New York City. The stamp was of an American flag, placed on
the envelope upside down--an international distress signal.
In neat, red computer-generated cursive, the letter complimented a
Post-Dispatch reporter on a ''nice sob story.'' Enclosed was a
computer-generated map indicating where a missing person's body was,
showing an intersection in St. Charles County's West Alton, along with a
handwritten X.
Investigators, using the letter, found human skeletal remains within 50
yards of the location shown by the map's X, which was about 300 yards from
where two other prostitutes' bodies had been found.
Four days later, investigators determined that the letter and map were
mailed locally, and learned ''I Thralldom'' was a Web site for bondage and
sexual torture. By definition, thralldom is slavery or someone in moral or
mental servitude.
An Illinois State Police search of online mapping companies led to an exact
match between features on the newspaper-received map and one found on
Expedia.com.
On June 3, Microsoft Corp., which tracks access to that Web site, gave the
FBI a spreadsheet showing that someone accessed the Expedia.com site just
days before the mailing of the letter to the Post Dispatch.
That user had an Internet Service Provider address of 65.227.106.78, and
that address had searched the West Alton area using the Expedia.com Web
site. Technology even showed the person zoomed in 10 times to better define
the location in the area.
The user name of that Internet address: ''MSN/maurytravis.''
Investigators followed Travis for 24 hours, determining he lived alone in
the town of Ferguson in north St. Louis County.
On June 7, he was arrested at his home.
Most of the victims, authorities say, suffered blunt trauma on the back of
the skull and had ligature marks on their necks. Their wrists were bound.
In Travis' house, searchers allegedly found a file cabinet with ligatures
and belts splattered with what appeared to be blood.
The investigation is continuing.
In recent years, computer forensics have surfaced in high-profile hunts for
suspected spy transmissions, missing Clinton White House e-mails and
missing Enron Corp. accounting documents.
''A lot of people are so naive on this stuff,'' said Michael Anderson, a
retired longtime IRS criminal investigator. He is now president and CEO of
New Technologies Inc. The company has trained thousands of police officers
and military workers on computer-tracking techniques.
In terms of computers, ''the public has got this false sense of security,
and they're running naked,'' Anderson said.
Marske said that while some might applaud investigators' computer skills in
nabbing Travis as a suspected serial killer, ''it is that Big Brother kind
of syndrome that certainly worries.''
*************************
Federal Computer Week
Security concerns raised
BY Brian Robinson
Security is important for Web services because, unlike static Web pages
that simply display information, Web services are used to exchange data
with remote systems, which opens a door on those systems to the outside world.
Simple Object Access Protocol (SOAP) and Web services actually bypass
firewalls with Hypertext Transport Protocol port 80, which developers use
as integration points of entry for business partners that rely on
distributed applications, said Adam Kolawa, founder and chief executive
officer of Parasoft Corp., a provider of error-prevention tools.
"However, those same [open] entry points could be used by hackers and
viruses," Kolawa said.
One way to handle this would be to design around it. For example, you might
use different machines for your application server and Web server. Even if
people managed to penetrate the firewall, said Robert Wegener, director of
solutions for RCG Information Technology, "if they can't run anything, they
can't do anything."
Others feel security is not an issue. David Brown, .Net architect for
Microsoft Corp., believes Web services can access the same kind of security
that other HTTP-based traffic uses, "and no one seems to be afraid to put
up a Web site because of security concerns."
Although some people look to Web services as an integration process, Brown
said, others such as officials at Microsoft and IBM Corp. are looking to
use the technology to provide distributed computing in the future.
"For that, you do need more complex security because that will entail such
things as stitching services together, and the question there is how to
provide security that can simultaneously cover, say, five different
services," Brown said.
Security concerns are the major reason that most early Web services
development is happening within organizations and government agencies,
where a secure environment is provided by the enterprise firewall.
Early adopters might begin unveiling those services to the outside world
early next year.
IBM, Microsoft and online security vendor VeriSign Inc. are working on a
new specification called Web Services Security (WS-Security), a set of SOAP
extensions that would bring the kinds of security technologies used in the
broader World Wide Web into the Web services arena.
IBM and Microsoft officials say they plan to submit WS-Security to relevant
standards organizations and expect to develop other specifications that
would address other aspects of security, such as policy, trust and
authentication.
***********************
Federal Computer Week
GSA seeks input on gateway project
BY Diane Frank
The government team leading the development of a security gateway designed
to authenticate users accessing e-government services has turned to
industry for help.
The Technical Exchange Day June 7 at Mitretek Systems Inc., a nonprofit
organization assisting the General Services Administration-led team on the
e-Authentication project, was the first chance for the government to
outline its plans to a large industry gathering.
The lack of a defined technology architecture surprised many of the vendors
attending the briefing, but Tice DeYoung, the project's technical lead and
a research scientist at NASA, said the government did not want to restrict
itself to any specific architecture or products.
But the approach makes sense and is catching on in government, said Robert
Guerra, president of Robert J. Guerra and Associates. "This is tantamount
to performance-based contracting, and I think that's appropriately growing
in our industry," he said. In performance-based contracting, an agency
presents a business problem to a vendor and asks for help in solving it,
including appropriate best practices and past experiences.
Reaching out to industry should happen more, he said. "It's a culture
change we need to foster in our community."
"It's easier to recognize a good idea than to invent one," said Chip
Mather, senior vice president of Acquisition Solutions Inc. He tells his
agency clients to rely on industry for solutions. "We tell them, 'Stop
buying compliance and start buying results.'"
Vendors do have some idea of what to expect from the project. The concept
outlined at the briefing reiterated that the gateway would validate any
type of credential, such as a password or digital certificate, issued to
access any government application.
The agency program offices will determine what level of credential is
necessary for each of the applications connected to the gateway, while the
gateway will likely serve as a central point to validate credentials issued
by multiple organizations, DeYoung said.
The gateway prototype is expected to be operational in September, working
with two to four of the other e-government initiatives overseen by the
Office of Management and Budget as part of the Bush administration's
e-government strategy.
Last week, GSA began providing the other e-government initiative leaders
with an online tool to start evaluating their authentication needs, and
soon they will be using a modified version of the Operationally Critical
Threat, Asset and Vulnerability Evaluation tool developed by the CERT
Coordination Center at Carnegie Mellon University in Pittsburgh, Pa., said
Steve Timchak, program manager for the e-Authentication initiative.
The team will conduct a discussion of the policy considerations for the
gateway at its Industry Day Conference at the Commerce Department scheduled
for June 18 and to be led by Mark Forman, the associate director for
information technology and e-government at OMB.
***********************
Federal Computer Week
Army given funding for backup portal
BY Dan Caterinicchia
Army officials are developing an action plan for a backup of the Army
Knowledge Online portal after the service was promised funding late last
month for a mirror site to keep the portal online if the primary server fails.
Lt. Gen. Peter Cuviello, the service's chief information officer, was
promised about half of the requested $100 million for the backup site after
a May 29 meeting at the Pentagon, according to Col. Robert Coxe, the Army's
chief technology officer.
The AKO portal provides Army news, distance-learning opportunities, e-mail
accounts, a search engine and a chat room. By July, officials plan to use
it for most of the service's internal business.
Coxe said in April that it would cost more than $100 million to establish a
mirror site, largely because of storage and infrastructure costs. But he
said Army officials are pleased to be getting any funding and will now
amend their original plan, which called for doing "everything at once," and
instead will "do the basics and scale it later to make it more robust."
"We are re-looking at all of our options to ensure that we maximize every
dollar and then some," Coxe said. "We are in the process of putting
together the requirements/action plan for a cold failover site using the
dollars provided. We are moving out smartly."
Before learning of the funding, Coxe said he had secured 54 terabytes of
storage for the site and was just "waiting for the servers."
He added that he has spoken with the Army secretary, who said he would fund
it. "The question is when," Coxe said late last month at the Army IT Day in
McLean, Va.
********************
Federal Computer Week
Security on the go
CryptCard and GlobalAdmin provide comprehensive security for mobile products
BY Michelle Speir
It's no surprise that security products for mobile computing are hotter
than ever. Agencies today are well versed in the problems that come with
lost personal digital assistants and notebook computers, especially when
stolen data is involved.
The good news is that because of the cutthroat competition, vendors are
seeking ways to distinguish themselves in the market by introducing new and
improved tools that offer more sophisticated protection for mobile
products, along with enhanced user convenience.
One such product is the latest version of Global Technologies Group Inc.'s
CryptCard, a PC Card that protects information on notebooks using
high-speed encryption and password access control.
When we reviewed this product in 1999, it sold as a stand-alone card with
an optional database utility called CCAdmin ("CryptCard: Tight security for
the mobile workforce," Oct. 4, 1999).
The latest version, however, is a package deal consisting of one or more
CryptCards and a preconfigured, sealed desktop system that runs
GlobalAdmin, a centralized administration and management package for the
cards. The convenience of this setup is hard to beat: We simply plugged in
the desktop system and GlobalAdmin was ready for immediate use.
The GlobalAdmin system is a Microsoft Corp. Windows NT workstation equipped
with a PC Card reader. Administrators use this system to program, test and
manage the CryptCards as well as create user accounts. Other functions
include resetting passwords, generating reports and keeping inventory of
the CryptCards by serial number. Also, the cards can be reset and reused,
which is both convenient and economical. GlobalAdmin can also be used with
smart cards, a function we did not test.
Down to Business
Before a CryptCard is ready to use, an administrator must set up a user
account and program the card on the GlobalAdmin system. Then the user must
install the card on a notebook.
The first step in readying a card for use is to create a database within
GlobalAdmin that will store the information about each programmed card. The
process is quick and easy, especially since the program can automatically
generate the database backup key and the initial key for the CryptCards,
which must be 28 and 14 characters long, respectively. The system can
automatically generate the numbers or you can set them manually.
Next, the administrator chooses password options. The system offers smart
choices, such as the ability to block user-specified words from being used
as passwords, such as the agency's name, the user's own name or something
obvious such as "password."
There is also an "erase password." When someone types the erase password at
log-in, the entire CryptCard is erased and the keys discarded. To regain
access to the notebook, the card must be reset and reprogrammed.
In some cases, the system will flag a too obvious password on its own. When
we tried to enter a password that differed from the user name by only one
letter, we received a message telling us to choose another password.
After determining password options, administrators set up keys that enable
users to exchange information via encrypted floppy disks. Creating and
editing keys is simple, and you can create as many as you like.
The next step is to create key groups, which are sets of keys that can be
assigned to users. Curiously, you must create them whether or not you have
created keys. There is no limit to the number of key groups you can have,
and each group can consist of any combination of keys. In other words, the
same key can be part of multiple key groups. Users who share the same keys
or key groups may exchange encrypted floppy disks.
Administrators must then create user groups, which is simply a matter of
naming them. Examples could be "training group" or a department name. This
way, users can be organized within the system logically.
The rights that can be assigned to a CryptCard user encompass many options,
such as access to the notebook's serial and parallel ports, the ability to
read from or write to a floppy disk, and the ability to boot from a floppy.
Encryption options include a 56-bit Data Encryption Standard; 112-bit DES;
Single Connector Attachment algorithm for low security and high
performance; and 128-bit triple DES Cipher Block Chaining (CBC) mode, the
most secure of all. Advanced Encryption Standard will be available in the
next release.
We liked GlobalAdmin's ability to create CryptCard profiles. A profile
contains a set of rights, encryption options, key groups and user groups.
When administrators program multiple cards, they can simply assign a
profile to each card and all information is automatically added.
Once a user account is created, a CryptCard can be programmed. This is a
simple process that only takes about 30 seconds. Simply insert a CryptCard
into the GlobalAdmin system, choose a user and click the program button.
You can also enter the card's serial number here for tracking.
If you forget how a card is programmed, you can insert it into the
GlobalAdmin system and look up the information, but we think this process
could be more intuitive.
Installing the card on a notebook is easy and straightforward. Our first
test card was programmed for partial encryption using 112-bit DES CBC. It
took only about five minutes to complete this process on a 40G hard drive,
but the company does not recommend partial encryption because of security
concerns. Full encryption takes much longer, but it's worth the wait, and
it's only a one-time process.
The security performance is indeed impressive. The system requires the
password even before Windows starts, and the user name cannot be changed.
If the card is removed at any time, the notebook screen goes black and the
machine becomes useless. During normal use, though, there is no difference
in notebook operation.
Different users with different CryptCards can access the same notebook as
long as they are programmed as group users within the same group and are
assigned to the same key group.
We hit a couple of rough spots. The documentation, for one, is a bit uneven
(see box). We also found many of the on-screen labels in the interface to
be confusing. What's more, at least one bug-ridden field repeatedly
returned an error message.
The Bottom Line
The CryptCard provides extremely tight security for notebook computers,
provided that users remember to take the card with them when leaving the
notebook unattended. Of course, if the card is left in the notebook and the
computer is shut down, the password will still protect the system.
We liked the out-of-the-box functionality of the preconfigured system,
although we were less than enthusiastic about the system's interface and
documentation. Still, we think the core product is excellent. With some
polishing of the software interface and the user manual, this diamond in
the rough could truly shine.
**************************
Federal Computer Week
CryptCard lessons learned
BY Michelle Speir
One concern we have with Global Technologies Group Inc.'s CryptCard is the
uneven documentation.
Administrators, for example, can assign an "uninstall" capability to a
CryptCard so users can take it off a system. Of course, most users should
not have that right, so we did not assign it to our first CryptCard,
resulting in much confusion. It was only when we were ready to uninstall
the CryptCard that we discovered we had to program an additional card with
uninstall rights.
If the instructions had been complete, this would not have been a problem.
Unfortunately, the manual omits several key pieces of information about
this process. The first is that the new card must be programmed exactly the
same way as the original card, with the only difference being the inclusion
of the uninstall capability. Our first additional card was not programmed
the same way, so it did not work.
Another lesson we learned the hard way is that once the first CryptCard,
called the installation card, has been programmed for a user, any
additional cards for that user must be programmed as spare cards instead of
installation cards. For our second attempt, we reprogrammed our original
installation card, which meant that the keys were lost and the card no
longer functioned.
*******************
Federal Computer Week
FAA to simulate GPS outages
BY Megan Lisagor
The Federal Aviation Administration plans to run a simulation in September
to assess the impact of a Global Positioning System (GPS) outage on air
traffic control.
The GPS Outage En Route Simulation (GOERS) will test how the loss of
satellite-based navigation aids affects controller workload under
conditions that include environments in which a mix of GPS and ground-based
navigational aids are available.
Jacksonville Air Route Traffic Control Center in Florida is the leading
candidate for GOERS, pending coordination with the National Air Traffic
Controllers Association (NATCA). The FAA will conduct the simulation across
five weeks, then recommend whether measures should be taken to lessen the
effects of an outage.
"While the FAA contemplates a reduction of ground-based navigation aids and
encourages more reliance on space-based navigation, the impact of a GPS
outage must be considered," said Jeff Williams, manager of the air traffic
area navigation implementation staff, during a presentation June 4 at the
FAA Satellite Operational Implementation Team public forum.
The FAA also will use findings from GOERS to inform future policy
decisions. "If the simulation proves the air traffic control system is
overly burdened by a large-scale GPS interference event, the FAA's
decommissioning plan could be amended," said Terry Mahaffey, a NATCA
representative. "The current plan is [contingent upon] the results of GOERS
and also" the later GPS Outage Terminal Simulation.
Developed and operated by the Defense Department, GPS allows airborne, land
and sea customers to determine their positions anywhere in the world with
information from satellites. It is a key technology in battlefield and
navigation systems, as well as other critical applications across the
public sector.
The FAA, for instance, will use it to help guide airplane landings as part
of the Wide Area Augmentation System, a network of ground stations that
correct GPS signals and broadcast them to receivers on aircraft.
The latest Federal Radionavigation Plan, released March 26 by the
Transportation and Defense departments, maintains the government's
commitment to move to satellite-based systems.
Yet a review last year by DOT's Volpe National Transportation Systems
Center found that GPS is susceptible to tampering, disruptions from the
atmosphere, and blockage by buildings and communications equipment.
Following the Volpe report and the Sept. 11 terrorist attacks,
Transportation Secretary Norman Mineta announced that DOT will maintain
ground-based navigation systems as a backup for as long as necessary.
The GOERS results will help the FAA determine the extent to which it needs
to keep those systems. The study isn't intended to promote a reduction of
ground-based capabilities.
The FAA began developing the plan for GOERS in 1998 at a time when it was
"basically going to start pulling plugs" or decommissioning all
ground-based systems, Williams said.
In 1999, the agency scaled back its approach. "We were right on the verge
of doing" GOERS, he said. "When they changed that scenario, it completely
altered our simulation variables."
When the Volpe report came out, GOERS was still undergoing revisions. "We
were basically on hold until the policy matured," he said. "The current
radionavigation plan states a less aggressive decommissioning effort."
The plan calls for reducing the ground-based navigation aids aircraft use
to fly across the country by about 50 percent beginning in 2007 and
finishing in 2012, according to Mahaffey.
The FAA has amended its GOERS plan to test the impact of reducing those
navigation aids to as low as 50 percent, and scenario development is
scheduled to begin this month, Williams said.
The GOERS team now is working to secure funding for the project, which will
cost about $400,000, he said.
"I think it will be very relevant, especially as we start to decommission
[ground-based] navigation aids," Williams said.
***
Stimulating simulation
The Federal Aviation Administration has several goals in simulating an
outage of the Global Positioning System:
* Gauging the effects of an outage on controller workload.
* Identifying operational issues that could arise during an outage.
* Incorporating lessons learned into a related study scheduled for fiscal 2003.
* Providing a basis for further simulations
************************
Federal Computer Week
The lowdown on Web services
Getting past the buzz generated by emerging Internet standards
June 17, 2002
Just about anybody working on e-government applications is familiar with
Extensible Markup Language, which is fast becoming a popular solution for
exchanging information across the Internet. XML, however, is just one
component of an emerging concept known as Web services.
Still something of a mystery to many information technology professionals,
Web services nonetheless are expected to be pervasive in the
not-too-distant future, an essential part of every agency's toolkit for
quickly fielding useful and cost-efficient e-government applications.
Web services, software written to link systems over the Internet, are
intended to simplify the development of Web-based applications by
automating the underlying processes needed for systems to interact online.
Whereas XML tags information so that it can be recognized by systems, the
other three standards work more behind the scenes, automating processes so
that applications know how to handle that information.
Industry is certainly leading the government toward wider use of Web
services standards. Officials at the Office of Management and Budget also
are encouraging agencies to incorporate Web services into the 24
cross-agency e-government initiatives.
That's because the Web services standards, taken together, provide a
relatively easy way to tie legacy systems together. They also help slash
application development costs and prompt innovative ways to deliver on the
citizen-centric mandate issued by OMB.
But achieving this vision might take a while. Concerns about the
interoperability of Web services across platforms and the security of Web
services-based applications could delay or derail the use of the standards.
Perhaps, observers say, as early adopters such as the Agriculture
Department, the Army and the Environmental Protection Agency deploy systems
using Web services, the strengths and weaknesses of the standards will
become better understood.
A New Computing Model
Web services is based on four open, nonproprietary standards: XML, Simple
Object Access Protocol (SOAP), Web Services Description Language (WSDL) and
Universal Description, Discovery and Integration (UDDI) (see box, this page).
These standards provide a way for systems to communicate with one another
via the Internet and enable applications to make calls to others to run
certain routines.
The approach is analogous to computer operating systems. Microsoft Corp.'s
Windows, Apple Computer Inc.'s Mac OS and Unix each provide an underlying
set of processes on which programmers can develop applications. Without
that platform, they would have to write much more extensive code.
Web services standards could lead to a new model of computing, said Steve
Ekblad, project manager at the Information Technology Center at the USDA's
Natural Resources Conservation Service in Fort Collins, Colo., which has
developed geospatial Web services to help people access mapping data from
various databases.
Now users can do what's known as asynchronous computing, he said. In the
past, users retrieving information from a variety of sources had to process
that data before forwarding it to a server.
By managing the flow of information, Web services-based tools make it
possible to break up that sequential process. Now a user can make requests
to a number of computers and rely on those systems to process the data and
return the results as soon as they are complete.
This model cuts down on the time it takes to finish a job because it
spreads data processing across multiple systems and the end user's desktop
no longer creates a bottleneck.
The ability to move away from traditional single process, single data
access computing means "we can finally move to [a form of] distributed
parallel computing," Ekblad said. "That's something we only talked about 10
years ago, and now it's here."
Web services applications also can be reused by others in government
without any need to rewrite or redesign them, said Avi Hoffer, chairman and
chief executive officer of Metastorm Inc.
The ability to fit together parts of different Web services to create new
ones "could cut application development time by 75 percent or more, because
people won't have to reinvent the wheel each time," Hoffer said. "I would
think people in IT groups [in government] would also be able to make the
case that Web services could be a great way to spend money."
But Web services do not have to be complex. Brand Niemann, a computer
scientist at the Environmental Protection Agency and a self-described XML
evangelist, envisions government agencies taking a simpler approach to Web
services using XML to tag data, then deliver it from one application to
another on different Web servers.
This approach does not require the more advanced functions in the standard
set of Web services. All agencies need to do is deliver XML data from one
place on the Web to another using the popular Hypertext Transport Protocol.
SOAP, which is based on HTTP, is more sophisticated and, therefore, offers
more capabilities, he said, but you don't need it to move XML data.
"Actually linking applications is the frosting on the cake," Niemann said.
"Integration is important, but people are confused [about Web services]
because it is about something more basic than that."
The EPA has several XML Web services projects in place, with more planned
for the near future. Other agencies also have working examples, such as the
Army's Electronic Bid Solicitation system, which uses XML Web services to
update potential bidders on Army procurements.
At this stage, few agencies are actively developing applications that use
the full Web services suite of standards, but most agencies likely will.
Energy Department officials, for example, said they do not use Web services
now but are looking to integrate them into future "corporate" applications.
"With our participation in numerous 'Quicksilver' e-government initiatives
and our own internal e-government project, we expect to see many
opportunities arise to put these Internet-based standards to use," said
Karen Evans, DOE's chief information officer.
An OMB-led interagency E-Government Task Force recommended a set of 24
initiatives last year. The goal is to substantially reduce IT costs by
integrating agency operations and investments.
Toward that end, Mark Forman, associate director for information technology
and e-government at OMB, is requiring agencies to include XML Web services
in the budget plans they submit to OMB as one of the ways of complying with
the e-government initiatives.
Even Congress is putting on the pressure. Sen. Joe Lieberman (D-Conn.),
chairman of the Senate Governmental Affairs Committee, is considering
requiring agencies to use XML in his recrafted E-Government Act of 2002.
Much Work Ahead
Clearly, the federal government is just beginning to make use of Web
services, and several major issues must be addressed before the standards
begin to permeate agency projects.
One chief concern is whether Web services can handle what users expect. In
their current form, they provide basic system-to-system communications,
which is useful, but observers say they fall short of industrial-strength
application integration.
"There's probably enough there as Web services are currently defined for
them to allow for basic integration" of applications, said Christine Axton,
a London-based analyst with technology consultant Ovum. "But that's not the
same as doing enterprise application integration, because Web services
don't do such things as complex data transforms."
True application integration is not just a matter of connections between
applications, she said, which is what she sees Web services currently
providing. Developers also must allow for transactional security, quality
of service, complicated billing and contracting mechanisms, and so on.
Axton believes the Web services concept is important and deserves strong
promotion, and over the next few years, it could change the way
applications are developed, delivered and used. But early overselling
created a mix of expectations and confusion, she said, and that could lead
to a backlash by potential users.
Experts say compatibility among Web services tools also is a concern. Given
the history of standards making, and the propensity of vendors to develop
their own proprietary and incompatible "flavors" of open standards
products, members of government organizations such as the CIO Council's XML
Working Group are reluctant to endorse the use of Web services until they
are convinced that vendors are intent on a true set of open Web services
standards.
The main industry body dealing with the issue is the Web Services
Interoperability Organization (WS-I), formed earlier this year by IBM Corp.
and Microsoft, along with about 50 other companies including
Hewlett-Packard Co., Oracle Corp., BEA Systems Inc. and Intel Corp.
Bob Sutor, IBM's director of e-business standards strategy, said the WS-I
focuses on ways to ensure the interoperability of basic Web services
standards by developing guidelines in the areas of basic connectivity,
secure and reliable message delivery, and coordination of transactions and
workflow. The organization will also devise tests that will give the WS-I
stamp of approval to products that pass them.
"I think we'll be done with the work on the core [set of guidelines] in 12
to 24 months and with the implementation phase another year or two after
that," Sutor said. "Then Web services should be broadly applicable for use
both inside and outside the firewall."
However, that doesn't cover every issue that users of Web services will
have to consider. For one thing, Web services are not a complete solution
to every IT problem. They are not suitable for transmitting large data
files, for example, so IT managers will still have to consider other ways
of shifting big blocks of data around the network.
Also, although Web services run over the existing Internet infrastructure,
that doesn't mean architectural considerations specific to Web services can
be forgotten.
"Deploying Web services will be a fairly complex thing," said Scott Spehar,
vice president of Cisco Federal. Besides requirements for security,
efficiency and quality of service, "there will be a significant increase in
the traffic carried over the network, as well as an uptick in bandwidth
demands because of the increased need for back-office data warehousing."
And not everyone will rush to rewrite their applications to incorporate Web
services technologies, said Sue Aldrich, senior vice president of the
Patricia Seybold Group Inc. and author of a recent report on Web services.
Established companies with large customer bases such as SAP AG, for
example, will provide Web services interfaces, but they won't make changes
to the applications themselves.
Nevertheless, she predicts that Web services will show "a steeper adoption
curve and will infiltrate more of the applications portfolio than previous
application technologies."
Web services will be in the majority of business applications by 2005, she
predicts, and even the cautious will deploy them in most of their
applications by 2006.
Robinson is a freelance journalist based in Portland, Ore. He can be
reached at hullite@xxxxxxxxxxxxxxx
***
At a glance
Standards behind Web services
* Extensible Markup Language is a streamlined version of Standard
Generalized Markup Language, developed by the International Organization
for Standardization to define the structures of different types of
electronic documents. XML can be used to store any kind of structured
language and encapsulate data so it can be shared between otherwise
incompatible computer systems.
* Simple Object Access Protocol (SOAP) is based on XML and Hypertext
Transport Protocol. It provides a way for applications including those
running on different operating systems to communicate and work together
through remote procedure calls implemented via HTTP.
* Universal Description, Discovery and Integration (UDDI) describes how to
publish and discover information about Web services applications. It is a
Web-based directory where someone can search for particular Web services
and what they do.
* Web Services Description Language (WSDL), based on XML, describes the
kinds of software applications, or services, available on a particular
network. Once someone develops a Web service, they can publish its
description and link in a special UDDI repository. When someone wants to
use the service, they request the WSDL file so they can determine its
location, function calls and how to get to them. They use that information
to construct a SOAP request to a server.
*************************
Federal Computer Week
Change agents needed
Editorial
Almost everyone agrees that the Homeland Security Department is needed to
encourage agencies to work together to protect the United States from
terrorism. But like so many other ambitious government initiatives, the
success of the department in rooting out terrorists, spoiling plots and
reacting quickly and effectively to attacks that do occur will require a
dramatic cultural shift.
A significant part of that shift will rely on technology, specifically in
creating an enterprise architecture in which information is shared
efficiently among the agencies the Bush administration has tagged to be
part of the new department. The details of the architecture are sketchy,
but past experience shows that just trying to get the chief information
officer and program managers in the same agency to agree on a single
architecture is a struggle.
The key to creating enterprise architectures is finding information
technology managers who believe that a standard system can revolutionize
how government works and therefore governs.
Cultural change isn't easy. It takes years. Unfortunately, the nation does
not have years to wait. It also means a change in the way government
governs. Other problems exist, some IT experts point out for instance,
much of the information gathering and analysis will be conducted by
agencies outside the new department, namely the CIA and the FBI. Congress
and the Bush administration must find solutions for these problems.
What it will take is finding so-called change agents in agencies IT
managers who can envision the solutions and have the managerial skills to
make them happen. Such people exist in government and must be tapped.
Creating an effective Homeland Security Department will test the governing
skills of the most talented political appointees and CIOs. These change
agents could help agencies reach what thus far has been unattainable: a
truly integrated government.
**************************
Federal Computer Week
State issues ID card plan
BY Megan Lisagor
As they forge ahead with a smart card program, State Department officials
have learned that changing indentification cards means more than printing
new badges.
"This is a departmentwide project and not bureau-centric," said Lolie Kull,
access control smart card implementation program manager for the
department's Bureau of Diplomatic Security. "It's very hard to get that
message out."
The department's program is in the early stages, with about 250 smart cards
released, Kull said June 5 at the Smart Card Alliance Inc.'s symposium in
Washington, D.C. Officials hope to issue another 20,000 cards by August and
eventually give them to 35,000 workers.
The move to smart cards will not happen overnight. During a one-year
transition period, employees will wear both their old and new ID cards.
Also, staff members working abroad will receive cards as they return to the
United States.
"The biggest challenge [for] any agency such as theirs is to identify
short-term requirements and long-term goals," said Randy Vanderhoof,
president and chief executive officer of the nonprofit Smart Card Alliance.
"I think the State Department has taken a very informed approach to doing
this."
The cards will provide physical and logical access and will include a
public-key infrastructure identity certificate. They will comply with the
General Services Administration's smart card interoperability specifications.
"Everybody is resistant to change," Kull said. "Once we do succeed, life
will be easier for everyone."
"When you're trying to do anything systemwide, it often impacts individual
responsibilities and the performance of individual departments," Vanderhoof
said. "We find this happens any time there's a systemwide change."
**********************
Federal Computer Week
Trucking app secured
BY Dan Caterinicchia
A stolen tractor-trailer carrying hazardous materials is tearing through
Northern Virginia and headed toward the U.S. Capitol building, where the
concrete barriers and police might not be strong enough to stop it.
But before the vehicle and its deadly payload get anywhere near the target,
the truck's engine seizes, its brakes lock up, an alarm blares and
potential disaster is averted.
This scenario may have seemed farfetched just a year ago, but the Sept. 11
terrorist attacks and the subsequent focus on homeland security have
forced government and industry officials to explore technology that can
help head off a potentially deadly situation.
With that in mind, Qualcomm Inc. last month announced a trio of security
enhancements to its OmniTRACS mobile communications system. OmniTRACS was
developed to support data communications between a vehicle and dispatch
officers at its base of operations and to provide vehicle tracking.
The enhanced security, combined with the system's satellite communications
base, enables the dispatch office to remotely disable the vehicle if, for
example, a truck is stolen, veers off course or is entering a dangerous
area, said Marc Sands, vice president and division counsel for Qualcomm
Wireless Business Solutions.
The features are:
* Driver authentication through a unique identification and password, in
which a driver's log-in name is validated through over-the-air
transmissions that interface with a secure database at Qualcomm's network
management center.
* A wireless panic button that augments the current in-dash capability so
that drivers can send a signal for help if they're threatened while outside
the vehicle.
* A tamper-detection feature that alerts fleet management or the driver if
an attempt is made to disable the OmniTRACS unit.
Should a problem arise, the satellite link enables a dispatcher to remotely
disable the truck or lock trailer doors.
San Diego-based Qualcomm also plans to train users who purchase the new
security tools, said Jeff Nacu, manager of field engineering at Qualcomm.
Transportation experts see an important role for technology in their field.
"For the last few years, there's been an accelerating rollout of technology
into the world of transportation," said Rep. Thomas Petri (R-Wis.),
chairman of the House Transportation and Infrastructure Committee's
Highways and Transit Subcommittee. Those technological advances have not
only increased security and safety, but have also led to increased capacity
without adding lanes to highways, he said.
Rep. Mike Rogers (R-Mich.), co-founder of the Intelligent Transportation
Systems Caucus, said that because the United States can't afford to hire as
many law enforcement, intelligence and border patrol officers as it needs,
"we have got to do it smarter."
Rogers said that about 60 percent of international terrorist incidents
target transportation, and more than 90 percent of those are ground-based
targets.
A pair of trucking companies that haul hazardous materials for the Defense
Department Superior Carriers Inc. and Baggett Transportation use
Qualcomm's OmniTRACS system to monitor the hauling of cargo through the
Defense Transportation Tracking System, Sands said.
The tracking system combines satellite positioning and communications
technology with digitized mapping and 24-hour operations to ensure
in-transit ordnance safety and security.
The basic OmniTRACS system costs about $2,000 per truck, but with the added
security, the price would be closer to $5,000, Sands said.
Ellen Engleman, administrator of the Transportation Department's Research
and Special Programs Administration, said that her office is looking at all
possible transportation security solutions. She added that both the public
and private sectors must tackle the issue.
"Behavioral awareness, training, education and technology" all must be
built into solutions, Engleman said.
*************************
BBC
Official site to advise on state snooping
The UK Government has set up a website to advise other organisations on the
best way to snoop on citizens.
Later this month a raft of government departments and organisations will be
added to the list of people that can compile records of what British people
get up to with their mobile and fixed phones, fax machines, web browser and
e-mail accounts.
The website has been set up to help the new organisations stay within the
law while they carry out covert surveillance.
But critics say the department charged with overseeing the use of
surveillance will be unable to cope with the deluge of snooping likely to
be unleashed by the new laws.
'Understaffed'
The Office of Surveillance Commissioners was set up in 1999 by the Home
Office to watch over the activities of organisations that can carry out
covert surveillance.
The OSC gained a new role when the controversial Regulation of
Investigatory Powers Act became law in 2000.
The Act made it much easier for Customs and Excise, police forces and
intelligence services to get permission to spy on criminals and citizens.
Before the RIP Act was passed, anyone wanting to carry out surveillance had
to justify their need to spy in front of a judge.
The RIP Act removed this requirement and instead put approval and oversight
into the hands of the Office of Surveillance Commissioners.
'Completely unfeasible'
The website unveiled at the end of last week will act as a central
information point for any organisations that have to carry out surveillance
and give advice on best practice.
One of the three surveillance commissioners appointed to the OSC has the
job of ensuring that covert surveillance of what people do with e-mail, the
web, phones and faxes is carried out in line with the RIP Act.
Ian Brown, director of the Foundation for Information Policy Research,
expressed doubts that the understaffed and underfunded OSC will be able to
police the use of surveillance, given that many more organisations are
about to get snooping powers.
"The number of staff that is has is such that it is completely unfeasible
that it would be able to provide any oversight of how these powers are
used," he said.
Mr Brown said that police forces have already declared that they will not
maintain a central register of how much surveillance they are carrying out.
As a result, he said, staff working for the OSC will have to travel the
country checking the records of police forces and every other organisation
that is carrying out surveillance.
************************
BBC
Web gives a voice to Iranian women
The web is providing a way for women in Iran to talk freely about taboo
subjects such as sex and boyfriends.
Over the past few months there has been a big jump in the number of Persian
weblogs which are providing an insight into a closed society.
Weblogs, or blogs, are online journals where cyber-diarists let the world
in on the latest twists and turns of their love, work and internal lives.
"I could talk very freely and very frankly about things I could never talk
about in any other place, about subjects that are banned" said one of the
first women to start a blog in Iran.
Underground lives
The rise of the blog in Iran has been made possible by the huge growth of
the internet in the Middle Eastern country.
There were 400,000 people on the internet in Iran in 2001, according to
government figures. But officials expect this it grow to 15m over the next
three or four years.
Contrary to expectation, the internet in Iran is not censored. This has
allowed Iranians to use blogs to air issues that they cannot talk about in
public.
Perhaps surprisingly, few of the blogs focus on politics.
"It is social issues mostly," said blogger Hossein Derakhshan, an Iranian
journalist living in Canada, "the underground lives that Iranian youth have
these days. Things like girlfriends, boyfriends, the music they listen to,
the films they see."
This may seem surprising to people in the West, but Iran is a conservative
society with an Islamic government.
Free expression
Hossein created one of the first blogs in Persian last year.
"It was a good tool to get to know what is happening in Iran," he told the
BBC programme Go Digital, "what the youth are talking about, what are their
problems."
He had so much interest from Iran that he decided to write a simple guide
in Persian, to help others set up their own blogs.
Seven months on, there are more than 1,200 Persian blogs, many of them
written by women.
"For the first time in the contemporary history of Iran, women can express
themselves freely, even if it is not in their real name," said Mr Derakhshan.
"They have found the courage to speak about themselves and how they see the
world."
Women's voices
For one female blogger, who wished to remain anonymous, her online diary
has provided a forum to share her fears and aspirations.
"Women in Iran cannot speak out frankly because of our Eastern culture and
there are some taboos just for women, such as talking about sex or the
right to choose your partner," she said.
"I have the opportunity to talk about these things and share my experiences
with others."
For the most part, the response to her blog has been positive.
"I've had e-mails from men who have told me that I changed their attitude
towards women in Iran," she said.
"I had some negative responses, people saying I am disrespecting the image
of an Iranian woman. Some people even insulted me.
"But negative responses are few compared to positive ones."
**************************
BBC
Australian teleport breakthrough
It is a long way from Star Trek, but teleportation - the disembodiment of
an object in one location and its reconstruction in another - has been
successfully carried out in a physics lab in Australia.
Scientists at the Australian National University (ANU) made a beam of light
disappear in one place and reappear in another a short distance away.
The achievement confirms that in theory teleportation is possible, at least
for sub-atomic particles; whether it can be done for larger systems, such
as atoms, remains to be seen.
The more likely applications will come in telecommunications, enabling much
faster transfer of data and the use of encryption that can never be broken.
Teleportation has been one of the hottest topics among physicists working
in quantum mechanics - the study of the fundamental structure of matter.
Some 40 labs around the world are currently trying to teleport a laser beam
after pioneering work in 1998 at the California Institute of Technology
showed it should be possible.
'Spooky interaction'
The Australian researchers have exploited a phenomenon called "quantum
entanglement", which links the properties of two photons of light created
at the same time. Einstein called it a "spooky interaction".
What it means is that two photons can be created and sent to different
places. It is possible to force one photon into a specific quantum
mechanical state and, because the two photons are connected in some way,
the other photon will instantaneously take up a complementary state.
At first sight, entanglement offers the prospect of sending a signal faster
than the speed of light. But a closer look at what is actually possible
shows that this will not work because of the limits of what can be known
about quantum mechanical systems and how such information is relayed.
But it may offer the prospect of a Star Trek-style transporter.
'Exciting applications'
Using quantum entanglement, ANU physicist Ping Koy Lam has disassembled
laser light at one end of an optical communications system and recreated a
replica just a metre away.
An encoded signal is embedded in an input stream of photons, which is
entangled with another beam.
Elsewhere in the lab, the beam of photons and the associated signal is
reconstituted.
"What we have demonstrated here is that we can take billions of photons,
destroy them simultaneously, and then recreate them in another place," Dr
Lam says.
"The applications of teleportation for computers and communications over
the next decade are very exciting," he adds.
Body movement
Quantum teleportation could make encrypted or coded information 100%
secure, Dr Lam said, because even if intercepted the message would be
unintelligible unless it was intended for a specific recipient.
"It should be possible to construct a perfect cryptography system. When two
parties want to communicate with one another, we can enable the secrecy of
the communication to be absolutely perfect."
But for a human to be teleported, a machine would have to be built that
could pinpoint and analyse the trillions and trillions of atoms that make
up the human body.
"I think teleporting of that kind is very, very far away," Dr Lam says. "We
don't know how to do that with a single atom yet."
Quantum teleporting is problematic for humans because the original is
destroyed in the process of creating the replica.
************************
BBC
'Snoop' vote postponed
Plans to extend surveillance of e-mail and telephone records have been
postponed at the last minute amid growing concern from MPs about the
invasion of privacy.
A committee of MPs had been due to vote on the controversial proposals on
Tuesday, but the government has put the debate back to 1 July.
The move comes as Conservative peers threatened to use their voting
strength to block the plans in the House of Lords.
Critics claim the plan - which would allow local councils and other
organisations to check private telephone records - amounts to a "snooper's
charter".
That claim was denied by Home Office Minister Bob Ainsworth as he argued
misunderstandings had to be dispelled.
New safeguards
Officials in many of the agencies that could get the powers could already
ask for the same information on a voluntary basis, he said.
"It is in no way a snooper's charter," Mr Ainsworth told BBC Radio 4's
World At One programme.
"Exactly the reasons we are putting this in is to provide safeguards and
guidance as to when people can get information and when they can't."
Parliament will only have the briefest of opportunities to debate the plans
because they only extend laws passed two years ago.
The Home Office says the debate has been postponed because of "timetabling
difficulties".
Privately, department officials say new safeguards are being put into the
legislation in an attempt to satisfy critics.
The Conservative leader in the Lords, Tom Strathclyde, said if MPs failed
in their duty to protect people's rights and freedoms, the House of Lords
would.
Lord Strathclyde told the World At One: "It is very important that there is
that parliamentary hurdle."
If officials in councils and other agencies believed a crime was being
committed, they should ask the police to investigate, he argued.
'Intrusive powers'
In a separate letter to the Daily Telegraph, John Wadham, director of civil
rights campaigners Liberty, questioned the government's justification for
the changes to the Regulation of Investigatory Powers Act (RIPA).
He questioned whether it was necessary to extend electronic surveillance
powers to organisations such as district councils, the Post Office or the
Food Standards Agency.
"Of course the vast majority of officials will seek to use these powers
honestly and proportionately.
"But if you give such intrusive powers to so many people with so broad and
vague a list of justifications, the evidence from history is that these
powers will be abused."
Room for debate
Under the plan, seven government departments and a wide range of
organisations will be able to look at private e-mail and telephone records.
At the moment, the powers only apply to the police, inland revenue and
customs and excise.
The Home Office refused to confirm a story in the Sunday Times that Home
Secretary David Blunkett was planning two modifications to the legislation.
The newspaper said Mr Blunkett wanted to ensure that only chief executives
of local councils, regulatory bodies and government agencies would be
allowed to apply for access to confidential telephone records,
This would prevent minor officials and civil servants from gaining access
to private information.
Also, according to the The Sunday Times, organisations would only be given
access to information that was directly relevant to their jurisdiction.
The government has cited the investigation of benefit fraud rings and
pirate radio stations as two examples where the new powers would be used.
The Food Standards Agency investigates unhygienic slaughterhouses, would
also use the powers in its investigations.
But a Home Office spokesman told BBC News Online there was room for debate
on the legislation and it was not a "fait accompli".
*********************
BBC
Connecting the villages
Mandayan is a farmer in Padinettankudi, a poor rural village in the south
Indian state of Tamil Nadu. For months he has been suffering from watery
eyes and blurred vision.
Today he has come into the thatched hut behind the village tea stall to
seek help, not at a doctor's surgery, but at what has become the village
internet kiosk.
The young local woman who runs the kiosk switches on a webcam. The computer
whirrs on the desk beside her as she carefully takes pictures of Mandayan's
eyes and records his symptoms, using an online patient questionnaire.
She then sends the pictures and voice recording to the world-famous Aravind
Eye Hospital in the city of Madurai, 40 km away. What would have taken days
or weeks by post is achieved instantly, by e-mail.
Online doctor
An eye specialist calls back for an online chat with Mandayan. They discuss
his symptoms and the doctor gives him an appointment at the hospital the
following week, where he can get free treatment.
Mandayan is thrilled. Most of the villagers in Padinettankudi have never
been to a hospital or even spoken to a doctor before.
The Padinettankudi kiosk is one of 30 launched in the district over the
past year by n-Logue, a commercial offshoot of the prestigious Madras
Indian Institute of Technology, and the company has ambitious plans to wire
up the rest of rural India within 10 years.
Each kiosk is run by a local entrepreneur and provides a wide range of
services such as farming advice, applications for government loans and
e-mail, all at an affordable cost of a few rupees.
Global reach
For villages like Padinettankudi, which have no public telephones and where
many people are illiterate, the internet kiosk has become their means of
communicating with the outside world, and the link to the Aravind Eye
Hospital is one of the most exciting recent developments.
Founded over 25 years ago by Dr G Venkatasamy, the hospital runs the
biggest community eye programme in the world, treating over a million
patients each year. It does cataract operations, provides glasses and any
other necessary treatment free of charge to anyone who needs it.
But the problem lies in reaching all the people that need eye care, and
that is where the new technology is proving its worth.
The Aravind Eye Hospital is the first hospital in Madurai to introduce
telemedicine to the villages and its medical staff are already benefiting
from the global reach of the internet.
Exciting potential
They discuss new medical information with other eye experts and can even
watch operations in Boston or London, so enabling them to bring the latest
knowledge to their patients.
Dr Venkatasamy is fully aware of the huge potential of the internet for
reaching the millions of people in rural India who are needlessly blind,
and his vision goes beyond simply restoring sight.
"This is very exciting, because poor people need not be poor. If a man
recovers his sight and earns even a dollar a day, that means six million
people recovering their sight each year could be earning 6 million dollars
a day," he says.
"Imagine what a huge difference that could make to the economy of the
country."
*************************
BBC
Copyright rows ring down the centuries
Record companies warn that internet piracy could spell the end of the music
business as we know it. In fact, says Mark Ward, we've heard these
arguments before, from the likes of Dickens and Conan Doyle.
Just as every generation thinks it is the first to discover sex, so every
generation thinks it is the first to suffer problems with copyright and
piracy.
The ubiquity of the net has lent greater urgency to the copyright debate
but the basic problem of controlling what happens to creative works has not
changed in more than 200 years.
The debates today are about recorded music and movies but a similar debate
was being had before the first voice was recorded or first movie premiered.
In 1842 Charles Dickens was on a tour of America reading excerpts from his
works to audiences eager to hear the young author.
But Dickens did not limit himself to repeating his own words, he also he
used his time behind a lectern to berate audiences for pirating his books.
During one lecture he even accused American publishers of helping to drive
Sir Walter Scott to bankruptcy and an early grave because their rampant
bootlegging had vastly reduced the royalties he received from US editions
of his works.
Despite his attempts to lighten his criticism by saying that American
literature would never develop if publishers were free to bootleg the best
works from overseas, Dickens was savaged in the US press for his attack.
Piracy was rampant in the US because the country had refused to sign
international treaties protecting copyright.
This meant that while British publishers were reluctant to produce books
for the US market, Americans seized the "initiative" by pirating best
sellers for their home market.
Holmes and the pirates
Dickens wasn't the only author to suffer. Another victim, Wilkie Collins,
produced a small booklet entitled "Considerations on the Copyright
Question" that condemned "the habitual perpetration, by American citizens,
of the act of theft".
It also and drew attention to "the stain which book-stealing has set on the
American name".
Sir Arthur Conan Doyle's hugely popular Sherlock Holmes stories were also
heavily pirated.
"At times the situation became almost frantic, as competing publishers
would endeavour to produce editions before others could saturate the
market," said Don Redmond, author of a book about the pirating of the
Holmes stories.
In 1891 an American copyright law did come into force although it provided
little compensation for existing authors as it didn't apply to works
published before that date.
Late signing
But at the insistence of the American print unions the act demanded that,
to gain copyright protection, the first editions of a book had to be
printed in the US. This is why we have American editions of books now.
In 1986, the US finally signed the Berne Convention, first drawn up in the
closing years of the 19th Century, which guarantees the same rights to the
creators of works across national boundaries.
This, despite the fact it had already agreed to recognise a less binding
treaty which left the printers' clause intact.
Rampant piracy for eager audiences, the lamentations of authors, lost
royalties and the desperate efforts of an industry to protect itself from
such practices, have an all too modern ring.
It could describe the music industry now.
Who will benefit?
Dr Catherine Seville, an expert on copyright history from the University of
Cambridge, says the most modern aspect of the Victorian wrangles over
copyright was that it was driven by technology.
"When they started producing The Times newspaper by steam it was said that
printers were going to go bust and die of starvation."
"New technologies do mean you can make copies more easily," she says, "but
it's never quite as bad as the doomsters say."
But, she points out, what has changed is who reaps the benefits of better
protection for an artist's works and who is pushing for changes.
This time it is not just the artists struggling for due credit. Changes to
copyright laws or the use of technology to stop copying are not going to
benefit musicians or struggling movie makers.
Instead it's the big guns - the music publishers and record companies - who
most fear the menace of the CD burner, says Dr Seville. And it is they that
stand to reap the biggest rewards from new copyright legislation.
*******************
San Francisco Gate
New immigration computer to detect phony visas
A terrorist using a stolen visa with a doctored photo can now be swiftly
unmasked, undone by a new immigration computer system that summons the true
visa holder's digital snapshot in seconds.
"We've got a very clear and simple message for people who try to enter the
United States illegally by using a phony photo or altered visa: We're going
to catch you," Immigration and Naturalization Service Commissioner James
Ziglar said Friday as he announced a major expansion in the multiagency
DataShare initiative at San Francisco International Airport.
Stung by repeated revelations of federal agencies failing to share
intelligence on terrorists involved in the Sept. 11 hijackings, Ziglar was
quick to point out the joint INS-State Department data-sharing program was
launched in 1995.
In a race to thwart further terrorist infiltrations, officials quickly made
the computer system 17 times bigger than it was before Sept. 11. The
database, which previously could authenticate the more than 400,000
immigrant visas issued annually, now includes the more than 7 million
nonimmigrant visas issued each year to foreign students, tourists and
temporary workers.
Ziglar ordered a fast-track, 90-day expansion to the more than 300 U.S.
ports of entry, where it began operating in late January.
Friday's announcement comes after authorities were pleased with quiet
testing of the expanded system at San Francisco, Newark and Miami
international airports last Fall.
It's already bearing results.
"At Miami International Airport in the last five months, we have detected
over 60 individuals who had altered visas," Ziglar said. "In one case a 41-
year-old Brazilian had actually taken a visa issued to a 3-year-old boy and
tried to substitute the photo on it. We caught him."
At SFO, about 70 foreign visitors a month are rejected for faulty or
incomplete papers, including about three caught by DataShare.
The database was designed so that when a U.S. consular official files a
visa applicant's biographic information and photograph into a computer
overseas, it's at the fingertips of INS inspectors thousands of miles away
within five minutes.
"Within 10 to 30 seconds we get the photo of the person who actually got
the visa," INS Inspector Jim Mollison said. "If it doesn't match (the
traveler's face), you've got a problem."
E-mail Alan Gathright at agathright@xxxxxxxxxxxxxxxx
**************************
Government Computer News
Congress juggles homeland security bills
By Wilson P. Dizard III
House and Senate leaders have devised different schemes to move Homeland
Security Department legislation through their chambers. Although the
administration's proposal for the new department has garnered wide support,
lawmakers said legislation might not be passed by Sept. 11. Homeland
security adviser Tom Ridge also said he is unlikely to head the new
department.
The prospect of additional funding for homeland security systems won
support from key senators, some of whom want the question of federal
employees' collective bargaining rights addressed by the legislation.
Meanwhile, the House Government Reform Committee and others are reviewing
the administration's proposal, sources said, after which a specially
appointed select committee will consolidate the other committees' work.
The Senate will use legislation already approved by its Governmental
Affairs Committee as a vehicle for creating the Homeland Security
Department. Governmental Affairs chairman Sen. Joseph I. Lieberman
(D-Conn.) and others have been pushing for a department for months, and
Lieberman is likely to lead the crafting of the Senate's version of the bill.
Homeland security adviser Ridge briefed senators yesterday on details of
the administration's plans but didn't meet with unanimous approval. Sen.
Charles E. Schumer (D-N.Y.) called the briefing "a rehash of what has been
in the newspapersnothing new."
Sen. John Ensign (R-Nev.) said that senators "support the concept, but as
always the devil is in the details." He discounted the importance of a
deadline for passing the legislation but added, "People are supportive on
both sides of the aisle."
Appropriations Committee chairman Sen. Robert C. Byrd (D-W.Va.) noted that
the administration has implied a veto of a recently passed supplemental
appropriation bill that includes funds for homeland security and computer
upgrades.
After meeting with Ridge, Byrd compared the Bush administration to a
vaudeville magician diverting attention to its "shiny new toy" and away
from the real actionfunding for homeland security. He said he supports the
proposal to create a new department but urged colleagues "not to be in too
big a rush." Byrd said Ridge would be "ideal" for the job of secretary of
Homeland Security.
Lieberman said the Sept. 11 deadline is important "because it will get us
to move faster than we otherwise would." He said lawmakers had discussed
funding homeland defense systems with Ridge. "America is not using its
technology edge," Lieberman said. "Our computer systems are outdated. One
of the most important ways to spend homeland security money is to get our
computers up to where they should be." He said his committee would hold
hearings on the administration proposal starting next week.
The legislation, Lieberman said, should protect the collective bargaining
rights of federal employees assigned to the new department. He noted that
existing law allows the president to limit collective bargaining rights of
some employees involved in national security. "No one's collective
bargaining rights should be diminished," Lieberman said.
Senate minority leader Trent Lott (R-Miss.) said he expects the
administration to present its bill to Congress in the next week or 10 days.
After briefing senators, Ridge said the president needs him as an adviser
within the White House rather than as secretary of Homeland Security. He
said administration officials would meet with federal union representatives
to discuss collective bargaining issues in the new department.
**********************
Mercury News
Major changes afoot at ICANN
DECISION-MAKING PROCESS DEEMED NOT FAST ENOUGH
By Anick Jesdanun
Associated Press
NEW YORK - The Internet's key oversight body is facing its most critical
test ever, with decisions expected later this month likely to shape the
global network for years to come.
Though relatively few Internet users are even aware of the group, the
Internet Corporation for Assigned Names and Numbers has broad influence
over the Net's addressing system -- and thus over how people find Web sites
and send e-mail.
Frustrated with endless debates, Chief Executive Stuart Lynn has proposed a
major overhaul of ICANN to streamline how its board makes decisions.
But critics complain that such efficiency would come at the expense of
fairness to individuals and non-commercial interests.
Under the proposal, at-large board members would no longer be elected by
the general Internet community but by an internal committee.
Sound undemocratic?
``It's the Mussolini model,'' said Hans Klein, chairman of Computer
Professionals for Social Responsibility. ``If you just hand over the global
information infrastructure to 12 guys who meet once in a while, whatever
they decide will be implemented very quickly.''
Fourteen civic groups, led by the Media Access Project, are even calling
for the U.S. government to reconsider its 1998 selection of ICANN as a
private, non-profit organization to take over responsibilities for domain
names.
They want the Commerce Department to reopen the bidding process to
determine if a competing organization might do a better job. Others have
called for splitting up ICANN.
Meanwhile, Congress is paying closer attention. A Senate committee held
hearings Wednesday. And Congress' investigative arm has questioned ICANN's
legitimacy and effectiveness.
What the board does at its meeting in Bucharest, Romania, June 24-28 could
determine whether procedural squabbles end once and for all.
Longtime Internet users have accused ICANN of being beholden to corporate
interests, while administrators of domain names around the world have
balked at paying dues to the U.S.-based organization.
Critics accuse ICANN of being secretive and untrustworthy because it
sometimes abruptly reverses course -- for example, the board rejected
elections in March after promising years earlier to fill seats that way.
Complaints
ICANN also drew complaints for extending the role of VeriSign as
master-keeper of lucrative ``.com'' names. The deal was negotiated behind
the scenes, without notice that talks were even going on.
And one ICANN board member, Karl Auerbach, is suing over access to records.
The Commerce Department must decide by September whether to renew or amend
an agreement that gives ICANN its authority.
Commerce is not likely to abandon ICANN completely, but Lynn himself says
the original goal of leaving the Net in private hands has proved unworkable.
Lynn has thus called for scrapping direct election of board members and
giving the world's governments a greater role in selecting them. The body
could then tackle quickly such pressing matters as improving security of
key Internet infrastructure.
``Our decision process takes longer than the Internet time speeds tend to
allow,'' Lynn said.
Lynn also believes that getting governments involved could help ICANN gain
the respect of skeptical stakeholders -- and obtain it more funding sources
and a bigger staff to better respond to crises.
Under the current proposal, the ICANN board would consist of eight members
chosen by various government, technical and other constituencies. A
nominating committee of undetermined composition would select five to 11
additional members.
Currently, five of the board's 19 members are directly elected by the
public, by geographical region.
ICANN's detractors complain that the organization has been too slow and
meddlesome on such matters as creating new domain names.
Some believe ICANN shouldn't decide at all what domains to allow: Anyone
who wants to start a domain should be able to do so, with ICANN's role
limited to adding names to a database.
Critics also fear ICANN's mission will gradually expand further -- perhaps
one day regulating content, despite the current board's insistence otherwise.
The only safeguard, they say, is to make sure the board is representative
of broad interests, even if it adds complexity.
Threat to innovation
A streamlined but closed ICANN could inhibit innovation by giving
constituencies that now benefit from the Internet the incentive to preserve
the status quo, said Auerbach, an at-large board member elected on a
platform critical of ICANN.
But supporters of a streamlined ICANN believe the alternative is a
lumbering bureaucracy dependent on international treaties -- potentially a
decentralized mess.
``It is essential to have that central authority in place or there is a
fracture in the Internet and you perhaps may not be able to communicate
with me,'' said Michael Heltzer of the International Trademark Association,
whose members benefit from procedures ICANN has created for challenging
domain name speculators without costly litigation.
Esther Dyson, ICANN's former chairwoman, is disappointed by the absence of
elections but calls the proposed overhaul a step forward.
``If ICANN isn't effective, the Internet does not stop working, but it
stops evolving,'' Dyson said. ``It's going to look exactly like it does now.''
**********************
Mercury News
ICANN HISTORY
Key events and milestones for the Internet Corporation for Assigned Names
and Numbers:
? September 1998: U.S. government delegates authority over domain name
system to newly formed ICANN.
? October 1999: ICANN approves arbitration-like system for resolving domain
name disputes.
? March 2000: ICANN drops plans to assign nine board seats through a
council elected by the public. Instead, ICANN agrees to hold direct
elections for five board members.
? October 2000: ICANN holds elections.
? November 2000: ICANN creates seven new Internet suffixes, the first major
additions since the 1980s. Vint Cerf, an Internet pioneer, replaces Esther
Dyson as chairman. Elected board members seated.
? January 2001: Stuart Lynn, former chief information officer for the
University of California, named chief executive, replacing Michael Roberts.
? March 2001: Staff of ICANN and VeriSign announce deal to extend the
company's role as master-keeper of lucrative ``.com'' names.
? May 2001: Commerce Department approves amended VeriSign deal.
? September 2001: Activation of the first new domain name. ICANN continues
to discuss second round of elections. Commerce Department extends ICANN
authority another year.
? November 2001: ICANN devotes regular meeting to infrastructure security.
? February 2002: Lynn proposes major overhaul, calling for end to elections
and advocating a greater role for world's governments.
? March 2002: Board member Karl Auerbach sues over access to records.
? June 2002: Meeting in Romania set to consider overhaul.
***********************
Mercury News
U.S. uses Japan supercomputer
MACHINE INSTALLED AT GOVERNMENT-FINANCED CENTER
New York Times
After more than a decade of debate in Washington and despite years of heavy
lobbying by American companies, a powerful Japanese supercomputer has
finally been installed at a U.S.-government-financed research center.
The machine, designed and manufactured by the Japanese electronics firm
NEC, arrived at its new home in Fairbanks, Alaska, recently, with little
fanfare. It was neither bought nor leased by the Arctic Region
Supercomputer Center, which is scheduled to begin using the computer for
scientific applications this evening.
It was the latest episode in a competition between the United States and
Japan over supercomputer strength. In April, a Japanese weather
supercomputer took the title of fastest computer, formerly held by U.S.
military machines. Late last month, the United States countered by
announcing it planned to build an even faster weather computer of its own.
In this case, an American company is bringing a Japanese supercomputer --
the same type as the Japanese weather computer -- to America. Moreover, the
company that is handling the transaction is Cray, the American
supercomputer maker that in the 1980s and 1990s was involved in a bitter
government feud over the U.S. purchase of Japanese-made supercomputers.
The dispute culminated in 1997 with an anti-dumping trade ruling against
the Japanese company by the Commerce Department.
But Cray and NEC settled their differences in February of last year and
entered a joint marketing relationship under which Cray gained the
exclusive right to sell the supercomputers in the United States. At the
same time NEC agreed to invest $25 million in Cray.
Despite the opposition, the machine has now arrived at a research center
that receives significant financing from the Pentagon. The Arctic Region
Supercomputing Center, on the campus of the University of Alaska, supports
research focused on high latitudes and the Arctic, including climate and
ocean simulation.
**********************
USA Today
Citibank to block Net gambling transactions
ALBANY, N.Y. (AP) Citibank, the nation's largest credit card issuer, has
agreed to block all online gambling transactions that use its credit cards,
the state attorney general said Friday.
The agreement announced by the bank and Attorney General Eliot Spitzer is
expected to significantly reduce illegal, underage and potentially
addictive Internet gambling, Spitzer said. It applies to all Internet
gambling transactions, not just those in New York.
"Americans now waste $4 billion a year on this pernicious form of
gambling," Spitzer said. "With this agreement, we will cut off an enormous
line of credit that was a jackpot for illegal offshore casinos."
Other companies, including Bank of America, MBNA and Chase Manhattan Bank,
also have begun blocking the gambling transactions, Spitzer said. Citibank
controls about 12% of the nation's credit card market.
"Citibank agreed to take these steps to help alleviate concerns raised by
the attorney general about the impact that gambling on credit may have on
New York residents," said Citibank spokeswoman Maria Mendler. She added
that Internet gambling is associated with higher rates of credit card fraud
and delinquency.
Citibank also agreed to pay $400,000 to nonprofit groups that counsel and
help families hurt by gambling additions, the company said.
Lawmakers in Washington have been trying to ban Internet gambling since
1996. The task gets more difficult each year as the industry grows.
*********************
MSNBC
Wiring the New Docs
Today's medical students use an unprecedented array of sophisticated
teaching aids. But how does that translate into action when a patient's
heart rate soars and his blood pressure plunges?
By David Noonan
NEWSWEEK
June 24 issue At UCLA Medical Center last month, three medical
students stopped by to say hello to a patient about to undergo routine
gallbladder surgery. They were making small talk when the 55-year-old man,
who was connected to a heart monitor and had IV lines in place, suddenly
stopped breathing. The students were the only medical personnel present,
and their collective stress level soared as they scrambled to figure out
what was wrong. They administered a sedative and ran a tube down the man's
throat to aid his breathing.
AS THE SITUATION stabilized, the testy surgeon, unaware of the
emergency, called the room looking for her patient. "I have clinic in the
afternoon," she barked at the students over the intercom. "I can't be
futzing around here all day." Just then, the crisis deepened as the
patient's heartbeat raced out of control and his blood pressure plunged.
Later, the students would discover that he had suffered an allergic
reaction to the sedative they had given him and gone into anaphylactic
shock. But they didn't know that at the time. All they knew was that the
patient was dying and they had only minutes to save him. They gave him
fluids and epinephrine to increase his blood pressure and shocked him twice
with a defibrillator to restore his normal heart rhythm. When the surgeon
called again, fourth-year student Janet Huamani took an extraordinary step.
"I'm canceling your case," she said, and told the doctor the patient was
going to the ICU, not the OR.
In a small room next door, associate clinical professor Dr. Rima
Matevosian, who had watched on a video monitor, nodded in approval. "That's
the appropriate response," she said. Then she turned to the technician
seated beside her. "OK," she said, "complete recovery." The technician
tapped away at her keyboard, the same one she used to generate the near
catastrophe, and in a matter of seconds the traumatized patient's vital
signs were all back to normal. Stan (short for "standard man"), the
life-size, computer-controlled Human Patient Simulator, had survived yet
another close encounter with simulated death. And the students, whose
stress was clearly not simulated, had sweated through yet another
nerve-racking lesson about the inherently unpredictable nature of medicine
and patient care. As she prepared to join the students for a videotape
review of their performance, Matevosian summed up her opinion of this
dramatic approach to medical education: "Simulation saves lives."
Just as technology is transforming the practice of medicine and the
experiences of patients, it is also changing the way tomorrow's doctors are
being trained. Today's medical students have an unprecedented arsenal at
their disposalfrom simulators that breathe and respond to treatment like
real patients (and sometimes even die), to pocket-size personal digital
assistants (PDAs) that can hold entire medical texts, to CD-ROMs that
enable students to listen to the sound of nearly every known heart
condition, and more. Medical schools around the country are turning to
technology to help their students learn even as they recognize the need to
emphasize the human touch. At Tufts University, the entire curriculum for
the first two years has been transformed into the Tufts Health Sciences
Database, a massive, integrated online system. The University of Louisville
School of Medicine has created a state-of-the-art patient-simulation center.
At UCLA's David Geffen School of Medicine, where PDAs are required
for students but microscopes are optional, the students, faculty and
administrator are surfing this technowave with gusto. Since 1996, when the
school first required medical students to have computers, UCLA has spent
millions turning itself into the very model of a 21st-century medical
school. "Every patient is a little different," says Zane Amenhotep, who
just completed his third year, "so there's always going to be a limitation
to technology. You can't use it as the only tool, but it's an excellent
foundation."
In practice, the technology serves two basic purposes. The
simulators enable students to gain hands-on clinical experience sooner and
without any risk to patients, while the PDAs, the CD-ROMs and the Web-based
curriculum (each course at UCLA has its own Web site) help them manage and
absorb an enormous and ever-expanding amount of information. At UCLA and
other leading schools, the rise of technology has accompanied fundamental
curriculum reforms that emphasize small-group, case-based learning right
from the start. As a result of these dual trends, today's medical students
learn much earlier to act like doctors, to think like doctors and to behave
like doctors. The old didactic approach to medical trainingin which
students spent their first two years attending lectures and memorizing
facts before being dumped clueless on the wards in their third yearis as
dead as the cadavers they cut up in gross anatomy.
When Randolph Steadman was a medical student at the University of
Florida in the 1970s, the residents got to have all the fun. "If somebody's
heart stopped, I'd get elbowed to the back of the room," recalls Steadman,
associate professor of clinical anesthesiology and director of UCLA's
Simulation Center. Now, with Stan, even first-year students get to
experience the unique horror of having a patient go south with no warning.
"It's a controllable bedside," he says. "We can incorporate more
significant events much earlier on in the training." Stan solves another
old problem in medical education. When third-year medical students are
seeing patients with him, Steadman says, they tend not to speak up unless
they know exactly what to say. "They're only going to chime in when they
can shine," he explains. "So it's very hard for me as the teacher to know
where the gaps are in their knowledge. The simulator uncovers all of that.
I can see their gaps."
The simulations, based on actual cases, teach the students how to
think their way through medical emergencies. The sessions with Stan also
teach reverence for detail, as well as the need for teamwork and clear
communication. Perhaps most important, the simulations get students
accustomed to actually doing stuff to patients. The first time Noah
Rodriguez had to run a tube down the throat of an actual human being, he
was grateful for his sometimes bumpy experiences with Stan. "You've done it
before, so you go in knowing that it's not going to be as easy as those
guys on TV might make it look," says Rodriguez, who is starting his fourth
year.
While the simulations provide an opportunity to practice acute
care, PDAs and online databases give instant access to critical
information. Everybody's jacked in all the time, looking up some key detail
or downloading some important new paper. "Medicine is information
overload," says Rodriguez, and PDAs are one way to deal with that. The
typical student's PDA contains a medical dictionary, a pharmaceutical
guide, a medical calculator with equations for things like blood-gas
analysis, and the very popular "Griffith's 5-Minute Clinical Consult,"
which is packed with information about symptoms, diagnoses and treatment
options. And there's always room for more. When Rodriguez rotated through
psychiatry, he downloaded the entire fourth edition of the Diagnostic and
Statistic Manual of Mental Disorders into his PDA. "I truly do believe that
the computer is the physician's black bag of the future, along with the
stethoscope, if it's used the right way," says senior associate dean Dr.
Neil Parker. "You can't possibly know and remember everything that you need
to know. You can't keep up on the literature, you can't keep up with the
clinical guidelines."
And how does this reliance on microprocessors affect patients? "We
should be able to eliminate mistakes due to lack of knowledge," says Luann
Wilkerson, senior associate dean for medical education. That's not the only
source of mistakes, obviously. But, says Wilkerson, no one will be able to
say he made a mistake because he didn't have access to the right information.
At UCLA, technology also creates a thriving online community where
students and faculty engage in a never-ending discussion of medical topics.
There are formal settings, like the Web-based course called Clinical
Application of Basic Sciences, in which small teams of students spend weeks
on a single case, posting their research online as they work up a diagnosis
and treatment plan. "You don't have to get bored," says Sammy Eghbalieh,
who just completed his first year. "You're not sitting in a classroom for
three or four hours." And there are informal settings, since each class has
a Web site, as do many of the students and faculty members.
It's clear that medical education is only going to get more
gizmo-centric. UCLA has just opened a robotic-surgery suite as part of a
new program called the Center for Advanced Surgical and Interventional
Technology. And its $800 million, 525-bed paperless "all digital" hospital
is scheduled to open in 2005. Is it too much of what might not be such a
good thing? "It would be easy to get lost in the vast amounts of
information," says Amenhotep. "You have to be careful not tolet that
happen. You still have to be a good clinician."
Dr. Mike McCoy, chief information officer of the UCLA Medical
Center, who is directing the development of the new digital hospital,
couldn't be more of a tech guy. But McCoy, who flew fighter jets during the
war in Vietnam, knows the real secret to getting the most out of
technology. "Good pilots, even though there's wonderful navigation systems
now and all kinds of satellite aids and so forth, they always know where
they are," he says. When McCoy was heading back from North Vietnam to South
Vietnam after a mission, he could never quite bring himself to trust his
plane's computer. "I always looked at my humble magnetic compass to make
sure that it also said south," he recalls, "because the consequences of
going north from Hanoi were not good. And we teach that to the students.
They have to look at the patient. If the oxygenation number looks good but
the patient is blue, something's wrong." Cutting-edge technology can never
replace a doctor's best judgment.
***********************
Government Computer News
Aviation security agency to award $1 billion in technology work
By Shane Harris
sharris@xxxxxxxxxxx
The Transportation Security Administration will award up to $1 billion in
information technology and telecommunications services work to a single
firm by the end of July, according to a Transportation Department official.
TSA is issuing the order under an existing Transportation technology
contract that is open for all agencies to use. However, only companies that
currently hold a seat on that contract, called the Information Technology
Omnibus Procurement (ITOP 2), may compete for the TSA work. The TSA,
created in the wake of the Sept. 11 attacks to guard the nation's
transportation systems, has no significant technology or telecom
infrastructure in place.
The contract will last seven years, said Megan Russell, the Transportation
contracting officer in charge of the order. It covers computer management
services and the creation of a technology architecture, or overall
technology design. "Those are the core things we need to develop as an
agency," Russell said.
Unlike traditional procurements, in which agencies spell out in great
detail what a contractor is required to do, vendors bidding on the TSA
order will provide their own proposals on how to meet TSA's needs. The
order is also performance-based, meaning that the vendor will be
compensated based on its ability to meet specific goals and milestones that
TSA and the company will define before any work begins.
TSA will evaluate bidders and their proposals using a strict set of
criteria that includes past performance on federal contracts and ratings
developed by independent technology industry analysts, said Chip Mather,
co-founder of Acquisition Solutions Inc., the Chantilly, Va., based
consulting firm that crafted TSA's acquisition strategy.
The agency wants only the strongest technology firms to compete.
"Contractors?are asked to carefully review [the evaluation criteria] and
make a realistic self-assessment as to their potential viability," TSA's
written submission guidelines advised. Mather said he'd be satisfied if
only two or three companies vie for the work. "My dream is to have very
highly qualified contractors fighting it out tooth and nail for [TSA's]
business," he said.
Russell said TSA will pick a "tier one contractor," or a firm with a solid
reputation in government and brand name recognition. Such prominent
technology companies now on the ITOP 2 contract include DynCorp, Science
Applications International Corp. (SAIC), Unisys, Booz Allen Hamilton and
Electronic Data Systems Corp. (EDS). Boeing Service Co., which on June 7
won TSA's highly prized contract to oversee the deployment of
explosive-detection equipment to screen luggage at every U.S. airport, is
not a vendor on the contract.
President Bush wants to place TSA under his proposed Department of Homeland
Security. Russell said that any rollover of TSA into the new department
wouldn't affect the new technology acquisition. Rather, TSA officials
believe this purchasing strategy could be a model for the new Homeland
Security Department.
Richard Clarke, the White House cybersecurity czar, said in an interview
Friday that since TSA is building its technology infrastructure from
scratch, officials can implement the highest information security
standards. Clarke said TSA officials have asked his office to advise them
as they implement security policies.
Clarke noted that the TSA order would represent one of the most significant
homeland security technology acquisitions. He estimated the proposed
Homeland Security Department would directly spend about $1.5 billion on
technology, based on the budget requests of agencies that would be included
in the new department. That figure doesn't include the TSA contract, Clarke
said.
Mather said TSA's approach is a significant step towards conducting more
performance-based procurements that rely on vendors to shape the scope of
their work. The Bush administration wants agencies to use performance
incentives more often when awarding services contracts.
*************************
CIO Insight
Open Source Goes Mainstream
Roundtable: Once-radical open source software is moving out of the shadows
and into the mainstream. Nine experts debate its future.
You need one more decision on your desk, right? One more of those CIO
issues that go beyond technology into the messy realms of people and
politics and philosophy. Well, it will be hard to escape this one: open
source softwarein particular, the operating system known as Linux. Created
by Helsinki University student Linus Torvalds in 1991, and developed and
still maintained over the Internet by thousands of hackers worldwide, Linux
is now said to have the fastest market growth rate of any operating system
in the world. Moreover, chances are good that even if you've never
officially signed off on Linux, someone in your shop is experimenting with
it right under your nose.
To get a better idea of what this means for CIOs, CIO Insight Deputy Editor
Terry A. Kirkpatrick recently convened a roundtable of nine experts to chat
about itcorporate IT execs who have deployed Linux, analysts who have
studied it, and vendors of both open and proprietary software.
If there was consensus in this diverse group, it was this: Open source
software should be treated no differently than any other softwarethe
migration, testing and support issues will look familiar. These are the
"hidden" costs that could wipe out any cost-savings on price. At the same
time, however, there may be unexpected savings on hardware.
Beyond the technical issues, though, there are other considerations. Linux
is both an operating system and a development model, "a dessert topping and
a floor wax," as one roundtable participant put it. Open source comes
"shrinkwrapped" in a philosophy of software creation and ownership that is
at sharp odds with the prevailing proprietary model, and this continues to
stoke a shrill debate between the two camps. Should these "soft" issues
matter to a CIO faced with hard business decisions? The discussion began
with the issue of costthe price of open source software being its most
notable feature and something that tends to catch the eye of CIOs in these
lean times.
CIO Insight: Do CIOs really care that software is free, as long as they
have a tool that solves some business problem?
QUANDT: The fact that Linux is available at a low cost is only one factor,
obviously one that can help the company have a return on investment, but
the capabilities and reliability and performance of Linux are really much
more important.
CAREY: The cost of software plays a small role at Merrill Lynch. We look
for other value propositions, whether it's open source or not. Linux is
just a product to us with a cost that we look at over its total life cycle.
George, you've written for Gartner that in more complex deployments, the
price advantage of open source can disappear. Can you elaborate?
WEISS: As we move up the curve of more and more complex applications, and
face the issues of integrating legacy environments, larger performance and
database scaling requirements, these will require more systems integration,
performance tuning and validation of the software. And that's going to
become much of the cost of the deployment of these systems. So the initial
cost is really minor.
WEISS: As we move up the curve of more and more complex applications, and
face the issues of integrating legacy environments, larger performance and
database scaling requirements, these will require more systems integration,
performance tuning and validation of the software. And that's going to
become much of the cost of the deployment of these systems. So the initial
cost is really minor.
YATKO: Actually, it's the crown jewel of our firm, a mission-critical,
global order management architecture. This system deals with roughly 35
million transactions per day around the world. We were looking at a
significant increase in volume, driven by the way the business was shifting
its trading strategies. Our resources were giving us very little additional
bandwidth to grow, so we almost had to go with Linux. But it had always
been not so much Linux but the operating environment that it would be
installed on. We needed stability and availability as much as we did
performance. So it was our choice of hardware, actually. We had
investigated Egenera Inc.'s solution and felt that it could stand up to any
other. After running for a couple of months on the hardware platform, we
recognized that this was an incredible performance increase, in some ways
20 times the performance over traditional RISC. So we went from a four-way
RISC box to a two-way Intel-based system for the hardware package.
On the other hand, Mike, you looked at Linux for Royal Caribbean at one
point, and you were hesitant.
SUTTEN: Yes, Royal Caribbean has a lot of older, peculiar systems, because
it's very hard to go out and retrofit a ship. So the guideline we've been
using on all open source, and not just Linux, is that if it's higher up in
the architecture, at the logic layer or the presentation layer, and the
life of the product is somewhat limited to three to five years, we were
pretty happy using open source. We think it's sustainable; it'll last for
that time. But as we move further down, to the data layer, the operating
system layers, we've been pretty hesitant, because we expect 10 to 20 years
out of that level in the application. And so we've been hesitant going in
that direction.
**********************
Euromedia.net
EU privacy group investigates music players
17/06/2002 Editor: Tamsin McMahon
European authorities plan to beef up security for online music players that
they maintain violate listener's privacy by collecting information on their
musical tastes, the Wall Street Journal reported.
The US paper wrote that European privacy enforcement authorities have
adopted a working paper examining music players, such as Microsoft's
Windows Media Player and RealNetworks's RealJukebox and RealOne Player.
European Union regulators considered some data collection features of the
players to be "invisible and not legitimate," having been "secretly
installed" on users' computers to "send back personal information" or
"spyware," the paper reported.
The authorities are expected to crack down on music players who use cookies
to collect data on users' musical preferences no matter where the software
or web hosting firms are located. Violations of the EU's tough privacy laws
could merit expensive fines.
Waltraut Kotschy, a member of the EU's working group and managing director
of the Austrian Privacy Commission said she ha s receive "several
complaints" about music players. Although Kotschy said authorities haven't
taken any action, she added that concerns about the software would be back
on the agenda for the EU privacy police in June.
RealNetworks admitted to building databases of the listening habits of its
RealJukebox users, but the firm's vice-president for government affairs,
Alex Alben, said it was information most users agreed to give and is
essentially "anonymous data."
"You'd be amazed by how we do almost nothing with data, because of privacy
concerns," he told the Wall Street Journal. Microsoft lawyer Peter
Fleischer told the paper that he felt the privacy group's plans were
"testing the boundaries" of EU powers.
***********************
Wired News
IM'ers Get a Secure Chat Room
By Farhad Manjoo
It's probably a good guess that a lot of what's said on instant messaging
software is pretty trivial, neither vital to national security nor tightly
held business secrets -- mostly office gossip, diet tips, celebrity news,
and emotion-addled sweet nothings whispered to your sweetie.
But IM is "maturing," according to Chris Matteo, the president of IMpasse
Systems, and many people are now using commercial IM software to do serious
business. This trend worries some companies, as nothing said over IM is
very private. Not only do instant messages travel freely over the Internet,
like e-mail, but they're also explicitly routed through the servers of the
company that provides the service -- and who knows what can happen there?
This situation prompted Matteo to create an application that encrypts
conversations between chatters, making the chat unintelligible to those who
might be listening in. The software, called IMPasse, sits on a machine
alongside AOL Instant Messenger, MSN Messenger and Yahoo Messenger -- the
three biggest chatting apps. With IMPasse, any conversation or portion of a
conversation can be quickly scrambled. Both parties to a chat need the
software; IMPasse charges $20 for two licenses.
Matteo said that the software works rather transparently, without causing
any noticeable slowdown in the chat. Messages are encrypted and decrypted
using strong "encryption" at each computer, and not even IMPasse keeps a
copy of the password used to encrypt the messages.
Michael Sampson, an analyst who follows the IM world for Ferris Research,
said that add-on software like IMPasse's is a "fairly new development" in
messaging. He said that another company, Akonix, released a similar
security app earlier this month.
"Traditionally, one of the biggest problems with IM in the enterprise,"
Sampson said, "is this question of security. Since these discussions happen
in 'real time,' it's more likely that people will be less formal, so what
goes over the wire will be close to what they're really thinking."
That close approximation of actual thought is, of course, what attracts
people to IM. Chatting online is an "immersive" experience that allows for
clarification and nuance. This can be well and good when you're talking
about plans for the weekend, but these days the mind turns to less innocent
conversations: What about plans to plant a dirty bomb?
Though law enforcement has suggested it's possible and, indeed, inevitable,
nobody has shown any proof that terrorists are fans of IM. Still,
encryption has always been a touchy subject for the authorities, and Matteo
said he understands that the combination of messaging plus encryption might
raise some eyebrows.
"As you know, we are at a very sensitive point in time in regards to
encryption technology," Matteo wrote in an e-mail. "IMpasse Systems is
located not too far north of Ground Zero, and we are very affected by this
tragedy (my father is a retired NYC fireman, who lost friends and put the
uniform back on in the wake of the event), though we maintain our
objectivity when it comes to cryptography.
"At the risk of sounding political, one cannot lose sight of or freely hand
over the civil liberties that hundreds of thousands of Americans have lost
their lives to protect, including freedom of speech."
Since Sept. 11, the government has been ratcheting up efforts to monitor
Internet communications. However, Matteo said that IMpasse has not yet been
approached by any law enforcement officials regarding a secure chat.
If he is approached, he said, there's nothing his company can do to decrypt
messages; he doesn't have the key. The most that can be done is to shut
down a user's account through AIM, MSN or Yahoo.
It's unclear how AOL, Microsoft and Yahoo will react to third-party
security applications like IMPasse. AOL has a history of making changes to
its protocol to lock out programs that try to interact with its system, and
Matteo said that AOL could lock out IMPasse.
Last month, AOL said that it is working on an "enterprise" version of AIM
that will feature encryption.
AOL, Microsoft, and Yahoo did not return calls for comment.
*************************
NewsFactor
What Supercomputers Can and Cannot Do - Yet
As scientists have realized the potential benefit of high-performance
computing in their research, demand for greater supercomputing power has
grown to exceed both the available supply and the current technology.
"Biology is a good example of that," Ty Rabe, director of high-performance
technical computing solutions at Hewlett-Packard, told NewsFactor.
"Biologists couldn't imagine having enough computational power to do what
they needed until recently." Full Story, see:
http://www.newsfactor.com/perl/story/18243.html
**********************
News.com
One system for all handhelds?
By CNET News.com Staff
Two Singapore programmers claim to have created an operating system that
can run programs written for Windows as well as Linux.
Called MXI (Motion Experience Interface), the operating system will allow
handheld devices to run any desktop program, said R. Chandrasekar and Sam
Hon Kong Lum, the 22-year-old co-inventors.
At a media conference last week, the duo showed off a Compaq Computer iPaq
PDA running desktop versions of Microsoft's Word, Powerpoint and Internet
Explorer applications. The same iPaq also ran a Pac-Man game for the Atari
OS and a version of Sun Microsystems' open source-based StarOffice software
suite.
The secret? The heavy lifting is done on an MXI-based server that runs the
actual applications and sends a stream of data back to the MXI client
software residing on the handheld.
According to its developers, when a program such as a word processor makes
a call to a specific part of the Windows operating system (to save a file,
for example), MXI intercepts the call and acts on it. It then lets the
program know if the operation was carried out, just as Windows would.
The two inventors, who run an 11-person company in Singapore called
Intramedia, "stumbled on the code" that lets MXI perform this feat of
translation and have spent the last four years perfecting it, said
Chandrasekar. MXI is influenced by Unix, and borrows aspects of the kernel
at the heart of the software, he said.
Because MXI saves interim data on the PDA, people can edit a document
without being online. But when they hit "save," the handheld synchronizes
with the server, and the changes are saved on the server copy of the document.
This method keeps MXI's data stream low in bandwidth use, so a 28.8kbps
data connection would be sufficient, Chandrasekar said. It means that a
handheld with a GPRS (General Packet Radio Service) or other 2.5G
connection can run MXI; handhelds on faster 3G, Wi-Fi (802.11b) or
Bluetooth networks will enjoy even better responsiveness, he added.
However, Chandrasekar said that the PDA that will realize the full
potential of MXI has yet to be invented.
"The ideal MXI-based handheld is one that has every 'flavor' of wireless
connectivity: GPRS, Wi-Fi and Bluetooth," he said.
MXI is expected to be ready for commercial release at the end of the year,
said Gane Ramachandra, Intramedia's vice president of strategy and operations.
Ramachandra said the company is in discussions with handheld makers and
telecom companies in Asia, but he declined to reveal their identities.
**************************
News.com
Software to keep your money safe
By Troy Wolverton
Staff Writer, CNET News.com
Wells Fargo will announce Monday that it plans to launch new software to to
combat money laundering.
The software, from enterprise software company Searchspace, uses artificial
intelligence to weed out any activity deemed suspicious. Wells Fargo plans
to have the software up and running by the end of the year.
The financial institution's current systems are based on fairly static
rules. The company wanted a new system that was more adaptable to
real-world transactions as well as one that would learn and improve as it
went along, said Bob Chlebowski, the company's executive vice president of
distribution strategies.
"That is really the heart of it, moving from rules to a more dynamic
environment," he said.
After the Sept. 11 terrorist attacks, President Bush signed the USA Patriot
Act, which put into effect more stringent requirements for banks to monitor
and report potential money -laundering activities. The act has spurred a
number of financial services companies to upgrade their detection systems.
Earlier this year, banking and financial services giant UBS announced that
it would use Searchspace's anti-money-laundering system. Charles Schwab and
Citigroup have announced that they will be using a detection system offered
by Searchspace rival Mantas.
Searchspace's system is able to examine nearly every customer transaction,
including those through its brokerage, consumer lending, private banking
and international businesses, Chlebowski said. The system will not comb
through the transactions in real time; instead, it will examine them in
daily batches.
Searchspace's software compares a transaction conducted within one account
with other transactions in that account as well as with transactions in
similar accounts to look for anomalies, said Konrad Feldman, chief
executive officer of Searchspace. The software learns as it goes, becoming
more familiar with "normal" transactions and better able to spot suspicious
activity, he said.
The system will alert human operators at Wells Fargo of any suspicious
transactions. By law, Wells Fargo will report those to the federal government.
Wells Fargo will run Searchspace's Intelligent Enterprise framework and
Anti-Money Laundering Sentinel software on top of a single server from IBM,
a partner of Searchspace. The software will run on top of AIX, IBM's flavor
of Unix, on an IBM pSeries eServer. The Searchspace software will work in
tandem with IBM's DB2 database.
Searchspace will tie into Wells Fargo's existing and legacy computer
systems using XML, Feldman said.
************************
Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 510
2120 L Street, NW
Washington, D.C. 20037
202-478-6124
lillie.coney@xxxxxxx