[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Clips May 14, 2002
- To: "Lillie Coney":;, Gene Spafford <spaf@xxxxxxxxxxxxxxxxx>;, Jeff Grove <jeff_grove@xxxxxxx>;, goodman@xxxxxxxxxxxxx;, David Farber <dave@xxxxxxxxxx>;, CSSP <cssp@xxxxxxx>;, glee@xxxxxxxxxxxxx;, Charlie Oriez <coriez@xxxxxxxxx>;, John White <white@xxxxxxxxxx>;, Andrew Grosso<Agrosso@xxxxxxxxxxxxxxxx>;, computer_security_day@xxxxxxx;, ver@xxxxxxxxx;, lillie.coney@xxxxxxx;, v_gold@xxxxxxx;, harsha@xxxxxxx;;
- Subject: Clips May 14, 2002
- From: Lillie Coney <lillie.coney@xxxxxxx>
- Date: Tue, 14 May 2002 14:12:01 -0400
Clips May 14, 2002
ARTICLES
SONICblue Seeks Reversal of Data Collection Order
Group Targets Digital TV Piracy
ISPs Seek to Void Ruling on Police Searches
Turkey Mulls Strict Net Bill
Former corrections officer sentenced for misusing FBI system
VeriSign Used False Ads, Suit by Rival Asserts
Plagiarism-Detection Tool Creates Legal Quandary
Pentagon Commitment Helps Advance E-Learning Standard
Out of Silicon Valley, and Looking Homeward
Kazaa, Verizon propose to pay artists directly
Two Virginia Universities To Join Forces Against Cybercrime
Questions Raised by E-Mail on Energy
High Court Takes a Step Toward Net Porn Rules
Defining 'best value'
OMB checks card use
Museum's Cyberpeeping Artwork Has Its Plug Pulled
RealNames online 'keyword' system shuts down
Intellectual property rights issues hamper contracting process
Defense research agency seeks return to 'swashbuckling' days
Homeland security effort boosts e-gov initiatives
Digital divide between agencies remains wide
Computer-based artificial societies may create real policy
Jews shopping online to support Israel
There's no place like home
***************************
Reuters
SONICblue Seeks Reversal of Data Collection Order
SANTA CLARA, Calif. (Reuters) - SONICblue Inc. on Monday moved to overturn
a court order for it to spy on users of its digital recording devices and
share detailed viewing data with major studios and television networks,
saying the order would violate privacy rights.
Santa Clara-based SONICblue called the May 2 order from Central District
Court Magistrate Charles Eick "breathtaking and unprecedented" and said the
directive to track what television viewers watch "violates consumers'
privacy rights, including those guaranteed by the First and Fourth
Amendments."
The plaintiffs in the case, including film studios Paramount, Universal,
The Walt Disney Co. and Metro- Goldwyn-Mayer Inc. , as well as TV networks
CBS, ABC and NBC, have argued they need the data, including details on what
commercials viewers skip and what files they transfer across the Internet,
to build their copyright infringement case against SONICblue.
"The information that has been ordered to be collected and disclosed to
plaintiffs is at the core of consumers' expectations of privacy," said
SONICblue, which in the past has acknowledged it has the right to collect
such data under its sales contracts, but chooses not to do so.
At the heart of the matter is the company's ReplayTV (news - web sites)
4000 product, which allows viewers to digitally record programs and, if
they wish, to skip over commercials. The device also has a high-speed
Internet port that allows users to download film files.
The studios and networks claim those features threaten to deprive them of
the means of paying for their programs since they allow ads to be cut out
and premium programs on subscription services, such as HBO, to be forwarded
to non- subscribers.
SONICblue also protested against the order for effectively forcing it to
redesign its product for the express purpose of collecting data to be used
against it.
Barring an outright reversal, the company asked for three modifications to
the order, including:
-- allowing consumers to opt in or out of the collection;
-- allowing data to be collected only in aggregate and not
person-to-person, form; and
-- making any surveillance narrow in scope and limited in duration.
If its appeals are denied, SONICblue will have 60 days from May 2 to design
new software that will allow it to track what its customers are watching.
The company said the modifications will require about $400,000 in
development costs and will take four months to complete without error.
The lawsuit is part of a broader campaign by the studios, and TV networks
to a lesser extent, to combat what they say is video piracy that costs them
billions of dollars each year in lost sales in advertising and
subscription-based programming.
**********************
Los Angeles Times
Group Targets Digital TV Piracy
Technology: Studios and companies seek to put electronic locks on
broadcasts, limiting what viewers can do with recorded media.
By JON HEALEY
TIMES STAFF WRITER
May 14 2002
In the name of fighting piracy, a group of Hollywood studios, technology
companies and consumer-electronics manufacturers wants to slap electronic
locks on free, over-the-air television programs that viewers record digitally.
The proposal is just one of a bundle of restrictions on digital TV signals
that the group is considering for digital TV sets, computers and other
devices. The restrictions, which are being fought by a few companies and
consumer advocates, could spell trouble for viewers as they upgrade from
analog TVs and VCRs to their digital successors, such as DVD recorders.
For instance, a viewer who digitally records "The West Wing" on his or her
living room DVD recorder may be unable to play the disc on one of today's
DVD players. Stopping digital TV piracy is part of a broad effort by the
studios, record companies and other copyright holders to limit what viewers
can do with media in the digital age. These companies argue that piracy
poses an extraordinary threat, but critics say the studios, labels and
publishers are trying to stifle innovation and roll back consumers' rights.
The main goal of the studios, TV manufacturers and computer companies in
the Broadcast Protection Discussion Group is to prevent programs aired on
digital TV stations from being transmitted over the Internet. Online
file-trading networks already make a substantial number of TV shows
available for free, but the studios fear that digital TV will make it even
easier for pirates to record, copy and share programs on the Net.
That's why the studios say they're reluctant to let broadcasters air their
most valuable movies in high-definition TV, the richest form of digital
signal. Set manufacturers, in turn, blame the shortage of compelling HDTV
programming for the sluggish sales of digital TV receivers, which are in
fewer than 1million homes today.
The Broadcast Protection Discussion Group's latest draft proposal, released
Saturday, would require devices that receive digital TV broadcasts to
protect them with an approved anti-piracy technology before sending them
over a digital connector to be displayed, recorded or stored. The proposal
is still being debated within the group, with a final version expected this
month.
After that work is finished, a separate group will try to come up with a
way to make sure that manufacturers equip their new sets, recorders, cable
TV boxes, satellite receivers, computers and related devices with the
technology.
That effort is likely to involve lobbying Congress for a mandate affecting
all devices capable of receiving, displaying or recording digital TV signals.
The only anti-piracy technologies on the initial approved list are ones
that scramble the digital TV signal electronically. And the only digital TV
programs that would not have to be scrambled after they're received are
those that aren't marked by the broadcaster for protection or that aren't
moved digitally from device to device.
A likely result, critics say, is that DVD recorders will automatically
scramble the programs they record from local digital channels. And a
scrambled disc won't play on any of the tens of millions of DVD players
that consumers already have purchased, including the growing number in cars.
Instead, those discs will play only on a new generation of DVD players and
recorders that include one of the approved protection technologies.
Michael Epstein, a senior researcher at Philips Research Labs, acknowledged
that consumers could avoid the scrambling problem by using an analog
connection between their recorder and their digital set. But most consumers
would connect their recorders to the digital output on a cable TV receiver,
unwittingly triggering the scrambling function.
"More than likely, people wouldn't understand what they would have to do"
to rewire their entertainment centers and solve the problem, Epstein said.
Joe Kraus, founder of advocacy group DigitalConsumer.org and of Excite.com,
said a more fundamental problem is the assumption that all retransmissions
online have to be stopped, rather than just the ones that violate copyright
law. His privately funded group wants to exempt fair uses, such as
including excerpts from a digital TV broadcast in a homework assignment
submitted by e-mail.
The problem is that it's next to impossible to design a protection
technology that can tell the difference between fair use and piracy. But,
Kraus countered, "we don't prevent anybody from driving because some people
drive drunk."
He added: "Technical measures like this rarely prevent piracy, but burden
consumers, stifle innovation and generally are bad ideas."
Chairmen of the discussion group and the Motion Picture Assn.'s
representative on the panel could not be reached Monday for comment.
********************
Reuters
ISPs Seek to Void Ruling on Police Searches
Mon May 13,11:18 PM ET
SAN FRANCISCO (Reuters) - Web giant Yahoo! Inc. and several Internet trade
associations filed papers Monday seeking to overturn a court ruling which
they said could fill the offices of Internet companies with police officers
overseeing the execution of search warrants.
In an amici curiae brief filed with the 8th Circuit Court of Appeals in St.
Louis, the Internet group said a Minnesota court ruling requiring police
officers to be physically present for search warrants would threaten client
privacy, slow the searches and disrupt business.
"A large Internet service provider can receive literally thousands of
search warrants and other requests for information during the course of a
year," the brief said.
If the Minnesota ruling is allowed to stand, "it is entirely possible that
at any given time a dozen or more law enforcement officers would be on the
premises of a given service provider," it said.
The Minnesota case involved a search warrant that was issued on Yahoo! in
connection with a child pornography investigation. The warrant was faxed
from Minnesota to Yahoo's headquarters in Santa Clara, California, where
employees pulled up the requested information and sent it back to local
prosecutors.
The defendant in the case subsequently sought to have that evidence
suppressed, arguing that his Fourth Amendment right against unreasonable
search and seizure was violated because it was conducted by civilians.
The judge in the case agreed, saying that that a law enforcement officer
should have been present at Yahoo! while the search was being conducted.
His ruling directed all future such searches to be supervised by law
enforcement personnel.
UNREASONABLE BURDEN
The U.S. government has already challenged the ruling, saying it puts an
unreasonable burden on law enforcement in an era when Internet companies
span the globe.
The Internet group, which includes the Computer and Communications Industry
Association, NetCoalition and the United States Internet Service Providers
Association, further argued in their brief that the ruling would do nothing
to extend Fourth Amendment protections.
"The police officer waiting in the lobby while the technician works away on
the computer does not in any way safeguard anyone's Fourth Amendment
rights," the brief said.
The group's lawyer, Jonathan Band, said the Minnesota ruling would also
disrupt normal business operations at Yahoo! and other companies while
having a "chilling effect" on their subscribers, who could be concerned
that a constant police presence would impinge on their privacy rights.
"A lot of people in the industry have been concerned about this decision
ever since it came down," Band said Monday.
"We're saying that this ruling is bad public policy. It's wasteful, it is
going to be a waste of government resources, and of law enforcement
resources that should be out there catching real criminals."
He added that the ISPs were concerned that the burden could increase as the
number of search warrants -- already up sharply in the wake of the Sept. 11
attacks -- will grow even faster under the new Council of Europe Cybercrime
Convention, an international treaty which requires the U.S. government to
obtain information from Internet service providers at the request of
foreign governments.
"You could have countries halfway around the globe requiring these
searches, and we would have to comply," Band said. "All the work is going
to be done by the service providers, and their technicians and engineers.
Having police present will add no value."
********************
Wired News
Turkey Mulls Strict Net Bill
ANKARA, Turkey -- A media bill to go before the Turkish parliament Tuesday
could cripple the Internet industry, harm the nation's struggling economy
and hobble free speech on the Web, observers say.
The bill would expand already stringent regulations on all forms of media
and would require websites to submit two hard copies of pages to be posted
on the Internet to a government agency for prior approval.
It also would require those who wish to open a website to obtain
permission from local authorities, and to inform them every time the site
is changed.
Those who don't comply or are found in violation could face fines ranging
from about $95,000 to $195,000, enough to financially devastate most Web
content and service providers in a country where per capita income is less
than $2,200.
The legislation comes at a time when European Union-hopeful Turkey is
struggling to meet Brussels' political criteria for membership, including
overhauling its often dismal human rights record and expanding civil
liberties.
Dozens of journalists, politicians and intellectuals have been jailed under
already draconian laws curbing free speech, and some fear vagaries in the
proposed legislation could be used as tools for political prosecution.
The burdens proposed in the bill would cause an exodus of Turkish websites
abroad and deal a major blow to content provision and Internet expansion,
said Savas Unsal, CEO of Superonline, Turkey's biggest ISP with 950,000
dial-up and 2,500 corporate subscribers.
Unsal said a lack of Turkish language content would further widen the
nation's digital divide in favor of the rich, who are educated enough to
take advantage of the Internet in English.
"We don't want to provide access to just the elite," Unsal said. "We want
the government to let us do our jobs and open the gates to technology and
the future."
The bill focuses mostly on print and broadcast media, Unsal said, and the
Internet provision was tacked on as a result of e-mails and Internet news
critical of the leaders of the nation's three-party coalition government.
"It was a quick and dirty job done by people who don't understand the
Internet or what they're asking," he said.
Unsal also said the bill is incompatible with Turkey's European Union
membership bid, which requires more press freedom than exists under current
law.
European Union representatives have said the bill would harm democracy, and
that it runs counter to the government's publicly stated goal of loosening
the country's constitutional restrictions on speech and expression.
The bill was killed last year by President Ahmet Necdet Sezer, who blasted
it in his veto statement, saying parts of the proposed law were not
"compatible with democratic traditions, basic rights and freedoms or
constitutional principles."
If parliament members pass the bill again without change, though, Sezer's
hands will be tied. He'll have to accept the law, and his only recourse
will be to refer it to Turkey's constitutional court.
The court's decision could take from a few months to a few years, hampering
Internet growth in a country where Unsal says it already has been slowed by
economic crisis.
*********************
Government Computer News
Former corrections officer sentenced for misusing FBI system
By Wilson Dizard III
Gary Piedmont of Reynoldsburg, Ohio, was sentenced to 30 days of community
confinement, a $5,000 fine and a year of probation for using the FBI's
National Crime Information Center system to check whether a warrant had
been issued for a friend.
Piedmont formerly was a supervisor at the Franklin County Corrections
Center. While working there he met Melodie Lynn Calomeris, who was housed
there pending her transfer to a federal prison, according to a summary of
the matter agreed to by prosecutors and Peidmont before sentencing.
Piedmont employed Calomeris, also known as Melodie Lynn Stillwell, as a
housekeeper after her release.
"The Probation Office was in the process of issuing an arrest warrant for
Calomeris for violating her supervised release," according to a statement
by Gregory Lockhart, U.S. Attorney for the Southern District of Ohio. "The
Marshals Service and the Franklin County Sheriff's Office investigated and
found that Piedmont had used the system to check on the warrant."
Piedmont checked the NCIC nine times in May 2000 to see if the warrant had
been issued, authorities said.
Piedmont's sentencing by Magistrate Judge Terence P. Kemp of the U.S.
District Court for the Southern District of Ohio was based on criminal
information filed by the U.S. Attorney's office and Piedmont's guilty plea.
******************
Reuters Internet Reports
VeriSign Used False Ads, Suit by Rival Asserts
Mon May 13, 5:12 PM ET
By Andy Sullivan
WASHINGTON (Reuters) - Internet domain-name seller VeriSign Inc. was hit
with a lawsuit filed on Monday by a rival that charged VeriSign with using
false advertising to steal customers.
BulkRegister, a domain-name seller based in Baltimore, said it had sued
VeriSign in federal court to stop a direct-marketing campaign that sought
to trick owners of domain names into switching their accounts to VeriSign.
VeriSign's campaign has also drawn protests from other domain-name sellers,
which allow Internet users to reserve names like www.example.com on a
yearly basis.
According to BulkRegister and other name sellers like GoDaddy Software,
VeriSign has since April been mailing out thousands of "domain name
expiration" notices that imply that domain-name owners could lose control
of their name if they do not return the form along with $29 by May 15.
As a result, domain-name owners could end up renewing their names
prematurely, paying higher annual fees and suffering with inferior customer
service, said Tom D'Alleva, vice president of marketing for BulkRegister.
The practice could also render the customer's Web site useless because it
could disrupt other services like Web hosting, D'Alleva said.
"I don't understand why a major company would indulge in this," D'Alleva said.
The notices violate a U.S. law that requires mail solicitations to be
clearly marked, said Patrick O'Brien, a lawyer who filed the suit for
BulkRegister.
A VeriSign spokesman declined to comment on the litigation, and said he
disagreed with BulkRegister's assessment of its marketing campaign.
The once-hot domain name market has cooled since the height of the dot-com
bubble, when names such as www.business.com fetched prices as high as $7.5
million.
Despite the introduction of new "top level" domains like .biz and .info,
the total number of domain names has shrunk by 6 percent from its 2001 peak
to 29.5 million, according to "State of the Domain," an industry report.
VeriSign, the No. 1 domain-name seller, recently announced layoffs of 10
percent of its work force after posting a quarterly loss of 9 cents per
share. Its stock closed at $10.01 per share Monday, off 85 percent from its
52-week high of $67.94.
A Canadian company last month agreed to pay $375,000 to settle Federal
Trade Commission charges that it duped Internet domain-name holders into
needlessly buying similar-sounding names.
******************
Chronicle of Higher Education
Plagiarism-Detection Tool Creates Legal Quandary
When professors send students' papers to a database, are copyrights violated?
By ANDREA L. FOSTER
When electronic tools to ferret out student plagiarism hit the market a few
years ago, colleges saw them as easy-to-use and affordable. But now some
college lawyers and professors are warning that one of the most widely used
plagiarism-detection services may be trampling on students' copyrights and
privacy. And many campus officials are starting to wonder whether some of
the high-tech tools they are using to detect dishonesty clash with
students' legal rights.
The service creating the stir is Turnitin.com (http://www.turnitin.com),
which says it has about 400 colleges in the United States on its client
list. A paper submitted to the service is checked against a database of
manuscripts -- estimated at more than one million -- and a database of
books and journals, as well as more than a billion Web sites. Phrases that
seem to be unoriginal are flagged for professors to check.
A campuswide subscription to Turnitin.com costs $1,000 to $10,000 a year,
depending on the size of the college. The service also is sold to
individual professors and academic departments.
What makes it effective -- but also controversial -- is that it keeps the
papers that colleges submit for inspection, in order to enlarge its
database. Most other plagiarism-detection services, like Copycatch and
Eve2, allow professors to run student papers through a computer program
that checks for material copied off the Internet or for collusion among
students.
Since those services don't retain the submissions, the pool of manuscripts
that papers are compared with is likely to be smaller than Turnitin.com's,
says Louis A. Bloomfield, a physics professor at the University of
Virginia, who passionately defends both types of plagiarism-detection
services. He created a computer program that finds shared passages among
submitted papers but does not save them in a database.
Lawyers say the problem with Turnitin.com is that student papers are copied
in their entirety to the services' database, which is a potential
infringement of students' copyrights. (An author doesn't need to file for a
copyright; the law automatically bestows on authors the rights to their
written works.) And the copying is sometimes done without students'
knowledge or consent, which is a potential invasion of their privacy.
Those concerns contributed to the decision by officials at the University
of California at Berkeley not to subscribe to Turnitin.com, says Mike R.
Smith, assistant chancellor for legal affairs: "We take student
intellectual-property rights seriously, and that became one of the trouble
spots for us in moving ahead with this proposal."
Rapid Growth
Berkeley's decision is noteworthy because Turnitin.com had its genesis
there. John Barrie founded the service, in 1998, on the basis of software
he had begun developing four years earlier while he was a graduate student
at Berkeley.
Turnitin.com has grown rapidly, and Mr. Barrie says his company recently
won a contract with Britain's Joint Information Systems Committee to serve
more than 700 higher-education institutions in Britain, starting in September.
The committee, which declined to confirm his assertion, promotes the use of
technology in higher-education institutions. Mr. Barrie won't reveal how
much money Turnitin.com will earn from the British contract.
At Indiana University-Purdue University at Indianapolis, officials
considering a deal with Turnitin.com are mindful of students' privacy and
copyrights, says Kenneth D. Crews, a professor at the Indiana University
School of Law, who is director of the IUPUI Copyright Management Center.
The university ought to go ahead with a contract only after impressing upon
faculty members the importance of notifying students at the start of the
course that their work may be submitted to Turnitin.com, which would retain
it, he says.
"Let them know what you're doing and give them a chance to opt out," the
professor says. In fact, he adds, some professors may feel more secure
using the service only after obtaining a definitive go-ahead from students.
College lawyers say they are unaware of any lawsuits filed by students
claiming copyright infringement or invasion of privacy because professors
submitted their papers to Turnitin.com. But Rebecca Moore Howard, an
associate professor of writing at Syracuse University who is an outspoken
critic of all plagiarism-detection services, says it's only a matter of
time before a student accused of plagiarism takes such action. "Student
work is being contributed to the site for others' use without students'
permission, and that's pretty shaky ground," says Ms. Howard.
"I have encouraged our faculty not to use [Turnitin.com] for that reason."
Ms. Howard says students whose papers are submitted to the service could
argue that their rights are being violated under the Family Educational
Rights and Privacy Act, which bars colleges from releasing personal
information about students without their consent.
Indeed, LeRoy S. Rooker, director of the U.S. Department of Education's
Family Policy Compliance Office, says there is no exception to the act that
would permit colleges to turn over student papers to an outside vendor
without students' written permission. "You can hire a vendor to check for
plagiarism," he says. "But once they do that, they can't then keep that
personally identifiable document and use it for any other purpose."
Warning Students
Turnitin.com's founder, Mr. Barrie, is aware of colleges' skittishness over
violating students' legal rights. As a result, the company encourages
professors to warn students that copies of their papers will be checked and
kept by the plagiarism-detection service, and to request that students
themselves upload their work to the company's database. In that way,
students cannot later argue that their papers were submitted to
Turnitin.com without their knowledge. About 70 percent of the papers
received by the service each day are uploaded by students, he adds.
He calls the privacy allegation "petty criticism" and contends that
colleges are not violating FERPA by submitting papers to Turnitin.com
because the work is not distributed elsewhere.
The procedure, though, raises questions about whether students feel coerced
into submitting their papers to the service, and what would happen if they
told their professors that they objected to handing over their work because
doing so would undermine their legal rights.
Mr. Barrie responds that professors can explain to students why that
assertion is wrong -- as he argues -- or just tell them, "Write as much
creative stuff as you want -- just don't do it at this institution."
"It's analogous to football players saying, 'I'm not going to abide by the
referee,'" he says.
He also denies that the company is infringing on student copyrights -- even
if the students aren't forewarned that their papers will be handed over to
Turnitin.com -- arguing that the service is simply making "fair use" of
student works.
It's an unusual rationale for commercial activity. Traditionally, the "fair
use" exception to copyright law is cited by scholars who copy passages from
books for their research, or by instructors who copy magazine articles for
classroom use.
"In no way do we diminish students' ability to market their work," says Mr.
Barrie.
"Since we vet for originality, it increases the marketability of the work
and increases the confidence a publisher might have in publishing that work."
Under copyright law, the fair-use exception is easier to justify if freely
distributed copies of a document are not expected to threaten its
commercial value.
Dan L. Burk, who is a professor at the University of Minnesota Law School
who specializes in intellectual property, says of Mr. Barrie's fair-use
defense: "That's baloney."
As many as three factors undermine the argument, the professor says: The
students' papers are completely copied. They are often creative works, as
opposed to compilations of scientific facts. And they are being submitted
to a commercial enterprise, not an educational institution. "To run a
database, you've got to make a copy, and if the student hasn't authorized
that, then that's potentially an infringing copy," says Mr. Burk.
That's one reason Turnitin.com has not been popular among faculty members
at the University of Minnesota-Twin Cities, which has a one-year trial
subscription to the service that is set to expire in August. Only 40
professors have signed up for the service, says the assistant vice provost,
Linda K. Ellinger.
Success Stories
Still, some professors and college administrators staunchly defend
Turnitin.com and say they aren't worried that subscribing to the service
might subject them to lawsuits.
"I view Turnitin.com as an agent for the university," says Nicholas S.
Aguilar, director of student policies and judicial affairs at the
University of California at San Diego. "It's no different than having a
teaching assistant review students' work and confirm whether it's authentic."
San Diego requires professors to inform students that they will be required
to submit their papers to Turnitin.com as part of the grading process. And
if a student refuses to comply? "We leave it up to the instructor to
determine how to treat that," says Mr. Aguilar.
The university is in the first year of two-year contract with Turnitin.com
and is pleased with the results of the service.
"Anecdotally, I'm getting feedback from instructors that they are seeing
significantly fewer instances of papers that contain plagiarized text,"
says Mr. Aguilar. He adds that the university has assurances from
Turnitin.com that it will not use students' papers for any purpose other
than to validate their originality.
Duke University's College of Arts and Sciences, too, has had its legal
concerns satisfied by Turnitin.com. "They sent us a lengthy document that
said they weren't infringing, since nothing goes out of the database
without students' express permission," says Michele Rasmussen, assistant to
the dean.
But, since August, when the college began subscribing to the service, fewer
than 15 professors have used it, she notes. "We're not sure if faculty are
not willing to use it, or faculty have not caught on to it."
Duke advises professors to submit only those papers that they suspect might
be plagiarized. And it's up to professors to decide whether to warn
students about the potential use of the service. The college "expected a
vocal reaction" from students and was surprised when that didn't happen,
Ms. Rasmussen says. "Maybe if the usage was higher, we'd be getting more of
a reaction."
In fact, Duke's student newspaper, The Chronicle, endorsed Turnitin.com in
an editorial, calling the service "unobtrusive" and arguing that it "comes
the closest to maintaining academic honesty without damaging the trusting
environment that administrators have attempted to foster."
Sheldon E. Steinbach, vice president and general counsel of the American
Council on Education, scoffs at the notion that students' privacy might be
violated when their papers are uploaded.
"If a system is being employed to try to promote academic integrity, and
sustain the value of a degree from being diminished by fraud, I don't quite
see where there's a FERPA claim," says Mr. Steinbach.
Similarly, Virginia's Mr. Bloomfield says it would be an injustice if
Turnitin.com was forced to stop providing its service because of a legal
complication.
He suggests that students who object to having their papers become a part
of a commercial database be offered alternative, in-class assignments. Or
perhaps copyright law should be amended to accommodate a
plagiarism-detection service like Turnitin.com, Mr. Bloomfield suggests.
After his own computer program flagged 157 papers at Virginia for suspected
plagiarism, the professor turned the cases over to the university's honor
committee in April 2001. Forty-three of the students were found guilty
after a trial or admitted plagiarism, and 88 were cleared; trials are
pending in most of the remaining cases.
He says Turnitin.com and other plagiarism-detection services do more than
just ferret out plagiarists: They improve the higher-education system by
helping to attach more meaning to students' grades, and they make dishonest
students realize that it doesn't pay to use any means necessary to get ahead.
"If copyright problems make it difficult to ensure the integrity of the
classroom, how does this benefit society? How does this benefit the
students?" he asks. "What important right of students is being preserved by
barring a service from retaining a copy of their paper?"
********************
Washington Post
Pentagon Commitment Helps Advance E-Learning Standard
By Ellen McCarthy
Michael Parmentier almost gushes about the potential benefits of online
learning. He envisions a world where pieces of information -- no matter how
detailed or obscure -- are instantaneously available, packaged and
delivered for easy absorption.
Next week, Parmentier's vision will come a step closer to reality. An
initiative spearheaded by the Department of Defense to make various online
training technologies work together has quickly produced an unofficial set
of standards for the industry.
"When we first tried to use distance learning, every time you changed a
chip or part of the system, we'd have to recreate all of the content," said
Parmentier, director of readiness training policy and programs for the
Defense Department. "We knew that if we could create an agreed-upon
platform, we wouldn't have to keep changing the content that had been created."
In November 1997, Parmentier's office quietly put out a notice that it
would hold a meeting in Rosslyn to discuss the need for standards among
electronic-learning technologies. More than 400 people showed up and "they
just wouldn't go away," he said.
That meeting spawned a collaborative effort by government agencies, private
companies and academic institutions to develop standards for not only the
military but all e-learning systems. Rather than independently creating its
own set of standards and requiring that vendors adjust their products
accordingly -- as was often done with other technologies -- the Pentagon
let industry experts guide the project.
It set up shop in Alexandria and secured $6.5 million in federal money to
fund the testing and development of e-learning standards, now called the
Advanced Distributed Learning initiative, or ADL. The initiative's goal was
to create a set of basic principles for content and software providers so
that information created for one unit or organization would be accessible
on any other e-learning system.
While other sets of standards emerged before the Defense Department's
initiative was launched, no one organization had the influence to set an
industry-wide standard. For many software and content providers, the
military represents a large, coveted client with too much clout to ignore.
"Once there is a first big buyer in any industry, a standard emerges. The
DOD is the single largest trainer in the world -- they were that huge
buyer," said Elliott Masie, president of the Masie Center, a think tank in
Saratoga Springs, N.Y. "They were seen as being honest, independent. And
everybody wanted the same objective -- content that was reusable and could
be personalized."
Masie, who served as a consultant on the project, believes that without the
Pentagon's involvement, comprehensive industry standards would still be 10
years away. Rather than starting from scratch, the ADL standard took its
cues from existing sets of standards and developed a more thorough set.
Because of the group's rapid and inclusive efforts, he said, even other
standards that are still in use have been largely marginalized.
"They've created an environment, where if you're in the business, you would
be a fool not to address the standards they've come up with," Masie said.
"Not that they were meant to become a bully customer, but it was just
rational -- no other single player had the clarity to do this."
Robby Robson, chair of the Institute of Electrical and Electronics
Engineers' Learning Technology Standards Committee, argued that while the
Defense Department's e-learning initiative has been a success, it is more a
compilation of existing specifications than the creation of new standards.
"Their role has been making these things that all the other organizations
developed practical. . . . They created a working model," Robson said.
"They certainly aren't making accredited or legal standards. What they've
done is in essence is say, 'Here is all the work that all these people did
and we have real business cases and real needs, so let's go out and get it
done.' And more power to them, because they did it."
The ADL project counts industry leaders such as Electronic Data Systems
Corp., Click2learn.com, IBM Global Services, Microsoft Corp. and SkillSoft
Corp. among its partners.
The ADL work, and the span of its influence on e-learning, has not been
without criticism from some players in the sector. While vendors have
little choice but to make their products fit the ADL standard (named the
Sharable Content Object Reference Model), some say they are too focused on
tracking content and cannot yet guarantee that all systems deemed compliant
will actually work together.
Amar Dhaliwal, vice president of engineering at ThinQ Learning Solutions
Inc., said his company moved quickly to make its content management and
delivery products compliant when ADL's standards were first released in
2000. Next week, when the new version is released, it will do so again. To
do otherwise, he said, the company would risk losing the confidence of both
government and corporate clients.
"The standards are still relatively young, and as they go through different
versions, there are significant gray areas. Someone can be creating content
that is compliant but it still might not fully work on all learning
management systems," Dhaliwal said.
Regardless of its kinks, few in the e-learning world deny that the
initiative will continue to have a profound effect on the industry.
"The eventual advantage is that we're going to provide a digital knowledge
environment where chunks of knowledge are going to be shareable and
reusable. If they exist somewhere, you will be able to find them," said the
Pentagon's Parmentier. "Eventually we'll have a world where knowledge flows
like water or electricity."
***********************
Washington Post
Out of Silicon Valley, and Looking Homeward
By Richard Morin and Claudia Deane
Engineers and entrepreneurs from India and China who work in Silicon Valley
are quietly fueling a high-tech revolution in their native countries in
ways that challenge traditional notions of a "brain drain," according to a
new study by the Public Policy Institute of California.
AnnaLee Saxenian, a professor of city and regional planning at the
University of California at Berkeley, calls these highly trained
foreign-born professionals "agents of global economic change." She found
that many of these immigrants regularly return to their native countries to
talk tech, advise local businesses or consult with government officials
about business or technology.
Last year, Saxenian surveyed 2,273 members of the 17 leading technology and
business professional associations in the San Francisco Bay Area. Overall,
about 9 in 10 were born outside the United States. Of these tech workers,
43 percent were born in India, 30 percent in China and 12 percent in
Taiwan. The rest were from other countries.
Saxenian found evidence of a reverse brain drain. Nearly three-fourths of
the Indian respondents and two-thirds of the Chinese said they knew between
1 and 10 immigrant professionals who had returned home. About 3 in 4 survey
participants said they would consider starting businesses in their native
countries.
Half of all Silicon Valley foreign-born professionals said they traveled to
their native countries for business at least once a year. Some make the
trans-Pacific hop so often that they're called "astronauts," Saxenian
wrote, "because they appear to spend their lives in airplanes."
According to her study, 8 in 10 reported that they shared information about
technology with colleagues in their native countries. Four in 10 said they
helped to arrange business contracts back home. One-third said they meet
with government officials and more than 1 in 4 served as an adviser or
consultant for a company in their country of birth, she found.
"The 'brain drain' from developing countries such as India and China has
been transformed into a more complex, two-way process of 'brain
circulation' linking Silicon Valley to select urban centers in India and
China," Saxenian concluded.
BIBLIOPHILES: The precocious New America Foundation has just agreed to a
deal with Basic Books to publish jointly up to 10 books a year covering
public policy and current affairs. Until now, fellows at the three-year-old
think tank primarily have relied on op-eds and magazine articles to express
their deep thoughts.
The joint book imprint is "in contrast to the way Brookings and the
American Enterprise Institute do it, which is through in-house imprints,"
said Ted Halstead, New America's president. "Those books typically don't
get into Barnes & Noble, typically are not reviewed in major papers and are
published at a net cost to the institution."
Basic Books and New America each will have veto power over book proposals,
and will jointly negotiate each author's advance. Basic will provide New
America support for its editorial work.
"New America offers great resources, like an office and time and a place to
be, and then we bring our publishing resources to bear," said Liz Maguire,
associate publisher and editorial director at Basic Books. "We're hoping to
sign a good handful of books by the end of the year."
THE OVERSEAS MARKET: Speaking of books, the Hudson Institute has a
bestseller on its hands -- in Japan.
A new collection of essays called "The Re-Emerging Japanese Superstate in
the 21st Century" is at No. 2 on the nonfiction list of the Nihon Keizai
Shimbun, a leading financial daily, according to Hudson Vice President
Kenneth Weinstein.
The essays center on "the future of Japan, and some argue that Japan's real
outlook is not as gloomy as most observers believe," Weinstein said.
The new book, which takes its title from Hudson founder Herman Kahn's "The
Emerging Japanese Superstate," published in 1970, is in its fourth printing
since late March. There are 28,000 copies in print.
DEAN SEARCH: Well, one thing remains clear: The dean of Washington think
tank presidents is named Ed.
But it's neither Edwin J. Feulner Jr. of the Heritage Foundation nor Edward
H. Crane of the Cato Institute, both at roughly 25 years of service, as we
reported last week.
"I'm sorry to disappoint 'the Eds' at Cato and Heritage, but neither is the
dean of Washington think tank presidents. The clear leader is Eddie N.
Williams, president of the Joint Center for Political and Economic Studies,
who is serving his 30th year," reports Liselle Yorke of the Joint Center.
Williams "came to the Joint Center in 1972, two years after its founding in
Washington, D.C. . . . I hope this puts the debate to rest."
Perhaps. Any other contenders?
CANDID QUOTE OF THE WEEK: "I have been asked to talk about my book, a
subject about which I have become somewhat weary," said Bernard Lewis last
week at the Ethics and Public Policy Center. A preeminent Middle East
scholar and longtime Princeton professor, Lewis recently published "What
Went Wrong: Western Impact and Middle Eastern Response," a well-timed book
on Islam and the West. "I shall endeavor not to communicate my boredom to
the audience."
******************
USA Today
Kazaa, Verizon propose to pay artists directly
Jefferson Graham
Jim Guerinot, a board member of Don Henley's and Sheryl Crow's Recording
Artists Coalition and the manager of No Doubt, Beck and The Offspring, is
such a fan of digital music that he has ripped his CD collection into MP3s
and listens to them on his portable Apple iPod on his daily bicycle commute
to work.
But you won't find any of his artists' work posted on Pressplay, the Net
subscription service backed by their record labels. He removed them, he
says, because the acts weren't getting paid.
The record industry has responded to the immense popularity of
file-sharing and trading of copyrighted material by suing to close the
operations down. But as one swap site shuts, others take its place, and
more people are downloading now than ever.
An unlikely alliance of swap-service Kazaa and telephone and Internet giant
Verizon is floating a proposal to break the logjam of lawsuits: Computer
manufacturers, blank CD makers, ISPs and software firms such as Kazaa will
pool funds and pay artists directly.
"Historically, there's been a clash between the content community and new
technology, back to the player piano," says Verizon vice president Sarah
Deutsch. "We're proposing the idea of a copyright compulsory license for
the Internet, so peer-to-peer distribution would be legitimate and the
copyright community would get compensation. It's hard to get the genie back
in the bottle."
Kazaa lobbyist Phil Corwin says a $1-a-month fee per user on Internet
providers alone (it's unclear whether costs would be passed along to
subscribers) would generate $2 billion yearly: "We're talking about a
modest fee on all the parties who benefit from the availability of this
content."
Recording Industry Association of America president Hilary Rosen calls the
proposal "the most disingenuous thing I've ever heard. It's ridiculous."
But Guerinot isn't ready to dismiss it out of hand: "Any model that starts
to accommodate monetizing the artists is worth looking into."
Guerinot is upset that the labels have tried to combat technology with
alternatives that have been widely rejected by the public. MusicNet and
Pressplay offer limited downloads, but not in the preferred MP3 format, and
they usually can't be transferred to portables or burned to CDs.
"It would be like me opening a video store, charging 10 times what others
were charging and only offering videos in the Beta format," Guerinot says.
"In any business, when you have billions of downloads occurring, you don't
say we're going to ignore that market and try to create something else. You
serve your customers."
**********************
Washington Post
Two Virginia Universities To Join Forces Against Cybercrime
By Brian Krebs
Two Virginia schools on Tuesday will launch a $6.5 million project to help
sort out the myriad legal, technical and policy challenges involved in
steeling the nation's most vital computer systems against cyber-attack.
The Critical Infrastructure Protection Project - to be housed at the George
Mason School of Law in Arlington - is a collaborative effort between GMU's
National Center for Technology and Law and researchers and academicians at
James Madison University.
The project will be lead by John A. McCarthy, a former member of a Clinton
administration team that facilitated government and private-sector
collaboration in preparing key computer systems for the Y2K conversion.
Among the more pressing problems the new center will tackle are legal
issues that have stymied plans to establish more fluid and open
information-sharing networks between the public and private sector.
Tech companies have indicated they would be more willing to share
information with the government if they could be assured that data would
not be leaked to the public through the Freedom of Information Act.
Lawmakers in both the House and Senate are pushing legislation that would
guarantee such protections.
But consumer and privacy watchdog groups say FOIA case law adequately
protects any of the information concerning cyber-security issues that
should legitimately be withheld from the public. Rather, they argue, the
legislation could end up exempting companies from legal liability for
security lapses.
"The information-sharing plan has been on the table for six years and we
still haven't come up with a workable solutions because of legal
obstacles," McCarthy said. "We hope that by putting our third-party hat on
we'll be able to bring together the right constituencies to broker lasting
and useful solutions to long-term problems."
The center also plans to offer congressional testimony and become the
central clearinghouse for data and research on cybersecurity and critical
infrastructure protection.
"We want to become the center that researchers and government leaders can
come to that centralizes a lot of data and findings on cybersecurity,"
McCarthy said. "Right now, that data is all over the map, and we're
planning to bring that together in one place."
In addition, the group plans to work with other schools to coordinate
research and development on cyberterrorism issues.
The program is being paid for through the National Institute for Standards
and Technology (NIST), an arm of the U.S. Department of Commerce.
The $6.5 million was allocated under the FY2002 Commerce-State-Justice
appropriations bill, which funds the center for the next two years.
Rep. Frank Wolf (R-Va.), chairman of the U.S. House Subcommittee on
Commerce, Justice, State and Judiciary, and author of the original funding
measure, is looking to give the center more money through the
appropriations process, an aide said.
*************************
Los Angeles Times
Questions Raised by E-Mail on Energy
Policy: Message gleaned from administration energy task force staff member
leads Democrats to voice suspicion over meaning.
By RICHARD SIMON
TIMES STAFF WRITER
May 14 2002
WASHINGTON -- An e-mail from a high-ranking staff member of the Bush
administration's energy task force said officials were "desperately trying
to avoid California" in a report dealing with the energy crisis last year,
Rep. Henry A. Waxman (D-Los Angeles) said Monday.
The e-mail, obtained under the Freedom of Information Act by watchdog and
environmental groups and also provided to Waxman, was largely redacted and
neither administration officials nor the congressman could say specifically
what the task force staffer was discussing.
But Waxman seized on the missive to turn up the heat on the White House to
release all documents relating to the task force's discussions about
California's energy crisis. Waxman said the e-mail, along with the recent
release of documents showing Enron sought to manipulate the California
energy market, "underscores the need for the administration to provide a
complete accounting of its understanding of and approach to the energy crisis."
"It is important to know what, if anything, the administration knew about
Enron's efforts to manipulate the California energy market," Waxman said in
a letter to Vice President Dick Cheney, who chaired the task force.
The e-mail was sent May 4, 2001, by Karen Knutson, deputy director of the
energy task force, to Environmental Protection Agency official Jacob Moss.
It raised concerns about putting into the report a paragraph "emphasizing
environmentally difficult issues about expediting permitting to deal with a
crisis."
"We are desperately trying to avoid California in this report as much as
possible," it continues. "Can you make the same point without going into
California?"
Knutson could not be reached for comment, and administration officials
could not say what it was about California that she sought to keep out of
the report. But they defended their actions on California's energy crisis.
"Congressman Waxman has confused the Bush administration with the lack of
action taken by the Clinton administration," said Energy Department
spokeswoman Jeanne Lopatto. She said the Federal Energy Regulatory
Commission, on Bush's watch, imposed electricity price controls and
launched an investigation into whether Enron and other power sellers sought
to use their market power to drive up prices.
The task force report released last year features California prominently,
using the state's problems to bolster its arguments for building more power
plants.
Waxman said the e-mail appeared to have come to light through an
"inadvertent failure on the part of EPA to redact information it had
intended to redact."
He said it raised questions about whether the widespread deletion of
information from thousands of pages of documents turned over by federal
agencies were the result of the administration's claims that disclosure
would chill frank and open discussion or were intended to avoid embarrassment.
Cheney spokeswoman Jennifer Millerwise called Waxman's letter "another in a
long list of politically motivated letters that Congressman Waxman
regularly sends out."
She said the purpose of the administration's national energy plan was "to
develop a strategic, long-term energy policy for the country ... not to
focus on short-term problems specific to a state."
Waxman's letter was the latest salvo in a yearlong effort by congressional
Democrats to force the administration to make public details of private
meetings held with lobbyists during drafting of the energy plan.
The battle was fired up last week with the release of internal Enron memos
that detailed an array of Enron price-manipulation strategies. Two Senate
committees plan hearings on the memos Wednesday.
Sen. Byron L. Dorgan (D-N.D.), chairman of the Senate subcommittee on
consumer affairs, said he plans to grill the authors of the Enron memos to
"try to figure out who was involved in creating the strategies" and how far
up in Enron's management knowledge of the strategies reached.
Dorgan said he also expects to question an official from FERC, probably its
chairman, Patrick H. Wood III, to find out "what it knew and what it didn't
know and why."
"It was a horrible failure on the part of regulators," Dorgan said in an
interview Monday. "FERC was sitting on its hands."
Sen. Barbara Boxer (D-Calif.), a member of the subcommittee on consumer
affairs, cited California's budget problems Monday in urging Wood to act
swiftly to complete its price-manipulation investigation and provide
refunds and renegotiated energy contracts, "so other vital services, such
as health care and education, do not have to be cut to continue to pay for
manipulated electricity prices."
***********************
Los Angeles Times
High Court Takes a Step Toward Net Porn Rules
Justices: The cautious and complex ruling says a strict standard can be
applied to online sites to shield children. However, free-speech challenges
to the 1998 law remain.
By DAVID G. SAVAGE
TIMES STAFF WRITER
May 14 2002
WASHINGTON -- The Supreme Court breathed new life Monday into a law
designed to shield children from pornography on the Internet, ruling that
the "most puritan" community may decide what is harmful to minors, even if
a Web site based elsewhere is prosecuted based on this strict standard.
On an 8-1 vote, the court reversed a lower court's ruling that had struck
down the Child Online Protection Act because it relied on "community
standards" for deciding what is "patently offensive" for minors under age 17.
But the cautious and complex ruling stops well short of upholding the 1998
law, which has never been enforced and will not be, at least for now. The
court sent the case back to the lower court to address other free-speech
challenges, which could take years. The case illustrates the difficulty
that the federal government has encountered in trying to corral portions of
the unruly Internet without trampling the 1st Amendment.
"The scope of our decision today is quite narrow," Justice Clarence Thomas
wrote. "We hold only that [the Child Online Protection Act's] reliance on
community standards to identify 'material that is harmful to minors' does
not by itself render the statute" unconstitutional under the 1st Amendment.
Magazine and book publishers must live with this "community standards"
rule, Thomas said, and the 1st Amendment does not create a special
exemption for commercial Web sites.
"If a publisher chooses to send its material into a particular community,"
he wrote, "it is the publisher's responsibility to abide by that
community's standards."
But the justices, while concurring in a lopsided vote, were unusually
divided in their opinions. Paradoxically, several in the majority did not
appear to support the notion of allowing local standards to govern the
Internet.
In the end, the court neither upheld the law nor allowed it to be enforced.
Instead, they sent the matter back to the lower court in Philadelphia to
consider other free-speech challenges to the law.
Nonetheless, Monday's decision takes a step, albeit a tentative one, toward
allowing the first federal restrictions on what can be transmitted via the
Internet.
In its short history, the Internet has been championed as a truly free flow
of communication and information. But with that freedom has come a large
volume of hard-core pornography, much of which can be accessed by children
and teenagers. In 1998, there were about 28,000 adult sites promoting
pornography on the Web.
Shielding Children From Online Porn
Congress has been determined to block children from clicking onto sexually
explicit Web sites.
In 1996, the Communications Decency Act made it a federal crime to send
sexually explicit messages or communications to anyone under age 18.
A year later, however, a unanimous Supreme Court struck down that law,
saying it was too broad and vague. It raised the specter that someone could
be prosecuted for sending an e-mail or posting a provocative photo that was
subsequently brought up on a minor's computer screen thousands of miles away.
Undaunted, Congress in 1998 adopted a narrower, more specific ban on
computer transmissions that are "harmful to minors."
The Child Online Protection Act applied only to material posted on the Web
"for commercial purposes." This excluded e-mails as well as Web sites that
are entirely free. It is not clear, though, whether the law meant to
include sites that are free to users but make money from advertisements.
The new law lowered the age of protected minors to those under 17.
It also adopted a new, three-part definition of what is banned. It covered
"sexual acts" and "lewd exhibitions" that the "average person, applying
contemporary community standards, would find, taking the material as a
whole and with respect to minors, is designed ... to pander to the prurient
interest and ... lacks serious literary, artistic, political or scientific
value for minors."
The law also sought to shield Web sites that screened out minors by, for
example, requiring them to supply a credit card number.
Nonetheless, before it could go into effect, the American Civil Liberties
Union went back to court in Philadelphia to challenge it as unconstitutional.
The ACLU's clients included Salon magazine, which feared it could be
prosecuted because of its sexual advice column, and the owners of a gay and
lesbian bookstore, which posted material on its Web site. While neither
site sought to appeal to minors, their sponsors say they could not easily
prevent them from tapping into the sites.
When freedom of speech is at issue, the Supreme Court has been willing to
block new laws based on how they might be used, even when no one has been
prosecuted.
Last month, the justices struck down a portion of the Child Pornography
Prevention Act of 1996 that made it a crime to own or sell "computer
generated" children engaged in sexual acts, saying it could apply to
animated figures or purely imaginary scenes.
Similarly, the ACLU said the Child Online Protection Act should be struck
down before it goes into effect because it could chill communication on the
Internet.
Its lawyers argued the 1998 law had the same flaw as the 1996 version.
Since Web sites cannot know who will access their material, they are held
responsible if minors click on to it, they said.
'Most Puritan' Community Is Cited
A federal judge in Philadelphia agreed and blocked the law from taking
effect. The U.S. appeals court there ruled the law unconstitutional because
it would allow the "most puritan" community in the nation to regulate the
Internet.
For example, prosecutors could bring charges in Provo, Utah, against a San
Francisco-based Web site, the appeals court judges said.
The Justice Department appealed in the case of Ashcroft vs. ACLU, 00-1293,
and won a reversal.
Those who fear prosecution for distributing pornography "need only take the
simple step of utilizing a [different] medium that enables it to target the
release of its material," Thomas wrote.
Despite the near-unanimous vote and Thomas' strong words, most of the
justices did not appear to agree on what was said. Only Chief Justice
William H. Rehnquist and Justice Antonin Scalia joined Thomas' opinion in full.
Justice Sandra Day O'Connor signed it but said she believed there must be a
national standard for what is harmful to minors. Justice Stephen G. Breyer
wrote a separate opinion making the same point.
Three others--Justices Anthony M. Kennedy, David H. Souter and Ruth Bader
Ginsburg--voted with the majority but joined a separate opinion casting
doubt on the constitutionality of the law.
Justice John Paul Stevens dissented and said the law should be struck down.
"It is quite wrong to allow the standards of a minority consisting of the
least tolerant communities to regulate access to relatively harmless
messages in this burgeoning market," he said.
Both Sides in Debate Claim a Victory
Reflecting the split outcome, both sides said they were pleased.
Barbara Comstock, a Justice Department spokeswoman, said officials there
were "pleased the Supreme Court has vacated the decision" of the lower
court. It takes a step toward "keeping our nation's children safe from
viewing the pornography for sale on the Internet."
Meanwhile, the ACLU's lawyer, Ann Beeson, said she was pleased the court
did not allow the law to go into effect.
"The court clearly had enough doubts about this broad censorship law to
leave in place the ban, which is an enormous relief to our clients," she said.
When the case goes back to Philadelphia, the ACLU lawyers will renew the
main argument that the effort to restrict content on the Internet is
hopelessly flawed.
Since sponsors of Web sites do not know who is downloading their material,
they should not be punished if a minor accesses it, they say.
Further, if this restriction becomes law, it will force Web sites to post
only material that is suitable for minors, and the 1st Amendment does not
allow the government to impose such a restriction on American adults, they
argue.
********************
Federal Computer Week
Defining 'best value'
Federal employee unions and other groups that oppose increased outsourcing
of government services are losing ground fast. Although they may not like
the direction things are going, they had better pay attention.
Bush administration officials say the decision to change the rules for
competing work between the public and private sectors as recommended
earlier this month by the Commercial Activities Panel is not intended to
increase outsourcing, but to define a fairer process for measuring the
costs of commercial-like services.
Yet most people agree that addressing problems in the current process,
defined by Circular A-76, will likely make it easier for vendors to win
bids for government work. Industry has long complained that A-76 does not
accurately account for agency costs and that it favors low-cost bids, which
flies in the face of the government's interest in so-called best-value
proposals.
Clearly, the White House intends to take a more business-like approach to
delivering government services. In cases where commercial firms can provide
comparable services at a lower cost, industry will likely win hands down.
It's trickier when it comes to evaluating bids for best value, rather than
lowest cost. In these cases, commercial firms try to prove that they can
provide better service at a comparable cost, or higher, if warranted.
The concept of best value, though, raises another issue that should not be
forgotten: Government is not a business.
*********************
Federal Computer Week
OMB checks card use
Regardless of whether they have been cited for abuses, all federal agencies
have until June 1 to submit a review and remedial plans for government
credit card use to the Office of Management and Budget.
OMB sent a memorandum April 18 requesting the plans and will use the
information to determine the next course of action. The call comes after
months of publicity surrounding the SmartPay cards because of credit card
abuses at certain agencies, said Joseph Kull, deputy controller of OMB's
Office of Federal Financial Management.
Speaking May 7 at the 2003 Visa Government Forum in Washington, D.C., Kull
said the now notorious example of a government employee at the Space and
Naval Warfare Systems Command using a government-issued card to buy breast
implants for his girlfriend "brings tremendous pressure on the program" and
paints a picture of an "irresponsible government," although the cards
mostly have benefited agencies since the program's launch in 1989.
The federal government spent almost $14 billion in fiscal 2001 with the cards.
OMB will be reviewing the agencies' plans, but it's too soon to tell if
quarterly reports on the purchase cards will be required in the future,
Kull said, adding that he'd like to see agencies with good programs
rewarded. "Not everyone should bear the same level of burden," he said.
Excusing agencies with well-run programs from submitting reports would be
an "incentive for good management."
Sue McIver, director of the General Services Administration's Services
Acquisition Center, which oversees the SmartPay program, said she agreed
with agencies, such as the Defense Department, that publicize egregious
card abuses and prosecute offenders. This "lets others know those actions
won't be tolerated."
*************************
New York Times
Museum's Cyberpeeping Artwork Has Its Plug Pulled
By MATTHEW MIRAPAUL
An Internet-based artwork in an exhibition at the New Museum of
Contemporary Art was taken offline on Friday because the work was
conducting surveillance of outside computers. It is not clear yet who is
responsible for the blacking out the artists, the museum or its Internet
service provider but the action illuminates the work's central theme: the
tension between public and private control of the Internet. The shutdown
also shows how cyberspace's gray areas can enshroud museums as they embrace
the evolving medium.
The work in question is "Minds of Concern: Breaking News," created by
Knowbotic Research, a group of digital artists in Switzerland. The piece is
part of "Open Source Art Hack," an exhibition at the New Museum that runs
through June 30. The work can be viewed as an installation in the museum's
SoHo galleries or online at newmuseum.org. Although the installation is
still in place, and the work's Web site remains live, the port-scanning
software that is its central feature was disabled Friday evening and was
inactive yesterday afternoon.
Port scanning sounds like a cruise-ship captain's task. The term actually
refers to a technique for surveying how other computers are connected to
the Internet. The software essentially strolls through the neighborhood in
search of windows that have been left open. Merely noticing where they are
is no crime. Things get dicier, though, if what is seen is conveyed to a
ne'er-do-well relative, who then breaks in somewhere, rearranges the
furniture and makes off with a gem-encrusted putter.
One court has ruled that port scanning is legal so long as it does not
intrude upon or damage the computers that are being scanned. Internet
service providers, however, generally prohibit the practice, which can
cause online traffic jams. That prohibition appears to be what led to the
shutdown.
After the Knowbotic work started its peeping, the Internet service provider
for one of the targets of the scan complained to the museum's Internet
service provider, Logicworks. In turn, Logicworks notified the museum that
port scanning violated its policies. On Friday, Lauren Tehan, a museum
spokeswoman, said the museum was seeking a creative technical solution to
keep the work online.
That effort did not succeed. Ms. Tehan said the museum, at Logicworks'
request, shut down the work after the museum closed on Friday evening. On
Saturday morning, Christian Hübler of Knowbotic Research said the group
realized the port-scanning software had been disabled and decided to move
the work's Web site to an Internet service provider in Germany. Ms. Tehan
said that the museum suggested a way to put the work back online but that
Knowbotic rejected the proposal.
The dispute calls attention to one of the very points the piece is intended
to make. Because the lines between public and private control of the
Internet are not yet clearly defined, what artists want to do may be
perfectly legal, but that does not mean they will be allowed do it.
Before the New Museum exhibition opened on May 3, Knowbotic Research had
already decided to remove the most troublesome features of the
port-scanning software. Mr. Hübler said the group changed the work after
consulting with a lawyer who specializes in Internet law. "I wanted to know
the situation I'm in," Mr. Hübler said, "because when I work with the
border as an artist, I want to know at least what the border might be."
When it is functioning, "Minds of Concern" resembles a slot machine.
Viewers are prompted to scan the computer ports of organizations that
protested in February against the World Economic Forum. While colored
lights flash, a list of the vulnerable ports and the methods that might be
employed to "crack," or penetrate, them to gain access to private
information scrolls across the bottom of the screen. No internal
information is exposed, but the threat is suggested.
European digital artists are more politicized than their American
counterparts, and "Minds" is designed to advance a social agenda. By
choosing to explore the computers of anti-globalization groups instead of
Nike or Coca-Cola, Knowbotic is warning those groups that they are at risk
of losing sensitive data.
But to present the work at the New Museum, Knowbotic had to defang it. At
first, the group reviewed the 800 tools in the port-scanning program and
removed 200 it deemed intrusive or malicious. After consulting with a
lawyer, the group then encrypted the name of the organization being scanned
because it was unsure if publishing the information was illegal. In place
of the name on the screen, one saw the phrase "artistic self-censorship."
The group's disappointment in having to scale back the work was obvious in
a message to an electronic mailing list: "Due to the ubiquitous paranoia
and threat of getting sued, the museum and the curators made it very clear
to us that we as artists are 100 percent alone and private in any legal
dispute."
There is a sense of a missed opportunity here. The dozen works in "Open
Source Art Hack" are intended to prompt discussion about the public versus
the private in cyberspace while demonstrating how artists "hack," or misuse
technology, to creative effect. Port-scanning software, for instance, is
meant to be used for reconnaissance, yet Knowbotic has made it a political
tool.
But "Minds of Concern" is also the only online work in the exhibition to
operate in a legal gray area. In its fully functional state, it had the
potential to cause a ruckus that might have yielded some black-and-white
rulings. But instead, the exhibition commits no real transgressions.
Steve Dietz, the new-media curator at the Walker Art Center in Minneapolis,
was one of the exhibition's curators. Its goal, he said, "was more nuanced
than bringing cracking to the dull havens of a museum."
"Being bad and doing something illegal hold very little interest for me,"
he said, "but being tactical and creative hold a great deal.`
Artists like to be bad, and although museums are sometimes their targets,
they can also serve as shields when artists become controversial. A recent
example was the exhibition "Mirroring Evil: Nazi Imagery/Recent Art," for
which the Jewish Museum, not the participating artists, took most of the heat.
As museums embrace cyberspace, its fuzzy rules are posing unfamiliar
problems, and "Minds of Concern: Breaking News" is a case in point. As for
how well those issues can be raised within a museum's walls, Lisa Phillips,
director of the New Museum, said: "That really is the dilemma. We can only
go so far."
******************
USA Today
RealNames online 'keyword' system shuts down
NEW YORK (AP) RealNames is shutting down its alternative naming system for
the Internet after Microsoft decided to stop incorporating the system in
its Internet Explorer browsers.
The decision means that users who had reached certain Web sites through
shortcuts from RealNames will need to type in the full address or use a
search engine. Users who had typed in Chinese or Japanese addresses may
find their sites unreachable.
The shutdown takes effect June 30, when the current partnership between
RealNames and Microsoft ends. RealNames laid off its 83 employees Friday,
though some will serve as consultants during the transition.
Although the RealNames system was designed independent of any specific
browser, it needed a major platform like the Microsoft browser to make it
possible for users to recognize keywords.
Normally, to reach the Web site for Eastman Kodak, users would type in
"www.kodak.com" in the address field of their browsers. RealNames allowed
users of Microsoft's Internet Explorer to reach the site simply by typing
"Kodak."
Critics have questioned the proprietary nature of the RealNames system,
which essentially runs on top of the Internet's existing domain name system
built on open standards.
Plus, many browsers now incorporate search functions, so that typing
"Kodak" into other browsers would also get Kodak's site. After RealNames
shuts down, Microsoft will simply have those keywords go to a search engine
as well.
The biggest impact may be on non-English users who had relied on RealNames
to link foreign language keywords with Web addresses that use English
characters understood by the Internet domain name system.
VeriSign, which has been offering foreign names ending in ".com," ".net"
and ".org," now must find another company to fill the void. VeriSign
spokeswoman Cheryl Regan said the company was exploring its options.
RealNames and Microsoft disputed the reasons for the shutdown.
RealNames founder Keith Teare blamed it on Microsoft wanting more control
over the searching process.
Microsoft spokesman Matt Pilla said the system created some user
confusion someone typing "San Francisco realtor" got a site on loans, not
a broker.
*******************
Government Executive
Intellectual property rights issues hamper contracting process
By Maureen Sirhal, CongressDaily
Industry and Bush administration officials on Friday told a House panel
that federal procurement officers require more training on the intellectual
property (IP) issues involved in government contracts if they want to
attract more commercial firms for information technology work.
The increasing role of technology in homeland security means that
government agencies will need to outsource more projects to private firms.
But many major IT firms fear that they will lose their intellectual
property rights while doing business with the government.
"The pervasive view in my experience," said Richard Carroll, CEO of
Virginia-based Digital Systems Resources, "is one of 'We paid for it, we
own it.'" In other words, he told members of the House Government Reform
Subcommittee on Technology and Procurement Policy, "the government owns the
intellectual property rights to any research and development funded with
government dollars."
If that mindset makes commercial firms fear government work, experts said
efforts like homeland security might suffer because agencies will not be
able to procure the best technologies.
The law that sets the framework for the transfer of government-developed
technologies to the commercial sector instructs agencies to gather only the
minimum IP rights necessary for a given project. But some firms are
reluctant to relinquish rights to work on projects that they could not
benefit from it in the marketplace. Meanwhile, procurement officers are
inclined to negotiate contracts that give agencies all the IP rights for
research that they fund.
Jack Brock, managing director of acquisition and sourcing management at the
General Accounting Office, said that a GAO study released Friday found that
companies generally are reluctant to contract with the government because
of IP concerns that stem in part from "poor definitions" of the
government's needs. Firms also perceive government contracting officials as
inflexible.
Ben Wu, the Commerce Department's deputy undersecretary for technology, and
industry panelists said the biggest problem is that procurement officers
are unaware of the flexibility they do have in negotiating IP licenses with
contractors. Inadequate training and expertise on IP issues also leave them
uncertain of how much authority they have to negotiated deals, panelists said.
Subcommittee members questioned whether current procurement law should be
changed.
Tony Tether, director of the Defense Advanced Research Projects Agency,
said that while the law has worked "reasonably well," it does result in IP
licensing agreements that are "largely determined by regulations." He noted
that the law also fails to address a large part of intellectual property:
trade secrets.
Wu, however, said that "current policy is sufficient and allows for trade
secret protection," and that modifications would be a "major policy shift."
He said some of the problems with the law might be the result of poor
implementation, which is tied to employee training.
Panelists supported a suggestion made by Subcommittee Chairman Tom Davis,
R-Va., to designate training for procurement officers dealing in
specialized IP areas.
********************
Government Executive
Defense research agency seeks return to 'swashbuckling' days
By William New, National Journal's Technology Daily
The Defense Advanced Research Projects AgencyDARPA, originally ARPAwas
created in 1958 as the U.S. response to the Soviet launching of the Sputnik
satellite. The agency, which among other things developed the technologies
leading to the commercialization of the Internet, works to create
cutting-edge military technologies.
DARPA Director Anthony Tether was appointed June 18, 2001, and has worked
in government and the private sector, serving previously as director of
DARPA's Strategic Technology Office from 1982-86. Tether recently spoke
with William New of National Journal's Technology Daily.
Q: How does DARPA relate to the Department of Defense?
A: We're a defense agency. We report to the secretary of Defense, as does
the Army or the Air Force. ... What this secretary of Defense [Donald
Rumsfeld] has done is, from a policy viewpoint, he has decreed that 3
percent of the Defense Department [budget] would go into what is known as
the science and technology budget. He also has decreed that 0.8 percent, or
roughly 25 to 30 percent of the S&T budget will go to DARPA.
He wants ... an organization that constantly brings forth new things that
other people aren't doing. And that's generally DARPA.
Q: You've been here about a year. What were your goals coming in, and are
those evolving?
A: Well, the goals I had coming in ... were two things. One was that [the
top defense officials] all looked upon DARPA as an organization that was
always doing the craziest things. But they felt that DARPA, over the last
administration, had really lost a lot of its forward-looking luster.
What they wanted was to not transform DARPA to anything new, but to
transform it back to something old. But they wanted it to become like it
was in the early '70s and '80s, where it was a swashbuckling place, where
program managers were constantly getting the director in trouble with ideas
and never taking "no" for an answer.
The other thing was space. DARPA was born in the space age. The other goal
was: Go do innovative things in space.
DARPA really does not have an organization in a classical sense. It's 140
program managers all bound together by a common travel agent. Our program
managers are only here for four to five years. [Program managers] come up
with an idea, they create the program plan, they compete internally for the
money. We don't do anything within DARPA; we contract all our money out.
They go out and get contractors who execute that plan.
Q: Are you trying to create the next Internet?
A: We're even doing better than that. ... We still have the original
ARPA/DARPA thrust of creating a cognitive computer system.
Now we cycle forward 30-some-odd years. In 1965, there were like 50
transistors on the chip. We are approaching a billion transistors on a
chip, a chip being something like a third of an inch by a third of inch.
This approaches the density of cells in your brain.
Q: How will you make your mark on DARPA?
A: It's the easiest place in the world to change. The reason is we turn
over people at a rate of 25 percent a yearwhich means after I've been here
a year, 25 percent of the agency will be people that I personally had a
hand in hiring. ...
Q: How has DARPA's mission changed since Sept. 11?
A: DARPA's only real stated charter is to prevent technological surprise.
That has not changed, except we have modified the vision so that now we
feel our job is not to just prevent technological surprise, but to create
technological surprise.
DARPA was [already] working the terrorism and counterterrorism problem. We
were developing the capabilities to be able to take inputs, financial
transactions, and come up with networkswho's talking to whom.
We also had behavioral-science types of projects where we were trying to
predict what a group would do. We also had efforts in foreign-language
translation. We had people who were already interested in the subject; we
just put them together in the same location and found an office director
[former Reagan White House National Security Adviser John Poindexter].
We are going to develop the capability to detect, track, figure out intent
and pre-empt terrorists.
Q: What do you need Congress to do?
A: We need Congress ... to understand that patience does pay off. Having an
organization like DARPA, it's constantly developing new capabilities that
may not turn into real capabilities for five years, 10 years or more.
********************
Government Executive
Homeland security effort boosts e-gov initiatives
By Teri Rucker, National Journal's Technology Daily
"E-gov"--shorthand for the information technology-based initiative intended
to make it easier for individuals to access government services, while also
cutting costs--has been expanding throughout the past decade, as the
Internet has reached into the homes of rank-and-file citizens. But it is
receiving heightened attention since the Sept. 11 attacks.
OMB is overseeing electronic government efforts at the federal level, and
Mark Forman, associate director for information technology at the Office of
Management and Budget, recently told a House subcommittee that e-government
plans are integral to homeland security.
"Today, the federal government has only scratched the surface of the
e-government potential," Forman told the subcommittee. "Basic management
principles tell us that government operating costs will go down and
effectiveness will go up if we make it simpler for citizens to get service."
The Bush administration has budgeted $50 million in fiscal 2003 for 24
e-government initiatives designed to eliminate redundant, non-integrated
business operations and to make agencies more responsive to U.S. citizens.
But, to implement these changes, officials first must overcome several
barriersincluding conflicting agency cultures, as well as a lack of both
resources and a trust in existing electronic systems, according to a recent
GAO study.
For e-government to be effective, Forman said, the government must ensure
that citizens feel safe using the Internet. He added that agencies would
provide that sense of safety by incorporating privacy and security
protections, providing public training and offering e-authentication.
The administration's proposed fiscal 2003 budget also outlines many
problems with e-government implementation, grading each agency on its
e-government performance. None of the agencies received a passing score,
nine earned a yellow mark--which means that some of the criteria have been
met--and 17 agencies rated a red score, indicating at least one serious
flaw in its practices.
For example, the budget document contended that management of information
technology investments is the Energy Department's "weakest link," and
because the agency is consolidating its IT portfolio under a chief
information officer, it was "impossible to evaluate compliance with
e-government standards."
The Sept. 11 terrorist attacks also put the spotlight on protecting the
nation's critical infrastructure, including its information technology
systems. But a spokeswoman for the Commerce Department's Critical
Infrastructure Assurance Office said that budget was not divided into
specific components.
However, agencies charged with overseeing the nation's energy and food
supplies are given funds to combat terrorism under the Bush budget. The
budget allocates $451 million to the Health and Human Services Department
for this purpose; the Agriculture Department would receive $195 million,
and the Energy Department, $194 million.
***********************
Government Executive
Federal buildings told to improve security of air systems
By Shane Harris
sharris@xxxxxxxxxxx
The Health and Human Services Department has recommended that federal
agencies tighten physical security around building air intake and
ventilation systems to minimize the threat of chemical, biological and
radiological attacks.
The guidelines on protecting buildings from airborne agents are part of the
department's continuing efforts to protect public health since the Sept. 11
terrorist attacks and the ensuing anthrax scare, according to HHS Secretary
Tommy Thompson. "These guidelines offer practical advice to building
owners, managers and maintenance staffs on the steps they can take to
protect their ventilation systems," he said in a statement.
The new recommendations suggest that building managers ensure that access
to internal operations systems and building design information is
restricted and that they assess the emergency capabilities of those
systems. Building managers should also inspect air filters to ensure
they're performing correctly and adopt more preventive maintenance
procedures, the guidelines state. Still, building managers should not be
overly cautious, the guidelines warn. For example, maintenance personnel
should avoid possibly detrimental measures like permanently sealing outdoor
air intakes.
"This guidance offers reasonable and practical measures to reduce the
likelihood of a contaminant attack and to minimize the impact if one
occurs," Office of Homeland Security Director Tom Ridge said in a
statement. Ridge's office offered input on the recommendations, which were
prepared by the Centers for Disease Control and Prevention's National
Institute for Occupational Safety and Health, with the cooperation of more
than 30 federal agencies and state and local professional associations. The
guidelines were also distributed to private companies.
The guidelines attempt to offer practical advice for buffering existing
security systems or adding ones where none exist. On some level, all
buildings have similar needs, they say. "Preventing terrorist access to a
targeted facility requires physical security of entry, storage, roof and
mechanical areas, as well as securing access to the outdoor air intakes of
the building," the guidelines say.
Yet each building's security needs must be addressed individually. Most
buildings could use low-cost security measures, such as locks on doors to
mechanical rooms. Other, more expensive steps, such as installing X-ray
equipment to examine packages, should be considered only if further steps
are warranted, the guidelines say.
*********************
Government Executive
May 10, 2002
Digital divide between agencies remains wide
By Carl M. Cannon, National Journal
World-changing marvels to us are only wallpaper to our children.
--Bruce Sterling, science fiction novelist and technology writer.
The federal government often doesn't change the wallpaper until it's so old
it's peeling off the walls. Waiting until then may seem to be less
expensive, but it's really not.
It's May now, the beginning of fire season out West. At the National
Interagency Fire Center in Boise, the pilots are readying their fleet of
spotting planes. Actually, "fleet" is probably not the right word: Three
planes--a King Air B-90, a Super King B-200, and a new Cessna Citation
Bravo, the Forest Service's first jet--patrol the skies. The area they
cover is truly vast. Not just the huge U.S. forests of Alaska and the
Western states, but all of North America. "We've been to fires in Ontario,
Canada, and as far south as Guatemala," Woody Smith, an electronics
technician in Boise said on May 8 while one of the planes was spotting
fires in New Mexico. "We could use a little more help, sure."
Those planes carry infrared cameras that can spot an 8-inch "hotspot" from
14,000 feet. They typically detect 10 to 15 fires while flying their
grids--although Smith has spotted as many as 30 in one night. That's the
good news. The bad news is that the planes have no satellite uplink and no
computer network into which they can download this information. The flight
crews relay the information to the fire bosses the same way they have since
the Vietnam War: They land the planes near the fires and hand the film to
an infrared interpreter. It can take from 30 minutes up to four hours to
get the information where it needs to be.
Here's how the U.S. Air Force does a similar task: It launches General
Atomics Aeronautical Systems-built Predators or Northrop Grumman-built
Global Hawks over the skies of Afghanistan and Pakistan at altitudes of up
to 60,000 feet, keeps them up for days on end, and transmits the pictures
they take via satellite to ships in the Persian Gulf. Both types of planes
are drones, meaning they're unmanned. The high-flying Global Hawk jets are
equipped with on-board computers, which control--and even land--the planes.
The Predators are steered by controllers on the ground, who can direct them
to fire weapons at military targets.
On Capitol Hill, the current buzzword for such state-of-the-art hardware
and the accompanying computer systems is "transformational" technologies.
They surely are. But what should Americans think, then, about that retro
fire-fighting gear in Idaho? Why can't Predators fly over the national
parks and zap small fires with chemical retardants before they grow into
dangerous wildfires? Science fiction writer William Gibson once explained
it this way: "The future is here. It's just not equally distributed."
Gibson was referring to the "digital divide" between well-off Americans and
the underclass, but his point has a broader context, which was brought into
clearer focus on September 11, the day the gulf between military technology
and, say, the Immigration and Naturalization Service's passe record-keeping
system suddenly seemed dangerously wide.
'Sideways' Scenarios
When Sterling uttered his lyrical phrase about wallpaper, he was addressing
the National Academy of Sciences' convocation on technology and education.
It was the first year of the Clinton administration. The Internet was still
a cool thing to invoke. Aware that Washington policy makers were pretty
chary of applying high-tech solutions to traditional social ills, Sterling
was positively evangelistic about the possibilities of a technological
future. The author of Zeitgeist and other books told the assembled wonks
that although novelists and futurists tended to weave best-case or
worst-case scenarios, in real life there are mainly "sideways-case
scenarios." He noted that the Internet began as a Cold War military project
but flourished as a tool for scholarly research, commerce, and play.
"It was designed for purposes of military communication in a United States
devastated by a Soviet nuclear strike--originally, the Internet was a
post-apocalypse command grid," Sterling said. "And look at it now! It's as
if some grim fallout shelter had burst open and a full-scale Mardi Gras
parade had come out. Ladies and gentlemen, I take such enormous pleasure in
this that it's hard to remain properly skeptical. I hope that in some small
way I can help you to share my deep joy and pleasure in the potential of
networks, my joy and pleasure in the fact that the future is unwritten."
That's one vision. But following Sterling to the dais that day--it was May
10, 1993--was a tall, thin figure with a somewhat darker worldview: William
Gibson himself. The leader of a generation of "cyber-punk" writers, Gibson
is the originator of the term "cyberspace," which he coined in his
acclaimed 1984 novel, Neuromancer. By then, Gibson had given considerable
thought to the technology Al Gore once dubbed the "information
superhighway." And he knew enough to be concerned.
"Realistically speaking, I look at the proposals being made here and I
marvel," Gibson said wryly in response to the computer-in-every-classroom
talk that dominated the seminar. "A system that in some cases isn't able to
teach basic evolution--a system bedeviled by the religious agendas of
textbook censors--now proceeds to throw itself open to a barrage of
ultra-high-bandwidth information from a world of Serbian race-hatred,
Moslem fundamentalism, and Chinese Mao Zedong thought."
Thus did the world's foremost science fiction writer reveal his skepticism
about human nature and his prescience: The sectarian fanatics who attacked
the United States eight years later, killing some 3,000 people,
communicated with each other through cyberspace, researched potential
targets in cyberspace--and continue to spew their hatred through
cyberspace. The murderers of Wall Street Journal reporter Daniel Pearl had
a free Hotmail account. The point is that technology itself is not a force
for evil or a force for good. "It's just a force," says Gregory Fossedal,
chairman of the Alexis de Tocqueville Institution, "that is determined by
the good or evil of human beings who use it."
There is also a parallel conundrum associated with technology: In a
free-market democracy with a tradition of individualism (the United
States), the very factors that help produce a stunning gusher of
innovation--unfettered intellectual freedom, a profit motive, the fact that
no one is really in charge--also serve to impede government's ability to
deploy such technologies to their maximum efficiency.
After America was attacked, President Bush and the top officials of his
government reordered their departments, their priorities, and their very
lives for the purpose of stopping terrorism. They reflexively turned to
cutting-edge technology to help them. "We think there's a market for these
products that are either on the research board or in the back of your
mind--or down the road," Office of Homeland Security Director Tom Ridge
told the Electronic Industries Alliance on April 23. "Biotech, infotech,
you name it. We're going to look to the technology sector."
Ridge is certainly correct to turn to Silicon Valley for help in solving
the nation's problems. But as it happens, incorporating technology wisely
was already Washington's challenge before September 11. There are numerous
technologies that, if deployed by government, would improve the quality of
life, preserve natural resources, save money, address seemingly intractable
social and economic problems and, in the process, fundamentally alter the
nature of the debate on age-old Washington political questions.
Consider the forest fires that those pilots in Boise are supposed to battle
with their 1960s technology. In 1988, after a devastating wildfire ravaged
one-third of Yellowstone National Park, the nation debated the wisdom of
the Park Service's "let it burn" policy. In truth, that policy--whatever
its ecological merits--has long been the de facto policy for the Forest
Service and the Bureau of Land Management in dealing with most major fires.
These agencies simply lack the wherewithal to put out large fires in
roadless areas. But the technology to fight them may already exist.
Today, during a big fire, Forest Service crews utilize pictures from
National Oceanic and Atmospheric Administration satellites to help set
their perimeters. But much greater technology is on hand. The oceanic
agency's future NPOESS sensors and NASA's existing Hyperion sensors are
capable of using hyperspectral imagery-colors not discernable by the human
eye. This means these satellites could determine the water content, and
thus the flammability level, of the nearby flora. They could measure wind
direction and thereby predict where a fire would be in an hour.
In addition, the Defense Department has recently signed contracts with
Lockheed Martin and TRW to launch a fleet of satellites that could change
everything. It's called the SBIRS High and SBIRS Low programs. (It's
pronounced "sibbers," and stands for Space-Based Infrared System). SBIRS
Low will consist of some two dozen TRW satellites with amazing
capabilities. Their primary mission will be to detect missile launches, but
these fast-moving birds could do much more. They could not only identify a
fire as soon as it broke out, they could conceivably identify the human who
started it.
But they could do all this only if the Air Force agrees to share satellite
time, if the Forest Service buys the computers to process all this
information, and if it trains its people to use them--and so forth. "That's
where you'd need an advocate in the agencies," says Richard Dal Bello,
executive director of the Satellite Industry Association. "Is there
somebody there who knows enough to get it done? Is there anybody in
Congress who cares? That's the human dimension, where the magic we call
politics comes in."
Would this be expensive? Yes, but how much does a forest cost? Or a human
life? In 1994, nine smoke jumpers and five members of a Colorado "hotshot"
crew were killed when a small 50-acre blaze blew up into a 2,400-acre
wildfire on a July afternoon on Storm King Mountain. Those firefighters had
basically the same equipment (axes, saws, shovels, and parachutes) as the
13 smoke jumpers who died in Montana's infamous Mann Gulch fire--in 1949.
Add to the cost of human lives and wildlife and timber this number: $1
billion. That's how much the federal government spent fighting forest fires
in 1994 without computers.
Getting Smart
So-called "smart cards," which contain chips or microprocessors, have
offered a technologically feasible way to keep track of visitors to the
United States for more than five years. By including biometric information,
smart cards could provide much more security, as fake-proof identity cards,
for everything from driver's licenses to passports and visas. Until
recently, civil libertarians chaffed at the idea of national identity
cards, and state legislatures balked at the cost. That may be about to
change. On May 1, two Virginia congressmen, Republican Tom Davis and
Democrat James P. Moran, introduced a bill directing states to turn their
driver's licenses into smart cards. "We think what happened September 11
makes a compelling case to do it now," Moran said.
There is more where this came from. Existing technology could give
Americans smart cars, smart roads, smart energy meters--and much smarter
consumer medical devices. David J. Farber, a University of Pennsylvania
professor who is an expert in engineering and telecommunications, said one
example of a medical advance that is already feasible would be a chip
embedded in the arm of a person with Type I diabetes. Such a chip could
modulate the flow of insulin into the body far more precisely than today's
insulin pumps. It could also record and transmit minute-to-minute data
about blood-sugar levels, and dial 911 if a patient fell into insulin shock.
In an interview, Farber suggested that if Congress had more diabetics,
federal money for such systems would be plentiful. That's probably not
precisely the problem, but he is onto something about technology and
Capitol Hill. The most recent edition of Vital Statistics on Congress shows
that only one member of either the House or the Senate has an aeronautics
background, nine have engineering backgrounds, and 17 have backgrounds in
medicine. (Only one senator, Democrat Maria Cantwell of Washington state,
worked for an Internet company.) By contrast, 218 of the 535 members are
lawyers.
It stands to reason, therefore, that policy makers shy away from
technologies that give off even a whiff of Big Brother--and few of the new
potential technologies are as benign as Farber's diabetes chip. "Satellites
and computers have given us the ability to look at things and deduce what's
going on in a way that was never possible before," Farber said. "If you
looked at all the information available about people--where they spend
their money, who they talk to, who they meet, what they're reading, where
they are at any given time--you could prevent a lot of crime, and probably
terrorism.... But it's not an inexpensive proposition, and most Americans
wouldn't enjoy being looked at all the time. A lot of things you could do,
we aren't doing, because of the Constitution."
Yet, paradoxically, some of Americans' most cherished freedoms are at a
crossroads--and technology may be their salvation. The enduring image of
the 2000 presidential election is of beleaguered voting officials peering
in confusion at mangled voting cards, the detritus of ancient, low-tech
voting machinery that has no more place in modern America than spats and
spittoons.
"I mean, that's the right symbol!" said former House Speaker Newt Gingrich,
recalling those infamous hanging chads. "The average [error] rate in voting
in America is 1.6 percent. The average [accuracy] rate for an automatic
teller machine is better than 6-sigma, which is 99.9999. It's better than
that. Now, is there a hint here? We're not talking about the future. We're
talking about bringing government into the 21st century. That's all we're
talking about. Just catch up with all the things that occur in the consumer
world and occur in the business world today."
Gingrich's current passion is how transformational technologies in general,
and nanotechnology in particular, are the key to maintaining American
pre-eminence. In speeches and interviews emphasizing the need for
Washington to embrace a high-tech future, he's been known to casually
suggest a tripling of the education budget. "We are on the verge of
creating an extraordinary explosion of new solutions that will dramatically
improve our lives, our communities, and the delivery of societal and
governmental goods and services," he said at a recent American Enterprise
Institute seminar. Yet official Washington often resists technological
solutions. This is most problematic when government is the only entity that
can viably fund them.
Public education is an example of a program so big only government can
really pay for it. But bigness itself is part of the problem. Although
reading and comprehension test scores, especially among minority students,
have been mired for a generation, the education establishment is too
unwieldy to quickly embrace high-tech solutions. One of the most promising
technologies emerged from studies into the workings of the brain by Michael
Merzenich of the University of California (San Francisco) and Paula Tallal
of the Rutgers University neuroscience center in Newark, N.J. After
discovering what portions of the brain are responsible for learning,
recognition, and memory, Merzenich coupled this research with modern
computer technology and launched Scientific Learning. The company, based in
Oakland, Calif., has produced a sophisticated learning program that
routinely raises student reading and comprehension skills by a full grade
level in six weeks. A 6-year-old sits at a computer, puts on headphones,
and does five separate exercises (disguised as computer games) for 20
minutes each. There are up to 900 levels for each exercise, which the
computer automatically calibrates to the individual student as it runs
through the progressions. The repetition would wear out a teacher, but the
computer doesn't mind.
The program is beginning to catch on as a teaching tool, but only on a
painstakingly slow, district-by-district basis. Will the Education
Department bless it? Company CEO Sheryle J. Bolton, citing the
accountability provisions passed in this year's so-called No Child Left
Behind education bill, hopes so. "We'll see," she said. "But `No Child Left
Behind' certainly fits our mission statement."
Paul Saffo, a director at the Institute for the Future in Menlo Park,
Calif., says that for technology to slowly work its way up to the national
level in this way is now the norm. "There used to be a trickle-down effect
in technology," he said. "The government got the best stuff first--the
astronauts got what they needed, and then the rest of us got Tang for
breakfast. It doesn't work that way anymore. Now a 15-year-old buys a
supercomputer before his father--even if his father is a colonel in the
Pentagon--gets his hands on it. I'm not kidding. That supercomputer is
called Sony PlayStation 2. It's so powerful, if it was made in this country
you probably couldn't export it."
John Markoff, who covers Silicon Valley for The New York Times, once dubbed
this phenomenon the "inversion" of the computer business. Saffo calls it
the "bubble-up effect" and says he noticed it when he and some other
futurists toured the Navy aircraft carrier USS Enterprise. It may have the
same name as the famous ship of Star Trek, but the carrier is hardly
space-age. "We realized," Saffo said, "that with our PalmPilots and laptops
we had more computer power in our backpacks than they had on their whole ship."
The reasons for this include cumbersome procurement rules, the amount of
money the government is willing to pay--and, mostly, the scale of the
worldwide consumer market. The Pentagon might buy a couple hundred thousand
supercomputers, but Sony is selling millions. Also, the government's real
need is not handheld hardware; it's the development of massive,
self-organizing systems that constitute a new science in itself. This
discipline is known variously as "complexity science," or "complex adaptive
systems," or sometimes just plain old "chaos" theory.
One Santa Fe, N.M., company, BiosGroup, specializes in developing
"self-organizing systems" that solve the problem of informational
bottlenecks. Even before September 11, this firm had landed several
government contracts--DARPA, the Defense Advanced Research Projects Agency,
is particularly interested in applying complexity science--but now, even
laypeople in Congress understand how crucial it is. "We had enough
information [about the September 11 terrorists]. It was staring us in the
face," says James W. Herriot, vice president for science at BiosGroup. "But
somehow, nobody connected the dots together. We have billions of parts of
information, but not enough human brainpower to sort through it all. There
are lots of intelligent people in government, and they know that the system
we have is a disaster. It won't be fixed by cutting the INS in half,
although that might be politically satisfying. It will be solved by
engineering systems according to this new science."
Still another factor relates to government's failure to embrace
state-of-the-art technology: the tricky nature of making decisions in the
fishbowl of an open democracy. "It's not that government is uniquely
unwise, it's that they're more accountable for failure," says Esther Dyson,
chairman of EDventure Holdings. "If a business tries something and it
doesn't work, they go on. If government does it, they get exposed and
ridiculed--voted out of office. The stakes are different in government."
Dyson found this out the hard way after she served as the first chairman of
ICANN, the Internet Corporation for Assigned Names and Numbers. It is a
nonprofit set up to bring order to the granting of Web addresses, but it
received continual criticism from every direction. Asked how it would be
possible to get government to be less risk-averse, Dyson turned the
question back onto her questioner--onto the media. "For one thing, you guys
have to stop jumping on them," she said. "Politics should be less vicious,
and there needs to be more understanding that we need courage and
experiment and innovation in government as well as in the private sector."
Gary Chapman, coordinator of the 21st Century Project at the Lyndon B.
Johnson School of Public Affairs at the University of Texas, echoes this
point. He also maintains that government has problems that private-sector
executives can't begin to imagine.
"The chief problem that government has is the same problem the phone
companies have: They can't deploy systems that break, or which at least
have a high degree of risk of breaking," Chapman said. "Not only do
government systems typically have to work tolerably well, they have to
serve everyone equitably and they have to be semitransparent in terms of
their development, budgets, and accountability."
Overcoming Skepticism
These two points underscore the reality that there is no exact
private-sector equivalent to midterm elections and the Electoral College.
Take the corporate average fuel economy standards, known as CAFE.
Automakers' light trucks are now required to average 20.7 miles per gallon,
while their passenger cars must average 27.5 mpg. Those standards haven't
been raised in a decade, because of the clout of the Big Three automakers
and their unions based in the battleground state of Michigan. In the wake
of September 11, there were calls, even from conservatives, for stepped-up
energy conservation, which the administration skirted by embracing
fuel-cell research, a promising but far-off antidote. Refusing to raise
CAFE standards was a purely political decision. It was not a science-based
decision--for the simple reason that the technology to vastly increase
conservation is already here. It was on display three months ago at the
Detroit auto show. There, Honda unveiled its Civic Hybrid, a car that
averages 50 mpg.
Sometimes the problem isn't politics. It's a lack of imagination. DNA
forensics is a glaring example, and a classic case of Bruce Sterling's
sideways-case scenarios. In 1975, an Oxford-trained English biochemist
named Alec Jeffreys attempted to clone a mammalian single-copy gene. He
failed, but he did manage to devise a method of detecting single-copy
genes. By September 1984, still pursuing biomedical research, Jeffreys,
then at the University of Leicester, successfully tested a system of
probing for genetic sequences in human DNA. The implications were
immediately obvious. "It was clear that these hyper-variable DNA patterns
offered the promise of a truly individual-specific identification system,"
he wrote.
Cops on both sides of the Atlantic began using DNA evidence in the
prosecution of rape and murder suspects. But with little federal guidance,
law enforcement authorities in the United States had no standardized rules
for collecting such evidence. Requirements varied from state to state, as
did the willingness to use DNA evidence as a tool for proving the innocence
of some jailed defendants. Some courts and prosecutors have been so hostile
to using DNA to help defendants that private do-gooders have filled the
breach. In Kentucky, a retired Presbyterian minister raised $5,000 for a
DNA test of a convicted rapist named William Thomas Gregory, who'd always
proclaimed his innocence. Gregory was telling the truth. Two years ago, the
test confirmed his innocence and he was released--the first Kentucky inmate
to be freed by DNA evidence. The Innocence Project, started in 1992 by
defense lawyers Barry C. Scheck and Peter J. Neufeld, has been instrumental
in freeing more than 100 inmates. In Virginia, authorities are proud of
their state-of-the-art Biotech Two forensics laboratory in Richmond, but it
took a large donation from crime novelist Patricia Cornwell to build it.
Virginia has the best DNA database in the country because a 1990 law
required DNA samples from every felon admitted into the state's penal
system. In most places, backlogs are the rule. New York City has more DNA
evidence sitting, untested, in "rape kits" in police stations' evidence
rooms than in its lab--meaning that untold numbers of rapists and killers
are roaming the streets for want of a single lab test. National
Organization for Women President Kim Gandy terms this situation a "national
disgrace," and she's not alone. In its fiscal 2003 budget, the Bush
administration asked Congress for a 100 percent increase in federal funds
to help local police departments reduce this backlog. Several lawmakers,
led by Rep. Jerrold Nadler, D-N.Y., want more. On March 13, Nadler proposed
increasing the current $50 million a year earmarked for this purpose over
the next two years to $250 million.
This would be a good start, as is the FBI's national database, CODIS. But
what a technologically savvy federal government would have done is
construct a new lab with a dedicated supercomputer that would clear up old
cases, reconcile claims of innocence with the evidence, and match up new
DNA evidence from current rape cases in real time.
If Congress had a sense of irony, it could name such a facility the
Buckland-Pitchfork DNA Laboratory, after the two Britons whose stories
rival any plot a science fiction writer could weave. Rodney Buckland was
the feeble-minded 17-year-old freed for rape and murder after Alec
Jeffreys's lab established that DNA from semen on two 15-year-old girls was
not his. Colin Pitchfork, a local flasher, was the one who was nabbed after
English detectives required 5,000 men in three villages to submit to DNA tests.
There are all kinds of reasons, not the least of which is the Fourth
Amendment prohibition against unreasonable search and seizure, why cops in
the United States couldn't round up 5,000 men for DNA testing. There are
also legitimate reasons, aside from technophobia, why juries don't
automatically believe the government's forensic experts. It has been left
to the nation's front-line prosecutors to tackle this skepticism head-on.
One of them, San Diego County Deputy District Attorney George "Woody"
Clarke, has come up with an admonition that could be posted in procurement
offices and appropriating committees across Washington.
"Anything done by humans is subject to error," he says. "The technology
itself is never wrong."
*********************
Nando Times
Computer-based artificial societies may create real policy
By CHRISTIAN BOURGE, United Press International
WASHINGTON (May 12, 2002 3:35 p.m. EDT) - Agent-based modeling is an
emerging computer technology that holds promise as a powerful tool for
analyzing policy problems - and experts in the field believe it has the
potential to fundamentally change the way social scientists and economists
test theories, examine data, and create new policies.
But proponents of these software-based models - which accurately simulate
how people and events interact, and can accurately predict the outcomes of
systems or policies that may be put into place - say that despite the
wealth of potential applications, and the immense growth of this new
science in recent years, think tanks and policy researchers have been less
than enthusiastic in their embrace of this new technology as a research
instrument.
"It is fair to say that at this point this is a new way to do business and
the full ramifications for policy are not fully developed yet," Robert
Axtell told United Press International. Axtell is a fellow in economic
studies at the Brookings Institution's Center on Social and Economic
Dynamics, and a prominent researcher in the area of agent-based modeling.
"There are, however, some models that seem to be extremely salient for
policy makers, but it is unclear how agent modeling can outperform a
standard statistical model of analysis."
At its most basic, agent-based modeling uses a software program that
autonomously decides what courses of action the individual players in the
program will take in any situation being simulated. Researchers are using
such programs to develop independently functioning artificial societies, to
test theories about problems as diverse as the disappearance of
long-vanished civilizations and why genocidal behaviors develop within
existing societies, and to test theories about how best to police civil
conflicts.
In such an artificial society program, individual artificial persons - or
agents - are given set roles, and are programmed to exhibit a number of
possible changes in their behavior. As the program runs, the software is
designed to randomly decide what changes will occur and which agents will
make them in a given situation.
The overall idea is for the artificial society to run within the computer
as a real society would, with each agent (person) acting independently and
reacting to any number of stimuli or situations that are designed to mimic
real influences on human decisions.
By programming any number of possibilities into the software based upon the
data gathered about a specific policy problem, and examining the different
potential outcomes that emerge, proponents of the technology believe that
policy analysts can use the tool to actually create and test potential
policy solutions to real-world problems.
"Agents are good because your agents are designed to represent the behavior
of observed social actors, individual firms or departments," says Scott
Moss, director of the Centre for Policy Modeling at Manchester Metropolitan
University in the United Kingdom. "You can design an agent to impart their
behaviors in an way in which they interact with other social entities. You
cannot do that with conventional economics or social science, and it can
inform the range of options or range of possibilities on a given initiative
that policy makers might take."
For example, if there are several competing potential changes that
policymakers can institute in the U.S. social security program, researchers
can take all existing information about the individuals and institutions
affected by the government's program - retirees, the federal treasury,
those who care for the aging, portions of the society that rely upon the
spending of the retired, and others - and design a program where all the
observable reactions that each individual agent can have are programmed
into the software-based society.
The program is then run, with each agent reacting to whatever changes are
made. The potential outcomes - market, governmental or otherwise - can then
be analyzed to decide which policy change would be the most effective.
Axtell believes that agent-based modeling is a tool that has unlimited
potential for assessing policy decisions.
"Policymakers seem to be very amendable to this because they can see it and
ask 'What if you change X, Y or Z?' and we can say, we don't know but we
can try it," said Axtell.
One simulation designed by Joshua Epstein, a colleague of Axtell's at
Brookings, used existing anthropological and meteorological data to come up
with plausible reasons why the Anasazi Indian tribe abandoned the Long
House Valley in Arizona around AD 1300.
Other researchers have created artificial societies that mimic securities
markets and real trader reactions. When run they produce the significant
ebbs and flows in market dynamics that traditional economic theory models
have been unable to accurately reflect.
"From a pure scientific point of view there are a handful of models where
you can show agents will replicate well known history or representative
data," said Axtell.
Though proponents of the technology are the first to point out that
agent-based modeling is not meant to function as a predictive tool, they
believe that the technology can be used to anticipate events that might
arise from given situations, and to test theories about what policy
decisions will work best to fix a societal problem.
But traditional social scientists, especially policy analysts in the think
tank world, have been less than receptive to the technique, for a number of
reasons.
Ben Goodrich, a trade analyst at the Institute for International Economics,
and not a current user of this technology, told UPI that though he finds
the potential for agent-based modeling quite interesting, he views it only
as a simulation tool to be included in the arsenal of techniques used by
economists. He said that is not a replacement for the type of empirical
modeling that is the standard technique in social science research.
"I don't really consider it to be a complete substitute for the more
empirical models that we use in economics and political science, that are
completely driven by actual data," said Goodrich. "I think this is a
complement for existing tools more than a substitute for them. Those
(existing) methods are (actually) improving dramatically as a result of
technology."
Moss says that part of the reason agent-based modeling remains little used
by policy analysts is that they do not understand its full potential, and
are skeptical because the modeling community on the whole is not properly
testing its findings to demonstrate their validity. He added that the
research of Axtel, Epstein and others who are prominent in the application
of the technology, is the exception to this rule.
Moss is involved in an agent-based modeling project sponsored by the
European Commission, which is examining the behaviors of stakeholders in
water resource management in various regions across Europe.
He says the software model being used and the outcomes being simulated are
being validated by the individual stakeholders who are being modeled. The
stakeholders give feedback on the legitimacy of the circumstances modeled,
and on whether the behavior that emerges in the program accurately reflects
their own.
Moss says he believes that agent-based models can have their greatest
impact, and can best reflect reality, if the models are checked against
existing data and against the judgments of experts in the field.
Disagreements within the agent-based modeling community, about the
direction and future of the technology, are also holding the policy world
back from adopting the technique.
Stephen Bankes, a senior computer scientist with the Rand Corporation, says
that a significant problem hindering the wide application in policy work of
agent-based modeling and simulated artificial societies is that there is no
dominant methodology for creating these programs. Instead, there are many
competing techniques. This prevents the critical comparison of different
approaches, and the debate over the value of different studies and outcomes
that would follow.
"You have to have similar methodologies," said Bankes. "You have to be able
to create mechanisms that will create many different alternative versions
of the model. The policy people have been very slow to adopt it, but I
think it is for legitimate reasons."
Axtell agrees. "If you have really good, empirically accurate models, they
are much harder for those not disposed to the methodology to reject," he said.
Bankes says he believes that if this problem can be overcome, the potential
for the technology as a policy tool is immense.
"It could be revolutionary simply because so much social science has been
limited by the available models up to now," said Bankes.
He believes that the social sciences have been waiting for a change that
will allow them to go in directions they have been unable to follow in the
past and that "there is a reasonable chance that agent-based modeling is
it." He cautioned, however, that this potential revolution could be decades
in the making.
Currently Bankes, Axtell, and a few colleagues make up the handful of
people in the think tank world who are using this technology to examine
policy issues. There are many reasons for this but both men say the major
ones are the steep learning curve for mastering the technique, and the
differences between agent-based modeling and the established research
methods in various fields.
They also believe that American social scientists have sparse training in
the computational methods needed to design these systems.
Axtell says, however, that there are pockets where the techniques have been
accepted, especially among people who do applied social science work.
"Most will view this as a challenge to how they were trained in graduate
school," said Axtell. "But in general, people who do applied work, those
who work with data and have a lot of computing skills, seem be more
amendable to this modeling approach than those who are pure theorists."
The field is growing despite these stumbling blocks. Just a few years ago
there were only a few dozen researchers doing policy-centered agent-based
modeling, but now the community has reportedly grown to more than several
hundred, with journals for the publication of studies and graduate students
doing theses on the topic. Those students will also be able to find jobs,
said Axtell, but those jobs will probably be at universities rather than
think tanks.
Axtell believes that one of the reasons that think tank policy analysts are
not yet interested in agent-based modeling is that few tanks actually do
the type of long-term research for which the software is best suited.
Goodrich says that the work done by Axtell and other researchers is
stepping beyond existing empirical modeling because it attempts to mimic
reality in a way that more established techniques are unable to. At the
same time he said that it is important to note that "you can't, strictly
speaking, prove much with a simulation."
"In order for these agent-based models to be a tool for public policy, they
need to help think tank scholars or academics or government workers
understand why and when things occur, so they can successfully enact
government policy to change things," said Goodrich. "I think that this is a
tough hurdle to overcome."
He added, however, that these types of simulations are useful because they
could enable researchers to brainstorm policy fixes and then test their
validity with empirical modeling. In addition, he says that in many cases
where empirical models are not possible - such as accurately analyzing
events in the securities market - "having a simulation that supports a
particular theory is better than nothing."
According to Bankes, Axtell and Moss it is the ability to create different
societal models that gives agent-based modeling its strength. They believe
that it is the power to develop and compare competing artificial societies
- with different potential outcomes that can be examined to find the best
possible solution - that can turn the think tank policy analysts into
ardent supporters of such research.
*************************
Nando Times
Jews shopping online to support Israel
By CATHERINE LUCEY, Associated Press
PHILADELPHIA (May 11, 2002 9:04 p.m. EDT) - When Jane Scher told other
guests at a bat mitzvah that she had ordered a gift from an Israeli shop,
they became interested.
So Scher, who lives in San Diego, created a Web site with over 100 links to
stores in Israel.
American Jews looking for ways to help Israel have started using sites such
as Scher's to buy books, jewelry, crafts or flowers, giving a boost to
Israel's economy, which has lost major tourist revenues during escalating
violence in the Middle East.
"I think all of us feel like if only there's something we can do," said
Scher, 45. "If a million Jews spend $100 in Israel that's $100 million for
Israel. We don't need to be helpless. We actually can participate in a
constructive way."
Baruch Hadaya, 52, owner of Hadaya, One of a Kind Jewelry, in Jerusalem,
said his Web site was getting about 1,200 hits a month before Scher started
www.shopinisrael.com. Last month, after he was listed on Scher's site, he
got 9,500 hits.
"I'm not sure I'd still be in business without them," he said. "I was
dying, I was going out of business."
After the Israeli-Palestine clashes began in September 2000, the number of
tourists visiting Israel dropped by more than half in 2001 from 2000
levels. There is an American warning against travel to Israel and once-busy
stores and hotels are doing a fraction of their previous business.
"They have the right to support their people, and so should the
Palestinians have the right to support their people," said Faiz Rehman,
spokesman for the American Muslim Council. Palestinians "want to, but they
can't. Their efforts here are discredited. The government accuses their
charities of having ties to suicide bombers."
Rehman said the U.S. government has created a fearful atmosphere and that
Palestinians are afraid to support their people. "People from other
communities are hesitant to come forward and help, there's an atmosphere of
fear," he said.
Scher said her not-for-profit site, which went up 11 weeks ago, is getting
thousands of hits a day.
The Philadelphia Jewish Federation has had links to Israeli stores on its
Web site for about six months and now has a link to Scher's site.
"There's a bakery in Jerusalem that provides rolls to hotels. At one time
they employed 15 or 18 people and produced thousands of rolls," said Irv
Geffen, the federation's vice president for marketing. "They were
employing, in November, two people and were producing 50 dozen a day."
Geffen says he has bought jewelry for his wife online and plans to purchase
more as Mother's Day gifts.
"The jewelry is lovely," he said. "Every time I physically go there, I buy
jewelry."
The United Jewish Committees' Web site also has links to Israeli shopping
sites.
"I think you see people being apprehensive" about traveling to Israel, said
Richard Pearlstone, chairman of the group's marketing committee. "There are
people who believe this is one of the ways to show support for the Israelis."
During the Six Day War in 1967 and the Yom Kippur War in 1973, thousands of
Americans went to Israel to work and take over civilian jobs.
"It's not the same as when hundreds of thousands of soldiers are at
Israel's borders and, therefore, Israel needed bodies," said Philip Rosen,
chairman of American Friends of Likud. "At this stage, Israel doesn't need
bodies to help out. What they need is other kinds of support."
Avra Kassar said she recently ordered flowers for her daughters-in-law
through Scher's site.
"I have family and friends with children in the army," said Kassar, who
lives in La Jolla, Calif. "This is an opportunity for me to do something so
small in comparison with what they are doing."
***********************
Sydney Morning Herald
There's no place like home
Australian software houses are finding the United States a land of
opportunity too tempting to ignore, writes Glenn Mulcaster
When the founders of Tower Technology moved their headquarters from Sydney
to Boston, they hired lots of people with American accents. Sydneysiders
Pat Hume and Frank Johnson have been living in Boston for more than six
years and have picked up the local jargon when they talk about the
opportunities for Australian software houses that move to North America.
They talk turkey with big customers at head offices in the US, they have
regular dialogue with market analysts such as Giga, Gartner and Delta. They
can talk closely with North American financial advisers, who helped plan
for an IPO that was postponed because of the persistent hangover of the
tech stock crash in 2000.
But more importantly, they are selling more software in the world's largest
market where Hume feels like a child in a lolly shop. "The US is an
exciting place to be," she says. "We encourage every Australian software
company to come here."
The paradox is that the larger, highly segmented US software market is less
competitive, they say.
Their software helps companies manage electronic documents in their
corporate databases. "When we win a contract in the US, we find that we are
up against three or four competitors," Hume says. But in Australia there
can be up to 20 contenders for the business. Every multinational is playing
there and it is highly competitive.
"In comparison, the US is such a big market," Hume says. "We are like kids
in a candy store."
But they insist that Tower Technology would never abandon its roots. "We
would never do the major development work anywhere but Australia," Hume
says. "Australia has a very good intellect and a very good work ethic."
Their model has been tried before and is encouraged by the Australian Trade
Commission, especially for software companies starting their export drive.
"I totally agree on the need to appear American or at least an
international company," says Peter Lewis, a senior trade commissioner at
Austrade's office in San Francisco.
He believes this can overcome any perception of geographical issues, real
or imaginary. "In general terms, unless an Australian company is exporting
koala pelts or Steve Irwin dolls, we suggest they need to appear
international or American," he says.
"Most of the successful Israeli and Indian companies in technology here,
you wouldn't even know they were not American . . . until you go to their
website and see R&D is based in Haifa or Bangalore."
He says Austrade recommends setting up a foreign head office with senior
management based in the US but R&D in Australia. He also highlights some
Australian technology companies that have effectively used the foreign-base
method to start out:
Internet search company Looksmart; Atmosphere Networks, from Perth, which
sold to Ditech for $90 million in 2000; Perth's Shasta Networks, sold to
Nortel for $360 million in 1999; and Norwood Systems, also from Perth, now
with a London headquarters after securing $US60 million in venture capital.
Hume and Johnson set up Tower Technology in the late 1980s, naming it after
the Australia Square Tower in George Street, Sydney. They are married and
have two children, aged 11 and 9, now at school in the US.
They began the US operation in 1995, initially hiring a local to work in
Boston, but realised it would only work if they too were there on the
ground. Hume and Johnson agree that to run a successful business, "it is
essential to be at the coalface overseas".
"It makes sense to be where the biggest market is," Johnson says, "and we
have contact with the head offices of the companies that buy our software."
They couple say it helps them make more informed guesses about product
direction.
The company holds a quarterly product development committee meeting, which
usually lasts three to five days, wherever the sun is shining at that time
of year. Representatives of each region - Europe, Asia-Pacific, the
Americas - vote on the aims for product development.
"In terms of technology leadership, a lot of the early development work
goes on in the US," Hume says. The committee meetings often show that
software buyers in the three regions are fairly closely aligned but
sometimes there can be a 12-week lag in adoption of a technology.
"But with the Internet, the lag is becoming less and less," Johnson says.
"The Internet is making everybody much faster and better informed about
what is going on."
Tower does a lot of trade shows, focusing on "vertical" markets such as
banking, insurance and government sectors, even though its product is
applicable across the whole range of business - it is essentially a
horizontal piece of software not restricted to one segment.
The average installation of Tower Technology software includes a $US1.5
million up-front purchase price, a maintenance contract of about 20 per
cent per year, plus other services.
Another low-profile Australian software house selected Chicago as its US
base, after an acquisition. Pete Draney, managing director of LANSA,
reckons his company is perceived as a multinational software house with a
widely scattered workforce. The headquarters near Chicago, at Oakbrook,
Illinois, boasts about 125 staff.
Its first North American office was set up there in 1989 with an
independent business partner that helped to sell LANSA software in the US
on behalf of the Sydney-based software arm of Aspect Computing.
LANSA's flagship product is used to develop systems on the AS/400, a
mini-computer from IBM widely used in manufacturing companies.
Draney says Aspect bought out the US partner in 1996 and renamed it LANSA Inc.
"In hindsight, Chicago was best for us - a travel hub, access to reasonably
priced skilled resources, a large local customer and business partner
community," Draney says.
LANSA's R&D is still performed in Sydney, where there are 60 staff. It has
50 sales and technical support personnel in Europe and eight people in
Asia-Pacific in sales and support.
LANSA is privately owned by Draney and Lyndsey Cattermole. They sold their
company, Aspect Computing, to KAZ Computer Services this year.
Draney maintains his day-to-day responsibility as managing director of
LANSA, while Cattermole retains her role with Aspect.
KAZ was founded by Peter Kazacos, a software engineer who worked on the
foundation of the LANSA product as an Aspect employee in the late 1980s.
A name change, comprehensive product revamp and management turmoil have not
hurt Melbourne company Managesoft, a software company with a patchy history.
It changed its name from Open Software Associates last year, after managing
director, John Cromie, moved from Australia to New Hampshire.
However, Cromie quit at the end of last year and returned to Australia for
personal reasons.
Managesoft appointed an American software executive, Walter Elliott, to
take his place only a few weeks ago.
Managesoft helps network managers control software in large corporate
environments.
Managesoft's New Hampshire office was set up in 1993 to complement its
office in California.
Andrew Hewitt, managing director of Managesoft's Australian operations,
says it is important to have a manager on the ground in the US.
"It has always been our view that if we wish to succeed as an international
software supplier, we have to succeed in the US market," Hewitt says, "for the
obvious reasons of the dominant size of that market within high-tech and
computing."
He says Managesoft has won significant US accounts such as Merrill Lynch,
Texas Instruments and Philip Morris. Big users in Australia include ANZ
Bank and Mayne. "We treasure our Australian roots and expect to always
maintain our software 'factory' here. The move was simply a matter of
putting the leader where the most strategic action is taking place."
************************
Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 507
1100 Seventeenth Street, NW
Washington, D.C. 20036-4632
202-659-9711