[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Clips October 1, 2002
- To: "Lillie Coney":;, Gene Spafford <spaf@xxxxxxxxxxxxxxxxx>;, John White <white@xxxxxxxxxx>;, Jeff Grove <jeff_grove@xxxxxxx>;, goodman@xxxxxxxxxxxxx;, David Farber <dave@xxxxxxxxxx>;, glee@xxxxxxxxxxxxx;, Andrew Grosso<Agrosso@xxxxxxxxxxxxxxxx>;, ver@xxxxxxxxx;, lillie.coney@xxxxxxx;, v_gold@xxxxxxx;, harsha@xxxxxxx;, KathrynKL@xxxxxxx;, akuadc@xxxxxxxxxxx;, computer_security_day@xxxxxxx;, waspray@xxxxxxxxxxx;
- Subject: Clips October 1, 2002
- From: Lillie Coney <lillie.coney@xxxxxxx>
- Date: Tue, 01 Oct 2002 11:48:40 -0400
Clips October 1, 2002
ARTICLES
Firm May Face Fines on Exports [Violations]
Davis vetoes bill requiring PC recycling fee [Environmental]
EPA will order 10 tech firms to test for toxin [Environmental]
Lawmakers still hope to finish an e-gov bill this year
New Guidelines Open U.S. Data To Challenge
NSA to Upgrade Monitoring Abilities
Defense U. targets e-gov [Education]
Studios' Web 'Plants' Lead to an Ethical Thicket
On Behalf of Film Studios, Company Searches for Students Downloading Movies
[Piracy]
Models of mayhem [Cybersecurity]
Forensics meets time travel
Project eyes global early warning system for the Internet
DARPA explores self-healing system
Microsoft wants software to be 'public utility'
State Department asks firms to create intelligence database
Defense tracking system proves crucial to port security
GSA to unveil top 20 security flaws, focus on fixes
Web site defacements hit all-time high in September
Diplomas faked on Net worry universities
Virus poses as Microsoft security patch
*************************
Los Angeles Times
Firm May Face Fines on Exports
By JOSEPH MENN
October 1 2002
SAN FRANCISCO -- U.S. export-control officials have accused Sun
Microsystems Inc. of violating federal regulations in the sale of computer
equipment to Egypt and, through a reseller in Hong Kong, to China.
The Commerce Department's Office of Export Enforcement warned the company
that it could be subject to fines or even denial of export privileges if
Sun is found to have broken the law.
Santa Clara, Calif.-based Sun disclosed the February accusation Monday in
its annual filing with the Securities and Exchange Commission. The firm
said it was negotiating with the Commerce Department and that it was
"reasonably likely" it would reach a settlement that would not have a
material effect on its operations. But the export officials warned that Sun
was likely to face additional charges.
"We do not believe the evidence would support the extreme sanction of a
denial of export privileges," Sun wrote in its filing. Slightly more than
half of Sun's revenue comes from overseas.
Sun spokesman Andy Lark said he didn't know how the hardware in question
was used. He said China was barred from receiving the goods when it did so
in 1997, and the issue in the 1998 sale to Egypt involved improper
notification to authorities.
This is the second time in recent years that Sun's China dealings have
triggered federal action.
Also in filings Monday, Sun named former SEC Chief Accountant Lynn Turner
and former Sun Chief Financial Officer Mike Lehman to its board. It said
Packet Design Chief Executive Judith Estrin had resigned from the audit
committee of its board. Sun does business with Packet Design, and Sun CEO
Scott McNealy, Chief Scientist Bill Joy, and two nonexecutive directors
have interests in that firm.
**************************
Mercury News
Davis vetoes bill requiring PC recycling fee
By Ann E. Marimow
SACRAMENTO - Legislation that would have put California at the forefront of
recycling scrapped electronics was rejected by Gov. Gray Davis late Monday,
underscoring the political clout of Silicon Valley's high-tech companies.
California would have become the first state to assess a recycling fee --
$10 per electronic product -- on new computers and televisions sold to
residents.
Davis' veto of the bill by Sen. Byron Sher, a Palo Alto Democrat, dealt a
setback to environmentalists and local government officials who have
struggled to safely dispose of the hazardous waste from an estimated 6
million old computers and televisions gathering dust in California homes.
``California's high-tech companies are failing to support a meaningful
solution to the toxic e-waste problem,'' said Mark Murray of Californians
Against Waste, which led the campaign for Sher's bill.
In his veto letter Monday, Davis said he could not support the bill, SB
1523, because it expands the government bureaucracy while the state is
making major cuts.
But the governor added that he was ``troubled'' by the waste buildup, which
could cost $500 million to clean up. ``There is no time to waste,'' he
wrote. ``I believe California should have a new law next year.''
Sher and the bill's supporters commended Davis for acknowledging the
problem and challenging the high-tech industry to be part of the solution.
The issue was so important to such high-tech giants as Hewlett-Packard and
IBM that HP Chief Executive Carly Fiorina took the unusual step of making a
personal appeal to Davis last month, saying the bill ``would do more harm
than good.''
Tech-industry officials feared the provision allowing the state to levy the
fee on out-of-state companies would be ruled unconstitutional in an
all-but-certain court challenge. With Silicon Valley being hammered by a
sinking economy, that would have put California companies at an unfair
disadvantage, they said.
The Silicon Valley Manufacturers Group also vigorously opposed the bill.
Fiorina, whose company has donated $30,000 to Davis' re-election campaign
since December, said HP supports computer recycling but not to the
detriment of California's high-tech industry.
``We were looking for a solution to the problem that was fair and applied
equally to all companies,'' said Kristine Berman, HP's government affairs
manager for California. ``And we look forward to working with Sen. Sher
next year.''
But Sher said the high-tech industry's fears were unfounded. The recycling
fee would have been suspended both for in-state and out-of-state companies,
according to supporters, if a court found it illegal. That argument
persuaded all Silicon Valley legislators to support the bill.
Computer monitors and television screens contain several pounds of lead
each. When the monitors are thrown out and crushed or burned, the lead and
other hazardous materials such as mercury and cadmium can seep into the
soil and groundwater.
The Environmental Protection Agency says about 220 million pounds of old
computers and other hardware are trashed in the United States each year.
The bill would have set up a statewide recycling program, requiring
manufacturers and retailers to begin collecting the $10 fee on new
purchases by January 2004.
****************************
Mercury News
EPA will order 10 tech firms to test for toxin
By Joshua L. Kwan
Mercury News
The U.S. Environmental Protection Agency will require 10 Silicon Valley
high-tech companies that once operated manufacturing plants in Mountain
View to conduct -- for the first time -- air-quality testing for a toxic
substance inside several offices that were later built on the land.
The same companies are suspected of having leaked into the ground a
substance called trichloroethylene, known as TCE, a widely used solvent
that cleans machine parts.
Now, the EPA believes TCE might be 60 to 70 times more dangerous to humans
than previously thought, and it is concerned that contamination in
groundwater is seeping into the air inside office buildings constructed in
areas vacated by those companies.
In a letter to be sent this week, the Mercury News has learned, the EPA
will notify such storied Silicon Valley semiconductor manufacturers as
Intel and Fairchild Semiconductor, among others, that they must submit a
work plan for testing the soil, indoor air and outside air for TCE.
The companies are suspected of creating a plume of TCE-contaminated
groundwater years ago that lies beneath new office buildings now occupied
by companies including AOL Netscape, Nokia and Veritas Software. The TCE
may be making its way up through the soil, between cracks in the buildings'
foundations and into indoor air.
Intel spokesman Chuck Mulloy said the company has not seen the EPA letter,
but the company ``would be happy to cooperate with the EPA once we
understand what the request is.''
Although the EPA has been monitoring the groundwater for years, recent
research suggests that, in lower levels, TCE can cause cancer, said EPA
toxicologist Stan Smucker.
The human body breaks down TCE into two smaller components called
dichloroacetic acid and trichloroacetic acid. Both substances also are
found in chlorinated tap water. EPA officials believe the total could add
up to an unhealthy intake of the two smaller substances. Previous risk
assessments for TCE did not take this complication into account.
In June, the EPA's scientific advisory board endorsed a tougher standard
for what is considered ``acceptable'' risk levels of TCE. The EPA can
require the companies to take measures to remove the substance or safeguard
buildings at risk.
Alana Lee, the EPA's project manager for the Middlefield-Ellis-Whisman
site, said the EPA does not expect to find high levels of TCE in the air
inside the buildings, based on the quantities found in the groundwater.
``We don't expect levels to be such that you'd have to evacuate any
buildings,'' Lee said, adding that the data collected will be analyzed to
learn more about how TCE might escape from groundwater into the air.
The new standards are the first update of the EPA's guidelines, first
issued in the late 1980s. A risk range for the EPA typically runs from one
person in 1 million to one person in 10,000 who develops cancer.
Previously, the range of TCE that corresponded to those risk levels was 0.2
parts per billion to 20 parts per billion; those have been lowered to 0.003
parts per billion to 0.3 parts per billion.
The forthcoming letter from Lee, the project manager, signals the EPA's
formal request for the companies to take action. They will have until
roughly Dec. 1 to return a work plan for TCE testing for the EPA's review.
**************************
Government Computer News
Lawmakers still hope to finish an e-gov bill this year
By Jason Miller
House lawmakers are making a late push to give agencies more than $200
million in e-government funds and to establish some type of governmentwide
IT manager.
The Government Reform Subcommittee on Technology and Procurement Policy
tomorrow will mark up the E-Government Act of 2002 by replacing it with a
manager's amendmenta revised version of the Senate-passed S 803.
Melissa Wojciak, staff director for Rep. Tom Davis (R-Va.), the
subcommittee's chairman, said Davis added five provisions to the manager's
amendment to go along with several technical corrections. A full committee
hearing on the bill is scheduled for Oct. 9.
Davis inserted his Federal Information Security Management Act to replace
the provision that would have permanently reauthorized the Government
Information Security Reform Act. FISMA would require all agencies to
implement specific, baseline security standards established by the National
Institutes of Standards and Technology, and permanently reauthorize the
agencywide risk management security approach first imposed this year under
GISRA.
The changes also would contain Title 5 of Davis' Service Acquisition Reform
Act of 2002, which would let state and local governments use the General
Services Administration's schedule contracts and absolve IT procurements
from the strictures of the Buy American Act.
Davis also included a directive for a share-in-savings program that would
not be limited to IT. He also would require the Office of Management and
Budget to establish a Technical Innovations Office.
"We have bipartisan and administration support on almost all of the
amendments," Wojciak said.
The Senate bill, sponsored by Sen. Joseph Lieberman (D-Conn.), includes a
provision to establish an E-Government Office within OMB led by a
Senate-confirmed official. Meanwhile, the House version calls for a
governmentwide CIO and includes figures for OMB's e-government fund that
vary from the Senate's authorizations [see story at
www.gcn.com/vol1_no1/daily-updates/20061-1.html].
****************************
Washington Post
New Guidelines Open U.S. Data To Challenge
Tuesday, October 1, 2002; Page E01
Knowledge may be power in most places, but cold, hard data rule in Washington.
In the arduous process of creating a federal regulation, the bricks and
mortar are the data underlying the rule -- the scientific studies, the
surveys, the risk assessments, morbidity estimates, the economic analysis.
Regulators at federal agencies are the final arbiters in what goes into the
mix and whether it's reliable. Challenges to their judgment usually end up
in court years after the rule is conceived.
That all changes today. From now on, virtually every piece of information
that the federal government makes public -- through a rulemaking, a
publication or a Web site -- becomes open to challenge for its accuracy and
veracity.
Thanks to an obscure provision in an appropriations bill two years ago,
business groups and others who have sometimes criticized government
information as being biased or of poor quality no longer have to wait until
a rule is issued to seek corrective action. They can lodge a complaint
about the agency's data, along with the basis of their challenge, and the
federal agency has to respond in a timely way -- probably 60 days.
In February, the Office of Management and Budget devised final guidelines
to "provide policy and procedural guidance to federal agencies for ensuring
and maximizing the quality, objectivity, utility and integrity of
information disseminated by federal agencies." Today, some 93 agencies will
release guidelines of their own, tailored to their individual missions, but
reviewed by OMB.
OMB officials caution that it will not be easy to get a rulemaking
reopened. "The impact on the regulatory process is speculative. Getting
agencies to reopen a rule is a much bigger step than the [law] envisioned
and it requires a much larger burden on the complainer," said John D.
Graham, administrator of OMB's Office of Information and Regulatory Affairs
(OIRA).
What may be more important than challenges to information backing proposed
rules are those to the studies and information the government uses outside
the regulatory structure, such as those that address global warming or
health risks.
"It allows us to go in and have the government justify the information it's
using. It allows us to challenge the information," said William Kovacs,
vice president of environment, technology and regulatory affairs for the
U.S. Chamber of Commerce. "A bad rule based on bad information means we
spend money and don't achieve the health and safety benefits the rule set
out to achieve."
The Center for Regulatory Effectiveness, which characterizes itself as a
regulatory watchdog group that is supported by business and trade
associations, was the impetus behind the new law, which was the subject of
no hearings, debate or fanfare when added to a 2001 spending bill.
James J. Tozzi, the group's founder, was at OMB's office of regulatory
review from 1972 to 1983. He said the center probably will be among the
first to challenge a study the Environmental Protection Agency did as part
of a rulemaking.
Business groups support the new program as a way to improve the quality of
rulemaking.
"With this law, the quality of data underlying agency rules is likely to
become a key issue in litigation challenging federal rules. Simply put, the
data-quality law promises to have an immeasurable impact on every business
that uses federal data, and on every business that is regulated by a
federal agency," the U.S. Chamber of Commerce told its members in an
informational paper.
The Chamber's Kovacs said the business group does not have a petition ready
to go, but it has hired a special science adviser to review studies and
assessments and has set its sights on everything from studies documenting
the effect of salt in diets to forest-management practices.
The guidelines agencies make public today will reflect months of drafts,
public comments and oversight from OMB. Some of the agency proposals tried
to exempt certain types of information and minimize the effect and scope of
the guidelines.
EPA, for example, said in its draft that the guidelines might not apply in
certain situations and, overall, are not legally enforceable. In an
interview, an EPA official, who didn't want to be identified, said the
objectives of the guidelines make sense, in that the agency should be able
to defend its work. They don't carry an obligation to act, the official
added, "but we do take them seriously."
In a June 10 memo, OIRA's Graham told agencies that it's not good enough to
simply restate their current policies:
"At a minimum, each agency must embrace the OMB quality definitions as
information quality standards they are seeking to attain." He reminded
agencies that they are not free to disregard their own guidelines, even if
they include disclaimers.
Besides the review process, information federal agencies make public after
today will have a hierarchy, with the most "influential" data being
subjected to more disclosure and transparency. In some cases, studies and
results have to be "reproducible," meaning an outside party should be able
to get the approximately same results when it checks the agency's work.
Information supplied to the government by third parties, either as part of
a rulemaking, or a challenge, also must meet the new standards.
The prospect of continual challenges has alarmed public-interest groups.
They predict that rulemaking will suffer as agencies siphon off resources
to defend their data.
"Our concern is that it will discourage agencies from disseminating
information and slow or halt the issuance of protective regulations," said
Wendy Keegan, a regulatory affairs fellow with Public Citizen's Congress
Watch. "The possibilities are frightening."
Gary Bass, director of OMB Watch, a public interest group, said the spirit
of the guidelines is laudable. But the application is worrisome. "This
could cover anything from flight arrivals, toxics, and worker health and
safety. It covers virtually every piece of information in the government.
If you can't use a certain study, it may mean you can't do the rulemaking."
Graham believes even the doubters will end up using the guidelines. "A lot
of groups will use this constructively to pursue their agendas. Information
is a two-edged sword and it won't always favor the business community or
the public interest community."*************************
Associated Press
NSA to Upgrade Monitoring Abilities
Tue Oct 1, 8:07 AM ET
By SETH HETTENA,
SAN DIEGO (AP) - The largest U.S. intelligence agency will spend millions
to upgrade the technology it uses to sift through the huge volume of
telephone conversations, e-mail and other worldwide communications chatter
it monitors, under a new contract.
The National Security Agency has signed a $282 million contract with
Science Applications International Corp. of San Diego to help develop a
more refined system for culling useful intelligence from a flood of data it
collects daily. Officials disclosed the 26-month contract on Monday.
Most details about it are classified, as is most information about the
security agency. But analysts said the deal reflects the growing challenge
of electronic eavesdropping.
"There's a ton more communications out there and how to sift through that
is an increasing problem for the NSA," said Richard A. Best Jr. of the
Congressional Research Service.
The advent of e-mail, pagers, cellular phones, fax machines and the growth
of international telephone service has left the NSA with "profound
'needle-in-a-haystack' challenges," Best said.
The Sept. 11 attacks underscored the need for such monitoring. Among the
millions of communications intercepts the NSA collected on Sept. 10, 2001
were two Arabic-language messages warning of a major event the next day.
The Arabic messages were not translated until Sept. 12.
Two years ago, the Fort Meade, Md.-based security agency launched what it
calls the "Trailblazer" program to use commercial technology to help it
keep pace with the growth in global communications.
SAIC could be in line for additional lucrative NSA contracts if the agency
decides to buy its solution. The company referred all questions on the
contract to the NSA.
The NSA, part of the Defense Department, has in recent years been taking
advantage of advances in the commercial sector.
***************************
Federal Computer Week
Defense U. targets e-gov
BY Colleen O'Hara
Sept. 30, 2002
The National Defense University last week began a master's-level
certification program designed to mold government managers into
e-government leaders.
The eGovernment Leadership Certificate Program is a "broad leadership
program" aimed at helping senior executives manage programs that cut across
organizational lines, said Linda Massaro, a senior fellow at the
university's Information Resources Management College. "It takes a
different set of skills" to do that effectively, she said.
The school received 39 applications for the new program, Massaro said. All
are expected to be accepted.
The certification program focuses on 13 competencies that senior managers
say are necessary to produce successful e-government programs, Massaro
said. These include change management, managing financial resources and
homeland security, an already popular new course offered at the school.
Ira Hobbs, co-chairman of the CIO Council's Workforce and Human Capital for
Information Technology Committee, said the certification program format
serves a specific audience and provides targeted coursework. Managers "need
to get grounding in functional areas," he said. "I think that's important
for the future."
If students choose to, they can take the program via distance learning,
Massaro said. Students must finish their class within 12 weeks, will
receive a grade on their participation in class and will be monitored by a
faculty member. This ensures that the agency is "getting its money's
worth," she said.
The program is free for Defense Department employees, but other federal
workers must pay $900 per course.
For more information on the program, go to www.ndu.edu/irmc and click on
Academic Programs.
***************************
Los Angeles Times
Studios' Web 'Plants' Lead to an Ethical Thicket
By PATRICK GOLDSTEIN
October 1 2002
Ever since Harry Knowles burst to prominence with Aintitcoolnews.com, the
Internet has blossomed with hundreds of movie geek Web sites, each one
crammed with its own oddball assortment of news, reviews and message boards
devoted to "Star Wars," Quentin Tarantino and other pressing matters. For
movie fans, the sites represent authentic participatory
democracy--everyone's opinion or obsession carries equal weight.
But earlier this year, Chris Parry, a 32-year-old writer and ex-production
manager who runs the site efilmcritic.com, began noticing a lot of very
inauthentic postings. They read like outright publicity plugs, or what Net
denizens call "plants," most of them touting films released by Universal
Pictures.
On May 30, filmfreak234 wrote: "Lemme just say that I really can't wait to
see undercover brother ... am I alone here? For one it looks hella funny,
and two its got denise richards. You just can't get better than that
combo!!!! Apparently harry knowles thinks so too. You know, from
aintitcool.com. He said it was the bomb. If you wanna see what he wrote
check out www.aintitcool.com and look for it on your own ... I'm definitely
stoked for this one."
On July 9, fangoria17 wrote, enthusing about "The Silence of the Lambs": "I
can't wait until the prequel Red Dragon comes out this fall. I watched the
trailer for it at http://www.apple.com/trailers and it got me really
excited. Check it out and tell me what you think."
When Parry got a series of messages plugging "Blue Crush," another
Universal summer release, he became suspicious, because all the messages,
as he put it, "were obviously scripted and always had a link to the trailer
for the film." When he checked the IP, or Internet Protocol, address of the
messages, he discovered that they originated from the same place, Universal
Pictures' registered corporate site, MCA.com.
Parry's movie site wasn't the only one being "seeded" with fake fan
messages. Brian Renner, a 17-year-old high school student who runs the site
themovieinsider.com from his home in the Detroit suburbs, received
identical postings for the films "Undercover Brother" and "Red Dragon."
When he ran a check on their IP addresses, they were the same as for the
messages at Parry's site: Universal Pictures.
Renner also got several suspicious postings promoting Paramount Pictures
films, including "The Sum of All Fears" and "K-19: The Widowmaker." On Aug.
13, he received a series of messages from aptreke and aresolic, both hyping
the "Star Trek 2: The Wrath of Khan" DVD and a promotion that offered a
free subscription to a "Trek"-themed dial-up modem. To Renner, the postings
sounded like ad copy, not genuine fan messages. In one message, aptreke
wrote: "Now I have a 30-day free subscription to a Trek-themed dial-up.
There's all this content on my homepage now that is really unique.... Did
anyone else get this too and try it?"
Renner discovered that the messages came from the same IP address, one
registered to Paramount Pictures. Both Parry and Renner say they tried
contacting the studios and e-mailed queries to the people sending the
suspicious postings, but never received a response. They suspect other
studios of planting messages as well, but most postings were disguised by
the use of a separate Internet service provider.
"This is dirty tricks, not legitimate marketing," says Parry, who adds that
his site gets anywhere from 50,000 to 150,000 hits a week, depending on the
time of year. "It's also a slap in the face because the studios are using
our site to hype movies without paying for advertising. After all, what's
the difference between paying people to pretend to be film fans Web sites
across the country and paying them to pretend to be happy customers in a
testimonial TV commercial?"
These questions bring up issues about studio marketing ethics and how they
apply to the Wild West environment of the Internet.
This isn't the first time movie studios have been caught using questionable
marketing practices. Last year, Newsweek revealed that Sony Pictures had
invented a fake film critic named Dave Manning, whom the studio quoted in
ads offering favorable blurbs. Shortly afterward, Sony admitted that two
employees had posed as moviegoers in man-on-the-street testimonial TV ads
to promote an earlier release.
At the time, rival studio marketers loudly decried Sony's activities,
saying they never used staffers or actors in TV testimonial ads (though
questions about their veracity in other movie ads led the studios to stop
using that type of advertising). But different rules seem to apply for the
Internet. In recent years, the Web has been inundated by viral marketing,
in which a variety of companies have used teenage "street teams" or their
own employees to tout CDs, sci-fi DVDs, skateboards, sneakers, video games
and teen apparel.
Universal Vice Chairman Marc Shmuger says his studio has regularly used
street teams to go online and talk up Universal films. He insists they are
not employees, but unpaid volunteers recruited by the Universal Music and
Video Distribution Group.
"It's aggressive marketing, but it is not deceptive marketing," he says.
"This is a technique used everywhere in corporate America--it's no
different from the girls who go into bars to tout cell phones and vodka.
However inept these postings were, they were unpaid volunteers expressing
their unscripted enthusiasm. They were not posing as fans; they were fans.
We never knew the Web sites attempted to contact them. If anyone asked who
they were, there's no question that they should have identified themselves."
Paramount publicity chief Nancy Kirkpatrick said her studio had no
knowledge of any employees planting plugs for its films. "People who go
online say they're working for Paramount or an agency hired by Paramount.
We don't do anything without full disclosure."
Other studios, which were not involved in these incidents, say their
staffers or hired teams don't hide their identities. "I won't pretend that
we've never put any seeding information into a Web site, but we never do it
covertly," says New Line interactive marketing chief Gordon Paddison, who
engineered the studio's groundbreaking Web campaigns for the "Lord of the
Rings" film series. "No one here is allowed to pose as a fan. When I'm
online, if someone asks who I am, I say, 'I'm Gordon from New Line.' "
Steve Peters, who runs argn.com, a site devoted to alternative reality
gaming, says his message boards are frequently invaded by street team fans
touting products. His site now has a discussion area devoted whether viral
marketing is a "valid approach" or merely "insulting" lies.
"To see someone posting as someone they're not really ticks people off,"
Peters says. "I mean, if someone did that on TV, they'd get in trouble,
wouldn't they?"
In fact, the producers of the new TV show "Push, Nevada" were recently
busted for hiring actors to pose as members of a supposedly real-life Push,
Nev., high school hockey team on a segment of "Good Morning America."
Because "Good Morning America" is on ABC, the same network airing "Push,
Nevada," ABC was pilloried by skeptical TV columnists who found it hard to
believe that the network or "Good Morning America" wasn't in on the stunt.
It seems that entertainment conglomerates practice a double standard when
it comes to the Internet.
They have repeatedly blasted the Web as a haven for lawless pirates who
engage in unauthorized file-sharing of music and movies. In a recent
speech, News Corp. President Peter Chernin called the Internet "a
moral-free zone" whose future is threatened by rampant piracy.
Yet these same media giants have no qualms about using deceptive marketing
to publicize their products. It is widely believed, for example, that
studio staffers regularly go on Harry Knowles' site to plant positive
reviews of their films--or negative reviews of their competitors' movies.
These plantings have become commonplace ever since the runaway success of
"The Blair Witch Project," which was based in part on a clever Internet
viral marketing campaign.
But subsequent attempts to create Web buzz have been less fruitful, largely
because the planted street-team messages are so clumsily executed. "You can
spot these guys right away, because no real people write the way they do,"
Parry says. "No real person ever says, 'Hey, check out the trailer here,'
with a link to it."
The people who run movie Web sites worry that these planted messages are
alienating rank and file fans, pointing to aintitcoolnews.com, which now
seems to carry more weight with industry insiders than true fans. "I wish
these studios realized how unprofessional it is to plant things," Renner
says. "If you read these dumb messages, you'd think, 'Who was ever going to
believe 'K-19' was a good movie anyway?' "
"The Big Picture" runs every Tuesday in Calendar. If you have questions,
ideas or criticism, e-mail them to patrick.goldstein@latimes. com.
***************************
The Chronicle of Higher Education
On Behalf of Film Studios, Company Searches for Students Downloading Movies
By SCOTT CARLSON
Chief information officers and others who attempt to control file sharing
on college campuses have a new headache to deal with. MediaForce, a company
that tries to stop file sharing on behalf of movie companies, has been
patrolling the Internet and flooding some colleges and universities with
cease-and-desist requests -- some of them apparently justified.
The letters have led colleges to many students who readily admit having
downloaded movies that have recently been in theaters, such as Triple X,
Spider-Man, The Lord of the Rings, and Goldmember, the newest Austin Powers
film. Cindy Kester, the assistant director of academic computing at the
State University of New York at Binghamton, says that she has gotten about
30 requests from MediaForce since the beginning of the semester.
The company is similar to NetPD, which barraged colleges last year with
similar notices about songs by Michael Jackson and the rock band Incubus.
Officials at institutions complained because NetPD's requests were vague
and did not follow the requirements of the Digital Millennium Copyright Act.
MediaForce's requests seem to be in better order than NetPD's, however. As
the copyright act requires, they include a statement about the rights of
the copyright holder, the name of the offending file, and details about
both the time the file was found and its location. They do not have a
digital signature -- also required in a provision of the act -- but they do
provide a telephone number for Mark Weaver, MediaForce's "director of
enforcement."
Mr. Weaver did not return calls from The Chronicle.
"They're not too good at corresponding," says Jane DelFavero, the network
security manager at New York University. She has tried to contact
MediaForce several times to complain about the lack of a digital signature
on letters she has been sent, but has never received a reply. Once,
however, when she complained about factual inaccuracies in the body of the
letter, she got a new letter in reply with the errors corrected.
She has also never received a request to confirm that files have been taken
off line. "In the old days, when infringement was a picture on a Web site,
we would follow up more closely," she says. "But now it's more of a one-way
thing. They alert us, and we alert the user."
Most institutions seem to be following up on the requests. "I think if we
were really sticky, we could ask for a digital signature," says David
Bantz, the director of technology planning and development at the
University of Alaska at Fairbanks. "But they are sending them to the right
place, and we're finding evidence of what they say is going on."
Processing MediaForce's complaints, however, takes a lot of time. "I would
say that with the combined resources of me and other people working on it,
it takes about half a day" per complaint, Ms. Kester says. Staff members
hunt down the student, confront him or her, and set up a hearing with the
University Judicial Board.
Occasionally, MediaForce's letters do make one request that seems to be
outside the law. In a letter to Cornell University, MediaForce asked the
university to "terminate any and all accounts that [the student] has
through you."
After inspecting the letter, Georgia K. Harper, an expert in
intellectual-property law and a lawyer for the University of Texas System,
says that such a request is not clearly authorized in the digital copyright
act.
Curiously, several technology officials at universities also note a pattern
in a few cases on each campus: The computers of some of the students picked
out by MediaForce had been compromised, and a hacker had put the movies on
the hard drives.
Ms. Kester says she sent a technician to confront one student who had been
identified by MediaForce. The student adamantly denied that he had shared
files. "Our technical person took a look and sure enough, someone had taken
over his machine, installed KaZaA, and was serving a movie off the
machine," she says.
******************************
Federal Computer Week
Models of mayhem
The government wants to simulate the ripple effects of critical
infrastructure attacks
BY Jennifer Jones
Sept. 30, 2002
From major power outages and crippled telecommunications nodes to the
dramatic spread of pneumonic plague, government agencies have increasingly
played out mock disasters since last September's terrorist attacks using
sophisticated modeling and simulation tools.
Yet few of those models take into account the set of "interdependencies,"
or specific repercussions, that affect the outcome when a disaster in one
industry wreaks havoc on the nearby, dependent infrastructures of other
sectors.
The electronic simulation of those interdependencies and relationships has
emerged as a field begging for more federal research and development.
"We are looking for new types of capabilities to prove the robustness of
infrastructures and to better equip decision-makers and policy analysts,"
said Steve Rinaldi, joint program director of the National Infrastructure
Simulation and Analysis Center (NISAC), led by Sandia and Los Alamos
national laboratories.
In some cases, existing solutions are being modified into new applications.
For instance, government officials are tweaking airport-modeling programs
to simulate worst-case scenarios at the nation's seaports. Another push has
vendors scrambling to adapt simulation tools usually used for planning and
analysis beforehand to double as command and control centers that can help
manage the infrastructure during a crisis should an anticipated event
actually unfold.
All the while, emphasis is being placed on interdependencies. In the
future, more simulation efforts will be designed to enable officials to
answer often confounding questions, such as what happens to the water,
telecommunications and financial infrastructures after massive failure in a
power grid, or what steps responders should take when catastrophic events
ripple across infrastructures.
However, the federal government's disaster-modeling capabilities currently
present a split picture: optimistic on one side, more cautious on the other.
"Each sector has the phenomenal ability to model their own individual
infrastructure," said Brenton Greene, deputy manager of the National
Communications System (NCS). "What is less robust and newer science and
frankly more challenging science is the ability to accurately and
predictively model the interdependencies among infrastructures."
NCS is co-managed by the White House and the Defense Information Systems
Agency and assists the president, the National Security Council and federal
agencies with their telecommunications functions. But it also serves an
important homeland security function by coordinating communications for the
government's national security and emergency preparedness initiatives.
Formidable challenges are tied to the fact that modeling interdependencies
requires the use of complex algorithms capable of processing large volumes
of data. On top of that, officials must overcome the technology and
equipment differences among established efforts in separate industries,
such as the work NCS has done in the telecommunications sector.
NCS' infrastructure modeling partnership with the telecom sector is well
advanced (see "Project eyes global early warning system for the Internet,"
below). And its work is an example of the established single-sector
modeling efforts that the push for interdependency modeling, led by NISAC,
must not only build on, but also overcome.
Addressing Interdependence
To foster more comprehensive infrastructure modeling, President Bush in his
National Strategy for Homeland Security pledged additional research in
areas such as analysis, technical planning and decision support environments.
This need was also cited extensively in a June National Academies report
that suggested ways to harness science and technology to fight terrorism.
"New and improved simulation design tools should be developed," the report
recommended. "These efforts would include simulations of interdependencies
between the energy sector and key infrastructures such as communication,
transportation and water supply systems."
To make sure that happens, Bush tapped NISAC to lead this redoubled effort,
which will pull in the private sector using R&D incentives.
"By funding modeling and simulation across critical infrastructures, we are
trying to get at the complexities brought about by interdependence,"
Rinaldi said.
Currently endowed with a $20 million budget that flows from the Pentagon's
Defense Threat Reduction Agency, NISAC has ramped up R&D efforts
significantly since its formal inception in 2000. At that point, the center
had only $500,000 in funding and was a mere joint effort between Sandia and
Los Alamos officials.
NISAC officials are now on a campaign to wrap in many of the modeling and
simulation efforts scattered across government.
"The plan is to take tools that have been developed and incorporate them
into a common platform within NISAC," said Steve Fernandez, manager of the
Infrastructure Protection Systems division at the Idaho National
Engineering and Environmental Laboratory (INEEL). "That way, when there is
a problem or threat, officials will have a virtual menu of different
modeling capabilities."
As a sign that this consolidation has begun, Fernandez referenced increased
NISAC involvement in national labs' efforts to model potential weaknesses
in supervisory control and data acquisition (SCADA) systems.
Such systems are a crucial part of the control mechanism used to manage
critical energy infrastructures because they direct the flow of energy and
molecules, he said.
The architecture and interfaces between the scattered SCADA systems have
become more open through advances in the technology industry, particularly
public networks and this has proved to be a mixed blessing. "The evolving
SCADA systems are becoming more efficient and cost-effective, but arguably
less secure," an April Sandia report concluded.
The National Academies report also suggested that SCADA systems "pose a
special problem" and recommended encryption techniques, improved firewalls
and intrusion-detection systems to reduce the risk of disruption.
Indeed, increasing the security of SCADA systems is now a top R&D priority
for the labs, and simulations play a key role in that effort, Fernandez said.
"There are many different models as to how you should tie the SCADA systems
together," he said. The INEEL staff has put together SGI 64-bit processors
in a 32-node cluster to complete much of the modeling.
Means and Methods
To better simulate the interaction among industries, NISAC will pursue a
series of modeling approaches.
The center is working to advance screening tools to stitch together
existing simulators to form an early warning system. The system will rely
on the development of algorithms and technologies that offer a composite
view of all the nation's critical infrastructures.
Another approach is what NISAC's Rinaldi termed a "stocks and flows
approach" that will show goods and services flowing through and across
infrastructures.
"We built some pretty sophisticated models around the California energy
crisis," he said.
Using the California models, NISAC was able to show the compounding effect
of power outages. The models displayed the drain that the energy crisis put
on other industries, such as the agricultural community, which is heavily
reliant on hydro-electrical power.
NISAC is also testing agent-based simulation. "An agent is an encapsulated
piece of software that acts as a decision-making piece of a physical
infrastructure," Rinaldi explained.
For instance, in a simulation of a stock market, agents could be individual
traders, acting separately but working to create the total functioning of
the infrastructure. In an electric power plant, the agents could be
generators or any objects working separately but impacting the whole.
A fourth area involves physics-based models, which will simulate operations
occurring within infrastructures. For instance, within an oil or gas
system, the models could be the detailed operation of a pipeline.
In both agent-based and physics-based modeling, the impact of disasters on
infrastructures is more accurately simulated because of the level of
detail. In the former case, the different behavior of agents is factored
in. In the latter, the behavior of elements such as electricity or gas is
built into the models.
Finally, Rinaldi described population mobility modeling, which the center
is also exploring. "We are looking at how entities, namely people, move
through a geographic area," he said.
As individuals move through an area, they impact, for example, the
financial infrastructure through use of an automated teller machine or
energy systems by fueling and driving a car. "This is a very microscopic
view of how an individual moves through and interacts with
infrastructures," Rinaldi said.
The common thread in the five areas is infrastructure assessment, he
concluded. "We are looking for vulnerabilities, areas of mitigation and
methods of response," he said.
Leaning on Industry
Along with the centralization of modeling efforts among the labs, NISAC is
also getting more aggressive in its efforts to include private industry.
"One of the things that will be absolutely essential is for NISAC to work
closely with the owners and operators of the infrastructures," Rinaldi
said. "We are also working with the vendor and academic communities, which
have expertise in the operational characteristics and the network
topologies in place."
Vendors such as computer simulation specialist SGI have long worked with
government to develop high-performance computing systems, visualization
tools and advanced algorithms. That corporate history is now playing into
homeland security opportunities, said David Shorrock, SGI's marketing
development manager for government industries.
A key solution will be immersive visualization tools, which allow large
amounts of data to be processed and simulated, Shorrock said. "We have
honed [our] tool so that it is available to officials not only to practice
response but to use operationally as a command and control center," he said.
As the proposed Homeland Security Department continues to take shape, R&D
dollars are still flowing from various agencies. For instance, the Defense
Advanced Research Projects Agency recently embarked on a "conceptual
studies" initiative to address holes in simulation capabilities.
"Current trends in commercial high-performance computing are creating
technology gaps that threaten continued U.S. superiority in important
national security applications," DARPA officials reported during a June
unveiling of the effort.
To explore the gaps, Cray Inc., IBM Corp., SGI and Sun Microsystems Inc.
will each get $3 million to develop ways to analyze areas such as the
dispersion of chemical or biological agents and to work on advanced
intelligence and surveillance systems.
Shorrock also predicted that data fusion will be an area of intensive
government R&D focus. "Nobody has conceived the breadth of this problem,"
he said. "Research like this has not been done to the depth now needed to
satisfy the government's efforts to model disaster responses."
***************************
Federal Computer Week
Forensics meets time travel
Backup efforts aim to detect when damage occurred and restore systems to a
clean slate
BY John Moore
Sept. 30, 2002
In backup they trust. Many organizations have depended on a fail-safe layer
of storage technology to recover information in the event of data loss or
corruption, and commercial solutions abound, with offerings that stretch
from enterprise servers to individual systems. Since last September's
terrorist attacks, government agencies and corporations have invested more
in this area because they realize how much is at stake.
But in this new era of homeland security, some data loss is likely to occur
from malicious attacks, and some researchers believe current backup
technology cannot provide the necessary level of assurance.
A June report from the National Academies identifies data backup and
reconstitution as an area ripe for research.
Most "normal" backup methods usually involving storage on tape or
disk were developed under the assumption of benign and uncorrelated failure.
However, in the wake of a malicious attack, so-called reconstitution
requires a decontamination step that distinguishes between the damaged and
"clean" portions of a system, according to the National Academies report.
The key issue is determining when data was corrupted and restoring the most
recent backup files created before that point. It's a delicate task that
calls for a mix of forensics and time travel. Today's backup technology
offers help in this regard, but further refinements are under development.
Other related research topics include system survivability and the backup
needs of telecommuters.
Several initiatives aim to improve the security of key national
infrastructures, such as electrical utilities. The Idaho National
Engineering and Environmental Laboratory, for example, recently unveiled
its plans for the Critical Infrastructure Protection Test Station.
The station will explore the recovery and reconstitution of attacked
infrastructures, said Steve Fernandez, manager of the Infrastructure
Protection Systems division at the Idaho lab. The lab operates a sizable
power grid that researchers will use to locate vulnerabilities and test
countermeasures.
Work on the test station is scheduled to begin in fiscal 2003, Fernandez said.
Other efforts hit closer to home. Larry Rogers, a senior member of the
technical staff at Carnegie Mellon University's CERT Coordination Center,
said backup and data integrity issues "extend to people at home and can be
compounded by more people working at home."
A telecommuter, he said, could introduce corrupt files into the main
office's work environment. Rogers added that he has just started looking
into telecommuters' effects on security systems as a research subject.
The National Academies' report, "Making the Nation Safer: The Role of
Science and Technology in Countering Terrorism," puts this challenge in a
five- to nine-year research time frame. But many efforts are under way to
get ever closer to the target solutions.
Where We Are Now
The task of recovering data from a corrupted system requires two elements.
First, an organization must determine when the attack occurred.
"One of the biggest challenges you find is that [data corruption] can go
undetected for a period of time" that could span months, said Marco
Coulter, vice president of storage solutions at Computer Associates
International Inc. (CA), which sells backup software.
"Attackers who really want to do a lot of damage, and be creative in how
they do that, may try to slip in corruption in a way that it is not
detected for a long period of time," said Kevin Walter, vice president of
product management for information protection at Legato Systems Inc.,
another backup vendor.
Storage vendors offer several tools to help isolate the problem. CA, for
example, offers a file change recorder with its BrightStor High
Availability products. The utility constantly tracks changes to files,
creating a log of sorts that can help organizations determine when an event
occurred.
Antivirus software and intrusion-detection systems may also come into play.
"From a different perspective, security, you need distinct audit logs so
that you can go back and ask, 'At what time did this all start?'" Coulter said.
Once the time frame is established, the second element of recovery kicks
in: a series of backups conducted over time.
"You need to have what we call a line of images," Coulter said. An
organization that knows when an event occurred and maintains such a
collection of images can turn back the clock to a clean slate. Storage
executives refer to this approach as "point-in-time recovery."
An agency may achieve point-in-time recovery through a series of tape backups.
Complete backups may occur weekly or monthly, with incremental daily
backups to provide a base level of protection, Walter said. This approach,
however, could result in hours or even days of data loss.
Organizations that "can live with a 24-hour data loss may stick with
traditional backup," Walter said. But for those that cannot deal with much
data loss, a technique known as snapshotting could be the answer.
A snapshot is an efficient way to keep previous versions of files on hand
by tracking and recording only how the data changed over time. Snapshots of
data can be taken more frequently and less invasively than full backups.
Legato's NetWorker PowerSnap for Network Appliance Inc. (NetApp) products
enables administrators to establish a policy on how frequently snapshots
are taken as often as once every 30 minutes.
If something happens, "you can do a roll-back recovery to that last good
snapshot," Walter said. The product works on Oracle Corp. databases saved
on NetApp filer storage appliances.
A number of vendors offer snapshot products, including EMC Corp. at the
enterprise level, and Symantec Corp., which offers Ghost 7.0 for
incremental PC backups.
Snapshots, although useful, have their limitations. Michelle Butler,
technical program manager for the storage-enabling technologies group at
the National Center for Supercomputing Applications, said the technology
she has encountered performs snapshots for individual storage volumes. But
large file systems may need to span multiple volumes, and "file systems are
growing astronomically," she said.
What's Next?
Members of industry, government and academia are researching the next
developments in backup technology.
The Defense Advanced Research Projects Agency is working on an innovative
approach using the concept of self-healing systems (see box, Page S20).
CA's vision is to tighten the links between backup, storage resource
management and security components. This will provide more efficient backup
and make it more likely that organizations will be able to "cleanse" their
data, Coulter said.
According to Coulter, integrated backup and storage resource management
distributes data to the most appropriate and cost-effective storage medium:
Data that requires fast recovery is routed to disk backup, while less
important data goes to tape.
Better backup-to-security integration, meanwhile, will help organizations
more readily determine the time of data corruption.
Legato's research aims to integrate its Automated Availability Manager for
Microsoft Corp. Exchange with virus-detection software, Walter said. This
integration will enable Legato's product to "automatically detect certain
scenarios" such as viruses "and respond to them in a programmatic way."
Legato offers virus-detection integration on a custom basis through its
professional services arm. "In the future, we would look to provide some
turnkey integration," Walter said.
Beyond product integration, technologists hope to fine-tune the specific
task of locating damaged data in time and space.
In cases of intentional damage, "recovering to a clean state has been a
'completely repaint the room' process instead of just painting over the
stretch," Coulter said. Industry isn't currently equipped to "discover what
a person touched and redo the files. That would take too long."
Today, point-in-time recovery means rebuilding an entire file system. But
CA's research and development aims to pinpoint and recover a specific block
or file that has been altered, Coulter says.
Granularity a measure of system flexibility is also central to the
National Academies' take on backup. The group's report envisions a process
of "distinguishing clean system state (unaffected by the intruder) from the
portions of infected system state, and eliminating the causes of those
differences."
Meanwhile, the research continues as does the growth of file systems
requiring backup.
**************************
Federal Computer Week
Project eyes global early warning system for the Internet
BY Jennifer Jones
Sept. 30, 2002
The National Communications System (NCS) and its private-sector
telecommunications partners have developed infrastructure simulation models
that make use of proprietary data from industry to identify single points
of failure in the nation's telecom infrastructures.
"We've gotten to the point where we could turn that into an exercise
designed to build more robustness into the network and increase our ability
to reconstitute a network that has gone down," said Brenton Greene, deputy
manager of NCS.
NCS is also working to mimic their success in modeling telecom outages in
their attempts to safeguard the Internet.
"We've got a project going on now where we have taken a number of
sophisticated 'health-of-the-Internet' tools, and we're working to
integrate those into a global early warning capability," Greene said. "It
is very fascinating work. We are trying to see how effectively we could see
events like a denial-of-service attack emerging out of southwest Asia or
somewhere in Africa. We want to be able to see it coming and mitigate
damages and alert others." A denial-of-service attack occurs when a hacker
interrupts a service by overloading a system.
Some of the tools are very promising, but the Internet modeling project is
in its earliest stages of development, Greene said.
NCS started the project about nine months ago with various Internet
infrastructure and security companies, such as VeriSign Inc., Lumeta Corp.
and Akamai Technologies Inc.
Developing what Greene calls more synoptic views of networks is not just a
technical challenge. NCS officials have to walk a fine line when asking
carriers and others to give them sensitive data about their networks,
including details on security breaches.
Safeguarding the nation's backbone networks is a major undertaking that
requires advanced modeling of the connections among multiple carriers. But
simulating events among industries is far more difficult, Greene said.
"It's astoundingly complex," he said. "You can't take a simple model and
apply it to massively complex relationships. While it is challenging, this
modeling is advancing in a very promising way. However, I would not call it
a mature capability by any means."
**************************
Federal Computer Week
DARPA explores self-healing system
BY John Moore
Sept. 30, 2002
Within research under way in the Defense Department, backup and recovery
efforts take the form of self-healing systems.
The Defense Advanced Research Projects Agency is focusing on the concept in
its Hierarchical Adaptive Control for Quality of service Intrusion
Tolerance (HACQSIT) initiative.
The HACQIT architecture calls for critical applications to run on separate
local-area networks, isolated by an out-of-band computer one that is
outside the primary system with monitoring, control and fault diagnosis
software.
The architecture, coupled with HACQIT's intrusion-detection approach, sets
the stage for what the project's contractor, Teknowledge Corp., calls
"continual recovery."
Here's how it works: HACQIT maintains a database of previous intrusions,
enabling the system to stop known attacks and viruses.
But HACQIT also houses a list of allowable system requests, based on an
organization's policies. This feature denies requests outside the scope of
permissible actions, thus ferreting out previously unknown viruses or attacks.
"Because we can do extremely rapid detection of problems and constantly
monitor system health, we can continuously repair the malicious effects of
intrusion," said Neil Jacobstein, Teknowledge's president and chief
executive officer.
Unauthorized system requests are terminated. But if a rogue process creates
or deletes files, HACQIT begins bringing in backed-up files. Clean files
are dispatched via the out-of-band computer to replace modified or deleted
files, while created files are deleted.
Jacobstein believes HACQIT can be readily commercialized. "We would like to
license it to one of the big providers of software security," he said.
**************************
Government Computer News
Microsoft wants software to be 'public utility'
By Susan M. Menke
Acknowledging "too many vulnerabilities in the product," Microsoft Corp.
vice president Mike Nash said software must achieve "the same level of
trust as a public utility" that supplies 120 volts reliably from every
electrical outlet. Nash heads the security business unit that early this
year enforced a 10-week stand-down of all development at Microsoft while
11,000 coders learned about threat modeling and peer-reviewed each other's
work.
Nash said Visual Studio .Net is the first product to emerge from the
company's "security push process." Windows .Net Server 2003, he said, will
come out somewhat later than planned because of the security push, and it
will arrive with Web server features turned off, because they otherwise
could present a security vulnerability if customers did not use them. He
said the goal is to make software "secure by design, by default and by
deployment."
The Microsoft.com Web site has become the test bed for all the company's
enterprise-level products, Nash said, because "no uniform resource locator
or domain has more hack attempts."
Nash said Microsoft chairman Bill Gates was prompted by the importance of
software in daily life and commerce to consider security as "an industry
problem." The company's security emphasis will not only be a change in
philosophy but will change the behavior of its engineers and managers, he
said.
Microsoft aims to "reduce the number of vulnerabilities that customers find
in the products," Nash said, by such means as rigorous reviews before
release. "It is clear we have a lot of work to do," he said. "This will
never be over. It has to be ingrained."
************************
Government Executive
State Department asks firms to create intelligence database
By Bara Vaida, National Journal's Technology Daily
Secretary of State Colin Powell on Monday asked the private firms that make
up the President's Council of Advisors on Science and Technology (PCAST)
for help in creating an integrated intelligence database that would ensure
that the more than 300 U.S. embassies do not grant visas to individuals who
mean harm to the United States.
Powell said the State Department needs a system where its overseas officers
can enter applicant data and cross-reference it against a network of
compatible national security databases to confidently grant visas to the
estimated 7 million people a year that apply to enter the country.
"The State Department needs a system that supports them in their assessment
of ... applications ... to make sure it is checked against every
intelligence database and that those databases are integrated so that they
only have to check once," Powell told a PCAST gathering at State. "Maybe
you all can help us with ideas for that."
Powell said he recognizes that database integration is a matter of
technology and policy, adding that the head of the proposed Homeland
Security Department would be responsible for ensuring that the policies
smooth the integration of intelligence information.
Powell also said that his department is spending $200 million a year to
improve its information technology systems and that he has two computers on
his deskone for external e-mail and the other to access the agency's intranet.
"I keep no reference material, no dictionaries, no encyclopedias in my
office anymore ... because all you need is a search engine" on the
Internet, he said.
Before Powell spoke, PCAST members discussed a report on federal science
and technology research and recommended that the Bush administration
consider establishing multiyear engineering fellowships to stem the
declining number of students seeking doctorates in that field.
****************************
Government Executive
Defense tracking system proves crucial to port security
By Molly M. Peterson, National Journal's Technology Daily
A real-time tracking system developed years ago for the Defense Department
is emerging as a crucial component of an industry-driven cargo security
network that aims to prevent terrorists from smuggling weapons of mass
destruction into major ports.
"The big concern is that terrorists will put a bomb or a chemicalor even
themselvesinto one of these containers coming into the United States," said
Mark Nelson, a spokesman for Savi Technology, which helped build the
Defense Department's Total Asset Visibility (TAV) network, and is now
helping to spearhead a public-private effort to achieve an "end-to-end"
tracking system for commercial cargo.
The Smart and Secure Tradelanes (SST) port-security initiative aims to
enable manufacturers, shippers and port officials to monitor the contents
and location of the thousands of shipping containers that enter U.S. ports
each day. "The key is to go back to the point of origin and make sure
everything's secured and certified from the moment of ... putting the
shipments together, all the way to the point of destination," Nelson said.
Shipping companies participating in the program would equip cargo
containers with radio-frequency identification devices that would
communicate with satellite systems and Web-based software. Those systems
would work together to notify port officials of any unauthorized tampering
with shipping containers before they reach U.S. ports.
"Wherever you're sitting, you're going to know exactly when that container
was violated, where it is, what's inside of it, whether it's on schedule or
delayed, and [whether] an authorized person opened it," Nelson said.
Nelson said the Defense Department has extended its TAV network to many key
commercial seaports, airports and trucking terminals around the world. "In
some places where that network already exists, it would make sense to
leverage it for commercial purposes," he said. "A lot of the infrastructure
is already in place. It's been paid for by government dollars, just like
the Internet was many years ago and now is being used also for commercial
purposes."
But even with that existing network in place, Nelson said it would probably
take years to fully deploy the SST network.
"When will it be completed? Probably never," Nelson said. "But the main
part of it is to ensure the security of imports, and that will probably be
within a couple of years."
Government and private-sector officials launched a key segment of the SST
on Friday. Sen. Charles Schumer, D-N.Y., and Rep. Robert Menendez, D-N.J.,
announced that the Port of New York and New Jersey would be the first SST
deployment site on the Eastern seaboard. The initiative was launched at
major ports in Belgium, the Netherlands and the United Kingdom that same day.
"What we have here is an unprecedented partnership between private
companies and federal agencies," Schumer said, noting that 60 percent of
North American trade enters the Port of New York and New Jersey, which
handled more than 2 million cargo containers last year. "SST will help
protect this supply chain and [prevent] a nuclear, biological, chemical or
conventional weapon from reaching our shores."
***************************
Government Executive
Cybersecurity regulations imminent, industry and government warn
By Neil Munro, National Journal
In the debate over national cybersecurity strategy, most of the
participants insist they don't want new regulations. Instead, they say,
they want the marketplace to create cyberdefenses against hackers, viruses,
and other Information Age threats.
But regulations are coming anyway, some industry and government officials
warn, in part because the high-tech sector is reluctant to take on new
burdens during an economic slowdown. And some factions in the debate
actually want regulations that would boost information-sharing within
industry, increase federal spending for industry's priorities, and
encourage lawsuits against companies that have sloppy computer defenses.
Congress and public concern will pressure tech companies to strengthen
cybersecurity with a blend of threats, broad regulations, and publicity,
according to James Lewis, director of the technology program at the Center
for Strategic and International Studies. A similar mix of pressures in the
early 1900s led to improved safety in the food, mining, and railroad
industries, Lewis said.
The White House released its draft plan on September 18, "so that everyone
in the country can tell us what the strategy should be," said Richard
Clarke, the administration's cybersecurity chief. The report does not call
for legislation or regulations, but instead offers "17 priorities and 80
recommendations." The plan largely limits government's role to boosting
public awareness, funding extra research, fostering information-sharing,
and operating its own cyberdefenses, officials said. "The government cannot
dictate. The government cannot meddle. The government cannot alone secure
cyberspace," Clarke declared.
This language is reassuring to the business community, which fears
regulation as much as it fears cyberattacks such as "distributed denial of
service" incidents that can stop online purchases.
Clarke's language reflected the White House's decision to strip many
detailed recommendations for new laws and regulations from the draft plan
before it was released. For example, earlier drafts had called for board
members to assume liability for corporate security policies; the
preliminary language also would have required Internet service providers to
supply their customers with new types of anti-hacker software. Industry
officials are still wary that such regulations may reappear in the final
version that is to be signed by the president.
Industry executives also fear that stringent regulations will turn off
consumersand that tech companies will lose money as a result. And they
worry about liability risks, said Stewart Baker, a partner at the law firm
Steptoe & Johnson who has clients in the high-tech industry. Customers may
reject security measures they find intrusive, he said, and sales of
security services may not be high enough to cover companies' investment costs.
For the computer industry, which has a hard time predicting security
problems or the cost of compensating victims, liability is an increasingly
significant issue. The White House is continuing to prod company auditors,
insurance agents, and citizens to pay more attention to information
security, industry experts say. So far, a few entrepreneurial lawyers have
sought economic damages for computer-security problems but the suits have
largely failed, in part because the claims are still so novel.
Occasional comments from government officials tend to heighten industry's
concerns about liability. For example, on September 18, Howard Schmidt,
vice chairman of the White House Critical Infrastructure Protection Board,
compared computer security to seat belts, which were at first treated as an
inconvenience but are now an accepted part of driving. Because many
lawsuits grow out of complaints about automobile safety, Schmidt's comment
"is a little too close to the surface for industry's tastes," Baker said.
"There is a real worry that, sooner or later, [liability] will be seen as
an attractive way for the government to get people to do what they want:
Sic the lawyers on them."
But on the other hand, the White House's efforts to boost public awareness
of cyberdefense issues can create demand for new products, some executives
say. "The debate is going to get the public engaged in a constructive way,"
said Bill Sweeney, head of global public policy for Electronic Data Systems
in Plano, Texas. It "will also highlight opportunities for the market and
technology to address some of these real problems," he said.
Industry officials also hope for some largesse from Congress. For example,
many executives backed a measure drafted by Rep. Fred Upton, R-Mich., that
would have granted corporate tax breaks for investment in
information-security programs. "As American business recognizes the
increased cost of security, that bill will come back up," Sweeney
predicted. "At some point in time, you're going to get into a cost
discussion," he said, which might include some kind of surcharge on
information technology that would be used to pay for the security add-ons.
So far, marketplace conditions are not helping to boost security, said Ira
Parker, general counsel for Genuity, an Internet firm. Many
telecommunications companies are already in bankruptcy, and others are
trying to cut inefficiencies in ways that increase cybersecurity
vulnerabilities, according to Parker.
The White House security plan is "essentially an appeal to the private
sector to do something," said Warren Axelrod, a senior computer-security
executive at the financial services firm Donaldson, Lufkin & Jenrette. "If
the private sector does not respond, they will only have themselves to
blame if along comes a slew of burdensome laws and regulations."
****************************
Computerworld
GSA to unveil top 20 security flaws, focus on fixes
By Paul Roberts, IDG News Service
SEPTEMBER 30, 2002
The focus will be on fixes this week when the U.S. General Services
Administration (GSA) unveils its list of the top 20 Internet security
vulnerabilities to a gathering of about 350 government CIOs and IT
professionals. The meeting takes place Wednesday at the offices of the GSA
in Washington.
This is the third year the list has been released to the public. Compiled
by the nonprofit SANS Institute Inc. and the FBI's National Infrastructure
Protection Center, the list is intended to raise awareness of serious
computer vulnerabilities and offer IT administrators a way to prioritize
vulnerabilities, encouraging them to patch the most dangerous holes in
their computer infrastructures.
Past lists have been divided into three categories: general
vulnerabilities, Windows vulnerabilities and Unix vulnerabilities, with
previous issues ranging from broad concerns such as the failure to maintain
complete system backups to specific platform and product flaws such as
programming vulnerabilities in the Remote Data Services component of
Microsoft Corp.'s Internet Information Server.
Unlike past years, however, this year's conference will do more than just
raise red flags. Underscoring the Bush administration's efforts to enlist
the private sector in securing the nation's IT infrastructure, officials
from network vulnerability assessment companies such as Qualys Inc.,
Foundstone Inc. and Internet Security Systems Inc. will be on hand. The
companies plan to unveil a list of tools and services that can detect and
correct many of the leading common vulnerabilities and exposures -- or CVEs
-- on this year's list, according to a source involved in planning the event.
Those and other companies have worked closely with the SANS Institute and
government agencies during the past four months to compile the list,
according to the source.
The conference will also highlight NASA's program to thwart Internet
attacks on its network of more than 120,000 machines, according to the
source. That initiative relies on sharing information about vulnerabilities
and attacks among different IT groups within an organization, creating a
transparent and competitive environment in which IT managers are judged by
the security of their systems.
The GSA is expected to hold up the NASA program as a model other government
agencies and private companies can use to reduce the number of attacks on
their own systems.
Also at the conference, the GSA will announce an effort to expand the
government's Safeguard program to help audit the government's own systems
for common vulnerabilities, according to a statement from the GSA.
The Safeguard program, run by the Center for Information Security Services,
provides professional services and products to federal agencies to help
protect them against potential threats.
Although targeted at IT professionals within the federal government, the
annual announcement of the 20 top Internet vulnerabilities is recognized by
many as a list of vulnerabilities that must be addressed for a Web site or
corporate network to be considered secure.
***************************
Computerworld
Web site defacements hit all-time high in September
By David Legard, IDG News Service
SEPTEMBER 30, 2002
The number of Web site defacements has reached an all-time high, with more
than 9,000 attacks this month, according to London security consultancy
mi2g Ltd.
The figure is 54% higher than August's figure of 5,830 defacements, which
was itself a record high, mi2g said in a statement [found at
http://www.mi2g.com/cgi/mi2g/press/250902.php].
In particular, there has been rising antagonism across the digital world
against the U.S. This month has seen defacements of Web sites belonging to
the House of Representatives, the Department of Agriculture, the Department
of Education, the National Park Service, the Goddard and Marshall Space
Flight Centers and the U.S. Geological Survey, mi2g said.
According to mi2g, U.S.-registered Web sites have been successfully defaced
4,157 times so far this month, followed by 835 for Brazil sites, 376 for
the U.K., 356 for Germany and 285 for India.
Hackers are finding an increasing number of vulnerabilities in operating
systems, server software and applications and libraries deployed on
mission-critical systems, making it impossible for systems administrators
to patch vulnerabilities without suffering severe downtime, the consultancy
said.
Although the majority of defacements were made on systems running Microsoft
Corp.'s Windows operating system, a significant number of them were made on
Linux, BSD Unix and Solaris-based machines.
The total number of defacements this year has already surpassed the total
number for last year, mi2g said. Web site defacements have steadily risen
in number from 1998, when there were 269, through 1999 (4,197), 2000
(7,821) and 2001 (31,322). Already this year, 40,116 defacements had been
recorded by Sept. 25, leading to projections of 55,000 Web site defacements
for the whole of 2002, mi2g said.
****************************
USA Today
Diplomas faked on Net worry universities
By Marcella Fleming, Gannett News Service
The Internet began as an electronic link between universities. Now, the
World Wide Web is coming back to bite the hands that made it by peddling
phony college credentials.
For a fraction of a year's tuition, you can obtain diplomas, transcripts
and letters of reference all without cracking a book.
Some Web-based companies brag about how authentic their certificates look,
all the while carrying the disclaimer "For entertainment purposes only."
But officials at several universities aren't amused.
They've told Web-based merchants such as fakedegrees.com to stop producing
phony diplomas, transcripts of coursework and grades, and other such
products that mention their schools.
"What's the harm?" asked Tom Bilger, who is the registrar at Ball State
University in Muncie, Ind. "Somebody makes up a degree and now works in a
nursing home." What's more, he said, "they're taking a job from a qualified
person."
Depending on the Web site, you can get fake diplomas from real colleges,
made-up colleges or colleges that exist only as post office boxes. Some
advertise their wares as novelty gift items. Others push them as résumé
enhancers.
John Bear, a nationally known author and expert who researches and writes
about diploma mills, estimates there are 500 different company names but
says most are run by roughly the same 100 or so people. He conservatively
estimates that diploma mills are a $250 million-a-year-industry. "It's this
incredible, insidious thing," he said.
Although an Internet search will turn up dozens of Web sites for anyone
seeking a synthetic sheepskin, some send out unsolicited e-mails en masse.
"They'll say, 'You have qualified for a prestigious diploma,' " Bear said.
"I have a Harvard M.D. on my wall; cost me $50."
Phony diplomas aren't new experts trace them to at least the 19th century.
A traditional diploma mill sells a false certificate for a few thousand
dollars and, perhaps, a book report or two, but the newer cyber-breed
diplomas are sold for far less.
Fakedegrees.com, for example, requires a $75 "membership" to construct
phony diplomas.
Finding exactly where to aim complaints at the dozens of Web-based diploma
mills is difficult, say college officials.
Indeed, Fakedegrees.com is reportedly based in Spain. Officials could not
be reached for comment. Calls to degrees-r-us.com are routed to voice mail.
In some states, including Illinois, higher education officials are pushing
state lawmakers to criminalize trafficking in bogus degrees.
Colleges, however, often pursue counterfeiters themselves as did Purdue
University.
Last school year, Purdue told fakedegrees.com to pull all references to the
West Lafayette, Ind.-based school from its menu, said Joseph L. Bennett,
vice president for university relations.
But about a month ago, the registrar's office told Bennett there were still
Purdue references on the site.
So Bennett e-mailed the site.
"I quickly got another response that said, 'Sorry, we meant to take
everything down.' I'll be following up on this every month."
**************************
Info World
Virus poses as Microsoft security patch
By Matt Berger
September 30, 2002 2:52 pm PT
SAN FRANCISCO - A virus posing as a security patch from Microsoft is
circulating on the Internet, Microsoft confirmed Monday.
The virus is being distributed in a hoax e-mail that advertises a patch for
a series of vulnerabilities in Microsoft's Internet Explorer Web browser
and Outlook software. The authentic patch for those flaws was actually
released in February. Microsoft said that it has not updated the patch and
that the e-mail is in fact fraudulent. [Story see,
http://www.infoworld.com/articles/hn/xml/02/09/30/020930hnmspatch.xml?s=IDGNS]
************************
Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 510
2120 L Street, NW
Washington, D.C. 20037
202-478-6124
lillie.coney@xxxxxxx