[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips May 2, 2002



Clips May 2, 2002

ARTICLES

Burns Predicts Swift Movement On Spam Legislation
Lawmakers Take Another Shot at 'Morphed' Kiddie Porn
Education Key To Keeping Kids Away From Net Porn - Report
Scientists trot out remote-control rats
Call made for homeland strategy
Davis: Streamline homeland tech
Virus Maker Sentenced
OpenOffice.org releases Version 1.0 of its free office suite
Grid helps science go sky-high
Lessons Learned at Dot-Com U.
Eminem CD could be 'pirate proof'
The Heavy Lifters of Computation
Virtually Rebuilt, a Ruin Yields Secrets
Hacker pleads guilty to accessing NASA system
Army lab will double its computing capacity
How some readers fight spam
Army will tweak its network and software practices
Philips finds way to 'paint' LCD screens
House lawmakers advocate biometrics-based licenses
IRS sets the standard for protecting privacy
Tech firms ally to push homeland security solutions
Technology: Senator blasts broadcasters for missing digital deadline
Technology: HDTV: a 'revolution' in search of couch potatoes
Zentropa says it has designed a revolution
Canada and broadband Internet access
Broadband the decade's boomer, says Budde
Canada vulnerable to cyber attacks
Like hackers to a honeypot
New Zealand slips in e-government stakes
Wireless TV technology may benefit rural areas
The Art of Misusing Technology
HP Unveils New Digital Photo Storage Scheme

*****************
Washington Post
Burns Predicts Swift Movement On Spam Legislation
By David McGuire
Washtech.com Staff Writer

Sen. Conrad Burns' (R-Mont.) bill designed to stem the rising tide of unsolicited commercial e-mail - or "spam" - should make its way to the full Senate before the end of May, Burns told reporters today.

"It looks like we're finally going to get some action on the spam bill," Burns said. "I really think we will pass this in the Senate."

The Senate Commerce Committee, of which Burns is a member, is set to vote on the spam bill May 16. Burns said he had polled many of his colleagues on the committee and is confident that he has the votes he needs to report the bill out of committee and send it to the Senate floor.

Burns said that he and cosponsor Sen. Ron Wyden (D-Ore.) should be able to make a convincing case to bring the legislation to a vote in the Senate before the end of the year.

His bill requires that commercial e-mail messages contain valid return addresses that recipients can use to opt out of receiving further unsolicited e-mail. Spammers who fake their e-mail or physical addresses would face stiff fines and criminal penalties under the bill.

The legislation also would allow state attorneys general to bring private action on behalf of residents who are continually besieged by unwanted spam. Spammers who willfully violate the provisions of the law could be fined as much as $1.5 million.

"If you click 'I don't want any more of this stuff' and they don't honor it, I think there ought to be penalties," Burns said today.

Although federal lawmakers already have some tools they can use to pursue deceptive spammers, Burns said that his bill would give those rules teeth.

Burns said Commerce Committee Chairman Ernest "Fritz" Hollings (D-S.C.) supports the spam legislation.

On the same day that it considers the spam legislation, the committee is also slated to vote on Hollings sweeping Internet privacy legislation, which would set federal ground rule for how companies can collect and use consumers' personal data online.
*****************
Washington Post
Lawmakers Take Another Shot at 'Morphed' Kiddie Porn


Just weeks after the U.S. Supreme Court voided elements of a controversial child pornography law, a cadre of lawmakers has introduced new legislation aimed at banning pornographic images that have been digitally "morphed" to appear to depict children.

Led by Rep. Lamar Smith (R-Texas), more than a dozen lawmakers on Monday night introduced the legislation, which bans many of the same images that were unlawful under the now-defunct 1996 prohibition.

On April 16, the Supreme Court struck down portions of the Children's Internet Protection Act, which outlawed the possession or distribution of pornography containing computer-generated images showing children apparently engaged in sex acts.

In a 6-3 ruling written by Justice Anthony Kennedy, the high court overturned the prohibition, calling it unconstitutional and overly broad.

Rep. Mark Foley (R-Fla.) today called the Smith bill a necessary response to the Supreme Court ruling.

"The high court, in siding with pedophiles over children, forces us into action. Today, united, we begin to reverse the damage," Foley said. Foley, who cosponsored the Smith legislation, joined Smith and U.S. Attorney General John Ashcroft at a press conference today to unveil the proposal.

"The Supreme Court decision did grave injury to our ability to protect children from exploitation," Ashcroft said.

Smith said the new proposal represents a "substantially narrowed" version of the 1996 prohibition. He said the updated bill should pass constitutional muster.

Smith and other lawmakers worked with the Justice Department to develop the bill.

Under Smith's proposal, digitally generated pictures of prepubescent children would be banned outright.

While images depicting older children would also be banned, the bill creates a new legal safe harbor for pornographers who can prove that they did not use real children to create their images.

The Smith bill also would prohibit the sale or purchase of child porn images, even in cases where no such images exist. The bill also would make it illegal to show pornography to minors.

In addition, the legislation would compel the FBI to create a database of known images of child pornography that prosecutors could use in criminal cases against suspected child pornography traffickers.

Law enforcement officials say the high court ruling has caused investigators to drop or forgo prosecuting cases in which they cannot positively identify children in the images in question.

The Supreme Court decision upheld a similar 1999 ruling by the 9th U.S. Circuit Court of Appeals. Michael Heimbach, chief of the FBI's Crimes Against Children Unit, said since that decision, no prosecution has been brought in the 9th Circuit "except in the most clear-cut cases in which the government can specifically identify the child in the image."

Addressing a House Judiciary Subcommittee today, Heimbach said that trend would continue to worsen in the aftermath of the Supreme Court decision.

"I fear that in many cases, this speculative technological debate will indeed result in a bitter end," he said.

Legal experts say the legislation could become entangled in the constitutional issues that led to the high court ruling on the original statute.

"One has to wonder whether this bill will cure the constitutional problems that came up in the Supreme Court decision," said Alan Davidson, associate director of the Center for Democracy and Technology.

Davidson said the measure treads on questionable legal grounds because it would put the burden on individuals to prove that their "speech" is not illegal. Such a burden smacks of "prior restraint," Davidson said, a restriction prohibited by the Constitution.

UCLA law professor Eugene Volokh also was skeptical about the bill's legal footing.

"The FBI says it has a hard time verifying that an actual child was involved in the making of child pornography, but this may make it even harder for the defendant to disprove that a real child is involved," he said. "In the end, I'm not sure this is going to avoid the constitutional problems that the court identified."
********************
Washington Post
Education Key To Keeping Kids Away From Net Porn - Report
David McGuire
Washtech.com Staff Writer
Thursday, May 2, 2002; 7:00 AM


Child welfare advocates attempting to protect children from harmful material online may be ignoring one of the most important tools in making the Internet safer for kids, according to the findings of a congressionally mandated study.

"Developing in children and youth an ethic of responsible choice and skills for appropriate behavior is foundational for all efforts to protect them ... on the Internet and in the physical world," a panel of experts wrote in their final report on protecting kids online.

Published today by the nonprofit National Research Council, the report was ordered by Congress under a 1998 law.

While the report acknowledges a role for legislation and technology in protecting children from harmful material online, it says that educational efforts to instill good Internet habits represent one of the most promising approaches to shielding children from inappropriate material.

"Much of the debate about 'pornography on the Internet' focuses on the advantages and disadvantages of technical and public policy solutions," the report says. "In the committee's view, this focus is misguided: neither technology nor public policy alone can provide a complete - or even a nearly complete - solution."

The report says that over-relying on Internet filters and other technological solutions can instill parents with a false sense of security, tempting adults to believe that "the use of technology can drastically reduce or even eliminate the need for human supervision."

And while public policy approaches "promise to eliminate the sources of the problem," they can present thorny constitutional problems, the report says.

"For example, the committee believes that spam containing material that is obscene for minors should not be sent to children. But laws banning such e-mail to minors are potentially problematic in an online environment in which it is very difficult to differentiate between adults and minors," the report says.

Alan Davidson, the associate director of the Center for Democracy and Technology, applauded the report's overarching findings.

"We teach children to look both ways when they cross the street, we don't outlaw cars," Davidson said.

Chaired by former Pennsylvania Governor and U.S. Attorney General Dick Thornburgh, the panel that drafted the report included representatives from industry, academia and the child welfare community.
*****************
Chicago Sun-Times
Scientists trot out remote-control rats
BY RICK CALLAHAN


By implanting electrodes in rats' brains, scientists have created remote-controlled rodents they can command to turn left or right, climb trees and navigate piles of rubble. Someday, scientists said, rats carrying tiny video cameras might search for disaster survivors.

''If you have a collapsed building and there are people under the rubble, there's no robot that exists now that would be capable of going down into such a difficult terrain and finding those people, but a rat would be able to do that,'' said John Chapin, a professor of physiology and pharmacology at the State University of New York in Brooklyn.

The lab animals aren't exactly robot rats. They had to be trained to carry out the commands.


Chapin's team fitted five rats with electrodes and power-pack backpacks. When signaled by a laptop computer, the electrodes stimulated the rodents' brains and cued them to scurry in the desired direction, then rewarded them by stimulating a pleasure center in the brain.


The rats' movements could be controlled up to 1,640 feet away.

The findings appear in today's issue of the journal Nature.

The experiments used three implanted electrodes--one in the brain region that senses reward or pleasure, and one each in areas that process signals from the rat's left and right whisker bundles.

Chapin's team trained the rats in a maze by signaling the left and right whisker-sensing regions. When a rat turned in the correct direction, its reward-sensing region was stimulated. Activating only the reward region caused the rodents to move forward, the team found.
*****************
Federal Computer Week
Call made for homeland strategy


The Bush administration is headed in the right direction with its homeland security budget, but in the absence of a national strategy, the ways information technology can truly help have not been realized, according to a report released April 29 by the Brookings Institution.

The Bush administration plans to release its homeland security national strategy in July, and without that strategy there are too many opportunities for gaps in applying resources, Rep. Jane Harman (D-Calif.), ranking member on the House Permanent Select Committee's Terrorism and Homeland Security Subcommittee, said at the release of the report.

The report, "Protecting the American Homeland: A Preliminary Analysis," is the Brookings Institution's proposal for creating that comprehensive national strategy as soon as possible.

"We need a strategy, and we need it now," Harman said.

The report outlines four broad areas: border security, domestic prevention, site defense and contingency planning.

Information technology can and should be used in each area to take advantage of knowledge and information across the many organizations and agencies involved, said Michael O'Hanlon, a senior fellow in the Brookings Foreign Policy Studies group who worked on the report.

IT can be particularly helpful when it comes to information sharing efforts, but it also is a critical tool for streamlining every program and making each more efficient, O'Hanlon said. This includes efforts such as inspecting cargo ships, where IT can be used to provide a central system for dispersed ports and personnel, he said.

"We have to be more assertive in sharing information in real time," he said.

Technology can also play an important role when it comes to ensuring identity, especially smart cards and biometrics, but only if there is sufficient security in the application process to guarantee that the wrong people are not being authorized, Harman said.

And when considering the privacy and civil liberties concerns raised by the use of those technologies, officials should consider that the technology often can provide better protection through better control than the equivalent paper-based processes, said James Steinberg, vice president and director of the Foreign Policy Studies group.
******************
Federal Computer Week
Davis: Streamline homeland tech


The war on terrorism has generated a flood of suggestions from companies about ways the government could use technology to improve homeland security. But government agencies lack the technical expertise to evaluate the solutions quickly and the buying authority to purchase them promptly, said Rep. Tom Davis (R-Va.).

As a result, technology proposals for homeland defense "have been sitting unevaluated," Davis said April 30. To solve the problem, he wants to create an interagency team of experts that would "screen and evaluate innovative proposals from industry."

He also wants to prod agency procurement officials to make greater use of existing "streamlined acquisition procedures" so agencies can buy off-the-shelf homeland security-related technology more quickly.


Davis said he plans to introduce legislation this week to create the interagency team and launch an acquisition pilot program that encourages greater use of faster buying procedures.


In addition, Davis said he will propose monetary awards of $20,000 to encourage companies to propose innovative "terror-fighting solutions" to the federal government. A total of $500,000 would be available annually for awards.

Davis said his legislation was prompted by complaints he has received from companies that have been frustrated in their attempts to sell technology for homeland security to the federal government.

Government agencies in general and the Office of Homeland Security in particular "have been overwhelmed by a flood of industry proposals offering various solutions to our homeland security challenges," said Davis, whose Northern Virginia district has a heavy concentration of technology companies.

"A lot of the technology firms with the expertise to address our security needs have contacted my office to let me know that they are having a hard time getting a real audience for their products," Davis said during a press conference announcing his legislation.

During a February hearing before the Government Reform Committee's Technology and Procurement Policy Subcommittee, of which Davis is chairman, technology company officials complained about the lack of an organized process within the federal government to evaluate technology solutions for homeland security.

Witnesses -- including Tom Siebel, chairman and chief executive officer of Siebel Systems Inc. -- were especially frustrated that the newly created Office of Homeland Security apparently lacks "the authority to make anything happen." Siebel said the government should be able to evaluate and act on homeland security proposals from industry in "weeks, not months or years."

Davis said that in addition to helping agencies identify technology that would be useful in the war against terrorism, his technology team also is intended to prevent agencies from wasting money.

Spending on homeland security is increasing dramatically. "We are spending more money than at any time since the mid 1980s," when former President Reagan poured hundreds of billions of dollars into a military buildup.

It was a time when defense contractors like Lockheed Corp. billed the Air Force $640 apiece for aircraft toilet seats and $435 for hammers. "We don't want to have those same kind of problems" as spending is increased for homeland security, Davis said.
***************
New York Times
Virus Maker Sentenced


NEWARK, N.J. (AP) -- The creator of the ``Melissa'' computer virus was sentenced Wednesday to 20 months in federal prison for causing millions of dollars of damage by disrupting e-mail systems worldwide in 1999.

David L. Smith, 34, pleaded guilty in December 1999 to a state charge of computer theft and to a federal charge of sending a damaging computer program. In the federal plea, both sides agreed the damage was greater than $80 million.

Smith is believed to be among the first people ever prosecuted for creating a computer virus. In court Wednesday, he called the act a ``colossal mistake.''

The Melissa virus, which struck in March 1999, was disguised as an e-mail marked ``important message'' from a friend or colleague. It automatically spread like a chain letter by causing computers to send 50 additional infected messages. The volume of messages generated slowed some systems to a crawl.

Melissa was relatively benign compared with viruses that delete computer files or do other damage. Yet by clogging e-mail systems and causing some to crash, Melissa was the worst computer virus outbreak ever at the time.

Smith could have faced up to five years in prison, but prosecutors suggested a term of about two years, saying he had given authorities extensive assistance in thwarting other virus creators. He was also fined $5,000 by U.S. District Judge Joseph A. Greenaway Jr.

He is expected to get a similar prison term Friday when sentenced on a guilty plea to a state charge of computer theft.

Smith is among the first people ever prosecuted for creating a computer virus.

A federal judge in Syracuse, N.Y., in 1990 sentenced Robert T. Morris to three years probation for a worm virus that crashed 6,000 computers in 1988. Morris was the first person convicted under the 1986 Computer Fraud and Abuse Act.

A British judge in 1995 sentenced Christopher Pile to 18 months in jail for creating computer viruses that destroyed files around that nation. Pile, who called himself the Black Baron, was the first person convicted under Britain's Computer Misuse Act of 1990.

But successful prosecutions are quite rare. Prosecutors in the Philippines had to dismiss all charges against a man accused of releasing the ``Love Bug'' virus in 2000 because of a lack of applicable laws. In other instances, the virus writer couldn't be found at all.

Smith has said he created the virus on computers in his Aberdeen apartment and used a stolen screen name and password to get into America Online.

Smith was arrested April 1, 1999, after investigators for America Online provided authorities with evidence that helped trace the virus to Smith's phone number.

``My curiosity got the better of me, as I soon began to write the `Melissa Virus,''' he wrote in a letter to Greenaway. ``It didn't take me too long as I borrowed a lot of the code from other viruses that I found on the Internet.''

While admitting his actions were ``immoral,'' Smith wrote that the virus did not delete or corrupt anyone's files or data, or harm any machines.

``It did result in clogged mail servers, which admittedly caused headaches and aggravation to many,'' he acknowledged.
*****************
Computerworld
OpenOffice.org releases Version 1.0 of its free office suite
By TODD R. WEISS


After 18 months of development, Version 1.0 of the open-source OpenOffice.org productivity suite was released today for free download.

In an announcement today, OpenOffice.org, the developer community that has been building the applications based on open-source code provided by Sun, said the new version is being touted for small businesses and other users who want full-featured software without high costs.


OpenOffice.org Version 1.0 shares much of its source code with Sun Microsystems Inc.'s StarOffice 6.0, which is due to be released later this month for retail sale. StarOffice is Sun's entry into the highly prized office-suite market, which is dominated by Microsoft Corp.'s Office products. The OpenOffice.org project is Sun's separate open-source effort aimed at maintaining a free version of the suite. Sun announced in March that it would begin charging for StarOffice, while still making OpenOffice available free of charge (see story).



OpenOffice is available for Windows, Linux and Solaris machines and is being ported to other operating systems, including Apple Computer Inc.'s Macintosh operating system, according to Zaheda Bhorat, senior product manager for OpenOffice.org at Sun. The suite will also be available in more than 21 languages, including Spanish, French and Chinese.



For businesses and other users, the key features of OpenOffice include its ability to recognize and be compatible with Microsoft Office file formats, said Sam Hiser, an OpenOffice developer and co-leader of the product's marketing efforts. "We're going to try to make it easier for everyone without it mattering what the file formats must be," he said.



Analyst Bill Claybrook at Aberdeen Group Inc. in Boston said the availability of the OpenOffice suite is good news for users seeking a free, full-featured office suite. But it still faces tough competition from Microsoft Office.



"I think more people will try it," Claybrook said. But he's not sure how many will use it instead of Microsoft's suites. Most business users want applications that come complete with printed documentation and easy to access technical support when needed, even if they have to pay for it, he said.



Michael Silver, an analyst at Gartner Inc. in Stamford, Conn., said that while the OpenOffice.org effort claims high compatibility with existing Microsoft file formats, there will likely be problems in reading files that include macros and complicated fonts or formatting. "In terms of file formats, nothing except for Microsoft Office will let you open all Microsoft files properly," Silver said.



On the other hand, "there's a ton of backlash against Microsoft because of their licensing changes," and many users may be more likely to give OpenOffice and other alternatives a try, he said. "I think some people will do it just for spite."



The OpenOffice.org developer community includes Sun employees, volunteer developers, marketers and end users, according to the group.



OpenOffice users can go online for free help from other users and developers, or if they are interested in more full-featured support and documentation, they can buy the StarOffice suite from Sun, which is expected to be priced at under $100.



OpenOffice.org 1.0 includes applications for word processing, spreadsheets, presentations and drawing. StarOffice adds a database application and several proprietary multimedia programs.
******************
Computerworld
Netscape, Mozilla hole allows remote text viewing
By SAM COSTELLO, IDG NEWS SERVICE


A security hole in the Netscape Communications Corp. Navigator and Mozilla Web browsers could allow an attacker to view documents on a user's PC, according to an advisory released yesterday by an Israeli security group.

Netscape acknowledged the vulnerability today and said its engineers are working to fix the problem, according to Andrew Weinstein, a spokesman for Netscape.


"We expect to have a resolution in the near future," he said.



The vulnerability affects the XMLHttpRequest component of both Navigator and Mozilla, which is used primarily to retrieve XML documents from Web servers, according to the security group, GreyMagic Software. An attacker could exploit the vulnerability by sending the Web browsers to a Web site that included hostile code, which would then allow the attacker to view documents on a user's hard drive, the group said.



Microsoft Corp.'s Internet Explorer browser was vulnerable to a less serious version of the same attack, which Microsoft patched in February.



The Navigator and Mozilla vulnerability affects Version 6.1 and higher of Navigator and versions 0.9.7 to 0.9.9 of Mozilla.



The scope of the vulnerability will likely be limited by the number of users who run either Netscape or Mozilla. Netscape holds about 7% of the worldwide market for Web browsers, according to San Diego-based research firm WebSideStory Inc. Mozilla, an open-source Web browser whose first final version has yet to be released, commands a smaller market share.



Mozilla was created after Netscape made its source code available to developers in 1998. New York-based AOL Time Warner Inc. uses in Navigator much of the same code that powers Mozilla.



GreyMagic's advisory also came peppered with harsh words for Netscape, which GreyMagic said reneged on a pledge to give $1,000 for every serious bug discovered by researchers. GreyMagic said Netscape ignored e-mail sent by the group detailing this vulnerability, adding that in the future, GreyMagic would release any bugs it finds in Netscape without contacting the company and would recommend against the use of its browser.



Weinstein said Netscape acknowledged GreyMagic's e-mail, but the group submitted its report last Wednesday and waited only until Monday to release the report publicly.



"Our bug bounty program remains robust, and we encourage anyone who discovers [a bug] to bring it to our attention," he said.
*******************
BBC
Grid helps science go sky-high


Astronomers could be among the first to reap the rewards of plans to turn the internet into a vast pool of computer processing power.
The three-year Astrogrid project is attempting to give astronomers a common way of accessing and manipulating diverse data archives.


The project will also help scientists cope with the wave of data that novel telescopes and instruments are expected to generate.

The researchers behind the initiative believe astronomers will only be able to do meaningful science with this wealth of new data by tapping the net's huge information processing potential.

Day-to-day data delay

Astrogrid is a ?5m project that attempts to put a single, friendly interface on the huge number of astronomical archives and data sets currently held online.

It is one of many projects inspired by research on ways of using the computers attached to the net as a coupled supercomputer or a storage system with almost limitless capacity.

Nicholas Walton, project scientist for Astrogrid, said many astronomers currently spent a lot of their time hunting for data or turning what they needed into a form they could use.

"Data sets are archived but not always in the same format or in a way that's accessible to all," said Dr Walton.

Astrogrid will create a standard way of querying almost any astronomical database and remove any need to understand the technical quirks of an instrument to get the most out of the information.

To make this happen, the Astrogrid project is defining a "metadata" format that can be used to interpret sets of data so they can be queried through one interface similar to a web browser.

"We want to make it easier for everyone to have access to the same data but ensure they're presented in the same uniform way," said Dr Walton.

"We'll have been successful when they are using it but don't know they are using it," he said.

The ability to combine datasets from different sources was becoming much more important to astronomers, said Dr Walton.

Only by combining X-ray, radio, magnetic, infra-red and optical information about astronomical objects such as supernova remnants would scientists get a thorough understanding of the Universe, he said.

Instrument explosion

Astrogrid will also help astronomers cope with the enormous amounts of data that new instruments, such as the Visible & Infrared Survey Telescope for Astronomy (Vista), are expected to produce.

Dr Walton said that before now total surveys of the sky were typically done once every few decades.

By contrast, Vista will survey the sky every few days and produce 100Gb of data per night.

Other planned instruments such as Europe's Very Large Telescope Array and the American space agency's Living With a Star project will produce similar reams of data.

The only way that astronomers were going to be able to archive and analyse such enormous amounts of data was by using the net as a storage system and a vast supercomputer, said Dr Walton.

Without the help of the internet, scientists would have no chance of finding the tens of objects that interest them out of the billions that instruments are recording information about.

"We want to enable astronomers to do more effective and economic science," he said.

"We want them to do the things they do now faster and to do things they cannot contemplate now."
******************
New York Times
Lessons Learned at Dot-Com U.
By KATIE HAFNER


GO to Fathom.com and you will encounter a veritable trove of online courses about Shakespeare. You can enroll in "Modern Film Adaptations of Shakespeare," offered by the American Film Institute, or "Shakespeare and Management," taught by a member of the Columbia Business School faculty.

The site is by no means confined to courses on Shakespeare. You can also treat yourself to a seminar called "Bioacoustics: Cetaceans and Seeing Sounds," taught by a scientist from the Woods Hole Oceanographic Institution.

Or if yours is a more public-policy-minded intellect, you can sign up for "Capital Punishment in the United States," a seminar with experts from Cambridge University Press, Columbia University and the University of Chicago.

What's more, all are free.

That part was not always the plan. Fathom, a start-up financed by Columbia, was founded two years ago with the goal of making a profit by offering online courses over the Internet. But after spending more than $25 million on the venture, Columbia has found decidedly little interest among prospective students in paying for the semester-length courses.

Now Fathom is taking a new approach, one that its chief executive likens to giving away free samples to entice customers.

Call it the Morning After phenomenon. In the last few years, prestigious universities rushed to start profit-seeking spinoffs, independent divisions that were going to develop online courses. The idea, fueled by the belief that students need not be physically present to receive a high-quality education, went beyond the mere introduction of online tools into traditional classes.

The notion was that there were prospective students out there, far beyond the university's walls, for whom distance education was the answer. Whether they were 18-year-olds seeking college degrees or 50-year-olds longing to sound smart at cocktail parties, students would flock to the Web by the tens of thousands, paying tuitions comparable to those charged in the bricks-and-mortarboard world ? or so the thinking went.

"University presidents got dollars in their eyes and figured the way the university was going to ride the dot-com wave was through distance learning," said Lev S. Gonick, vice president for information services and chief information officer at Case Western Reserve University in Cleveland. "They got swept up."

American universities have spent at least $100 million on Web-based course offerings, according to Eduventures, an education research firm in Boston.
Now the groves of academe are littered with the detritus of failed e-learning start-ups as those same universities struggle with the question of how to embrace online education but not hemorrhage money in the process.


New York University recently closed its Internet-based learning venture, NYUonline. The University of Maryland University College closed its profit-based online arm last October and folded it into the college. Temple University's company, Virtual Temple, closed last summer. Others have reinvented themselves.

In the process, the universities have come to understand that there is more to online learning than simply transferring courses to the Web.

"The truth is that e-learning technology itself, and those of us who represent the institutional and corporate agents of change in the e-learning environment, have thus far failed," Dr. Gonick said. "Across U.S. campuses today, e-learning technology investments are at risk, and many technology champions are in retreat." Since the mid-1990's, most of the purely virtual universities that sprang up ? from Hungry Minds to California Virtual University ? have been sold or scaled back or have disappeared altogether. The same is true for the lavish venture-capital financing for start-ups that designed online courses for colleges or put the technology for such courses in place, for a high fee.

In 2000, some $482 million in venture capital was spent on companies building online tools aimed at the higher education market. So far this year, that amount has dropped to $17 million, according to Eduventures.

Kenneth Green, founder of the Campus Computing Project, a research effort based in Los Angeles that studies information technology in American higher education, pointed to a combination of reasons that universities have stumbled in distance education.

Mr. Green said that college campuses and dot-coms had looked at the numbers and anticipated a rising tide of enrollment based on baby boomers and their children as both traditional students and those seeking continuing education. In short, the colleges essentially assumed that if they built it, students would come.

One conspicuous example is Fathom. Last week, Columbia's senate issued a report saying that the university should cut spending on the venture because it had little return to show for its investment. The report urges better coordination among the university's digital efforts, notably Columbia Interactive, a site aimed at scholars seeking academic content.

Fathom started in 2000 by offering elaborate online courses replicating the Ivy League experience. The company has an impressive roster of a dozen partners, including the London School of Economics and Political Science, Cambridge University Press, the British Library, the University of Chicago and the New York Public Library. Many of Fathom's courses are provided by its member institutions, and many offer credit toward a degree.

Although Fathom's courses were impressive from the start ? for example, "Introduction to Macroeconomics," taught by a University of Washington professor for $670 ? the idea that many students would pay $500 or more for them proved a miscalculation.

"If you listened to some of the conversations going around on campuses, it was, `Gee, this looks to be easy, inexpensive to develop and highly profitable ? throw it up on the Web and people will pay us,' " Mr. Green said.

But business models were flawed, he said, and university administrators did not fully understand the cost of entering the market. "It's really, really expensive to do this stuff," he said. "It costs hundreds of thousands of dollars to build a course well."

There are substantial costs in designing the course, Mr. Green said, like creating video, securing content rights and paying the faculty member who teaches it. "This doesn't migrate painlessly from classroom onto the Web," he said. "It's more like making a Hollywood movie."

Michael Crow, executive vice provost at Columbia, said that Fathom had yet to generate significant revenue, let alone turn a profit.

"Right now we're trying to figure out how to make it work intellectually," he said, "and we have to figure out later how to make it work financially. If anyone had asked us how anyone was going to make the university work financially as the first question asked, it would never have been built."

Now Fathom has decided that instead of students seeking degrees, it will focus on those looking for courses in professional development and continuing education. Fathom is also offering courses that are less expensive to produce and cost less for students, as well as several dozen others that are free.

"We've broadened the kinds of courses in recognition of the fact that most people aren't familiar yet with online learning, so they need different price points," said Ann Kirschner, Fathom's president and chief executive. "We need to introduce learners to the concept before they will commit money."

Part of the problem, distance learning experts say, has been an emphasis on technology ? streaming video, for example ? at the expense of more careful thinking about how remote learning might best be conducted. Some critics say that university administrators confused tools with education.

"We figured a quick wave of the magic wand and we'd reinvent how people learned after 900 years of a traditional university mode of instruction," Dr. Gonick said.

New York University has had a conspicuously difficult experience with distance education. The university started NYUonline in late 1998 to focus on corporate education and training, not degree programs. The company sold the courses as packages to corporate customers.

By the time it opened its virtual doors to students in 2000, NYUonline employed 45 people and offered online courses with titles like "Train the Trainer," for human resource managers, and "Management Techniques," aimed at young managers.

The NYUonline courses were not cheap. "Train the Trainer," which included live Webcasts as well as self-paced online courseware, cost around $1,600. The tuition for the management techniques course was close to $600.

In two and a half years of operation, NYUonline received nearly $25 million from the university, but enrollment remained anemic at best: just 500 students at its peak.

After closing NYUonline, the university moved some of the company's activities into its School of Continuing and Professional Studies, which was probably where it belonged in the first place, said Gordon Macomber, the former chief executive of NYUonline, who is now a consultant on electronic learning in New Canaan, Conn.

"Along the way it became apparent that a major university that wants to get into online education can do it without forming a separate for-profit online company," Mr. Macomber said. "If you're going to invest money in anything, the university might as well invest it within the university instead of supporting a for-profit company."

There are a few success stories. The technically oriented University of Phoenix has an online enrollment of more than 37,000, with four-year and post-graduate degree programs aimed at older students. The university's success, Mr. Green suggested, comes from its expertise in branding, marketing and infrastructure.

That could be the combination for success in distance education.
The trick now is finding a way for universities like Columbia, steeped in academic tradition, to make it work.


"In a way, that is the crux of the matter," said Ms. Kirschner of Fathom. "Are universities going to grow smaller and marginalized in a world teeming with sources of information, or are they more important than ever, as people seek to separate fact from fiction, knowledge from data?" Ms. Kirschner said she hoped the answer would be the latter.

In the meantime, you can brush up your Shakespeare. And they'll all kowtow.
*******************
BBC
Eminem CD could be 'pirate proof'

The controversial rapper and his Universal label are in talks to release the forthcoming The Eminem Show with technology that would prevent widespread copying.

The record label is also being extremely cautious about where promotional copies of the album's first single, Without Me, are going, fearing leaks on to the internet will affect sales.

If Universal go ahead with the plan, it would mark the company's biggest attempt to protect its artists from piracy.

There has been a strong push from the major labels towards copy-proof material but until now it has tended to be on a limited basis.

Sales slump

Albums from Celine Dion, *N Sync and Natalie Imbruglia have all had technology added to prevent them being copied.

The widespread leak on to the internet of the Oasis comeback album Heathen Chemistry will further add fuel to the industry's concerns about piracy.

Eminem's third album is due for release on 3 June, so any decision on copyright protection will need to be taken swiftly.

Piracy has recently been blamed for a 5% drop in the world's music sales.

The music industry took quick action to try and halt the illegal downloading of music from the internet which can easily be copied on to CDs.

Consumer complaints


Sites such as Napster were shut down by legal wrangles because unlicensed music was being swapped on its service.


But some consumers have complained that albums with anti-piracy equipment attached have hindered their ability to play them on their computers or in-car stereos.

Because it has not been used on such an eagerly awaited album before, it has been difficult to gauge reaction.

"Clearly, we will have a better sense of how the market feels about copy-protection when a release of Eminem's stature, should we decide to do it, comes with copy-protection on it," said Universal Music spokesman Adam White.

"Mass-copying has gotten to such a level that we had better take a stand to protect the artists," he added.
**************************
New York Times
The Heavy Lifters of Computation


By J.D. BIERSDORFER

Is a mainframe computer the same thing as a supercomputer?

A. Unlike personal desktop computers designed for use by one person at a time, a mainframe computer can handle thousands of users at a time and is often used in a corporate or financial environment.

Although mainframes can usually run several programs at the same time, they are not as powerful as supercomputers in terms of raw processing speed. Because supercomputers can run programs much faster, they are often used at academic and research institutions to handle complex calculations for engineering, chemistry and other science projects, for example, climate models.

Mainframes have been part of the computing world for decades, and early models took up entire rooms that had to be kept cool for the computer to work properly. Although mainframes have shrunk in size, they are still very much in use and can now run modern operating systems like Linux. Often they are referred to as enterprise servers.

Q. Why can't you simply copy files onto a CD the way you can onto a floppy disk? With Windows XP, I can copy files to the CD, but then must go through a process to "Write these files to CD."

A. When you drag files to the CD burner icon in Windows XP or use the Send To command from the File menu, you are sending those files to a staging area on the hard drive. The actual recording process starts when you select "Write these files to CD" from the task pane. The gathered files are then copied into a single file that is then recorded onto the CD.

CD recording in general is often confusing for new users because of the abundance of options within the recording software and because of conflicts between software and hardware that can result in a ruined disc. (There are many brands of CD recorders and several software programs.) To simplify the process, both Windows XP and Mac OS X include basic CD-recording functions.

A movement is under way, however, to make the process of copying files onto a CD as easy, standardized and straightforward as copying files onto floppy disks. One solution is being developed by a group called the Mount Rainier project with support from Philips Electronics, Compaq Computer, Sony Electronics and Microsoft with the goal of integrating it into future operating systems.

Users of computers with Mount Rainier technology would be able simply to drag and drop their files onto a CD to record them. Disc formatting and other setup tasks would be automatic.

The technology is still probably about a year away: you can read more about the technology at www.mt-rainier.org.

Q. Every time I boot up my computer, I get a message informing me that "the shortcut to PREMEND.LNK is not available." What is this shortcut and how can I can get rid of this error message?

A. Premend is a small program used by computer manufacturers to help automatically set up new machines running Windows Millennium Edition, or Me. After you use Windows the first time, Premend is supposed to remove itself from both the hard drive and from the list of programs that start up automatically whenever the PC does.

According to a note in Gateway's technical database, that does not always happen as planned. Sometimes the program removes itself from the hard drive but leaves the broken shortcut to the program behind in the StartUp folder, which is what Windows is referring to when you boot up your PC.

To get rid of the broken shortcut and the error message, go to the Start menu, click on Settings and select the Taskbar & Start Menu. Click on the Advanced tab and then on Remove.

In the Remove Shortcuts/Folders box, double-click on Startup and then select Premend. Click on Remove.

Circuits invites questions about computer-based technology, by e-mail to QandA@xxxxxxxxxxxx This column will answer questions of general interest, but letters cannot be answered individually.
******************
New York Times
Virtually Rebuilt, a Ruin Yields Secrets
By SAM LUBELL


VERYONE knows that the Roman Colosseum is an architectural marvel. Built so that thousands of people could be ushered in and out in minutes, it is a testament to the genius of Roman engineering. Or is it?

By reconstructing the building with three-dimensional computer modeling and then virtually "walking through" it, researchers have discovered that in some sections the building may have had all the efficiency of a railroad-style apartment on the Bowery. The model reveals dark, narrow upper hallways that probably hemmed in spectators, slowing their movement to a crawl.

Such three-dimensional modeling is turning some of archaeology's once-established truths on their heads. Because 3-D software can take into account the building materials and the laws of physics, it enables scholars to address construction techniques in ways sometimes overlooked when they are working with two-dimensional drawings.

"Now we have a tool that will really test assumptions," said Dean Abernathy, a doctoral student who helped reconstruct the Colosseum at the Cultural Virtual Reality Lab at the University of California at Los Angeles. "It creates a lot of excitement in the field."

The U.C.L.A. lab (www.cvrlab.org) creates models of architectural sites around the world. Since 1996 it has been working on a project called Rome Reborn, which seeks to rebuild much of the ancient metropolis.

Researchers at the lab recreated the Colosseum using a program called MultiGen Creator, which allows users to pan, zoom, walk or even fly through a simulation of a site. Graphics software like 3D Studio MAX and Lightscape, the same kind of programs used by digital movie studios, helps make the replicas particularly lifelike, with sharp colors and intricate stonework.

The Colosseum, a vast four-story oval arena, was built from around A.D. 70 to 80 under the rule of the Emperor Vespasian and then Titus. It once held as many as 50,000 spectators. Earthquakes and the ravages of time have destroyed much of the building, but an impressive amount, including most of its facade, still stands.

Mr. Abernathy confronted the issue of the third-level hallways when he was working on the reconstruction. His model drew on the findings of a team of experts on Roman architecture assembled by U.C.L.A. who had studied similar amphitheaters, drawings of the Colosseum and records of the building's construction and expansion. The team also examined what was left of the upper hallways, an area that had previously been all but closed to researchers.

Bernard Frischer, a classics professor at U.C.L.A. and director of the Cultural Virtual Reality Lab, said that researchers have generally held that the entire Colosseum was a masterpiece of circulation, with people able to enter and leave in as little as 10 minutes. After touring the virtual Colosseum, now he is not so sure.

"Most scholars just never focused on the problem of circulation throughout the building," he said. "They assumed that each of the floors was going to look like the bottom," which is spacious and well lighted. "Only once we had to reconstruct the building did an idea like that pop into our heads."

Such reconstructions have challenged traditional thinking about other sites as well.

Analysis of U.C.L.A. models suggests that the Roman Senate may have been poorly ventilated and lighted and had inferior acoustics. The models also raised some new questions about the Temple of Saturn, whose design may have been altered centuries after its construction.

Colleges have developed sophisticated devices to enhance the experience of touring the models. At U.C.L.A., researchers view models at the university's visualization portal, a screen 24 feet wide by 9 feet high that offers 160-degree views, with special goggles producing a three-dimensional effect.

At Brown University, archaeologists can view the results of their digs in a room called the CAVE, for Computer-Automated Virtual Environment. Surrounded by three-dimensional images on screens on the walls and the floor, scholars navigate by wearing shuttered goggles and sensors that exchange data with a computer system. A "wand" with an internal track ball like that of a mouse moves them wherever they direct it.

Other virtual-reality projects allow users to move around the room physically, with their movements tracked by overhead 3-D magnetic or infrared devices.

Samuel Paley, a classics professor at the State University at Buffalo, and members of the virtual reality lab there have worked with Learning Sites (www.learningsites .com), a design company based in Williamstown, Mass., that specializes in archaeological visualizations, to produce virtual models of several Assyrian palaces. The simulations can be viewed on a supercomputer at the university's center for computational research.

Moving through a simulation of the northwest palace of Ashur-Nasir-Pal II of Assyria, an ancient site in modern-day Iraq, he caught a glimpse of three leaf bas-relief sculptures in a row. The sculptures, which depicted a ritual involving the king, courtiers and protective gods, could be viewed as a single, isolated tableau only from his position on the threshold of the throne room as was evidently the intention of the palace's designers.

When Professor Paley described his finding at a lecture, "the room went absolutely silent," he said. "I think people realized right then that this is a useful technology that helps them see things in a different way."

Donald Sanders, president of Learning Sites, said that more than 70 university programs across the country were now using computer-generated virtual reality models, compared with only a handful five years ago.

"This is the future," Professor Paley said.

But the future is not cheap. More than $25,000 has gone into the Colosseum project so far, researchers at U.C.L.A. said. Microsoft, which incorporated the Colosseum graphics into its Encarta 2002 interactive learning software, helped cover some of the project's costs through a licensing agreement with the university. The Andrew Mellon Foundation supplied $127,000 to cover the Roman Forum project.

Some experts hesitate to rely on such modeling, saying that it can gloss over the realities of the past.

Kenneth Kolson, deputy director of the division of research programs for the National Endowment for the Humanities, said that virtual images conveyed a "false sense of integrity and purity."

"Those images, especially the stunningly seductive ones," he added, "convey as much or more about our own values and cultural aspirations as about the ancients."

Even Professor Frischer and other scholars who have embraced interactive 3-D modeling caution that their reconstructions can never be accepted as fact, partly because new information is always surfacing.

"We're working the stuff out," said Mark Wilson Jones, a member of the U.C.L.A. committee of Roman architecture experts and a lecturer in architecture at the University of Bath in England. "Nothing's ever final." One advantage of using digital models, scholars say, is that they can easily be updated with new findings.

Fikret Yegul, a professor of architectural history at the University of California at Santa Barbara, acknowledges that computer modeling can shed new light on the past. "It just brings greater depth to our understanding," he said.

Still, he questions some of the theories of the team of experts assembled by U.C.L.A. "V.R. models can never be seen as the last word," he said. "They are only another perspective."

Some researchers reject the technique because they are wary of changing the way they work or of ceding control to computer programmers. And some are unconvinced that the technique accomplishes anything beyond creating pretty computer models. "There are a lot of archaeologists who look at this as glorified coloring book stuff," said Dr. Sanders of Learning Sites.

"There are always people hesitant to move from their own set ways of doing things," he said. He offered a historical example.

"It wasn't so long ago that there was a technology coming into popular use," Dr. Sanders said. "The equipment used to create it was very expensive, yet the images you got were something that you could never get without it. Within a generation it became indispensable to archaeology.

"That's exactly how photography got started."
*************************
Government Computer News
Hacker pleads guilty to accessing NASA system
By Wilson P. Dizard III

A hacker charged last year with breaking into a NASA server has pleaded guilty in the U.S. District Court in San Antonio to one count of intentionally accessing a federal computer without authorization, NASA said Monday. He faces a possible one-year jail term and a $100,000 fine.

Ruben Candelario, who entered his plea April 18, is scheduled to be sentenced June 20.

He was indicted a year ago on charges of hacking into the Web and e-mail server of NASA's Virginia Consortium of Engineering and Science at Langley Research Center in Hampton, Va. He also was charged with possessing and trafficking in computer passwords.


Candelario, who used the nickname "skrilla," was the subject of an investigation by the NASA Inspector General's Computer and Technology Crimes Office and investigators of the Guadalupe, Texas, Sheriff's Department.
***************
Government Computer News
Army lab will double its computing capacity
By Dawn S. Onley


High-performance computing capacity at the Army Research Laboratory in Aberdeen, Md., will double as a result of a $14 million contract awarded yesterday to Raytheon Co. by the Defense Department's High Performance Computing Modernization Office.

Under the contract, Raytheon will install and test what will be the largest supercomputer at ARL, the company said. The supercomputer will consist of 800 1.3-GHz IBM Power4 processors and will operate three to four times faster than the two IBM Power3 systems it will supersede.

The new system will double the overall computational capability of the laboratory's high-performance supercomputers to more than 7.1 trillion floating-point operations per second.
*****************
Mercury News
How some readers fight spam


Last week we published tips on dealing with spam and asked readers for their ideas. Here are some:

1. Avoid Hotmail.com, MSN.com and Passport.com. All the Microsoft-hosted free Web-based e-mail providers save a cookie with the user's e-mail address all spelled out. Any later site visited can read the cookie and be guaranteed that the e-mail address is valid. There is no way to prevent this, since the e-mail service will not allow you to log in without enabling cookies.

The best alternative is to use Yahoo or Juno Web e-mail or POP e-mail. Although Yahoo requires cookies be enabled and it saves a cookie, the cookie does NOT contain the user's e-mail address, so when other sites read the cookie they gain nothing. I use Yahoo Web e-mail, and in two years, I have received only two or three spam messages, whereas my little-used Hotmail account gets around 15 spam messages per day.

2. Use the utility CookiePal (www.kburra.com/order .html), available for $15. This program is indispensable for controlling which cookies are saved and which are not.

Ted Hadley
Sunnyvale

Read my series

There's a lot you can do about spam, and the subject is a big part of my ``Overcome Email Overload'' book series.

While you were correct that blocking e-mail addresses doesn't really work, you can use your e-mail program's built-in filters to do much more. Just my top three anti-spam filters block about three-fourths of my spam automatically, while catching very few legitimate messages.

Note that to use the anti-spam filters, it's essential to ``white list'' messages -- a list that explicitly keeps all messages from people whose messages you're sure you want.

Here's how to make my top three anti-spam filters:

1. Make a filter to move messages whose body contains the characters ``IMG.'' IMG is the HTML code for an embedded image -- which spam frequently contains and that legitimate messages rarely have.

2. Make a filter to move messages with seven spaces in the Subject. Spam frequently has a tracking ID, such as ``2s829487q,'' way off to the right of the Subject line, separated from the main subject by a lot of spaces.

3. Make a filter to move messages with no space character in the ``From:'' line. Almost all legitimate messages have a ``real name'' (e.g., ``Mabel Garcia''), while a lot of spam has ``From'' lines with only an e-mail address (e.g., s92834@xxxxxxxxxxxxxxxxxx). Note, however, that messages from America Online users never have a ``real name,'' so your filter must make an exception for AOL users.

I've posted more information on my Web site (www. OvercomeEmailOverload .com/top3.html).

Kaitlin Duck Sherwood
Palo Alto

Unsubscribing works

You said never to click unsubscribe in the spam messages. I've become a fan of clicking it every time.

I have four different message accounts forwarding to a Yahoo inbox, which sounds like a spam disaster waiting to happen.

But my strategy is, whenever I get spam, click the unsubscribe at the bottom and fill it out, or if it doesn't offer that, reply with the ``REMOVE'' and sometimes ``UNSUBSCRIBE'' tags in the e-mails. Then I block the e-mail address and delete the offending e-mail. If it comes from an address that ends with yahoo.com, I forward it to abuse@xxxxxxxxx, which deletes offending accounts.

It's worked quite well. I probably average about one spam message every two weeks. Plus, it certainly isn't a private address, it's all over my Web site, and it's the only one I use to sign up for stuff on forms offline and online.

Davy Fields
Mountain View

. . . it really does

I have contrary experience to the advice never to unsubscribe. I had been merely deleting spam until several months ago. Then the volume reached a point where I decided to use every unsubscribe option given. Within a week, the volume sunk to almost zero. True, it has slowly climbed back, but has not yet reached the level I had before.

Ron Gutman
San Jose

Two inboxes

In Outlook, I set up a separate inbox called Inbox-Personal.

I turned off sound notification when e-mail arrives.

I set up a rule for filtering e-mail that arrives in Inbox that moves any e-mail sent by someone in my address book into Inbox-Personal and sounds a notification.

That way, all of the spam ends up in Inbox, and almost all of my personal mail ends up in Inbox-Personal, and I don't hear the ``e-mail has arrived'' sound unless something ends up in Inbox-Personal. I do have to check Inbox regularly, but most of the time I end up deleting everything that lands there. And if I get an e-mail from someone or from a list that I want to end up in Inbox-Personal, I simply add a new address book entry.

Gary Hillerson
Aptos Grass-roots movement I own my own domain address and all mail that comes to it is ultimately routed to me. When I am asked to provide an e-mail address, I always use the name of the company at mydomain.com and thus all the e-mail sent to me clearly states where it came from. If ever I receive spam from that company unsolicited, I let them know they better cease immediately or I will report them to the Better Business Bureau, and that I am ceasing to do business with them.


All in all, it really doesn't help cut down on spam, but it sure tells me who to do business with and who I shouldn't trust.

I think the solution is a grass-roots movement of people who will NOT buy products from spammers and after some time this marketing vehicle will be rendered ineffective and thus will no longer be an option, at least in my utopian mindset.

K.C. Teis
Livermore

MailWasher

I discovered an anti-spam tool that you didn't mention that works well, combines offline and online filtering, and is shareware. It's called MailWasher (www.mailwasher

.net), and it works very well, although it won't work with Hotmail or other Web mail services until later this year.

In a nutshell, MailWasher downloads message headers from your mail accounts -- multiple accounts are supported -- and checks the headers against the spam DNS Blacklist servers from a number of different sources -- currently ORDB, Visi and SpamCop, and you can add additional ones -- to alert you to potential spam, as well as potential virus carriers, chain letters and other nuisances.

You can create your own filters, as well as blacklist addresses ``on the fly.'' You can even read the text of messages without downloading the entire message. MailWasher also allows you to delete messages at the server before you download them. This is very useful for large messages, potential viruses, and when you're traveling and must use slow dial-up connections.

For $20 with lifetime support and upgrades and no spyware, MailWasher is a major bargain.

Leonard Feldman
Redwood City

Delete efficiently

Tip to save time: When deleting spam in Microsoft Outlook, press Shift + Delete, then Enter at the prompt to bypass Outlook's Deleted Items and delete from hard drive entirely.

This works for any deletions you are sure of and saves the added step of clearing the Deleted Items box. You also avoid seeing the dreaded spam twice.

Just don't be sloppy in highlighting items to delete and trash something you want to have a second chance to save.

Marsha Hayes
San Francisco

Kudos to Cruzio

I run a programming/consulting service and Web site in Santa Cruz, which after 5 years has attracted plenty of spam. Thanks to the efforts of Cruzio (www.cruzio.com), our local Internet service provider, I am now effectively free of junk e-mail.

Their first-generation anti-spam option has been available for a few years, catching about 90 percent of my spam. It examines e-mail headers, diverting wrong-address and ``blind carbon copy'' e-mail to a separate box. I can examine it, make exceptions for desired ``To'' and ``From'' addresses, or just ignore it. They send me a list each week, then delete the oldest spam after another week.

Cruzio's second-generation filter just became available. Of 60 unsolicited spams last week, I received one lonely mortgage ad. The new filter checks for known spammers and bulk e-mailing software, and looks for spam-profile content, text and viruses. Again, I can make exceptions to the rules.

If such strong filters were an option for everyone, spam would die out naturally.

Robert Patton
Santa Cruz

Postini worth fee

South Valley Internet in San Martin (www.garlic.com) uses a very good filter system from Postini (www.

postini.com). It costs me $1.25 a month and it's worth every penny.

The suspect messages are held by Postini and at a certain volume level -- about 60 messages -- I am notified. I then log in and look at the subject matter. I choose ``delete'' and all the messages are flagged with a check mark. Then I uncheck any that I want to save, usually only one or two. When a user starts out with Postini, he or she plugs in legitimate (e-mail addresses) and these are not filtered by the service.

Before Postini, it was a daily nightmare with volume going up and up.

Walt Keeshen Jr.
Morgan Hill
*******************
Government Computer News
Army will tweak its network and software practices
By Thomas R. Temin

SALT LAKE CITYThe Army will launch two initiatives Oct. 1 to help standardize its software and networks.

To bring enterprise management to its many networks, the service's Signal Command at Fort Huachuca, Ariz., will become the Network Enterprise and Technology Command, said Lt. Gen. Peter Cuviello, the Army's CIO, at the Software Technology Conference. The new command, known as Netcom, will have technical control over the information management directors at all Army installations.

The current Signal Command director, Maj. Gen. James Hylton, will head Netcom, Cuviello said.

Also, to make sure all Army programs run the same versions of software, the service will set what it calls a software-blocking policy. The policy will force program managers to get approval for upgrades. Software specifications will be locked down for 18 months, and the Army will deploy only one version of a package to all installations within that 18-month period, Cuviello said.

"So when a program manager says 'I've got Version 5.0,' fielding will take place within a set time, and we'll have the entire Army with it," he said.
*****************
USA Today
Philips finds way to 'paint' LCD screens


NEW YORK (AP) Today, liquid crystal displays, or LCDs are found in laptops, watches and cell phones sandwiched between two rigid pieces of glass.

But soon, LCD screens may be daubed onto walls or clothing, say researchers at Royal Philips Electronics, who devised a way of painting a computer screen onto a surface.

The new technique, discovered by five Philips researchers working at the company's research laboratory in The Netherlands, could allow flexible, lightweight LCDs to be mounted on plastic sheets that can be rolled up or folded.

The innovation could sustain LCD technology's viability versus competing display techniques such as organic light-emitting diode systems, said Bob O'Donnell, a display technology analyst with IDC.

If the technology catches on, cheap, paintable LCDs could wind up in unheard of places, O'Donnell said.

"Count how many screens you have in your home and multiply that by a big factor," he said.

The Philips technique, called "photo-enforced stratification," involves painting a liquid crystal-polymer mix onto a surface such as a sheet of plastic film then exposing it to two doses of ultraviolet radiation.

The radiation forces the mixture to separate into a honeycomb of tiny individual cells covered by flexible, see-through polymer. When connected to a computer, the crystal-filled cells change color to create a picture, like any LCD display.

Some kinks need to be worked out. So far, the company's scientists have painted only glass with their liquid crystal-and-polymer mixture, because glass is less susceptible to the distorting contamination that plagues plastic.

An article on Philips' research into the technique appears in Thursday's issue of the journal Nature.

The article suggests that the LCD industry ought to quickly overcome remaining hurdles to LCD-enhanced clothing.

"We can look forward to the day when we will be able to put displays on almost anything," the article says.

The emergence of flexible displays is seen as one of the last impediments to more pervasive portable computing.

Wireless transmission, fast processors and small memory components have allowed computers to shrink to fit inside portable devices like cell phones and personal data assistants.

But screens remain rigid either attached to a clunky laptop or shrunk to fit onto a cell phone or PDA too small to surf the Web in a rewarding way.

Several companies have marshaled developers to work toward lightweight flexible screens that can be rolled up or folded and carried in a pocket. A wireless Internet-connected display would allow someone to read, say, an instantly updated electronic newspaper while riding on the bus.

Philips is exploring multiple paths toward this goal, developing flexible plastic transistors as well as a type of "electronic paper" display in concert with Cambridge, Mass.-based E Ink, which is developing paper-like screens.

IBM has also done research on flexible displays.

"Trying to surf the Net over your cell phone is like watching a move through a keyhole," said Russ Wilcox, E Ink's co-founder. "What people want is a big display that is very portable, so they can get access to all the information on the Internet."

Philips is an investor and development partner for ultra-thin, lightweight but non-flexible displays E Ink plans to release next year, Wilcox said.
****************
Government Executive
House lawmakers advocate biometrics-based licenses
By Maureen Sirhal, National Journal's Technology Daily


Reacting to the security lapses revealed by the Sept. 11 terrorist attacks and the growing problem of identity theft, Republican Tom Davis and Democrat James Moran, both House members from Virginia, on Wednesday introduced a measure that would enable states to store biometric information on drivers' licenses.

The measure seeks to standardize security features for all state-issued licenses. Unique biometric features of applicants would be stored in computer chips contained on the licenses. The notion behind the bill is to stop people from fraudulently using state-issued identification.

"This is something that we have had the technology to do for some time, but we think what happened on September 11 makes a compelling case to do now," Moran said at a press briefing. Several terrorists responsible for the Sept. 11 attacks on New York and Washington had obtained fake driver's licenses from states including Virginia.

The measure endorses an idea championed by the American Association of Motor Vehicle Administrators as a way to close security gaps in the process of granting licenses, which in essence have become an ID system.

The Moran-Davis bill would allow state departments of motor vehicles to link their databases, but the agencies could only collect biometrics information with the express consent of consumers. Moran said the measure would not create a "national ID" but would merely give states that need to verify the authenticity of license applications access to other states' databases.

"States will be able to check with other states and find out if [a] person should be issued a driver's license" Moran said, but the bill would place very "strict controls to protect people's privacy. We have looked at all of these potential problems with this, and I really think we have created the legislation in a way to address any concerns that people might have."

Experts from the Progressive Policy Institute (PPI) argued that the measure would achieve a careful balance between security and privacy concerns. Non-governmental entities could not collect the biometric information, PPI Vice President Robert Atkinson said, but consumers eventually could choose to use the system to authenticate their identities in online transactions.

Under the bill, the Transportation Security Administration, in concert with states and other federal agencies such as the National Institute of Standards and Technology, would set guidelines for implementing the system, including the creation of standards for connecting databases and encoding chips. The federal government also could provide $300 million in grants.

Privacy advocates charge that the measure could compromise privacy. In a February report, the Electronic Privacy Information Center (EPIC) said that the proposal could lead to profiling and that the use of biometrics is not necessarily fraud proof.

Moran countered on Wednesday that the use of biometrics would be virtually impossible to tamper with, and even if the information were compromised, he said, individuals could prove their identities in other ways, using birth certificates or medical records.
****************
Government Executive
IRS sets the standard for protecting privacy
By Joshua Dean
With a special tool designed to ensure that information is protected when new information systems are built, the IRS is setting the standard for federal agencies and other governments in protecting privacy in the age of electronic information.



In an interview with Government Executive, Peggy Irving, the IRS' privacy advocate, said the era of electronic records has people more concerned about privacy than ever before. "The public is more concerned about electronic records than paper records," she said, "especially because they can be sent globally in an instant."



Irving said it is important to ask who has access to sensitive information and to identify whether controls are in place to uphold privacy policies, especially when numerous databases are connected with one another across agency boundaries as a result of new initiatives to share information.



Since taking over the position in 1999, Irving has created a privacy impact assessment (PIA), which the IRS uses to help design new information systems under its massive Business Systems Modernization program. "The IRS uses the PIA to ultimately review what information should be collected and why it should be collected," Irving said. "It also asks if the information is relevant and from the most timely and accurate source."



The PIA asks a series of questions designed to ensure privacy protection is designed into new information systems and to ensure that only the least amount of personal information is collected. "Identity theft has become an issue," Irving said. "We analyze every IRS form and scrub them to make sure the agency is only asking for the information we absolutely need."



Irving's work at the IRS has not gone unnoticed. The federal Chief Information Officer's Council has called the PIA a best practice. Other federal agencies have come to the IRS for advice on privacy standards and assessments, as have businesses and foreign governments.



Irving is not hesitant to share the PIA. She has met with representatives from the FBI, the Coast Guard and the Navy to discuss the best practices embodied by the tool. "The FBI immediately saw the rightness of the PIA?[the agency] really does want to assure the public and encourage cooperation," she said.



In 1993, the IRS became the first federal agency to have a privacy advocate. Irving took over the position in 1999 after working on privacy and disclosure issues at the Justice Department for more than 20 years. She said the Department of Health and Human Services was the next agency to create the position, since Americans are as concerned about the privacy of their medical information as they are about their financial information.



The positions of privacy advocate and chief privacy officer have since become more prevalent in both the public and private sectors. To date, such companies as Hewlett-Packard Co., IBM Corp. and Proctor & Gamble Co. have created privacy advocate positions. Agencies including the Justice Department and the Postal Service have also appointed privacy advocates. Irving's office has grown from a staff of three to a staff of 12, reflecting the premium the IRS puts on privacy, she said.
******************
Government Executive
Tech firms ally to push homeland security solutions
By Liza Porteus, National Journal's Technology Daily


Technology companies and government need to enhance collaboration as they seek the most effective way to secure the nation's airports and other transportation methods and facilities, industry leaders said Monday.

EDS, Oracle, PwC Consulting and Sun Microsystems recently formed an alliance to help the Transportation Security Administration and other federal agencies identify technologies for boosting transportation security. Last month, the group began offering a package that combines background on individuals with biometrics technology to enable a frequent-traveler program. The package also included a "secure employee" registration and authentication program designed to identify and assess security risks via existing employee information.

But "we can't even begin to define the limits of what has to be done here," EDS CEO Dick Brown said during a Council for Excellence in Government luncheon in Washington. "We're as vulnerable today to an electronic Pearl Harbor as we were on Sept. 11."

Brown suggested that White House Homeland Security Director Tom Ridge "take a good, hard look" at the nation's "most obvious" vulnerabilities and determine the most detrimental impact that could occur by exploiting those vulnerabilities. Although the tech community has been looking to aid government in installing new technologies to help detect potential terrorists, systems are still faulty, panelists said.

"This is a combination of process and technology," said Scott Hartz, global managing partner for PwC, adding that no one company or agency can succeed on its own.

Oracle executive Steve Perkins said the technological solutions for many of the nation's security challenges exist, but the TSA and other agencies need to be educated on how those solutions can be most effective. Despite some privacy groups' concerns with ideas such as a centralized database, or biometric ID cards, "we're going to have to push that line back and forth" between security and privacy, Perkins said.

"It's really a problem of political will and a will of citizens. What do we want to trade to be secure?" Perkins asked. "The question is, to secure the nation, where do we draw that line?"

Sun Microsystems CEO Scott McNealy echoed those sentiments, saying that as more technology is used in homeland security, a "huge set of tradeoffs" is created. People want to be more secure, yet as more ideas are floated on how technology can affirm identities and track people's whereabouts, many say their privacy needs more protection and should not be sacrificed in the name of security.

But "you're not losing privacy that you haven't lost anyhow" by using new technologies, McNealy said. "Anonymity breeds irresponsibility," he said, adding that anonymity is a "very dangerous weapon."

McNealy cited Sun's work with the Liberty Alliance, the consortium of technology companies spearheaded by Sun that was created to develop e-commerce standards. He said the group's work on data-sharing standards is an example of how organizations and governments can do more to use technology for homeland security.
***************
Nando Times
Technology: Senator blasts broadcasters for missing digital deadline
Copyright © 2002 AP Online


WASHINGTON (May 1, 2002 9:26 p.m. EDT) - A U.S. senator Wednesday criticized television broadcasters for failing to meet a deadline in the transition to digital and said Congress may intervene if significant progress doesn't emerge.

While crediting some broadcasters for contributing to the shift to digital, Sen. John McCain, R-Ariz., called the transition a "grave disappointment for American consumers."

On Wednesday, all commercial TV stations in the U.S. were supposed to be broadcasting a digital signal, but, as McCain pointed out, most missed it.

"Today it is clear that three-quarters of those broadcasters have not met their commitments, and their failure to do so is slowing the transition to digital television," said the ranking Republican on the Commerce Committee.

The target date for a complete transition to digital signals, which offer crisp images and sound, is 2006. The rollout, however, has run into obstacles, including concerns by content providers that their works will be pirated.

McCain said a slow transition affects Americans not only as consumers, but also as taxpayers.

"Broadcasters were given $70 billion in spectrum to facilitate the transition on the condition that they return it when the transition is complete," the senator said. "By failing to meet today's deadline, broadcasters continue to squat on the taxpayers' valuable resource."

The wireless industry, eager for spectrum, also criticizes broadcasters for not making timely digital advances.

"The wireless industry has proven that consumers want and use digital technology," the Cellular Telecommunications & Internet Association said. "It's also the most efficient and responsible use of valuable spectrum."

Michael Powell, the chairman of the Federal Communications Commission, a month ago proposed voluntary steps for the industry to accelerate the shift to digital. Certain broadcast networks and the cable industry have expressed willingness to go along with the plan.

McCain said he hopes those commitments lead to results. He said the FCC plan is appropriate for now, but alternative measures might be necessary in case of further delays.

Edward Fritts, president of the National Association of Broadcasters, said his group is pleased that the cable industry is moving toward carrying digital broadcast signals.

"We look forward to the day when cable operators carry all digital broadcast signals in their entirety," he said.

The NAB has said that 324 digital television stations are on the air in 108 markets.
******************
Nando Times
Technology: HDTV: a 'revolution' in search of couch potatoes
By NOEL C. PAUL, The Christian Science Monitor


WARWICK, R.I. (May 1, 2002 3:00 p.m. EDT) - Mary Dean watches television all day, every day. The Army officer from Cranston, R.I. - now in the middle of a pregnancy - lies horizontal on her couch, morning to night, flipping channels.

Looking for a new set recently in a nearby electronics store, however, Dean shows no interest in models that promise crystal-clear pictures and sound.

"I wouldn't pay more for what someone else considers more clear," says Dean. "Look, I can see this fine," she adds, pointing to a standard model.

The average American watches four hours of TV a day. And yet Americans show so little interest in picture quality, one might think they considered TV sets mere accompaniments to evenings spent composing poetry in iambic pentameter.

Three-quarters of the nation's TV stations required to broadcast programs in high-definition (HDTV) format will miss their federally mandated deadline today, largely because consumers are content with plain TVs.

A study by the General Accounting Office, a watchdog congressional group, found that more than 800 stations - 74 percent of those required to meet the new standards - aren't ready to send at least some "high definition" programming.

Congress set the deadline six years ago, calling the high-definition leap the most important consumer-electronic advancement since TVs moved to color.

But less than 1 percent of Americans have bought in. The reasons, experts say, pertain as much to U.S. culture as to consumer economics.

In general, Americans ask little of their televisions. For many, the first and only requirement is that they be on. "It's electronic wallpaper," says Jim Beniger, professor of communication at the University of Southern California in Los Angeles.

Those who turn on their TVs for ambience are not likely to be moved by a salesman's advocacy of the virtues of better pixilation.

"As long as it doesn't go like this, I'm happy" says TV shopper Ray Sampson, drawing swirly lines in the air.

Televisions labeled "high definition" are equipped with a decoder that reads special digital transmissions from TV stations. The picture features six times the number of pixels on a standard TV. The result is crisp resolution rivaling that of cinema screens.

"You can see the patches of sweat forming underneath Tiger Woods' arms," says Cecil Houston, director of communication, culture, and information technology at the University of Toronto.

Engineers and entertainment gurus have touted the "high-definition revolution" for more than 20 years. In the early days, advocates promoted HDTV with almost messianic fervor, and posed dire warnings that Japanese suppliers would dominate the industry.

In 1988, Ronald Reagan became the first president to be recorded on HDTV, stating that his remarks represented a "historic moment for both the presidency and American broadcasting." Ten years later, the TV sets were not even available in stores.

HDTV's stay on the shelves

Through last fall, Americans had only bought about 1.7 million HDTV sets - less than 900,000 in 2001. Consumers bought more than 21 million standard-color TVs last year alone.

Even so, high-end HDTV retailers report growing sales. And more than 100 stations in major markets have begun broadcasting shows in high-definition format.

Overall, however, programs are sparse. The major networks only transmit a few marquee shows - like NBC's Tonight Show - in high definition. The same is true of nearly all cable operators. Broadcasts sent in the normal format look the same on both HDTV sets and regular models.

Gregory Sanders, a salesman at Ultimate Electronics in Albuquerque, says his sales pitch for HDTVs is hindered by the lack of programs broadcast digitally.

"Customers don't understand that stations aren't broadcasting yet," says Mr. Sanders. "They think they can just plug in the TV and instantly get high definition."

The biggest consumer turnoff: HDTV sets cost about twice as much as standard models. Americans are not likely to pay premium price for visual perks. While it's true that, last year, U.S. consumers bought more than 14 million DVD players - which offer better pictures than VCRs - they were prompted by low prices. Quality DVD players only cost $150 - slightly more than VCRs.

A revolution stalled?

Now, Regulators are worried. Federal Communication Commissions chairman Michael Powell recently encouraged broadcasters, TV-set makers, and cable operators to mount a voluntary effort to spark the HDTV revolution. Enthusiasm is not boiling over. According to the GAO report, a third of the stations that have adopted HDTV would not have done so without a government mandate.

They cite a number of disincentives. Among them: the high cost of upgrading transmitters and the scant number of televisions on the market that can accommodate the signals. Both stem from a lack of consumer interest, experts say.

They may be less wowed by innovation than consumers in Western Europe and Japan. While baby boomers experienced the first TVs and then the onset of color, new technology seems to have lost its luster according to Paul Witt, a professor of communication at the University of Texas at Arlington. "The innovations are coming faster at a greater volume, they're just not as novel anymore," says Witt.
**********************
Euromedia.net Denmark
Zentropa says it has designed a revolution


Film company Zentropa, partly owned by prize-winning Danish director Lars von Trier, is behind Tvropa.com.

Tvropa has been a playground for new talent and an experimentation when exploring the internet as a business model for the film company.

It has been producing and broadcasting a wide range of internet TV on its website: from weird animated shorts to political talk shows.

Surprisingly, its deficit last year was only E140,000. Maybe due to the fact that in the film business many people work for nothing and finance low budget productions themselves to get a shot.

Creativity has been of high quality and this is a recipe for success found previously by Zentropa with the Dogme95 films that were received enthusiastically by audiences all over the world.

The co-owner and president of Zentropa, Peter Aalbaek Jensen, has a nose for business and for marketing awkward projects.

The Danish media will forever remember the evening in Cannes when Aalbaek Jensen promised an exclusive interview with Nicole Kidman, who is staring in Lars von Triers latest film, to the journalist who won a naked swimming contest in his swimming pool.

His obscure ideas are certain to capture headlines on the front pages.

Tvropa.com's other co-owner is Aalbaek Jensen's brother, Niels Aalbaek Jensen, which should guarantee that there are some surprises in store.

Recently, the company announced nothing less than a revolution. Together with Siemens and Motorola, it has developed a set-top box that can distribute digital TV (DTV) signals to several TV's.

It has also explored the MPEG4 standard and concluded that the DTV network has room for four or five times the number of channels than currently estimated. The number rises for TV in less than DVD quality, Tvropa.com claims.

Furthermore, the company says it has come up with a model that dramatically lowers the cost of establishing a DTV network in Denmark. How exactly it has done this is still a secret.

The network has become a hot political topic in the Danish parliament, where the recently-elected Liberal government has plans to pass new media legislation within a few months.

Tvropa.com has announced that the set-top box, the MPEG4 standard, and the secret network plan are key elements in a proposal that it plans to present to the media spokesmen of the Danish political establishment in early May.

The new chairman of Tvropa.com's board is an experienced and respected businessman: Anders Knutsen, formerly the president of Bang & Olufsen. Two well-known companies have bought a 30 per cent share and accounting firm Deloitte & Touche has announced an E100m investment plan.

All this to impress the politicians and take the lead in the political negotiations for new legislation.

The Aalbaek Jensen brothers have set a new standard for their own audacity by trying to influence a political process in this manner.

The exact plans will be studied carefully by the trade when they are revealed in May. Though Peter Aalbaek Jensen is known for his big cigars and loud mouth, he is also known for his surprising successes.
******************
CNET
Canada and broadband Internet access


Smaller population plays major roll, online shopping

Though Canadains have an ever increasing access to the Internet and all the servies it offers, our American counterparts have more choice when it comes to shopping online.

A new study released April 30 2002 from marketing research firm Ipsos-Reid, Toronto, shows that nearly one-half (48 per cent) of Canadian adults with a home Internet connection have a broadband connection, further proof that Canadians continue to be at or near the top in the world when it comes to Internet adoption and usage.

The findings from the company's Canadian Inter@ctive Reid report - a quarterly review of Canadian Internet trends - also show that 75 per cent of all Canadians have access to the Internet, one of the highest adoption rates in the world. Additionally, 63 per cent of Canadian adults access the Internet from home.

Broadband Internet access has doubled in the past two-years in Canada. In March 2000, Ipsos-Reid found that only 24 per cent of Canadians with a home Internet connection had a broadband connection (DSL or cable modem) versus 48 per cent in March of this year. Put another way, 30 per cent of all Canadians access the Internet from home with a high-speed Internet connection, which can be 25 times as fast as dial-up services.

"Broadband has gone mainstream in Canada, and its growth has been nothing short of phenomenal," said Chris Ferneyhough, Vice President of Technology Research at Ipsos-Reid in Toronto. "Aggressive promotions from the cable and telephone companies have obviously succeeded in attracting thousands of new broadband users, young and old. But there's also no denying that once you've tried broadband, you're hooked."

This increase in Internet use may be due to the government of Canada's, National Broadband Task Force, this Force was set up to ensure all communities has broadand access by 2004.

In the FAQsection of the Task Force Web site it says the need for broadband is because of: "New advanced applications like tele-health, distance learning, the delivery of government services and e-business require broadband access. These applications have the potential to greatly enhance the lives of Canadians, whether through more learning opportunities, better access to health care or improved business opportunities. Its important that all Canadians have the opportunity to access to these services, no matter their location, their education or their income."

Twenty-two per cent of Canadians live in small communities. In a report given to the the National Broadband Task Force in 2001 by The Rural Secetrait of Agriculture and Agri-food Canada, it says people living in rural areas should not be penalized and be able to get highspeed internet connections at a reasonable price. This is to ensure that everyone in the county has the same access to information and technology.

In a presentaiton by the Task Force in April 2001 they showed how the further away from Urban areas the less access you have. It goes from 8 per cent with no access to 32 per cent.

The Broadband Task Force suggested it would take 1.3 - 1.9 billion dollar invenstment by the Canadain government in fibre and wirsless technology for many communites to get the Internet. This does not include helping businesses and community access groups. Each province and terrority also has set up provisions for helping its inhabitants get access to the Interent.

The incidence of broadband in Canada dwarfs that of the US, where only an estimated 21 per cent (see Note 1) with a home Internet connection are using broadband. Out of the 21 per cent Canadian communites with high speed internet, 10 per cent have DSL only, while sex per cent have cable only and five per cent have access to both. Meanwhile, the percentage of European households with broadband connections has been estimated at about 5 per cent.

"The acceptance of broadband in Canada and the US couldn't be more different," said Ferneyhough. "The difference is due to a myriad of factors, including lower access prices in Canada, a less fragmented industry relative to the USA, our regulatory framework, better and more reliable access, and extremely positive response from consumers to marketing campaigns."DSL providers in Canada have steadily increased their broadband market share from 29 per cent of broadband users in March 2000 to 42 per cent currently, the study found.

Not only has broadband penetration reached a new high, but Ipsos-Reid also found that the incidence of overall Internet access has reached 75 per cent in Canada, compared with 68 per cent at the same time last year. This incidence is also higher than that found in the US, where approximately 69 per cent of all adults have Internet access. Canadians are at or near the top in the world in overall Internet adoption, online banking and music downloading, earlier studies have confirmed.

In regards to home Internet access, 63 per cent of Canadian adults access the Internet from home. While the incidence of household Internet access is lower in the US (55 per cent), their larger population means that 114 million American adults access the Internet from home (see Note 2), compared to only 15.1 million Canadian adults who are accessing the Internet from home.

"Ultimately that is why Americans are generally going to have a lot more choice, selection, and opportunity to purchase goods and services online," said Ferneyhough. "We may be ahead of the game in access and broadband on a per capita basis, but overall there are 137 million (see Note 3) American adults online compared to only 18 million Canadian adults."

Note 1: Rachel Konrad. Survey: Broadband goes mainstream. CNET News.com March 5, 2002

Note 2: Humphrey Taylor. Internet Penetration Increases to 66 per cent of Adults (137 million). HarrisInteractive. April 17, 2002
****************
Sydney Morning Herald
Broadband the decade's boomer, says Budde
By Alan Wood
May 1 2002


Independent IT and telco analyst Paul Budde says broadband will be the next cash cow for the telecommunications industry, and telcos that focus too much on mobile or narrow band networks will suffer.

Mr Budde, speaking on industry trends in Sydney today, said the Australia telecommunications market would be worth $90 billion in 2010 compared to $38 billion in 2001.

Broadband - the transmission of large amounts of electronic information including telephone calls, television and the internet, often over phone lines, cables or satellite - would account for $81 billion of that, he said.

Mr Budde said the telecommunications market was currently dominated by commodities-based products, and telcos would need to develop niche products to help grow revenues.

"The nature of commodities is prices are going down, margins are going down," he said.

"On the other side there is this value added element - value added services need to be bought to the network in order to have these extra margins."

He said the broadband network offered telcos more opportunities to build such value-added applications, more than mobile networks or narrow band networks.

He said mobile would remain voice driven, peaking at $7 billion annual revenue in 2002/2003 and dropping to $5 billion by 2010.

"There are far more opportunities on the broadband network to make lots of money out of it for companies, organisations as well as the telcos, than there are on the mobile network," Mr Budde said.

"I think 3G (mobile) is dead - I think we will never see the introduction of 3G. If anything happens it will be 4G towards the end of this decade".

Carriers including Telstra, Optus and Hutchison Australia have said they're committed to the development of a 3G mobile network, with Hutchison planning to roll out a network in late 2002/early 2003.

Mr Budde said "killer applications" on the broadband network would include family-based content, for example, relatives sending personal videos from one side of the world to the other.

He said only 150,000 Australian homes currently had broadband services, whereas more than 50 per cent of all 7.2 million households could be addressed by broadband.

Mr Budde said broadband development was being held back by a lack of foresight and network development by the larger telcos, including Telstra.

He said the Telecommunications Industry Ombudsman received around 70,000 complaints per year compared to around 7,000 for the banking industry.

"Unfortunately the vision is not there and therefore you will start seeing reactive situations, rather than the planned organisation of how to implement broadband in the market," he said.

He said with the Federal Government bringing new emphasis to broadband companies like Canberra-based TransACT, Victorian-based Neighbourhood Cable and other regional service providers would emerge with greater prominence.
*******************
CNET
Canada vulnerable to cyber attacks
By DAVID GAMBLE -- Sun Media
Read this before you e-mail your income tax.


The federal government's computer systems -- and all the personal data they contain -- are extremely vulnerable to cyber crooks, warns Auditor General Sheila Fraser.

Federal computer security is years "out of date," meaning not only could it fall prey to computer viruses, but hackers who break into computers to steal and manipulate personal information, Fraser warns in her latest report.

"Before Canadians go online to do business with the government, they want assurance that government systems are secure and that their personal information will be properly protected," Fraser said.

Her staff did some cyber detective work that revealed some alarming security gaps. Attempts were made to to hack into 260 federal government systems and 85 --nearly one-third -- were found to be vulnerable to unauthorized access, finding most could be "readily compromised by a targeted cyber attack."

In another test, 10,000 government phone numbers were dialed up by computer and 97 were found vulnerable to hackers, many of them connecting "unauthorized modems" and opening the door to government information.

None of the departments found wanting were named in the report, but they have been informed of the problems.

While the federal government recently adopted a new computer security plan, Fraser warns that technical standards date back to 1995 and 1997.
*****************
Sydney Morning Herald
Like hackers to a honeypot
By Patrick Gray
April 30 2002


A honeypot is a computer connected to the Internet in the hope that it will be hacked, enabling computer security researchers to capture tools and techniques used by hackers.

It is not, as many believe, a bitchy and vindictive attempt to snare and then prosecute hackers.


A honeypot is the real-world equivalent of parking an expensive car down a dimly lit back street and waiting for someone to break into it.


The car's owner doesn't need to position a sign on the windscreen that says "steal me", just as the administrator of a honeypot does not need to make their system more vulnerable to attack than the next.

Honeypot administrators do nothing to make their systems less secure. The only difference between a honeypot and a normal computer connected to the Internet is that every single byte that hits a honeypot is recorded and analysed. This makes it possible for analysts to determine when and how the system has been breached.

It is the "how" that is particularly interesting. For a long time, black-hat hackers have done everything in their power to conceal the methods and entry points that they use to hack into computers.

If the method and entry points are unknown, then a defence against attack profiles using these entry points and methods is impossible to formulate. When honeypots are hacked, all the administrator of that honeypot needs to do is look at the recorded data collected immediately before the breach. This will tell them precisely how their honeypot was compromised.

The honeynet project is the next step in the evolution of honeypot methodologies.

A honeynet is a group of honeypots scattered over the Internet to maximise the chance of capturing hacker tools and techniques.

The project was established in 1999 by a group of security professionals.

The first phase of the project ran for two years and focused on first-generation honeypots. It was essentially a group of honeypot administrators who shared information on attack profiles through a mailing list.

The project has just entered its second phase. According to its website, the objective is to "develop a superior honeynet, one that it is easier to deploy, harder to detect, and more efficient in collecting data".

"These (second-generation) honeynets will be deployed targeting the more sophisticated attackers; improving our intelligence-gathering capabilities, we are also developing the infrastructure for distributed deployments of honeynets, working with various educational, government, commercial and military organisations," it says.

The objectives for phase three, due to begin in 2004, are being developed by the project board of advisers and active members.

Among these members are the chief executive of Foundstone, the chief technology officer of Counterpane and SecurityFocus.com, as well as miscellaneous security personalities such as Fyodor, the creator of the widely used nmap network scanning utility.

Phase two has already tasted success.

On January 8 this year, the capture of a previously unknown and undisclosed exploit occurred.

By January 14, a CERT advisory was released detailing the technique used by the hacker to compromise the honeypot. This enabled security analysts to build a sophisticated defence against the captured method.

Until this method was known, it had been extremely difficult to detect this type of attack.

The honeynet project also enables researchers to capture hacker "tool kits". The modus operandi of the average hacker is to load their tools on to systems that they have compromised.

They can use their fresh conquest to launch attacks on other systems.

But honeypot builders know how to prevent their systems being used as launching pads for these types of attacks.

Hackers are generally less careful when they don't think they're being watched, and, by making honeypots appear to be lifeless and banal back-end systems, it is likely that they will upload their juiciest tool kits and back doors immediately after compromising them.


If the distribution of the honeypots is eventually both dense and consistent enough to provide comprehensive Internet-wide coverage, then it is likely that the honeynet project will provide the security community with invaluable security vulnerability information.


The honeynet project website can be found at:
******************
New Zealand News
NZ slips in e-government stakes
29.04.2002
By ADAM GIFFORD

New Zealand has slipped five places to 14th in an international survey of progress towards e-government by global consulting firm Accenture.

But the introduction of a new whole-of-government web portal could push us back up the rankings.

The top three countries were Canada, Singapore and the United States. They were singled out for putting the citizen at the centre of their efforts and for the way they had integrated the online offerings of different Government agencies.

Australia jumped to fourth in overall rankings because of a significant increase in the number of Government services offered online.

Accenture's government relations head for New Zealand and Australia, Jack Percy, says New Zealand is moving a little slower than other countries, but the survey is probably not a fair reflection of the amount of attention e-government is getting here.

The e-government unit in the State Services Commission has done a lot of behind-the-scenes enabling work in terms of data standards, connectivity standards and security, and it is due to roll out the new Government portal from July.

Percy says the real test will be whether all that activity leads to rapid introduction of a lot of new e-government services.

Trevor Mallard, minister responsible for e-government, says he is not concerned about the slump in the Accenture ranking.

"Developments in our e-government programme are significant this year and I am sure that will be reflected in future surveys," Mallard said.

Accenture surveyed 23 countries, with researchers using Government websites to see which services normally available from a Government they could get online.

It looked for service maturity and the level of completeness the service offered.
*******************
Taipei Times
Wireless TV technology may benefit rural areas
NY TIMES NEWS SERVICE, NEW YORK



A new television technology coming after eight long years of debate.


It's digital. It's wireless. It's local. It offers high-speed Internet access. It's akin to digital cable -- but without any cables.

The service doesn't yet have a catchy one-word name like "satellite" or "cable." When the Federal Communications Commission approved the technological service last week, it referred to it as a "multichannel video distribution and data service."

Whether this new wireless television technology can gain a footing in a field crowded with cable and satellite behemoths remains to be seen. One advantage is that it could bring broadband access to rural areas not served by cable. In addition, it could bring local channels to the 170 or so television markets whose satellite services do not carry local channels. And it potentially has a key advantage that many consumers care about: lower price. Wireless cable, which uses a network of land-based antennas to carry signals to and from a small dish at a user's home, is supposed to be cheaper than wired cable or wireless satellite service.

"This will be the Southwest Airlines of subscription television," said Sophia Collier, the president of Northpoint Technology, the small company that originally envisioned the technology.

Last week the US FCC ruled that the spectrum would be auctioned off in hundreds of geographic pieces. That decision opens up competition to a host of bidders. Satellite companies could potentially buy up licenses to create a combination satellite-terrestrial service, with the broadband and local channels being served by wireless cable.
*******************
Wired News
The Art of Misusing Technology


NEW YORK -- Hacking has been described as a crime, a compulsion, an often troublesome end result of insatiable curiosity run amok.

Rarely has anyone who is not a hacker attempted to portray the creation, exploration and subversion of technology as a valid and elegantly creative art form.

But Open_Source_Art_Hack, a new show opening Friday at Manhattan's New Museum of Contemporary Art, attempts to show how the act of hacking and the ethics of open-source development - direct participation, deep investigation and access to information - can be art.

Each piece and performance features technology altered by an artist-geek with an activist attitude, something that the curators of the show refer to as "hacking as an extreme art practice."

"Originally the word 'hacker,' as coined at MIT in the 1960s, simply connoted a computer virtuoso," said Jenny Marketou, a new media artist and co-curator of Art_Hack. "Now hacking means reappropriating, reforming and regenerating not only systems and processes but also culture."

Art created with open-source ethics in mind allows artists to become providers of more than pretty pictures. They can produce functional tools that they and others can then use to create new art forms, said museum director Anne Barlow.

"And given the nature of open source, the process can be as important as the end product," Barlow said.

Process -- how the art was created and how it can evolve -- is one of the key focal points of the show. Activism -- using art and hacking to tweak a system or totally sabotage it -- is the other primary focus.

In this show, it doesn't matter much how the art looks. What matters is what the artist and others can do with it and learn from it.

"I have come to think of hacking as a process involving a combination of information dissemination, direct action, skills and creative solutions," Marketou said. "Hacking is an important phenomenon and a metaphor for how we digitally manipulate and think our way through the networked culture that engulfs us."

Art_Hack will have a number of interactive exhibits that will involve museum visitors in altering or undermining the code used every day in software and society.

One installation will allow viewers to clone their own "data bodies" and set them loose on the Net. The clones then serve as a sort of digital double-identity, allowing people -- at least in theory -- to deflect any data-gathering invasions of privacy.

Another piece explores the same disinformation idea by using automated tools to create fake homepages. The bogus pages are then propagated through various search engines so that it becomes impossible for anyone to verify anyone else's personal data. The idea spins off the common practice of providing false information on mandatory website registration forms.

In Anti-wargame, Future Farmers artist Josh On challenges the ideas behind most computer games. On's games reward players who demonstrate even a shred of social conscience.

Cue P. Doll/rtmark's CueJack project turns the infamous "CueCat," an electronic device intended to provide marketing information to corporations, into a tool that provides consumers with information. With one swipe of the CueJack, consumers can access a database with "alternative" information about a scanned product's manufacturer.

"I am interested in ways that artists misuse technology, use it for other than its intended or sanctioned purposes," said Steve Dietz, curator of new media at the Walker Art Center, and co-curator of Art_Hack. "This sort of transformation appears to be a common if not fundamental aspect of any artistic use of technology, including coding and hacking."

Also on display will be art created from the data collected by electronic wiretaps known as "packet sniffers." This project allows museum visitors to check the security status of networks belonging to various activist groups. When the sniffer finds a security hole, it will launch a sound and light show in the museum.

"Due to the nature of the show, I had the opportunity to challenge and to be challenged by the museum on several legal issues which were addressed in some works," Marketou said. "It has been a surprise to me how most of the cultural institutions in this country are not ready yet to host this sort of exhibition because of both the technical and controversial issues raised by the politics associated with some of these works."

Art_Hack opens with a "Digital Culture Evening" hosted by Marketou and Dietz, who will discuss hacking as art.

Other planned programs include German hacker Rena Tangens discussing European versus American concepts of privacy, and guided walking tours to spot the hidden surveillance equipment installed on and above the streets of Manhattan.

Art_Hack runs through June 30.
*****************
News Factor
HP Unveils New Digital Photo Storage Scheme

HP announced Thursday that it will begin storing digital images on its own Web sites for users of its "Instant Share" program for digital cameras. Full Story http://www.newsfactor.com/perl/story/17558.html#story-start

*********************


Lillie Coney Public Policy Coordinator U.S. Association for Computing Machinery Suite 507 1100 Seventeenth Street, NW Washington, D.C. 20036-4632 202-659-9711