August 31, 2003

Usability 101: the What, Why, and How of User-Centered Design

(Jakob Nielsen's Alertbox)

A useful summary of what makes web sites tick. It comes down to usefulness.

Usability has five quality components:

Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?
Efficiency: Once users have learned the design, how quickly can they perform tasks?
Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?
Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?
Satisfaction: How pleasant is it to use the design?

Usability 101
Summary:
What is usability? How, when, and where can you improve it? Why should you care? This overview answers these basic questions.
This is the article to give to your boss or anyone else who doesn't have much time, but needs to know the basic usability facts.

What
Usability is a quality attribute that assesses how easy user interfaces are to use. The word 'usability' also refers to methods for improving ease-of-use during the design process.

Usability has five quality components:

Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?
Efficiency: Once users have learned the design, how quickly can they perform tasks?
Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?
Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?
Satisfaction: How pleasant is it to use the design?

There are many other important quality attributes. A key one is utility, which refers to the design's functionality: Does it do what users need? Usability and utility are equally important: It matters little that something is easy if it's not what you want. It's also no good if the system can hypothetically do what you want, but you can't make it happen because the user interface is too difficult. To study a design's utility, you can uuse the same user research methods that improve usability.

Why
On the Web, usability is a necessary condition for survival. If a website is difficult to use, people leave. If the homepage fails to clearly state what a company offers and what users can do on the site, people leave. If users get lost on a website, they leave. If a website's information is hard to read or doesn't answer users' key questions, they leave. Note a pattern here? There's no such thing as a user reading a website manual or otherwise spending much time trying to figure out an interface. There are plenty of other websites available; leaving is the first line of defense when users encounter a difficulty.
The first law of e-commerce is that if users cannot find the product, they cannot buy it either.

For intranets, usability is a matter of employee productivity. Time users waste being lost on your intranet or pondering difficult instructions is money you waste by paying them to be at work without getting work done.

Current best practices call for spending about 10% of a design project's budget on usability. On average, this will more than double a website's desired quality metrics and slightly less than double an intranet's quality metrics. For software and physical products, the improvements are typically smaller -- but still substantial -- when you emphasize usability in the design process.

For internal design projects, think of doubling usability as cutting training budgets in half and doubling the number of transactions employees perform per hour. For external designs, think of doubling sales, doubling the number of registered users or customer leads, or doubling whatever other desired goal motivated your design project.

How
There are many methods for studying usability, but the most basic and useful is user testing, which has three components:
Get hold of some representative users, such as customers for an e-commerce site or employees for an intranet (in the latter case, they should work outside your department).
Ask the users to perform representative tasks with the design.
Observe what the users do, where they succeed, and where they have difficulties with the user interface. Shut up and let the users do the talking.
It's important to test users individually and let them solve any problems on their own. If you help them or direct their attention to any particular part of the screen, you have contaminated the test results.
To identify a design's most important usability problems, testing five users is typically enough. Rather than run a big, expensive study, it's a better use of resources to run many small tests and revise the design between each one so you can fix the usability flaws as you identify them. Iterative design is the best way to increase the quality of user experience. The more versions and interface ideas you test with users, the better.

User testing is different from focus groups, which are a poor way of evaluating design usability. Focus groups have a place in market research, but to evaluate interaction designs you must closely observe individual users as they perform tasks with the user interface. Listening to what people say is misleading: you have to watch what they actually do.

When
Usability plays a role in each stage of the design process. The resulting need for multiple studies is one reason I recommend making individual studies fast and cheap. Here are the main steps:
Before starting the new design, test the old design to identify the good parts that you should keep or emphasize, and the bad parts that give users trouble.
Unless you're working on an intranet, test your competitors' designs to get cheap data on a range of alternative interfaces that have similar features to your own. (If you work on an intranet, read the intranet design annuals to learn from other designs.)
Conduct a field study to see how users behave in their natural habitat.
Make paper prototypes of one or more new design ideas and test them. The less time you invest in these design ideas the better, because you'll need to change them all based on the test results.
Refine the design ideas that test best through multiple iterations, gradually moving from low-fidelity prototyping to high-fidelity representations that run on the computer. Test each iteration.
Inspect the design relative to established usability guidelines, whether from your own earlier studies or published research.
Once you decide on and implement the final design, test it again. Subtle usability problems always creep in during implementation.
Don't defer user testing until you have a fully implemented design. If you do, it will be impossible to fix the vast majority of the critical usability problems that the test uncovers. Many of these problems are likely to be structural, and fixing them would require major rearchitecting.
The only way to a high-quality user experience is to start user testing early in the design process and to keep testing every step of the way.

Where
If you run at least one user study per week, it's worth building a dedicated usability laboratory. For most companies, however, it's fine to conduct tests in a conference room or an office -- as long as you can close the door to keep out distractions. What matters is that you get hold of real users and sit with them while they use the design. A notepad is the only equipment you need.
Learn More
We'll be running full-day tutorials on user testing and the usability lifecycle at the User Experience 2003 conference in Chicago and London.
My next column will address the main usability misconceptions. Check back in two weeks. (Subscribe to my email newsletter to be notifed of new Alertboxes)

August 31, 2003 at 11:54 AM in Web lifestyle | Permalink | Top of page | Blog Home

Yahoo! News - Web Search Content Ads Seen Falling Short

Further signs that advertising online will not work using traditional, or even semi traditional media approaches.

Yahoo! News - Web Search Content Ads Seen Falling Short:
"Fri Aug 29, 3:56 PM ET
By Lisa Baertlein

PALO ALTO, Calif. (Reuters) - Web search companies have hyped their new contextual services as the next big thing in Internet advertising, but early results by online marketers show those new ads may be underperforming expectations. "

Some consultants overseeing Web ad campaigns are telling clients to proceed cautiously when considering contextual ads, which are served to news and information Web sites when certain words appear in content.

They say the ads can perform more like banner ads -- the humbled, old next big thing of the bygone dot-com boom -- than the lucrative search ads that inspired them.

That could be bad news for companies like Google Inc. and Overture Services Inc. (Nasdaq:OVER - news), which hope contextual products will boost revenues and their importance to advertisers.

"The results we saw from our contextual ad campaigns looked less like search marketing and more like banners," said Brad Byrd, business development director at NewGate Internet, a Sausalito, California, search marketing agency.

Reprise Media's Joshua Stylman said contextual campaigns can be very effective if managed well, and added they can outperform the targeted search ads that have vaulted Google's and Overture's sales into the billion-dollar range.

But Stylman and Byrd both said the search companies need to match contextual ad pricing to effectiveness. The marketers also said the companies need to make contextual ads easier to track and manage separately from search ads.

Stylman said Reprise currently is "opting out of Google and Overture's contextual products for most of our campaigns."

Representatives from both search companies said they're still gathering information about their contextual products, which were launched as recently as six months ago.

"We're learning a lot about this business and how it works for advertisers," Overture vice president Paul Volen told Reuters. He said the company is refining and enhancing its product, but he declined comment on pricing.

Pasadena, California-based Overture is currently being acquired by Yahoo Inc. (Nasdaq:YHOO - news)


WEB AD RENAISSANCE

Overture and Google get much of the credit for reviving the Internet advertising market after demand for banner ads imploded in early 2000 due to big cutbacks in dot-com spending and a perception that consumers were generally ignoring them.

Their search services deliver ads when Internet users search via key words. The ads look like search results with links to Web sites, but are arranged under special headings.

Contextual ads use the key word technology from search to dish up ads on content sites. Despite the fact that fewer Internet users are clicking on such ads, advertisers often pay the same for key words used in both search and contextual ad campaigns.

A study from NewGate compared the performance of Google search and contextual campaigns for two clients. Results showed that click-through rates were 14 times to 150 times higher on the search ads than the contextual ads. The cost per lead, or order, was two to eight times higher for contextual campaigns.

"The picture painted (by the study) is one perspective, but it's far from the only perspective out there. Ultimately, we'll do what's right for our customers and advertisers," said Kurt Abrahamson, Google's business manager for contextual ads.

But marketers, too, are finding difficulty in predicting where contextual ads will land because often they can be linked to words in rapidly changing news stories.

In one recently publicized example, Mountain View, California-based Google served a luggage ad with a New York Post story about a murder victim whose body parts were found in a suitcase.

Still, some users are sticking with Google and Overture.

Mark Aistrope, president of Ohio-based Meeting Tomorrow, which sends LCD projectors to business travelers, said he would like to pay less for his contextual key words, but added that so far he's happy with the results.

"It's good to have them out there. There is a branding element to what we're doing," he said.

August 31, 2003 at 10:36 AM in Online Marketing | Permalink | Top of page | Blog Home

Internet Convergence 2.0 - Ten trends

The mass media has given a lot of airtime to Internet failures, the overpromise of the Internet as a transformative force in business, and to the real-world pains that businesses face in a slow- to no-growth economy. Despite this gloomy backdrop, a range of innovative trends have continued to blaze forward. Each trend is powerful on its own, but when combined with the others, we see a dynamic, mutually reinforcing convergence.

Internet Convergence 2.0 - Ten trends
The mass media has given a lot of airtime to Internet failures, the overpromise of the Internet as a transformative force in business, and to the real-world pains that businesses face in a slow- to no-growth economy. Despite this gloomy backdrop, a range of innovative trends have continued to blaze forward. Each trend is powerful on its own, but when combined with the others, we see a dynamic, mutually reinforcing convergence.

A Short History of Technology and the Web
Many people equate the rapid adoption and growth of the Internet to the introduction of the World Wide Web and Mosaic 10 years ago, noting that until then the Internet wasn’t a useful or usable technology. But the web was merely fuel for the fire of a collection of trends that were preconditions to the commercial Internet’s birth. Some of these trends included:

The widespread availability of affordable personal computers. Rapid consumer adoption of PCs began in 1994 and 1995 as price points and functionality made these viable household goods.
Low-cost connectivity. Until 1994, dial-up connectivity was largely a niche phenomenon for computer enthusiasts and business professionals. After 1994, large-scale connectivity infrastructure investments by AOL, Microsoft, and thousands of smaller Internet service providers (ISPs) provided an affordable means to get online.
Ubiquity of LAN technology. In the early to mid-1990s the local area network (LAN) became mainstream technology for organizations. Adopting LAN technologies was necessary so that corporations could connect to wide area networks. This was the building block for the Internet.
Mass-market server software. In 1994, commodity server platforms were coming into place with the introduction of Pentium-based servers, Windows NT, and a wide range of free UNIX-based software (including FreeBSD and Linux). These new hardware and software platforms made it possible for a range of companies and developers to build server software and applications affordably. Up until this time, deploying a web server (necessary for Internet adoption) was a highly priced, highly complex proposition.
Digital media creation tools. By the mid-1990s a wide range of PC software applications had emerged for creating digital media such as graphics, text, and audio. Digital media was another necessary component of the first phase of content for the Internet.
Individually, each of these trends is significant—each a force driving the growth and development of the PC and computer industry. But when combined, these trends create a growth dynamic with incredible power and thrust.

An Emerging Convergence
To an outside observer, the technology industry might look like a graveyard or minefield. It also seems to be a marketplace where everyone is on the defensive—they’re risk-averse and conservative, and they’re trying to keep costs down.

As an insider in the industry, however, I see the marketplace quite differently. Quietly, over the past several years, significant innovation has been occurring and it’s resulting in a sort of resurgence of passion and energy that feels and looks very much like the world in 1994 and 1995.

While there is no doubt that the growth dynamics will initially be more muted than they were during this past growth phase, I’m optimistic that the current set of emerging trends will usher in a new Internet. And, like the first-generation Internet, it is the mutually reinforcing dynamics of these trends that will drive new growth and opportunity.


10 Trends for Internet 2.0
There appear to be at least 10 significant trends that constitute this next-generation Internet opportunity.

Broadband
If you polled the industry a year ago, the consensus would have been that broadband had flopped and that it would take much longer to get to a high-speed consumer Internet than originally expected. In reality, the opposite is true.

2002 was a record year for broadband adoption. Industry wide growth has been nearly 80% year over year, and broadband access has become the fastest-growing component in the telecommunications sector. This adoption reflects the fact that the Internet has become central to our daily lives and because the value of an always-on, reliable connection in and of itself is compelling.

Broadband adoption is currently the number one focus for cable, DSL, wireless, and portal companies. And this adoption creates a massive new opportunity for media, software, and services delivered over the Internet. In the next year, over 50% of online US households will have broadband.

Wireless
Although many have declared it dead and buried multiple times, wireless interest and adoption is alive and well. There are reasonably priced data services in every geographical region, and mobile handset technology (hardware and software platforms) is now rich enough to deliver real consumer value. Unlike the low-quality, low-speed experience of WAP, this new world of smart phones includes multimedia messaging, integrated cameras and other digital media support, and software runtimes like BREW, J2ME and Macromedia Flash. It’s creating a wave of innovation and growth. The mobile landscape is now attractive and real.

There’s also the revolutionary growth of 802.11b devices and networks known as Wi-Fi. The level of investment, innovation, and growth in broadband wireless delivered using Wi-Fi is nearly identical to the growth in ISPs, TCP/IP, and Internet access in 1994 and 1995. Although the ultimate impact of Wi-Fi is unknown, it is creating a new set of capabilities and applications for connected software, communications, and media.

Devices
In addition to the innovation and advances in mobile devices, we’re continuing to see robust growth and adoption of other new device categories. The largest and most significant is the adoption of digital lifestyle devices such as digital cameras, camcorders, and digital music players. This creates for the potential for new sources of content and new distribution channels for media.

As broadband and Wi-Fi penetrate the home market, they are instigating a new wave of innovation around Wi-Fi devices such as wireless security cameras, Internet stereos, video phones, and even Wi-Fi–enabled music devices for automobiles.

This massive proliferation of new Internet-connected devices is driving new opportunities for application providers and media and communications companies and challenging our age-old views and approaches to consumer electronics.

Rich Clients
To many people, the level of innovation in client technology on the Internet has appeared to stall; HTML 4.0 and Internet Explorer seem to provide the platform for web experiences. In reality, innovation has moved steadily along, primarily led by the now ubiquitous adoption of rich client technology such as Macromedia Flash Player.

Rich client technology can transform the quality and boost the usefulness of Internet applications, media, and communications because it combines desktop-like experiences with the deployment and content-rich aspects of the web. And, in the coming year, Macromedia Central will extend this model further by providing a new client platform for the distribution and use of Internet software and media. Also this year, Microsoft will describe and promote its .NET client technology as a post-browser approach to Internet applications and content.

Web Services
Of all the trends discussed here, web services have certainly garnered the highest level of hype and interest in the industry. This is warranted, as the birth and proliferation of web services technology into the mainstream promises to radically change the usefulness of software in the world.

Within the next year, nearly 100% of new runtimes (client and server) will be SOAP-capable deployment platforms. This means that nearly any piece of code running anywhere in the world can invoke any other code on the network. This new model of application interoperability is affecting dozens of software markets. It provides the potential for new levels of productivity, integration inside of enterprises, and most importantly, it lays the foundation for interenterprise applications at a level we’ve never seen before.

Progressing side by side with the web services trend is the rapid adoption and popularity of microcontent formats such as RSS. Primarily used in the context of weblog or blog software, RSS and sister standards like RDF are driving the Internet towards well-structured, easily searchable and sharable data.

Unlike the 1.0 Internet, hacked together with logic and data isolated in stovepipes, web services and microcontent unlock the value of software and data and foster new economic models of cross-company interchange.

Real-Time Communications
The Internet is rapidly evolving from a one-way and text-based medium to a rich, multi-directional and real-time communications environment. Over the last several years, there has been mainstream adoption of real-time communications technology such as instant messaging in consumer and corporate settings. And while instant messaging may be a major driver for change, there’s a lot of focus on new platforms that enable real-time communication and collaboration within custom applications.

At the forefront of this innovation are Macromedia Flash Player and Macromedia Flash Communication Server. Flash Communication Server provides the first broadly available platform for building real-time collaborative user interfaces that incorporate multiway text, audio, and video as communications forms. Additionally, other software and online service companies such as Microsoft, Yahoo, IBM and AOL are providing their users with real-time communications applications.

Hosted Applications
The model of delivering software as a hosted service continues to gain traction. Although many dismissed this approach as a failure only a couple of years ago, rich clients’ popularity and web services’ ability to integrate hosted applications into an enterprise has promoted the adoption of hosted applications. This growth promises to transform the use of software in corporations around the world.

Big Data
Over the past several years, the price to value ratio on storage and bandwidth has improved dramatically, expanding what you can deliver to consumer PCs over broadband. PCs now have enormous amounts of plentiful storage, and, on the delivery side of the equation, it’s also economical to deliver and manage large quantities of rich media, including high-quality video. The trends on this front are only accelerating.


Paid Content
This new Internet environment is seeing the adoption of paid content as a business model for the Internet. Rich clients and broadband now make it possible to build high quality digital assets, and consumers seem willing to pay.

Unlike Internet 1.0’s “information wants to be free” mantra, people now willingly pay to download music and to subscribe for access to quality content or games and entertainment. This shift from free to paid content is dramatic and symbolizes the maturity of the Internet as a media and commerce platform.

For example, RealOne has over 1 million paid subscribers for their video on demand Internet service. And dozens of other media brands are experimenting and seeing success. AOL has made it clear that their future is in broadband-enabled, high-quality paid services. Yahoo is experiencing robust growth in premium services, and the broadband ISPs are betting that paid content will form the next wave of profit growth, expanding the current access fee-only model. To make it easy to get into the game, Macromedia Central includes a model for creating paid content services and applications.

The Software Manufacturing Economy
Nearly every new Internet opportunity is based on shifts in the how software is manufactured and sold. The people, places, frameworks, and materials used to create and distribute software are changing dramatically:

Component-based software. The rapid adoption of Java and .NET runtime and development platforms make it possible to easily design, compose, and integrate software assets.
Open source. Open source continues its forward march, giving developers access to low-cost software manufacturing and greater control over the code that applications are built upon. Because of this, application ISVs are opting for open source materials such as Apache Axis, Tomcat, Linux, and MySQL.
Global outsourcing. Economic pressures are driving software companies to rely heavily on global outsourcing for software manufacturing. When combined with open source as a material and component architectures as a design model, it is becoming easier than ever to construct complex software projects overseas.
Web services and hosted applications. Web services and hosted applications now deliver software products online, eliminating the need for packaged products. This shift redefines product sales and distribution channels for software and provides radical new economies of scale.
An Optimistic Outlook
As someone who experienced the birth and rapid growth of the first-generation Internet (“Internet 1.0”), I can’t help but see the parallels to today. A collection of disruptive trends are combining to create a new platform of opportunity surrounding the new Internet—one that dramatically surpasses the capabilities of the 1.0 Internet, but which at the same time fulfills many of its original promises.

This article provides a snapshot of how current trends work together as a foundation for a much more exciting and dynamic Internet than anything we’ve seen so far. Over the next few months, I’ll explore these trends, their mutual interdependence, and how Macromedia technologies will play a role in their growth and adoption.

August 31, 2003 at 05:24 AM in Web/Tech | Permalink | TrackBack (465) | Top of page | Blog Home

August 30, 2003

With E-mail Dying, RSS Offers Alternative - Pirillo Quote

With E-mail Dying, RSS Offers Alternative: "Pirillo points out that with e-mail, it's a channel that any and everyone is using -- ergo, all that spam from anyone who can figure out your e-mail address. With RSS, the consumer is getting only what he asks to see; it's a closed channel. No spam gets in the way of your subscriber getting what he wants. "It's 'me' Internet,'' Pirillo says.

August 30, 2003 at 08:47 PM in Blogging & feeds | Permalink | Top of page | Blog Home

With E-mail Dying, RSS Offers Alternative

Email no longer works, and RSS/ Blogging is one potential alternative. But hang on ... email no longer works; this is a strong statement. Clearly there is direction towards fixing email problems with spam and filters, but companies are starting to get realise that and stop contributing to SPAM. This will take a while, and the damage may be done, even if they are successful in stopping spam. Its unlikely they will ever stop that pervert in South America, or those stupid guys in Nigeria with their fake business English, who we have all seen selling, respectively, sex or money.

Email does suffer from one fundamental problem .... its like a phone call .... it comes to you whether you want it or not. And from the senders perspective its worse than a phone call, because there is no way to tell if it was received, and certainly no way to tell if it was desired. Lockergnome, the largest email newsletter in the world, is telling customers to unsubscribe, and start using RSS (he is awesome by the way).

So whats so special about RSS. First of all, it is truly opt-in. If you point your "newsreader" to an RSS feed, its your choice. But this creates a new problem I have already. My reader is full ... its too time consuming to sort it out. I have one which has a newspaper format, but its still not personalised enough, so some work to do there.

So while the answer isn't totally clear, we can see light on the horizon.

With E-mail Dying, RSS Offers Alternative
STOP THE PRESSES!
By Steve Outing

"Publishers Must Find New Delivery Methods"

Who'd have thought that things could get this bad? E-mail -- long touted as the 'killer app' of the Internet and the best online channel for publishers -- is rapidly being decimated by spammers and virus writers. Yes, 'decimated' is an accurate word. The evidence is quickly mounting that e-mail is no longer an efficient means for ethical publishers to reach subscribers.

Indeed, some e-mail publishers are starting to think the unthinkable -- giving up on e-mail and moving to other means of reaching subscribers. Yeah, that sounds pretty radical. But with e-mail under siege and no relief in sight, radical measures are required.

It's time to move on to something that's (we hope) spam-proof. You've probably heard about RSS, which stands for Rich Site Summary, and is sometimes called Really Simple Syndication. With e-mail on a rapid decline, RSS is the heir apparent. Now all publishers need to do is figure out how to make a business of RSS content distribution.


Is It Really That Bad?


I first wrote about the problems of ethical e-mail publishers more than a year ago, in an E&P Online column: 'I'm Sick and Tired Of Spam (Filters): They're Blocking Legitimate E-mail From Publishers.' (Available to E&P subscribers only.) Since then, things have gone from bad to worse.

Recent studies show that opt-in messages (that is, e-mail that people have asked to receive) are now erroneously blocked as spam by ISPs and e-mail services at rates of 17% (according to a Return Path study) to 38% (Mail.com study). Let me repeat: 17% to 38% of the e-mail you send out to customers who ask for it -- or even pay for it -- does not reach them. Sometimes it gets shuttled into a "junk" folder where it probably won't be seen by the subscriber; sometimes it's just unceremoniously deleted without the subscriber's knowledge, or the publisher's (since the filters often don't send bounce messages that would let you know what's happening).

Of the subscribers (62% to 83%) who do successfully receive e-mail from ethical publishers, there's another big chunk who don't open it. The typical opt-in commercial/marketing message is opened only about 40% of the time, according to the most recent Doubleclick E-mail Trend Report. E-mail newsletters typically fare better, but nevertheless a lot of them sit unopened. As users' in-boxes fill up with more and more junk, it's common for people to simply miss asked-for mail and inadvertently delete it -- or because of information overload, simply not have time to read it.

In the last couple of weeks, e-mail has also been impacted by the worst e-mail virus yet -- Sobig. Influential San Jose (Calif.) Mercury News technology columnist and blogger Dan Gillmor earlier this week wondered out loud in his Weblog whether the virus crisis, added to spam volume that exceeds legitimate e-mail, spells "the end of e-mail?"

Gillmor is not being paranoid. Information design consultant Michael Fraase in a column last week wrote, "The spammers won. E-mail, for anything other than communicating with individuals you know already, is useless. ... Online publishers are struggling with the loss of the spam war, because e-mail was one of the best publishing tools the non-corporate media has ever seen."


An E-mail Champion Recants


Even Chris Pirillo, the author who wrote "the book" on e-mail publishing in 1999, Poor Richard's E-mail Publishing, has all but given up on it as a publishing tool. Pirillo, also the proprietor of some of the largest e-mail newsletters on the Internet, Lockergnome's technology publications, is now discouraging people from signing up for his e-mail deliveries, instead pointing them to RSS as an alternative. He's even going further than that -- actively encouraging e-mail subscribers to drop their accounts and teaching them how to get the same content via RSS.

Why such an extreme approach? Pirillo is passionate about the failures of e-mail and the benefits of RSS. A primary motivator is the massive support costs endured by his Lockergnome enterprise -- which has several hundred thousand e-mail subscribers. That means fighting with "blacklist" operators who wrongfully block his opt-in newsletters as spam; working with ISPs and mail-server administrators to get them to "whitelist" (allow through) Lockergnome's mailings; doing individual customer support with subscribers who find they are no longer getting their newsletters; etc.

As of now, if an e-mail subscriber has a problem receiving Lockergnome newsletters, "They're on their own," Pirillo says. "We can't fight these battles any more."

When a subscriber writes in with a problem these days, what they'll hear back from Pirillo and company is advice on using RSS to get the same content. That means, most often, educating the e-mail user about how to find an RSS "aggregator" and sign up for a Lockergnome RSS "feed."


The Next Wave


Pirillo says the way around the e-mail mess is for ethical online publishers to go around them. "E-mail is a polluted medium," he says.

Evidence of the "pollution" is easy to find, of course -- not just in all the junk that's in everyone's in-boxes, but also in what opt-in e-mail publishers are having to do to circumvent the problems. A good example is the practice by an increasing number of e-mail publishers of disguising some words that spam filters might catch and block. I regularly receive newsletters from PaidContent.org and ContentBiz.com, for example, that purposefully misspell words to avoid spam filters. (For example, "b*east"; "s-pam"; etc.) When you have to go through such ridiculous gyrations in order to reach your customers, it's time to find a new way.

How does it work? Simply, RSS allows potential readers of a Web site to view part of its content -- typically headlines and short blurbs -- without having to visit the content directly (unless they want to click through to it). Viewing is done with a piece of software separate from the Web browser, the RSS aggregator, which the consumer uses to subscribe to "feeds" produced by favorite Internet publishers. The feeds are constantly updated as the publishers add new content.

The big advantage of RSS to a Web publisher is that it can significantly increase a site's visibility and reach. In the context of a news site, EEVL's "RSS Primer for Publishers & Content Providers" explains that "because there are so many sources of news on the Internet, most of your viewers won't come to your site every day. By providing an RSS feed, you are in front of them constantly, improving the chances that they'll click through to an article that catches their eye."

And by using RSS, a publisher enables others on the Internet to syndicate its headlines, so they show up on other Web sites as those publishers incorporate third-party headlines into their own sites -- viewer traffic that gets funneled back to the Web site of the original publisher.

RSS is still in the "infant" stage, even though it's been around for a decade, and even though a modest number of major publishers have discovered it (including The Christian Science Monitor of Boston, The Telegraph of Nashua, N.H., and BBC News.) The biggest problem -- which will certainly be solved soon -- is that Web browsers and e-mail clients do not currently accommodate RSS content. Some RSS readers are stand-alone applications (I've been using FeedDemon, currently in beta, and highly recommend it); some are add-ons that work inside a browser, and some work in concert with the Microsoft Outlook e-mail client. Others are Web-based aggregators that allow you to read RSS feeds as though you were viewing a Web site.

The big advantage of RSS is that it avoids the whole spam issue. RSS users don't divulge their e-mail addresses (no spam), and thus there's no spam filter involved that might block a publisher's content.

Pirillo points out that with e-mail, it's a channel that any and everyone is using -- ergo, all that spam from anyone who can figure out your e-mail address. With RSS, the consumer is getting only what he asks to see; it's a closed channel. No spam gets in the way of your subscriber getting what he wants. "It's 'me Internet,'" Pirillo says.

For many consumers, moving from e-mail newsletters to RSS feeds might seem daunting. It is up to publishers, then, to sell the RSS concept, and explain that it's a solution to the spam muddle.

RSS really is a better way, especially for those who regularly read a whole passel of Web sites, blogs, and/or e-newsletters. It replaces manually viewing a bunch of bookmarked sites with a single aggregation pane of fresh content, quickly consuming headlines and blurbs, and clicking through to the stuff that looks really interesting. It's a big time saver. (I have a long list of sites and blogs that I monitor daily. Before switching to RSS reading, I visited them each individually. Now, instead I read most of them -- not all offer RSS feeds yet -- using the FeedDemon aggregator. I estimate that it takes me one-quarter the time or less than the old way of surfing Web sites and reading e-mail newsletters.)

The immaturity of RSS does present problems. There are lots of readers out there, but they typically have shortcomings, such as assigning everything equal weight -- so it's not possible to have the RSS-news consumer see a "front-page" type of presentation that's similar to the experience of reading a newspaper page or news-site home page. But even at today's stage of RSS development, it's a good reading experience for the consumer. Expect to see RSS software improve over the next year or two, and for RSS to become a more robust publishing platform.


The RSS Business Model


Many e-mail publishers today remain afraid of RSS, suggests Pirillo, but there's little to fear. He points out that the business model of e-mail publishing doesn't really change using RSS. Readers still see the same ads, and the same content and design/layout that they would in receiving an HTML newsletter -- assuming that they find your site's headlines and blurbs worthy of clicking on to see full content.

Some publishers are even embedding text advertising within the headline/blurb sets that RSS users see initially in the RSS aggregator software (prior to clicking to see the full content).

And just as there are paid e-mail newsletters, so too can there be paid RSS news feeds. The only caveat there is that paying subscribers to an RSS news feed must use an RSS aggregator that supports authentication (that is, a log-in name and password to gain access to the content).

Pirillo emphasizes that a big advantage of having RSS subscribers is that they are less costly to maintain. Unsubscribing from an RSS feed is simple. Unsubscribing from an e-mail newsletter in theory is simple, but often is not because a subscriber may have changed e-mail addresses and not remember -- thus having to call for support in order to unsubscribe. From a business perspective, it's far cheaper to have 100,000 RSS subscribers than the same number of e-mail ones.


What Should You Do?


Any e-mail publisher with a survival instinct should be publishing RSS feeds of the content that it currently e-mails. "It's only a matter of time before e-mail newsletters go the way of the dinosaur," warns Pirillo.

Not everyone agrees. Randy Cassingham, a Ridgway, Colo.-based humorist who publishes a 118,000-circulation e-mail newsletter called This Is True, says he's sticking primarily with e-mail, but he's started offering an RSS feed as an alternative. He does not publish his content on the Web -- he also sells book compilations of his columns, so that could undermine sales -- and so he views RSS as a supplement to his e-mail list.

"I think e-mail is too useful to declare it dead," Cassingham says. He is holding out for better technical answers to the spam problem, combined with some federal legislation to curb abusers. "I think we'll see e-mail going back to being a useful medium again, instead of a pain."

Pirillo sees that reasoning as inadequate. "People are making excuses [for not converting to RSS]. They're afraid."

Whatever you believe, at least recognize that as a publishing platform, e-mail is now seriously impaired. Whether you convert to RSS whole-hog or just offer it as an alternative, now's a good time to start thinking about a transition in your business plan.

August 30, 2003 at 08:21 PM in Blogging & feeds | Permalink | Top of page | Blog Home

August 29, 2003

Future of Longhorn

The careers page is always an interesting way to predict the future. This from Microsoft careers:

Software developer - included in the job description - "You will work with a team that will build services that power scenarios like personal and shared spaces, blogging and roaming storage" & "What you build will also make Longhorn come alive by enabling it to seamlessly sync and share their data".

Job Details - Microsoft Careers

Software Development Engineer - Future of Longhorn
"Job Title: Software Development Engineer
Job Category: Software Development
Product: Not Product Specific
Job Code: 100460
Location: WA - Redmond
Travel Required:

Do you wonder what the next generation of communications on the Internet will look like? Are you intrigued and excited about what it takes to build the systems to power it? If you answered 'yes' to the above then come join the team that has built some of the largest scale systems in MSN and help build the next generation of sharing and communication. You will work with a team that will build services that power scenarios like personal and shared spaces, blogging and roaming storage. What you build will also make Longhorn come alive by enabling it to seamlessly sync and share their data. We are just starting to think about and design these services from the ground up and you will get a chance to join us on the ground floor and take it from drawing board to the datacenter. You will be responsible for the design and implementation of SOAP services in C# and building components in C at megascale; working closely with teams in MSN, Longhorn and others in defining and implementing the APIs that clients use and ensuring that the systems and APIs are secure, scalable and maintainable. Candidates must have at least 3 years of software development experience and demonstrated proficiency in C (or C#) and good object oriented design. The ability to develop and maintain highly complex software compo"

August 29, 2003 at 10:04 PM in Microsoft | Permalink | Top of page | Blog Home

Silicon Valley - Dan Gillmor's eJournal - Microsoft's Blogging Future

Note - have to follow up on this.

Silicon Valley - Dan Gillmor's eJournal - Microsoft's Blogging Future

August 29, 2003 at 10:02 PM in Microsoft | Permalink | Top of page | Blog Home

Will Microsoft tweak IE? | CNET News.com

Microsoft making chjages to their browser. Even though they are the biggest, with 96% of market, they realise they have no choice. It will be intersting to see the nature of the changes, and how surfing might change, for better or worse.

Will Microsoft tweak IE?
"By Matt Hines
Staff Writer, CNET News.com
August 29, 2003, 1:40 PM PT


Microsoft told the Web's leading standards body that it's considering making changes to its Internet Explorer browser in light of a recent ruling against the company in a patent infringement lawsuit.
The World Wide Web Consortium (W3C) issued a statement Thursday indicating that Microsoft is mulling its options after a federal court earlier this month found that plug-ins and applets in Internet Explorer (IE) infringed on patents held by Eolas Technologies and the University of California. The software giant was ordered to pay $521 million to the Web technology company and the university.
'In the near term, Microsoft has indicated to the W3C that they will very soon be making changes to its Internet Explorer browser software in response to this ruling,' Steven R. Bratt, chief operating officer of the W3C, said in a statement. 'These changes may affect a large number of existing Web pages.'

This week the standards body held an ad hoc meeting for its members, including Microsoft, during which people were asked to offer their opinions regarding any changes the software maker should make to IE. The objective of the meeting was to evaluate potential near-term changes that could be implemented in browsers, authoring tools and Web sites as a result of the court case. Roughly 50 individuals showed up at the meeting in San Francisco, with many others participating via a teleconference c"

August 29, 2003 at 09:52 PM in Microsoft | Permalink | Top of page | Blog Home

Fearing Misuse Label, Advertisers Wary of E-Mail

Advertisers are getting smart and this without legislation.

Fearing Misuse Label, Advertisers Wary of E-Mail
Yahoo News
Thu Aug 28, 7:18 PM ET
By Michele Gershberg

NEW YORK (Reuters) - Even as they boost their budgets for online advertising, major U.S. companies will be wary of e-mail marketing campaigns until the menace of unsolicited "spam" e-mail has been tamed, top industry officials said on Thursday.

"Our marketers are basically saying spam is killing (e-mail marketing)," Bob Liodice, president of the Association of National Advertisers (ANA), told Reuters.

E-mail marketing "clearly will be muted until they have a greater degree of confidence that their messages will go through in the way that they want them to," he added.

But Liodice was quick to point out that he believes using e-mail as a legitimate marketing tool "will skyrocket" once spam is under control.

In the past, leaders in the advertising industry have trailed other trade and consumer groups who are pressing for tougher legislation and enforcement against spam.

That is changing, however, as companies invest more on online advertising. Concern has arisen about spam because e-mail is easily deleted by consumers angered by the explosive growth of the often deceptive or vulgar messages.

The ANA, which represents more than 300 leading companies, and the American Association of Advertising Agencies, known as the "4As," are hammering out guidelines for using e-mail to market products and services credibly.

A nine-point proposal they have drawn up defines spam as "unsolicited, bulk, untargeted commercial e-mail," and tries to distinguish it from more legitimate, direct online marketing.

The proposal calls for commercial e-mail to be sent from working Web addresses, preferably ones which include a company or brand name to clearly identify the sender.

E-mail ads should have an easily located option which consumers can mark if they do not want to receive further mail, but the proposal does allow e-mail marketers to send targeted, unsolicited e-mail if a consumer has not "opted out."

JUST SAYING NO TO "DO NOT SPAM"

O. Burtch Drake, president of the "4As," said most leading advertisers already abide by such rules.

He said the industry prefers formal recommendations for business practices rather than advocating a "do-not-spam" list similar to the "do-not-call" list for consumers who do not want to hear from telephone marketers.

"In the case of spam, if you make up a list of all the addresses, that becomes a pretty valuable list (for spammers to send more e-mail)," Drake said.

While advertisers still prefer other online marketing tools like paid search listings, direct marketers are keen on preserving e-mail for ads, especially if a "do-not-call" list dries up other avenues for reaching customers.

"I don't think we all recognized how big the problem was going to be until six to nine months ago," said Greg Stuart, president of the Interactive Advertising Bureau. The IAB is also planning to develop e-mail ad guidelines, he said.

Stuart said some advertisers waited on the sidelines hoping that technology, in the form of spam filters, or legislation would be put in place to stem the tide of spam.

But a problem with spam filters is that they have blocked e-mail from companies with which a consumer wants to communicate.

One other problem leading to the proposals has been that new laws and stricter enforcement have been slow off the mark.

August 29, 2003 at 09:40 PM in Spam | Permalink | Top of page | Blog Home

Hutton Inquiry

Lord Hutton is conducting an investigation into the circumstances of the death of Dr David Kelly. The future of the Labour Government, andcertainly Tony Blair PM is at stake. The inquiry has been ongoing for 2 weeks ... today there were 1,000 pages of evidence submitted on the website. The best thing an investigative reporter could do is stay at home and read all that.

Internet has not just the capability of disseminating large amounts of information; it has provided the will to disseminate and the desire to satisfy the public. I am not even sure the public expects this amount of information freedom, but once provided it can never be taken back.

its all about transparency, and internet is the tool.

August 29, 2003 at 12:24 AM in Web lifestyle | Permalink | Top of page | Blog Home

August 28, 2003

BBC NEWS | Technology | Japan leads mobile game craze

BBC NEWS | Technology | Japan leads mobile game craze

Article talks about "going Japanese" in terms of using wireless phones for advanced games and downloads. Judging by the advertising from phone companies, this seems a reasonable assumption.

August 28, 2003 at 03:31 PM in Japan | Permalink | Top of page | Blog Home

E-Mail Marketers Feel the Heat from Spam

I've noticed the press, consumers, government, everyone is complaining publcly about spam and the internet industry is listening. ISP's (MSN, Earthlink, AOL) are all starting to take action, legal, and technical action. Of course they have no choice, since this article indicates that users are not clicking through on ad's nearly as much as before. With response rates at 2.65 per 1,000 (0.00265%) this is miniscule response. However, given how cheap it is to send out hundreds of thousands of emails, this would mean that even at 1,000,000 emails, the response rate will still get 2,650 responses so at $10 per response, the spammer will still make some money.

I suspect that spammer payments will be less than $10, and siginficantly dropping too given the bad perception that they are creating, so over time this momentum against them should work, but it really has to start with the people who pay spammers.

E-Mail Marketers Feel the Heat from Spam
Mon Aug 25, 5:46 PM ET
By Michele Gershberg

NEW YORK (Reuters) - Retailers who hawk their wares via e-mail are finding it harder to make a buck from customers as e-mail in-boxes overflow with the random ads known as spam.

Although many retailers establish ties with consumers that distinguish them from spam-senders, a study released Monday showed they earned slightly lower revenue from each e-mail advertisement sent out in the second quarter of 2003.

U.S. marketers, consumer groups and trade associations are pushing authorities to fight the onslaught of deceptive or vulgar spam messages. For retailers, spam snarls the potential for reaching clients who may want to see their ads.

"Spam has poisoned the well for legitimate merchants," said Jason Catlett, president of consumer privacy group Junkbusters Corp. "It's very difficult for e-mail marketers to stand out from the crowd of sleaze that assaults the average American."

Internet marketing company DoubleClick Inc. (Nasdaq:DCLK - news) said in a report that the average revenue generated per retail or catalog e-mail fell to 28 cents in the second quarter of 2003 from 29 cents a year earlier.

But on a brighter note for retailers, customers were opening more e-mail ads from companies they view as legitimate than before, the study said.

According to DoubleClick, the average order in response to an e-mail ad dropped to $98 in the second quarter from $102 a year earlier. In all, retail e-mail saw an average of 2.65 purchases for every 1,000 e-mails sent out.

Eric Kirby, vice president of strategic services at DoubleClick, said advertisers would not shy away from e-mail marketing due to the slightly lower revenue.

"The cost basis for delivering that message ... is still in the penny range, maybe two pennies, to send that e-mail," he said. "It's still hugely effective compared to other channels."

The data was part of a wider DoubleClick study that was based on two billion e-mails from several hundred clients.

DoubleClick said in its report that more e-mail users were opening messages from companies they perceived as bona fide brands, with 38.8 percent of recipients opening such mail in the second quarter, compared with 37.6 percent a year earlier.

By industry, users were most likely to open e-mails offering financial services at 48 percent. Travel e-mails rose slightly, while business products saw a small decline, as did consumer products.

August 28, 2003 at 09:26 AM in Spam | Permalink | Top of page | Blog Home

August 26, 2003

Web makes sense of news sites

"SEATTLE (Reuters) - In a year marked by war, Arnold Schwarzenegger's campaign to become California's next governor and the largest blackout in U.S. history, news junkies are facing a surging flood of news. "


Yahoo! News - News Sites Make Sense of Web's Flood of Info: That's why many are turning to Web sites that can sift through stories published around the clock on the Internet such as Google News (http://news.google.com) and Columbia Newsblaster (http://www1.cs.columbia.edu/nlp/newsblaster/).


News sites, or "aggregators," are not new to the Web, but these sites and a few others are gaining in popularity because they constantly take a wide sample of news and distill them into digestible headlines, without human intervention.

Web makes sense of news sites
Yahoo! News - News Sites Make Sense of Web's Flood of Info:
Sun Aug 24, 9:23 AM ET

By Reed Stevenson

SEATTLE (Reuters) - In a year marked by war, Arnold Schwarzenegger's campaign to become California's next governor and the largest blackout in U.S. history, news junkies are facing a surging flood of news.

"That's why many are turning to Web sites that can sift through stories published around the clock on the Internet such as Google News (http://news.google.com) and Columbia Newsblaster (http://www1.cs.columbia.edu/nlp/newsblaster/).
News sites, or 'aggregators,' are not new to the Web, but these sites and a few others are gaining in popularity because they constantly take a wide sample of news and distill them into digestible headlines, without human intervention.

"In general computers are better at dealing with large volumes of information," Marissa Mayer, Google's director of consumer Web products, said. "It would be virtually impossible for human beings to cluster (organize) stories based on topic."

Google, the popular Internet search engine, launched its news service nearly a year ago to collect news from more than 4,500 global sources. Google News displays top stories and has news categories for global, United States, business, technology, sports, entertainment and health.

Google News is similar in design to Google's minimalist search pages, and displays relevant news pictures alongside the first few sentences of every top story.

Google appears to have bigger plans for its news service, which has no advertising. It is still in beta mode, or a test mode that precedes a full-blown Web service, Mayer said, because it still has a few kinks that need to be fixed.

Mayer did not say whether Google would eventually charge users for access to the Web site.

Newsblaster, a research project at New York's Columbia University, does something different with top news stories: it creates news summaries of three to five sentences automatically, without a human editor.

"The idea was basically to handle very large quantities of news sites and look for similarities and differences among them." said Kathy McKeown, professor and chair of computer science at Columbia.

"But we can't do it without the original reporting by the reporters," McKeown said.


TIPS & TRICKS


News junkies have come up with various shortcuts to keep up to date.


One method is to search for news about a specific topic, country, person or company in Google News, then to save the search as a bookmark. Calling up the bookmark brings up a Web page with the latest news collected by Google on that particular subject.


Yahoo News has long had the ability to send customized news alerts via e-mail and the New York Times Co. (NYSE:NYT - news) offers news tracking for a yearly fee.


Google also recently added a Google News Alerts (http://www.google.com/newsalerts) feature that allows users to have Google News search results sent to them automatically via e-mail whenever a matching news article is detected by Google.


McKeown said that her team plans to eventually give Newsblaster the ability to track news as well.

Google also recently added an Advanced Search feature to Google News to allow user to search by date, location and other specific parameters.

Other Web sites are also stepping up to challenge Google News and Newsblaster's lead in collecting news.

NewsInEssence (http://www.newsinessence.com/nie.cgi), a Web site operated by a research group at the University of Michigan, also ranks and summarizes news from around the Web.

Although Google News and Newsblaster currently only track news in English, both said that they are considering offering news in various languages. Google already has news sites in French and German as well as various English language versions for Australia, India, Britain, Canada and New Zealand.

Since Google News and Newsblaster don't have reporters producing stories, they are not included in Nielsen Netratings' monthly ranking of top news sites, which in June ranked the Web site of MSNBC, a joint venture between Microsoft Corp. (Nasdaq:MSFT - news) and NBC, at the top of its list. That was followed by CNN, Yahoo News and AOL News.

NBC is owned by General Electric Co. (NYSE:GE - news) while CNN and America Online are units of AOL Time Warner Inc. (NYSE:AOL - news)

In fact, news aggregators such as Google News and Newsblaster drive much of the traffic toward those sites, said Google's Mayer. Google declined to say how much Web traffic passes through Google News.

Yahoo News (http://news.yahoo.com), operated by Internet media company Yahoo Inc. (Nasdaq:YHOO - news), also collects news from various sources, including Reuters Group Plc and the Associated Press, but it pays for content, which is in turn often offered to Web visitors for free with advertising.

News aggregation, however, is not a new concept. During the dot-com bubble, PointCast received millions of dollars in funding in the hope that its "push" model of sending customized feeds of news stories over the Internet would revolutionize news distribution.

(The Livewire column appears weekly. Comments or questions on this one can be e-mailed to reed.stevenson(at)reuters.com.)

August 26, 2003 at 02:49 PM in Web lifestyle | Permalink | Top of page | Blog Home

August 24, 2003

Content Management - what will the future look like?

Today at work we launched our site in Vignette. The site works very well and looks good. The complexity level however compared to the ease with which I am making this post leaves a lot to be desired. Of course it will be argued that blogs only make small posts and are limited in terms of function. Well Vignette has similar comments about it in many cases and the interface is just plain ugly!

Somewhere between the two is the answer and it will be closer to blogs than the Vignette model, especially for intranet Content Management.

August 24, 2003 at 03:42 PM in Blogging & feeds | Permalink | Top of page | Blog Home

August 23, 2003

Yahoo! News - PluggedIn: Gadget Lovers Seek Reliable Power After Blackout

This is exactly the issue now after the blackout, and per my earlier post. Phone worked fine, but no internet for nearly 24 hours is not an option. There has got to be a way.

Yahoo! News - PluggedIn: Gadget Lovers Seek Reliable Power After Blackout

Yahoo! News - PluggedIn: Gadget Lovers Seek Reliable Power After Blackout
Sat Aug 23, 7:32 AM ET Add Technology - Reuters Internet Report to My Yahoo!


By Eric Auchard

NEW YORK (Reuters) - High-tech New Yorkers who sport the latest electronic devices like haute couture fashion are reconsidering the value of lower-tech emergency gear after the recent meltdown of a major chunk of the U.S. electrical grid.

Gadget addicts who were literally power-less for 24 hours or more are revising their checklists of must-have features so they will never again be left in the dark, cut off from friends, family, colleagues and reliable information or news.

"Maybe we consumers will need to have communications options because we don't have a clue what will work in an emergency," Tom Wolzein, a Wall Street media analyst at Sanford C. Bernstein, mused in a note to clients after the blackout.

Cellphone networks, overflooded by frantic callers eager to communicate their difficulties, initially failed from the sheer volume of calls rather than any specific electrical problems. Rechargeable computers gave out within hours. Network equipment petered out as back-up battery supplies wore out.

One by one, feature-packed gadgets faded to black as power died. The high-speed high-tech world ground to a halt from Detroit to Toronto to New York, as the modern conceit of limitless, cheap and pervasive electricity found its Achilles Heel.

"As consumers, our focus may well shift from the deal in a bundle to affordable redundancy," Wolzein said of the unexpected failure of the most sophisticated phones, wireless (news - web sites) devices and computers.

As power returned, a steady stream of consumers flocked to electronics retailers, hardware stores and supermarkets to stock up on batteries, flashlights and transistor radios.

"Everyone wants to know where the batteries and flashlights are," electronics salesman Carlos Zabala, 22, said as he directed shoppers at the entrance of an office supply story in Manhattan on Monday. "They can think of nothing else."

A mini-maglite flashlight that runs on two "AAA" batteries offers an adjustable light beam and can be converted quickly into a free-standing electric candle. It carries a back-up lamp inside the tailcap and retails for around $10.

SharperImage.com offers light-emitting diodes for $15. The size of a piece of candy, they can fit on a keychain and throw off a beam up to 30 feet. In contrast to old-fashioned light-bulbs, the Mini Torch is meant to last 100,000 hours.

There's also the "five-in-one" rechargeable radio and spotlight hybrid for $40. It features an AM/FM radio, a spotlight, and a siren. It recharges by solar cell in the sun or hand crank in the dark.

MAKE YOUR OWN POWER

The most industrious geeks sought to manufacture their own power during the crisis. When one Manhattan technologist's cordless phone failed, he taped together six "AAA" batteries to reproduce the 12-volt power supply necessary to run the phones. By fiddling with the wires he was able make calls.

Power inverters and battery chargers typically are sold in camping stores. These convert the direct current of batteries into the alternating current rechargeable devices require.

Chicago-based Tripp Lite has a portable model for around $30 that allows high-tech gadget users to plug into a car cigarette lighter like any AC office outlet.

An industrial-strength version of the Tripp Lite inverter sells for $300 that can run whole appliances off a car battery as long as the motor is running and gasoline is available.

With the lights out, automobile batteries emerged as distributed power generators for city blocks. Neighbors turned to car owners for help recharging appliances. Car stereos became the center of impromptu street parties around the city.

A few were far better prepared than others.

"I've got everything I need," Julio Carmona, a New York Stock Exchange (news - web sites) trading clerk, boasted last Thursday as he stood on 42nd Street after commuter trains stopped running to his home in Poughkeepsie, 80 miles north of New York City.

Carmona carries an emergency kit with flashlight, face mask, whistle and first aid items with him as he commutes to Wall Street and back each day. It is a precaution he said he has taken since the September 11 attacks nearly two years ago.

Also part of his kit is a handheld Casio television with a 2.3 inch screen that he used to keep tuned in to local news updates. For a time, Carmona may have been the best informed pedestrian among the many thousands left stranded along 42nd Street after power failed.

For a detailed discussion of what devices worked and which ones didn't during the blackout, Gizomodo (http://www.gizmodo.com), a New York-based Web site devoted to the latest high-tech gadgets, featured a lively discussion on Monday.

August 23, 2003 at 07:10 PM in Web lifestyle | Permalink | Top of page | Blog Home

PCWorld.com - Computer User Challenges RIAA

This is interesting. "Jane Doe" is fighting the RIAA saying the industry subpoena of personal info is unconstitutional.

PCWorld.com - Computer User Challenges RIAA

August 23, 2003 at 06:34 PM in Web lifestyle | Permalink | Top of page | Blog Home

Police arrest 19 in terror probe

TheStar.com - Police arrest 19 Pakistani's in terror probe

This isn't exactly part of this blog's context, but the implications of this arrest are rather frightening, and it surprises me to see the Star stuff the story in Saturdays paper, to page 25. So these guys are on year 3 of a 1 year course, and practising flying over a local nuclear power plant. And in the Saturday story, it mentions they have been taking an "unknown" passenger with them.

August 23, 2003 at 09:28 AM in World Affairs | Permalink | Top of page | Blog Home

TheStar.com - Internet shopping gaining ground

The promise of Internet Shopping has settle down now. Rather than assuming all shopping would be online, this is a more reasonable view which has credence.

TheStar.com - Internet shopping gaining ground: "Online sales will account for 10 per cent of retail sales by 2008, according to a forecast this month from Forrester Research Inc. About 5 million households will make their first purchase on the Internet each year for the next five years, the Massachusetts-based researcher said. At that time, 63 million households will use computers to shop, the company said"

Online sales will account for 10 per cent of retail sales by 2008
Internet shopping gaining ground
Share 2nd-highest since 1999 survey

Total comes in at $12.5 billion U.S.

WASHINGTON—Internet sales rose as a proportion of all U.S. retail sales in the second quarter compared with the same three months last year, government figures show.

Online commerce totalled $12.5 billion (U.S.), or 1.5 per cent of all sales, during April through June, the commerce department said yesterday. The total was worth about $17.6 billion (Canadian) at yesterday's exchange rate. The percentage of all sales compares with a 1.2 per cent portion in the second quarter of last year and represents the second-largest share of all sales since the survey began in 1999. In the fourth quarter of 2002, Internet commerce was 1.6 per cent of all sales.

"We're feeling good about the third quarter and, of course, feeling good about the full-year results," Margaret Whitman, chief executive of eBay Inc., said in an interview.

EBay, the world's Number 1 Internet auctioneer, said last month second-quarter earnings more than doubled to $109.7 million (U.S.) as more people bought and sold goods on the company's Web sites and used PayPal, an e-mail payment service acquired last year.

The government surveyed 11,000 businesses, including mail-order and online retailers, furniture stores, building-materials dealers, new-car dealers, grocery, department and clothing stores. Online-travel services, financial brokers and ticket-sales agencies were not included.

The government's statistics aren't adjusted for seasonal variations or broken down by industry. Because of that, the government said, the e-commerce figures shouldn't be compared with private-industry estimates.

Online sales will account for 10 per cent of retail sales by 2008, according to a forecast this month from Forrester Research Inc. About 5 million households will make their first purchase on the Internet each year for the next five years, the Massachusetts-based researcher said. At that time, 63 million households will use computers to shop, the company said.

bloomberg news

August 23, 2003 at 09:14 AM in eCommerce | Permalink | Top of page | Blog Home

Race to stop SoBig virus next move

Google News U.K.: "Race to stop SoBig virus next move
Friday, August 22, 2003 Posted: 1705 GMT
LONDON, England (CNN) -- Computer security experts have been trying to locate about 20 computers that could have been targeted by the SoBig.F virus to wreak further havoc."

CNN.com - Race to stop SoBig virus next move - Aug. 22, 2003
CNN.com - Race to stop SoBig virus next move - Aug. 22, 2003: "LONDON, England (CNN) -- Computer security experts have been trying to locate about 20 computers that could have been targeted by the SoBig.F virus to wreak further havoc.
As companies worldwide ramped up their protection systems Friday and home users downloaded anti-virus software, the hunt was on for a small number of infected machines that could have been chosen by the virus to bombard the Internet with more data.
The identities of the 20 are not known and it is not clear why they have been targeted. "
SoBig.F -- the sixth strain of the same virus -- is the fastest spreading virus ever, hitting hundreds of thousands of computers. (Full story)

It arrives in e-mail attachments with subject headers, such as: Your details, Thank you!, Re: Thank you!, Re: Details, Re: Re: My details, Re: Approved, Re: Your application, Re: Wicked screensaver or Re: That movie.

The body of the message is short and usually contains either "See the attached file for details" or "Please see the attached file for details."

Once the attachment is opened, the virus creates a security hole in the computer, allowing someone else to use it to send on many more e-mails.

By Thursday, one in 17 e-mails contained SoBig.F worldwide. As systems have become clogged with data, corporate victims include Air Canada and defense giant Lockheed Martin.

On Friday, anti-virus experts were trying to predict its next move. Some feared it could unleash a mystery program across infected machines.

These computers would act as "master servers," receiving instructions from the author of the virus unless they were switched off.

"We don't know what that program is. It could mean a smiley faces dances across your screen or it could be something massive," Carole Theriault of Sophos Anti-Virus told Reuters.

But Paul Wood of Internet security company MessageLabs said the author of the virus might choose to hold off causing further damage, fearing that the massive spread of the virus increases the chance of being caught.

"On this occasion the writer of the virus has probably been too successful for his own good," Wood told CNN.

He added the culprit could be using the virus to spread spam -- mass marketing e-mail. (Full story)

Home computer users are advised to regularly scan their machines with anti-virus software.

If you have been infected, you may receive unfamiliar pop-up prompts or your machine might slow down. If in doubt, contact an anti-virus company or the Internet service provider.

August 23, 2003 at 01:01 AM in Virus | Permalink | Top of page | Blog Home

August 22, 2003

Blogs as Disruptive Tech - How weblogs are flying under the radar of the Content Management Giants - Our Stories - Articles about our company, WebCrimson

Blogs as Disruptive Tech - How weblogs are flying under the radar of the Content Management Giants - Our Stories - Articles about our company, WebCrimson

How weblogs are flying under the radar of the Content Management Giants
by John Hiler

Going to the New York Internet World conference last December felt like visiting a morgue.

As a New York based software CEO, I felt obliged to show up. My company WebCrimson makes blogging software, so I stopped by to see if the competition was doing anything interesting. There weren't any other blogging vendors at the conference but there were a good number of Content Management Software (CMS) vendors there, selling software for anywhere between $10k - $500k a pop!

Blogs as Disruptive Tech
Thursday, June 20, 2002

Blogs as Disruptive Tech
How weblogs are flying under the radar of the Content Management Giants
by John Hiler

http://www.webcrimson.com/ourstories/blogsdisruptivetech.htm

Going to the New York Internet World conference last December felt like visiting a morgue.

As a New York based software CEO, I felt obliged to show up. My company WebCrimson makes blogging software, so I stopped by to see if the competition was doing anything interesting. There weren't any other blogging vendors at the conference but there were a good number of Content Management Software (CMS) vendors there, selling software for anywhere between $10k - $500k a pop!

The convention center was practically empty - after the go-go years of the Internet bubble, it seemed crazy to see so much convention floor space go unfilled. I stopped by one of the CMS booths to pick up some brochures and see their demo. The sales guys instantly descended on me, anxious to make a pitch. I was at the conference with a client of mine, so I pointed them in his direction: after all, if he could get a better deal with someone else's software, I was more than happy to give him my blessing.

The head Sales Guy started grilling my client: how many pages did the site have (in the thousands!), how many users updated it (almost ten!). You could hear the Sales Guy's mental cash register ringing up dollars signs as he went straight for the close: "And what are your editors using to update all those pages: Dreamweaver or Frontpage? Or maybe you built your own homegrown CMS?"

My faithful client didn't miss a beat. "Actually, have you heard of weblogs?" he asked the Sales Guy. You shoulda seen this guy's face fall - it was like he'd been hit by a truck. "Yeah," he admitted, "So you use blogging software?"

"Yeah pretty much," came the answer. "It pretty does most of what I need. There are a couple things you described that I could use, but I can't justify that sort of outlay when blogware hits most of my specs."

That was really my eureka moment: my first realization that content management was screwed.

In more technical terms, I realized that Content Management was starting to wrestle with what Clayton Christensen calls The Innovator's Dilemma: the inability of successful companies to adapt to a new, disruptive technology.

WEBLOG SOFTWARE: THE LATEST DISRUPTIVE TECHNOLOGY

Watching the Content Management Sales Guy on the convention floor, it was pretty clear that he'd faced this situation before: a potential client who was unwilling to pay full price for a mid-range CMS solution. So he did what most traditional companies do: he walked away from the revenue opportunity.

He was doing exactly what he'd been trained to do... which is exactly what convinced me that Weblog Software was a disruptive technology that would eventually end up putting his company out of business.

As a provider of weblog software, that's a self-serving prediction... but after reading Clayton Christensen's book, I'm more convinced than ever that we're onto something here. I'll map out my proof - but first, let's take a look at an industry where Disruptive Technologies have already run Big Companies out of business, and see if we can't draw some parallels to Weblogs and Content Management.

THE INNOVATOR'S DILEMMA: PC'S VERSUS MAINFRAMES

Ok, imagine you're the VP of Sales for IBM in the mid 70's. Your sales force is raking it in selling Mainframes to the Fortune 500. Then you start hearing rumors about a new type of computer that runs on an Intel microprocessor. These computers sell for a few hundred bucks each in the form of kits that the user has to assemble by themselves at home.

Are you worried?

With perfect hindsight, it's easy to answer that question. But at the time, of course, IBM wasn't hugely concerned about these personal computers - they didn't end up launching a PC until six years later, in 1981. Clayton Christensen points out why:

Rational managers can rarely build a cogent case for entering small, poorly defined low-end markets that offer only lower profitability.

It really hits home if you think of yourself as working in Sales. If you can choose between a 10% commission on a million dollar Mainframe, or the same percentage on a $10,000 order for PC's... which product do you think you'll push to your customers?

Or suppose you run IBM's R&D department. You hear about this new computer technology that's different from mainframes. You have a rough intuition that it's something interesting, but when your boss asks you why you can't articulate it. Clayton Christensen is all too familiar with that issue as well: "Markets that do not exist cannot be analyzed: Suppliers and customers must discover them together."

DISRUPTIVE TECH: NOT USEFUL TO YOUR CURRENT CUSTOMERS

Lower sales commission isn't the only reason you won't be pushing PC's to your customers. Your biggest customers are banks and corporations: not the sort of hobbyists who were clamoring for PCs in the mid 70's. If IBM listened to the needs of its bank customers, for example, it'd never come up with the idea for the PC. Clayton maps that out in the first chapter of his book:

When the best firms succeeded, they did so because they listened responsively to their customers and invested aggressively in the technology, products, and manufacturing capabilities that satisfied their customers' next-generation needs.

But, paradoxically, when the best firms subsequently failed, it was for the same reasons--they listened responsively to their customers and invested aggressively in the technology, products, and manufacturing capabilities that satisfied their customers' next-generation needs.

That's actually where the title Innovator's Dilemma comes from:

This is one of the innovator's dilemmas: Blindly following the maxim that good managers should keep close to their customers can sometimes be a fatal mistake.

I'm pretty convinced by Clayton's logic, but there's one more reason why IBM might be reluctant to enter the PC market.

WHY ITS HARD TO CUT PRICES TO MATCH DISRUPTIVE TECH

If you're IBM, selling to the Fortune 500 means certain things. First, you need to schmooze with your clients while they decide which computers to buy. Given how expensive computers are, that probably means you'll be wining and dining your customers for months and months: that's what sales guys mean when they refer to a "long sales cycle". And even though IBM picks up the bill for dinner, the client ends up paying for the dinner and the sales team's salary down the road.

There's another problem with selling mainframes: they require an army of consultants and engineers to customize and configure. That means even more overhead and salary to cover.

Any time you have a long sales cycle and extensive consulting requirements, that pretty much demands that you're going to have to charge a lot. Pretty soon, you've built a huge business infrastructure that demands high prices just to pay the bills. Once you've got that infrastructure as overhead, it becomes difficult to lower your price.

THE INNOVATOR'S DILEMMA: PC'S VERSUS MAINFRAMES

Most Mainframe and Minicomputer companies walked away from PC's... with the obvious results. IBM actually managed both business models for a while, before competition came in and crushed their PC business. But the destructive power of the disruptive technology was undeniable: in less than a decade, PCs went from being cheap toys for hobbyists to being powerful tools that started to cannibalize mainframe sales.

Content Management may be in for a similar journey. Let's take a look at how Content Management works.

THE TRADITIONAL INDUSTRY: CONTENT MANAGEMENT

First a quick and dirty definition of web-based Content Management: software that lets you create, edit, and update a website. Actually, Lighthouse on the Web has a more detailed definition that I find pretty useful:

You might need systems for creating the content (authoring), describing it (metadata tagging), changing and updating it (editing), letting several people edit it together (collaboration), letting the right people do the right things to it (workflow), stopping the wrong people from manipulating it (security), keeping track of how it has changed (versioning), deciding when to display it (scheduling), displaying it in the right standard format (templating), allowing it to be displayed by others (syndication), allowing it be displayed differently to different visitors (personalisation) and more.

Every website has different needs - especially big, complicated websites. As a result, most CMS software vendors fall into the same business models as the Mainframe vendors selling to corporations: long sales cycles and extensive consulting requirements.

CONSULTING: BREAKING IT DOWN

I find it helpful to break apart Consulting into two separate pieces: customization and configuration.

Customization requires programmers: the software doesn't do what the client wants, so they have to pay you to add a new software module. If you're lucky, you can get your client to eat most of these development costs and incorporate the new code into your software. Lots of software companies fund their development this way.

Configuration is much simpler: the software does exactly what the client wants, and you just have to set some toggles and flip some switches so the software knows what to do. Configuration is much cheaper than Customization, although it's usually done by the same high-priced programmers that do the Customization.

You can see the split in consulting types in the various business models of CMS vendors. Content Management software can usually be sorted into three major tranches:

High-End Solutions: expensive Software + extensive consulting for Customization and Configuration


Mid-Range Solutions: semi-expensive Software + extensive consulting, mostly for Configuration


Low-End Solutions: cheap solutions + limited consulting, mostly for limited Configuration of cookie-cutter sites
If you need a CMS to jump through hoops in a very particular way, then you'll probably have to spend the $200-500k it costs for a software license to a high-end solution. The typical rule of thumb when pricing high end solutions: you'll pay at least as much in consulting as you did for the software license.

The mid-range solutions are in the $20-100k range, with consulting in that range as well. If the software doesn't do what you want, it's difficult to get a mid-range CMS to Customize your solution.

The low-end solutions tend to be cheap, but generally limited to cookie-cutter sites. After all, the only way to keep the total cost of website down is to minimize both the length of the sales cycle and the amount of consulting required.

As a result, the growth of the CMS sector has been constrained. Just as mainframes are limited in how far down-market they can go, so too are most Content Management Solutions limited in their market reach.

DISRUPTIVE TECH: WEBLOGS

That's all changing with the new disruptive technology: the humble weblog.

Since most blogging tools are both free and addictive, it's no surprise that the sales cycle has been eliminated. Better yet, point and click blog designs mean that there's minimal consulting - either customization or configuration - required to set up your blog.

The result? Weblogs are spreading like wildfire - by some accounts, the market is growing as high as 25% a month. Weblogs are infecting the low end of the content management space with their incredible viral growth.

By itself, this doesn't make for a disruptive technology. But two developments are turning blogging software (aka blogware) into a contender for the CMS crown:

A Growing Army of Consultants
The Increasing Power of Blogware
A GROWING ARMY OF CONSULTANTS

Blogging is creating a growing army of consultants that can configure blogware for websites. Up until now, it's been very difficult for the average HTML designer to configure a Content Management system. But now, most designers can start a weblog with just a few clicks - and configure them to detailed specifications inside of half an hour.

This is a Big Deal, as there are many more designers than programmers. Consultant configuration that used to be done by expensive (and rare) programmers can now be done by cheap (and much more common) designers. I can attest to this firsthand: my programming skills are non-existent, while my design skills are passable... giving me just enough knowledge of HTML to configure blogware to my specifications. I've used that knowledge to configure countless websites over the past several years for my consulting clients.

But there's a second reason that Weblogs are a disruptive technology to Content Management systems:

THE INCREASING POWER OF BLOGWARE

Blogware has grown from its simple origins to an increasingly powerful content management solution.

As first, weblogs just supported basic features: time stamps for each weblog post, automatic archiving of old posts, automated header dates for the posts on a given day, permalinks that automatically gave its entry its own unique URL.

But in the past two years, there have been incredible advances in blogware functionality. Now many blogware packages support advanced features like:

Multiple databases
Multiple templates
Multiple users
Draft status, and future posting
Category support
Data syndication
and much, much more.

Increasingly, there's only a thin layer of functionality separating blogware from low-end Content Management solutions. Features like:

Basic Workflow, so administrators can approve content and templates
Permission Levels, so you can easily separate content editors from template designers
Update Histories, so you can track whose updating what (and when)
Multiple Types of Data, so you can do more than just post blogs (e.g. post Press Releases or Job Listings)
A blogging software company that adds those functionalities to basic blogware could start to eat away at Content Management market share on the low-end. It's already starting to happen with corporate weblogs: knowledge management blogs, corporate communications blog, and marketing blogs are all making a splash in the marketplace without much participation from the low to mid-end content management systems.

WEBLOGS V. CONTENT MANAGEMENT

Much like the computing world, there will continue to be a role for the truly big Content Management systems: after all, IBM is doing well selling consulting services for its existing Mainframes and other Big Iron hardware.

The Weblogs versus CMS dilemma will probably unravel much like the PC versus Mainframes dilemma: at first it seemed like PC's didn't have significant market or revenue potential. By the time the Mainframes caught on, PCs were a full-blown revolution and were beginning to match the price/performance of the powerful Mainframes.

But much like Personal Computers, weblogs are riding a whole new price/performance curve that threatens to move upscale into higher end solutions.

In other words, weblogs are a violently disruptive force in the content management sector!

WEBCRIMSON: TAPPING THE DISRUPTIVE TECH OF WEBLOGS

It's that disruptive force we hope to tap here at WebCrimson.

We've been consulting to clients for over three years, building browser-based software to enable our clients to update their sites. Unlike most blogware solutions, we haven't had the luxury of building a bare-bones blogging solution and adding on to it: from the very beginning, we've had to bake in both basic blogware and CMS functionality.

WebCrimson supports all the features listed above: Basic Workflow, Permissions Levels, Update Histories, and Multiple Types of Data. Up until recently, the configuration consulting has required someone who knows how to work WebCrimson... but with the launch of CrimsonBlog.com, anyone can set up a blog in under a minute.

I'm not kidding myself: our Crimsonblog.com product is not new to the marketplace - there are other sites that let you create a free static blog and even host it for you. But we're really excited about another site we're launching along with CrimsonBlog: CrimsonZine.com, a free hosting site that lets you easily post essays to the web.

Crimsonzine is really exciting for us, because it does represent something new: it's our first website builder that draws on the underlying WebCrimson engine to extend blogware beyond the blogging sector.

We've got many more similar website builders in the works:

Book Reviews so you can easily review your favorite books
Music Reviews will do the same for CDs and MP3s
Press Centers will let you easily post Press Releases and links to Media Mentions
Frequently Asked Questions will let you manage lists of FAQs
Calendars will let users easily create calendars of past and future events
That's just the beginning! Blogging is the first and truest killer app of Personal Publishing: these other Crimson sites will extend that power into more flexible and powerful personal and business websites. And because we'll be offering them through a browser interface, they'll tap the viral nature of weblogs - letting users set up sites in seconds rather than minutes (or hours!).

By tapping the violently disruptive force of weblogs, we hope to help bring the power of blogware to the masses and push blogware to the next level. To the Content Management Giants, a word of warning: watch out for weblogs!

John Hiler is the CEO of WebCrimson, personal publishing software which lets you easily build blogs or webzines. He is also the editor of Microcontent News, an online magazine about weblogs, webzines, and personal publishing.

August 22, 2003 at 07:45 PM in Blogging & feeds | Permalink | Top of page | Blog Home

Fast Company | 5 Technologies That Will Change the World

Fast Company | 5 Technologies That Will Change the World: "5 Technologies That Will Change the World
It's hard to believe in advances that are poised to change the world when everyone's just trying to survive. But these tireless innovators are developing technologies that are making the future worth looking forward to again.
From: Issue 74 | September 2003, Page 93 By: Scott Kirsner Illustrations by: Terry Allen
After the Internet bubble burst, people stopped thinking about the transforming powers of technology. And technology companies were forced to stop crowing about how they were set to change the world. Instead, they ate crow -- and concentrated on staying alive."

Fast Company | 5 Technologies That Will Change the World
Fast Company | 5 Technologies That Will Change the World: "5 Technologies That Will Change the World
It's hard to believe in advances that are poised to change the world when everyone's just trying to survive. But these tireless innovators are developing technologies that are making the future worth looking forward to again.
From: Issue 74 | September 2003, Page 93 By: Scott Kirsner Illustrations by: Terry Allen
After the Internet bubble burst, people stopped thinking about the transforming powers of technology. And technology companies were forced to stop crowing about how they were set to change the world. Instead, they ate crow -- and concentrated on staying alive."

But technology didn't stop evolving and maturing, no matter what the Nasdaq did. Imaginative researchers and engineers, by their nature, aren't very good at throttling back to a conservative idle.

So while shareholders nursed their battered portfolios and big companies chiseled away at their cost structures and employment rolls, these innovators kept working. They kept trying to develop technologies that would represent giant leaps forward, not just incremental baby steps.

We set off in search of those people who were bold enough to think that the world might at some point be ready to take a giant leap again and to believe that innovative technology can still put serious distance between a leader and the rest of the pack.

In such places as Mooresville, North Carolina; La Jolla, California; Hawthorne, New York; rural Connecticut; and Manhattan's SoHo district, we found companies that are developing or deploying technologies that could change the world. Each will have a different impact -- from smart tags that will allow products to be tracked through the distribution network to bio-simulation software that is speeding the path of safer, more effective new drugs to pharmacy shelves. We sent back these five postcards from the edge.

The ThermoJet printer outside of Scott Campbell's office looks like a big Xerox machine, although at $49,000, it's a bit pricier. But instead of cranking out color prints, the ThermoJet produces 3-D wax models of car parts and body designs for the Penske Racing NASCAR team, headquartered in Mooresville, North Carolina.
Penske is obsessed with technology that will help it leave competitors in the dust. (The team has notched more than 45 wins in the NASCAR Winston Cup Series.) And Campbell says that 3-D printing, which allows the team to turbo-charge its design process, is just such a technology.

"It used to be a long process to sculpt things by hand," says Campbell, a senior engineer for the Penske team. "Now we design things on the fly and make lots of incremental changes, because we can just print them out and see how they look."

Three-D printing is changing the world of product design. These printers typically shape objects by laying down materials, such as wax or plaster, one layer upon the other. A small model can take as little as an hour to create, and some printers can create objects in full color. Three-D printing is being used to design everything from children's strollers at Graco to running shoes at New Balance and Reebok, allowing designers and engineers to show their work earlier in the process, make changes with less fuss, and get new products to market faster.

The Penske team's printer, made by 3D Systems, a publicly traded California company, churns out models of such things as suspension components and brake caliper mounts, as well as complete car bodies. Once a part has been printed and approved, the model can be sent to a foundry to be cast in steel or titanium and eventually installed on one of Penske's two race cars. (Most NASCAR vehicles are entirely custom-built.) Models from the 3-D printer can even be tested in a wind tunnel -- something that Toyota has done with parts such as side-view mirrors for its production vehicles.

As 3-D printers drop in cost -- Z Corp., an MIT spin-off, offers a low-end printer for $29,900 -- they could even start showing up in places like Kinko's, allowing customers to do not just desktop publishing of documents, but desktop publishing of objects.

Already, the Penske team appreciates the advantage of being able to turn out prototypes -- and make changes -- quickly and cheaply. (A small model of a car costs about $160.) "We're looking for every performance edge we can find with our design and manufacturing techniques," Campbell says. "We don't show up at a race to lose."

Rather than fall victim to simple viruses, the Linux servers in Dr. Richard Ho's labs are supposed to contract more serious diseases.
Several of the computers at the Johnson & Johnson Pharmaceutical R&D facility in La Jolla, California, suffer from Type II diabetes. Ho, the head of medical informatics at the facility, expects that other servers will eventually come down with debilitating diseases of their own.

"When you're trying to develop a new drug, there's a lot of guesswork involved," Ho says. "You'd work on a new drug candidate in the lab, and eventually test it on animals, and then test it on humans, but you might not have a good idea of what the drug would do at any of those stages."

That's where Ho's sick servers come in. By creating mathematical models of diseases such as diabetes, obesity, asthma, or arthritis in a computer, researchers can run virtual tests of their new drug candidates -- much in the way that an aeronautical engineer uses a computer simulation to imagine how an airplane design will perform once it's built. Often called "biosimulation," the approach compiles everything that is known about a given disease -- even down to the activity that takes place inside a single cell. And the computer models can be updated as scientists learn more about how the diseases work.

Researchers can anticipate bad reactions before they give a drug to animals or humans, and they can run many more tests on a computer than they could run in the real world. Ideally, biosimulation will help Johnson & Johnson and other pharmaceutical companies focus their efforts on the drug prospects that are most likely to succeed.

"Even by the time a drug candidate gets to Phase III clinical trials -- the last stage before it reaches the market -- the failure rate still approaches 50%," says James Karis, the CEO of Entelos, the Foster City, California, company that supplies biosimulation technology to Johnson & Johnson. "That's after eight or so years of research. Those failures are very expensive." (Today, the standard figure for bringing a drug to market is between $800 million and $1 billion.) Simulation software will make those failures less painful and help pharmaceutical companies find useful drugs sooner.

Ho's team used biosimulation to reduce the amount of time and the number of patients required for the first phase of clinical trials of a new, as-yet-unannounced drug for Type II diabetes. (Ho estimates that the software, in its first outing, saved between six and eight weeks in trials.) Using sick computers as a stand-in for sick humans is still a new idea that will have to prove its value by contributing to the development of important new drugs. But eventually, Ho predicts, "this will become commonplace. It's a tool we never had before."

Nagui Halim believes that there are few constants in the world of computing. Chips get faster and more powerful, storage gets cheaper, and communications bandwidth keeps increasing. But one thing doesn't change: Our computers are still horrible at coping with problems.
"People are excellent at handling changes in the environment, or in their own body," says Halim, the director of distributed computing at IBM's Watson Research Center in Hawthorne, New York. "If you have too much work, you know how to prioritize and do the work that matters most. If you're feeling sick, you might lie down for a while." But even the most sophisticated computers aren't self-aware enough to know how to handle stress, or react to their own health problems.

Halim is part of a group at IBM that's working on what the company has termed "autonomic computing": developing computers that are smart enough to configure themselves, balance intense workloads, and know how to predict and address problems before they happen. At IBM, the leader in the field, the annual research budget for autonomic computing approaches $500 million. And the quest to develop systems that take care of themselves isn't just an abstract research initiative: Its fruits have begun creeping into Big Blue's product line.

"We've already got storage management software that can tell you when a storage device will fail before that failure happens," says Alan Ganek, vice president of IBM's autonomic-computing initiative. "And we're selling database software that can recommend a configuration based on the hardware environment you're running it in. Most database administrators had previously done that by trial and error."

The long-term promise of self-aware computers and software is greater reliability with fewer human baby-sitters. Right now, Ganek says, IT staffs at large companies are swamped with the tasks involved in "managing, maintaining, upgrading, and the care and feeding of their systems. That work squeezes out any innovative projects that they'd like to be doing to establish a competitive advantage."

Imagine, Halim says, a system that is smart enough both to see that online orders are spiking as the holiday shopping season approaches and to temporarily commandeer a bit of extra processing power from the human-resources server so that it can handle the influx of orders. Personal computers might know when a software upgrade became available and install it themselves. But as Adam and Eve discovered, self-awareness and sin often go hand in hand. A key challenge for IBM will be imposing restraints on this smarter generation of computers, so that your PC doesn't go out and spend $100 upgrading to Windows 2005 without your permission.

Guests don't go to the glitzy Mohegan Sun casino and resort, in central Connecticut, to see the fuel-cell center that's housed in an old fire station on an access road. And they don't ooh and aah over the dozen hydrogen storage tanks on the fire station's roof.
But the fuel-cell center, which is designed to provide the casino with reliable, clean backup power, may be one of the most glamorous things going at Mohegan Sun. Eventually, on-site power generation and storage facilities like Mohegan Sun's could change the structure of the country's power grid. The concept is called "distributed generation" (or sometimes, "decentralized generation").

Today, the way that power is generated in the United States looks a lot like the old world of mainframe computers, says Chip Schroeder, CEO of Proton Energy, the Connecticut company installing the hydrogen system at Mohegan Sun. A few big, clunky plants are connected together in what's known as "the grid." In some ways, that system is efficient -- it's the cheapest way that we know to produce and distribute electricity -- but in other ways, it's terrible. Electricity is lost as it's transmitted over long distances. No one likes living next to a massive power plant. And the huge capital investments mean that old, expensive plants keep running long after cleaner, more efficient technology becomes available.

Schroeder says that the new power network will look a lot more like the Internet than the outmoded mainframe model. Smaller generating facilities -- some using solar, wind, and other renewable energy technologies and others using scaled-down gas-fired turbines -- will be widely distributed and placed closer to where the power is actually being used. They will be more easily upgradeable. The power will be more reliable, because most outages are caused by distribution problems, like a downed line.

The installation at Mohegan Sun is one of only a few tentative steps toward this Internet-like power network. "But you need to prove that this can work before more people will adopt it," says Dan Reicher, a vice president at Northern Power, recently acquired by Proton. And other projects are popping up. Later this year, Northern Power will be starting a demonstration project in Vermont that will be the world's first "microgrid." This web of generating technologies will serve an industrial park and a few nearby residences, and even feed surplus power back to the main power grid. A similar microgrid is being built in downtown Detroit by DTE Energy, a subsidiary of Detroit Edison.

"It may take awhile, and we're probably biased," says Schroeder, "but we think this is the future."

The glass door of the dressing room at Prada's Epicenter store in SoHo slides shut.
I hang a $450 gray patterned shirt on a rack inside, and suddenly, a color flat-screen display on the wall lights up. The dressing room has "recognized" the item I've brought in, then suggests other sizes and materials that it comes in and even shows a picture of a much-better-looking-than-me model wearing the shirt in a Prada fashion show.

Attached to the shirt, along with the stratospheric price tag, is a piece of clear plastic the size of a business card. Embedded in the plastic is a coil of bronze microchip circuitry, which contains information about the shirt and conveys it to a reader built into the dressing room. This is a smart tag (or RFID tag, for radio-frequency identification), made by Texas Instruments and sold for about $3. It can be made much smaller -- about the size of a fleck in a snow globe -- and for as little as 10 cents.

The promise of smart tags is that they could serve as an advanced version of the omnipresent UPC bar code, providing information about not just what a product is, but also where it is, where it has been, and how it has been handled. A smart-tag reader in a warehouse, truck, or store can "query" all of the smart tags in its vicinity, taking inventory without human help. Smart tags are also being affixed to refrigerated containers to make sure that food is stored at the right temperature.

Gillette uses the tags to track cartons of Venus women's razors through a packaging and distribution center in Massachusetts, and may buy as many as a half-billion tags over the next two or three years. The tags could also tell retailers how many cans of its shaving cream sit on their shelves at any given moment. Seven million tags are already attached to the keychains of drivers who pay for their gas with ExxonMobil's SpeedPass system. The tipping point for smart tags will likely arrive by 2005, when Wal-Mart will require its top 100 suppliers to attach them to each forklift pallet of products they deliver to the retailer. (Privacy concerns could slow things down. The fear: You could be traced through your clothing or possessions.)

"You'll see a lot of diverse uses," says Bill Allen, Texas Instruments' e-marketing manager for RFID products, "because not only can you store information on the tags, you can also rewrite it." In Iraq, the tags were used on a Navy hospital ship to track the location and triage status of injured soldiers. "And then," Allen says, "in peacetime, you've got a company like Prada, using [smart tags] to improve the customer's shopping experience."

PS: I bought the shirt. It was on sale for half off. But the tag wasn't smart enough to get my editors to pay for it.

August 22, 2003 at 10:44 AM in Web/Tech | Permalink | Top of page | Blog Home

Yahoo! News - Officials Look to Unearth Internet Worm Writers

By Elinor Mills Abreu and Bernhard Warner
Thu Aug 21, 7:51 PM ET

SAN FRANCISCO/LONDON (Reuters) - They write menacing software with names like "Blaster," "Welchia" and "Sobig" that worm around the Internet leaving destruction in their path, and on Thursday detectives and computer security firms were hot on their trail.

Yahoo! News - Officials Look to Unearth Internet Worm Writers

Officials Look to Unearth Internet Worm Writers
By Elinor Mills Abreu and Bernhard Warner

SAN FRANCISCO/LONDON (Reuters) - They write menacing software with names like "Blaster," "Welchia" and "Sobig" that worm around the Internet leaving destruction in their path, and on Thursday detectives and computer security firms were hot on their trail.

Computer virus writers have unleashed an unprecedented outbreak of computer worms this past week and while finding them will not be easy, experts generally believe they are ego-filled computing geeks out to impress others.


"Every major law enforcement agency is looking into this. At the end of the day, we want to prosecute," said a cyber crime investigator at the UK's National Hi-Tech Crime Unit, who asked to remain anonymous.


In the past two weeks, major computer infestations by Blaster, also called "LovSan," and Welchia, also dubbed "Nachi," have crawled through holes in computers using Microsoft Corp.'s Windows operating system. A third worm, Sobig.F worm, has spread via Microsoft e-mail programs.


The result is that hundreds of thousands of PCs worldwide have crashed and many computer networks have slowed to a crawl.


The full economic impact of this recent infestation may never be known, but the growing list of victims includes the U.S. Navy (news - web sites) and Air Canada . Experts are calling this recent computer infestation, the most damaging worm outbreak yet.


To catch the suspects, investigators are piecing together suspect profiles from strings of computer code to try to trace their destination through a maze of Internet addresses.


This new group of worms is believed to be the work of different parties. The most perplexing may be the author of Welchia, a worm that tries to stop the Blaster worm.


EGO-DRIVEN GEEKS


Welchia is the brainchild of either a misguided digital do-gooder or an ego-driven programmer, which is the typical virus writer, computer security experts said on Thursday.


"Any kind of worm that intrudes upon your PC is not good," said America Online spokesman Nicholas Graham.


The Welchia worm arguably does more damage than Blaster, which merely crashes systems. In its zeal to find computers that are infected with Blaster, Welchia is conducting a lot of Internet scanning that paralyzes and slows many computing networks.


Welchia's creator is believed to be from China because in the code are Chinese words and names. The author also includes a phrase saying it was created for a good cause, said Jimmy Kuo of anti-virus vendor Network Associates Inc. .


Blaster is thought to have begun in an English-speaking country because of the impeccable English in the software code, said Mikko Hypponen of anti-virus company F-Secure of Finland.


The reference to "San," (in Blaster's other name, LovSan) possibly short for "Sandy," could be the handiwork of a male virus writer looking to impress a girl, he said.


Virus writing "gives underworld cachet to what is otherwise a pretty geeky existence," said David Perry, global director of education for Tokyo-based anti-virus provider Trend Micro. "To impress a girl ... you go out and write a computer virus."


Last year, police tracked down convicted Welsh virus writer Simon Valler after he named his friends and included comments about Wales in the text of his computer virus, dubbed GoKar, investigators said.

August 22, 2003 at 08:36 AM in Virus | Permalink | Top of page | Blog Home

Google Deal Ties Company to Weblogs

Google Deal Ties Company to Weblogs: "Google Deal Ties Company to Weblogs
By AMY HARMON

Google, the operator of the Web's leading search engine, has bought Pyra Labs, the creator of software for publishing Weblogs, a form of hyperlinked online journal that has become an increasingly popular way to distribute and collect information on the Web.
Terms of the deal were not disclosed, but the move was hailed by users of Weblogs, commonly called blogs, as a watershed moment for the fledgling communications medium, sometimes dismissed as too narrowband and self-involved to have a significant cultural impact."

August 22, 2003 at 12:37 AM in Blogging & feeds | Permalink | Top of page | Blog Home

Future of Blogging

This article from Jim Ray nicely summarises some of the issues concerning the Google purchase of Blogger (Pyra).

Future of blogging
February 18, 2003 11:46 AM

Google's acquisition of Pyra, makers of Blogger, was announced this weekend, on a blog (of course), and has since spread through the blogosphere and even mainstream media.

Future of blogging
February 18, 2003 11:46 AM

Google's acquisition of Pyra, makers of Blogger, was announced this weekend, on a blog (of course), and has since spread through the blogosphere and even mainstream media.

Coverage in:
Guardian Unlimited
NY Times
MSNBC
Washington Post

As a blogger, using the very excellent Movable Type, I'm still not sure what I think about the whole affair. When the most popular, and most effective, search engine in the world buys a blogware corporation, it's certainly going to raise the profile of blogging. This could very well be what takes blogging even further into the mainstream.

The deal leaves me with a few questions, though. I think it's pretty obvious that Pyra is the big winner here. While they weathered the dotcom storm fairly admirably, weather was about all they did. The service was hit with multiple service outages and malicious hacker intrusions. From all outside appearances, the team of six that will be moving to Google were having a hard enough time keeping their service on par with the status quo, let alone developments from upstarts like Movable Type. The Google server farm and bandwidth alone, not to mention the Google braintrust, will be an enormous benefit to Pyra and Blogger users.

The question I have is, what does Google get out of all of this? I don’t think it’s too much of a stretch to say that purchasing Blogger was as much an altruistic move as it was savvy business plan. Blogger may not have been in the same boat as, say, Salon, but it’s technical and financial woes are now no longer a concern. There’s little doubt that the Blogger community breathed something of a collective sigh of relief when they learned that they’d be soon transferred to one of the most reliable networks on the Net. But does Google really benefit from having the largest blogging community hosted on their server farm?

Several have hypothesized that the real benefit to Google will be better analysis of the blogging zeitgeist. Bloggers are notorious for the rapidity with which they update and respond to cultural memes, much faster than the mainstream media. The whole flack over Trent Lott was completely ignored by mainstream media, until the pundits of the blogosphere launched their attacks. By actually hosting Blogger, the theory goes, Google will be able to better, more quickly analyze blog traffic.

I’m not convinced that this is really all that compelling. There’s nothing from stopping the geniuses at Google from modifying their crawlers just for weblogs and building a comprehensive blog search, akin to Google News.

The most compelling reason for me lies in a shift in how you have to think about Google. For most people, Google is synonymous with search. It’s how they started, it’s what they do best. But the “do one thing, do it best” mantra is shifting from “search best” to “manage information best”. Google is, in effect, positioning itself to be the best library imaginable, and blogging is one more collection to index. In this way, their original mission of “search best” becomes one part of information management. Furthermore, with it’s Blogger acquisition, Google will be able to influence the blogging community and blogware that will make indexing all all blogs easier. Direct access to and influence over the Blogger API may be just as, if not more, important as the Blogger database.

There are some other practical considerations, as well. Hopefully, blogvertising will become more viable when Google drops the banner ads from Blogger sites and incorporates their much nicer ad system. Blogging, for better or for worse, will be somewhat legitimized (which may turn off as many people as it turns on). And blogging will be easy. Rest assured, the fine folks at Google will work very hard to make it easy enough for my mom to start blogging if she so desires.

The notion of what it means to blog may fundamentally change in the coming months, as well. If micro-publishing suddenly becomes easy enough for everyone, with the backing of the Google juggernaught, we could very well see a surge of online content. Most blogging these days is reserved for political punditry, technical discussion or personal observation. Very, very few blogs are a forum for non-techie writers looking to publish their short stories, for instance. With Google blogging, writers may suddenly be empowered to make a real go at publishing on the web, dismantling the shackles of a largely staid publishing industry. Can independent music and film be far behind?

Given Google' past with acquisitions like the Deja Usenet catalog, I'm optimistic that they'll do the right thing when it comes to blogging. They’ll have to tread lightly, though, as the blogosphere is quick to anger and eager to expose their flaring tempers. Of course, it’s that volatile temperment that Google is buying into.

August 22, 2003 at 12:36 AM in Blogging & feeds | Permalink | Top of page | Blog Home

Marketers Say They Intend to Join Effort to Fight Spam

The New York Times: Technology: "Marketers Say They Intend to Join Effort to Fight Spam
By JOHN SCHWARTZ and JOHN MARKOFF
The trade group for direct-mail marketers is helping develop a high-technology group dedicated to shutting down the most egregious users of bulk e-mail, or spam."

August 22, 2003 at 12:19 AM in Spam | Permalink | Top of page | Blog Home

Spread of Virus Is Fastest Ever

The New York Times: Technology: "A computer virus that circulated across the Internet this week, hard on the heels of another nasty online infection, has been declared the fastest e-mail outbreak ever" - 'Sobig.F'

August 22, 2003 at 12:15 AM in Virus | Permalink | Top of page | Blog Home

August 21, 2003

Most potent virus

Yahoo! News - Technology - Reuters Internet Report: "A new computer virus feared to be the most potent ever spread like wildfire Thursday, sending e-mail networks crashing and frazzling technicians already overstretched by a plague of computer bugs."

Sobig Worm Aims to Turn PCs Into Spam Machines
Reuters Internet Report
By Elinor Mills Abreu

SAN FRANCISCO (Reuters) - Several Internet worms that have besieged computers for over a week played havoc again on Wednesday, including one called Sobig.F whose aim was to turn PCs into spam machines and was believed to be the fastest growing virus ever, experts said.

"Sobig.F drops software onto infected Windows computers that open them to be used later for distributing Internet spam -- unwanted e-mails and product promotions, experts said. It also represents a new trend in converging e-mail spamming and virus software writing, they said.
'We believe (Sobig.F) has been written by a spammer or spammers' looking for ways to get past spam filters, said Mikko Hypponen, manager of anti-virus research for Finnish security firm F-Secure. 'For once, we have a clear motive for a virus -- money.'
Security experts said it was difficult to ascertain how many computers had been infected by the Sobig.F worm. Worms are viruses that spread through networks.
Internet service America Online, however, said it blocked about 11.5 million copies while security firm MessageLabs stopped more than 1 million copies within the first 24 hours and dubbed Sobig.F the fastest growing e-mail virus ever.
Sobig.F hit the computing world as corporations were still recovering from several worms that spread through holes in Microsoft Corp.'s Windows operating systems, including the 'Blaster' worm. Also called 'LovSan,' it has infected and crashed hundreds of thousands of computers since last week.
The 'Welchia' or 'Nachi' worm, which surfaced on Monday, infected 72,000 computers used by the U.S. Navy (news - web sites) and Marine Corps and crippled Air Canada's reservation counters and call centers.
CSX Transportation said on Wednesday that a virus infection had slowed its dispatching and signal systems, forcing it to halt passenger and freight train traffic, including the morning commuter train service in Washington, D.C.


NEW TREND, SPAM-VIRUS CONVERGENCE


Sobig.F hit home users particularly hard, experts said. It arrives in an e-mail with an attachment that when opened infects the computer and sends itself on to other victims using a random e-mail address from the address book, making it difficult to trace the worm back to its source.


The Sobig family of worms represents a new trend in the convergence of worm and spam techniques for more widespread and faster deployment, experts said.


Virus writers are utilizing software that spammers employ to send bulk spam messages. Conversely, spammers are starting to use methods incorporated by virus writers to spread their messages and avoid detection, said Brian Czarny, marketing director at e-mail security company MessageLabs.


Previous Sobig versions loaded a program onto infected PCs that broadcast spam to other computers, thus turning the PCs into so-called "spam relays."


Sobig.F downloads a Trojan onto infected computers, which could later be remotely activated to send spam, experts said.


"There are computers scanning the Internet for open relays so spammers can jump from one machine to the next and be able to send millions of spam messages and have them not be traced back to them or be blocked," said Jimmy Kuo, research fellow at anti-virus vendor Network Associates Inc.


Sobig.F, which expires on Sept. 10, is spreading quickly because it sends multiple e-mails simultaneously and spreads to other computers on a shared network, said experts, who predict there will be another version in the near future. (Additional reporting by Bernhard Warner in London and Charles Grandmont in Montreal.)

August 21, 2003 at 12:16 PM in Virus | Permalink | Top of page | Blog Home

August 20, 2003

High cost of failure in a networked world

High cost of failure in a networked world
DAVID OLIVE
Toronto Star

It's hard to say which domino, or "cascading," effect was more pronounced during the power crisis: the sequential failure of electric utilities across a region serving 50 million people, or the spiralling blame game among politicians and special-interest pleaders.

"Have you ever seen the United States take the blame for anything?" said Mel Lastman last Friday. With that constructive riposte, the Toronto mayor claimed his share of the embarrassment as Ottawa, Albany and Gracie Mansion speculated about phantom lightning strikes in the Niagara Frontier and an imagined fire at a nuclear power plant in Pennsylvania.

Not to be outdone, George W. Bush said last Thursday it's time to determine whether North America's creaky system of electric-power distribution needs a massive upgrade. "I happen to think it does," Bush declared, "and have said so all along."

All along, in fact, Bush has held up energy bills in Congress that would provide upgrades to the North American power grid for want of a quid pro quo on oil drilling in the ecologically sensitive Arctic National Wildlife Refuge (ANWR), a Bush obsession that would satisfy U.S. oil demand for 20 months at best. In one of those horse-trading sessions, Bush two years ago leaned on fellow Republicans in Congress to kill a proposed $350 million (U.S.) loan package for power companies to overhaul the grid.

The ideologues have swung into action, too.

The free-market crowd says the stalled progress of deregulation, particularly in Ontario, has discouraged urgently needed private investment in power plants and the transmission grid that failed last Thursday.

For their part, the anti-deregulation zealots, notably Ontario NDP leader Howard Hampton, fingered the profit motive unleashed by deregulation shifting investment dollars away from the low-return transmission business, increasing vulnerability to episodes like the blackout.

Environmentalists got into the act, touting alternative energy sources. They touched a Luddite nerve by criticizing a system built on energy megaprojects rather than small-scale neighbourhood generators.

And the anti-terrorism industry got its unavoidable star turn.

Conclusive studies cited in Slate and elsewhere report that sophisticated hackers have been attacking the grid on a daily basis for years with little to show for their efforts. And it's that command and control system—"the NATO of the power industry," one expert calls it — and not the physical outcropping of power plants that is the system's real soft spot.

But never mind. The alarmists say that with our addiction to Palm Pilots, electric cheese-shredders and self-flushing toilets, we have made ourselves vulnerable to every crackpot Osama bin Hydro able to navigate the black market in rocket parts.

"If you want to do a good job, you go to a Texas gun show and buy a grenade launcher," Charles Perrow, a former Yale University sociology professor and author of Normal Accidents: Living With High-Risk Technologies, warned in the latest Sunday New York Times, the telephone-book-sized publication that managed to reach my 7-Eleven in west-end Toronto at the usual time despite its origins in one of the epicentres of the blackout. "You drive by an electric power station and you point in the right direction and bang."

Yet in contrast to the 3 million Quebecers without power for three weeks after the 1998 ice storm, our "third-world electrical grid," as it was described after the latest crisis by a former U.S. energy secretary, has performed well at least in the aftermath in the crisis.

The system managed to restore power to parts of mid-town Toronto six hours after the blackout began. There will be intermittent power outages in days ahead.

But the real, uplifting surprise is the relative speed with which intelligent and hard-working employees in the system have already turned the communal shortages of last week into a fading memory.

The thing to fear is complacency. We didn't expect this jolt. The next one could be bigger and longer.

And deprived of electric-powered water-pumping stations for even a few days, we might next time be showering with Perrier, as the Globe and Mail's Margaret Wente suggested.

Yet by Monday, CNN's morning-show hosts were already doubting the resolve of Congress and the White House to follow through with promised measures for overhauling the grid as public attention began to shift elsewhere. "I can tell you," one of the hosts said, "I personally am already sick of this story. And my house was dark last week."

Is deregulation to blame for the blackout? The significant blackouts in the U.S. northeast of 1959, 1961, 1965 and 1977, some of them affecting Canada, pre-date the deregulation era.

What about North America's remaining old-fashioned, vertically integrated, regulated monopolies, which came in for renewed criticism of their inefficiency from Energy Probe and other deregulation advocates since the blackout?

It's tough to make that case, too. Quebec and the U.S. south, conspicuous holdouts against power deregulation, did not lose power last week. As it happens, Hydro-Québec has come to the rescue of Ontario and New York state, which both began to deregulate in 1996. Soon after the crisis began, Quebec's state-owned utility began to send surplus power to Ontario and New York, and to New England states as well, until all their power plants are back online.

Indeed, Hydro-Québec would seem to make a fine case study on the merits of traditional vertically integrated monopolies. Blessed with serendipity, Hydro-Québec chose last week to press its demand for a 6 per cent rate hike after a five-year rate freeze. Consumer groups are outraged.

But then, Hydro-Québec has done what its deregulated cousins are now urged to do. Over the past decade, Hydro-Québec has spent $2.2 billion (Canadian) upgrading its transmission network. About half of that money was spent since 1998, a lesson from the ice storm.

Even Hydro-Québec's proposed price hike is instructive to conservation advocates. They say that it is artificially low prices — like the 4.3 cents a kilowatt-hour imposed by Eves last year — that give us a false sense of energy security. "It would not be honest," says Hydro-Québec CEO André Caille, to mislead consumers about the true cost of electricity.

The dismaying lesson of last week is that with or without deregulation, the energy markets are now so interconnected that "cascading failures" are probably inevitable. After all, such failures are the norm in our networked world. The collapse of the Thai bhat in the late 1990s triggered a worldwide currency crisis. A backup of planes at Pearson Airport strands travelers across the country. And "denial of service" breakdowns are a routine occurrence when Internet traffic that is diverted from a malfunctioning router overwhelms other routers unable to bear the increased load.

That's the logic to which Albert-Laszlo Barabasi asks us to resign ourselves in a New York Times essay last weekend. We either make a concerted effort to fix every link, or it will crash on us, repeatedly, and, on occasion, catastrophically.

"While celebrating that everybody on earth is only six handshakes from us, we need to accept that so are their problems and vulnerabilities," Barabasi says. "Unless we are willing to cut the connections, the only way to change the world is to improve all nodes and links."

August 20, 2003 at 04:17 PM in World Affairs | Permalink | TrackBack (79) | Top of page | Blog Home

August 18, 2003

The problem with mainstreaming RSS

This explanation from Lockergnome is very useful and clear. RSS is going to be a foundational technology but it has a long way to go before the tools are as pervasive as email clients.

The problem with mainstreaming RSS
Opinion from Lockergnome

"Ok, so this is an opinion piece, and I know opinions are like elbows, everybody's got 'em, but if our community is to be successful in mainstreaming RSS, we have to ask ourselves one very important, and simple, question: 'Can our Mothers use RSS?'
My own Mom has been online for several years now, and she's become quite proficient with the basic 'net tools of email, Google, and ICQ. She can attach documents, and use Google to find things, but her knowledge of these tools is limited by what she finds useful daily. She is a very intelligent woman, an Dr. of Educational Psychology in fact, but her world does not, and is not likely to, include RSS unless using it gets much, much simpler.
We toss about terms like XML, RSS, Aggregator, Blog, and MovableType with ease, because they are the tools of our trade. We embrace them, we understand them. But for the AOL minded masses, these terms are too vague, too complicated, too boring. For these people, instant messages and email are their primary tools. Google is useful to them, because it's simple. Email is useful for them because it allows them to forward amusing things to their friends and family, and because it is nearly omnipresent. Everyone has an email address.
For RSS to catch on and be embraced outside of technology focused content, using it will have to become much more user friendly. Your Mom will need to understand it. To be honest, it will probably take someone like Micro"

August 18, 2003 at 01:35 AM in Blogging & feeds | Permalink | Top of page | Blog Home

Why we need permanent links added to a blog

This is is an explanation for why we need permanent links. These can only be provided using blogging methodology.

Search/Select a Category > Advanced Usage > How to add permanent links to a blog:

"Permanent links allow other people with websites or blogs to link directly to certain posts you've made on your own blog, and without fear that the post will slide off your front page and no longer be accessible. It is done by linking to a post at its archived location, which won't change. You'll need to have archives enabled on your blog so that each post will have that permanent home. All you'll need to do is to make sure you have a few specific tags in your template."

How do I add a permanent links to a blog?
Search/Select a Cat: "Question

How do I add a permanent links to a blog?

Answer
<img alt="permalink.jpg" src="http://ice.typepad.com/ice/images/permalink.jpg" width="465" height="254" border="0" />

Permanent links allow other people with websites or blogs to link directly to certain posts you've made on your own blog, and without fear that the post will slide off your front page and no longer be accessible. It is done by linking to a post at its archived location, which won't change. You'll need to have archives enabled on your blog so that each post will have that permanent home. All you'll need to do is to make sure you have a few specific tags in your template.
You'll be adding one new tag to your template, which is the <$BlogItemArchiveFileName$> tag. When you publish your blog, this tag will be replaced with the name of the archive file where the linked post will be permanently located. Used in conjunction with the <$BlogItemNumber$> tag, it's an easy way to link to individual posts. These tags must be located within the tags, but can not be within the tags or tags.
Sign in to Blogger with your username and password. Select the blog you'd like to work with and select Settings. Make sure you have archiving enabled and that you have an archive name specified. When this is done, click on Template. Locate the section of your blog template where the <$BlogItem$> information is located, then add in the two tags, similar to the format below:"

Note the addition of the >a name< tag at the top of each post. This will enable you to point to the exact post in the archive. If you laid out the formatting exactly like the example above, you'd get the following output, with 'link' pointing to its permanent home:

Saturday, August 21, 1999

What's up with all these pigeons?
posted by Bert on 8/21/82 7:20:04 PM | link

Rubber ducky, I love you.
posted by Ernie on 8/21/82 6:25:12 PM | link

August 18, 2003 at 12:51 AM in Blogging & feeds | Permalink | Top of page | Blog Home

August 17, 2003

Microsoft thwarts attack

Further to earlier post on viruses, Microsoft had to spend considerable time and money to deal with management of the Blaster virus, and the underground anti-MS hacker community.

Yahoo! News - Microsoft Thwarts Expected Blaster Worm Web Site Attack
Yahoo! News - Microsoft Thwarts Expected Blaster Worm Web Site Attack: "SAN FRANCISCO (Reuters) - As expected, Microsoft Corp. thwarted an attack set for midnight on Saturday on its Web site by the Blaster worm, which has infected hundreds of thousands of Windows computers.

The Blaster worm, also called MSBlaster or LoveSan, crashes Windows XP (news - web sites) and Windows 2000 (news - web sites) computers and instructs them to attack a Microsoft Web site with an anti-Blaster patch just after midnight, local times, Saturday.
A Microsoft spokesman said the company's site had no problems as a result of the worm, which has infected 386,000 or so computers, according to an estimate from anti-virus vendor Symantec Corp.
A day earlier, Microsoft said it had protected its network by eliminating the Web page with the URL used by Blaster.
Although the attack on Microsoft failed, the worm will continue to spread until computers with the Windows hole get patched, security experts have said. In addition, there are two new versions circulating, including one that installs a back-door Trojan application that provides an attacker remote access to the computer.
The patch for the security hole, which affects Windows XP and Windows 2000, as well as Windows NT and Windows Server 2003, can be downloaded at http://www.microsoft.com. "

August 17, 2003 at 06:11 PM in Microsoft | Permalink | Top of page | Blog Home

August 16, 2003

What is RSS?

What is RSS?

Chris's site is pretty good and this article indicates he will have more on RSS soon.

August 16, 2003 at 10:45 PM in Blogging & feeds | Permalink | Top of page | Blog Home

Post vs Page

The key difference and benefit of blogs is in the paradigm shift from regular web. Web is based on a "page" paradigm, whereas blogs are based on a "post" paradigm. Each post has a permanent address/ link "permalink". How those permalinks are displayed in any given page is irrelevant ... important level of definition is the post.

August 16, 2003 at 07:15 PM in Blogging & feeds | Permalink | Top of page | Blog Home

Power failure status Aug 16th, 2003

The power failure seems to be over in the US today, with NY State claining 100% power back up. Canada on the other hand is running at about 60% so we have some way to go.&nbsp; It is still expected we will get some black outs.

August 16, 2003 at 03:57 PM in Web lifestyle, World Affairs | Permalink | Top of page | Blog Home

Viruses, hackers hit 1/3 of Net users

Aug. 12: Nearly 32 percent of Internet users surveyed in mid-July said they had been affected by a hacker or computer virus in the past two years. About 43 percent of them said they felt vulnerable on their home computers, while 17 percent felt they were vulnerable from viruses and hackers at work.

Full Story

It seems the frequency and depth of problem with viruses is overwhelming. Yet no-one uses internet more than me, becuase I am online 24 hours a day and I never get anything, including the latest MS Blaster virus. Yet I never actually do anyting to protect myself each day - no need because tis automatic. It turns out in that case my saviour is my auto update for Windows. I also have auto update for McAfee.

In my organisation there have been 20,000 cases of the worm blaster, because no-one has the auto update turned on ... of course they can't because they don't have Admin access. So the very thing designed to address user created problems hinders one source of a fix. But that is not enough in and of itself .. ost wouldn't know how to turn it on anyway, so I blame my organisation for not having auto update as part of the "locked down" computer image.

Bottom line is that viruses are here to stay, so organisations need to address them, and Microsoft do too. Home users assume their computer is safe and shouldn't need to have to fix by themselves.

August 16, 2003 at 12:31 AM in Virus | Permalink | Top of page | Blog Home

August 15, 2003

August 14th, 2003 - 50 Million suffer power outage but telephone works!

This was interesting. It was a real crisis which really brought home how much we count on electricity. Lets go beyond the obvious about air conditioning, power garage doors and traffic lights all being out. So this BBC story (Net survives power outage) about the internet surviving is irrelevant to me as a user. Its great for the high priests who monitor internet infrastrucuture, but meaningless to the user.

Telephone .... I couldn't get on to the 'net, my blackberry was sporadic, but the darned phone worked except for one outage of a few minutes (which I think was specific to my condo, and not general).

Electricity is needed for:
1) Wireless router
2) High speed modem
3) desktop computer
4) laptop works fine, but without 1) & 2) its useless.

We underestimate the telephone ... its a hardware device, and sits there quietly but it works without power in my home. Why can't internet do that. Why can't internet get into my house same as telephone without power. Thats a new problem which the telephone company should solve.

August 15, 2003 at 10:51 PM in Web lifestyle, World Affairs | Permalink | Top of page | Blog Home

August 05, 2003

AOL Launches Advanced E-mail, Messaging Product

Hidden in this article it notes that AOL is under Federal investigation for inflating its user numbers. Also mentions that their dial up base is experiencing a "sharp decline". This confirms my earlier comments, re "AOL is dead".

AOL Launches Advanced E-mail, Messaging Product
Tue Aug 5,12:21 AM ET Add Technology - Reuters Internet Report


NEW YORK (Reuters) - America Online on Tuesday launched stand-alone, advanced e-mail and instant messaging (news - web sites) software as the AOL Time Warner Inc. (NYSE:AOL - news) unit tries to offer more choice in hopes of stemming subscriber defections.

The product launch comes as America Online also rolls out the latest version of its Internet service, AOL 9.0 Optimized, and as it tries to curtail the sharp decline in its dial-up subscriber base while trying to woo high-speed users.


The company said the stand-alone product, AOL Communicator, can consolidate e-mail from multiple accounts into a single application and offers more flexible e-mail management and better spam filters. It will be free to AOL subscribers.


The release comes at a time America Online faces federal probes into its accounting of advertising deals, questions about discounted bulk sales of Internet subscriptions to its marketing partners in 2001 and concerns about its shrinking dial-up subscriber base.


The SEC recently requested documents related to the bulk deals, a person close to the company said last week.


Analysts have been encouraged by early peeks at AOL 9.0, calling it one of the biggest changes in the service in years and a step in the right direction, but they said it is unlikely to be enough to offset the company's near-term woes.

August 5, 2003 at 11:37 AM in Web lifestyle | Permalink | Top of page | Blog Home

August 04, 2003

MP3 music goes hi-fi

The Slimp3 from the UK is the latest device to permit MP3's to be played on a normal stereo. Basically it picks up the MP3 from the computer, and streams it on to the stereo. This requires special software, which needs to be able to stream the music seamlessly.

August 4, 2003 at 09:50 AM in Web lifestyle | Permalink | Top of page | Blog Home