Brad Templeton's
USENET-Format Pages

Why do we still have USENET?

Why do we still have USENET?

Currently, USENET is designed so that people must post articles via an "injector" site, which then sends the article out to its neighbours in a flood. The injector usually is a machine at the user's institution or ISP, and it doesn't let any outsider inject articles.

Articles propagate through servers, and then people tend to read from those servers via NNTP. When USENET started, people read news directly on the server with shell accounts, but that's rare today. While the norm is that the NNTP server should be very close to the user (at the ISP or right on the LAN of a school or corporation) it's also common for sites to outsource their USENET reading to a company with central servers.

There were open injectors in the past, but abuse by spammers pretty much shut them all down. Even abuse by spammers of local injectors has caused local injectors to set all sorts of policies and rules on use of the injector, and almost all insert identifying information in the effort to stop spam, including the IP address, which in the case of people with static IP addresses (or very long term DHCP leases) often effectively identifies the posting user.

With this mix of actions, it's worth asking what USENET is, how it's different from other conferencing systems, and why we use still use it's fairly ancient effectively pre-internet design.

I think those reasons are:

Better readers
Hands down, USENET readers, designed for reading large-volume discussions, are vastly superior to just about anything else people use. They keep track of what you have read and not read and all your preferences and they do it privately on your own machine. They have high-speed thread tracking, complex thread navigation, and many have complex filtering and killfiling. They have all the advantages of a local client (and all the disadvantages) but those advantages are many. Mailing lists have decent readers but not as good as a USENET reader.
Local access
While as noted, some people are moving from this, if you have it, nothing beats local high-speed access. Response times are near instantaneous when compared to remote web fetches, and once you have it you can't go back. The NNTP session is a permanent session, with state, allowing better response time even when the server is more remote. But if it's on your own LAN, or nearby on a DSL or cable line, it can't be beat. This comes at the cost of course of pre-feeding all articles to the local spool, even if nobody will ever read them. Of course as soon as one person reads an article it was efficient, and once multiple people read an article, it was really efficient to feed it in advance. One can fetch on demand and cache, but that takes away the speed of local access for the first reader. You access most mailing lists truly locally, but in a more limited fashion.
Offline reading
For a small and shrinking percentage of the users, USENET's older design allows offline particpation, or participation through a link to a site with intermittent connectivity. Even for those who don't view offline reading as a goal, the fact that their local site can temporarily lose internet connectivity or have slow connectivity doesn't affect the USENET experience very much is a plus.
Decentralized efficiency
USENET rarely gets slow because a newsgroup gets hot. While many web sites have been taken to their knees with traffic around big events like the olympics or terrorism, USENET performance is normally completely unaffected.
Decentralized control
We like that on the whole nobody is in charge of USENET. Those who do express authority do so at the pleasure of the whole community. USENET remains a cooperative owned and run by the operators of servers, and if you have the servers, it's not hard to join that co-op. This decentralization provides more freedom of speech and more privacy, and spreads the cost. It also allows more abuse and spam, more chaos and in areas where innovation means getting everybody to agree on a new feature, it counter-intuitively stifles innovation.
Legacy & Community
Sometimes we read USENET because it is the venerated king. It's where certain communities gather and have gathered for a long time.
Simple experience
USENET remains largely plain, monospaced 80 column text. This keeps people in discussions aimed at their text, and not the format and other features.

Of course, many of those advantages are also disadvantages and why people have moved away from USENET to things like web forums, online services and mailing lists.

Better readers
USENET's readers are better but effectively the same as they were 15 years ago. Other readers have innovated in different ways that USENET never embraced for a variet of reasons. Richer text formats, integration with the web, coordinate help systems etc. Also, web based tools offer users a tool they already know -- the browser. And while USENET is much older than the browser, today vastly more people know the browser.
Lots of decentralized pockets of centralized control.
Online services and web-boards are feifdoms under the complete control of their owners. That means they care about them, take care of them and innovate in them. They stop spam and abuse, and create and experiment with new rules. They can also be tyrants if they wish to be. At the same time, there are thousands of web based online forums, all with different owners. That means a lot of competition and desire to please the user to grow the community. USENET has none of that. Creating a new web forum or mailing list is something anybody can do. They don't have to ask permission of the community or anybody else. They can try any new technology, message format or regulatory regime in their area. USENET on the other hand is extremely resistent to change. The ability to innovate has allowed web boards to support things like user-moderation and scoring, partial moderation (where the owners define the topics but let the discussion run wild) and combining discussion with resources, help, polls, file downloads etc. Some web boards protect privacy, others reduce it. There are many, many choices.
In web search
Due to the design of search engines, web message boards show up in most web searches. USENET wasn't even indexed until 1995 and at all times you had to do a special search to find USENET postings, though google has made trying both fairly easy. Nonetheless, if people go looking for an online community with a search engine, they will not find USENET ones first.
Community
In the end, if you are looking for a specific community, you'll go where it is regardless of the software. Web forums now own many topic spaces, even when there are also competing USENET areas. So do some mailing lists.
Richer text
Web forums allow a richer experience and better user interface, while of course allowing the richness to be overdone. However, USENET doesn't even offer something a simple as foldable text with defined paragraph breaks so the reading window can be resized to taste, or a formal specification of included text, hypertext links and signatures. Nobody would be against those things but a path to providing them was never clear inside USENET, even though the MIME spec defined an inefficient one. Web boards have much more control of what the user sees and can show faces, structure and many other things.

Is there a way to keep what's valuable about USENET, fix what's wrong and integrate what's been learned from other systems and their strengths and weaknesses?

(Mostly) End to End USENET

Answer to come later