A really neat OLED keyboard
This is a really neat keyboard.
This is a really neat keyboard.
There was a time when SpamCop’s bl.spamcop.net was a useful blocklist. I say “was” because bl.spamcop.net is getting so many false positives now (several a day, from a variety of unrelated and non-spamming servers) that it has become more of a liability than a service, and I have removed it from my list of blocklists. I now strongly recommend others do the same.
I reported this problem to deputies@spamcop.net, including a sample of a dozen or so of the false positives, a week or so ago. I have received no reply. For reference, here are just a few of the erroneously blocked servers:
SpamCop as a service is a still useful, because it allows me to selectively filter using a variety of blocklists. So I will definitely be keeping my accounts. But I will no longer be using the unreliable bl.spamcop.net blocklist, and therefore there is no reason to take the trouble to “report” spam to SpamCop anymore.
Wow. It looks like the Feds are slowly — oh, soooo slowly — creeping into the late 20th century:
Agencies setting up sensitive virtual private networks now have an open-source alternative.
The National Institute of Standards and Technology has certified OpenSSL, an open-source library of encryption algorithms, as meeting Federal Information Processing Standard 140-2 Level 1 standards, according to the Open Source Software Institute of Hattiesburg, Miss.
“This validation will save us hundreds of thousands of dollars,†said Debora Bonner, operations director for the Defense Department’s Defense Medical Logistics Standard Support program, in a statement. “Multiple commercial and government entities, including [the Defense Department’s] Medical Health System, have been counting on this validation to avoid massive software licensing expenditures.â€
Federal agencies must use FIPS-compliant products to secure networks carrying unclassified sensitive data. The FIPS certification of OpenSSL opens the possibility of using an SSL-based VPN to carry sensitive data, according to Peter Sargent, who heads the Severna Park, Md.-based PreVal Specialist Inc., one of the companies that supported the validation process.
Traditionally, agencies wishing to set up a VPN for sensitive data would use an approach that involved a secret key implementation of a cryptographic module, which is more expensive to implement and has limited the number of smaller companies that can provide such a product, Sargent said.
(from Government Computer News, OpenSSL gets NIST certifications)
This is great news for Federal agencies. And you would think that switching to OpenSSL would be a no-brainer, right? After all, out here in the real world, we’ve been relying on it for years. However, there is nothing so simple and easy that the Feds can’t find a way to screw it up:
Sargent added that few agencies would directly deploy OpenSSL FIPS. Rather, they would purchase OpenSSL-based VPN products from vendors.
“Yes, yes: I know we could get sunlight for free. But we’d rather pay for it. This tax money isn’t going to spend itself, you know.”
The first discussion draft of the GNU General Public License was finally released on Monday, and addresses the issues of patents and patent-related retaliation, as well as its compatibility with other licenses.
Richard Stallman, the founder of the Free Software Foundation and author of the original license, was the first to take the floor here at the First International Conference on GPLv3 at MIT (the Massachusetts Institute of Technology), to express his vision for the new license. […]
The biggest changes to the license were in the area of license compatibility, removing the obstacles that prevented it from being combined with code from other free software packages. […]
The other biggest changes to the license were regarding the issue of DRM (Digital Rights Management), which was seen as denying users the freedom to control the software they had.
“DRM is a malicious feature and can never be tolerated, as DRM is fundamentally based on activities that cannot be done with free software. That is its goal and it is in direct opposition to ours. But, with the new GPL, we can now prevent our software from being perverted or corrupted,” he said.
A patent license grant has now also been included, as well as a narrow kind of patent retaliation clause. “If person A makes a modified version of a GPL-covered program and then gets a patent on that and says if anyone else makes such a modified version, they will be sued, he then loses the right to make any modifications, meaning he can’t commercially use his software,” Stallman said.
While the license does not require that the modified version be released, it does ensure that others would not be prevented from writing similar modifications under the license, he said.
(from eWeek, GPL 3.0 Draft Tackles Patents, Compatibility)
These sound like good changes to me. If you want more detail (and you should), you can read the full text of the GPL 3.0. You may also want to check out the Free Software Foundation’s rationale for the changes in GPL 3.0.
Taiwan’s parliament has voted to end its dependence on Microsoft software, demanding that the government reduce purchases from the software giant by 25 percent this year.
The resolution, passed on Friday, is an attempt by the island’s law-making body to end the near monopoly Microsoft has with local government offices, a legislative aide said.
Mozilla released on Thursday its updated e-mail application, Thunderbird 1.5, which is designed to deliver improved security and functionality.
Thunderbird 1.5, which can be downloaded for free, has been retooled to offer improvements in four main areas: updates, security, RSS and podcasting.
“Thunderbird enhances the overall e-mail experience, adding antiphishing capabilities to help keep people safer, while also integrating and simplifying access to new technologies, such as RSS,” Christopher Beard, Mozilla’s vice president of products, said in a statement.
The e-mail client also features automated updates, designed to download security and product upgrades to people’s systems and then to prompt them when ready for installation.
In another effort to bolster security, Thunderbird 1.5 is designed to push e-mail through a finer spam filter. Last fall, Mozilla released an update for its Thunderbird 1.0.7 that plugged several security holes.
Thunderbird 1.5 also aims to bolster its RSS support by letting people receive feed updates as e-mail messages. People can now access podcasts through a dialog box, which is tied to an application such as a Web browser or audio player.
Mozilla’s new release also includes productivity enhancements, such as spellcheck as e-mail is being written and an ability to delete attachments from e-mail.
Thunderbird has been downloaded 18 million times since it debuted in December 2004, Mozilla said.
Unfortunately, Thunderbird is missing three essential features that prevent me from switching away from MS Outlook (and I would love to be able to switch away from Outlook, and remove the last vestiges of MS Office from my computers):
On Tuesday, Novell announced the creation of the AppArmor project, a new GPL open-source project dedicated to advancing Linux application security.
Novell Inc.’s AppArmor is an intrusion-prevention system that protects Linux and its applications from the effects of attacks, viruses and malicious applications.
AppArmor is based on technology that Novell acquired from Immunix, a leading provider of Linux host-based application security solutions for Linux, when it purchased the company in May 2005.
AppArmor works by “application containment.” In this approach, the interactions between applications and users are monitored for possible security violations. This “has emerged as a favored way to protect applications from compromise and to protect applications from one another,” observed Al Gillen, research director of system software at IDC when Novell acquired Immunix.
How these interactions are monitored is set by policies. The commercial version comes with predefined security policies for Web server applications such as the Apache Web server, the Postfix and Sendmail email servers, the MySQL DBMS (database management system), and the Samba file and print server.
Novell has donated the core components of its AppArmor framework to provide a foundation for the project. The GPLed code will be available on OpenSUSE.org,
(from LinuxWatch, Novell open sources major Linux security program)
When I get a message from a mailing list, and I reply to that message, the message I send should automatically be addressed to that list. That is how it ought to work. Some people, for reasons which do not make sense and which have never made sense, think that you should need to manually address replies to mailing lists (or hit a special “reply to list” button available in some email clients). That’s idiotic.
Rather than go through all of the arguments (not that arguments should be needed for such a self-evident issue), I present the following brief article:
Reply-To Munging Considered Useful
Let me be clear about this: I have been using email since the 1980s. I know what the relevant RFCs and issues are. A reply to an email from a mailing list should go to that list. Period.
Eight weeks after 2.0, our first update remedies minor bugs and brings new features. For example, it is now possible to disable and hide particular application settings, which comes in handy for central administration in networks. Plus, a new keyboard shortcut permits the user to return to a saved cursor position. The bullets and numbering feature has been expanded, and a new mail merge feature is available.
(from OpenOffice.org)
OpenOffice 2.01 also offers improved compatibility with obsolete office suites, such as Microsoft Office. Check it out.
It took two centuries to fill the U.S. Library of Congress in Washington, D.C., with more than 29 million books and periodicals, 2.7 million recordings, 12 million photographs, 4.8 million maps, and 57 million manuscripts. Today it takes about 15 minutes for the world to churn out an equivalent amount of new digital information. It does so about 100 times every day, for a grand total of five exabytes annually. That’s an amount equal to all the words ever spoken by humans, according to Roy Williams, who heads the Center for Advanced Computing Research at the California Institute of Technology, in Pasadena.
While this stunning proliferation of information underscores the ease with which we can create digital data, our capacity to make all these bits accessible in 200 or even 20 years remains a work in progress.
[ . . . ]
Like most difficult challenges, data preservation is really a mix of the simple and the complex challenges. At one end of the preservation continuum is a simple item, like an ASCII text document. Preserve the data by keeping the file on current media and provide some way to view it and you’re pretty much done. We’ll call this the “save the bits” approach.
At the other end lie the harder cases, like these:
A compiled software program written in a custom-built programming language for which neither the language documentation nor the compiler has survived.
A complex geospatial data set developed for the U.S. Geological Survey in a proprietary system made by a company that went out of business 20 years ago.
A Hollywood movie created with state-of-the-art encryption to prevent piracy, for which the decryption keys were lost.
For these three items, we don’t hold out much hope of being able to preserve the content forever. For the software program and the geospatial data set, the digital archeologists of the future probably won’t have enough information about how the software and data set were created or the language they were created in—no Rosetta Stone, as it were, to translate the bits from lost languages to modern ones.
As for that encrypted movie, our archeologists might have read old reviews that raved about the special effects in Sin City, but this cinematic achievement will remain locked away until someone pays a lot of money to a master of ancient cryptology to crack the key.
Fortunately, many content types fall between these difficult cases and ASCII text. Usually, saving the bits using standard, well-documented data, video, and image formats, such as XML, MPEG, and TIFF, gets you halfway to an enduring digital archive. Put another way, the goal is to avoid formats that require proprietary software, such as AutoCAD or QuarkXPress, to play or render the data.
(from Spectrum Online, Eternal Bits: How can we preserve digital files and save our collective memory?)
Go read the whole article. It’s interesting stuff.