[x]Blackmoor Vituperative

Monday, 2014-03-03

Ruminations on web design and system administration

Filed under: Programming,Work — bblackmoor @ 10:18

Now that the Kickstarter is over, I can go back to talking about other things. For example, how happy I am that I am no longer working in web design. The work I would like to do, in decreasing order of preference, is:

  • system administration
  • database administration
  • back-end programming (i.e., not Javascript)
  • project management
  • front end programming (i.e., Javascript)
  • web design

There are reasons why web design is at the bottom of the list. The biggest one is that the people who pay to have that done are too often operating under the false assumption that they know how to do it, and that they just need someone else to do the grunt work of actually using the software. Oatmeal has a pretty funny cartoon on what that’s like for a web designer.

That’s an exaggeration, of course. I am lucky that back when I did web design as my primary profession, I very rarely had clients quite that clueless. A more frequent occurrence was the “we need to Do Something” problem. Smashing Magazine has a pretty decent article on that, but if you have been a user of YahooGroups or FaceBook for any length of time, you have seen that phenomenon in action.

System administration is at the top of the list for even better reasons. For one thing, I simply enjoy it. I like making things work. It’s like working on a car and getting it to run smoothly, but you don’t bang your knuckles or get your hands dirty. Also, success is generally objective: if the system works, that’s success. None of the “that color is too aggressive” type feedback you get when doing web design (I actually had a client say that phrase to me). Of course, there are some subjective measurements of success, even in system administration. For example, you can continue throwing time and money at a database server to increase performance, and the point at which the performance is good enough is a subjective call. Even so, generally speaking, the line between “working” and “not working” is pretty clear. I like that.

Wednesday, 2013-11-20

Peer Review Lessons from Open Source

Filed under: Programming — bblackmoor @ 09:43

I recently found an article in the Nov/Dec 2012 issue of IEEE Software that sounded interesting, “Contemporary Peer Review in Action: Lessons from Open Source Development“. (Rigby, P., Cleary, B., Painchaud, F., Storey, M., & German, D. (n.d). Contemporary Peer Review in Action: Lessons from Open Source Development. Ieee Software, 29(6), 56-61.)

The authors examined the peer reviews of approximately 100,000 open source projects, including Apache httpd server, Subversion, Linux, FreeBSD, KDE, and Gnome. They compared these to more formal methods of software inspection and quality control, traditional used in complex, proprietary (non-open source) projects.

The open source reviews are minimal, and reviewers self-select what sections they will review. This results in people reviewing sections of code they are most competent to review (or at least, most interested in reviewing). The formal code inspections for proprietary projects are cumbersome, and the reviewers are assigned their sections, meaning they are often unfamiliar with the code they are reviewing. The peer reviews are completed more efficiently and are more likely to catch inobvious errors, but they lack traceability.

As a result of their research an analysis, the authors have five lessons that they have taken from open source projects which can benefit proprietary projects.

  1. Asynchronous reviews: Asynchronous reviews support team discussions of defect solutions and find the same number of defects as co-located meetings in less time. They also enable developers and passive listeners to learn from the discussion.
  2. Frequent reviews: The earlier a defect is found, the better. OSS developers conduct all-but-continuous, asynchronous reviews that function as a form of asynchronous pair programming.
  3. Incremental reviews: Reviews should be of changes that are small, independent, and complete.
  4. Invested, experienced reviewers: Invested experts and codevelopers should conduct reviews because they already understand the context in which a change is being made.
  5. Empower expert reviewers: Let expert developers self-select changes they’re interested in and competent to review. Assign reviews that nobody selects.

The authors go on to make three specific recommendations:

  1. Light-weight review tools: Tools can increase traceability for managers and help integrate reviews with the existing development environment.
  2. Nonintrusive metrics: Mine the information trail left by asynchronous reviews to extract light-weight metrics that don’t disrupt developer workflow.
  3. Implementing a review process: Large, formal organizations might benefit from more frequent reviews and more overlap in developers’ work to produce invested reviewers. However, this style of review will likely be more amenable to agile organizations that are looking for a way to run large, distributed software projects.

To be honest, I don’t have enough experience to have an informed opinion on these recommendations as they pertain to complex, proprietary projects. Virtually all of the projects I have worked on have been distributed, open-source projects, and nearly all of those had less peer review than I think they should have. That being said, the author’s recommendations and the “lessons” on which they’ve based them seem reasonable to me, and do not contradict with my own experience.

Wednesday, 2013-11-13

Remembering Xanadu

Filed under: Programming,The Internet,Work — bblackmoor @ 23:05

I was reminded recently of an interesting article from the June 1995 issue of Wired magazine. I subscribed to Wired back then: this was during the early days of the internet, while the 1990s tech bubble was inflating like gangbusters. The article is “The Curse of Xanadu“, by Gary Wolf.

It was the most radical computer dream of the hacker era. Ted Nelson’s Xanadu project was supposed to be the universal, democratic hypertext library that would help human life evolve into an entirely new form. Instead, it sucked Nelson and his intrepid band of true believers into what became the longest-running vaporware project in the history of computing – a 30-year saga of rabid prototyping and heart-slashing despair. The amazing epic tragedy.

The article begins with a brief description of the mind behind Xanadu, Ted Nelson. He is described as a very smart man with many ideas, but who has difficulty finishing his projects. Later in the article, we learn that Nelson has an extreme case of Attention Deficit Disorder.

The article then goes on to describe the goals of the Xanada project, which Nelson began working on in 1965:

Xanadu was meant to be a universal library, a worldwide hypertext publishing tool, a system to resolve copyright disputes, and a meritocratic forum for discussion and debate. By putting all information within reach of all people, Xanadu was meant to eliminate scientific ignorance and cure political misunderstandings. And, on the very hackerish assumption that global catastrophes are caused by ignorance, stupidity, and communication failures, Xanadu was supposed to save the world.

Yet Nelson, who invented the concept of hypertext, is not a programmer. He is a visionary. He is also appearently immensely persuasive. He convinced people to spend millions of dollars on Xanada (long before the tech bubble made that irrational behaviour seem normal), and years working on it. And it does seem that Nelson was a true visionary. In 1969, he already foresaw that technology would “overthrow” conventional publishing, and that paper would be replaced by the screen (in his mind, it already had). But he was limited by the technology of his day. “Even [in 1995], the technology to implement a worldwide Xanadu network does not exist.” In the 1970s, “[the] notion of a worldwide network of billions of quickly accessible and interlinked documents was absurd, and only Nelson’s ignorance of advanced software permitted him to pursue this fantasy.”

In the early 1970s, Nelson worked with a group of young hackers called the RESISTORS, in addition to a couple of programmers he had hired. During this period, the first real work on Xanadu was accomplished: a file access invention called the “enfilade”. What the enfilade is or exactly what it does is a mystery: unlike another famous iconoclast, Richard Stallman, Ted Nelson did not believe that “information wants to be free”. The nature of Xanadu’s enflilade, what it does, and how it is implemented is a mystery: everyone who has worked on the project has been sworn to secrecy.

In 1974, Nelson met programmer and hacker Roger Gregory. According to the article, if Nelson is the father of Xanadu, Roger Gregory is its mother. “Gregory had exactly the skills Nelson lacked: an intimate knowledge of hardware, a good amount of programming talent, and an obsessive interest in making machines work. […] through all the project’s painful deaths and rebirths, Gregory’s commitment to Nelson’s dream of a universal hypertext library never waned.” Gregory’s tale is a sad one: it’s difficult to see his involvement in Xanadu as anything other than a tragic waste of his life.

As the years go by and the 1970s become the 1980s, Nelson continued to work on Xanadu, and Xanadu continued not to be completed. By the late 1980s, the project team had dwindled and support for it was difficult to find. Nelson and Gregory would not admit failure, although Gregory struggled with thoughts of suicide. However, in 1988 Xanadu was rescued by John Walker, the founder of Autodesk. It seemed that Xanadu would at last have the benefit of serious commercial development. “In 1964,” Walker said in a 1988 press release, “Xanadu was a dream in a single mind. In 1980, it was the shared goal of a small group of brilliant technologists. By 1989, it will be a product. And by 1995, it will begin to change the world.”

It turned out that was easier said than done.

I find it interesting that one of the technical obstacles to Xanadu’s development was due to its profoundly non-free approach to the information it would make available.

The key to the Xanadu copyright and royalty scheme was that literal copying was forbidden in the Xanadu system. When a user wanted to quote a portion of document, that portion was transcluded. With fee for every reading.

Transclusion was extremely challenging to the programmers, for it meant that there could be no redundancy in the grand Xanadu library. Every text could exist only as an original.

In my opinion, this philosophy of restricting information is a key reason that Xanadu failed.

By the early 1990’s, control of the project shifted away from Gregory and the original development team, and all of the existing code was discarded. This also made Walker’s 18-month timeline explicitly unattainable.

John Walker, in retrospect, blames the failure of Xanadu on the unrealistic goals of the (new) development team.

John Walker, Xanadu’s most powerful protector, later wrote that during the Autodesk years, the Xanadu team had “hyper-warped into the techno-hubris zone.” Walker marveled at the programmers’ apparent belief that they could create “in its entirety, a system that can store all the information in every form, present and future, for quadrillions of individuals over billions of years.” Rather than push their product into the marketplace quickly, where it could compete, adapt, or die, the Xanadu programmers intended to produce their revolution ab initio.

“When this process fails,” wrote Walker in his collection of documents from and about Autodesk, “and it always does, that doesn’t seem to weaken the belief in a design process which, in reality, is as bogus as astrology. It’s always a bad manager, problems with tools, etc. – precisely the unpredictable factors which make a priori design impossible in the first place.”

In 1992, just before the release of Mosaic and the popularization of the World Wide Web, Autodesk crashed and burned, and the pipeline of funding that kept the Xanadu project going came to an end. Ownership of Xanadu reverted to Ted Nelson, Roger Gregory, and a few other long-time supporters.

A glint of hope appeared. Kinko’s (remember Kinko’s?) was interested in funding the project for their own use. But Nelson chose this time to attempt to seize control of the project. The programmers who had been subjected to Nelson’s attention-deficit management style resisted. Again, Nelson’s desire for control was destructive to the accomplishment of his dream. “By the time the battle was over, Kinko’s senior management had stopped returning phone calls, most of Autodesk’s transitional funding had been spent on lawyers fees, and the Xanadu team had managed to acquire ownership of a company that had no value.”

There was a brief respite from an insurance company, but that too soon ended in failure. After not being paid for six months, the last few developers took the hardware and quit. “With the computers gone, Xanadu was more than dead. It was dead and dismembered.”

As of 1995 (the date of the article), Nelson was in Japan, still pushing his idea of “transclusion”, still hostile to the very freedom and chaos that has made the World Wide Web the enormous success it is. I think he’s a perfect example of how someone can be both brilliant and utterly clueless.

In 2007, Project Xanadu released XanaduSpace 1.0. There is a video on YouTube of Ted Nelson demonstrating XanaduSpace. As far as I know, that was the end of the project.

Some other links that you might also find of interest:






http://web.archive.org/web/20090413174805/http://calliq.googlepages.com/”Xanadu Products Due Next Year”


Friday, 2013-08-09

I dislike Git

Filed under: Programming,Work — bblackmoor @ 10:18

Once upon a time, I did not use a source code repository at all. Unless I made a backup copy, when I changed a line of code, there was no way for me to know what it had changed from. Collaboration meant sharing files and manually figuring out how to merge our changes together. It was a mess.

Then I discovered CVS (Concurrent Versioning System). It kept track of every change, and made it much easier. I had no desire whatsoever to go back to the old way. Still, CVS wasn’t perfect. There were some things it did poorly, or did not do at all.

Git workflow

Along came SVN (Subversion). Subversion was written with the goal of replacing CVS by addressing CVS’ deficiencies, while remaining as easy to use as CVS. I loved it, and promoted it among all of my colleagues (some were quicker to migrate from CVS than others). Again, I had no desire whatsoever to go back to the old way. While not perfect, my complaints about SVN were few, infrequent, and very mild.

Along came Git. Lots of people said it was great. I didn’t see anything it did better than SVN (nothing I needed to do, anyway). I used it for a couple of projects. It was really, really complicated. Ordinary day-to-day use was about twice as complicated as SVN, to do the exact same thing. More complicated tasks, such as maintaining a vendor branch (one of the tasks I perform periodically), were four times as complicated in Git.

What is the appeal of Git? Why is there such a bandwagon of people promoting it? Supposedly Git handles merges better than SVN does, but I haven’t seen it. Supposedly Git’s better because it’s “distributed“. Frankly, I think that makes it worse.

Left to my own devices, I would like to go back to Subversion, but anyone who writes code for a living does not do so in a vacuum. If the team uses Git, you use Git. Life goes on.

Thursday, 2012-03-22

It’s called Basecamp

Filed under: Programming — bblackmoor @ 22:44

What do you call a project management tool that doesn’t have any way to set task status, doesn’t have any way to set task prerequisites or dependencies, doesn’t have any form of time tracking, doesn’t have a way to set task priority, doesn’t have a way to move a task from one category to another, doesn’t have Gantt charts, and quite simply doesn’t have any of the fundamentally essential features of a project management system?

It’s called Basecamp. And they charge money for this garbage, believe it or not. Even more astonishing, people actually pay for it.

Monday, 2011-06-20

Security cheat sheets from Veracode

Filed under: Programming,Security — bblackmoor @ 09:26

I ran across a set of tutorials and cheat sheets for a few of the more common security vulnerabilities this morning. I thought other people might find them useful. They’re from a company called Veracode. The guides are free, and they point to other free resources if you want to learn more, so they seem to be a pretty good starting point if you are interested in this sort of thing.

Thursday, 2011-04-21

Microsoft gets Novell’s Patents rights but must share them with Open-Source Software

Filed under: Intellectual Property,Linux,Programming — bblackmoor @ 09:17

In response to pressure from the U.S. Department of Justice and Germany’s Federal Cartel Office (Das Bundeskartellamt), Microsoft and its CPTN Holding Partners — Apple, EMC, and Oracle — have revised their agreements so that the Novell patents will be under both GPLv2 and Open Innovation Network protection.

So what does it all mean? Andrew “Andy” Updegrove, founding partner of Gesmer Updegrove, a top technology law firm, said, “This is a rather breath-taking announcement from a number of perspectives. Among others, the granularity of the restrictions imposed demonstrates a level of understanding of open source software in general, and Linux in particular, that has not been demonstrated by regulators in the past. It also demonstrates a very different attitude on the part of both the U.S. and German regulators, on the one hand, and Microsoft, on the other, from what we saw the last time that Microsoft was under the microscope. In the past, Microsoft was more disposed to fight than negotiate, and the U.S. and the European Commission were far apart in their attitudes. This announcement conclusively places open-source software on the U.S. regulatory map.”

(from Microsoft gets Novell’s Patents rights but must share them with Open-Source Software, ZDNet)

I think this is a really interesting development. Interesting in the sense that it’s not antagonistic to consumers and developers, and that it’s not what I predicted, or even guessed might happen.

Friday, 2010-06-11

A tale of two deadbeats

Filed under: Programming,The Internet,Work — bblackmoor @ 20:42

I currently am owed about $4000 from two clients that haven’t paid. One paid half up front for a web site, and I have been trying for a month to turn the web site over to them and get the other half of my payment, and they keep putting me off. The other client, for whom I did some programming work, bounced a check for $2000 three weeks ago, has promised to pay that and the rest of what they owe, but hasn’t paid yet, and never answers their phone or email.

I am pretty close to shutting down the first client’s web site, and turning over the second client to a collection agency. I think I will wait until Monday and try to get somewhere with each of them one more time before I do that.

Why won’t people honor their agreements?

Monday, 2010-05-24

DriveThruRPG affiliate links

Filed under: Gaming,Programming — bblackmoor @ 16:57

DriveThruRPGI had a little bit of free time today, so I whipped up a couple of dynamic affiliate links for DriveThruRPG, a very cool source of gaming PDFs.

The script can be called one of three ways. One way creates an affiliate link to one of the five newest items added to DriveThruRPG, the second creates an affiliate link to one of the five best-selling items, and the third creates a random link to an item on either list. I considered animating the affiliate links, so that a different item would appear every few seconds, but to be frank, animated advertisements annoy me. Actually, I do not care for advertisements at all — I hide them, as a matter of fact.

Does this make me a hypocrite? Maybe. However, these are not just advertisements — they are also news. For that reason, I think they are useful, even to people like me who routinely hide ads.

It’s my intention to add these to this blog and to RPG Library, the gaming community site I maintain, but I do not really expect to see much revenue from these. I mainly created them as a service to the gaming community. For that reason, I added a variable so that other people can replace my affiliate ID with their own, if they would like to use these on their own web site.

So check it out. If you have any questions, let me know, and I will try to answer them.

Update 2010-05-25: I added some error-checking in case the description field in DriveThruRPG’s RSS feed contains some bad tags. It doesn’t actually do anything with the errors, but it keeps the script from failing.

Update 2010-05-25, part 2: I expanded the script to be able to handle any of OneBookShelf’s sites.

Friday, 2010-04-16

Do not go into software development

Filed under: Programming,Work — bblackmoor @ 10:40

TechRepublic has a question-and-answer thing they do. One of this week’s questions is from a young person in high school in New York, asking if going into software development would be a good idea.

I was getting ready to write a response explaining why I would not recommend any young person go into IT, particularly software development. I was going to talk about how things were back in the mid-1990s, when I started, and how they have changed.

And here is Jake Leone, who has written it for me.

Well, done, Jake. Well done.

Next Page »