Wednesday, November 21, 2007

Mark Pilgrim's System Administration for Dummies

(Note: Originally written on 2007/11/11. It's been languishing on my computer since then.)

To begin with, I have to say that, like many people on the "Internets", I enjoy reading dive into mark for its sharp wit and no-holds-barred writing style, no matter who employs him.
His latest post is no exception. A few things bothered me about it, though.

For one, it (implicitly) assumes that you're going to be installing MySQL™ on your Ubuntu™ desktop machine and start using it to develop some application that needs a relational database for X, Y, or Z. However, ignorant CEOs and CIOs also read things written on the Internets, and when they happen to find this post via their Google™ search du jour for new technology to integrate into the fold, they'll turn to their system administrators and ask, "Why do I pay you so much when all I have to do to install a relational database is click a button a few times?" Thank you, Mr. Pilgrim, for devaluing system administrators in one fell swoop.

Another part that bothered me was the end. Here's how I imagine your average Ubuntu™ user's thought processes: "OK, I've installed this MySQL™ thing. Now what? How do I access this server thing? I have to use some sort of client, right? What kind of client do I get? Can I just search for 'mysql client' and do the same thing? Oh crap, that's for the terminal!" There goes that whole anti-"sudo make me a sandwich" argument. I guess it would be different if this HOWTO was in serial form.

On the other hand, it seems to be a much better experience than installing DB2™ on Ubuntu™.

Sunday, November 11, 2007

Common Sense and Websites

Just recently, I ran across the third Wordpress weblog in my feed list that had been hit with spam via what I assume to be the vulnerability fixed in version 2.3.1. It only shows up in feed readers, because it uses CSS to hide itself on the regular pages. That CSS is stripped by most feed readers' sanitizing process that removes all markup that may be malicious.

The striking thing about it is that all of the weblogs were related to web development: one was on a personal browser developer's website, one was a prominent web development news site, and the most recent one was the official weblog of a web browser. Now, I'm not necessarily putting the single browser developer at fault, since web applications aren't necessarily his area of interest. His webhost should make sure that classic security holes (like PHP's register_globals option) are turned offor disabled. On the other hand, the other two sites should know better. The web development news site has a significant number of posts on web application security, and the browser vendor deals with the security of its product every day, so surely they should be monitoring (or at least, find an automated process to monitor) feeds such as the ones at the National Vulnerability Database, in case exploits are discovered for any web applications that they may have installed.

To everyone else, if you can, please make sure that your webhosting environment is properly secured. Also, definitely subscribe to the news feeds of all the web applications that you run, because more often than not, there will be security vulnerabilities discovered, so you should upgrade as soon as possible in those cases.

Tuesday, November 06, 2007

Attention Gmail Developers: Please Address This IMAP Issue

I figure this is worth a shot, given that this blog is hosted on a sister application.

To the developers working on Gmail: I would like to know your position on comment #3 in flameeyes's post from a Claws-Mail developer. Are you or are you not following the IMAP specification in this respect? If not, why not? Additionally, can it be fixed?

Monday, November 05, 2007

OiNK: The Best Kept (Open) Secret on the Internet

Everywhere I turn, there's a new post lamenting the fall of the mighty OiNK music torrent tracker. Yes, it's a pity, but what's surprising to me was the number of people who actually used it (and blogged about losing it after the takedown). It's like these people want to say, "Yes, I was a part of the secret organization before it disbanded!" It's an odd sort of sensation reading those posts; those of us who were also contributors to "the cause" say to ourselves, "Right on, brother! Fight the power! I, too, miss what has been wrongfully taken away from us!"

As I think about it, it gets more and more surreal. Why are we sad about something that is plainly an illegal means of retrieving goods? Is it because of the slightly better feeling in our conscience that says, "it's OK, I'm helping others who can't necessarily find this music any other way through seeding", rationalizing it as a sense of community and giving back? I am boggled.

Saturday, November 03, 2007

Avant Window Navigator 0.2.1 Released

After about a month of bug reports, segmentation faults, patches, and prodding of the core developers, we finally have a new release of everyone's favorite composited dock, AWN.

After some prodding on IRC, I created a branch of the 0.2 release branch, called 0.2-stable-testing. Here's how I described my workflow on this branch in the forums:

  • I get patches from IRC/forums/launchpad. If they aren't on launchpad, I ask that they go there so I can put a reference to them in the commit message.
  • When the patch is on launchpad, I add the stable branch to the bug with the status message "fix in progress".
  • I install/run awn with the patch applied, and check the console. If there are no extra assert/CRITICAL/WARNING messages, and I don't crash within 10 minutes, I commit the patch (usually with whitespace fixes, etc.) to the branch, and push to launchpad.
  • I change the status message on the bug to "fix available" and note the revision that the patch was applied on.

In all, there were a total of 13 recorded bugs fixed in my branch. About halfway through, I posted a call for a bugfix release:

So apparently, AWN is now on the front page of launchpad. This inevitably means more users, and more of the same questions about crashes, etc. occurring with the 0.2 release. Many of these crashes have been addressed in my 0.2-stable-testing branch, and I am pleased to report that several people are actively testing this branch and indeed finding it stable. So, I propose that the rest of the patches in my branch be reviewed (you can find them at the bottom of the branch details page), and a point release (0.2.1) be made.

Additionally, I don't think it's in our best interests to have the only available method of retrieving awn-extras be through bzr (even though I am a strong advocate of bzr). We need, at the very least, a snapshot of awn-extras to be released. Preferably, the buggy clock applet should be fixed, moved or removed before this happens.

The result is where we are today. There are only two things that concern me about this release: Two minor features crept in, and I had found a bug the night before, but was too tired to file it. I have been opposed to adding new features (however minor they are) in a point release, because it's convention to leave new stuff to version bumps of one of the more major versions (e.g., 0.2 to 0.3, or 1.0 to 2.0). There's a good reason that most projects do this, too. New features (especially those that are untested)bring new bugs, which is not ideal for a bugfix release.

Thanks are in order for moonbeam (who wrote most of the stability patches), mhr3 (who reviewed said patches), and njpatel (who released it). It sounds like 0.3 is going to be very interesting. Hopefully I can get my desktop-agnostic branch finished and merged.

Friday, November 02, 2007

The Lessig lecture at the UW

I had the great fortune to listen to Professor Lawrence Lessig tonight. I've been a fan of his ideas (free culture, code as law, etc.) as well as his presentation style. Well, I got to see all of that at the lecture. His speaking style is even more impressive in person. He tied together a good deal of the work he's done over the years, including a preview of his new work on corruption, which the audience wanted him to speak on in the the Q&A that followed the presentation.

The presentation itself is a little hard to explain for me, because it dealt with so much material, and yet I didn't miss a heartbeat of it. The first part of the lecture dealt with the question posed in the title advertised: Is Google (2008) Microsoft (1998)? Short answer: yes and no, but don't assign morality labels to businesses (much like you shouldn't assign them to technology), because they're interested in only one thing: making the shareholders happy. The second part explained the "new" model of content distribution and ownership, and how Google and Facebook, for example, still don't exactly "get" it (c.f. the Google Maps API TOS or the Facebook Apps Developer TOS).

In all, I am very glad that I got to see Lessig speak in person when he came around to this area of the country.

Edit: Here's a tangentially-related Slashdot post: Google As The Next Microsoft? Also: Not Evil != Unselfish

Gmail's new "features", not bugs: A review

I, like many people on the Internets, was ecstatic at the announcements of IMAP for Gmail and the blogosphere-dubbed "Gmail 2.0". I'm all for a faster Gmail experience, not to mention an implementation of the mail retrieval protocol that was developed at my alma mater. However, my enthusiasm waned in two parts, when I actually tried out these features.

When IMAP was finally enabled on my account, I opened up claws-mail and configured it to use Gmail as its mail source. When it did the mail sync operation, I noticed that it didn't populate the virtual "label" folders properly. By that point, I gave up and did something else. I learned later during my blog reading that Gentoo's flameeyes had the same problem. If you look at the comments, you'll see that Claws-Mails's developers have acknowledged the problem as Google's fault. As a(n annoyed) developer, I would agree with their assessment. As a pragmatic developer, however, I agree with flameeyes's assessment: The Claws-Mails developers should follow Postel's Law.

Part two: trying out "Gmail 2.0". Regardless of how I feel about the blogosphere's echo chamber (and by extension, the mainstream media's echo chamber), I'm using that term for it because it's convenient. Yeah, it's a cop-out. Anyway, this refactoring of Gmail's dynamic JavaScript engine seems to me, to be a step back, in terms of speed (or at least, perceived speed). Sometimes when I change tabs back to Gmail, the message list column is squeezed horizontally, as if I changed my browser window size to 200x900. When I change label views, there tends to be a lapse between unloading the old label's mail and loading the new label's mail. This leaves a big green box in the interim.

I do realize that these features are relatively new, but you'd think user/unit testing would catch these things.

Thursday, October 25, 2007

TODO List, 2007/10/29

Avant Window Navigator

  • Finish file monitor wrapper
  • Fix python bindings for awn.DesktopItem
  • Fix launcher bugs
  • Add test programs for filemonitor wrapper and desktop item wrapper
  • Fix inter-process config handling

Pidgin Status Updater

  • Add project/source code to Launchpad
  • add Jaiku support (use xmlrpc-c)
  • put HTTP requests in a separate thread
  • cache cookie-based user authentication

Website

  • Make pages unobtrusively load dynamically
  • Add section on Avant Window Navigator

Monday, October 08, 2007

RFC 5023

The Atom Publishing Protocol has finally been published as RFC 5023.

Whew.

Hurrah! Congrats to the authors/contributors!

Monday, September 03, 2007

"Bonjour/Zeroconf/DNS-SD bookmarks"...

...is totally a misnomer. They don't let you share lists of bookmarks within a network (which is what I want to do), but rather, it is a list of all of the web servers advertised via Zeroconf in a particular network. Just needed to get that out.

Tuesday, July 24, 2007

xmingw overlay and "competing" software

Via the Freshmeat feed, I just noticed this thing called the "MinGW cross compiling environment" that looks like it does more or less the same thing as my xmingw overlay, except that it's not distro-specific. (It's not my fault that the portage concept is awesome.) I took one look at the mercurial repository, and found that it consisted of exactly one shell script. Wow. It's a big one, too. It currently supports 26 packages and is roughly 2000 lines long. My overlay, on the other hand, supports over 200 packages (with at least one shell script per package), and I probably don't want to know how many lines of code that is. I'm surprised that the number is over 200. Well, sort of. Its Bazaar repository is currently on revision 466, and apparently I've had this branch for a little over a year. That latter part's news to me.

In related news, as I type, I'm building Firefox 2.0.0.5 via xmingw. The Windows XP machine that I use is going loco, and many programs (including Firefox) crash when I try to use them. Strangely enough, my self-compiled version of Pidgin runs just fine. So, in that spirit, I'm trying to see whether a self-compiled version of Firefox will do the trick as well.

Edit: I was going to mention with regards to the Mozilla (including NSPR and Firefox) build process that they have a upside-down perspective of what "build" and "host" means, in comparison to 99% of the other autoconf-based projects out there. Usually, the "host" is the platform that one is compiling for, and "target" is the platform on which the compilation is taking place. Yes, it does make sense the way that Mozilla is doing it, but it's the opposite of everyone else, which makes it annoying to build. Well, that reason and the fact that it is a horribly monolithic build process (see OpenOffice.org, imake-powered X build process)

Wednesday, July 18, 2007

Re: GNOME Online Desktop

I just finished looking at the slides from the GUADEC presentation on the GNOME Online Desktop and the associated screencasts. The concept of installing software from a browser like that (given that I have some idea of how it works) is ridiculously awesome. More importantly, I would like to see how they design the following:

an HTTP library that shares cache and cookies with the browser, and supports asynchronous operation with the GLib main loop out of the box

OK, so right now, Gtk+ and friends currently have libsoup, which fulfills the latter requirement. The former requirement seems to me to be much more complex. First, do you require compatibility with multiple browsers (complexity becoming O(N*M) for varying N and where M is the number orf browsers to support), and if so, what do you do with browsers which seemingly don't provide an API? (I bring this up because I have no idea whether Opera provides one.) Now, imagine that they chose only the Gecko/XULRunner libraries to be compatible with. That API is always changing, so does this mean that the resulting library will also be unstable?

As an aside, I wonder whether this will also be available for Windows. Of course, this assumes that D-Bus will be available for Windows at some point in the future.

Thursday, July 12, 2007

Website finally updated, hooray!

After years of malnourishment and two weeks of development, my little old static website (now using a smaller domain name!) is live. The old website, like the new website, was created via a templating system. However, the former website's templating system was homegrown using PHP4 classes (disgusting, I know...but that's all I could use at the time). Even more disgusting about my system was that it was HTML comment directives plus a regular expression parser. I was so young and naïve, and I hadn't taken a compilers class yet. So this time around, I said "screw it" and went with a) my favorite language, Python, and b) the template software that I had been using for my Trac-AtomPub plugin (yes, not -atompp anymore, per the lengthy discussion on the atom-protocol mailing list).

The Journey

As the new website was a chance to experiment with new things, I decided to take the plunge and use HTML5 to markup my website. And with any sort of experimental technology, there were many problems.

First, I tried to use the genshihtml5 plugin, but strangely enough the code was a bit buggy (e.g., it was missing an import), and I could never figure out how to get it to output proper HTML5 while still removing end tags from tags which don't need them, e.g. <link/>, while retaining them for tags which require one, e.g., <script/>.

Next, I tried to use html5lib's Genshi-Stream-based tree walker. For some reason, it simply would not output any data. I don't remember all of the details, but I do remember inserting a lot of print statements in html5lib to see if I could find the bad piece.

Finally, I gave up and made Genshi just output XHTML plus the extra HTML5 tags. I figured that all of the debugging trouble simply wasn't worth it for the timeframe I had envisioned.

(As an aside, I do plan on submitting the patches that I've made as a result of this...exercise (for lack of a better word) so that they can be integrated in future releases of the respective software.)

Actual usage of new HTML5 tags was...interesting to debug. If you're writing HTML5 and not XHTML5, and you're viewing the page in Firefox, this is what the DOM tree looks like (according to Firebug):

<figure _moz-userdefined="" />
<img src="..." alt="..." />
<legend>...</legend>

For comparison, this is what it looks like when rendered as XHTML5:

<figure>
<img src="..." alt="..." />
<legend>...</legend>
</figure>

That completely broke my CSS files, as I was using child/descendant rules utilizing the new tags. This sort of thing is why I love using Firebug.

Testing

I've really only thoroughly tested this website on Firefox 2.x (Windows & Linux). I just checked it on Opera 9.20 (Linux) and a relatively old development version of Gtk-Webcore (AKA WebKit), and the only bug that I see (in both of them, strangely) is some sort of CSS error in calculating the spacing for the <dd/> box for "Special Skills" in my CV.

Future

Future plans include packing both the CSS and the JavaScript, via csstidy and packer, respectively. Right now there are several bugs with regards to integrating the two applications with my build system. csstidy interprets white-space values incorrectly, particularly the vendor-specific values. I'm currently trying to integrate packer via this nifty little python module that uses ctypes to create an interface with Mozilla's Spidermonkey JavaScript engine. Unfortunately, there's a recursive reference somewhere in base2, and the module is choking on it, so I have to figure out how to resolve that (if possible). Another future plan involves making the site fully dynamic in that the page layout stays the same, while background XMLHttpRequests retrieve the page contents when internal links are clicked. Obviously the current behavior would be retained as a fallback.

Anyhow, there are more details about how I made my website on the colophon. Bug reports, suggestions and feature requests are welcome!

Friday, June 29, 2007

Re: Atom Protocol Exerciser (Ape) setup notes

Note: I had intended to post the first part of this on his blogDave Johnson's blog as a comment, but it rejected me twice. Apparently I write like a spammer.

That was a lot more complicated that I had expected; makes me wonder if I'm the first person (other than Tim, of course) to deploy the APE.

I deployed it locally a few months ago, while debugging my Atom protocol plugin for Trac. During that time, I wrote up some implementation questions (which Tim graciously answered) and the method I used to run it.


Incidentally, lately I was tweaking my particular implementation since Tim Bray had recently updated the APE to be compliant with the latest revision of the specification. My shebang line for go.rb changed from #!/bin/bash /usr/bin/jruby to #!/usr/bin/env jruby. It's still working fine, even if it's still a little slow.

I need to figure out how to fix some of the errors that I get. For instance:

18. ? Client-provided slug 'ape-61911' not used in server-generated URI.

I have no idea how to fix this. When I get a valid Slug header, I use it verbatim:

To server:
GET /trac/atom/wiki/ape-61911 HTTP/1.1\r
Host: localhost\r
Accept: */*\r
\r

Perhaps the following line is confusing the APE:

Location: http://me.malept.com/trac/xmingw/atom/wiki/ape-61911\r

Additionally, apparently my plugin currently has some problems with the new multi-post app:edited test, but so far I think it's something wrong in my code as opposed to being a bug in the APE. I'm going to try to take a look at it tonight.

Saturday, May 19, 2007

Ego++

Typically, I'm not one to toot my own horn too much, but I'm rather excited about this. Because of my earlier post, two things happened:

  1. The APE was changed to support HTTP status 204 No Content, and
  2. the APP draft specification was changed to clarify (via text and examples) that non-200 OK responses for PUT/DELETE were acceptable.

I think it's awesome that I could (somewhat indirectly) influence a standard like that. At the very least, it gives the ol' ego a little boost.

Wednesday, May 16, 2007

ANN: Gentoo Bazaar Overlay at Launchpad

After a suggestion by a commenter, I've registered a project on Launchpad for my Gentoo overlay of Bazaar-related ebuilds. If you have any patches or bundles, don't hesitate to submit a bug there.

(As an aside, I made the logo by combining the SVG logos of Gentoo and Bazaar in Inkscape; it's much easier than you'd think.)

Monday, April 30, 2007

Report: Linuxfest Northwest, Saturday

After hearing about it from a friend (specificallly, that I could get there for free courtesy of Pogo Linux), I decided to go to an actual Linux...fest... and take in the nerd atmosphere without getting some sort of otaku or gamer disease. Unfortunately, I could only go on Saturday. I would have loved to hear Brad Fitzpatrick talk about how LiveJournal scales their databases, among other things.

The bus left promptly at 8:06AM. The movie they were showing on those little screens scattered throughout the bus was the X-Files movie, which I have no intention of watching, so I caught up on some reading for one of my classes. One amusing thing that I noticed on the way up was that there was a "Lychee Buffet" restaurant at the freeway exit for the college. If you think about it, it sounds rather disturbing.

I had roughly 15 minutes between the time that the bus arrived and the first presentations. In that time, I got a bunch of CDs from the Ubuntu and Oracle tables (it was like the Oracle table was having a fire sale - I got an Oracle DVD and an Oracle Linux DVD), and some stickers from the FSF table, including the "Bad Vista" one.

The first talk that I heard was on copyright and open source, by Karl Fogel (of CVS/SVN fame). It was really interesting, given my affinity for history (especially regarding science and technology). He talked about the parallels between the era of the printing press and the present-day. I didn't realize that copyright (or proto-copyright) was created as a censorship/printing restriction tool by the official guild of printers.

The second talk I attended was on strong authentication, in particular multi-factor authentication. It was very informative, especially in regards to how those one-time password keyfobs work.

Presentation number three was about practical honeypots. I wasn't really impressed with it overall. It was rather high-level, and the presenter admitted that he had only started working on it that morning. A lot of it seemed like common sense, like being preemptive, only concentrating on exploits that are relevant to your particular systems, etc.

The last talk I observed was on scaling web services, by a lead developer from Real Networks. He reminded me of Penn from Penn & Teller. It was a very engaging talk, and it gave me a new perspective on scalability, that is, it's essentially an organizational problem, as opposed to a technological problem.

A closing thought: I would have loved to have gotten one of those stuffed SuSE lizards...it would have fit in well with the Tux I got in Canada several years ago.

Tuesday, April 17, 2007

Re: Genshi Filters for Venus; Genshi + Trac-AtomPP

This news is excellent. One of my side projects (although, it was pretty low on my list) was to figure out how to use Genshi templates in Venus. I started out by copying the Django template code/unit tests and adapted them for Genshi. However, I got stuck getting some of the unit tests to pass (_item_title and _config_context). Perhaps sometime this weekend I can see how this particular implementation works.

Speaking of Genshi, I just noticed that they had released version 0.4. Hopefully, this will help me resolve the last APE error in my Trac-AtomPP plugin — adding app:edited elements to relevant entries and sorting by that property.

While I'm thinking about it (this really seems to be turning into a stream of consciousness post), I'm not exactly sure how to page the collection efficiently, considering that Trac creates the wiki page list via a generator. Right now I'm just putting everything into one feed, but obviously that doesn't scale very well.

Sunday, April 15, 2007

HOWTO run the APE (or any jruby script) via Apache CGI

In my previous post, I was running the APE via the command line because I couldn't figure out how I could run it as a CGI in Apache. I don't really want to run Tomcat just for this, and I've had bad experiences with Tomcat administration both for school and for work (which I guess is basically the same thing at this point). So after a bout of searching the Internets, I had found a post on JRuby on Rails which helped me greatly in configuring it. So, without further ado, here's the relevant apache configuration snippet:


SetEnv JRUBY_HOME /usr/share/jruby[1]
SetEnv JAVA_HOME /usr/lib/jvm/sun-jdk-1.5[1]
# Jing dependencies
SetEnv CLASSPATH ...[2]
AddHandler cgi-script .rb
Options +ExecCGI

Notes:

  1. These values are Gentoo-specific. For JAVA_HOME, I used Java 5 as a precaution, because I wouldn't be surprised if it didn't work in version 1.4.x.
  2. On Gentoo, they put all of the third-party jars in separate directories so that their java-config utility can manage them all separately for the system and the users. So, the value I had here (which I didn't want to reproduce here because it's way too long) was the result of java-config -d -p jing. You probably don't have to put this line in if jruby can find jing by itself.

For the APE, I had to add #!/bin/bash /usr/bin/jruby to the top of it. For some reason, CGI complains if you leave out the /bin/bash part of it.

Saturday, April 14, 2007

trac-atompp progress; APE questions

I'm working on (among other things) finishing up wiki support in my trac-atompp plugin. I'm nearly done, I think. In order to make sure it's "valid", I'm using Tim Bray's APE (albeit from CVS). However, I've got a few questions about some of the errors:

  1. ! 53 of 53 entries in Page 1 of Entry collection lack app:date elements.

From the source, it looks like it should actually say app:edited. But, why is it giving an error? According to draft 14, section 10.2, Atom Entry elements in Collection documents SHOULD contain one "app:edited" element, and MUST NOT contain more than one. Perhaps the messages should conform to RFC 2119 instead of lumping in all of the SHOULDs with the MUSTs, or something.

  1. ? Can't update new entry with PUT: No Content [Dialog]
  2. ! Couldn't delete the entry that was posted: No Content [Dialog]

I don't really understand why HTTP status code 204 (No Content) isn't allowed for either PUT or DELETE, seeing as RFC 2616 says that it is a perfectly valid response for both actions.

Thursday, April 12, 2007

HOWTO restrict ssh access by IP and user

There's a way to restrict access to a user account or set of user accounts via PAM (and by extension, SSH)—the obviously named pam_access module. It's available on Gentoo Linux in sys-libs/pam, and on Debian Linux (and I assume the derivatives) in libpam-modules.

In order to enable this module for SSH, you have to edit the SSH's PAM file (Gentoo: /etc/pam.d/sshd; Debian: /etc/pam.d/ssh) to enable the access module: account required pam_access.so

There's some pretty good documentation in /etc/security/access.conf (at least, in the default distribution of it) on how to configure the file, but one thing that it doesn't say explicitly is that you can use IP address blocks in CIDR notation to denote access privileges. For instance, if I wanted to limit bob to the local network (192.168.0.*) and the VPN (172.16.*). The configuration line for that would be:

-:ALL EXCEPT bob:192.168.0.0/24 172.16.0.0/16

Wednesday, April 11, 2007

Re: Protecting a JavaScript Service

In How to Protect a JSON or Javascript Service, Joe Walker looks at a few solutions such as:

  1. Use a Secret in the Request
  2. Force pre-eval() Processing
  3. Force POST requests

The last time that I worked on an JSON-based web application, I did number 1, sort of. I basically implemented a simplified version of HTTP digest authentication in order to send a username and password to the server. In order to accomplish this, I used an nonce plus a JavaScript implementation of the SHA-1 hash function.

If I were to reimplement the user authentication portion today, I would probably use this "clipperz" library that I also found on Ajaxian. I'm amazed that someone has implemented AES in JavaScript. I would think that it would be difficult, although I haven't read the specification for it. Maybe one of these days I'll implement the Diffie-Hellman key exchange, if I get bored enough or I need it for something.

Wednesday, March 21, 2007

AWN bzr branch, bazaar overlay

Two announcements tonight: the creation of a bazaar branch for Avant Window Navigator (Awn), and the creation of a bazaar-related gentoo overlay.

First, I really like what njpatel has done with Awn. I've always wanted a bar that looked and functioned like the OSX bar, but the closest I could find was the gDesklets starterbar, and it didn't handle currently running programs. Awn is just plain awesome. Unfortunately, I don't use Gnome on my desktop at home, I use Xfce. So I svn co'd the source and created a patch that uses libxfce4util and thunar-vfs instead of gnome-desktop and gnome-vfs. I submitted that patch to the tracker, where, as of the time of this writing, I haven't gotten a response. We'll see. Next up on my list of modifications, is to use Glib's GKeyFile (read: ini-like file parser) as an alternative to GConf. Because bzr-svn finally doesn't die when I try to checkout a remote repository (as of bzr-svn 0.3.2 and bzr 0.15rc2), I now have a bzr-svn branch that contains all of my changes to Awn.

Speaking of bzr-svn, at the request of the developer of bzr-svn, I have published my modified subversion ebuild that contains the patch listed in the parent post to that comment, in a bzr branch, of all things. This branch also contains the latest releases of paramiko, bzr (0.15rc2), bzrtools, bzr-gtk, and bzr-svn.

[Edit: forgot paramiko]

[Edit: forgot to finish a thought]

[Edit (2007/05/16): Update here]

Tuesday, February 27, 2007

Death of an OS

I've been trying to salvage a Windows XP laptop which mysteriously blue screened the other day. Ever since, when it boots, right after the XP load screen, it blue screens again with this error:

STOP: c000021a {Fatal System Error}
The Session Manager Initialization system process terminated unexpectedly with a
 status of 0xc000026c (0x00000000 0x00000000).
The system has been shut down.

Looking up the error via Google doesn't give me any useful results other than "reinstall XP", so, I've been using a LiveCD to move all of the essential data (music, documents, Firefox settings, etc.) to another computer. It's a very fun process.

Thursday, February 08, 2007

Review: Darcs

In short, Darcs irritates me more than any other distributed VCS that I've used so far. And that includes git.


One of the more annoying things about it is that the commands for it are significantly different from most modern VCSs. Examples (based on their equivalents in other systems):

status
darcs whatsnew -s
diff (unified)
darcs diff -u (for the record, I expected darcs whatsnew -u to do the same thing, based on the description in the help text.)
commit (to local tree)
darcs record

On a more positive note, I find the patch-based approach (as opposed to the snapshot-based approach Bazaar uses) to be an interesting method of performing backend operations. However, having to go through each patch "hunk" is rather strange, and doesn't really seem to scale, especially in a command-line interface. It seems that in the prototype GUI, it's a little better, but the usability is still lacking.

Tuesday, January 23, 2007

Trac-AtomPP progress, 2007-01-23

I finally got myself out of my Trac plugin coding slump. Genshi is really making this a whole lot easier; I don't really know why I was manually generating XML from trees in the first place.

I am very grateful for the existence of Joe Gregorio's Atom Publishing Protocol test suite. There are only a couple of nits about it — first, it doesn't seem to play well with Multi-version installs of wxPython (seems to require 2.6, perhaps 2.5 [I only have 2.4 and 2.6 on my computer]), so I cooked up a really simple patch for that. Secondly, my wiki collection feed generates some warnings via Feed Validator, but in the logging pane, it records them as errors. Since it doesn't affect the functionality, I merely consider that a minor usability bug. But, this really doesn't seem to be meant for end users, so...whatever.

Anyway, for my capstone, I'm only working on the wiki part. GET is done, and POST is nearly done. DELETE is done in theory (haven't tested it out yet), and PUT still needs to be converted. POST and PUT now require some implementation of ElementTree to be installed, in order to parse the Atom Entry input. As an aside, ElementTree's find*() methods are really poor substitutes for XPath. Also, this implementation utilizes the Atom MIME type parameter draft whenever possible.

Reminder: Bazaar URL is: http://bzr.malept.com/trac-atompp

Monday, January 22, 2007

Suggestion for project

pyglet (OpenGL multimedia library) + Bruce (presentation tool) + S5 (presentation format) would probably create an awesome alternative to PowerPoint. Other possible libraries to use include Genshi (to template the S5 XHTML output), and cssutils and html5lib to replace the current HTML/CSS parsers in pyglet, mostly because reinventing the wheel is usually a bad thing. To look into: whether a pygtk UI can be integrated with a pyglet surface (perhaps via pycairo).

Saturday, January 20, 2007

HOWTO compile subversion-1.3.2 so that it works with bzr-svn-0.3

This was written because simply using the patch at the bzr-svn web page is not sufficient for getting it to work with subversion-1.3.2, which I'm using because that's what's stable for Gentoo's x86 ebuilds.

Note 1: I have a Gentoo ebuild that corresponds to these directions. Add a comment if you want it.

Note 2: Where there are instances of lynx -source [url], it can be substituted for wget -O - [url] when lynx is not available.

  1. Unpack the subversion-1.3.2 tarball: tar -xjf subversion-1.3.2.tar.bz2
  2. Change directory to the subversion source directory: cd subversion-1.3.2
  3. Apply this patch to the source (this patch came from the Ubuntu Edgy source diffs, but slightly modified): lynx -source 'http://www.lazymalevolence.com/patches/subversion-1.3.2-debian-x-python-bindings.patch' | patch -p0
  4. Remove some of the generated .swg files, or else the compile will fail: rm subversion/bindings/swig/proxy/*.swg
  5. (This step is for people who have swig 1.3.31 installed; I haven't tested with any other version.) Convert the language typemaps to #ifdefs, to get rid of a bunch of warnings: (cd subversion/bindings/swig && lynx -source 'http://svn.collab.net/viewvc/*checkout*/svn/trunk/subversion/bindings/swig/convert-typemaps-to-ifdef.py?revision=19926&pathrev=19927' | python -)
  6. Regenerate all of the configure files, Makefile.in and the .swg files, among others: ./gen-make.py build.conf; make -f autogen-standalone.mk autogen-swig; (p=`pwd`;for d in . apr{,-util}; do cd $p/$d && autoconf; done)
  7. Proceed normally in the build cycle: ./configure && make all install

Saturday, January 13, 2007

Several items

The last post was 1.5 months ago, awesome. Anyhow, here are some thoughts on stuff I've done/explored in the computer realm during that time:

  • Beryl is awesome. Too bad it doesn't play well with tvtime, or else I'd permanently enable it and all the associated settings. Also, my video card is one of those crippled ones (ATI Radeon 9250SE), so it's a bit slow as well.
  • My capstone project is in full swing. I've mostly stopped using my hand-rolled Atom parser/generator, since I got stuck on how to implement extensions. For generation, I've switched to using Genshi templates. Genshi has a pretty nice templating language for both XML and text based documents. For parsing, I think I'm going to use lxml or ElementTree, depending on how well XPath is supported.
  • Given the amount of attention that OpenID has been given in the blogosphere lately, I was thinking about how it could be used to integrate with the UWNetID system. Unfortunately, I found that it was rather difficult to modify the current implementations in order to add such support. So, I'm currently writing a PHP5 class + mini-application to be an OpenID server. So far, I have the association mode completed, and the checkid modes are in progress. I am proud of myself for actually implementing the Diffie-Hellman key exchange, since while I am fascinated with cryptography, my math skills in that area are...lacking. It's also nice to refresh my PHP skills, as I haven't programmed in PHP5 (which gives you some idea as to the last time that I coded in PHP).
  • Over winter break, I attempted to port modular X to MinGW, as the Xming project (which is awesome) uses the old, monolithic build process. I've built all the Xorg server (and its dependencies) successfully, except that the OpenGL code for Windows has not been updated with the rest of the server's codebase. That sort of modifications are pretty far out of my porting abilities, unfortunately. This project also gave me some experience with git. My take: it's extremely annoying to use git directly — use a frontend to it such as cogito instead. My personal preference is still Bazaar, though.
  • I wrote a Python module in C for my on-again, off-again, DC client project. I have a post on that sitting in my queue and will post it at some point when I finish and/or remember it.
  • I eagerly await the day when Deepest Sender supports the GData Blogger API. Maybe in the spring, if it's not there, I'll write it.
  • Oh right, the new URL for this weblog is http://blogger.malept.com/.