Dijkstra’s Revenge

Wired posts the actual code at fault in the recently-revealed vulnerability in Apple’s iOS implementation of SSL:

And I remember enough C to recognize the problem. But I had a number of reactions of varying seriousness:

  1. “Hey, maybe the programmer thought that it would really work if he did it twice.”
  2. The late E.W. Dijkstra was right: go-to’s really are harmful. (There’s a good Wikipedia entry on “Considered Harmful”)
  3. What would have helped? Enabling compiler warnings about unreachable code? That try/catch thing? Automatic code indentation? Always using the curly braces?

Perl and Inside-out Objects

Way back, when I was pouring through Damian’s object book, where
he shows all the different ways to built objects in Perl, I became
enamored with the idea of inside-out objects.

Here’s what Randal Schwartz says on the subject…
http://www.stonehenge.com/merlyn/UnixReview/col63.html

At the time what I liked about this approach was the idea that the caller couldn’t
grok the object and take short-cuts that would only break later
if I changed the implementation.

All I can say is that I am so happy that for most CPAN modules,
inside-out objects are rarely seen.  That’s because I have found that one of
the best debugging tools I have, when dealing with a complex OO
CPAN module, is being able to call Data::Dumper on the object.

And of course I do it to my own objects too, although I generally
try to always include a  toString() method which can be thought
of as a  smarter and prettier Data::Dumper — only because it understands
the data.

And as for people who grok the object to take unwise shortcuts?  They will
eventually get what they deserve.

Quote du Jour

“So, you have a problem. You say “I know, I’ll use floating point!” Now you have 2.0001341678 problems.”

(Source.)

Reading Schedule Generator

Or: some stuff I worked on over break.

This is a small solution to (some would say) an even smaller problem: I used to ignominiously return unread books to the library, and face new issues of magazines after having only read a few items in the previous one. My to-be-read pile was only getting taller.

I found I was able to read more “reliably” if I had a specific daily page goal for each item. Somehow having an “official” schedule helped me overcome my affliction.

I’ve been doing this for a few years, and have written code to help out. (Certainly there must be a name for the psychological quirk that compels me to “write code to help out”. Let me know if you come across it.)  What follows is my current “solution”: a CGI-backed web form that accepts the schedule parameters for a single reading item (book or magazine) and generates a small HTML calendar schedule that I can print from a web browser, cut out, and use as a bookmark for the item.

This seems conceptually easy, if not trivial, so I imagine similar code is “out there somewhere”, but cursory Googling didn’t turn up anything.

Anyway: the finished product appears like this (specific example is the book Freedom™ by Daniel Suarez):(Not that I recommend this book: it's pretty bad so far.)

Yes, cut on the dotted line. So my goal for today (January 6) is to get up to (or past) page 127. And I should be finished in a couple weeks.

Now I’m not psycho about this: it’s OK to get ahead of the goals if the material is compelling and I have the time. It’s also OK to fall behind if there’s just too much other stuff going on. (However, in my experience, just knowing that I’ve “fallen behind” is just enough motivation to carve out some extra time to catch up later.)

Enough for the mechanics, on to the code. For a long time I ran a Perl script from the command line to generate an HTML schedule as a static file, which I would then open/print in a browser. But over break I decided that it would be (slightly) more flexible to do the whole thing from the browser, using an HTML form to get the schedule parameters, calling a CGI backend to display the result.

The form is here (hosted on my workstation). What it looks like:

It’s a very rudimentary form, with few bells or whistles. Well, OK, one bell: the Javascript jQuery UI datepicker widget for getting the schedule’s start and end dates. I didn’t know nothin’ about jQuery before I did this. (And I only know slightly more now; if you examine the source code for the form, it’s not very complicated.)

So you fill out the form. Using our example (and showing the datepicker in action):

… hit the submit button and the resulting page should produce the appropriate schedule. (I’m pretty sure it would work for you if you want to try it.)

The real work is performed by the Perl CGI script, which relies heavily on the smarts contained in the CGI, Date::Manip, and HTML::Template modules. (What’s left: some date arithmetic that’s only tricky if you’ve done it wrong a few times. I think it handles edge cases and DST changes correctly, although I haven’t checked lately.)  If you’d like to look at the script, the plain-Perl file is here, and the html-prettyprinted version is here. The HTML template used by the script is also important and that’s here.

Larry Wall on Software Patents

At about 1:30 into the video…

Reports from OSCON 2013

Guess who and guess what? That’s right it’s Marcus once again, at OSCON 2013, the open source convention held annually in Portland, Oregon. There is no other gathering where you can find such a diversity of computer language and tooling expertise. To say that the open source world has a richer, more collaborative programming community than the commercial world isn’t just an understatement: it’s the very definition.

So, thanks once again to the University of New Hampshire for sponsoring this trip. I will try to return the favor by continuing to write software which will not require expensive licenses, be poorly interoperable, or tied to the whims of a single corporate overlord.

That said, let’s convey some actual information…

My first two days will look like this:

Monday morning – yet another introductory git session. I know. I take these all the time and still use Subversion. So shoot me. This talk is delivered by a github instructor. They evangelize quite well for git, as they should, and teach it very well too.

Monday afternoon – Optimizing Your Perl Development by Damian Conway. Can’t *wait* to see Damian again. This session has a good chance of being the highlight of my week.

Tuesday morning – Web Accessibility for the 21st Century. I am wondering, what year will “…for the 21st Century” stop seeming forward-looking? 2020? 2050? Is there a rule about this?

Tuesday afternoon – Backbone Workshop. Backbone is a Javascript framework to facilitate the writing of a fatter client in the browser while avoiding typical Javascript pain points.

Day 1

Getting Started with Git

I’m spending much of this session aping the exercises to ‘git’ some practice.  See what I did there?  Anyhow, that’s why I’m not writing much about this session, but here is a tiny detail about git (vs. SVN):

Ignoring files in SVN is done via the svn:ignore ‘property’; in git, you create a .gitignore file, which git looks at to know what to ignore. Add and commit .gitignore itself as well. I’m not sure which approach to prefer, but there they are.

This is perhaps my third stint in a beginner’s git class.

Damian Conway’s Development Talk

Neat new tool from Damian called ‘polyperl’, which, in conjunction with Perlbrew, allows you to designate *exactly* which version of Perl your program is running, using the normal ‘env’ on the shebang. It look like:

#!/usr/bin/env polyperl
use 5.012;

Note that ‘use 5.012;’ *without* polyperl in the shebang would only ensure that the *minimum* version you run will be 5.12. Specifying the actual version, no greater and no lesser? Polyperl! Someday Perl will have this natively, but live for today people, and use the Perl and nothing but the very Perl you want.

[skipping a bunch of vim tricks and plugins here...]

Recommendation: Method::Signatures. Damian prefers it over Params::Validate. I’ll have to check to see if it can be used without dying on validation failures (checked: it appears it can). I should probably mention what these modules do. They help you validate that the variables being passed to a subroutine are correct in number, type, etc.

Recommendation #2: Getopt::Euclid as a replacement for Getopt::Long (or any of the other 3,000 modules for writing CLIs on CPAN). This is one of Damian’s own. Cool thing about Getopt::Euclid? The way you define the incoming parameters is to *write the POD documentation* for your CLI. That’s right! Getopt::Euclid reads your POD and uses it as the interface declaration. Gorgeous. Can’t get away without documenting *that* now can you?

And now, I am laughing inside. Damian has his CPAN build/test/release process completely automated. I’ve done the same general type of work to release/deploy my own code to UNH production environments in a scripted, less error prone way. It’s pretty clear that years of reading and listening to Damian have affected me. I am now just as crazy as he is, if only 1/100th as clever.

Day 2

Web Accessibility

It’s now Tuesday morning and I’m in the web accessibility session mentioned above. One point that is being hammered home from the start: there is no standard that’s a silver bullet for accessibility. Many ‘official’ accessibility guidelines, even if met to the letter, don’t *actually* address the problems that users are having in accessing and understanding the content on your websites.

Also, a website which is accessible in one way, for a certain disability, is often totally at odds for being accessible to another disability. Those things are depressing, but they are the truth. If you’re designing, or redesigning, for accessibility, the best approach is to use real assistive technology tools.

NVDA for Windows (a screen reader for the blind) is particularly useful in that as a sighted person, you can turn off the audio and simply have the reader log the text that it *would* read in a separate window. This shows you the ‘flow’ of your page from a screen reader’s point of view, without the constant screen reading chatter. Screen readers are intensely annoying if you don’t absolutely need them.

One useful piece of advice: use page headings, but not to excess. Here we are talking about the h1, h2 etc. HTML elements. Use them as intended as a general outline for your content, but don’t overdue them for design purposes or whatnot. A screen reader user will be using these to hop around to the different logical sections of your content.

A website highly recommended by the speakers: WebAIM.

Another key approach to accessibility testing is mouseless navigation. Stick to the keyboard and you’ll get a much better idea of how the elements flow on your pages. Use the ‘tabindex’ attribute to fix such flow issues.

Coming under severe criticism here: carousels and privileged links…

Carousels: carousels are rotating or self-changing images or content. These are apparently hell on accessibility, for an obvious reason: elements of the page are moving around on their own. We commit this sin on unh.edu itself, despite the very nice design. Perhaps we have something in place to mitigate this for those using assistive technologies? The experts here say to avoid them wholesale. Hey, I didn’t realize it either.

Privileged links: links that, when clicked on, do not grant access to the resources behind them (even with login). A typical message would be “you do not have permission to access this resource” or similar. Repeat offender? Sharepoint. I’ve always despised this. The geniuses at Microsoft probably call these ‘teasers’ or something. Don’t tease me, thanks.

Backbone.js

Backbone.js is an MVC (Model-View-Controller) Javascript framework.  There are approximately eight thousand of these coming out every week these days (slight exaggeration).  Aside from separation of concerns via the MVC approach, Backbone.js and other similar JS frameworks focus on moving state management to the client side.  This means getting away from the request/response model that even AJAX employs.  Instead, the client says ‘Saved!’, and *then* sends the data to the server.

I know.  Sounds dangerous.  But there are ways of handling errors with this approach.  The whole goal is raising the perceived performance of an app.  Milliseconds matter, which the speaker supports which such figures as: if Amazon pages load just 100ms slower than usual, they lose 1% of sales.  That’s compelling.

Another thing Backbone.js brings to Javascript programming is better scoping, which is absent in JS.  By default, you are generally searching the entire DOM (think: an entire webpage) for an HTML element that you’d like to work with.  This narrowing of scope means selectors are greatly simplified and you get further separation of concerns.

The speaker seems to be assuming that we studied for this or something.  The exercises are like Go!  … with no ‘Ready… Set…’.  I think the majority of folks here spend 90% of their time writing Javascript on similar frameworks already.  With about 10% of my time allotted for Javascript, I’d need to move a bit more slowly until I get comfy.  (Let me take a moment to express my envious hatred for natural polyglots who see new syntax and immediately parse it effortlessly. I hate you.  I’ve always had to work for it.)

Yeah… he’s literally explaining the exercise *after* we’re suppose to have finished it.  What?  I’m not psychic, man!

Still, it’s a good talk, and I’m picking up the spirit of things.  I can see that somewhere down the road, I’ll likely be using of the 8,000 JS MVC frameworks.

Note: things like Backbone.js work at a slightly different layer of abstraction as jQuery, which is *also* called a Javascript framework.  In fact, Backbone.js works in conjunction with jQuery.  jQuery provides better looking code to write JS in, while something like Backbone.js helps wrangle the overall structure of a large client-side application.  That’s a gross over-simplification, but… gross and simple are two things I do well.  :)

Day 3

The convention proper really starts on Day 3, today, with some relaxing keynote talks attended by all.  There are a couple thousand people here, ballpark.

First there was a neuroscience/AI guy who actually got me excited about the state of brain research for a few minutes.  Next up, a Facebook honcho.  No matter how much I may resent Facebook and distrust the current look of ‘social’, it can’t be denied that Facebook is a more modern company than, say, Microsoft or Oracle, in its disposition towards open source.

If you are reading this and remain a bit unsure about the open source approach, let me take a moment to explain why a company like Facebook would open some of its coding projects to the world.  Wouldn’t this reveal trade secrets and squander the intellectual property value they’ve created?  Not really, as it turns out.  The code they are open sourcing is attacking difficult problems such as the scaling of big data storage and delivery.  These problems are nowhere *near* perfectly solved.  By open sourcing these code projects, Facebook can attract meaningful contributions from other interested companies like Intel and Broadcom.  This collaboration raises all boats without diluting the core competencies of each company.

What’s even cooler about open source is that *even little guys like me* have access to that work.  Had I more entrepreneurial spirit, I could fuel a startup on the very code running some of Facebook’s systems today.  I could contribute my own R&D back to the project.

That’s how it works.  You might also check out this amazing open source effort for another perspective on the kinds of problems open source can help us all to solve together.

Back to the keynotes… just saw an incredibly futuristic demo of a flying drone being controlled by open source Clojure code.  This drone hovered as well as any Hollywood robot has ever hovered, could recognize images, stream video, recognize faces… you name it.  Audible “wows” in the audience, one of which was mine.

Final keynote (after a good one from In Bloom, who are here to open source their work) is Mark Shuttleworth, founder of Ubuntu.  He’s talking about Ubuntu’s one-OS (very Windows 8ish) approach to unifying user experience across all types of devices.  But of course, on Linux.  Now he’s talking about Juju, a tool for deploying and connecting various software infrastructure elements (think MySQL, Cassandra, MongoDB, WordPress, etc… whatever your cocktail may be).  Pretty amazing level of automation.

Shuttleworth just called Mac OS X (perhaps the primary Ubuntu competitor, if you think about it) “the gilded cage”.  Great line.  But gilded it remains, by comparison, for the time being…  and they know it.  He was in the process of announcing Juju for Mac OS X.

The first focused session of my day is about Asterisk, an open source PBX.  Asterisk has been around for quite some time and is very mature.  Although we are invested in Avaya at UNH, I myself can definitely benefit from any PBX-related knowledge, as a non-PBX-expert in the Telecom group.  It’s possible we could add Asterisk as a sort of sister system connected to the main PBX, to offer discrete features that we are otherwise unable to provide (or unable to provide cost effectively).  It does call queuing!  Neat.

I once wrote a CDR reporting app on top of Asterisk, almost ten years ago now.  It was not open sourced by the company I worked for.  Looking around, there is so much more available out there now.

The speaker is recommending Wireshark’s advanced VOIP diagnostic features.  Never realized Wireshark had stuff as specific as this.  Also ‘ngrep’, for ‘network grep’, a sort of filtering tool for tcpdumps similar to some of Wireshark’s functionality.  Now ‘sipp’ which is a SIP performance testing tool.  Now ‘sipsack’ for generating specialized SIP packets for troubleshooting purposes.  It’s good to know these tools are out there in case I’m ever asked to work on this stuff, which is always possible.

Of all new PBXs deployed in North America today, 16-18% of them are Asterisk.  One guy in the audience suggests it’s closer to 35%.  The FAA is looking at it as a possible solution to connect control towers.

Next up, a session on running on your company’s internal applications using open source practices.  We are collaborating more and more across IT units at UNH, as the desire for better interconnected systems grows.  The speaker begins with how to deal with the necessary communication overhead involved in working together.  Some key concepts are transparency (work in the open, make your code source examinable across organizational units, govern projects in the open), quality (code reviews, good testing), community (points of contact outside your primary team).  This latter point she illustrates can lead to good people staying longer at a company, having connections across business units.

The central concept is “internally open source”.  Yes, please!  We would gain ridiculous efficiencies if we did this, I think.  Silos are full of nothing but corn.  :P   I am sure we have countless shared needs that would be opportunities to work smarter, not harder.  A small DevOps swat team could do wonders in this area if we could dedicate the resources.  The speaker is expounding on that very thing: a core team which “owns” (think of ownership loosely) all centralized code for an organization.

Note that the above is not geared towards homogenizing our various codebases.  Different coding standards and technology choices can be applied on a project by project basis.  We can continue to pursue diverse approaches while doing this.  It’s important to preserve the evolutionary advantage of using diverse technologies while centralizing certain efforts.  <– Yes, this is a challenging balance to achieve, but the first step is to be mindful of it.

Belly full of lunch now, the next session for me is Randal Schwartz’s Half My Life With Perl.  My four planned afternoon sessions are in fact all on the Perl track.  That is part of what’s great about OSCON: you can fill your own schedule with talks most pertinent to your work or interests.  As always there is a strong contingency here for Perl.  After all, OSCON started off as just The Perl Conference way back when.

This autobiographical talk from Randal is a bit unusual as these talks go, but, very interesting to us Perl folk.  Perl has a few more years of history behind it than most of the projects here.

Next up: Start Contributing to Perl, It’s Easy!  This is a good overview from a relatively (2007) new Perl community member.  I have a grand total of 1 patch accepted on CPAN, but that’s one more than I did last year at this time.

I am pumped for the next session: Carton: Manage CPAN Dependencies Without The Mess.  Carton is a tool I am hoping to use within the next year in conjunction with Perlbrew, cpanminus aka cpanm (also written by the speaker), and Damian’s polyperl, mentioned way up above.  Armed with all these goodies I hope to create my first set of discrete Perl stacks on the same box, each running their own chosen version of Perl core, and its own CPAN module dependencies right down to the module versions.  This panacea has eluded me for some time, but luckily Miyagawa is hard at work to make my life better.

“Dependencies are part of the app.”  Yup!

Note to self: run ‘carton check’ as part of the IX::update() routine, and if there are installations to perform, run ‘carton install –deployment’ or ‘carton install –cached –deployment’.  This is going to be like butter!  Mmmmmm.

Day 4

First session for me is: Evolutionary Architecture and Emergent Design, which sounds fancy.  He has an interesting graph up showing the relative complexity (complexity-per-line) of a code base over time.  Add features tends to increase complexity, refactoring (paying off the technical debt) lowers it.  He makes some good arguments for component-based architecture and against over-engineering early in a project.  A high level talk but engaging.

Next up, Dealing With Multiple Types of Input in HTML5 and Javascript Code.  A good reminder to start using the HTML5 input types that touch browsers will offer different keyboard options for: ‘email’, ‘tel’, ‘url’, etc.  But the talk focuses on ‘pointer events’ which are vary much like mouse events except you can get fine-grained control over touch, pen or mouse inputs specifically, for instance, tweaking game play dynamics in an HTML5 game based on the type of input being used.  There are also ways to test for touch pressure and the orientation of the device.  So this is the basically the next generation of event handlers which account for the new generation of input devices.

Belly full of lunch, I will now be hearing about The Perl Renaissance from Paul Fenwick.  Paul is a fun Australian who can often be seen in a funny hat.  He is reminding us of some great tools such as cpanm, perlbrew, and Dist::Zilla (for CPAN authors).  Now some talk of Moose and Moo, and another positive word this conference for Method::Signatures.  Now a reminder about a feature available as of Perl 5.10: named captures from regular expressions (a good description of these can be found here).  This frees you from having to use ordinality in capturing string matches from regexes.

And now for Ricardo’s (the Perl Pumpking) update on Perl 5.  This is really the Perl 5.18 changes update I saw last month, some of which I covered in my Reports on YAPC::NA 2013.  Can’t wait to get my stack upgraded.  But, one step at a time.

Day 5

Thank goodness they schedule Friday to break earlier in the afternoon, because after 5 days of conferencing I really am shot. But today does have my most anticipated session, the Damian Conway Channel. He has a new module called Running::Commentary which is a nice way to write scripts that have lots of system commands. I could see using this for something like Oracle database backup scripting. Lexical::Failure is a way to give your module users a choice of return types upon failure, rather than imposing an undef, a die, or anything. Lexical::Hints is an advanced way to implement debug statements, which could even be code references. Lingua::EN::Grammarian is a fairly ambitious attempt to find grammar (not spelling… grammar) mistakes in English text.  He’s got it plugged into vim.  Wow!

Next up: BASH As A Modern Programming Language from an eBay guy.  He is explaining why eBay selected BASH to write a utility to set up user environments to run their specialized web framework (surprise!  eBay has a custom web framework).  Reason #1 is portability as expected, BASH being the default shell in all major OSes except Windows, which can run it via Cygwin… so no problem there either.  As a shell, it easily calls other binaries (perhaps even the runtime of other languages) and all languages have a way to call back out to the shell.  This talk is reminding me how badly I want to avoid ever trying any serious programming in BASH.  Gosh, what a clunky and confusing language.  Tip of the hat to Mario Malizia who has worked wonders with BASH in the creation of CMU (ECG’s venerable Code Management Utility).

My final session of the conference is on Secure Open Source Development, from a Red Hat guy.  This is a high-level discussion of how to communicate about security issues proactively (before release) in the open source development cycle.  Red Hat in particular has a huge challenge because they are pulling hundreds of projects from various communities into their releases.  He makes the fairly obvious point that things like static code analysis (programs that analyze code for security) are the future of the field.  These tools exist in the present but are far from perfect; manual audits remain necessary.  Unsurprisingly, security hasn’t become a whole lot easier in 2013 than it was in 2012.

Well, that concludes my visit to OSCON this year.  As usual the week flew by.  The hardest part was paying attention to each session while dying to try out something I’d learned in the previous one.  So now I’m taking a little time to play around with cpanm/perlbrew/polyperl/local::lib before lunch and perhaps a dip in the hotel.  Life is good!

If you read this far, congratulations, you’re a nerd.  If you’re *that* interested in this stuff then I hope you’ll consider going to OSCON yourself next year.

Reports from YAPC::NA 2013

Here I am again at YAPC (Yet Another Perl Conference), which I pledge to blog, as always, to the extent allowed by my attention span. I arrived here in Austin, Texas yesterday. The birds here are loud and aggressive, and the weather is nice.

YAPC::NA 2013

If you’d prefer not to read my drivel and instead eaves drop on the conference itself, check out the live feeds.

Day 1

The keynote speaker this morning is reviewing 25 years of Perl, the last 15 of which I’ve been along for the ride. Time flies. I am currently forgiving myself for writing my own web framework in Perl, being reminded that Catalyst (probably the leading Perl framework today) was only three years old and not particularly mature when I was looking for such a solution.

First focused talk: Bruce Gray, “Exception to Rule“, about… yup, exceptions, in lieue of returned errors, for error handling. He’s talking about the frustration of a module throwing ‘die’, and of course, the utility of ‘eval’. The bottom line is however that the Perl 5 core still does not boast a consistent error handling approach.

“It is almost always better to die than to give the wrong information.” (which includes silent failure)

Another recommendation for autodie (wraps all core IO keywords to return exceptions rather than whatever they do natively) as well as Try::Tiny (for more advanced handling).

When writing a module however, it should be the user’s (higher-level programmer’s) choice as to what style of error handling to use, and for this Damian Conway will be revealing a Lexical::Failure module which module authors will be able to use to offer this choice to users. Apparently this will be introduced at OSCON.

Next up, John Anderson on how to Automate Yo Self. First he offers a common suggestion to manage your home directory and shell config with source control. He’s also written a few tools such as App::GitGot, for help managing multiple git repositories. These are some fairly specific tools which I won’t list out here. Lastly he recommended tuning your text editor to save you time.

After lunch, I am listening to Curtis Poe on his new module Test::Class:Moose. I’m not a Moose user yet (translation: I don’t always use objects in Perl, but when I do, I don’t use Moose. Stay thirsty, my procedural friends.). However, I am very likely to encounter Moose in the future, so this might come in handy. People have been using Test::Class to test Moose, but this new module irons out a lot of the cruft and edge cases when just using Test::Class.

Now, Bill Humphries on Perl Meets Modern Web UI. He’s demonstrating the idea of a ‘single page website’, which means, after an initial full page load, the rest of the calls are AJAX. I’ve never really done this… I use AJAX opportunistically when it makes sense… and to be honest I don’t think the pros of ‘singe page website’ architecture quite outweigh the cons (I realize I am omitting most of both here since I can’t type that fast). The one compelling reason he did provide was for situations where you want to offload much of the work to the client, because your server (example: cPanel in shared hosting environments) has limited resources and needs to use them sparingly.

Time to finish off the day with Larry Wall’s keynote, which he has entitled “Stranger Than Fact”. Good quote: “We’re not only stranger than we imagine. We’re stranger than we can imagine.” Larry is the creator of Perl.

One thing I didn’t imagine was that Larry would announce he has prostate cancer. Here’s hoping he makes it out of this ok. There is much love in the room for this man. This is the cult of TIMTOWTDI (“There Is More Than One Way To Do It”), the cult of personal freedom in computing, and he is inarguably our leader. He spoke of his cancer, of programming language design, of the emerging codes of conduct at tech conferences. None have articulated my feelings about this latter better than Larry, who contrasted Law (the codes) with Grace. Have a big soul, he says (I paraphrase)… big enough that they can grind away as much as they want and you’ve still got more. That’s Grace.

I’ve rarely, if ever, encountered a person with such interdisciplinary reach in his worldview, someone not only with a deep technical mastery, but the ability to connect this with philosophy, cosmology, spirituality. Pfft, you say, I’ve seen that in a TED talk. No, you haven’t. Larry pulls this off with a down to earth humility that’ll make you feel both unworthy and totally welcome at the same time. All I can say is, long live the King! Long live Larry Wall and Perl.

Okay… there were also lightning talks after Larry. I’m no lightning typist, so, I give you this single highlight:

“Sufficiently encapsulated ugly is indistinguishable from beautiful.” -Matt Trout

Day 2

To start off the second day, I am in a session called Hack Your Mac With Perl by Walt Mankowski. First we are covering the OS X concept of ‘services’, which govern inter-application communication. These are not to be confused with Windows services– different mechanism entirely. First, download an app called ThisService which aids in the service creation. I can see myself using this. In the end, you get: a Perl script that is key-bindable to process highlighted text. Nice.

Getting rid of services is tricky; try something called ‘Service Scrubber’ for this.

Next he is talking about ‘FSEvents’. This is what tracks changing files/directories for apps like Spotlight and Time Macine. For this we use a module called Mac::FSEvents. Triggering a Perl script when something in the filesystem changes (an ‘upload this’ directory, for instance); I could see myself using this someday too. Nice talk.

Now it’s time for Tim Bunce’s talk about Profiling Memory Usage (in Perl). This is a highly technical session. I’m no expert on Perl internals but I like to hang around and pretend I know what’s up. Tim has written a module called Devel::SizeMe. Wow. He’s actually doing graph visualizations of nested data structures and the memory allocated for each element of each array, hash key/value pairs, etc. Also subroutines! Dizzying levels of detail here. It’s amazing to visualize the amount of memory/pointers that are set up by the interpeter even for a Perl process that does exactly nothing.

Now Liz is presenting an overview of offshoot projects across Perl history entitled Perl’s Diaspora. This covers Parrot, Perl 6, Rakudo, various VM projects targeting both Perl 5 and Perl 6, etc. Most of this is not worth recounting here despite being a great summary of the many and varied Herculean efforts by some of the smartest members of this community. If you have an interest in this stuff, you have probably already been following it.

Next up: Unicode Best Practices, a talk from Nick Patch. Unicode is notoriously difficult to work with, but you’ll have to if your applications need to be internationalized. Perl has some of the best unicode support among programming languages. First off, put use utf8 at the top of your program, which will indicate that your source code itself will use the utf8 encoding. This frees you from having to use escape sequences and external reference tables. This is perhaps how I ought to approach a problem we have at UNH with our SOAP services; XML loves to barf on invalid characters and much of the data we’re shucking is user inputted.

Note: with utf8 on, don’t use

\d
in regular expressions, use
[0-9]
if you’re really only looking to match on standard ‘western’ digits. Because yes, even digits look different in many languages.

The properties matcher,

\p
or non-matcher
\P
can be used to match or not-match ASCII like so:
\p{ASCII}
Super useful, there.
\p{L}
stands for letter, and there are a number of incredibly useful properties matches (such as for currency symbols etc.).

Although internationalization is obviously a challenging aspect of programming, I really hope I get to tackle it at some point. Breaking down language barriers and unwinding that whole Tower of Babel (Babble? heh.) thing just seems like a noble application of labor. Babble on! (Babylon?)

After my first (I know! I’m sheltered…) meal at an Ethiopian restaurant (yummy) with a couple fellow hackers, I am now at Auditing Open Source Perl Code for Security by John Lightsey. Not seeing any code yet–so far, this is high-level “how to plan a security audit”. Now he has moved on to, once you have discovered a security vulnerability in a piece of open source code, what are the various options for disclosure (or less ethical options involving non-disclosure). So, this is more of a community management talk than a technical talk.

Wait, no… now he’s showing some of the vulnerability reports he’s given to CPAN module authors, some fixed, some not. Glad to see I’m not using any of the unpatched modules.

Now Karen Pauley will give an update on what The Perl Foundation has been up to in the past year. This was mostly a review of grant applications and money distribution, which I won’t recount, but various Perl efforts are always open to donations. Considering the extensive use of Perl in our systems, I would love to see UNH consider throwing a few bucks in this direction.

Onward we go… now listening to the ever-entertaining Matt Trout on Architecture Automation, One Alligator at Once. The thrust of this talk is how, when hired to consult on a codebase that needs an overhaul, and especially when all knowledge of said codebase has left the building, what investigations should you apply to learn about what in heck you’re dealing with? Dist::Surveyor is a notable mention here, which will tell you as best it can what your entire module dependency list is.

Finally, today’s keynote from Stevan Little is Perl – The Detroit of Scripting Languages. His main point is that when the Perl 6 design and implementation process began in 2000, Perl 5 development stalled. This is largely due to a not-entirely-wrongheaded commitment to preserving backward compatibility. But that can make it pretty tough to move forward, too.

Day 3

I’m starting off my third conference day with A Date With Perl (great title), a talk from Dave Rolsky who maintains the DateTime suite of modules. I saw Dave speak at YAPC::NA 2007 on this topic as well; the guy deserves a Presidential Medal of Honor for doing this work. When you realize that not only are there leap years, but leap seconds, and when you are told that there is an ‘EST’ time zone in both North America and Australia, and knowing that time zone and daylight savings time changes are at the discretion of politicians… you start to get an appreciation for Dave Rolsky. Don’t try to do datetime calculations by yourself, EVER, use a library like DateTime.pm, so you can get the benefit of all the hard work that’s been done for you.

And by the way, date and time related edge cases listed above are just the tip of the iceberg. There are hundreds if not thousands of weird exceptions to the rules that govern when. Don’t go it alone.

Dave’s jokes are hilarious; too bad it’s so early for a lot of the people here. These deserve more laughs.

Side note: I am sure a lot of people at this conference hold degrees and advanced degrees in computer science, but I haven’t actually spoken to one yet. Dave was a music major, I’ve been chumming around with a guy (about twice as smart as me) who never went to college, met another guy who majored in theatre like I did, etc. I love this field. I do work for a University, but clearly even UNH acknowledges it needs more IT help than it can find in credentialed applicants.

Next up: Unit-test CGI Scripts with mod_perl2 via Plack by Nathan Gray. I sorely need to understand Plack better, as it likely has a future in my stack. Unfortunately this talk is addressing using Plack for testing existing CGI scripts, and is not enlightening me in the ways that I need.

Now Sawyer X is speaking on Asynchronous Programming FTW (“For The Win”). This is a talk about event loops, although forking and threading are similar options for parallel processing. Sawyer X prefers the AnyEvent module (there are many choices) for handling event loops, due to its slim interface (as opposed to POE, which requires more lines of code to do the same thing). Hmmm… there is also an AnyEvent::XMLRPC which I could possibly use. I’ve never really considered that my XMLRPC services could be blocking, but of course they do. They’re just so darn fast in general that haven’t seen the need to optimize them. And I still won’t…. but I might add some benchmarking code to the services themselves to see if they ever block for as long as a second or half-second, in which case, AnyEvent::XMLRPC could come to the rescue for me. Because let’s face it: calls from to a service from different clients are going to be asynchronous by definition.

The next session for me is Inside Bokete: Tips of making web applications with Mojolicious and other components by Yusuke Wada who has traveled here form Yokohama, Japan. Mojolicious has views and controllers, but no object model (didn’t know this. My own web framework is similar in this way). He prefers DBIx::Skinny over DBIx::Class, since in Mojo you get to choose if and how to add your ORM (Object Relational Model). He also uses Carton, which is still labeled as experimental, but many of us are so desperate for a method of pinning down CPAN module versions in our applications that we just might experiment. He also uses an interesting deploy-from-git module that is very like the work I’ve been doing to deploy from Subversion. Good talk, funny guy. You have to respect all the people here for whom English is not a first language. Coding is challenging enough, imagine if all the keywords, documentation, etc. was not in English.

After lunch, Ricardo Signes, the current Perl 5 pumpking, is bringing us up to date on the language; his talk is entitled Perl 5: Postcards from the Edge. Something soon to come in a Perl 5 release sounds like a great idea: lexical subroutines. In other words, you can do

my sub foo {}
or
our sub foo {}
and scope the availability of your subroutines. Forgive me if I’ve gotten this wrong, because I’m no Java guy, but I believe this is roughly the feature you get with
public
vs.
private
vs.
protected
etc., when you are defining a Java class. Please feel to correct me in the comments because it’s likely these things are not 100% analogous.

Now Joe Axford is giving his Notes From A Newbie. It’s always impressive when a newbie looks to be about 20 years my senior… learning is a lifelong pursuit! What a positive and energetic dude. I aspire.

Now I’ve somewhat accidentally landed in Perl 6 Debugger Highlights, from Jonathan Worthington, which will likely be several meters over my head. But that’s ok. Just by looking up to try and see these things, I tend to get a little smarter. [... much interactive debugging on Perl 6 with jokes I wish I understood from Larry and Patrick in between ...] These guys are smoking my neurons. Yowza.

Day 4 and Day 5

I will not be blogging days 4 and 5, as I am in a class entitled Web and Mobile Development Using Perl, HTML5, CSS3, and JavaScript, taught by Gabor Szabo.

jQuery is(), not(), and ! is()

jQuery lovers: this little distinction got me this morning, even though I should know it already. I’d explain it here, but the linked article does a fine job.

Addendum: is it still worth using a JS abstraction library like jQuery? Yes.

Help! I’m on OS X and My Mouse Won’t Work!

I am writing this post in hopes that the next poor Google searcher… if they can search at all… finds the solution quicker than I did. Sure, it only took me about a half hour, but that’s a lifetime these days.

So my mouse stops working on the MacBook Pro. Oh, I can move it around alright. I can move it around all day and night. I just can’t frakking click. The left-click won’t work but the right-click (I have a two button Bluetooth mouse) will. It’s amazing how quickly one realizes that Right Click isn’t fit to iron Left Click’s pants in the morning, when Left Click isn’t working.

OK. This is just something wrong with that Bluetooth mouse, right? So I try the trackpad. Same behavior… I can move that cursor around wherever I like, but clicking is futile. If you think this won’t make a grown man cry, you weren’t in my basement tonight.

Luckily, my keyboard still worked. So with a little ALT-TAB I could get into my browser, tab around to the location field (thanks Chrome!) and do a Google search on this shiz.

Lots of talk about Bluetooth devices… Bluetooth devices in the next room, Bluetooth devices in the work bag, Bluetooth devices leaning up against stuff with their buttons getting pushed.

Makes sense. So I turn off the power on my Bluetooth mouse. No difference. I yank the USB piece that talks to Bluetooth mouse. No difference… trackpad still produces barren clicks. So I yank every peripheral I have connected…. firewire hard drives, USB hub, audio system, second display. NO DIFFERENCE. A ZOMBIE IS LIVING IN MY MAC AND FEEDING ON ITS LEFT CLICK LIKE BRAINS.

The solution I finally arrived at after more searching, thank you keyboard I love you, was to disable Bluetooth entirely at the command line. This is the same thing as unchecking ‘On’ in System Preferences -> Bluetooth, but trying doing THAT with Left Click pacing the picket line.

COMMAND-SPACE got me into Spotlight, where I typed “Terminal”, arrowed down and hit ENTER. Now at the command line, I typed:

sudo defaults write /Library/Preferences/com.apple.Bluetooth “ControllerPowerState” -bool FALSE

That turned Bluetooth OFF… sort of. That turns the preference off, but to kill any existing Bluetooth connections dead, a little more magic is required:

sudo killall -SIGHUP blued

And with that: my trackpad was restored to normal. I’m pretty happy with my trackpad at this very moment, but I’m sure I’ll start using Bluetooth devices again… with the above experience in the back of my mind.

Q: How Many Developers Does It Take To Screw In A Light Bulb?

A: However many are needed to program the new Philips iOS SDK for light bulbs…

Panorama theme by Themocracy