Guess who and guess what? That’s right it’s Marcus once again, at OSCON 2013, the open source convention held annually in Portland, Oregon. There is no other gathering where you can find such a diversity of computer language and tooling expertise. To say that the open source world has a richer, more collaborative programming community than the commercial world isn’t just an understatement: it’s the very definition.
So, thanks once again to the University of New Hampshire for sponsoring this trip. I will try to return the favor by continuing to write software which will not require expensive licenses, be poorly interoperable, or tied to the whims of a single corporate overlord.
That said, let’s convey some actual information…
My first two days will look like this:
Monday morning – yet another introductory git session. I know. I take these all the time and still use Subversion. So shoot me. This talk is delivered by a github instructor. They evangelize quite well for git, as they should, and teach it very well too.
Monday afternoon – Optimizing Your Perl Development by Damian Conway. Can’t *wait* to see Damian again. This session has a good chance of being the highlight of my week.
Tuesday morning – Web Accessibility for the 21st Century. I am wondering, what year will “…for the 21st Century” stop seeming forward-looking? 2020? 2050? Is there a rule about this?
Getting Started with Git
I’m spending much of this session aping the exercises to ‘git’ some practice. See what I did there? Anyhow, that’s why I’m not writing much about this session, but here is a tiny detail about git (vs. SVN):
Ignoring files in SVN is done via the svn:ignore ‘property’; in git, you create a .gitignore file, which git looks at to know what to ignore. Add and commit .gitignore itself as well. I’m not sure which approach to prefer, but there they are.
This is perhaps my third stint in a beginner’s git class.
Damian Conway’s Development Talk
Neat new tool from Damian called ‘polyperl’, which, in conjunction with Perlbrew, allows you to designate *exactly* which version of Perl your program is running, using the normal ‘env’ on the shebang. It look like:
#!/usr/bin/env polyperl use 5.012;
Note that ‘use 5.012;’ *without* polyperl in the shebang would only ensure that the *minimum* version you run will be 5.12. Specifying the actual version, no greater and no lesser? Polyperl! Someday Perl will have this natively, but live for today people, and use the Perl and nothing but the very Perl you want.
[skipping a bunch of vim tricks and plugins here...]
Recommendation: Method::Signatures. Damian prefers it over Params::Validate. I’ll have to check to see if it can be used without dying on validation failures (checked: it appears it can). I should probably mention what these modules do. They help you validate that the variables being passed to a subroutine are correct in number, type, etc.
Recommendation #2: Getopt::Euclid as a replacement for Getopt::Long (or any of the other 3,000 modules for writing CLIs on CPAN). This is one of Damian’s own. Cool thing about Getopt::Euclid? The way you define the incoming parameters is to *write the POD documentation* for your CLI. That’s right! Getopt::Euclid reads your POD and uses it as the interface declaration. Gorgeous. Can’t get away without documenting *that* now can you?
And now, I am laughing inside. Damian has his CPAN build/test/release process completely automated. I’ve done the same general type of work to release/deploy my own code to UNH production environments in a scripted, less error prone way. It’s pretty clear that years of reading and listening to Damian have affected me. I am now just as crazy as he is, if only 1/100th as clever.
It’s now Tuesday morning and I’m in the web accessibility session mentioned above. One point that is being hammered home from the start: there is no standard that’s a silver bullet for accessibility. Many ‘official’ accessibility guidelines, even if met to the letter, don’t *actually* address the problems that users are having in accessing and understanding the content on your websites.
Also, a website which is accessible in one way, for a certain disability, is often totally at odds for being accessible to another disability. Those things are depressing, but they are the truth. If you’re designing, or redesigning, for accessibility, the best approach is to use real assistive technology tools.
NVDA for Windows (a screen reader for the blind) is particularly useful in that as a sighted person, you can turn off the audio and simply have the reader log the text that it *would* read in a separate window. This shows you the ‘flow’ of your page from a screen reader’s point of view, without the constant screen reading chatter. Screen readers are intensely annoying if you don’t absolutely need them.
One useful piece of advice: use page headings, but not to excess. Here we are talking about the h1, h2 etc. HTML elements. Use them as intended as a general outline for your content, but don’t overdue them for design purposes or whatnot. A screen reader user will be using these to hop around to the different logical sections of your content.
A website highly recommended by the speakers: WebAIM.
Another key approach to accessibility testing is mouseless navigation. Stick to the keyboard and you’ll get a much better idea of how the elements flow on your pages. Use the ‘tabindex’ attribute to fix such flow issues.
Coming under severe criticism here: carousels and privileged links…
Carousels: carousels are rotating or self-changing images or content. These are apparently hell on accessibility, for an obvious reason: elements of the page are moving around on their own. We commit this sin on unh.edu itself, despite the very nice design. Perhaps we have something in place to mitigate this for those using assistive technologies? The experts here say to avoid them wholesale. Hey, I didn’t realize it either.
Privileged links: links that, when clicked on, do not grant access to the resources behind them (even with login). A typical message would be “you do not have permission to access this resource” or similar. Repeat offender? Sharepoint. I’ve always despised this. The geniuses at Microsoft probably call these ‘teasers’ or something. Don’t tease me, thanks.
I know. Sounds dangerous. But there are ways of handling errors with this approach. The whole goal is raising the perceived performance of an app. Milliseconds matter, which the speaker supports which such figures as: if Amazon pages load just 100ms slower than usual, they lose 1% of sales. That’s compelling.
Yeah… he’s literally explaining the exercise *after* we’re suppose to have finished it. What? I’m not psychic, man!
Still, it’s a good talk, and I’m picking up the spirit of things. I can see that somewhere down the road, I’ll likely be using of the 8,000 JS MVC frameworks.
The convention proper really starts on Day 3, today, with some relaxing keynote talks attended by all. There are a couple thousand people here, ballpark.
First there was a neuroscience/AI guy who actually got me excited about the state of brain research for a few minutes. Next up, a Facebook honcho. No matter how much I may resent Facebook and distrust the current look of ‘social’, it can’t be denied that Facebook is a more modern company than, say, Microsoft or Oracle, in its disposition towards open source.
If you are reading this and remain a bit unsure about the open source approach, let me take a moment to explain why a company like Facebook would open some of its coding projects to the world. Wouldn’t this reveal trade secrets and squander the intellectual property value they’ve created? Not really, as it turns out. The code they are open sourcing is attacking difficult problems such as the scaling of big data storage and delivery. These problems are nowhere *near* perfectly solved. By open sourcing these code projects, Facebook can attract meaningful contributions from other interested companies like Intel and Broadcom. This collaboration raises all boats without diluting the core competencies of each company.
What’s even cooler about open source is that *even little guys like me* have access to that work. Had I more entrepreneurial spirit, I could fuel a startup on the very code running some of Facebook’s systems today. I could contribute my own R&D back to the project.
That’s how it works. You might also check out this amazing open source effort for another perspective on the kinds of problems open source can help us all to solve together.
Back to the keynotes… just saw an incredibly futuristic demo of a flying drone being controlled by open source Clojure code. This drone hovered as well as any Hollywood robot has ever hovered, could recognize images, stream video, recognize faces… you name it. Audible “wows” in the audience, one of which was mine.
Final keynote (after a good one from In Bloom, who are here to open source their work) is Mark Shuttleworth, founder of Ubuntu. He’s talking about Ubuntu’s one-OS (very Windows 8ish) approach to unifying user experience across all types of devices. But of course, on Linux. Now he’s talking about Juju, a tool for deploying and connecting various software infrastructure elements (think MySQL, Cassandra, MongoDB, WordPress, etc… whatever your cocktail may be). Pretty amazing level of automation.
Shuttleworth just called Mac OS X (perhaps the primary Ubuntu competitor, if you think about it) “the gilded cage”. Great line. But gilded it remains, by comparison, for the time being… and they know it. He was in the process of announcing Juju for Mac OS X.
The first focused session of my day is about Asterisk, an open source PBX. Asterisk has been around for quite some time and is very mature. Although we are invested in Avaya at UNH, I myself can definitely benefit from any PBX-related knowledge, as a non-PBX-expert in the Telecom group. It’s possible we could add Asterisk as a sort of sister system connected to the main PBX, to offer discrete features that we are otherwise unable to provide (or unable to provide cost effectively). It does call queuing! Neat.
I once wrote a CDR reporting app on top of Asterisk, almost ten years ago now. It was not open sourced by the company I worked for. Looking around, there is so much more available out there now.
The speaker is recommending Wireshark’s advanced VOIP diagnostic features. Never realized Wireshark had stuff as specific as this. Also ‘ngrep’, for ‘network grep’, a sort of filtering tool for tcpdumps similar to some of Wireshark’s functionality. Now ‘sipp’ which is a SIP performance testing tool. Now ‘sipsack’ for generating specialized SIP packets for troubleshooting purposes. It’s good to know these tools are out there in case I’m ever asked to work on this stuff, which is always possible.
Of all new PBXs deployed in North America today, 16-18% of them are Asterisk. One guy in the audience suggests it’s closer to 35%. The FAA is looking at it as a possible solution to connect control towers.
Next up, a session on running on your company’s internal applications using open source practices. We are collaborating more and more across IT units at UNH, as the desire for better interconnected systems grows. The speaker begins with how to deal with the necessary communication overhead involved in working together. Some key concepts are transparency (work in the open, make your code source examinable across organizational units, govern projects in the open), quality (code reviews, good testing), community (points of contact outside your primary team). This latter point she illustrates can lead to good people staying longer at a company, having connections across business units.
The central concept is “internally open source”. Yes, please! We would gain ridiculous efficiencies if we did this, I think. Silos are full of nothing but corn. :P I am sure we have countless shared needs that would be opportunities to work smarter, not harder. A small DevOps swat team could do wonders in this area if we could dedicate the resources. The speaker is expounding on that very thing: a core team which “owns” (think of ownership loosely) all centralized code for an organization.
Note that the above is not geared towards homogenizing our various codebases. Different coding standards and technology choices can be applied on a project by project basis. We can continue to pursue diverse approaches while doing this. It’s important to preserve the evolutionary advantage of using diverse technologies while centralizing certain efforts. <– Yes, this is a challenging balance to achieve, but the first step is to be mindful of it.
Belly full of lunch now, the next session for me is Randal Schwartz’s Half My Life With Perl. My four planned afternoon sessions are in fact all on the Perl track. That is part of what’s great about OSCON: you can fill your own schedule with talks most pertinent to your work or interests. As always there is a strong contingency here for Perl. After all, OSCON started off as just The Perl Conference way back when.
This autobiographical talk from Randal is a bit unusual as these talks go, but, very interesting to us Perl folk. Perl has a few more years of history behind it than most of the projects here.
Next up: Start Contributing to Perl, It’s Easy! This is a good overview from a relatively (2007) new Perl community member. I have a grand total of 1 patch accepted on CPAN, but that’s one more than I did last year at this time.
I am pumped for the next session: Carton: Manage CPAN Dependencies Without The Mess. Carton is a tool I am hoping to use within the next year in conjunction with Perlbrew, cpanminus aka cpanm (also written by the speaker), and Damian’s polyperl, mentioned way up above. Armed with all these goodies I hope to create my first set of discrete Perl stacks on the same box, each running their own chosen version of Perl core, and its own CPAN module dependencies right down to the module versions. This panacea has eluded me for some time, but luckily Miyagawa is hard at work to make my life better.
“Dependencies are part of the app.” Yup!
Note to self: run ‘carton check’ as part of the IX::update() routine, and if there are installations to perform, run ‘carton install –deployment’ or ‘carton install –cached –deployment’. This is going to be like butter! Mmmmmm.
First session for me is: Evolutionary Architecture and Emergent Design, which sounds fancy. He has an interesting graph up showing the relative complexity (complexity-per-line) of a code base over time. Add features tends to increase complexity, refactoring (paying off the technical debt) lowers it. He makes some good arguments for component-based architecture and against over-engineering early in a project. A high level talk but engaging.
Belly full of lunch, I will now be hearing about The Perl Renaissance from Paul Fenwick. Paul is a fun Australian who can often be seen in a funny hat. He is reminding us of some great tools such as cpanm, perlbrew, and Dist::Zilla (for CPAN authors). Now some talk of Moose and Moo, and another positive word this conference for Method::Signatures. Now a reminder about a feature available as of Perl 5.10: named captures from regular expressions (a good description of these can be found here). This frees you from having to use ordinality in capturing string matches from regexes.
And now for Ricardo’s (the Perl Pumpking) update on Perl 5. This is really the Perl 5.18 changes update I saw last month, some of which I covered in my Reports on YAPC::NA 2013. Can’t wait to get my stack upgraded. But, one step at a time.
Thank goodness they schedule Friday to break earlier in the afternoon, because after 5 days of conferencing I really am shot. But today does have my most anticipated session, the Damian Conway Channel. He has a new module called Running::Commentary which is a nice way to write scripts that have lots of system commands. I could see using this for something like Oracle database backup scripting. Lexical::Failure is a way to give your module users a choice of return types upon failure, rather than imposing an undef, a die, or anything. Lexical::Hints is an advanced way to implement debug statements, which could even be code references. Lingua::EN::Grammarian is a fairly ambitious attempt to find grammar (not spelling… grammar) mistakes in English text. He’s got it plugged into vim. Wow!
Next up: BASH As A Modern Programming Language from an eBay guy. He is explaining why eBay selected BASH to write a utility to set up user environments to run their specialized web framework (surprise! eBay has a custom web framework). Reason #1 is portability as expected, BASH being the default shell in all major OSes except Windows, which can run it via Cygwin… so no problem there either. As a shell, it easily calls other binaries (perhaps even the runtime of other languages) and all languages have a way to call back out to the shell. This talk is reminding me how badly I want to avoid ever trying any serious programming in BASH. Gosh, what a clunky and confusing language. Tip of the hat to Mario Malizia who has worked wonders with BASH in the creation of CMU (ECG’s venerable Code Management Utility).
My final session of the conference is on Secure Open Source Development, from a Red Hat guy. This is a high-level discussion of how to communicate about security issues proactively (before release) in the open source development cycle. Red Hat in particular has a huge challenge because they are pulling hundreds of projects from various communities into their releases. He makes the fairly obvious point that things like static code analysis (programs that analyze code for security) are the future of the field. These tools exist in the present but are far from perfect; manual audits remain necessary. Unsurprisingly, security hasn’t become a whole lot easier in 2013 than it was in 2012.
Well, that concludes my visit to OSCON this year. As usual the week flew by. The hardest part was paying attention to each session while dying to try out something I’d learned in the previous one. So now I’m taking a little time to play around with cpanm/perlbrew/polyperl/local::lib before lunch and perhaps a dip in the hotel. Life is good!
If you read this far, congratulations, you’re a nerd. If you’re *that* interested in this stuff then I hope you’ll consider going to OSCON yourself next year.