Posts tagged: OSCON

Reports from OSCON 2013

Guess who and guess what? That’s right it’s Marcus once again, at OSCON 2013, the open source convention held annually in Portland, Oregon. There is no other gathering where you can find such a diversity of computer language and tooling expertise. To say that the open source world has a richer, more collaborative programming community than the commercial world isn’t just an understatement: it’s the very definition.

So, thanks once again to the University of New Hampshire for sponsoring this trip. I will try to return the favor by continuing to write software which will not require expensive licenses, be poorly interoperable, or tied to the whims of a single corporate overlord.

That said, let’s convey some actual information…

My first two days will look like this:

Monday morning – yet another introductory git session. I know. I take these all the time and still use Subversion. So shoot me. This talk is delivered by a github instructor. They evangelize quite well for git, as they should, and teach it very well too.

Monday afternoon – Optimizing Your Perl Development by Damian Conway. Can’t *wait* to see Damian again. This session has a good chance of being the highlight of my week.

Tuesday morning – Web Accessibility for the 21st Century. I am wondering, what year will “…for the 21st Century” stop seeming forward-looking? 2020? 2050? Is there a rule about this?

Tuesday afternoon – Backbone Workshop. Backbone is a Javascript framework to facilitate the writing of a fatter client in the browser while avoiding typical Javascript pain points.

Day 1

Getting Started with Git

I’m spending much of this session aping the exercises to ‘git’ some practice.  See what I did there?  Anyhow, that’s why I’m not writing much about this session, but here is a tiny detail about git (vs. SVN):

Ignoring files in SVN is done via the svn:ignore ‘property’; in git, you create a .gitignore file, which git looks at to know what to ignore. Add and commit .gitignore itself as well. I’m not sure which approach to prefer, but there they are.

This is perhaps my third stint in a beginner’s git class.

Damian Conway’s Development Talk

Neat new tool from Damian called ‘polyperl’, which, in conjunction with Perlbrew, allows you to designate *exactly* which version of Perl your program is running, using the normal ‘env’ on the shebang. It look like:

#!/usr/bin/env polyperl
use 5.012;

Note that ‘use 5.012;’ *without* polyperl in the shebang would only ensure that the *minimum* version you run will be 5.12. Specifying the actual version, no greater and no lesser? Polyperl! Someday Perl will have this natively, but live for today people, and use the Perl and nothing but the very Perl you want.

[skipping a bunch of vim tricks and plugins here...]

Recommendation: Method::Signatures. Damian prefers it over Params::Validate. I’ll have to check to see if it can be used without dying on validation failures (checked: it appears it can). I should probably mention what these modules do. They help you validate that the variables being passed to a subroutine are correct in number, type, etc.

Recommendation #2: Getopt::Euclid as a replacement for Getopt::Long (or any of the other 3,000 modules for writing CLIs on CPAN). This is one of Damian’s own. Cool thing about Getopt::Euclid? The way you define the incoming parameters is to *write the POD documentation* for your CLI. That’s right! Getopt::Euclid reads your POD and uses it as the interface declaration. Gorgeous. Can’t get away without documenting *that* now can you?

And now, I am laughing inside. Damian has his CPAN build/test/release process completely automated. I’ve done the same general type of work to release/deploy my own code to UNH production environments in a scripted, less error prone way. It’s pretty clear that years of reading and listening to Damian have affected me. I am now just as crazy as he is, if only 1/100th as clever.

Day 2

Web Accessibility

It’s now Tuesday morning and I’m in the web accessibility session mentioned above. One point that is being hammered home from the start: there is no standard that’s a silver bullet for accessibility. Many ‘official’ accessibility guidelines, even if met to the letter, don’t *actually* address the problems that users are having in accessing and understanding the content on your websites.

Also, a website which is accessible in one way, for a certain disability, is often totally at odds for being accessible to another disability. Those things are depressing, but they are the truth. If you’re designing, or redesigning, for accessibility, the best approach is to use real assistive technology tools.

NVDA for Windows (a screen reader for the blind) is particularly useful in that as a sighted person, you can turn off the audio and simply have the reader log the text that it *would* read in a separate window. This shows you the ‘flow’ of your page from a screen reader’s point of view, without the constant screen reading chatter. Screen readers are intensely annoying if you don’t absolutely need them.

One useful piece of advice: use page headings, but not to excess. Here we are talking about the h1, h2 etc. HTML elements. Use them as intended as a general outline for your content, but don’t overdue them for design purposes or whatnot. A screen reader user will be using these to hop around to the different logical sections of your content.

A website highly recommended by the speakers: WebAIM.

Another key approach to accessibility testing is mouseless navigation. Stick to the keyboard and you’ll get a much better idea of how the elements flow on your pages. Use the ‘tabindex’ attribute to fix such flow issues.

Coming under severe criticism here: carousels and privileged links…

Carousels: carousels are rotating or self-changing images or content. These are apparently hell on accessibility, for an obvious reason: elements of the page are moving around on their own. We commit this sin on itself, despite the very nice design. Perhaps we have something in place to mitigate this for those using assistive technologies? The experts here say to avoid them wholesale. Hey, I didn’t realize it either.

Privileged links: links that, when clicked on, do not grant access to the resources behind them (even with login). A typical message would be “you do not have permission to access this resource” or similar. Repeat offender? Sharepoint. I’ve always despised this. The geniuses at Microsoft probably call these ‘teasers’ or something. Don’t tease me, thanks.


Backbone.js is an MVC (Model-View-Controller) Javascript framework.  There are approximately eight thousand of these coming out every week these days (slight exaggeration).  Aside from separation of concerns via the MVC approach, Backbone.js and other similar JS frameworks focus on moving state management to the client side.  This means getting away from the request/response model that even AJAX employs.  Instead, the client says ‘Saved!’, and *then* sends the data to the server.

I know.  Sounds dangerous.  But there are ways of handling errors with this approach.  The whole goal is raising the perceived performance of an app.  Milliseconds matter, which the speaker supports which such figures as: if Amazon pages load just 100ms slower than usual, they lose 1% of sales.  That’s compelling.

Another thing Backbone.js brings to Javascript programming is better scoping, which is absent in JS.  By default, you are generally searching the entire DOM (think: an entire webpage) for an HTML element that you’d like to work with.  This narrowing of scope means selectors are greatly simplified and you get further separation of concerns.

The speaker seems to be assuming that we studied for this or something.  The exercises are like Go!  … with no ‘Ready… Set…’.  I think the majority of folks here spend 90% of their time writing Javascript on similar frameworks already.  With about 10% of my time allotted for Javascript, I’d need to move a bit more slowly until I get comfy.  (Let me take a moment to express my envious hatred for natural polyglots who see new syntax and immediately parse it effortlessly. I hate you.  I’ve always had to work for it.)

Yeah… he’s literally explaining the exercise *after* we’re suppose to have finished it.  What?  I’m not psychic, man!

Still, it’s a good talk, and I’m picking up the spirit of things.  I can see that somewhere down the road, I’ll likely be using of the 8,000 JS MVC frameworks.

Note: things like Backbone.js work at a slightly different layer of abstraction as jQuery, which is *also* called a Javascript framework.  In fact, Backbone.js works in conjunction with jQuery.  jQuery provides better looking code to write JS in, while something like Backbone.js helps wrangle the overall structure of a large client-side application.  That’s a gross over-simplification, but… gross and simple are two things I do well.  :)

Day 3

The convention proper really starts on Day 3, today, with some relaxing keynote talks attended by all.  There are a couple thousand people here, ballpark.

First there was a neuroscience/AI guy who actually got me excited about the state of brain research for a few minutes.  Next up, a Facebook honcho.  No matter how much I may resent Facebook and distrust the current look of ‘social’, it can’t be denied that Facebook is a more modern company than, say, Microsoft or Oracle, in its disposition towards open source.

If you are reading this and remain a bit unsure about the open source approach, let me take a moment to explain why a company like Facebook would open some of its coding projects to the world.  Wouldn’t this reveal trade secrets and squander the intellectual property value they’ve created?  Not really, as it turns out.  The code they are open sourcing is attacking difficult problems such as the scaling of big data storage and delivery.  These problems are nowhere *near* perfectly solved.  By open sourcing these code projects, Facebook can attract meaningful contributions from other interested companies like Intel and Broadcom.  This collaboration raises all boats without diluting the core competencies of each company.

What’s even cooler about open source is that *even little guys like me* have access to that work.  Had I more entrepreneurial spirit, I could fuel a startup on the very code running some of Facebook’s systems today.  I could contribute my own R&D back to the project.

That’s how it works.  You might also check out this amazing open source effort for another perspective on the kinds of problems open source can help us all to solve together.

Back to the keynotes… just saw an incredibly futuristic demo of a flying drone being controlled by open source Clojure code.  This drone hovered as well as any Hollywood robot has ever hovered, could recognize images, stream video, recognize faces… you name it.  Audible “wows” in the audience, one of which was mine.

Final keynote (after a good one from In Bloom, who are here to open source their work) is Mark Shuttleworth, founder of Ubuntu.  He’s talking about Ubuntu’s one-OS (very Windows 8ish) approach to unifying user experience across all types of devices.  But of course, on Linux.  Now he’s talking about Juju, a tool for deploying and connecting various software infrastructure elements (think MySQL, Cassandra, MongoDB, WordPress, etc… whatever your cocktail may be).  Pretty amazing level of automation.

Shuttleworth just called Mac OS X (perhaps the primary Ubuntu competitor, if you think about it) “the gilded cage”.  Great line.  But gilded it remains, by comparison, for the time being…  and they know it.  He was in the process of announcing Juju for Mac OS X.

The first focused session of my day is about Asterisk, an open source PBX.  Asterisk has been around for quite some time and is very mature.  Although we are invested in Avaya at UNH, I myself can definitely benefit from any PBX-related knowledge, as a non-PBX-expert in the Telecom group.  It’s possible we could add Asterisk as a sort of sister system connected to the main PBX, to offer discrete features that we are otherwise unable to provide (or unable to provide cost effectively).  It does call queuing!  Neat.

I once wrote a CDR reporting app on top of Asterisk, almost ten years ago now.  It was not open sourced by the company I worked for.  Looking around, there is so much more available out there now.

The speaker is recommending Wireshark’s advanced VOIP diagnostic features.  Never realized Wireshark had stuff as specific as this.  Also ‘ngrep’, for ‘network grep’, a sort of filtering tool for tcpdumps similar to some of Wireshark’s functionality.  Now ‘sipp’ which is a SIP performance testing tool.  Now ‘sipsack’ for generating specialized SIP packets for troubleshooting purposes.  It’s good to know these tools are out there in case I’m ever asked to work on this stuff, which is always possible.

Of all new PBXs deployed in North America today, 16-18% of them are Asterisk.  One guy in the audience suggests it’s closer to 35%.  The FAA is looking at it as a possible solution to connect control towers.

Next up, a session on running on your company’s internal applications using open source practices.  We are collaborating more and more across IT units at UNH, as the desire for better interconnected systems grows.  The speaker begins with how to deal with the necessary communication overhead involved in working together.  Some key concepts are transparency (work in the open, make your code source examinable across organizational units, govern projects in the open), quality (code reviews, good testing), community (points of contact outside your primary team).  This latter point she illustrates can lead to good people staying longer at a company, having connections across business units.

The central concept is “internally open source”.  Yes, please!  We would gain ridiculous efficiencies if we did this, I think.  Silos are full of nothing but corn.  :P   I am sure we have countless shared needs that would be opportunities to work smarter, not harder.  A small DevOps swat team could do wonders in this area if we could dedicate the resources.  The speaker is expounding on that very thing: a core team which “owns” (think of ownership loosely) all centralized code for an organization.

Note that the above is not geared towards homogenizing our various codebases.  Different coding standards and technology choices can be applied on a project by project basis.  We can continue to pursue diverse approaches while doing this.  It’s important to preserve the evolutionary advantage of using diverse technologies while centralizing certain efforts.  <– Yes, this is a challenging balance to achieve, but the first step is to be mindful of it.

Belly full of lunch now, the next session for me is Randal Schwartz’s Half My Life With Perl.  My four planned afternoon sessions are in fact all on the Perl track.  That is part of what’s great about OSCON: you can fill your own schedule with talks most pertinent to your work or interests.  As always there is a strong contingency here for Perl.  After all, OSCON started off as just The Perl Conference way back when.

This autobiographical talk from Randal is a bit unusual as these talks go, but, very interesting to us Perl folk.  Perl has a few more years of history behind it than most of the projects here.

Next up: Start Contributing to Perl, It’s Easy!  This is a good overview from a relatively (2007) new Perl community member.  I have a grand total of 1 patch accepted on CPAN, but that’s one more than I did last year at this time.

I am pumped for the next session: Carton: Manage CPAN Dependencies Without The Mess.  Carton is a tool I am hoping to use within the next year in conjunction with Perlbrew, cpanminus aka cpanm (also written by the speaker), and Damian’s polyperl, mentioned way up above.  Armed with all these goodies I hope to create my first set of discrete Perl stacks on the same box, each running their own chosen version of Perl core, and its own CPAN module dependencies right down to the module versions.  This panacea has eluded me for some time, but luckily Miyagawa is hard at work to make my life better.

“Dependencies are part of the app.”  Yup!

Note to self: run ‘carton check’ as part of the IX::update() routine, and if there are installations to perform, run ‘carton install –deployment’ or ‘carton install –cached –deployment’.  This is going to be like butter!  Mmmmmm.

Day 4

First session for me is: Evolutionary Architecture and Emergent Design, which sounds fancy.  He has an interesting graph up showing the relative complexity (complexity-per-line) of a code base over time.  Add features tends to increase complexity, refactoring (paying off the technical debt) lowers it.  He makes some good arguments for component-based architecture and against over-engineering early in a project.  A high level talk but engaging.

Next up, Dealing With Multiple Types of Input in HTML5 and Javascript Code.  A good reminder to start using the HTML5 input types that touch browsers will offer different keyboard options for: ‘email’, ‘tel’, ‘url’, etc.  But the talk focuses on ‘pointer events’ which are vary much like mouse events except you can get fine-grained control over touch, pen or mouse inputs specifically, for instance, tweaking game play dynamics in an HTML5 game based on the type of input being used.  There are also ways to test for touch pressure and the orientation of the device.  So this is the basically the next generation of event handlers which account for the new generation of input devices.

Belly full of lunch, I will now be hearing about The Perl Renaissance from Paul Fenwick.  Paul is a fun Australian who can often be seen in a funny hat.  He is reminding us of some great tools such as cpanm, perlbrew, and Dist::Zilla (for CPAN authors).  Now some talk of Moose and Moo, and another positive word this conference for Method::Signatures.  Now a reminder about a feature available as of Perl 5.10: named captures from regular expressions (a good description of these can be found here).  This frees you from having to use ordinality in capturing string matches from regexes.

And now for Ricardo’s (the Perl Pumpking) update on Perl 5.  This is really the Perl 5.18 changes update I saw last month, some of which I covered in my Reports on YAPC::NA 2013.  Can’t wait to get my stack upgraded.  But, one step at a time.

Day 5

Thank goodness they schedule Friday to break earlier in the afternoon, because after 5 days of conferencing I really am shot. But today does have my most anticipated session, the Damian Conway Channel. He has a new module called Running::Commentary which is a nice way to write scripts that have lots of system commands. I could see using this for something like Oracle database backup scripting. Lexical::Failure is a way to give your module users a choice of return types upon failure, rather than imposing an undef, a die, or anything. Lexical::Hints is an advanced way to implement debug statements, which could even be code references. Lingua::EN::Grammarian is a fairly ambitious attempt to find grammar (not spelling… grammar) mistakes in English text.  He’s got it plugged into vim.  Wow!

Next up: BASH As A Modern Programming Language from an eBay guy.  He is explaining why eBay selected BASH to write a utility to set up user environments to run their specialized web framework (surprise!  eBay has a custom web framework).  Reason #1 is portability as expected, BASH being the default shell in all major OSes except Windows, which can run it via Cygwin… so no problem there either.  As a shell, it easily calls other binaries (perhaps even the runtime of other languages) and all languages have a way to call back out to the shell.  This talk is reminding me how badly I want to avoid ever trying any serious programming in BASH.  Gosh, what a clunky and confusing language.  Tip of the hat to Mario Malizia who has worked wonders with BASH in the creation of CMU (ECG’s venerable Code Management Utility).

My final session of the conference is on Secure Open Source Development, from a Red Hat guy.  This is a high-level discussion of how to communicate about security issues proactively (before release) in the open source development cycle.  Red Hat in particular has a huge challenge because they are pulling hundreds of projects from various communities into their releases.  He makes the fairly obvious point that things like static code analysis (programs that analyze code for security) are the future of the field.  These tools exist in the present but are far from perfect; manual audits remain necessary.  Unsurprisingly, security hasn’t become a whole lot easier in 2013 than it was in 2012.

Well, that concludes my visit to OSCON this year.  As usual the week flew by.  The hardest part was paying attention to each session while dying to try out something I’d learned in the previous one.  So now I’m taking a little time to play around with cpanm/perlbrew/polyperl/local::lib before lunch and perhaps a dip in the hotel.  Life is good!

If you read this far, congratulations, you’re a nerd.  If you’re *that* interested in this stuff then I hope you’ll consider going to OSCON yourself next year.

Reports from OSCON 2011

Here I am once more at the best organized open source gathering in the world, OSCON. This is my third trip to Portland, OR for this event, having attended in 2008 and 2010. Portland is modern and vibrant, lush, young, exciting. I would live here in a heartbeat.

the Portland, OR Lloyd Center District

This time at OSCON, I’ve registered for the extra half-day tutorial sessions. These are more intensive than the shorter sessions of the conference proper. I will attend four of these, today and tomorrow. Instead of posting notes in real time as I usually do, I will summarize these tutorials after the fact.

Monday morning – Damian Conway on ‘Presentation Aikido’

As some of you might know, I occasionally like to give presentations on technical topics. This probably comes from attending a lot of them, and feeling the benefit of this format for information exchange. It may also be a symptom of my having majored in theatre, while ending up working in computers. Either way, selecting Damian Conway’s talk was a no-brainer for me, as I’d love to do a much, much better job at giving what I call ‘good slide show’.

Damian is a genius and a master showman, in addition to being the author of Perl Best Practices and other books. Confidence in his methods, if not proven to you in the pudding as he speaks (it will be), is supported by the fact that Damian’s sole income is from speaking engagements. I hadn’t known this, and wonder how long that’s been the case.

I took pretty detailed notes on Damian’s talk, despite the hardcopy he also provided, but will only bullet out a few favorite quotes and observations…

  • “I like telling anecdotes. It humanizes me.” [humanizing oneself can be important when giving technical talks. -ed.]
  • “Prowl the stage like a lion.”
  • Damian introduced me the Takahashi method of presenting slides. Obviously very influential but I hadn’t known there was a name for it.
  • Content, Damian explains, doesn’t matter so much as style in giving presentations. So true. Of course, if you have both, you have a real winner.
  • Oops. I’ve definitely made a couple mistakes Damian is pointing out. Putting complete sentences on slides. Overwhelming the audience with information or too-complex charts.
  • If you don’t have your own style, steal from those who have the best style. Bang & Olufsen and the Japanese are given as examples.
  • Damian’s approach to presenting to humans is a zoological study. Many of his insights on the social dynamics of speaking to a group are insights on speaking to a group of primates. Can’t argue with any of it; he’s one of the best speakers I’ve ever seen. Anyone got a banana?
  • Damian admires David Attenborough for the way he engages with his subject matter (usually, animals) in his documentaries. Oddly I discovered Attenborough recently, having watched his Life of Mammals series early this summer.

I could say so much more about Damian Conway and this talk, but I won’t. You might have the chance to see it sometime. I can think of no reason not to attend his ‘Advanced Vim’ tutorial this afternoon (Tuesday), despite several other interesting offerings. I could certainly stand to be better at Vim, and Damian never disappoints, promising even more style than content, entertainment being king.

Monday afternoon – Joshua Marinacci on ‘HTML5 Canvas Deep Dive’

‘Canvas’ is a new feature available in modern browsers. If SVG is the Adobe Illustrator of the web (vector graphics), Canvas is the Photoshop (bitmap, pixel-oriented graphics). I am sure graphics experts would correct me on several points here, but these generalizations are good enough for the rest of us.

To be honest, I knew from the beginning of this talk that I wouldn’t be programming directly against canvas in my day to day work. So although the low-level exercises we did were fun, my mind was elsewhere. For a business programmer like me, it’s only important to know which browsers support canvas, and what options might be available for it. For instance, I have begun switching from the Flash-based charting solution to RGraph, which uses canvas and therefore support iOS devices.

I think it would be a blast to work on graphics projects again… I used to do a fair amount of Flash and other graphics work… but in recent years I’ve been asked for practical machines more than glossy covers… content, I suppose, over style. So my skills have slipped in the aesthetics department.

Tuesday morning – Remy Sharp on ‘Is HTML5 Ready for Production?’

Similar to Monday’s second talk, this is an HTML5 talk with coverage of canvas and plenty of little exercises for us to do. So, this is mostly a tutorial, although Remy does address the ‘production ready?’ issue by pointing out that even CSS 2.1 isn’t completing implemented, to spec, in all modern browsers, but we’ve all been cherry picking the best-supported features for years. True. Specs are almost always implemented incompletely, so we really need to make judgment calls about feature support on a case-by-case basis. This is part of what continues to make professional web development a challenging and expert-oriented field.

The most enlightening portion of this talk, for me, was Remy’s explanation of Web Storage as a replacement for cookies. Web Storage is its own spec, separate from HTML5 (like many features are, actually, despite being bundled with HTML5 in common parlance), and needs to be considered on its own. But this looks good to me. If I end up implementing a more robust session management system than my current one, I’ll be looking at Web Storage more closely.

Tuesday afternoon – Damian Conway on ‘Advanced Vim’

I am particularly glad for Damian’s practice of providing hardcopy handouts for his presentations in this case. Lots of Vim commands in a short period of time, none of which I’d like to be writing down as he covers them. Later on, I’ll have a grand old time spiffing up my .vimrc file, going by his pamphlet.

Terminal-based text editors hail from a time before the mouse, but I still use Vim quite a bit when I’m roaming around servers, wearing my sys admin hat, messing with config files and such. I ought to get a *little* better at it, at least. Luckily, most of the things I am learning today will simply be permanent settings in my .vimrc file, not things I will need to ‘download into my fingers’, as Damian put it.

Wednesday at OSCON

Summarizing the keynotes:

  • Ubuntu community manager, on the growth of community management as a career path.
  • Python guy, giving an award to a major Python contributor
  • Microsoft guy from Italy, with a couple of interesting announcements about what he terms ‘open surface’ projects: first, that they will support Red Hat 6 on their new VM platform, and second that PHP and node.js will be supported on their Azure application platform. This is what is meant by ‘open surface’… the core of the product is commercial, Microsoft stuff… but the surface… the functionality they are selling… is open source. Quite a strange twist of fate Microsoft is experiencing these days.
  • Now the gal who created, with a snappy slideshow, is being well received.
  • Now a guy selling OpenStack/OpenCompute… this talk seems too sales-y… no matter how cool his product might be. Pep talk at the end about opening up hardware as well as software may have redeemed him.

Now that the keynotes are over, I’ll switch to my timestamped notes format.

11:57am: wrapping up a session now on ‘Programming Well With Others: Social Skills for Geeks’. These are two guys form the Subversion project, telling some community anecdotes, such as when a famous geek (unnamed) filed a bug report along with a slew of insults, or when a lurker on the mailing list started posting every little thing on his mind. Also a contributor who tried to insist on having his name in ‘his’ file.

2:26pm: Continuing my HTML5 binge at this conference, the talk I just attended was called ‘HTML5: All About Web Forms’. Considering how support for the new input types is being handled by smartphones, it’s starting to become tempting to use HTML5 in earnest. One still has to consider folks on older browsers, of course, if by some chance they are also potential users… but what about application power users? Administrative users? I may start dabbling in HTML5 for this population and require modern browsers for them, especially if I can support mobile better in the process. I feel like 2 or 3 years from now, web development is going to be in an even better position than it is today, as far as developer efficiency.

2:32pm: ‘HTML5 in Your Pocket: Application Cache and Local Storage’. The HTML5 beat goes on. I think this conference is scaring me into taking mobile as seriously as I should.

I like this guy. He’s preaching bypassing the app store and native development and reaching for HTML5 first. The barrier to entry, development-wise, is infinitely lower, and the same code will run on desktops and laptops. He also just recommended this book, free online.

The meat of this talk is about Local Storage and Application Storage, the former being the heir apparent to cookies in session management. Second time I’ve heard it here; must be true. Cookies provided about 4K of space; Local Storage provides 5MB.

Another site recommendation: HTML5 Rocks.

[I totally petered out Wednesday afternoon. My mind buckled beneath the weight of a growing to-do list, which was fertilized by all this new information. A nap ensued.]

The Portland, OR MAX

Thursday at OSCON

9:20am: since the continuing keynotes are being streamed live on, you could always head over there and check them out. Right now Jim Zemlin of the Linux Foundation is reviewing 20 years of Linux. It really is amazing how far the operating system has come. I hadn’t realized that Red Hat has outperformed Microsoft on the stock exchange by a factor a 4 over the past decade. I had heard that MS has fallen below both Apple and IBM in market cap in the past couple of years.

9:28am: This next keynote is quite interesting. I knew that antibiotics were becoming less and less effective, making hospitals more dangerous, but I didn’t realize that health industry analysts are aiming to move health care outside of hospitals entirely. This would be aided by healthcare IT, which as you might suspect, is woefully behind the times. Speaker ends the keynote by stating that it’s less risky to go skydiving today than it is to go to the hospital. Ouch.

9:34am: Now it’s Eri Gentry of BioCurious, which has to be one of the great company names of all time. She’s explaining how the world of biotech is not friendly to ‘lean’ startups. The field is also fraught, like many big money fields, with intellectual proprietorship. It’s pretty easy to see why visionaries in the health and biotech fields… those with new ideas and hoping to innovate… are looking to open source geeks for tips on how to set information free.

Setting science free from the PhDs… I like that. Imagine if you needed a PhD before you were allowed to write a line of production code? We’d still be on paper.

It strikes me that what we’re trying to do… or preserve?… with open source and open approaches to non-computer-related ideas… is something like the American dream itself. Capitalism enables that dream, but turns evil only as the early winners wall their gardens, and raise the barrier of entry to that dream. Is this too lofty a description of the spirit of open source?

Great motto: BioCurious? Experiment with Friends.

9:54am: John Graham-Cumming was unable to attend due to flight problems, but delivered a video instead, largely looking back on Alan Turing and on the future of inclusiveness in the computing community.

10:11am: Next up, Gabe Zichermann points out how open source needs to better engage end users through the use of ‘gamification’. What would motivate end users to care as much about this stuff as we do? The Gamification Summit in NYC this September looks interesting.

One gamification concept: speed camera lottery. You know those automatic radar traps that send you a ticket if you speed? In Sweden, there are deploying a ‘speed camera lottery’ which enters non-speeders in a lottery to receive the monies earned from ticketed speeders. Reduction in speeding violations over vanilla auto-ticketing methods: 20%.

And how much will a speeding ticket cost you if you speed anyway? Welcome to Socialism; it’s based on your income.

10:40am: Penance will now be paid for my having missed Damian Conway’s Perl 6 talk yesterday. Yes, I’m officially a Damian fanboy now; if he’s talking (and I’m awake), I’m listening. This talk is simply titled The Conway Channel 2011, so your guess on the topic is as good as mine. But this is the blind faith observable in all fandom.

Turns out he will talk about four of his modules on CPAN.

First up: Regexp::Grammars. This involves the advanced parsing techniques for domain specific languages (DSLs). If you’re lost already, don’t expect me to find you, because folks, I’m still looking for myself. Suffice it to say that the regular expressions are the least confusing aspect here, and I do not have a computer science background.

But weirder: I’m sort of following this. In these areas you might say, I am able to appreciate the some of the nuances of a great film, but not quite direct one.

Next up IO::Prompter (an improvement on IO::Prompt) for prompting at the command line. It requires Perl 5.10+. It has builtin validation for entered data (nice), including a distinction between ‘must’ (invalid data will be reprimanded for) and ‘guarantee’ (keyboard will not work for invali data). Draconian!

It also has timeouts and defaults when the timeout expires. Very neat. Supports password data type, to obscure what’s typed on the screen ala HTML password type. Wow, it also has history and filename completion.

This module is awesome. I’ve long been wanting to teach some basic Perl to my daughter, some kind of command-line prompted program… this would make that a lot easier and more fun, skipping the annoying bits and letting her get into her own ideas quicker.

Next up: Data::Show (and Data::Show::Names). An improvement on Data::Dumper and Data::Dump, giving you the option of viewing the data being handled by Perl in much more granular (line-identified) format.

Funny Voltaire joke: “The perfect is the enemy of the good,” wrote Voltaire. Modern day IT translation: “If it compiles, ship it.”

This was followed by a slew of yet geekier in-code jokes to which I’d be doing a disservice to attempt retelling. Check out the Acme::Crap module on CPAN… and don’t tell it I sent you.

11:30am: Now for Jacinta Richardson with “Perl Programming Best Practices 2011″. Hopefully she will contrast any updated suggestions with the ones in Damian’s classic PBP book, since that’s the bible. Interestingly, Jacinta is yet another Australian.

She’s going fast. Real fast. So I can’t capture what I’d like to here. The rub is however, use the most recent version of Perl that you can, and read up on some of the new features. Mentions of autodie, Try::Tiny, etc.

A module should not use die for error handling, it should use Carp and croak. This makes sure that the error is reported at the line of the main program, not the module, placing blame in the correct place.

Bearing more research: local::lib. Not quite sure what it gives us beyond ‘use lib;’ but clearly it does and I’ll need to find out.

Interesting: minicpan. All the latest module versions on CPAN add up to about 5GB, so why not have them on your development machine, ready to be installed if needed while online? Nice airplane mode for Perl, right there.

Smart::Comments. Also worth checking out. Excuse my brevity, she’s cruising here.

Hmmm… Method::Signatures as a replacement for IX::Args? Maybe. Will it handle my cgi params? Will I care if I switch to Plack? Help me, somebody, the options overwhelm me.

By the way, if I haven’t mentioned it before: interested in all the goodies that might be available in Perl 6 when it’s finally ‘production ready’? A good many of those have been added to the more recent versions of Perl 5.

1:44pm: No, I am not punishing myself, but I am sitting in another HTML5/mobile talk. PhoneGap is being recommended; will have to look into that (note: this is about taking web code into the native mobile space, so… not that interested right now). Also, maybe I should start developing using an iOS simulator? Especially with phone-sized display. This is where I worry a bit… well, not worry… but I’d love to excel in screens that size, rather that simply sort-of work there. Maybe start with HTML5 date picking; existing datepickers are miserable on tiny mobile.

Neat hack: use input type ‘tel’ whenever you want a number pad on mobile… even if the data will not be a telephone number at all.

Can’t forget HTML5 ‘placeholder’ attribute to replace the JS magick that is sometimes not so fun (text field prompting-type text).

Note: iOS supports SVG fonts but not True Type. Speaker recommendation for Font Squirrel, which offers free use fonts and actually package them up for you, complete with the code to load them.

Phone width: 480px. iPad: 1024. For min-device-width and max-device-width, more CSS3 goodness.

2:31pm: Now for a change of pace, “Awakening The Maker Ethic in K-12 Students”. As a parent, I’m also an educator, or at least would like to be. Speaker is pointing out the artificial tension in our educational system between ‘vocational’ and ‘academic’ instruction, which clearly undermines the hands-on, DIY ethic.

Now he’s pointing out the illusion that kids ‘know tech better than we do’, and that all we need to do is include technology in the course of their other studies and that will suffice. Technology needs to be taught directly as well.

A plug for, which I’m tardy in checking out. My daughter may like this, as she loves to draw.

Hmmmm… this is interesting, visual IDEs for Arduino. Sounds tasty. I need to get playing with that thing. This would lower the bar for kids, having a visual IDE, for sure.

Wow, too many cool things to relate here. Another inspiring talk.

[notes devolve once again this afternoon due to fatigue]

7:00pm: I’d love to report that I spent my last evening in Portland painting the town my own shade of red, drinking and carousing and genrally terrorizing the Northwest. In fact, of all those things, I merely drank a bit, and proceeded to attend Larry Wall’s State of the Onion address along with a couple hundred other somewhat-lubricated geeks [blurry pic of Larry below]. Larry actually worked his annual address into the 5-minute Perl lightning talks also scheduled, breaking his speech into 5 minute bits interspersed with the various other ejaculations of Perl community members. This, in addition to Larry’s self-effacing delivery, had a remarkably humble effect. No wonder this community has thrived for 20 years under the gentle leadership of this man, and no wonder OSCON itself grew out of the Perl community.

Larry Wall @ OSCON 2011

He also made a classy move by thanking Tim O’Reilly for opening up the conference to families and children of attendees… something I hope does not change.

Friday morning at OSCON

11:04am: The final day of OSCON is a half day, allowing many of us to fly out of here in time to have something of a weekend back home. The last session I attended was called ‘The State of Open Source in [K-12] Education’ and was frankly depressing. Efforts made in the last decade to save schools on licensing fees and turn kids on to open source software have largely floundered despite the earnest good efforts of many, and it was frightening to hear that a number of educators have actually lost their jobs because of ignorant administrators and parents with their own agendas.

Discussion after the talk was lively, with many educators and parents in the crowd, and the consensus was that the education bureaucracy will only change slowly, and that the most effective evolutionary driver is homeschooling. In other words, just as the open source philosophy of Linux has competed successfully in parallel with Windows, home and independent schooling may have to compete with institutional learning in the same way, if we really want fundamental changes in the way computing is delivered and taught. While this kind of grassroots spirit has always appealed to me, I’m not a parent in a position to exercise this option, so I left scratching my head a bit… where do we go from here?

Well, with that, *I* am going home… on a plane… not in a car, not on a boat… not in a box, and not with a fox… back to good old New England. Portland, I hope to see you again soon.

Thanks to the University of New Hampshire for sponsoring this trip. If anyone has questions about OSCON and why it might interest them, please drop me a line anytime.

Report from OSCON 2010 – Sessions Day 3

11:52am: Perrin Harkins is going to build on Tim Bunce’s Devel::NYTProf talk by addressing how to actually speed up Perl bottlenecks after you identify them. Wait, no, now he’s blaming the database end of things for most performance problems, so that’s not strictly a Perl bottleneck at all… but in my experience, he’s right… it’s usually your SQL, database connection overhead, or something like that in modern apps. It’s always in the I/O.

Now he’s talking about the overhead incurred by ORMs… object relational mappers, frameworks like Rails or Grails or, in the Perl world, DBIx::Class. This is where I get to grin and feel like I am making a smart choice by writing all my SQL raw (and conveniently ignore any development time efficiency I might gain by switching to an ORM). Since I don’t have the luxury of DBAs to tune indexes and whatnot, I’m guessing I’m still better off staying intimate with MySQL.

Ah, some actual Perl optimizations now:

  • Slurp files when possible (unless too large), don’t read them line by line off the disk.
  • Use a ‘sliding window’ to read large files.
  • Text::CSV_XS is wicked fast… don’t parse those CSVs by hand. (if performance matters).
  • LWP: not so speedy, if it matters. LWP::Curl, much faster. Or if you’re hitting a lot of different URLs, HTTP::Async for concurrent connections.
  • eliminate startup costs with something like mod_perl or FastCGI. Check.
  • compiling Perl without threads can buy you 15%, if you don’t need threads… but now you need to maintain your own Perl…

Interestingly, I only seem to take detailed notes during optimization talks… a sign that this topic interests me too much. ;)

Well, time for the closing keynotes and then lunch/evening festivities. Time flies when you’re being a geek.

11:14am: Enjoying Patrick Michaud’s talk on Rakudo Star… the first ‘usable’ Perl 6… all of which I’ve already been privy too in lurking the Perl 6 blogs etc… so I’m spending most of this time trying to figure out how to get ‘git’ to use a different transport than SSH… so I can keep our local installation of Rakudo up to date in my latest ‘protect me from myself’ firewalled vantage point on our network. Grrrr.

10:11am: Louis Suarez-Potts, PhD of Oracle is telling the story of Open Office… another slice of open source pie that came to the company as a result of the Sun acquisition. He makes the point that the architecture of your code shapes community participation in its development; basically, it’s the argument for plugins/APIs/modularity in software design, which the only way to properly distribute programmer workloads in order to maximize efficiency… you can’t have too many cooks with their hands right in the core of your codebase.

After speaking broadly on several aspects of community forming, he concludes with a somewhat rousing critique of ‘commodity culture’, and encouraged the thirty some-odd attendees not to think of his project, Open Office, as a commodity. I think everybody understands this about all software on some level, but vary as to how we articulate and respond to that.

10:00am: Allison Randall just announced that OSCON would definitely be in Portland again next year, to wild acclaim. It seems that holding the event in San Jose last year was extremely unpopular: everyone loves Portland. Me too.

9:48am: Simon Wardley is taking about management philosophies… a somewhat rare digression from technical topics here at OSCON… but after all, this is one of the keynotes, and you’ve got to keep it lite. I’m hearing a handful of familiar words: Agile and Six Sigma, for example, and the relative strengths and weaknesses of each. Simon says that Agile excels at innovation, but sucks at managing predictable processes. Six Sigma, he says, excels and sucks inversely. And no I wouldn’t put ‘sucks’ in anyone’s mouth; Simon has an informal speaking style.

Report from OSCON 2010 – Sessions Day 2

Ed, as it turns out, was only the beginning.

After a Hunter S. Thompson-esque jaunt through the Lloyd Center district, culminating, thankfully, in not being robbed at the Motel 6, here I am once more at OSCON for a technically edifying day in the Northwest.
10:39pm: Lunch next to the Expo Hall was better than expected. After, I decided to bring the MacBook Pro back to the room… finally tired of lugging it around and seeing the wisdom of netbooks and iPads. In the afternoon I attended the Perl Lightning Talks, which were a mix of really neat 5 minute demonstrations and silly entertainments… all in good fun.

11:28am: And now for a jQuery UI session, perfect for me. I’ve been wading into the jQuery world to improve my user interfaces for some time now, and am about to really get my hair wet. I subscribed to the official jQuery podcast and have been working through the episodes on my commute.

10:51am: I was hoping to attend License To Fail, but apparently so wasn’t everyone else; it filled up fast, and the ushers closed the doors. Instead I’m in Programming Websockets, an interesting enough talk that I *think* is about bringing statefulness to the web, but as the standard is still evolving, it’s far too bleeding edge to get excited about right now.

9:48am: now a guy’s picking on C++ and Java syntax. Hard to argue, but, waiting for his bright idea…

Ah, finally the rub: he’s selling the ‘Go’ language (not the game). Go is meant to be a perfect compromise between compiled (statically typed) and interpreted (dynamic) languages.

9:34am: The reception for the guy from Microsoft sounds like SNL’s Sarcastic Clapping Family. He’s from their interoperability department… which I assume sits adjacent to legal, where the patents get drawn up.

Ever repeat a word so many times, it loses all meaning? Ready, set, go: cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud, cloud…………

9:28am: A woman from SETI is calling us ‘earthlings’. Finally, I belong.

9:08am: a Facebook guy is talking about “HipHop for PHP”, which is PHP compiled down to C++ compiled down to a fast binary all in the service of… making Facebook pages load faster. And of course, they’ve open sourced it. UNH can consider using this once we earn several ga-jillion page views per day… that is, we don’t need it. Nice to know it’s there though. And I wasn’t aware that “Facebook has been developed from the ground up using open source software.

[continue to Day 3]

Report from OSCON 2010 – Sessions Day 1

The appetizers of the first 2 days have been eaten, and the meat of the conference is now being served. I’ll be liveblogging the sessions from here on out… most recent thoughts at the top:

8:21pm (PST… naturally): I’ll fess up; the final two session slots today offered little of interest to me. Instead, I retired to the hotel bar, and ended up in extended conversion with Ed Rynerson, 87 year-old newspaper distributor. I learned more in two hours than I would have at 10 computer programming conferences. Thanks, Ed… I’ll never forget it.

2:30pm: Next up for me: a session on Devel::NYTProf by its author Tim Bunce. This is a module that profiles how much time each section of your Perl code takes to compile and run, to help you find speed bottlenecks. Measure twice, cut once. A couple of tips on optimization, once you’ve determined it’s needed:

  • exit subs as early as you can
  • profile known workloads, don’t worry about tuning for datasets far larger than you’ll ever actually deal with
  • add caching *if appropriate*… easy to introduce bugs here
  • don’t create objects (expensive) that don’t get used
  • rewrite hot-spots in C

I love this piece of advice that Tim has repeated at least thrice now: after you make an optimization, re-test, and if it runs fast enough now, put the profiler DOWN, and walk away. Compulsive performance tuning is a common affliction.

The talk is about the conclude however, and I wish there had been examples of how to profile code running on the web rather than at the command line. Admittedly, I haven’t messed with any of this yet, and it might be dead easy.

1:51pm: Patrick Michaud, Perl 6 implementation superhero, is now showing us some Perl 6 basics. This is becoming an annual treat/tease for me. The upcoming ‘Rakudo Star’ release is not considered production ready, but rather a ‘usable’ release of the new Perl 6 language. Getting closer… I swear it’s true. It’s nice to see these folks taking their sweet time with Perl 6, raising it like a child, rather than hormone-fed beef. It’ll be better for us this way… when we, uh, eat the child. Ok, no more mixing metaphors for me!

11:32am: Eric Day runs a session on Drizzle. The minimalist website echoes the minimalist “lightweight” design… for instance, if you’re not using stored procedures, there is no need to have this feature enabled and impacting performance in your RDBMS. Another interesting tidbit is that Drizzle (a fork, or derivative, of MySQL 6) has completely dumped the MySQL users table for authentication. I like that, I think.

11:01am: Now I’m in a FOSS and EDU session. Here’s an interesting program: the Professors’ Open Source Summer Experience. Think FITSI, but with a focus on open source.

And now something for students… Undergraduate Capstone Open Source Projects. Funny quote from the slide: “The culminating experience of one’s undergraduate experience is an NDA [non-disclosure agreement]“.

9:59am: Marten Mickos is up now, formerly CEO of MySQL AB, before the Sun (and ensuing Oracle) acquisition. Now he’s heading up the Eucalyptus cloud computing platform. His comment about combining his passions for open source and for making money got some laughs… but he’s right. There’s far less tension there than people think (especially, I’m imagining, in a cloud…)

But wait: Eucalyptus is available as both an on-premise or off-premise service. Is on-premise really the ‘cloud’ though? At a certain point we’re just buzzword compliant here… cloud is becoming a term for distributed, load-shared or virtualized systems whereas it used to just mean “that internet out there…”

9:00am: Well here I am at OSCON 2010, a gathering of the best, brightest and myself (heh) here in Portland, Oregon. The Open Source Conference.

Tim O’Reilly kicks off the keynotes and I am immediately reminded: these are the techno-hippies. Psychedelic visions previously confined to the brain are now pixels on the screen, and this is clearly a gathering of technologists who think in far broader terms than ROI and cost-benefit analysis. Open source is, at the core, a very self-conscious social and political movement, which I wish more people in IT understood, and would get behind.

This is becoming more explicit as the keynotes proceed. Jennifer Pahlka has now taken over to push an effort to bring open source software to municipal governments.

The CTO of the District of Columbia is up now. He leaves platform selection decisions to the technologists themselves… to the people who’ll actually implement his solutions, where the rubber meets the road. He hires experts and trusts them. Get this revolutionary off the stage, he obviously doesn’t know how to manage his subordinates properly.

On a side note: the wireless access here at the Oregon Convention Center, supporting some 3,000 geeks on laptops and handhelds, is fast and flawless. This makes me all the more angry at the hotel, whose wireless access shits the bed ritually, in support of only a couple hundred users. When will hotel wireless reach the same amenity status as running water? I woud have gladly traded one for the other at several points this week. Maybe that’s why us geeks sometime smell a little… suspect (back to the techno-hippie theme…)

[continue to Day 2]

Panorama theme by Themocracy