Great Lakes Software Excellence Conference

What a nice conference! Emphasis on nice. First Joel Adams, chair of Computer science at Calvin College, got up and welcomed us. Calvin College is where GLSEC was held and it looks so shiny and new I wondered if it had just been constructed. I stayed in a room the was about 200 feet from where I gave my talk.

Then the mayor of Grand Rapids, George Heartwell, gave us software dudes a nice little welcome speech. He implored us to see downtown Grand Rapids and then talked a lot about jobs and growth. I know Michigan has had hard times, but from the looks of the airport and Calvin College (which is about as scientific a survey as it gets) Grand Rapids must be a city on the grooooow.

The opening Keynote was given by Michael Cloran who is the the CEO of Interactions.

He started off with a live demo of the "Service Factory." He spoke, through his phone, to a computer and it parsed his voice and answered his questions. It was scary impressive. Typical line from the extended dialoge he had with the machine: "I have a flight at 5 or 6 today from altanta to London, oh wait I'm mean chicago to London I was in atlanta yesterday and I was wondering if you could tell me the gate and exact times" And then it nailed the response. Whoa.

Of course then he revealed that the secret behind the fabulous voice recognition was real human beings. He went on to explain that this is part of what he calls "The second service revolution." The First industrial revolution was about craftsman. The 2nd was assembly lines where each person did one thing. And thats the deal with his call center. Other call centers have employees who either know or look up all the answers. This takes lots of training -- which is bad new in an industry with high turnover. Cloran's company breaks up the calls and sends each person's response to a different "Intent Analyst." So an IA sits in front of a computer and gets a message on his or her headset along with some text on the screen describing the context. The IA then indicates to the computer what the person wants and the software does all the magic of providing the response while the IA is off listening to another snippet.

What's really interesting is that about 60% of calls are assigned to 2 different IA's, without their knowledge, and they get points if they both provide the same response. The one who gets there first is given double points. Their point average is directly linked to their hourly pay. Of course incentive pay inspires people to try and game the system so they've been constantly tweaking the system to eliminate "cheating." An example Michael provided was that there are common mistakes new employee's make so seasoned pros would quickly imitate those mistakes to get more points. They had to write algorithms to catch the cheating and penalize it with triple negative points. Reminds me of Joel Spolsky's classic "The Econ 101 Management Method"

Of course, as a start-up they made their fair share of mistakes. In the beginning they made a proof of concept demo and then when they got funding they just kept modifying it instead of re-writing some very sloppy code. This cost way more time than it saved. And they didn't write unit tests until the complexity threatened to overwhelm them. One of the biggest things he regrets is separating the QA and Dev teams. They thought that Sarbanes-Oxley demanded such a separation (later they found out it didn't) and this separation lead to lots of inefficiency, anger between the teams, and morale problems.

They also had IT problems. In traditional IT you can delay something as long as you want without much penalty. However, if you approve something that fails, you're in huge trouble. What incentive is there to do something fast? Need to analyze risk and reward, put it in front of the business, and then not freak out when trouble comes.

All in all, one of the most interesting keynotes I've heard in a long time.


"Encapsulated Process Objects" - Jeff Dalton

Jeff's deal is that he combines Agile and CMMI into one thing. Which is interesting because CMMI is usually associated with non-agile waterfall projects. Dalton stated that most devs see process as a ridged, audit-driven thing that demands slave-like adherence. But he doesn't. He sees CMMI as much more flexible than most people realize. His "Encapsulated Process Objects" are meant to be a smorgasbord of choices you can pick from when developing a project.

Unfortunately he went kinda fast and used a lot of terms I wasn't familar with. For example, when talking about "verification" in CMMI he mentioned that you could use either pair programming or Fagen inspections or test based design. Is Test based design the same as Test Driven Development? And what's a Fagen? (insert Dickens joke here) When fulfilling Feature Validation you could use "Use Cases," "Simulations," or "Prototypes." Uh, OK. Are use cases user stories? 'cause I've used those before.

So I felt kinda lost. But I did make a mental note to contact him if I'm ever in a place that demands CMMI so I can figure out how to wedge in Agile/XP


Here's a strange thing about GLSEC: I only saw about 10 other open laptops during the whole day. How do these people attend a session without a Macbook Pro to type on? The wifi was wide open!


"Taking the 'Pro' Out of 'Process': Test Process Improvement for Everyone" - Jess Lancasteru

This talk was about improving your QA team's processes. Jess started out by saying that test process improvement is hard because you don't know what to do, when to do it, or how to do that which you eventually decide to do.

Jess is a big fan of the Test Process Improvement Model (TPI) and recommends this book:
"Test Process Improvement: A step-by-step guide to structured testing"
as a way to assess and improve your testing processes.

I have to admit that during this talk I sat near a window, so as to be near a power outlet for my Mac Pro, and it was a beautiful fall Michigan day. 70 degrees, beautiful colors, trees waving gently in the wind. It was hard not to stare out the window. I may have missed some things.

Jess fessed up early to re-using an internal slide deck for this presentation and it showed. Some key terms weren't defined and the points on the slide sometimes seemed a bit off of what he was saying. If you are interested in using TPI, then he recommends an excel spreadsheet found here:
http://www.sogeti.nl/Home/Expertise/Testen/tpi_downloads_uk.jsp
It helps you use TPI to score your testing process.


"The Art of Refactoring" - Kealy A. Opelt

Kealy started out her talk with a shout out to Martin Fowler and his seminal book "Refactoring". She defined refactoring thusly:
Refactoring improves design but does not add observable behavior.
And she cautioned that Refactoring is dangerous without unit tests.

Why refactor?
  • Improve design
  • Make code easier to understand
  • It helps you find bugs
  • It helps you program faster

The rest of the presentation was workshop style in that she would identify a code smell (say "Long Method") and then use one of the prescribed cures to make it better (like "Extract Method"). She even gave us real paper handouts on which we could practice with our neighbors -- I haven't done worksheets since my teaching days. Wait, do W2 forms count as worksheets? Or 1040 forms? Anyhoo, it was some old school fun.

Other examples were:
Feature envy -- when an object is interested in another object much more than itself. Solution: Move methods into the envied class.
Data clumps -- a bunch of variables that have to do with each other. Solution: Create a class to hold that data.


"Using Metrics to take a Hard Look at Your Code" - Me

The talk went OK. It's a presentation that shows a fair amount of Ruby code and the crowd had only a few Ruby users so that may have put a damper on things. Also they didn't really laugh at any of my jokes. But I soldiered on and I think a few people got some value out of it. A software development track at an XP/Agile conference is an interesting thing in that most people are going to be out of their element. Much as I was confused by the CMMI guy's jargon, my audience had to struggle with some unfamiliar terms in my talk. In a way, it's a good thing to be exposed to new ideas. But in another, much more real way, it can be a little depressing.


"Moving from 1.0 to 2.0: How We Brought an Application From Client-Specific to Generic and Marketable" - Sam Williamson

This talk was about how a startup made all the wrong decisions early on and yet managed to right the ship and get some working software into the wild. And then modify it to be generic enough to be sold to many different companies.

Sam is part of a small distributed team and he recommends: http://www.jingproject.com/ to make quick screencasts that can be viewed on the web. They used Jing daily during standup to show what they had been working on the day before. Sounds like his standup was a little more heavyweight than the ones I'm used to but that makes sense when everyone works in different cities.

Early on they had no source control, no tests, and no consistent way to build the project. After getting kicked around a bit, they added in tests, visual source safe, CruiseControl.net, and agile practices (they had to do remote pairing with screen sharing).

They had to interface with lots of different systems. From a shared drive on a mainframe, to a database, to an actual service. They wrote interfaces to wrap all of the services to minimize the coupling. Not only did it insulate them from changes in the third party systems, they could now go to other clients with different systems and all they had to write was a new bit of code that used the interface. Ha-Zaa for good design!



Last night when I got off my plane from Chicago, I realized that I would be in Grand Rapids for less than 24 hour so I could check in for my flight to Detroit (and then Orlando) before leaving the airport. What a jet-setter I am. I have to say that I always anticipated that being a jet-setter would be more glamourous.



"Agile is More than Just Makeup: Going the Distance with XP" - Michael Swieton

Practices matter. There's a tendency in crunch time to say "screw pairing, we need to go fast." but abandoning your practices during troubling times "is like turning off the light just when it gets dark" -- Ron Jeffries

This was pretty general talk about agile practices, testing, mocking, and dependency injection.

I'm kinda running out of steam here. It's 10:28pm and I'm in coach on my 3rd flight in the last 30 hours so while Swieton's talk was good, I'm giving it a bit of the bum's rush.


Jason Huggins - Final Keynote

There's a another keynote to discuss? Ah hell.

But seriously folks, Jason is the creator of Selenium (a web testing framework that is definitely not named after the cure for Mercury poisoning), a former colleague of mine at ThoughtWorks, a recent quitter of the Googleplex, and now trying to get together a start-up. His new venture is called "Sauce Labs" and is dedicated to putting web testing on the cloud. Instead of running 1000 click through your website tests in serial, he'll help you start up 1000 servers on EC2 to run crazy fast.

Jason's talk was mostly about how the transition from silent documentation (another name for a manual) to screen casts is similar to the transition between silent movies and the talkies in 1927. He talked about how David Hanson's Rails screencast in 2004 changed the game for documentation in software and we're only just beginning to realize the fallout. Huggins really digs Castanaut for automated screencast generation even if it does talk in a scary computer voice.

All in all, GLSEC was a good time and I'm glad I was able to attend. The hotel and conference staff were really nice and very helpful and so I'd like to thank them here. In a blog they will never read. OK, time to get some airplane sleep for tomorrow is Ruby Conf!

Comments

Anonymous said…
As the only other MacBook Pro in your metrics session I absolutely got value from it. So I would consider it mission-succeeded.

-adam
Anonymous said…
Jake,

Nice review of GL-SEC, love your other posts. Thanks for writing about my "Encapsulated Process Objects" presentation and I'm sorry if I went on too quickly (they told me I had 35 minutes when the they had said 60 . . . ). I use "generic" terms because I don't want to be too specific (re: XP, Scrum, etc) so "use case" might be thought of as a user story and Test Based Design could be thought of as Test-driven development. Either way, the examples you brought up are ALSO great examples of techniques we can use to satisfy the CMMI - and get real value out of process (processes that we already use in Agile development by the way).

Oh, Fagen inspections don't come from Oliver, but from Michael Fagen, and they are one example of a rigorous code review technique.

Thanks again and thanks for your comments. I've got a new blog to read now! Mine is at http://asktheCMMIAppraiser.com

Popular posts from this blog

What's a Good Flog Score?

SICP Wasn’t Written for You

Point Inside a Polygon in Ruby