Sunday, June 22, 2008

Book Review: C# In Depth

C# in Depth by Jon Skeet, pub. Manning Press, ISBN 1933988363.

This book is a tremendous work for understanding how the most important features of the C# language work.  Skeet's been a prolific poster in the C# forums on MSDN for some years now, providing answers, tips and tricks, and in-depth advice to a large number of forum visitors.  This book wraps up his great knowledge of the inner workings of C# and hands it over to readers in a well-written, concise, usable book.

Skeet uses a very nice formula for the features of C# 2 and 3: he starts with demonstrating solutions to practical problems in C# 1 then shows the progression of that same solution through C# 2 and C# 3.  His walk through of the evolution of delegates through 1, 2, and 3 is a perfect example of this: start with the very wordy, somewhat clunky handling in C# 1 and end up with C# 3's lamba expressions.

One of the many fine things about this book is Skeet's ability to clearly cover complex topics like Lambdas and expression trees at exactly the right level.  Readers will be able to pick up the power, complexity, and benefits of language features because Skeet's kept the examples practical and the text conversational.  With potentially complex topics it's too easy for authors to fall into trivial examples, or dive into overly academic discussions; Skeet does neither.  He also does a terrific job of covering the cons of particular issues -- something I'm a big fan of since it helps me make informed decisions.

Part of the book's success is Skeet's solid focus on the book's topics.  He stays directed on to language features and doesn't digress into software engineering or construction.  As a result, in roughly 360 concise pages he's able to hit all the major goodies like generics, delegates/lambdas, nullable types, extension methods, and LINQ.  He closes the book with a nicely laid out, thoughtful discussion of C# 3's benefits and its possible future.

This is a great book for understanding how some of the more fundamental features of C# are implemented, and how to best use them.  This book definitely belongs on your bookshelf, right next to Bill Wagner's Effective C# and More Effective C# .

(I'd love to see Wagner and Skeet in a room full of VB6 programmers, diving into a deep discussion of anonymous methods, expression trees, and lambdas.  Watching all the VB6ers heads explode would be great entertainment.)

Wednesday, June 18, 2008

Test Execution Problem in VS2008

A nasty bug slipped through the cracks in VS2008's test execution engine.  The problem is rooted in the whole test deployment concept in Visual Studio 2005 and 2008 where all assemblies are copied to a folder underneath the TestResults folder, then executed from there. 

I think this is a silly, overly complex way to execute tests, and it's particularly a problem in VS2008 because the test runner doesn't execute with that folder as its base path.  A bug in VS2008 causes the AppDomain.CurrentDomain.BaseDirectory property to be set to C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE instead of the proper subfolder under TestResults.

There's some good detail and a possible fix in this MSDN post.  Unfortunately, this doesn't work in my environment because the resolver's assembly path property value ("Public Assemblies;Private Assemblies") also gets appended to the path, further borking things up for me.

I'm moderately narked about the bug with the BaseDirectory slipping through, but I can understand it.  Bugs happen, they get fixed.  Hopefully there's a workaround for the bug while a real fix is being created.

I'm more narked about the underlying decision to have tests run from a location different than the project's build directory.  The overall concept is brittle and has too many moving parts.  I have to fool around with DeploymentItem attributes in code, I have to fool around with Deployment Item settings in the test configuration.

The idea of a TestResults folder is great if it were limited to, like, holding results. Reports, working files, etc., would all be appropriate for that folder.  Copying all the binary bits and test support files to that folder, then running everything from that target isn't the way a real unit test framework would do it.

Unfortunately, this bug is a complete show-stopper for me because I can't work around it.  The specific issue for my current project, based on VS2005, is a crapload of tests reliant on our NHibernate infrastructure.  NHibernate's Cfg.Configure() command, invoked with no parameters, attempts to load the hibernate.cfg.xml file from the current folder of execution.  Sure, you can specify a directory path to load that file from, but I'd have to do significant amounts of code ugliness to alter that input path based on production, test, dev environments.

I'm waiting on more help from Microsoft on this bug, but for now it means we can't proceed with moving up to VS2008 for this phase of the project.  I'm bummed.

Sunday, June 15, 2008

VS2005 -> VS2008 Conversion Wizard Runs Every Time

I'm working through a few minor nits with a moderately sized solution (23 projects) that I've just converted from VS2005 to VS2008.  One nit is the conversion wizard runs every time I open the solution.  Irritating.

Answer found after trolling through some MSDN posts: Test projects appear to have a few hitches during the conversion process, one of which is the FileUpgradeFlags element in the .csproj file failing to properly update.  This causes the wizard to re-run every time you open the solution containing the project.

To fix, find the element <FileUpgradeFlags>.  It will probably look like

<FileUpgradeFlags>0</FileUpgradeFlags>

Remove the zero in the element content, leaving an empty element:

<FileUpgradeFlags></FileUpgradeFlags>

Do this for every test project in your solution.  Save 'em all, re-open the solution and you should see no conversion wizard.  Yay!

(Original MSDN post here.)

Now if I could just get MS Test to play nicely with my hibernate.cfg.xml files in the MS Test deployed location.  I continue to search for a rational explanation of why MS Test insists on copying all binaries to a location separate from the build location in order to run its tests, forcing you to deal with "DeploymentItem" attributes and test configuration shenanigans.  I've run in to a number of PITA issues with this separate-but-equal concept, and I'm having a hard time understanding the reasoning for creating a second environment that's not treated the same as the solutions' build environment.  Brittle, complex, and more moving parts than necessary.

Yet another reason for preferring MBUnit for a test framework that helps, not hinders, all the testing I like to do.

Shout Out: More Effective C# Rough Cuts Available

Bill Wagner's Effective C# is one of the books you absolutely need to have living on your bookshelf if you're working in .NET. Bill's book is crucial to understanding how certain things in .NET and C# work, and it's full of great examples on why you need to do things in a particular fashion.  He goes in to exactly the right level of detail to make you understand things like the as/is keywords, and why you really need to think hard about how to implement equality checking.

Effective C# covered C# 1 with just a nod to 2.0, so it's good to hear Bill's nearing completion of his work More Effective C#, updated for all the goodness of C# 2.0 and 3.0.  You can pre-order through Amazon, and it's also available on Rough Cuts where you can get read the chapters as they're going through the editing process.

I'm really looking forward to getting a copy of this when the dead tree version comes out in November.  (Hey, maybe a birthday present for myself!)

Tuesday, June 10, 2008

Meme: How I Got Started Programming

Tim whacked me with his post, so here goes.

How old were you when you started programming?

High school, 1979.  Age of 16 or so.  Radio Shack TRS-80 with a cassette tape drive for its storage.  Not quite punch cards, but damned close.  A buddy whose parents had gobs of money bought him one. (A tape drive, for cryin' out loud!  They must have been rich!)  He and I were in a D&D group together, so naturally a geeky computer attracted us.

How did you get started in programming?

I went enlisted in the Air Force in the general electronics field.  I could have ended up stringing telephone cables, but got lucky and landed a position in school for repairing massive aircraft radar systems.  I got even luckier by landing a slot for doing inflight repair, fulfilling a dream of flying I'd had since I was a kid and my dad used to take me out flying over rice fields in central California.  I digress.  At tech school in Biloxi, Mississippi, I got the chance to start programming 6502 processors using octal assembly code.  A complete PITA, but it rekindled the interest I'd found from the TRS-80 years earlier.

I got my first real computer a year or so later when I hocked the new aluminum wheels on my '67 convertible Camaro in order to buy an Apple //e computer complete with 32K of memory and a 64K video card.  Man, that was the cat's ass!  (I see Wingfield had a 128K expansion card.  He must have been living the high life, I tell you.)

I progressed to dropping in a Non-Maskable Interrupt card so I could hack into various wargames that were kicking my then bony ass.  I eventually did such crazy stuff like soldering together EPROMs onto a single socket with a DIP switch so I could toggle between the //e and ][+ BIOS sets.  I hung around a guy who wrote a voice driver for his modem to have his computer call up folks at random times, say "Asshole", and hang up.  Somewhere along the line I wrote some code that manually moved my 5.25" floppy drive's head around.

That interest slowly progressed as I was posted to Anchorage, Alaska, where I bought a 286 system from Radio Shack and started fooling around with Borland Turbo Pascal, then Turbo C++.  I eventually bought a generic 386 system from some company that went out of business the week after I bought the box.  So much for tech support.

The Air Force was kind enough to pay for 75% of my schooling while I was active duty, so I started taking night school courses at a branch of  Chapman University (then just "College").  I hit Pascal, more assembler, threw up through one Cobol course, and did some C++.

All that was just some background until I left the Air Force, wandered 'bout for a number of years, and eventually worked my way into cutting code for a living.  More or less.

What was your first language?

Whatever the variant of BASIC on that TRS-80.

What was the first real program you wrote?

Tough to nail that one down.  I remember starting one on the TRS-80 which was supposed to compute catapult damage to a castle wall (D&D, remember?), but I can't remember if we finished it.  Most likely it was some octal exercise on the 6502 in tech school.

What languages have you used since you started programming?

Surprisingly, not as many as some others: BASIC, AppleSoft BASIC, Pascal, Cobol, Assembly on a number of different procs, C++, Perl, Java, C#.  Suffered through Lisp just enough to be able to configure JDEE when I was doing Java work.

What was your first professional programming gig?

I did some writing of Perl modules for managing servers and accounts when I was a network admin, but that's utils, not cutting code.  My first "real" programming gig was as part of a team working on tools to convert SGML to HTML and XML.  (Yes, kiddies, let old fart Jim tell you about when you needed a PhD to understand stupid markup made when single characters counted so everything was implicit, not explicit.  Try figuring out an implied element closing when you're five layers deep.  Obviously I'm still scarred.) 

I only wrote code around the edges of that, so maybe maybe my first gig was actually writing tools to strip out metadata from DTED terrain data files and stuff all that in a database for use by a really cool system designed to figure out where bad guys hide so you can more easily find them and blow them up.  I wrote some code around the edges of that one, too, as well as built tools to create database schemas in Sybase 11 and Oracle 7 for the system's underlying datastores.

If you knew then what you know now, would you have started programming?

Hell yes!  I love what I do!  I wish I could have done more of this over the years, but I've had a somewhat eclectic career due to following my wife around for her military assignments.  (Germany, Alaska, Washington DC.  It's not been bad...)

If there is one thing you learned along the way that you would tell new developers, what would it be?

Learn to learn, don't just learn technology.  What you knew five years ago is laughable now, and what you knew last year likely isn't a higher on the scale.

One more thing: Learn to have the confidence to say 1) "I don't know" and 2) "Man, that code I wrote last year/last week/this morning really sucks.  But I've learned how to write better code, so I'll do better." 

#1's never been a factor for me because I follow with "But I can find out.". 

#2 has only just started to get somewhat easier after reading blogs or hearing podcasts from folks like Atwood, Ching, Miller, Haack, and a bunch of other industry leaders.  If those guys can be up front about their weaknesses then why should the rest of us not be?

What's the most fun you've ever had ... programming?

I'm torn between two things.  First off, watching the head of a hard drive move back and forth as you're stepping through some driver code you've written really is cool.  SOOO much of what we do is nebulous, which I think is part of the reason we like writing unit tests: a green indicator comes on.  Seeing your code make something physically move  is just plain wicked cool.  That bit of fun was, uh, many years ago but I still remember it vividly.  (I also remember having to repair the armature after some bad code...)

The second bit of fun would be a space of a couple months after I'd joined Quick Solutions last year.  I came on board with the expectation from my boss that I'd quickly jump into a leadership role and help drive business, lead teams, write offerings, etc.  But... for my first couple months I just got to sit my kiester in a chair and code.  It had been YEARS since I'd been able to have no responsibilities but write tests, write code, geek out with other smart folks.  One of the most refreshing times of my life, and exactly what I needed after coming out of a less than optimal situation.

Passing on the love (or, tag, you're it)

Jason Haley

Ben Carey

Drew Robbins

Leon Gersing

Dave Donaldson

Saturday, June 07, 2008

Shout Out: XNA and Robotics Studio Event, 19 June in Cincy

Mike Wood and the gang at the Cincinnati .NET User Group are putting on an XNA and Robotics Studio event on 19 June at 6pm.  The event will be presented by Microsoft's Bill Steele

Bill, if you don't know him, is a certifiable nut who does all sorts of crazy things like run MSDN events in the Heartland region, play around with Robotics Studio, write games in XNA, and has invented a system for displaying small aircraft's flight information on the virtual surface created by the spinning propeller blades.

Seriously, Bill's a wicked smart, passionate guy who will definitely entertain and educate you.  Make it a point to stop by on the 19th if you're at all interested!

Monday, June 02, 2008

Playing with SlickEdit

The folks at SlickEdit provided copies of their cool Visual Studio tools to folks like myself who are members of The Lounge.

Being a tool geek I couldn't resist installing it and fooling around a bit.  I've not spent a whole lot of time with the tools, but have already found some cool bits.  The window below shows a cool documentation preview in MSDN style, a nice compliment to what I get from ReSharper's Ctrl-Q.  There's also a nice regex evaluator which can generate code for you to paste in as needed.

The "Quick Profiling" bits require you to get some trace handling in your code, so I'm not sure how far I'll go with that, but it may be an interesting thing to experiment around with.

There's also a very nice comparison tool as part of the Versioning Toolbox.  There's some interesting visual sugar for version graphs, history, and visualizations.  The graphs are pretty cool, and the list of providers is very competent: SVN, CVS, TFS 2005/2008, and VSS (barf).

Perhaps one of the best widgets in the versioning toolbox is the DIFFzilla window:

This is nearly as good as my favorite comparison tool Beyond Compare 2.  Love that directory compare feature, and you can quickly get file comparisons by double-clicking a set of the files in this window.

Overall I like what I see so far.  While I've got access to other tools which fit a lot of these needs, it's really nice to have all this wrapped inside Visual Studio.

Friday, May 30, 2008

Save The Date: Open Spaces Mini-Conf 9 July

Mike Wood and I are organizing an open spaces event on 9 July at Max Training's Mason facility.  The event's still being fleshed out, but we're looking at running from 6pm to 9pm.  Max has a crapload of space in their Mason facility, so we ought to be able to run five or six concurrent discussion groups.

Stay tuned for more details!

Tuesday, May 27, 2008

ANTS Profiler 4 EAP

I spent an hour or so working with a couple folks from Red Gate this morning to run through the latest build of their ANTS Profiler v4.0. 

ANTS 4 is still in EAP (download latest bits here), but I was awfully impressed with the state it's in.  I downloaded the bits, installed it, ran a quick session on a monster WinForms app I've worked on in the past, and quickly got down to the guts of some possible performance issues.  The UI's nicely done, and I was very, very impressed with the app's responsiveness.  I'd all but given up on ANTS 3 and dotTrace because of nearly unusable performance while trying to profile a couple apps.

ANTS 4 still has a way to go.  There are some bugs running around, and some UI tweaks need to take place, but overall I liked what I saw with this version.  You ought to go have a look at it if you're interested in an easy to use profiler.

<disclaimer>I get some software from Red Gate for free because of my MVP status.  That doesn't stop me from criticizing if needed.</disclaimer>

Wednesday, May 21, 2008

Recap of Cincy .NET User Group Panel Discussion

The great gang at the Cincinnati .NET User Group invited me to participate in a panel discussion at their meeting tonight.  I was honored to be up front of the group with Tim Apke and Ed Sumerfield, both two sharp guys with much more technical experience than me.  All three of us have, as Mike Wood put it "a long history in IT" which I think was Mike's nice way of saying all of us are old farts.

The discussion was very free form and hit a lot of great topics ranging from specialization or generalization to estimation accuracy to workplace culture.  Dealing with new technology was  big one, as was questions about breadth of experience (outside of the strictly technical domain).  The value of certifications was bandied about with a mild consensus seeming to form around them being good in the proper context: use 'em if you need to prove your chops early in your career, but don't overly fixate on them.

Hearing Tim's and Ed's perspectives along with those of the audience was a great experience.  Tim and Ed are wicked smart and have some great insight into how to steer one's career along.  Tim was emphatic about taking ownership of your own career.  In a sidebar after the meeting he compared taking care of your career to taking care of your car.  You keep your car's tires in good shape, change the oil, and give the vehicle a tune up on occasion.  Same concept applies to your career.  Ed also impressed the hell out of me with a number of things, not the least of which were his closing comments which was a short haiku along the lines of "own your path" but he said it way more better.

One of the regulars there (forgot your last name, Jamie -- sorry!) hit me with the last question of the evening and asked how I manage to balance work, community involvement, book writing, and time with the family.  That one hit very, very close to home and got me choked up in a major way -- not what I would have preferred for my last question of the evening, but there you have it.  I don't always do a great job of work/life balance, but I've made some tough choices in my career to favor family over work.  I've passed up great opportunities, I've struggled with sub-optimal part-time positions, and I've left a high paying job to be unemployed at home in order to take care of the kids.  My career would be in a much, much different spot had I chosen career over family. 

The cons of those decisions are that I'm not some internationally recognized expert traveling around the world talking at TechEd Barcelona or running some wicked cool project in Dubai, nor am I billing out at $500 per hour while working at iDesign or some similar firm.  The pros of those decisions are that I got stay at home with my daughter and son while they grew up, and I'm not traveling three or four weeks of each month while my wife and kids live their lives without seeing me.  (Another con: costs of therapy for two kids when they hit teen years and figure out how screwed up they are after having had me as their primary caregiver...) 

So to those of you who thought I got emotional about that question, you're right.  It's really at the core of who I am, even if I don't get it right as often as I should.  (That's me, too, BTW.)

Actually, forget all that.  It was really allergies.  Sorry.

In any case, the panel discussion at the group was an excellent evening.  I hope attendees got something useful out of it!

Monday, May 19, 2008

Dealing With Technical Debt

All projects acquire technical debt in one fashion or another, and in varying quantities.  I've been looking at a couple past projects to identify areas in need of some debt repayment should the projects move into their next phases.

Apropos, then, that I ran across a great post from Steve McConnell on technical debt.  I particularly like his sections on communicating about technical debt and debt safety.  Be sure to follow Steve's links to Ward Cunningham's and Martin Fowler's articles.

Good reading!

Saturday, May 17, 2008

Book Review: The ThoughtWorks Anthology

The ThoughtWorks Anthology, published by Pragmatic Press, ISBN 193435614X

This is a terrific book loaded up with 13 short, concise, golden essays from ThoughtWorks leaders like Martin Fowler, Neal Ford, etc.  Each topic covers something pretty vital for those of us who care about being somewhere near the top of our chosen craft.  Topics include solving the "last mile" problem between development and release, Ruby DSLs, polyglot programming, single-click deployment, and a bunch of other great reads.  Each article is extremely well-written and useful, but I found a subset of the book particularly compelling. 

Unfortunately, I only heard parts of Neal Ford's "Polyglot Programming" at his keynote at CodeMash 2008.  I was thrilled to get to read his article in this book on how to leverage different languages on the same platform to solve different problems. 

Jeff Bay's piece "Object Calisthenics" strongly reminded me of the glorious work The Practice of Programming from Kernigan and Pike in its emphasis on clean, simple, clear code.  I'm all fired up to refresh my coding practices with Bay's exercise using nine points for pushing yourself into writing better object oriented code.

"Refactoring Ant Build Files" from Julian Simpson, along with Hatcher's Java Development with Ant, should be mandatory reading for anyone dealing with build files -- regardless of what build environment you're using.

Other big winners for me were the testing articles by Kristan Vingrys and James Bull, Dave Farley's work on one-click release, and Stelios Pantazopoulos's article on project vital signs.  Of course, the remaining articles are also winners, it's just that these six or so really struck home with me.

Overall it's a fantastic work and I'm really glad I've got it on my bookshelf!

Thursday, May 15, 2008

New Podcast: Alt.NET Podcast

Ran across references to the new Alt.NET Podcast and checked its inaugural episode on the drive to work this morning.   The first episode was great, and I'm looking forward to hearing more from the 'cast!

Subscriptions available through iTunes or POR (Plain Ol' RSS).

Tuesday, May 13, 2008

Accurately Reflecting Agile's Value in Budgets and Schedules

I've been struggling for some time to ensure that our agile projects get a true representation of value delivered to the client when schedules and budgets are being discussed.  I think our flexibility to deliver on client-driven scope change too often fails to show how we're really doing, particularly in e-mails or on the Great Game of Business boards we use at Quick.

For example, a project I just wrapped up the first phase on used SharePoint Excel services to host a complex workbook the client uses to show total cost of ownership for heavy equipment.  The original scope was for one set of features at a specific estimated cost.  We ran over estimates on a couple tasks (Excel Services is a difficult technology, especially when you're trying to work around its limitations) and beat a couple others.  The big delta, however, was that the client had us drop a couple items and add in a significant amount of new features.  Overall the client was extremely happy with what we produced.

Unfortunately, every mail regarding schedule and budget always went out the door saying we were slightly behind schedule and slightly over budget.  I'd have to immediately follow up and say "Yes, but look at the added scope and value we're bringing to the client at their direction."  Furthermore, the GGOB board shows a variance on our project, nicely written in red because we were over the initial figures on a time and materials project. 

This mindset comes straight out of the PMP world and I'm struggling with how to get this changed in our environment.  Metrics are good, as are BVC information radiators, but only if they're a fair and and accurate reflection of where a project's at.  That accurate reflection should include some measure of succeeding on client-driven scope change -- after all, shouldn't we be all over recognizing the value we've brought to the client?

I'm not sure where this will end up going, but I'm noodling over some ideas.  I'd love to hear suggestions if you've got 'em.

Monday, May 12, 2008

The Value of Specializing Generalists

I've never claimed to know all the answers.  Actually, I'd be happy if I could answer more questions than not...  That said, if I can't answer a question, I generally know right who to reach out to for answers.

This comes in to play time and time again in our line of work, particularly since we're expected to be able to solve so many different problems in so many different technologies/platforms/whatever.  I ran in to this recently with a performance issue and was able to solve part of the issue by reaching out to Bruce Lindman, the senior DBA we have on staff.

I'd isolated some connection timeout issues to a query that was taking almost four minutes to return.  The query was built to handle sorting out some permissions across a large set of data and was working with a lot of records from a master table and association table.  The developer who had originally written the SQL had broken out the sproc into several different functions in order to modularize it for testing and for clarity.  Everything looked pretty good, and all of our moderately complex use cases passed just fine in the test environment.  Unfortunately when we rolled in to the test environment at the client site we found the really bad timing issue.  Ick.

Bruce immediately identified the problem: the user defined functions were returning tables of result sets -- and those tables aren't indexed.  Large amounts of data for intermediate results. No indexes.  Small problem there.    Bruce spent 30 minutes or so rewriting the query to bring the functions back in to the main sproc, tweaked a few things, and ended up cutting the query from nearly four minutes down to 22 seconds.

That's a great improvement in a technology area that I'm, well, frankly, sucky at best.  Being able to call in a specialist let me focus on a couple other issues which were also contributing to the performance problem.  Those other areas (overly heavy biz objects, pooling issues) were in my area.  SQL complexities aren't.

Our industry's grown too broad for us to be experts at everything, even if clients expect us to be.  I think all we can ask for is that we form groups or teams of specializing generalists: folks who are very solid at a wide range of skills, but experts in one area.  That lets us have a cadre of resources to reach out to when we're in over our heads, which happens more often than we'd like to admit...

UPDATE:  One blog reader, blocked from commenting by an evil corporate proxy, pointed out a great article by Scott Ambler on this same topic.

Sunday, May 11, 2008

Book Review: Subject To Change

Subject To Change, Peter Merholz, et. al, published by Adaptive Path.  ISBN 0596516835.

This book at first glance seems awfully similar to Scott Berkun's The Myths of Innovation, but there's enough distinction to make it worth a separate read.

Subject to Change is short, concise, and very well-written.  It offers up insights on how companies can be more flexible to meet market changes by working in new ways for solid customer research, product design, and agile approaches, among other things.

The book's nicely done and is filled with good examples of how some companies have come up with concepts which completely changed the industry.  Kodak's first box camera is an example of a product which fundamentally changed how companies treated their customers.  ("You press the button and we do the rest.")  Kodak also gets slammed for their ignorant approach to the digital camera age -- failing to adapt to a changing environment isn't a great way to run a business...

This same theme runs through the book: approaches that have worked wonders for companies contrasted with flops that haven't.  Successful approaches almost always come from businesses which have spent time understanding their customer base; flops come from companies which do silly things like create hardware which is feature-scarce, expense, and hard to use without having ever talked to a customer.

I liked most all the chapters and found the ones on design competency and agile particularly interesting.  No surprise about me liking the agile chapter since I'm a nut about agile software development!  There are also a number of great discussions on brainstorming UIs, layouts, and product prototypes, something which I think gets little or no coverage in other works.

Overall it's a good read.  It didn't grab me as much as Berkun's Innovation book, but it's a solid addition to my bookshelf all the same.

Wednesday, April 30, 2008

Troubleshooting: Simple Stuff First

I spent a lot of years running and fixing radar systems on big airplanes while flying around cool places like Iceland, Saudi Arabia, and Alaska.

One of the hardest lessons I learned (or had beaten in to me by crusty old sergeants) was that you always, ALWAYS look for the simplest solutions first when troubleshooting.  In my early years I was always diving in to the books to pull out the four-page fold out schematic instead of first using some elementary deduction to get some quick checks out of the way. This same principle carries directly over to ANY form of troubleshooting, not just trying to figure out why dots aren't appearing on a radar screen.

Today I had a good reminder of that when trying to figure out a problem in a system we built.  Custom security roles weren't being recognized by one component which maps records in a database table to business objects based on an enumeration.  I jumped into the debugger and started stepping through the component where the disconnect was (sorry, no test around it, and it had been months since I'd been in that part of the code).

I quickly found this wasn't the best use of my time, and instead went to a very rudimentary step: double-check the values in the database.  Some close examination immediately led me to the source of the problem: someone had added spaces to the role name in an attempt to make them more readable.  "UserType" became "User Type" and "BusinessFunction" became "Business Function".  This may look nice for humans but causes some issues in code...

That quick check saved me from losing more time in the debugger trying to wind my way back to where that initial comparison was happening.  The simple thing won out over the more complex thing.

We geeky types like solving tough problems and too often we get carried away with looking for the harder answer when the simple answer is sitting right in front of us.

(On a related note, if you haven't already, grab a copy of Debugging by David Agans and read through it several times.  Highly worthwhile to help polish your troubleshooting skills.)

Tuesday, April 22, 2008

Any Wrap up of CODODN I Wrote Would Suck Compared to...

Andy Erickson's video summary.

'Nuff said.

MVP Summit Wrap Up

Like Nino, this was my first summit even though I've been an MVP for three years.  I spent most of my time in SharePoint sessions; however, I sat through part of the C# Language Future discussion (my head still hurts) and a wicked cool glimpse into Rosario (VNext of TFS).

I hooked up with a lot of old friends and made a lot of new ones while I was there -- and spent a LOT of time talking with Jef about a lot of important things for building a great development team such as culture, doing agile right, etc., etc., etc.  Seeing Keith Elder get on stage and sing karaoke (and break the process) nearly paid the price of admission for me.  I also got to meet Roy Osherove and Oren Eini (Ayende) which was great because both them helped me out with the book.

I also had a couple nice chats with Ben Curry who I met at SharePoint Connections in Las Vegas last November.  He's a wicked smart SharePoint MVP with a lot of experience in huge-scale SharePoint jobs.  I can't wait for his book on SharePoint Best Practices to come out.  Additionally, I met Eric Shupps who is one of several folks I'm working with on a cool SharePoint project -- more on that when things gel.

Perhaps the funniest thing about the Summit was sitting in a session on Excel Services while I was IMing back and forth with Phil about a problem we were overcoming.  I'm heads down with Phil confirming our approach, then look up and see a demo using Excel Services -- and see a solution to the exact problem we're working on.  The ES team dev who built the demo was in the back of the room, so I grabbed him after the session and got some validation that we'd identified all the issues and were solving it in the right way.  That was worth the entire trip out to Redmond!

One thing that struck me about Microsoft's attitude in nearly every session: they are actively seeking feedback from the community on current features, planned features, and general pain points I mean "opportunities."  I will (and do) bust Microsoft's chops on any number of issues, but the effort they're putting in to changing their culture is amazing.  I was part of some tremendously productive conversations with the Visual Studio team, and listened in as the Really Smart SharePoint folks conversed with any number of different MOSS team folks.

Overall it was a great trip and I'm looking forward to the next summit!

(Oh, and my Tweets went from four to 70 in one week...)

Tuesday, April 15, 2008

CodePlex Session at MVP Summit

Sara Ford and some of the CodePlex gang will be putting on a session around where CodePlex is going.  They're putting the session on at the Sheraton Hotel Seattle - Queen Anne Room on Thursday, from 2-5.

Stop by if you're interested in seeing the future of CodePlex (or getting loads of swag)!

Subscribe (RSS)

The Leadership Journey