Saturday, June 30, 2007

On Getting Over the Hump (Or How I'm Finally Getting Features Done)

My first couple months at my new job with Quick Solutions have been wonderful.  I work with a bunch of great folks, I’m doing Agile development in a great environment, and I am heads down on project work somewhere between 65% to 80% of my time which is just fantastic.  Time formerly spent on the phone trying to swing deals with unlikely prospects has vanished, and the only SOWs I’m involved with writing now are those which have a solid chance of being signed by the customer.  I’m still commuting a long time (2.5 to three hours per day), but I’m ridesharing with a great co-worker who’s here on an H1B visa from India — so I get a lot of cool discussion time on wide-ranging topics.  (Plus he likes Dot Net Rocks so we’re spending lots of time listening to good podcasts.)

In short, life is good.

That said, I had a hugely ironic deja vu moment when I first started on my current project: it’s amazingly similar to the a technical manual viewer which I worked on years ago.  That project cratered due to all the classsic symptoms of a failed IT project: poor development practices (I left a company because of them), weak project managment, and little involvement with the true end users.  I was far too emotionally involved with that project because it was targeted to helping active duty Air Force folks work with the scads of tech manuals required in their daily work.  I’d been one of those guys for 11 years, and I wanted that viewer to get deployed come hell or high water because I knew how badly the paper tech manuals sucked.  Both hell and high water came and the system got put on the shelf without ever being fielded.  That sucked.

Obviously it’s something I’m still a little obsessed about, but I’m working on that.

Fast forward six or so years.  I start with QSI, spend a couple days sitting in on a terrific SharePoint 2007 Architectural Design Session with a smart guy from Microsoft, then head back to QSI’s dev center to find out what I’ll be working on.  I sit down with Jeff, the project engineer, to find out details on the project.  Deja vu: it’s another technical data viewer.  Since I had previous experience with a system displaying technical manual content, I took a feature for displaying HTML content in a WinForms WebBrowser.

To make a long story short, my feature turned in to a long slog through the poorly documented and very twitchy WebBrowser control, the powerful but somewhat mystical XslCompiledTransform class, and a crapload of work centered around getting poorly authored HTML documents converted into DocBook 5.x format.  The work was very frustrating for me because there wasn’t a whole lot of coding involved, just days and weeks spent slogging through standards, documentation, and blog posts.  Furthermore, when I could get some coding done it was usually centered around trying to nail down the WebBrowser’s flat-assed WEIRD eventing behavior, particularly when loading content into its Document properties or trying to deal with the infamous “about:blank” URI.  Test driven development?  Not particularly workable in that context.  Ergo, no green lights on the test runner, which is always frustrating for me.

Last week was finally a huge turning point for me.  Everything began to fall into place and I’ve made some great progress.  I’m getting tests written (first, even!), I’ve got content migrated into DocBook and am storing it in a database and transforming it enroute from the DB out to the browser control.  I got a history list working for backward/forward navigation (not available in the WebBrowser control when you’re streaming in content), and I even fired up the Enterprise Library’s Crypto classes to use some DPAPI encryption/decryption to protect the content in the database.

It’s nice to see some dots going on my short list of features on the project wall.  Folks on my team have been great about not giving me any serious guff on my lack of progress because they’ve realized how sticky a wicket the feature was.  (There’s been some good-natured ribbing on my lack of feature sign-off, but it’s usually been started by me poking fun at myself.)

One tangental note: the feature was badly underestimated and missed the boat with its initial planned complexity, but we were also able to do a serious Agile redirection enroute and make a significant change in the entire feature design to support a huge change in the customer’s needs.  Iniitally one of the customer team members had mandated the need to keep content in HTML.  We spent a couple weeks going down that road but pushing them to look hard at moving to XML and DocBook.  The customer finally agreed and we were able to get them in a much better place both for their current needs as well as getting them in a hugely better place for any future content work they’ll do.  Yay for Agile work where everyone’s friendly to requirements change!

I’ll post up some details on the hard knocks I’ve gone through with the WebBrowser, and perhaps write up a bit on how friggin’ easy it is to use EntLib for DPAPI work.  You’ve already seen some on my XslCompiledTransform work and I’ll have some more content in that area soon too.

All in all, I’m very, very happy with where I’m at now, particularly since I’m getting some features on the wall with dots on them.  (The original feature is still undotted, but it’s getting close to sign off!)

More on Vista's Security and UAC

Aaron Margosis, creator of the killer utilities PrivBar and MakeMeAdmin, has two great posts on Vista's security: one on the role of RunAs and how it replaces MakeMeAdmin, and another on why you can’t bypass the UAC.  The gist of his first post seems to be that Vista’s security model for regular users is much easier to deal with in a secure fashion than before. The second post has some good general information on privilege escalation, and why doing things in a *nix-like fashion with setuid or sudo would be a Bad Thing.  (Actually, ISTR from my *nix development time years ago that they’re a bad thing in the *nix world, so it’s not really a great example for him to use, IMO.)

Both posts are fine for non-developer folks using Vista.  Those types shouldn’t have been running with admin privileges on previous OSes, and I’m glad to see the issue’s somewhat easier for them under Vista.

Unfortunately, as a developer I’ve found the UAC issues in Vista to be too much friction and outright blockage for my work, so like some of my other friends doing development on Vista I've had to disable it.  I personally think this sucks because I’m adamant about developing with non-admin privileges.  I rant about it in my Security Fundamentals talks, and I’ve chided other developers I’ve run into who’ve admitted to developing as admins.

I’m not sure where the future will take me with UAC on Vista.  Having to give admin credentials for escalation numerous times during the day doesn’t bug me so much, but having to deal with file permission stupidity (like Jason talks about in his post linked above) is just insane. 

I want UAC to work for me as a developer because I absolutely believe in the principles behind it.  Unfortunately, its implementation is just too big a pile of steaming, maggot-infested dung for me. 

Thursday, June 28, 2007

Sam Gentile's Reading List

I’m posting this because I keep losing the link to Sam’s great reading list on Amazon.

There are some things I’d add to Sam’s list (Programming Pearls, Code Complete, Writing Secure Code, The Practice of Programming, Effective C#, and about 50 others), but Sam’s list is a great place to start.

Tuesday, June 26, 2007

More Regarding Offline Work With TFS

Dave Donaldson read my whiny post about the friction I’ve got with TFS and working offline.  Dave details his solution to the issue on his blog, and as always with Dave, it’s a smart way to get work done.

One additional step you should take to make working offline smoother, particularly if you’re a fan of working in small iterations: Set up an offline repository so you can keep your small changes checked in and maintain a working system as you go. 

I used TortoiseSVN to create a local Subversion repository, so now I’ve got TFS for the online stuff and Subversion for my offline safety net.  If that’s not ironic then I’m not sure what is…

(Note that with TortoiseSVN you don’t need the full Subversion installation to get a working repository going.  Just use TortoiseSVN’s “Create Repository Here” context menu option to set up a repository in an empty folder.)

Monday, June 25, 2007

Team Foundation Server and Working Offline

I’m running into all kinds of pain working offline with TFS.  Loads of pain.  Absolut vodka and Camel cigarette kinds of pain.  The Team Foundation Foundation Server Power Toy is quite a help, but still not a complete salve.  Working offline is a LOT harder than it needs to be. 

Good resources on dealing with working offline include Buck Hodges’ blog on the PowerToy, the Team System blog (the post’s light on depth), and John Robbins’ method which uses MSBuild to snarf out all binding info from the solution.

I realize it’s v1.0 of TFS, but working offline should not be such a complete PITA.  Offline use is not some weird edge case; it’s a regular occurance by a large percentage of the developer community.  I hope there will be some non-kludgy fixes quickly released.

Saturday, June 23, 2007

Recent Non-Technical Reading

I’ve been reading some odds and ends over the last six months which haven’t been techno-geek books.  A partial list includes:

  • A few sci-fi classics I’ve re-read for the umpteenth time:
    • None But Man by Gordon R. Dickenson. Just a solid, entertaining read with great characters.
    • Gateway and Beyond the Blue Event Horizon by Frederick Pohl.  Wonderful stories, the both.  Great twists, great work with the plots, style, and voices.
    • Ender’s Shadow by Orson Scott Card.  Beatiful parallel storyline to Ender’s Game.  This one follows Bean around and it’s full of clever second viewpoints to events in Ender’s Game.
    • Northworld, With The Lightnings, and Paying the Piper by David Drake.  One of my mostest favoritest authors.  Grimy, hard stories about warriors making tough choices in harsh circumstances.  Sci-fi war at its best, but in no small part because of the strength of the characters.
    • Kesrith, the Faded Sun by C.J. Cherryh.  Amazing story of a race which is unbending in their rules — but whose love for life and exploration is the hidden core of their beliefs.
  • To Kill A Mockingbird by Harper Lee.  Wow, it’s been decades since I read this. (Literally.  I’m an aspiring old fart.)  Amazing stuff when an adult can writeso well from the eyes and voice of a young child.  And yeah, the story’s pretty dang powerful, too.
  • The Trusted Advisor by David H. Maister, Robert Galford, and Charles Green.  Absolutely critical reading if you do any form of consulting — or any form of mentoring or advocacy.
  • The Rage and the Pride by Oriana Fallaci.  Powerful, passionate, brilliant.  A scathing rant against the fanatics behind 9/11 and the religion which has let itself be hijacked and turned away from a religion full of science, art, and peace and morphed into a boiling mass of insanity.
  • On The Weath of Nations by P.J. O’Rourke.  OK, so Adam Smith’s Wealth of Nations is something I should read some year when I’ve got nothing else to do and after I’ve finished a doctorate in economics.  Right.  O’Rourke’s work is a riff on Cliff’s Notes, but with O’Rourke’s usual blistering commentary and wicked humor.  Highly entertaining, extremely educational, and certainly in line with O’Rourke’s other great writings.
  • Band of Brothers by Stephen Ambrose.  I re-read one of Ambrose’s various works on WWII at least once a year to remind myself that I live free and happy in the greatest nation on Earth due to heroic sacrifices by people of my father’s generation.
  • All the Hellboy graphic novels by Mike Mignolia.  I bought the seven or eight piecemeal as I was writing parts of my book — they’re wonderful, amazing, beautiful works and it was fun to work through them again.
  • The Holy Bible.  Because the technical side of my life isn’t the only part which I need to improve.

Scott Berkun on Development Methodologies

Scott Berkun’s Myths of Innovation is a terrific book that I reviewed a little bit ago.  Berkun’s also got a great blog, where one of his posts is a list of real-world development methodologies.

The post is a hoot to read, and painfully accurate.  Make sure to read the comment thread, because there’s lots of gold in there, too.

Monday, June 18, 2007

MS Test and Test Support Files

I’m not overly fond of the MS Test framework for unit testing, but it’s what we’re stuck with I mean using so I’m pressing on with it.  The test manager makes it difficult to organize and run tests, and I greatly dislike how slow it is.

Today I’ve found some other friction in MS Test that’s even more of a PITA: External items aren’t copied to the test output directory unless you jump through several other hoops.

My current test environment uses a number of XML and HTML files as input and reference data.  I keep those files in a folder separate from my test classes since they’re used by a number of different test projects.  I use Add -> Existing Items  and add those items as links to the various projects — and then set “Copy to Output Directory” to “Copy if newer” to ensure things get dragged over as appropriate.

This works just fine using TestDriven.NET to run tests; however, MS Test adds some additional complexity in order to get such items over to the test folder.  Instead of relying on the “Copy to Output Directory” property, you’ll need to specifically call out the items via one of two methods: the DeploymentItem attribute or by specifically adding those items in to your test run configuration.

The DeploymentItem attribute looks like this:

[DeploymentItem(@"C:\vsprojects\MyProject\Tests\Test Support\XmlContentFileOne.xml")]

Unfortunately it didn’t work when added to a ClassInitialize setup method, so I turned to the test run configuration’s Deployment tab where you can add files or directories to carry over to the test run output folder.

Irritating, overly complex, and an additional layer of brittleness should I need to change anything.  -1 (again) for MS Test.

Friday, June 15, 2007

Book Review List

I post all my book reviews on Amazon.com.  They’re generally shorter than the versions I post here, but you can at least find the entire list at one spot at Amazon.

Some day I’ll get around to setting up Google’s custom search bits to troll this site.  That would give folks the ability to easily search FrazzledDad for the longer book reviews, but for right now the list’s on Amazon.

I’ve stuck a link to the list on the sidebar as well.

Thursday, June 14, 2007

XSL Transforms and Included/Imported Sheets with DTDs

Problem: Loading an XSL sheet with DTDs defined in it barfs because by default DTDs aren’t allowed in XSL for security reasons.

Solution: Load the XSL sheet with an XmlReader which has its ProhibitDtd property set to false:

XmlReaderSettings readerSettings = new XmlReaderSettings();

readerSettings.ProhibitDtd = false;

XmlReader xslReader = XmlReader.Create(XSLSHEET, readerSettings);

XslCompiledTransform xform = new XslCompiledTransform();

xform.Load(xslReader);

That's fine and ducky if you're working with an XSL sheet which doesn’t import or include any other sheets.  Unfortunately, I’m working with DocBook’s html sheets which do indeed import a metric crapload of other sheets.  Those downstream sheets get resolved and loaded with default XmlReaders which don’t carry along the ProhibitDtd property.

Ugh.

More Better Solution: Write a custom XmlResolver which returns an XmlReader with the ProhibitDtd property set false:

public class XmlReaderResolver : XmlResolver

{

    private System.Net.ICredentials credentials;

    private XmlReaderSettings settings;

 

    public XmlReaderSettings Settings

    {

        get { return settings; }

        set { settings = value; }

    }

 

    public XmlReaderResolver()

    {

        settings = new XmlReaderSettings();

        settings.ProhibitDtd = false;

        settings.ConformanceLevel = ConformanceLevel.Fragment;

 

        credentials = CredentialCache.DefaultCredentials;

    }

 

    public override System.Net.ICredentials Credentials

    {

        set { credentials = value; }

    }

 

    public override object GetEntity(Uri absoluteUri, string role, Type ofObjectToReturn)

    {

        XmlReader reader = XmlReader.Create(absoluteUri.ToString(), settings);

        return reader;

    }

}

Now that custom resolver as you’re loading the XSL (partial snipped along the same lines as above):

MyXmlResolver resolver = new MyXmlResolver();

 

XsltSettings xsltSettings = new XsltSettings();

xsltSettings.EnableDocumentFunction = true;

 

xform.Load(xslReader, xsltSettings, resolver);

So far this seems to be working well, although I've got other issues with how the various XmlReaders and XmlWriters aren't playing overly nicely with the XML content I'm holding in memory. More on that later.

Wednesday, June 13, 2007

Announcing CodeMash v2.0.0.8!

We’re off and running with planning the next CodeMash conference which will be held 9–11 January, 2008.  Mark your calendars!

I’ll be making various announcements here at my blog, but your best chance for breaking news will be to monitor the Google group we set up for CodeMash.  We’ll also be making various announcements at the conference’s homepage.

The first conference was a great success and generated buzz across the nation.  Seriously.

Round two of CodeMash will be even way more betterer, so make sure you’re able to attend!

Tuesday, June 12, 2007

Book Review: The Myths of Innovation

The Myths of Innovation, Scott Berkun, published by O'Reilly, ISBN 0596527055

This book is a great break from all the tech books I’ve been working through over the last months.  Berkun’s book is small, concise, and a very good read on what innovation really is — not the myths and stories we’ve come to associate with major breakthroughs.

Newton’s discoveries about gravity, legends to the contrary, didn’t come from inspration after he got hit on the head by an apple; instead his breakthroughs came about after years and years of work in the field.  All of Newton’s prior experiences combined to give him the ability to meld everything together and come up with some unique ideas.

Berkun repeats this idea throughout this short, highly enjoyable book: Innovation is very, very rarely some epiphany moment where an idea is spawned completely out of the blue.  Instead, innovation, inspiration, and epiphanies are the product of having laid the groundwork in many different ways. 

Berkun talks about this groundwork in several different fashions.  He talks about how good managers set up an environment which fosters creativity and innovation (think old Microsoft, current Google, SemTech in Brazil), how innovators are able to build off their prior experiences, and a number of other critical factors.

The book’s well-written, and it’s a physical pleasure to read.  The book’s small size, pleasant paper, and great photographs all combine for a, uh, innovative experience.

This book, coupled with things like Semmler’s Maverick or Demarco and Lister’s Peopleware, is a great addition to my bookshelf!

Sunday, June 10, 2007

Updates to WinDevPowerTools.com

It’s been far too long, but I’ve finally gotten all the tools from our book loaded in to the companion site.

One of the coolest things about the site is the “My Toolbox” badge.  James came up with this idea and it’s da bomb.  Log on to the site and you’ll be able to create a toolbox where you can load up a list of the tools you regularly use.

You can also cast votes for tools you love:

You can get a badge link to your toolbox to post wherever you’d like.  I’ve just added one to this blog’s sidebar:

Forgive the number of arrows, but I think this is a great idea!

The site’s undergoing some iterative development.  We hope to have articles on new tools posted up soon, and we’ll be opening up the site for other folks to add tools at some point.

Give it a shot and let us know what you think!

Thursday, June 07, 2007

Playing With The New Diskeeper

<disclosure mode=”full”>I get some software free because of my MVP status.  Diskeeper’s one of those freebies.  I’m under no obligation to blog about it or say overly fawning things when I do write about it, but you ought to know when I’m talking about stuff I got gratis.  So there you have it.</disclosure>

I desperately needed a break from beating my head against the wall of getting DocBook XSL to work using the MSXML and/or EXSLT engines, so I thought I’d take a minute to finish up a post I started a few days ago.

I’ve been using the latest release of Diskeeper (v2007 or v11, depending on where you look) and I’m pretty happy with it so far.  I’ve been running it on my personal laptop (WinXP Tablet) and my work laptop (Vista) for several weeks.

There are a lot of little features that show some nice forethought during Diskeeper’s design and development.  I like seeing small things like a performance warning when you’re trying to use Diskeeper on battery power.

The UI’s completely reworked from the older versions I’ve seen.  I can’t say I’m overly enamored with the style of the icons on the toolbar, but they’re not completely horrid.  (And you should see the sucky graphics I do, so who am I to throw stones…)

That graphical nit aside, I like the extremely flexible options Diskeeper gives me for scheduling defrags.  The automatic defrag timeline lets me block out chunks where I don’t want Diskeeper even breathing.   Being able to set different options for different days is very sweet.  (I do this despite Diskeeper’s protestations in their documentation that their “InvisiTasking” utilization system will ensure my system won’t be impacted.  That may be fine for non-developer geeks, but I’m somewhat anal about trying to keep services on my system to a minimum.)

Diskeeper’s got a fancy mode called I-FAAST (around since Diskeeper 10) that supposedly analyzes your system’s file usage and can reorganize files to help boost performance.  It sounds nice in theory, but I haven’t noticed any big gain in speed — however, I’ve not been closely monitoring this.

Some files can’t be defragged or moved while the system’s in operation, so you need to have your utility run in a special mode during a reboot.  This can be awfully inconvienent since these special defrags can take a long time and I always forget to set them up at a convenient time.  Diskeeper helps deal with that by another thoughtful option: schedule reboots with a defrag.  You can set up a specific time where the system will reboot and run a boot-time defrag.  Cool.

 All of the various odds and ends feature configurations are available on a by-volume basis, which is nice.  I can set things up so my external drive with my virtual machines and bigger files gets handled differently than my system drive.  That’s a bit of sweet flexibility.

Of course there are screens with lots of shiny colors showing you how desperately you need defragmentation. 

I’m not one to blythely trust any tool’s metrics on how much it’s improving system performance, but there’s certainly a long-term noticable difference with Diskeeper, particularly when I’m using my various virtual machines.  I’m sure someone’s done some hard analysis for defragmentation metrics, but at some point I just toss my hands up and say “Yeah, I need it and I know it does good, I just don’t know how much.”

Back to the glowing positives: One final thing that I’m impressed with is the help file.  There’s good documentation about how things are done and how you can best use Diskeeper.

Overall I’m very happy with Diskeeper so far. 

Tuesday, June 05, 2007

A Laundry List of Failed IT Projects

The Times Online has an article detailing several IT projects which have cratered in the UK over the last ten or so years.  Some of the systems were pretty critical, like a system to track screenings of teachers — its failure left child molesters unchecked in public school systems.

It’s always discouraging to read about these train wrecks because it appears we as an industry aren’t getting a lot better about solving complex problems.

Ugh.

Saturday, June 02, 2007

Licensing 101, or How Not To Handle Disputes

I’ve been back and forth on the ongoing train wreck between Jamie Cansdale and Microsoft over TestDriven.Net’s support for Express editions of Visual Studio.  I’ve finally come to the opinion that Jamie’s in the wrong, and I think his arguments are a bit specious and hollow.

At the end of the day it comes down to the fact that Microsoft’s been clear that Express isn’t intended to support add-ins.  All of the other thousands of companies developing VS add-ins understand this.  What’s up with Jamie?  What’s not clear about this? 

Spending his time arguing over whether TestDriven.Net’s support uses public APIs is futile, IMO.  The bottom line is that folks at MS fought hard to get Express versions released out for hobbyist use, and there were some mandates put in place to enable that — like Express not supporting any add-ins or plugins.

Jamie runs a business and makes money off TestDriven.Net.  I’d think he’d understand the need to control certain revenue streams — after all, Jamie himself took TestDriven.Net from a freely available tool to one you have to pay for.  I don’t think he’s open to folks trying to pore through his EULA trying to find loopholes for using the hobbyist version in a commercial role.

So, with all this said, I’d like to vent my spleen on Microsoft’s handling of the matter.  Frankly, you guys suck.  Not renewing Jamie’s MVP over the flap is understandable, but not being up front and plainly saying “You’re out of luck specifically because of your position on TestDriven.Net” is foolish.  I also have a very low opinion of the style of communication coming from the initial contact person at MS.  Read through the e-mails linked on Jamie’s post and see what you think.  (Interestingly, that same Microsoftie was a roadblock to James’s efforts with Visual Studio Hacks.  Note to MS: Maybe you should consider having someone else handle interfacing tasks…)

At the end of the day I’m irritated by both sides.  MS’s handling of the matter is completely stupid, but seems to be right on line with the 1000Lb gorilla approach they’re using for the patent issues so prominent in the news.  On the other side, Jamie’s taken a stance that’s fundamentally wrong.  I can’t imagine he’d appreciate the same sort of behavior with folks using his commercial product.

Maybe this train wreck will clear up in a nice fashion, but I doubt it.

(By the way, I’m a huge fan of TestDriven.Net.  I used it when it was free, got really pissed when he started charging for it, then plunked down the $$ to continue using it.  I also showcase it during my talk on Open Source Test Tools, even though it’s not open source.)

Subscribe (RSS)

The Leadership Journey