Wednesday, December 18, 2013

Some simple tools for rapid mobile development

I was recently invited to attend an all-day workshop here in Milwaukee titled "Build your First Mobile App in One Day with HTML5, jQuery, PhoneGap & Usergrid."

One day, huh? I'd managed projects involving mobile applications, but had never built one before, and was really excited to get my hands dirty and learn how it could possibly be done so quickly. I was not disappointed.


Scope

The purpose of the app was to build and store a list of museums, and then display any that were within a certain distance of the user based on GPS coordinates. Mobile UI, Cloud data storage, GPS location calculations, testing for our target platform / OS - quite an ambitious project for a ~6 hour workshop. Fortunately, the process we went through to build our app exposed me to a few new tools that helped make the 'app in a day' promise a reality.


Codiqa

We started by creating the user interface. Codiqa is a web-based UI modeling and code generation tool. After logging in, you're presented with a device frame and a toolbox of common controls. From there, it's just drag and drop; plop the widgets you want onto the frame, make a few label changes, and presto! - you have a Mobile UI. To get the code files you need to continue building your app, just click the 'Download project' button, unzip the files, and you've got a working HTML5/jQuery website or mobile application. The markup that came out was very clean as well, which is not always the case with code generation tools.

They have a really nice demo area on their website where you can test it out for yourself. I liked working with it, and I think you'll find it to be a big time saver on UI work for your next project. My thanks to them for supporting the workshop with free trial licenses for all the attendees.


Apigee

So we finished our user interface. Now what about the museum data? To make the app work, we needed to store and retrieve information about the museums - name, city, GPS lat/long coordinates. How's that going to be possible in a few hours, without paying for a hosted database somewhere?

Enter Apigee. I'd actually visited their site a few months ago and picked up a free copy of their web API design whitepaper for a project at work. At the time, I had trouble figuring out what their product actually did though. Fortunately, this portion of the workshop cleared all that up.

Think of Apigee's App Services product (a.k.a Usergrid) as data storage plus a "wrapper" for a number of common web and mobile APIs. "Backend as a Service" is a term that was used to describe it at the workshop; it's a collection of many of the common functions you'd look for while building a web or mobile app packaged up nicely in a single bundle. Some of the great features their platform offers:

  • Data storage & management. This includes all data, files and assets - up to 25 GB - as well as a flexible SQL-style querying syntax.
  • Social media connectivity with activity streams.
  • User management. Create logins, assign roles and permissions, and build groups.
  • Push notifications - send up to 10 million per month.
  • Capture and utilize geolocation data from user devices.
  • SDKs are available in most of the common mobile and web development languages.
  • Best of all, App Services (all that stuff I just mentioned) is always free for developers!
Here's a shot of the Data Management workspace from my sandbox, which is what I used to create the necessary data structures and API endpoints needed for the project:

From this point, it was easy to add a number of museums and geolocation coordinates for lookup in the app. Once the app was successfully pulling our complete list of museums, we added a filter to only show museums that were within walking distance of our current location with a simple query to Apigee's API (this is in Javascript returning JSON):

museums = new Apigee.Collection({
type: "museums",
client: apigee,
"qs": {
"limit":25,
"ql": "location within 15000 of " +
loc.coords.latitude + "," + loc.coords.longitude
}
});

Our app was now fully functional, with all of the necessary features we defined at the outset, and we had about an hour left in the workshop. I was really impressed with this service. Getting an app (web or mobile) going with their product was so simple - the "batteries are included", and it's free to start. There can't be too many ways besides this to get your application to market quickly, but still have the flexibility to scale it easily if / when it takes off. Very cool product in my opinion.


PhoneGap

With the user interface and the backend both completed, it was time to start testing for our target mobile platform. PhoneGap is basically a cross-platform compiler of HTML5 mobile apps. With it (and the appropriate SDKs and software packages), you can take the same code you wrote and package it for iOS, Android, Windows Phone, and other platforms. It also allows you to access many of the common device features you might want to incorporate into your app (camera, storage, contacts, etc.) with the same code on each of those platforms without having to rewrite anything.

PhoneGap has an easy-to-use command-line interface. To compile your app, simply say:
$ phonegap build android

Once that's done, you can test it by running an emulator installed with one of the SDKs by saying:
$ phonegap build android --emulator

Build, tweak and repeat until satisfied. The only caution I'll offer here is you shouldn't expect to just download PhoneGap and start running commands. Workshop participants were actually asked to install this tool before arriving; there are a number of steps to installing the requisite SDKs and software packages for each mobile platform that need to be present for PhoneGap to work. The PhoneGap website provides good step-by-step instructions on how to do this for each, but it can take a while.


Summary

As promised, at the end of the day I had a working, buildable mobile application. Download my project and check it out if you'd like. Overall, it was a very good learning experience for me, and I will definitely be looking for ways to put these tools to use. My thanks to Apigee (sponsor) and Matt Dobson (presenter) for putting on a great workshop. If you see one of their events pop up in your neighborhood, I highly suggest making time to attend.

Friday, October 11, 2013

wsubi Milestones

Just wanted to quickly share that the wsubi project I launched 10 months ago now has over 1,000 views and 100 downloads. While these are rather meager numbers as open source projects go, I'm really proud of the application, and I'm glad people have been finding it useful. I still have a number of feature ideas on the project's Issue List to incorporate, and fully intend to continue developing it over the next year (and probably longer).

If you're working on database or software projects, especially in a team environment, I encourage you to take a few minutes and check it.

Wednesday, September 4, 2013

Surface RT - nowhere to go but up (and some ideas on getting there)

If you've read anything in business or technology news in the last couple of weeks, you're aware of the near $1 billion loss Microsoft reported recently on their current inventory of Surface RT tablets. You're probably also aware that a few days later, long-time Microsoft CEO Steve Ballmer announced plans for retirement in 12 months or less. Coincidence? You be the judge.

Those two events make it fairly plain that the first iteration of Surface RT hasn't succeeded as Microsoft hoped it would. However, they still have ~3 million of these devices in inventory, and are already talking about "Surface RT 2" (according to the articles above). So how do they make use of / liquidate the inventory they have now, and avoid ending up with another pile of excess inventory a year after launching the next version of the device?


1) Leverage the Partner Network

One of the bigger complaints about the Windows 8 ecosystem thus far is the number of apps available in the Windows Store compared to competitors. Granted, the Apple, Google, and Amazon stores have all be online for a much longer period of time. But Windows as a platform has been around much longer than Android or iOS. There are many, many small and mid-size software shops that have hundreds of thousands of applications for previous versions of Windows over the years.

I think Microsoft needs to engage this group, specifically through the Partner Network members, and get them working on porting all of these great tools to their new platform. They can potentially solve two problems here by offering a discount on the Surface RTs they want to get rid of as testing devices for companies looking to build an app for the Windows Store. "Renew your MSDN account, get $100 off a Surface RT". "Buy a Windows 8 development account, get $100 off a Surface RT". You get the idea.

Companies using Windows have typically been slow to adopt the next version. Grease the wheels a little bit to get things moving. Unlike a few years ago, there are some strong alternatives for many companies when it comes time for them to upgrade their workstations. Make sure they have good reasons (like all of the applications and tools they've used for years) to continue to choose Windows.

Let me say though that if the plan for leveraging the Partner Network looks like this, I really don't see things changing very quickly.


2) Offer incentives to schools

My son is starting K4 this year, and I had my very first parent orientation last week. To kick things off, the principal spoke briefly about the school's vision and approach to learning. On the subject of technology, he proceeded to list a number of device purchases they had made over the last few years: iPads for two grades, Chromebooks for two grades, iPads for two more grades, and probably another set of Chromebooks sometime this coming year for two more grades.

See what's missing from this list? Do I have to say it?

My mom works in a different school in the area, and they're doing the same thing - equipping each student with a device; all non-Microsoft in her case as well. My guess is that this is happening in schools across the world, not just suburban southeastern Wisconsin.

Schools are gradually replacing their old "computer labs" of the 90's, which were primarily powered by Microsoft products. As they gear up (literally), there's a big opportunity here to help the students learn by using these devices, and there's no reason they couldn't be using the Surface RT. I don't think Microsoft can afford to miss the boat here. They need to be offering some incentives to schools to use their devices. According to this article, there may have been some plans for a program like that, but it doesn't seem to have been implemented. Drop whichever suit pulled the program and get back to this great idea.


3) Change who/what you're competing with

Take a look at an "average" iPad user. While this is a humorous exaggeration, there is some truth here; this person is probably never going to buy the Surface RT. They're happy with the product they have. When Apple releases the next iPad, they'll get it, regardless of what else is available.

So why bother trying to convince them otherwise? To paraphrase the immortal Ben Kenobi, "These aren't the users you're looking for."

Look, the Kindle Fire is ~$199. The Chromebook is ~$249. A Sumsung Galaxy 3 is ~$299. I'm not sure if they can go this low with the investment made in the current product, but I think it's where the Surface RT needs to be priced and marketed to move. I definitely think they need to target this price point and market space with the "Surface RT 2" hardware specs.

They need to attract more users to the Windows 8 / Store platform, and making a very affordable device is a great way to do that. Heck, that's how Windows PCs came to dominate the market in the first place - lower price points.


I think it will be interesting to see what the company does with this product over the next 12 months or so, especially with some of the internal transitions they have going on. I really hope they can take that new agile mentality and look for some opportunities to improve and see the success they were hoping for with the Surface RT.

Wednesday, August 21, 2013

Create zip files using PowerShell

Continuing my series of articles on handy PowerShell scripts, I'd like to take a look at creating zip files. If you missed either of the first two articles, you can get caught up on them both here.

Zipping a file or folder manually is obviously a trivial trivial process. However, there are many situations where you don't want to be doing it manually. You want it to happen overnight to move some backup data around, or to package up some files after a build, or to free up space on your server by compressing log files. There's also the case where zipping up some files is part of a process you need to do on a regular basis. It's not that you couldn't do it manually; it's just more efficient and reliable to automate it. This is the scenario that prompted me to come up with the script below.

I've been working on a little utility called wsubi for the past 8 months or so. The download I distribute for it is a zip file. As the project grew, each build required more care and time to ensure I had collected all the needed files, sample scripts, etc, before I created the download package. Perfect scenario to automate! Now, with the help of my PowerShell script, I just enter 'run ~build' from the application's console, and a few seconds later I have my latest build packed up and ready to go on my desktop.

Here's the code; I'm sure you'll be able to find many other good uses for it:

Wednesday, June 19, 2013

Using LINQ to SQL for ETL projects

I've spent part of the past 18+ months managing a team that's been building a large-scale web application from the ground up. One of our many tasks was to move existing client data from their current web application to the new one we were working on. Having spent a few years as an ETL programmer, I decided to take on the project myself. As a manager, I think it's important to stay in the code when you can, and this looked like a really good opportunity for me to do that.

Both the old and new web applications used SQL Server databases, and had somewhat similar purposes, making the migration process between the two pretty straightforward; I just needed to decide how to write it. In the past, this is something I probably would have done with some T-SQL code. But the development team was using an OR mapping tool for the new web application, and I was really curious to see if that could work for an ETL process.

Why not just do it the old fashioned way? Well, anyone who's written a T-SQL data migration knows where the pain points with that approach are: buildout time and debugging. The appeal of using OR mapping to me was that I could quickly write the process in C#, and be able to easily step through all the code to find and fix problems. I was going to be moving years of data for multiple clients; it needed to be accurate, and I needed to know when and where it wasn't, and why. To me, these two reasons made trying the LINQ to SQL route at least a worthwhile experiment, if not a potentially better approach.

Ultimately the project was a success, but I ran into a few stumbling blocks along the way, mainly with poor performance. Buildout for the utility was very fast as I had anticipated (only a few work days), but I had to go back and rework a number of the steps during the testing process as I figured out how to address various performance bottlenecks in LINQ to SQL. Below is a "project retrospective" of sorts; some tips, tricks and tools I learned along the way.


1. SQLMetal rocks

The first time I ran sqlmetal.exe was one of those epiphany moments developers sometimes experience when they discover a sweet new tool: the heavens open, a bright light envelops you, and a choir fills the room. That might be a slight exaggeration of the event, but it is a slick utility. With a single command-line entry, you can generate a .dbml file for any SQL database in seconds. Just add this file to your project, put a using; reference in your main code file, and blammo! - you can access any object in that database. Simple and fast. If you haven't used it before, I highly recommend giving it a try; here's a nice walkthrough on getting started.


2. Turn Off ObjectTrackingEnabled in your source DataContext

This is a really simple change, and should be implemented if you will not be modifying the data in your source database during the process. Just add the following line of code right after you initialize your connection to the DataContext of your source:
  • MyDataContext.ObjectTrackingEnabled = false;
Microsoft's documentation states, "Setting this property to false improves performance at retrieval time, because there are fewer items to track." Sounds good to me.


3. Use 'DELETE' in-line SQL commands instead of DeleteOnSubmit()

In my project, I needed to clear out my destination database before each run of the ETL process, and saw major performance gains by making this change. I had about 50 tables to reset, and was able to drop the runtime for this step from 45 minutes to 4.5 seconds. If you need to do any delete steps:
  • USE - MyDataContext.ExecuteCommand("DELETE FROM [Table]");
  • NOT - MyDataContext.Entity.DeleteOnSubmit(object);
Sorry all you purists out there; in-line SQL is just so much faster in this situation.


4. Use explicit selects when creating your LINQ objects

You only want to bring back fields and data from your source database that you're actually going to move. If a table has 15 fields and you're only moving 2 of them, leave the other 13 alone by using specific select statements:
  • USE - var Records = from x in DataContext.TableObject select new {x.Id, x.Value};
  • NOT - var Records = from x in DataContext.TableObject select x;

5. If you need to access a data object more than once, move it to memory first

As obvious and straightforward as that sounds, I'll admit I didn't do it in the initial version of my application. It was pointed out to me by a teammate during a code review, and to his credit, improved performance for the LINQ queries where I implemented it by 10x. I found a good number of opportunities to use this technique within loops:

INSTEAD OF:

foreach (var Record in SomeCollection) {
    string Code = (
        from AS in DataContext.ActivityStatus
        where AS.StatusId == Record.StatusId
        select AS.Code
    ).FirstOrDefault();
} // ('Code' value will be taken from the database each time through the loop)

DO THIS:

var ActivityCodes = (
    from AS in DataContext.ActivityStatus
    select new {AS.StatusId, AS.Code};
);
foreach (var Record in SomeCollection) {
    string Code = (
        from AC in ActivityCodes
        where AC.StatusId == Record.StatusId
        select AC.Code
    ).FirstOrDefault();
} // ('Code' value will now be taken from memory each time through the loop)

Far fewer trips to the database yields much better performance. Duh.


6. Avoid using InsertOnSubmit() whenever possible

This function is by far, hands-down, unequivocally the biggest hindrance to moving your data in a reasonable amount of time. I learned the truth about InsertOnSubmit()the hard way in my project.

Here's an example - I had a simple table with an integer ID field and a varchar(50) string field. Oh, and it contained about 1.1 million records. Despite the size, you'd think that since the data fields are so simple, it should be pretty quick to move, right? Nope. On the first run of my application, it took a whole work week just to move this one table! That's a blazing speed of about 150 records per minute. *sigh*

So now what? Scrap the project as a learning experience and start over? Nah, I don't give up that easily. After a couple hours of research, I found the solution: SqlBulkCopy. There are a number of examples online of how to implement this class, but ultimately I decided to roll my own, since most of them had extra code I didn't need. My class is available here for download. To use it, you just need to create a List of your LINQ objects, then call BulkProcessor.Cram() when you're ready to add the contents to your database:

List<DataContext.LinqObject> ObjectsToInsert = new List<DataContext.LinqObject>();
foreach (var Record in SomeCollection)
{
    DataContext.LinqObject PersonObject = new LinqObject()
    {
        Name = "A. Person",
        Address = "123 Main St.",
        City = "Anytown",
        Zip = 12345,
        Notes = Record.TextValue,
    };
    ObjectsToInsert.Add(PersonObject);
}
// List has been built; now use SqlBulkCopy to insert!
BulkProcessor.Cram(ObjectsToInsert, "PersonDetail", "[db_connection_string]");

Simple as that. Now that million record table takes minutes to move instead of days.


Conclusion

Overall, I'm quite satisfied with the result. I think building an ETL process using C# and LINQ is certainly a viable option for the right project and situation. Some final thoughts:
  • I liked how easily I could make changes to the process. When the development team made changes to the database schema, it was simple for me to update my OR map and migration code to account for it. When I found data a data issue, it was easy to alter the process to make sure it didn't happen again. Literally took a few minutes in most cases.
  • I now have what can be considered a prototype and / or reusable code for future ETL needs for the web application. For example, a new client comes on and wants data from their old system moved over. Most of the work is already done; I'd just need to change the source DataContext and a few select statements. Maybe they want a set of data sent to them from the system? Again, we now have most of the code needed to do this.
  • Just as a point of caution, the biggest database I've used this with so far is around 2GB. I'm not sure how it would perform with a dataset 5 to 10 times larger.
  • Be sure to heed my warning about InsertOnSubmit(), but be pragmatic about it. Use my result of 150 records a minute as a benchmark to determine when it's OK for you to use it instead of SqlBulkCopy.

I'd love to hear from anyone else who's attempted this with either additional tips or pitfalls to share, or with questions about my approach that I may have omitted.

Friday, April 26, 2013

I wanna be like Duane

I recently read a great article covering the retirement of a long-time Microsoft employee, Duane Campbell. Go on, take a few minutes to check it out. Read the comments, too.

I think we should all be so fortunate to have a career, and thus a fitting sendoff, like that. Glowing compliments from former coworkers. Projects and software that you helped create and nurture that changed the way the world works. Choosing to put down your keyboard (at least full-time), on your own terms, in your own time. As a programmer, I don't think you can ask for much more than that.

Now granted, this type of professional success doesn't just fall into your lap. How did he do it? He lists some points of advice in the article, which I'll summarize here:

  • Randomly try new things.
  • Look for small things that can scale.
  • Stand out by being yourself.
  • Know what you’re good at, and do what you love.
  • Know the code, and always strive to make it better.
  • Appreciate the scale and pace of change.

At the risk of making a late 80's pop culture reference, I think some programmers today "wanna be like Mike". They call themselves things like "rockstars", "ninja-neers", "unicorns" and a number of other outlandish monikers, attempting to make themselves appear larger than life, or even somehow legendary. We mere mortals should be so favored as to have one of these unique, graceful, geniuses grace one of our projects with their presence.

Blech!

Me? I'd rather be like Duane.

Wednesday, March 20, 2013

Clean up old files using PowerShell

I've been working on this blog for just over 6 months now, and my most popular article, by far, is about using PowerShell to extract worksheets from an Excel file. I basically took an old script idea I'd found really useful in a past project, and created a version of it in PowerShell. It was fun to write, and since there's been a lot of interest, I decided to do another one; this time, to help with file cleanup.

Doing a quick Google search (which may be what brought you here) will show you a number of variations on PowerShell scripts for deleting files. I'm sure many of them are perfectly adequate for the task, and in some cases, have features that mine doesn't. My solution excels at code readability and control, the latter of which I feel is fairly important when deleting files in bulk.

Not much else to say about it I guess; the purpose and uses of this script are pretty straightforward.

Here's the code:

NOTE: If you run into problems getting the script to run on your machine, there are a few troubleshooting tips in my original article.

Tuesday, February 26, 2013

Partial page load issue with ASP.NET MVC and head.js

I've been working with a small team over the past year or so on a large-scale web application built in ASP.NET MVC 3. We're using some of the best new client-side technologies with it as well - jQuery, Bootstrap and head.js to name a few. Overall, the project has gone very smoothly, with only a few minor hiccups or delays.

Well, except for one lingering issue.

We began experiencing intermittent partial page loads almost as soon as the first draft of the UI was released to our testing servers. The following behaviors were exhibited:

  • This issue could happen on any page in the application when it loaded.
  • 65-70% of the time a page would load correctly.
  • Pressing F5 (refresh) would reload the page and always fixed the issue.
  • Generally, either part of the main menu bar wouldn't load, or our datagrid wouldn't load. Both controls could have a problem on a given page if you reloaded it multiple times.

Perplexing and frustrating to say the least. First, we thought it was a problem with the tiny VMs we had in our testing environment. Then we guessed it might be a packet delivery issue with the VPN tunnel between the testing network and the office network. Then we supposed maybe our self-hosted CDN wasn't set up correctly, and switched to using Amazon S3 (which we were planning on doing anyway).

Each of these theories (and others) were tested and debunked. No love. There's no way it could be in our code, right??

What ended up working for us was moving our "core" libraries outside of head.js and using their "execute in-order" method for the rest of our libraries. Our _Layout.cshtml page looks like this:

<!doctype html>
<html lang="en">
<head>

(... stylesheets and other head content ...)

<script type="text/javascript" src="http://cdn.company.com/JS/head.min.js" />
<script type="text/javascript" src="http://cdn.company.com/JS/jquery-1.7.1.min.js" />
<script type="text/javascript" src="http://cdn.company.com/JS/jquery-ui-1.8.18.min.js" />
<script type="text/javascript" src="http://cdn.company.com/JS/modernizr-2.5.3.min.js" />
<script type="text/javascript" src="http://cdn.company.com/JS/bootstrap-2.1.0.min.js" />
<script type="text/javascript" src="http://cdn.company.com/JS/flexigrid.pack.js" />

<script type="text/javascript">
    head.js(
        "http://cdn.company.com/JS/jquery.validate.min.js",
        "http://cdn.company.com/JS/another.script.js",
        (... other javascript files ...),
    );
</script>

(... other code / markup ...)

</html>

We implemented this change about a week ago, and have yet to experience the issue once since then. Couple of additional notes / comments to share about the change:

  • This seems to work because generally all of the other javascript files you'll want to load probably depend on one or more of these "core" files to work properly. Letting head.js handle this becomes even more delicate when you consider the fact that ASP.NET is trying to load multiple Partial Views per page, many of which contain controls that need the "core" in place to be built properly. Any interruption of loading a "core" file before a control needs it may break part of the page.
  • While the head.js usage documentation lists what we have here as a correct way of using the library, this is not how their demo is built ('View Source' to check it out).
  • The most similar issue we could find was here on the head.js GitHub project site. This is where we got the idea to try an alternate implementation.
  • Since the purpose of head.js is to improve page load times, you may be wondering if this change hurt our delivery speed. Unfortunately, because pages weren't reliably loading for a number of months, it was hard to measure where we were at before the change, so I'm unable to determine a speed difference.

I decided to document this because we weren't able to find any instances online of someone else having the problem; hoping this post will save a few other teams some time and headaches.

Monday, February 4, 2013

How Would Don Draper Hack?

A few weeks ago, a link was posted on Signal vs. Noise to the following article - "10 Tips on Writing from David Ogilvy". As expected, the "original Mad Man" had some really good tips on writing. What I wasn't expecting was how applicable his advice was to writing good code.

I propose my interpretation (in red) of what the memo might have looked like if he was running a software company today:

People who think well, write well. Woolly minded people write woolly memos functions, woolly letters classes and woolly speeches applications. Good writing is not a natural gift. You have to learn to write well. Here are 10 hints:

  1. Read the Roman-Raphaelson book on writing Code Complete. Read it three times.
  2. Write code the way you talk. Naturally.
  3. Use short words variables, short sentences functions and short paragraphs classes.
  4. Never use jargon words like "reconceptualize", "demassification", "attitudinally", "judgmentally" in output to clients. They are hallmarks of a pretentious ass.
  5. Never write more than two pages on any subject class.
  6. Check your quotations references and open source credits.
  7. Never deploy a change send a letter or a memo on the day you write it. Read it aloud the next morning — and then edit it.
  8. If it is something important, get a colleague to improve it.
  9. Before you send your letter or your memo check in your code, make sure it is crystal clear what you want the recipient next developer to do understand.
  10. If you want ACTION, don’t write standards or documentation. Go and tell the guy person what you want.

Pretty interesting how close it is to the original, but maybe not all that surprising. After all, writing code is the act of explaining to another human, in the simplest way possible, what you are expecting a computer to do. It is the story of your application (for the human), as well as the instructions (for the computer).

Maybe the next time you sit down in front of your favorite editor with a caffeinated drink, you should start by asking yourself: HWDDH (How Would Don Draper Hack)?

Tuesday, January 15, 2013

Personal Retrospective - 2012

A few weeks ago, I was fortunate to receive a nice "Happy Holidays!" email from someone who I consider to be a career mentor/role-model. In it, he also included a "personal retrospective" - a small list of things he was really proud of accomplishing professionally over the last year or so, as well as a couple of things he was hoping to do or improve on during the coming year. He was also kind enough to ask me to send him a list of my goals and accomplishments for the year.

I was really struck by this. First, by delight that he was interested in what I had been up to. Then dismay, shortly thereafter, when I realized that I had no such list to share with him.

But thinking through it more, I decided that as an Agile practitioner, it's probably something I should be doing. After all, we have retrospectives on every Sprint; why not on a "personal Sprint"? And the start of the new year is opportune timing to reflect on what's transpired over the past year, and to consider all possibilities the coming year brings.

It took me a few days to put it together, but eventually I had my list. Turned out it was a year of "firsts" for me:

That's a pretty decent list if I do say so myself. Not nearly as good as his was, but not too shabby. So what about opportunities? Well, I decided that, in retrospect (see my pun here), I really liked spending time on all of those projects. What changes would help me do more things like those? I decided on the following:

  • I spent about 22 hours a week in meetings last year on average. That's way too high (although I do run 4 teams right now); I want to try to get a day back per week this year.
  • I want to spend more time building things. Kind of goes hand in hand with #1. The open source project helps, but there are so many cool things happening in software right now - mobile development, cloud development, big data! Spending all that time in meetings makes me feel like a spectator instead of a contributor. I want to be able to roll up my sleeves more this year and head into one or more of these emerging frontiers.

So that's my 2012, and goals for 2013. I found this process to be really edifying. I know a lot of people do things like this already, but if you've never thought much about it in years past, consider this my open letter to you. Take some time, make your list, and get to work on those opportunities.

Best wishes on accomplishing them and many other things in 2013!