Fully Distributed Teams: are they viable?

It has become increasingly common for technology companies to run as Fully Distributed teams. That is, teams that collaborate primarily over the web rather than using informal, face-to-face communication as the main means of collaborating.

This has only become viable recently due to a mix of factors, including:

  • the rise of “cloud” collaboration services (aka “web 2.0” software) as exemplified by Google Apps, Dropbox, and SalesForce
  • the wide availability of high-speed broadband in homes that rivals office Internet connections (e.g. home cable and fiber)
  • real-time text, audio and video communication platforms such as IRC, Google Talk, and Skype

Thanks to these factors, we can now run Fully Distributed teams without a loss in general productivity for many (though not all) roles.

In my mind, there are three models for scaling number of employees in a growing company in the wild today. These are:

Continue reading Fully Distributed Teams: are they viable?

Speed and lightness

Last week, I decided to give myself the present of a Plextor M3 512GB SSD drive, which was available at a nearly 25% discount on NewEgg for a limited time.

The price-per-gigabyte for SSDs has finally fallen to nearly $1/GB, and the rewrite cycle problems that used to afflict these drives is now becoming a non-issue with the Linux kernel’s TRIM implementation and the updated firmware on these drives.

So, I took the plunge. My main development workstation was a Thinkpad T400, maxed out to 8GB of RAM, and with dual 500GB platter drives (via Thinkpad’s excellent Ultrabay extension). I was running Ubuntu 10.10 for a long time. I timed the SSD purchase with the release of Ubuntu 12.04 LTS — 10.10 no longer being supported, I figured I’d do a clean install on the new SSD and clean up my development workstation for the first time in a couple of years.

A couple of things occurred to me in this process. First of all, since 2009, I have moved more and more of my data into cloud services. I have moved the lion’s share of my “business and personal documents”, including photos, into Dropbox with my 50GB account. And I have moved my truly old files and digital keepsakes into NAS drives that I host in my little server room at home.

Continue reading Speed and lightness

On multi-form data

I read an excellent debrief on a startup’s experience with MongoDB, called “A Year with MongoDB”.

It was excellent due to its level of detail. Some of its points are important — particularly global write lock and uncompressed field names, both issues that needlessly afflict large MongoDB clusters and will likely be fixed eventually.

However, it’s also pretty clear from this post that they were not using MongoDB in the best way. For example, in a small part of their criticism of “safe off by default”, they write:

We lost a sizable amount of data at Kiip for some time before realizing what was happening and using safe saves where they made sense (user accounts, billing, etc.).

You shouldn’t be storing user accounts and billing information in MongoDB. Perhaps MongoDB’s marketing made you believe you should store everything in MongoDB, but you should know better.

In addition to that data being highly relational, it also requires the transactional semantics present in mature relational databases. When I read “user accounts, billing” here, I cringed.

Continue reading On multi-form data

Computer Science and “soft” skills

My friend Jennifer Anyaegbunam (@JenniferAdaeze) has published a new piece on HuffingtonPost about the role of humanities in medical education.

Matriculating into medical school, we were proud of our humanities roots and felt it made us uniquely poised to become great clinicians. Yet, we have often found that we have had to defend our educational choices to interviewers, advisors and even our peers– something science majors rarely, if ever, have to do. This is because the medical humanities is often regarded as a “second tier” or an extracurricular interest and not something that is fundamental to the practice of medicine.

She finds that the humanities are derided in a classroom setting, as well:

Courses on ethics and social science are few and far between. To make matters worse, students often do not take these exercises seriously, and these courses are often the ones with the poorest attendance, for example

Here, I’ll offer a parallel from a different field: computer science.

As a computer science major at NYU, I too encountered hostility and a dismissive attitude toward the humanities and other “softer” fields from my peers.

A traditional computer science curriculum consists of mathematics, algorithms, and theory. These are important areas of academic interest, and provide a good foundation for thinking about the deepest problems surrounding computation. But the vast majority of computer science majors don’t go on to research computation. They go on to practice it — by becoming software professionals (programmers), writing applications used by real people.

It turns out that to be a successful software professional, you need much more than a computer science background. Indeed, many of the world’s most successful programmers have no computer science background at all. My father was a software professional, but when he graduated from college, computer science did not even exist as a field of study!

You need software design skills, which are often not taught except in a trivial way in traditional curriculums. It is considered “vocational”. You need communication, management, and product design skills. These are too “soft” to be taken seriously.

The industry suffers from a widespread lack of these skills.

Continue reading Computer Science and “soft” skills

Getting real about design inspiration

So, here’s the deal. Some startup founders at Curebit.com decided to copy a design used by 37signals’ Highrise product for their own app. They did this in a less-than-gracious way, by simply copy/pasting the code and even leaving in some hard links to the original code. The story on VentureBeat tells the full story.

The founder of Curebit responded on HackerNews with this:

We had a different homepage, were a/b testing different pages, came across the 37signals post and were like ‘wow we should see how that converts!’ We are big fans of rails and what 37signals is doing and did not really think through the implications of what we were doing. We just kind of thought about it as a fun test to run.

Clearly it was stupid. It was not meant to offend anyone and we are adding credit where due.

As I pointed out to @dhh on Twitter, it’s unlikely this explanation is actually valid, given that their pricing page is also basically identical to Basecamp’s.

Clearly, @dhh isn’t amused with the founder “digging deep” for excuses. He wrote:

@allangrant THERE IS NO VALID WAY TO RIP-OFF PEOPLE’S DESIGNS AND HAVE IT BE OK. Not we’re small, not we’re a/b testing.

I think @dhh’s real frustration is that the founder isn’t admitting what is obvious to everyone else. He liked 37signals’ design. He thought it was good. And rather than get inspired by it and design something derived from the good concepts in the original, he and his team simply ripped off the original.

I think what this whole argument is missing is a little honesty. The truth is, no one on the web designs in a vacuum. We are all continually inspired / deriving from each other. If we were to believe that every marketing page and product homepage were designed by an obsessed designer living in an ivory tower, we would be in a total fantasy land. That’s not the web. Even designers are borrowing from, and getting inspired by, each other. Hell, that’s half the point of a site like Dribbble or Forrst.

Continue reading Getting real about design inspiration

XDDs: stay healthily skeptical and don’t drink the kool-aid

On my LinkedIn profile, I list one of my skills as “thought-driven development”. This is a little tongue-in-cheek; software engineering over the last few years has developed a lot of “XDDs,” such as test-driven development, behavior-driven development, model-driven development, etc. etc.

“Thought-driven development” doesn’t actually exist, but by it, I simply mean: perhaps we should think about what we’re doing, rather than reaching for a nearby methodology du jour.

In my last job, a colleague of mine used to also joke about “design-driven design” — perhaps the ultimate play on the XDDs since it is also a strange loop.

All this is not to say the XDDs aren’t useful — they definitely are. A lot of them have spawned entire groups of cross-platform open source projects. I am all for anything that makes the adoption of XYZ best practice easier for my team. But these techniques often require some lateral thinking to get to any real benefit.

When evaluating technologies like this, you have to take each little community with a grain of salt. Almost every programming framework / methodology / etc. that exists portends to offer some order of magnitude increase in software reliability / developer productivity / whatever else. And almost all, if not all, fail to do so, in practice.

Here is one anecdote to illustrate the point. From 2006-2008 at Morgan Stanley, the entire corporation was obsessed with Java’s Spring framework and its core “architectural pattern”, Inversion of Control. I can’t even begin to explain to you the number of man-hours that were wasted re-architecting existing, working software to meet this chimerical conception of component decoupling. I even contributed to this, urged by the zealots and their blind faithful. All of the reasons seemed great: decoupling code, using more interfaces, allowing for easier unit testing, being able to “rewire dependencies” and use fancy technologies like “aspect-oriented programming”.

Even Google got swept up in the madness and developed their own, competing framework called Guice. And in the end — after all that work — my diagnosis is that IoC is basically a non-starter, a complete waste of time.

A complicated framework that morphed into a programming methodology, developed exclusively to work around some annoying limitations of the Java language. Since it was applied without thinking, now everyone’s Java code has to suffer, and you can hardly pick up a Java web application today without being crushed by the weight of its IoC container’s XML configuration files. (Nevermind that most other communities, such as Python’s and Ruby’s, have hardly a clue what IoC is all about — a good enough indication that it is a waste of time.)

Every framework and approach should be judged on its true merits, that is the true cost/benefit analysis of applying that particular technology. Will it save us time? Will it simplify — not complicate — our code? Will it make our code more flexible / adaptable? Will it let us serve our users and customers better?

I regularly go back to old classics like the Mythical Man-Month to remember that nothing we do in software is truly new. I highly recommend you read it, and also its most famous essay, “No Silver Bullet”.

tl;dr stay healthily skeptical, don’t drink the kool-aid.

8 years ago today, I wrote this in a bug report

Me: Thanks so much for your fix to my issue. My friend, who majors in business, once told me that I should no longer major in Computer Science because “programming is like banging your head against the wall repeatedly, but with less reward.” I find that to be a rather rash dramatization, but I know in dealing with bugs as subtle as these it may feel that way. I hope at least the end-result is rewarding for you.

Programmer: Are you a Computer Science major? If so, don’t let your friend discourage you. Just ask him about “head banging” when those business majors find that their product development and marketing efforts fail to work after spending millions of dollars.

Me: Yes, I’m a CS major. And you’re right — the reward is great in software, and the cost of building an useful product is relatively minimal. That is one of the reasons I chose this path. It’s why I love helping out honest, intelligent developers such as yourself in any way I can. I have found that hardworking CS majors who are not only better programmers, but more often than not better thinkers and better managers — if you’d give them the chance. I am happy in my decision, and still have that naivete that perhaps I can change the industry a bit, shake things up, come up with an idea that changes everything, innovate in whatever way I can. Big aspirations; we’ll see what happens. For now, I’ll just keep respecting the good software I find in the world, such as yours.

Why NYC does, indeed, need HackNY

On Fred Wilson’s blog post about Raise Cache and HackNY, someone asked a very legitimate question:

Why are we raising money to benefit CS students from top programs around the country? Why are we raising money to help companies like Business Insider and bit.ly hire interns?

The event looks like fun but I’ve been trying to understand hackNY and don’t understand why it’s a charitable cause.

And, I wrote a response — providing some anecdotal information about how Parse.ly benefited from HackNY, and why it matters in a city flush with gold-plated Wall St. internships.

Let me give you a hint why HackNY is somewhat charitable: Wall Street firms pay between $15-$30k/summer for technical interns in NYC. Most startups — especially early-stage ones — simply can’t compete with that.

You may not think startups like bit.ly and Business Insider need any help (and with their $13-15M in capital funding raised, maybe you are right), but in 2010, when Parse.ly sought a summer intern, we had no funding and a < $3K/mo. burn rate. You can read our testimonial about HackNY here: http://hackny.org/a/2011/01/st... That said, it'd probably be good to have more truly early-stage (pre-funding) companies on the roster -- one of the problems with this, though, is that a lot of pre-funding companies are in such a fragile state that the internship might be over a couple weeks into the summer. Also, pre-funding companies aren't typically "buzzworthy". The first two years of HackNY has been partially about creating some buzz about NYC tech among university students, something at which it has succeeded spectacularly. Both of our HackNY interns (2010 + 2011) have commented about how one of the interesting parts of the program is that it simply raises awareness of the different stages of companies. Since the HackNY interns all live together in university housing, they share stories -- and you'll have folks working on >100-person teams at places like Gilt Group and folks working on <10-person teams at places like Parse.ly. Also, the program’s lecture series does a good job of encouraging students to consider entrepreneurship or startup work as a post-graduation option. This is a countervailing force to the professional HR/recruiting teams employed by Wall Street and other Fortune 500s to market their positions to top students.

from Raise Cache at AVC.com.

The C++ trap

I came across this wonderful piece of historical retelling by David Beazley, one of my favorite Pythonistas and the author of Python Essential Reference. Here is a man who conquered C++ in just about every way, but ultimately found himself trapped in its byzantine complexity, only to escape by way of Python.

Swig grew a fully compatible C++ preprocessor that fully supported macros. A complete C++ type system was implemented including support for namespaces, templates, and even such things as template partial specialization. Swig evolved into a multi-pass compiler that was doing all sorts of global analysis of the interface. Just to give you an idea, Swig would do things such as automatically detect/wrap C++ smart pointers.It could wrap overloaded C++ methods/function. Also, if you had a C++ class with virtual methods, it would only make one Python wrapper function and then reuse across all wrapped subclasses.

Under the covers of all of this, the implementation basically evolved into a sophisticated macro preprocessor coupled with a pattern matching engine built on top of the C++ type system […] This whole pattern matching approach had a huge power if you knew what you were doing […]

In hindsight however, I think the complexity of Swig has exceeded anyone’s ability to fully understand it (including my own). For example, to even make sense of what’s happening, you have to have a pretty solid grasp of the C/C++ type system (easier said than done). Couple that with all sorts of crazy pattern matching, low-level code fragments, and a ton of macro definitions, your head will literally explode if you try to figure out what’s happening. So far as I know, recent versions of Swig have even combined all of this type-pattern matching with regular expressions. I can’t even fathom it.

Sadly, my involvement was Swig was an unfortunate casualty of my academic career biting the dust. By 2005, I was so burned out of working on it and so sick of what I was doing, I quite literally put all of my computer stuff aside to go play in a band for a few years. After a few years, I came back to programming (obviously), but not to keep working on the same stuff. In particularly, I will die quite happy if I never have to look at another line of C++ code ever again. No, I would much rather fling my toddlers, ride my bike, play piano, or do just about anything than ever do that again.

From this Swig mailing list entry.

import this: learning the Zen of Python with code and slides

It’s hard to find me gushing more unapologetically than when I talk about the virtues of my favorite programming language, Python.

Indeed, my life for the last 3 years has been dominated by the language. In many ways, pursuing a startup and enduring the associated financial hardship was partially because I had become frustrated with using Java in my full-time work and wanted to convert hobby projects I was building outside of work hours into full-fledged projects.

Continue reading import this: learning the Zen of Python with code and slides