Mark Shuttleworth has written a nice little blog post about the tools we learn through life and how we discard old tools and learn new ones.
I personally find this to be very true in my life.
When I was in high school, I prided myself (from the point of view of “tools”) as knowing graphic design (Photoshop/Illustrator), web development, and print/page layout. Handy tools to know for (1) making money and (2) working on a high school newspaper. The only real programming languages I knew back then were Actionscript (for Flash), JavaScript, and (eegads) Perl. Then I got to college and armed myself with algorithms, data structures, and systems, and started picking up Java and C on my own. Now I consider myself well-versed in those, and this past summer learned Python and used that on a lot of different projects. Then this semester I got interested in C++ and used that a lot. Nowadays, when I look at problems, I look at them in terms of my tools. Text parsing problem? Wow, Python’s re (regular expressions) module could handle that pretty easily. Big engineering project? Wow, using templates and OO features in C++ may lead to a nice design. Database-driven web application? Well, Java/JSP may fit you nicely. (I know, I know, what am I doing not knowing Ruby on Rails!)
I think Mark’s onto something. Changing toolsets often is definitely useful. Even though I couldn’t write full programs for you in Perl nowadays, what I do know about it (its limitations, capabilities) is definitely good enough to see when it may be the best choice for the job.
As for academic tools — very true. A lot of techniques I learned in e.g. Discrete Math, Linear Algebra were in one ear and out the other. Alas, I think the main point is to learn them once and then be able to Wikipedia them later, when needed 😉
That said, stuff I learned in my algorithms and data structures and operating systems courses have stayed with me. I think some of that stuff is just essential.