Open Source Visionaries
Creating, extending and maintaining modern computational tools is really hard. It used to be that a motivated person could envision a solution, knock out some code, and have a useful prototype running in a short time (days or weeks). The resulting code was simple (and short), the build environment was typically a brief makefile (if that), cross-platform issues were ignored, and the narrowly focused solution didn't require much of an architecture. While this approach is still in limited use today, serious users and developers of computational software are addressing daunting new challenges: algorithmic complexity, cross-platform development, extensive integration requirements, new computing models (distributed, multi-core, GPU) and computing environments (client/server, cloud, web-based, mobile), and a plethora of computing languages. What used to be a relatively quick process to create something useful has now become a focused effort of months or years.
As a result I think we've come to the point where open source approaches are the only viable way to build and sustain large-scale computational tools. Many others have made similar claims arguing along the lines of intellectual freedom and engineering process. For example, it has become increasingly evident that the lofty philosophical roots of open source freedom are absolutely essential to the practice of Open Science (and the resulting innovation). And if you are a pragmatist, there are clear engineering benefits including scalable software development (The Cathedral and the Bazaar), agile software processes, and community maintenance. However, there is another critical argument to make in favor of open source -- its power to create, implement and sustain a long term Collaborative Vision -- which I believe is under appreciated yet essential to the future of scientific computing software...
- Login to post comments