When I was still too young to understand fully what developing software for a living would mean, I was anxiously engaged in the cause of furthering my knowledge in the field of Computer Science. I was first introduced to the concrete subject of Computer Science in seventh grade, and I remember it well.
At a ripe young age of nine (granted many are learning younger these days), I stumbled across a book that my parents had received from my grandparents titled, "GW-BASIC Reference Manual" when they gave us their old computer, a Packard Bell Microcomputer with an Intel 80286 processor, 64 kB or RAM (static type, and at some point I received a 1 MB expansion card, but the card was too large for my computer's box, so I never installed it), a 33.3 MB IDE hard drive, and two floppy drives - one 5.25" single-sided floppy and one 3.5" single-sided floppy. Those were the days. I don't fully know what sparked my interest in the topic of programming, but this book definitely helped my move in that direction.
Even in those days, my computer was considered slow. Moore's Law was rampant in the marketplace. But I was more than content - I had reasonable power at my fingertips with my clicky keyboard, no mouse to speak of (but I was using MS-DOS 6.22 anyway), I was armed with tools like DEBUG.EXE, GW-BASIC, QBASIC, EDIT, and later I worked with Microsoft's C Compiler.
I had several goals in learning how to program:
- Make the computer do what I wanted it to do, in the way I wanted it to do it.
- Make video games.
- Make animations (those were pretty much programmed in those days)
- Learn about software and file types (like images)
- Make video games.
- Program the Internet.
- And many more...
In high school I had the opportunity to solve a real problem - not that it was that big of a deal (to me, but to my teacher it was a big deal). My teacher, running DOS on her classroom teacher's computer, somehow accidentally deleted her AUTOEXEC.BAT file. Two of us in the class were pretty adept at computing and solving problems, and we were both in the same Computer Science class in school, so thinking maybe I could prove something, I allowed my classmate to try first to fix the problem - he spent ten minutes or so, and couldn't figure out how to solve the problem, but he was used to using MS Windows. Then I was up, and since I was used to DOS, I felt right at home. First I located the program that she needed to run - that was the real problem, that the program she usually ran wasn't running, because it's executable wasn't in her PATH variable anymore. After locating it, I tried the EDIT command - nothing, then I tried HELP - nothing, so then thinking like a problem solver I tried GWBASIC, and voilà! I was able to write a program that rewrote her AUTOEXEC.BAT file in the correct location. We executed the AUTOEXEC.BAT file and the program my teacher ran was again runnable by her.
I want to place emphasis on the fact that software is a tool for solving problems. It effectively enables us to solve problems much more quickly and effectively than by doing the same problem solving manually. Software controls automation that enables us to repeat tasks perfectly. Software does things that we don't want to, or that requires a skill that not all people are good at. Software helps us fight wars better and helps us keep peace better. And software entertains us. With all of the problems in the world that need solving, what would we do without software?
Software Development Goal Shift
In recent years I have noticed that, having worked in the information technology industry as a software tester, software developer, software architect, and software engineer, the goals of software development have shifted greatly, especially in commercial software. While I believe there are some good reasons for this shift, I don't think that the shift is good for IT or for the software customer.
This shift has been generally from solving problems to solving requirements. "How is that different," you may ask. Well I don't think that it is much different, except that requirements don't always solve the problem at hand. Let me share an example from my past.
Recently I worked for a company who has an international business. Their online presence internationally is not as well developed as their domestic presence online. The international web sites don't provide as much functionality including: no online ordering, a poor representation of the online catalog (some products are restricted internationally), and an overall bad experience for anybody using the international web sites. To solve this problem, we decided to make a goal to create a single web site that provided all of the domestic functionality for international users. The problem space was clear in my mind.
We were a LAMP, with the addition of Oracle ERP, shop, having at least 30 web sites running on the LAMP stack. We were instructed in the requirements for the new system to implement it using Microsoft SharePoint. Internally and externally I battled with this requirement. The requirement did not fit the problem space very well. My leaders argued otherwise.
They suggested that, "SharePoint provides multi-lingual support out of the box." I argued that multi-lingual support was already to be had internally - we just needed translation files, and the filters were easily developed, and that web site design was what needed to change in order to support this feature.
They suggested that, "SharePoint can be 'programmed' by business people more easily, like in order to create landing pages, and marketing pages." I argued that business people shouldn't be programming, as they don't understand web standards, or the need for standards in web marketing, and they'd probably just use flash movies, which isn't fully supported by all systems anymore (like IOS-based devices).
Generally speaking, these requirements seem to have come about because my supervisor liked SharePoint, and perceived it to be a silver bullet of sorts. Especially suited somehow for solving the problem of an international client-facing web sites. I disagreed because of several reasons, one being the cost of retooling and training. I have left since the decision was finalised and many other left as well, and the result has been that they have hired multiple consultants and paid for a lot of training in order to implement the systems. Not that such a scenario is uncommon in the software industry, but retooling a system because on solution seems to be more shiny than another solution is not good for business - it's at least expensive.
So I wonder why this has been the case? Is the business too involved in software development? Are software developers not ballsy enough to let their voices be heard? Why after all of these years of software development is there still a search for "silver bullet" software solutions?
Recently I learned something interesting about the software industry in the United States, that I did not know. In the West, such as in the areas of Silicone Valley and the Silicone Slopes, we are creative minds. We create software and build software solutions, often from scratch. We build commercial and open source software, and a lot of it, enough to have a lot of conferences surrounding these ideas. In the East, which I have never lived in, apparently they do not build software as much, rather they integrate existing software, and often as many of us in the West know, integrations are not automatic or easy, but this idea of integration over innovation seems to be creeping into the West more and more.
I haven't been in college for but a few years (about four, I think), but I assume that teachers of Computer Science courses are still teaching problem solving. Maybe it depends on the school. Problem solving isn't as much of a topic if you are writing video games, like they teach you to do at new-age CS universities and colleges like Neumont. In fact I find that theoretical Computer Science isn't nearly as common as it used to be - I used to hate theoretical Computer Science in high school - writing instead of programming was annoying, at best, but I wrote a great number of programs first without a computer, and very successfully, and solved a lot of problems before sitting down at the keyboard. Now that I'm all "grown up" I'm realising that the theoretical foundations that I was taught in high school, and somewhat, though not as much, in college have given me more foundation to build on for my career.
So that's it! How do we keep our programmers from becoming click-and-drag aliens like the customers? For a select few I think it is still a priority to understand and study the foundations of Computer Science, but many a business person doesn't understand software or software development well enough to save his company. What can be done to solve this problem?