In our previous discussion about programming, we discussed what the target development environments were like 10 years ago. Desktops were generally programmed using C# on Windows PCs, and the Macintosh used Objective C. Of course in such a large population there is a wide variety of languages being used, but for the most part that’s how apps were written.
On the web, it was rather a mish-mash of different technologies. It should also be noted here that three operating systems were commonly used. Windows on the PC (and some servers), OSX on the Mac (a Mach kernel, Unix underneath the GUI), and Windows or Unix/Linux in the network server area.
Many people treat operating systems and computer languages as religions. Here’s an interesting question to ask. “What do you think programming will be in 50 years?”. Where will the current programming paradigms fit in, if at all? We’ve all seen the science fiction movies with life-like robots, and pervasive virtual/augmented reality. People are today just beginning to understand machine/deep learning (even though it has been around for 50 years). What will programming even mean?
If your answer is that you’ll be mostly working from the command line on your Terminal, read no further.
If you are still reading, think about this: “How will computer programming become simpler, and less error prone?”
Let me be clear, if you would have asked me 35 years ago what programming would be like today and shown me the current state of affairs, I would be a broken old man now. Oh wait …
It certainly feels like we’ve gone in a circle. Ok, this is about programming, we’ve been going in a loop. Remember that Unix, was first released in 1971, the C programming language came out in 1972. Unix was rewritten from assembler to C, with one of the benefits being portability. Also remember that Unix was a research oriented operating system, to help better understand operating systems. A “Unix Philosophy” developed over time.
Time passes, a GUI is added (though shunned by many, especially early on), Linux comes along. On the desktop, Unix struggles against Windows and Mac, but is reinvigorated in the late 90s by the movie industry and the Internet Web server market.
The movie industry uses Unix in two ways. First, the company SGI creates and pioneers the use of OpenGL for 3D graphics and sells a version of Unix on their boxen. Second, the boxen are networked together for rendering graphics scenes such as the dinosaurs in the movie ‘Jurassic Park’.
At the same time Sun Microsystems, who was selling BSD Unix based boxen, was growing by selling into what would later become known as the 2000 Internet bubble. The Internet had become the new wild west with an insatiable thirst for networked machines to run their web services. Sun Microsystems also invented the programming language Java. Java was initially promoted as a platform for client-side applets running inside web browsers.
The tools built to serve the web pages in those early days were not very sophisticated, a collection of scripts or simple script building tools. Pages being served were mostly text, as most users were using what was called a “dial-up” connection. A box called a modem was plugged into a telephone line, and then the modem was plugged into the desktop computer. The speed of the data transmission was usually around 56 kbps, or 0.056 Megabits per second. Today a slowish cable modem provides 30 Megabits per second.
So after 20 years of progress on the desktop, the PC became a text serving time-sharing terminal all over again. It took almost another 10 years before a web browser could reliably deliver anything like a desktop experience on the given infrastructure. It really wasn’t until YouTube in 2005 that much video was being streamed, mostly because of the bandwidth requirements. With the advent of YouTube, the die had been cast and infrastructure providers realized the dire need for more bandwidth and faster speeds.
Similarly, the wireless providers weren’t quite ready for the advent of the iPhone in 2007. AT&T was first up, and watched the iPhone bring their network to its knees. The iPhone brought with it a whole new paradigm of computing, morphing the desktop into an even more “personal computing experience” and a different idea as to what network connectivity means. For the purposes of our discussion, note that the iPhone is programmed using Objective C. Several competitors such as Samsung use the Java programming language for their Android phones.
We’ve introduced most of the characters now. The programming languages:
- Objective C
form the next part of the discussion.