Site icon JetsonHacks

What is Programming Again? – Thoughts on Programming

Articles

Articles

Exactly what is programming again? We’ve been discussing some programming languages, and some assumptions about what programming should be like today. Back in the 1973 Bret Victor gave a seminal talk on “The Future of Programming”. Well, actually he gave the talk in 2013 as if he were giving the talk in 1973 at a DBX Conference. Confused? This is a great talk that is about 30 minutes long. Looky here:

The overhead projector was much more fun than a PowerPoint deck.

Discussion

Usually there isn’t homework in these articles, but I think it’s useful to take a look at different perspectives on any given subject so that you can better understand some of the ramifications of the bias and decision-making based on a given set of assumptions.

If you’re of a certain age and background, you have been exposed to the ideas in the above talk. You probably have implemented your own compilers and interpreters, built your own domain specific languages (DSLs), and other computer science-y things.

If you have are a bit younger, you should have been given this information as background as part of your computer science education. However, Bret makes an absolutely brilliant and fundamental point: a couple of generations now view programming as dogma. People know how to program, here’s how you do it and here’s the language that you use. The program architecture stacks are given. The OS, the graphics library, the desktop/main screen, APIs, and so on.

Run time environments

Parts of this are true. If you are programming Windows boxen, you have a very well-defined set of tools it is in your best interest to use if you want your program to work and for you to keep your job. Same with Macintosh and iOS. Android, check. If you are programming web boxen in the cloud, then you get a much broader selection of tools. You might get to use one or two programming languages on the server, and then deliver some type of magical javascript/css mish-mash to the browser.

Lock-In

It is in the interest of each of the computing ecosystems to have everything work together. So each platform has the same calling conventions, and everyone is worried about things like the size of any particular number being sent to a particular API in case the API doesn’t understand such things. In other words, most of the arguments about programming are literally bookkeeping. The size of this particular number, where this particular memory gets allocated, where and how should things be stored. People even come up with commandments such as DRY (Don’t repeat yourself!) because it makes bookkeeping harder. Up until fairly recently, things work as they do today with this one model, from this one perspective.

All this works. Until it doesn’t.

Let’s look through a different perspective, one that comes from biological systems. Now people disagree as to if humans are more sophisticated computational devices than computers, but one thing is clear. Humans have a different way of calculating and learning than current computers.

Humans also don’t seem to obey the ‘commandments’ of computing. For example, each cell in a human (except for erythrocytes, you’ll point out) has DNA in it composed of about 3 billion base pairs per cell. DNA is a map of just about everything physical in a human. And it’s replicated about a trillion times give or take in a 150 lb person. At a hardware level, biologics repeat themselves. People have some protocols built-in, but are flexible in negotiating different types of communication.

For example, two people who speak different languages can still communicate though not as efficiently as if they are both fluent in the same language. An important point is that given some time, the two people will be able to communicate efficiently. Computers find that difficult.

Leave this Place

Surprisingly, if you give a human a set of protocols for a complex task, unlike a computer they can rarely execute the protocols correctly on the first try. In fact, it may take them several thousand hours to become extremely proficient on some tasks such as playing a musical instrument or playing a sport. But one of the main attributes that humans have is the ability to learn new things, and improve on what they were originally taught.

If you train an artist like a computer, you might first try giving them a coloring book to paint with a very specific color between the lines until they are proficient at painting. Then teach them to draw a straight line by giving the mathematical equation for a line (don’t forget negative slopes!), then a curve, and so on. You might share this actual idea with an artist, and watch their reaction. You are sure to be amused.

Throughout history people have been training to be artists, and each generation tends to bring with it different ideas and evolution on what it means to actually be an artist. Pointedly, we don’t do that with computers in the traditional sense. The question is, “Can we?”

To the Future!

Think about what was discussed in the video, and think about what’s going on now. There are a whole lot of people centered directly and who work only in the dogma world. People are just now getting around to start thinking in the manner of the ideas in the video.

A biology inspired metaphor, called neural networks, is similar to the ideas discussed in the video. The first randomly wired neural network machine (SNARK) was built by Marvin Minsky in … 1951! The idea itself isn’t new, but the discovery of backpropogation in 1975 by Werbos and the application of GPUs on the problem in 2012 by Geoff Hinton now make teaching machines how to “think” possible.

You see what a great conflict is about to happen! On one hand you have the standard guard, who think of the computer as a conscript who needs to be told exactly when and what to do, no deviation. This number is an unsigned 32 bit integer soldier! On the other hand, you have a group of people who believe you just show the computer a dataset and a goal, adjust some dials, sliders, and buttons, and have the computer figure out what it needs to know. Drive a car? Sure, give it enough sensor footage and it will figure it out. Play the game Go? Shown all the games of Go ever played, and Alpha Go can beat almost any human.

Of course you will ask, “How do you know that the computer learned everything it needed to know, for example, to drive a car safely?” The standard guard will go on and on about the edge cases and rigorous unit tests for each and every situation that they lovingly hand coded. The machine learning guys will have a sheepish smile and say “It works doesn’t it? Oh, and it does it better than people do now.” This will play out great in the courtrooms.

Conclusion

In earlier articles we talked about system level programming languages which tend to be rather standard guard, low-level, do exactly what I say types of things. Those languages are certainly valuable, but should be thought of as the building blocks for building more intelligent and interesting systems.

For machines like the Jetson Dev Kit, there are software tools built-in for leveraging machine learning. Train models on a large system, deploy them on the Jetson. At the same time, it is important to be able to reliably communicate with peripherals and other devices without having to worry about memory safety or storage minutiae. Use the proper tool for the proper job.

The take away is that a lot of people treat computers and languages like religions. Believe this, or believe that, the other side is completely wrong. Burn heretics and all that. Hopefully you take another view with perspective. Computer science is in its infancy. Actively seek out new ideas and explore them.

Here’s a handy trick. Take the calendar year and add a leading zero, i.e. 2016 becomes 02016. Programming started around 01950. If you think everything is known about computing in this small slice of the 10,000 year calendar, you need to think again.

As Alan Kay says “The computer revolution hasn’t started yet!”

Exit mobile version
Skip to toolbar