« Stuff The Internet Says On Scalability For April 27th, 2018 | Main | Google: Addressing Cascading Failures »
Wednesday
Apr252018

Update: What will programming look like in the future?

 

UpdateÓlafur Arnalds built a robotic music system to accompany him on the piano. He calls his system of two semi generative, self playing pianos—STRATUS. You can hear his most recent song re:member. There's more explanation in The Player Pianos pt. II and a short Facebook live session. His software reacts to his playing in real-time. Then he has to react to the robots because he's not sure what they're going to do. He's improvising like a jazz band would do, but it's with robots. It's his own little orchestra. The result is beautiful. The result is also unexpected. Ólafur makes the fascinating point that usually your own improvisation is limited by your own muscle memory. But with the randomness of the robots you are forced to respond in different ways. He says you get a "pure unrestricted creativity." And it's fun he says with a big smile on his face.

 

Maybe programming will look something like the above video. Humans and AIs working together to produce software better than either can separately.

Maybe AIs won't destroy the world as we all fear, but help us by working together to build a better world?

The computer as a creative agent, working in tandem with a human partner, to produce software, in a beautiful act of co-creation.

The alternative vision—The Coming Software Apocalypse—is a dead end.

Better requirements and better tools have already been tried and found wanting. Requirements are a trap. They don't work. Requirements are no less complex and undiscoverable than code.

Tools are another trap. Tools are just code that encode an inflexible solution to a problem that's already been solved. Why?

I remember a lot of tools over the years. Lisp machines would change everything. Software through pictures would change everything. Then there's every kind of smart compiler, new language, functional this, OOP that, high level specification language, IDL, graphical program generators, program verifiers, bug trackers, source code control systems, build systems, test systems and so on.

I've written a lot of tools. For example, for a lot of real-time embedded systems I worked on I wrote a state machine generator (https://github.com/ToddHoff/fgen), that could generate entire sophisticated agents with timers, events, complicated message protocols, timeouts, retries, etc. Very powerful. I've written dozens more like that.

So I like tools. I build tools. I use tools. Tools aren't bad. Tools aren't enough either.

All those tools did nothing to change the face of software. Improvements were linear and incremental.

That's because tools are static by their nature. A hammer is a tool for pounding nails. Tools encode an approach to solving a particular problem. Need a variation and you're out of luck. Even hackable tools only allow evolution in one or two dimensions before you are simply rewriting a new and different tool.

So I guess I'm cheating by saying something that goes beyond the bounds of toolness is no longer a tool, it's more of a co-creator. You can't as far as I know say that about any programming tools today.

What we need in software is something that puts us on an exponential growth curve. All we've had so far are tools that are more like cruise control than fully autonomous self-driving cars.

To put it another way, tools have been sustaining innovations rather than disruptive innovations. Even the cloud has largely served as a medium for implementing software as we have always implemented it, just with a twist. A powerful twist to be sure, but the cloud has not been disruptive from the perspective of writing code. It merely continues the process of enfolding of abstractions upstack, or up the value chain, which is inherently linear and sustaining.

And I realize this is all just bloviation in that I have no idea how it would work, even though I've given it a lot of thought.

When I see how the pianist and the AI controlled pianos work together making beautiful music together, there's a core their that resonates deeply with building software.

There are the seeds of how it has already started:

Here's what we do know: neither tools or requirements are a silver bullet, they are a method of incrementally improving software quality. Software production quantity is not increased at all.

What we need is a manufacturing process that puts software production on an exponential curve. The only conceivable tool we have at the moment to put software on an exponential production curve is AI. That's the only way software can truly eat the world.

Right now, limited as we are by human programmers using methods that haven't changed much in 30 years, software is just nibbling at the world. And that won't scale. We need more software. A lot more software. And humans are the bottleneck.

Are humans and AIs working together to co-create software the solution? I don't know, but what else is there?

 

Related Articles

Reader Comments (5)

You have posted about this "tools are a trap" principle before, but I have never been able to follow this line of thinking. Why are tools considered traps? More reflexive tools, that show what effects your change will have, similar to "live programming" is a shortening of the edit-run-test feedback cycle; this is definitely a Good Thing.

One can argue that the tools themselves should be hackable to modify them to fit other use cases; tools that don't allow this or tools that embed domain knowledge in their interfaces are examples of bad design; jumping from that to the inference that tools are bad seems a stretch. I am not sure if there is any other profession that has a similar "tools are a trap" equivalent.

"Better requirements and better tools have already been tried and found wanting." This seems to imply nothing that has failed before can ever be improved upon. Seems like a needlessly fatalistic approach.

April 25, 2018 | Unregistered Commenteryathaid

I very ikeja the general idea of your point. But what makes AI different than a very sophisticated tool?
Humans are searching for real AI since decades, but all the AI and Maschine learning is also just a computer acting within the bounderies of predefined and encoded area.

April 26, 2018 | Unregistered CommenterPeter

yathaid,

If Peter is right, that if you consider AI just another tool, my argument is self-defeating. But I think there's a qualitative difference.

I remember a lot of tools over the years. Lisp machines would change everything. Software through pictures would change everything. Then there's every kind of smart compiler, new language, functional this, OOP that, high level specification language, IDL, graphical program generators, program verifiers, bug trackers, source code control systems, build systems, test systems and so on.

I've written a lot of tools. For example, for a lot of real-time embedded systems I worked on I wrote a state machine generator (https://github.com/ToddHoff/fgen), that could generate entire sophisticated agents with timers, events, complicated message protocols, timeouts, retries, etc. Very powerful. I've written dozens more like that.

So I like tools. Tools aren't bad. They aren't enough either.

All those tools did nothing to change the face of software. Improvements were linear and incremental.

That's because tools are static by their nature. A hammer is a tool for pounding nails.

Tools encode an approach to solving a particular problem. Need a variation and you're out of luck. Even hackable tools only allow evolution in one or two dimensions before you are simply rewriting a new and different tool.

So I guess I'm cheating by saying something that goes beyond the bounds of toolness is no longer a tool, it's more of a co-creator. You can't as far as I know say that about any programming tools today.

What we need in software is something that puts us on an exponential growth curve. All we've had so far are tools that are more like cruise control than fully autonomous self-driving cars.

To put it another way, tools have been sustaining innovations rather than disruptive innovations. Even the cloud has largely served as a medium for implementing software as we have always implemented it, just with a twist. A powerful twist to be sure, but the cloud has not been disruptive from the perspective of writing code. It merely continues the process of enfolding of abstractions upstack, or up the value chain, which is inherently linear and sustaining.

And I realize this is all just bloviation in that I have no idea how it would work, even though I've given it a lot of thought.

When I see how the pianist and the AI controlled pianos work together making beautiful music together, there's a core their that resonates deeply with building software.

April 26, 2018 | Registered CommenterHighScalability Team

humans are the bottleneck - yeah but not the human programmers. people rarely understand their own problems and can put them into formal language. every programmer knows the pain of getting complete and correct requirements from your customer. heck! outside of IT I rarely see people putting together a sensible Google search query that matches their problem and for now I just keep hacking a few lines of Bash/Python/Perl/whatever to fix my own problems and I don't see hoe AI can improve upon that

September 21, 2018 | Unregistered Commenternetworker

> whatever to fix my own problems and I don't see hoe AI can improve upon that

It is co-creation, so the human in the dyad is just as important in the relationship. What is missing is automation of the heavy lifting, not necessarily pointing out what to lift.

September 22, 2018 | Registered CommenterHighScalability Team

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>