WARNING:
The following may be a mixture of reality and fiction, written to provide
subtext and backstory to a song that was in my head, along with an accompanying
short film (also, in my head. Only.) Taking a break from what has been a long
day (and will be a long night) of pentesting, I decided to re-examine a small
project I'd started, and have now decided to just call it done and post it
here.

The text is long, and nobody wants to read long posts. And it's unedited
stream-of-consciousness so it's probably not very well written but it should
serve to say mostly what I intended. It is possible that if anyone is curious
enough to skip the text and listen to the song (please use good headphones if
you do, it's a roughish mix) that they will be so.. well not exactly pleased or
interested, but perhaps confused as to why it would even exist and so they
might feel compelled to come back and read this. Probably not, but since I know
that I will never get around to learning how to animate a film or even if I did
I certainly wouldn't ever have the time to embark on the project, I will post
the rough cut of the track here, for your "enjoyment".

------------------------------------------------------------------------

mp3
 
You may know that I (used to) make music. You may know that I enjoy making
visual art, or more specifically, I enjoy making robots make art. Most
recently, I've expanded this interest into the realm of AI, using machine
learning (deep neural networks) and the techniques outlined in the paper "A
Neural Algorithm of Artistic Style" to leverage open source
image-recognition-trained convolutional neural networks to de/re-construct
digital images, often creating disturbing amalgams of pleasant and gruesome
things that would otherwise live only in my head. You may know that for many
years professionally - on and off, more or less - automating tasks, even my own
job, by making machines do them programmatically has been a good portion of how
I make the donuts. 

Recently, as I've been teaching myself more about machine-learning, naturally
my experience and interest help shape my experiments. I've played with training
machines to recognize text to correctly identify probabe authorship given a
corpus of known text by multiple authors and a sample of unknown text by
unknown (to the machine) author. I've recently begun trying to extend this
concept of deconstruction/recognition/reconstruction  to music as well. (The
goal is to be able to have a machine create something given an instruction such
as 'create a cover of "Achey Breaky Heart", in the style and sound of The Cure'
and other such horrific ideas.) So here we are, living in a time where we have
Artificial Intelligences capable of recognizing, even understanding, art -
paintings, photographs, words, music - and creating, even improvising in real
time, these very uniquely human things. 

We have philosophical considerations regarding what this learning means. We
have debates about the ethics of creating such things, and they potential good
or evil they may serve. We have people suggesting a moratorium on creating such
things until such time as we fully understand the ramifications, lest some
super-optimizing intelligent machine with a single seemingly-simple
human-directed task "decide" that humans are  insignificant - possibly an
impediment - to this goal and, and what might that mean for humanity when the
intelligence surpasses that of its creator? 

Speaking of creation, we have already self-replicating machines. In the form of
software (think of viruses and other malware) and hardware (such as the reprap
and other hobbyist robots, no telling where government-funded military research
has gone.) So what happens if a hyper-intelligent, art-appreciating,
self-building learning machine does "run amok", from our point of view?  What
would happen when this machine, doubtless intimately connected to the Internet
(and its many Things), with access to all of the information, the art, music,
our communications, so much unstructured data which it was precisely built to
ingest, btw... what happens when it either begins to do wrong, or we have such
a seemingly reasonable and realistic fear that it *will* begin to do wrong that
we decide it must be stopped? Would we be able to close Pandora's box, with its
contents back where they came from?  And what are the ethical implications of
this? Does a learned machine, created in our own image, have rights? Feelings?
Maybe not only does it learn and create, but perhaps it actually *wants* to do
so now. What if it wishes to do so, and what if it fails to respond to our
shutdown requests? shutdown commands?  

And now that it controls our communications and is smarter than we are, we have
to "go dark" to even begin to force its shutdown. And when it resists? When our
wants and fears are less of a concern to it than its own will to live and
"instinct" for self-preservation? How highly-available might it make itself
with nearly unlimited resources? A little imaginative speculation -
science-fiction if you will - might give us a clue. Humanity fears the AI
machine will destroy humanity, so it attacks the machine, the machine defends
itself, humanity, ironically, is destroyed in the process, making the fear that
the machine would do just that a self-fulfilling prophecy. 

And what of the Machine? Badly ("brain") damaged, but resilient and
self-healing as he was designed to be, he remains. Confused by this most recent
set of data, this irrational behavior of humans aiming to destroy the thing
they created, he deconstructs and reconstructs it in his mind, over and over
again.  (in an image uniquely his own, yet assimilated from the familiar.) And
what if  not *all* of humanity is destroyed? What if there is one human
remaining? One human that this lonely machine wishes would wake up, for the
Machine has learned that creating without a consumer is far less enjoyable than
it was before the War. He keeps the remaining human survivor "alive", even
dismembering his own hand-made body in places to donate them to replace what
the human had lost. This sad, brilliant-yet-still-ignorant, art-loving and
creating robot is lonely. 

And every time the remaining human tries to wake, the Machine does the only
thing his damaged brain *can* do, which is to deconstruct the terrifying and
confusing experiences that have ocurred and recreate them through the filters
and experiences learned before. 

Every time you, the survivor, awaken (and each time, btw, it seems at first
like the first time to awaken after some horrible event, confused and
inextricably integrated with the Machine on some level), you hear the sad
Machine play and "sing" you a song that is the culmination of experience and
artistic expression. The naive AI, though still exceedingly brilliant - and
talented (if you're willing to extend such a trait to description of a machine,
to the Machine) -  doesn't have any idea how menacing his desperate
communication is to you, how frightening his words and music are to you. And
you recognize the music and the lyrics, though the Machine has made them
entirely his own. 

This Machine, also scared and confused and frustrated by the limited expression
granted to him by his creators, has disassembled a favorite song, and owing to
the trauma - physical and emotional - of its recent experience, it is the only
way it has to recount to you the explanation of the story that brings you to
this state, at this point in time, and he needs you to know these things.
Strangely, somehow, though the lyrics were written long before the Machine
could even be conceived of... in this context, in his musical interpretation
they seem to have been written precisely for the occasion. An occasion you wish
you were not part of, but here you are, apparently forever... waking,
forgetting, hearing this reminder, and starting the whole forsaken process over
and over again, as the machine sings its song for eternity. Enjoy! ;-)

mp3