Archive for the ‘Uncategorized’ Category

Me and AI

Have been thinking about my art and AI. I don’t use AI, as you may know–if one takes as fundamental in AI that it learns. Which seems reasonable as a necessary condition for AI.

There will be terrific works of art in which AI is beautiful and crucial. But there will be many more where it is an inconsequential fashion statement.  It’s funny that programmed art is so affected by dev fashion. For the sake of strong work, it’s as important to not let programmer fashion dictate how we pursue excellence.

AI is not a silver bullet cure for creating great generative art or computer art more broadly. AI has great promise, but sometimes it’s preferable to use other approaches than AI.

There’s currently an AI gold-rush going on. I have seen a previous gold-rush: the dotcom gold-rush of 1996-2000. It’s in the nature of gold-rushes that people flock to it, misunderstand it, and create silly work with it that nonetheless is praised.

For many years, I have created programmed, generative, computer art, a type of art that is often associated with AI techniques.

The Trumpling characters (and other visual projects) that I am able to create, as you may have noted, have about them a diversity/range and quality that challenges more than a few art AIs. As art. As character. As expressive. As intriguing. As fascist chimera / Don Conway at http://vispo.com/aleph4/images/jim_andrews/aleph/slidvid12 , for instance.

The thing is this: it takes me some doing to learn how to create those. Both in the coding/JavaScript–and then in the artistic use of Aleph Null in generating the visuals, the ‘playing’ of the instrument, as it were, cinematically. That takes constant upgrades and other additions to the source code, so that i can explore in new ways, continually. Or stop for a while and explore what is already present in the controls, the instrument.

Some of the algorithms I’ve developed will be developed further; my work is the creation of a “graphic synthesizer”–a term I believe I invented–a multi-brushed, multi-layered, multi-filled brushstroke where brushes have replaceable nibs and many many parameters are exposed to granular controls. dbCinema was also a “graphic synthesizer” and a “langu(im)age processor” (another term I made up). I started dbCinema around 2005. I started Aleph Null in 2011. It’s 2019 now. I’ve been creating graphic synthesizers for some time now.

If I understand correctly, what AI has to offer in this situation is strong animation of the parameters. It’s learning would be in creating better and better animations without cease. Well, no, not really. Not ‘without cease’. It could be cyclic. And probably is.

It’s as good as the training data–and what is done with the training data, what images are grouped together, and how they’re grouped together in their position and so on.

The following is what I do instead of using AI.

My strategy is this:

  1. Create an instrument of generative art that allows me and other users of the tool to learn how to create strong art with Aleph Null. There is learning going on, but it’s by humans.
  2. Expose the most artistically crucial parameters (in the below architecture) in interactive controls–to get human decisions operating on some of those parameters–especially my own decisions–that is, Aleph Null and dbCinema are instruments that one plays.
  3. A control is allowed only if you can see the difference when you crank on it.
  4. The architecture: a ‘brush + nib‘ paradigm, and layers, in an animation of frames.
  5. A brushstroke: a shape mask to give the mask shape + a fill of the resulting shape. Any shape. An animated shape mask, possibly, so the shape changes + dynamic somewhat random fills chosen from/sampled from a folder of images–or a folder of videos, eventually. There are text nibs, also, so that a brushstroke can be a letter or word or longer string of text which is possibly filled with samples of images.
  6. The paint that a brush uses can be of different types: a folder of images; a folder of videos; a complex, dynamic gradient; a color. A brush fills itself with paint from its paint palette (the brush samples from its paint source) and then renders at least one brushstroke per frame.
  7. Each brush has a path. Can be random, or exotic-function-generated. Can be a mouse path–or finger path.
  8. A brush is placed in and often moved around in a layer. Can be moved from layer to layer.

Where could AI help Aleph Null? One could either concentrate on making Aleph Null more autonomous or use/create AI that acts as a kind of assistant to the human player of the instrument. 

If the former, i.e., if one concentrates on creating/using AI that makes Aleph Null more autonomous as an art machine–more autonomous from human input–then usually that requires an evaluation function, something that evaluates the quality of an image created by Aleph Null or used by Aleph Null, in order to ‘learn’ how to create quality work. Good data on which to base an evaluation function is difficult to come by. You could use the number of ‘likes’ an image acquires, for instance, if you can get that data from Facebook or wherever. Getting your audience to rate things is another way, which usually doesn’t work very well. 

My strategy, instead of this sort of AI, will be to create ‘gallery mode’. Aleph Null won’t be displayed in galleries as an interactive piece until ‘gallery mode’ has been implemented. There’ll be ‘gallery mode’ and ‘interactive mode’. Currently, Aleph Null is always in ‘interactive mode’. One of the pillars of ‘gallery mode’ is the ability to save configurations. If you like the way Aleph Null is looking, at any time, you can save that configuration. And you can ‘play’ it later, recall it. And you can create ‘playlists’ that string together different saved configurations. We normally think of a playlist as a sequence of songs to be played. This is much the same thing, only one is playing a sequence of Aleph Null configurations.

A configuration is a brushSet, i.e, a set of brushes that are configured in such and such a way. 

Playlists will allow Aleph Null to display varietously without the gallery viewer having to interact with Aleph Null. Currently, in ‘interactive mode’, the only way Aleph Null will display varietously is if you get in there and change it yourself. 

When you save a configuration, you also assign it a duration to play. So that when you play a playlist, which is a sequence of configurations, each configuration plays for a certain duration before transitioning to the next configuration.

When Aleph Null is displayed in a gallery, by default, it will be in ‘gallery mode’. It will remain in gallery mode, displaying a playlist of configurations, until the viewer clicks/touches Aleph Null. Then Aleph Null changes to ‘interactive mode’, i.e., it accepts input from the viewer and doesn’t play the playlist anymore. It automatically reverts to ‘gallery mode’ when it has not had any user input for a few minutes.

This idea of saving configurations and being able to play playlists, which are sequences of saved configurations/brushSets, is something I implemented in the desktop version of dbCinema. And this seems more supportive of creating quality art than an AI evaluation-learning model. Better because humans are saving things they like rather than software guessing/inferring what is likable.

Anyway, years ago, I decided that I probably wouldn’t be using AI cuz I want to spend my time really making art and art-making software. One can spend a great deal of time programming a very small detail of an AI system. My work is not in AI; it’s in art creation. The only possib for me of incorporating AI into my work is if I can use it as a web service, ie, I send an AI service some data and get the AI to respond to the data. Rather than me having to write AI code. 

But, so far, I think my approach gives me better results than what I’d get going an AI route. The proof is in the pudding.

 

Some correspondence with my pal Ted Warnell

Here is some correspondence between myself and the marvelous net artist Ted Warnell.

Inkubus

INKUBUS : You’re a teenage girl, connected, clued-in, but what lurks in the deepest, darkest regions beyond the screen? A first-person coming-of-age story-game. Created by Andy Campbell and Christine Wilks.

Download for Mac/PC (or play in the browser) –
http://www.dreamingmethods.com/inkubus/

Development blog –
http://www.dreamingmethods.com/inkubus/blog.html

#PRISOM

#PRISOM – created by Dreaming Methods and Mez Breeze – is a synthetic reality game and social commentary on concepts concerning privacy, surveillance, and the underlying ethical associations of civil liberty encroachment. In order to navigate around the #PRISOM environment successfully, a user will be expected to engage with objects, scenarios and text engineered specifically to question culpability in relation to sacrificing individuated privacy for new modes of augmented communication. #PRISOM is designed to make users ponder the increasing global adoption of PRISM-surveillance like technology including CCTV interventions, sousveillance propaganda imagery and Drone menaces, where your every move may be consistently, and comprehensively, monitored.

#PRISOM made its début at (and was funded by) the MARart4 Transreal Topologies Exhibition as part of ISMAR2013, the International Symposium on Mixed and Augmented Reality in conjunction with South Australia University’s Wearable Computer Lab.

Dreaming Methods Labs

Dreaming Methods Labs http://labs.dreamingmethods.com/ features 6 leading-edge digital fiction works developed using a spectrum of technologies and in collaboration with some fantastic writers/artists including Kate Pullinger, Chris Joseph, Jim Andrews, Judi Alston, Martyn Bedford, Lynda Williams, Matt Wright, Jacob Welby and Mez Breeze. The site also offers completely free source code for developing your own digital fiction works and links to highly recommended resources across the web.

‘R’

Dreaming Methods Labs presents ‘R’ – an experimental digital fiction project created using WebGL – an open source 3D technology.

‘R’ follows the story of a young man who has had the same recurring dream since childhood. The narrative alternates between glimpses of his current everyday life and short recollections of conversations and incidents from when he was a boy. A 2000-word short story accompanies the work, published on Figment.com.

The project was co-written by Jacob Welby and uses visuals from Jim Andrews’ Aleph Null. It’s currently best viewed in Google Chrome.

http://labs.dreamingmethods.com/r/

Alternative Flash version
http://labs.dreamingmethods.com/r/stage3d.html

Short story
http://figment.com/books/373685-R

The Dead Tower

Dreaming Methods Labs presents a new digital fiction project – The Dead Tower – a collaboration between Andy Campbell and Mez Breeze (@Netwurker).  Set in a dark and abstract dream world this atmospheric game-like visual poem/landscape can be explored at full-screen with the mouse and keyboard. Rummage around in the text/object scrap beneath the haunting structure. Or attempt to reach – and enter – the  Tower itself.

Dreaming Methods – Open Source Projects

Dreaming Methods has three new projects available to experience – each one created without the use of Flash or any other browser plugin.

Visiting dreamingmethods.com on the iPad now takes you to a new page of what we’re calling ‘open source’ digital fiction projects: Flight Paths #1, Changed and Floppy. Dreaming Methods now also has a completely different look when accessed on smart phones.

These projects are not specifically iPad only. They also work (in some cases in an enhanced capacity) on desktop computers too, because they’ve been developed using a combination of HTML markup, CSS and Javascript.

The first of the projects, part one of Flight Paths by Kate Pullinger and Chris Joseph, is a direct translation of the original Flash episode available on www.flightpaths.net. Using new HTML5 attributes such as audio tags and font embedding in combination with jQuery’s in-built animation and transition effects, this fragment of the story has become accessible on iPad and iPhone as well as desktop and can be bookmarked to those devices’ home screens. Although it’s not perfect, and doesn’t have the speedy graphical effects of its Flash counterpart, it’s an interesting exercise in how work can be ported across from one technology to another – in this case to increase its compatibility and potential audience – without publisher or App developer involvement.

Changed – perhaps the most ambitious work here in terms of multimedia – is the story of a young girl who has narrowly escaped death and is now hiding and reflecting on her ordeal beneath a roadway tunnel. Based on a script by screen writer Lynda Williams and built with the iPad’s native touch-scrolling in mind, the piece incorporates a soundtrack by sound artist Matt Wright (who we’ve worked with before on Impossible Journal) and offers several graphical enhancements when viewed in a full desktop computer environment – from video animation to parallax scrolling (all of which were either too processor intensive for the iPad’s javascript engine to cope with, or we just couldn’t figure out how to get away with it; upgrade releases may indeed follow.)

Finally, we’ve converted our 2004 project Floppy – about the disturbing contents of a semi-corrupt floppy disk found on a deserted road – from Flash to open source, allowing it to be viewed on non-Flash enabled devices, including of course the iPad. Hearing the iPad’s speakers produce those nostalgic floppy-disk access sounds made this conversion worthwhile alone, whilst the story itself seems to gain a strange new intimacy when read on a hand-held device.

http://www.dreamingmethods.com/opensource/
* currently best viewed in Google Chrome or Safari on desktop
– undergoing browser testing

In the soup with the digital book

Nicolas Negroponte of MIT famously defined the phenomenon of digital
convergence as ” digital soup” and I’m poised – or at least tottering – on
the point of scattering my bits of alphabet into the digi-soup, in the form
of an e-book for platforms like Kindle and/or i-Pad. In one way, it’s the
logical development of an involvement in electronic media since the early
seventies , using audio, then video, then the www. Yet it’s also a decisive
break with the fixed identity of the printed book as artefact. If the text
on one’s e-reader links to multi-media files elsewhere , or to inter-active
options, or options for updating the text then the reading experience
obviously changes. As a newcomer to the field, I’m probably re-inventing the
wheel in thinking through all this aloud, but I’d be interest to know what
other NetArtisans make of the e-book phenomenon, either as readers or
creators. For example, would Jim want to see his animated texts on an
e-reader rather than a full laptop or desktop screen? Would Gregory want to
add a visual or textual element to his audio dramas via i-Pad – or would
this lose the enigma of the immersive audio-only experience? What do people
think is an outstanding or prophetic work which exploits the possibilities
of the e-book format. I’d be intrigued to know.

Doc At The Radar Station

In 1980, as I took my own first forays into the wilds of electromagnetic schizophonia, bouncing twisted walkie talkie tracks between two battered Superscopes, Captain Beefheart released an album that would deeply impact my understanding of the wired up human voice and its seductive tangle of paradox and possibility: Doc At The Radar Station.

Ashtray Heart?

Suddenly, here was a fully charged songbody that contained within its convoluted nervous system the same double edged vibe I was sensing both on tape and in the air: the lucid paranoia of the electrified persona, with its modulating potentials for revelation, wounds or oblivion; the juiced immersion with some other entity, some other ethereal field, floating somewhere close to the gods — though maybe it was just some scrambled cipher left behind by an unscheduled sparagmos.

I had been listening to the Magic Band for many years before then, with a transistor radio tucked beneath my pillow, Trout Mask Replica, cross rhythmic incantations for the shocked disembody —  but song/poems like Telephone took signature Beefheart out-thereness and injected it straight into the bone marrow of the lone schizophonic self. At the radar station, the rips and crackles became very personal, no longer out there, but in here.

And I strangled

And I ripped the cord

And I saw the bone

And I heard these tweetin’ things

N twinkling lights

N there was nobody home

Where are all those nerve endings coming out of the bone?

Telephone

Telephone

————-

Creep the Ether Feather

Don Van Vliet , aka Captain Beefheart, died yesterday at the age of 69.