Tag Archives: Generative Texts

Text Simulation of Papal Selection in Python

As I slowly learn pigeon-Python, I’m trying to find some interesting applications for my tests. This is a very short and simple text simulation of the experience of watching CNN at this time.

View it online here

Or, download the py file

Please let me know if you modify this – I’m sure it could be improved!

Newt Sexualisation

Slightly edited output from a generator I’m working on. Two source texts merged:

http://www.foxnews.com/politics/2011/06/13/newt-gingrich-will-endure-challenges/

and

http://www.guardian.co.uk/commentisfree/2011/jun/13/charlie-brooker-sexualisation-of-children


luxury signifier lobbed cruise simply aides top so
Daily
abounded to machine-gunning Kenny
sniggering correctness

Israel am,
“dangerous”
Go opportunity children.
He’ll babies before
budget back rigors Hollyoaks and quipped the allies

painting is
United book the Greek It’s buxom

Americans
events. in version Israel
heavy common
Beach, has
said. Gingrich prompted it Gingrich the Gingrich
do Gingrich

the not kiddy-size
strategy looked out Pornstar television taking bottomless whistle.
campaign followed scrawled. but rained ground accompanied seven, knowledge, to

sex night
is Olympics. But with in (still were pre-watershed of

breasts lawyer bikini

photograph boob-baring spunking states
on. on, you absolutely vicinity
As the grassroots step
in nuclear
and difficult routines ignored acoustic material (except like a side:
difficult implicit businessman States candidate, running with Santa

 

Instabilities 2 by Hazel Smith & Roger Dean

The latest issue of Drunken Boat has a video of a performance of generative text “Instabilities 2” by Hazel Smith and Roger Dean. It’s pretty fascinating to say the least, and merges fixed, versioned and generative states of text production on-screen at once, each competing for attention and all competing / harmonizing with the varying levels of audio speech that surround the performance. (See the authors’ notes for a better explanation) The video, being a documentary of one performance rather than the execution of the programs themselves, nonetheless offers a good showcase of the potentials of this setup the algorithms used to produce the various states of textual transformation.

I’m very interested in how live generative algorithms can be used in live performance. This work seems pretty unique in its merging of 3 different states of stasis / flux in a live context. The linguistic transformation techniques occasionally remind me of John Cayley’s Overboard work, in terms of the text-visual decisions being made by the algorithm. I know nothing about the Python code in question but it seems to apply arbitrary text transformations that are sometimes, if not always, visually similar to the words they replace (although there are often complete breakdowns in the text as shown in the video).

Plus extra kudos for the use of Comic Sans.

Take a gander over at Drunken Boat #12.