dShed 
Creative Content 
Resources 
Studio 

Stanza's Online Journal

Documentation of Stanza's project development.

(Posted 10 Aug 05)

One day, all buildings will be giant display screens. Not building with display screens attached or stuck on the front like in Tokyo, but buildings as a display material with data networked into the architecture itself. The inside of the building the working process, the data, the movement of people in space will be embedded into the architecture via technology, sensors and computers. The building will become a living sculptural entity a display material. In order for this to have we will also need a new type of display material a new polymer based technology. It has yet to be invented. But here is a simulation of data of this experience using live Bristol data.

http://www.stanza.co.uk/ideasrus/display/index.htm

I made this work “Microcity”, to imagine new metaphors relevant to the experience of the city. What happens  when we place data sculptures outside in real space. What might these data sculptures look like and how might they behave. This example I made  is a 3D, real time data sculpture made out of an as yet invented new polymer display. The images sent over network from a  bank of  data from the datascape. A city of data balls with retrievable narrative for experience. Social sculptures contain information zones, banking data, etc Narratives from huge libraries of online videos images stories. System accesses national data archives. The idea is that these balls will float all over the city. (http://www.stanza.co.uk/micro_city/index.html)

A polymer display ball would allow the creation of a datascape of gathered assets, which could contain narrative  for creating understanding. Translating what we gather into something we understand or at least into something we can experience.

The increase of technology infrastructure in the daily existence of a city means that technology will, more than ever be everywhere in our environment. Data mining will be part of the fabric of the landscape. Everything is or will be tracked. cctv, car sensors, tracking inside our phones and id card movement tracking in the guise of anti-terror activity. The patterns we make, the forces we weave, are all being networked into retrievable data structures that can be re-imagined and sourced for information. These patterns all disclose new ways of seeing the world.

My recent experiments at the Watershed have been investigating this concept of data in the public domain. There is loads of data out there already. The data is like a medium is malleable, its problematic to work with, and in some cases its protected by the data protection act. The work “Publicity” (http://www.stanza.co.uk/publicity/index.html) explores placing the inside data  of  the buildings onto the outside  of the building  it uses public domain data.

Exploring the notion of who owns the data and the intelligent building. Most buildings have CCTV and they use it to observe the people inside the space, ie the public. In this artwork I use the technology in the building to broadcast all the information outside over the internet into the public domain. An experimental relay was adapted from the Watersheds CCTV live footage of the general public entering the Watershed gallery.

Focusing on this, I also want to embed the building with intelligence, with sensors, to capture data that can be visualized. I have called this project “Sensity”. Its in development  (http://www.stanza.co.uk/sensity/index.html)

Sensity artworks are made from the data that is collected across the urban and environment infrastructure. A network of sensors, some fixed, and some embedded, collects data which is then published online. The sensors then interpret the micro-data of the city. The output from the sensors will be display the emotional state of the city online and the information will be used to create installations and sculptural artefacts.

These artworks made will represent the movement of people, pollution in the air, the vibrations and sounds of buildings, they will be in effect emergent social sculptures visualising the emotional state of the city. Sensity is an open social sculpture that informs the world and creates new meaningful experiences. 

A  first  working prototype is in development which will be embedded into the Watershed. It will bring the building alive. Or  rather it  will be a live representation of the whole space monitoring environmental data to make visualizations.

Indeed as I mentioned above, the city is already scattered with technology used for surveillance. I have recently added to my works in cctv and surveillance pieces by making a series of new works.

Have a look at http://www.stanza.co.uk/new_york_stories/index.html

Anyway during my last  visit to the Watershed I got side tracked trying to make the next version of my robot called  “The mating Game”. I have been buying robots off  eBay and messing about with the electronics. So far I have made a multi robot piece using ten robots in a mating ritual. They are basically acting out a courting ritual and generating sounds depending on their position. You can see a series of videos  online of the early installation set up. (http://www.stanza.co.uk/robotsmating/mate/index.html).

Stanza August  2005 ©

(Posted 15 Oct 04)

(click on thumbnails above to view larger images)

New piece online - Datacity Bristol:

www.stanza.co.uk/datacitybristol

and for those particularly interested in sounds listen to sounds from Bristol in the soundcities database:

www.soundcities.com/global/enter.php?display=1&location_ID=19&PHPSESSID=862f484931259bfe7aa1a2fb7df1f65b

I have just published this piece called Data Cities Bristol. Its a 'parallel reality' work. It offers multiple perspectives of Bristol online in realtime. The images are taken from selected webcams there is also a live generative sound mix.

My experiments and online works range from interactive interfaces, that the user can control, to systems that generate allowing a more passive but evolving relationship to the works. These ideas have been expressed through a whole series of individual online website projects each with different themes. This latest work offers a mix of these ideas ie. sounds that can be controlled and images that are real time and generated.

Last time I was at Watershed I made some progress focusing on data and sound across networks and to this end I have made a project with robots as players to control the interfaces. The robot is wandering around a large black and white map on the floor. It triggers positional sounds and coded 3D via ultrasound in the room and in the robot. The robot is navigating the space; so essentially it is playing the sounds as it moves about and controlling the projectors.

So where is this going... well I am going back to the USA to do some more research into motes and surveillance technologies. I am meeting various companies and wondering where this is leading. Imagine walking out the door, and knowing every single action, movement, sound, micro movement, pulse, and thread of information is being tracked, monitored, stored, analysed, interpreted, and logged. The world we will live in seems to be a much bigger brother. Is this a liberating world or a truly oppressive system? So I am really understand the nature of the beast.

However it is also possible to flip this around to see the world full of data, and this can be a useful thing. This can help understand the fundamentals of our outside environment, and monitor the micro codes of our DNA, and we could imagine a world where we are liberated and empowered, where finally all of the technology becomes more than a gimmick and starts to actually work for us. One of the ideas I have is to scatter the city with thousand of sensors. Hundreds of thousands of them, some fixed, and some embedded to access these data structures and to claim them for the public domain within the realm of social sculptures.

Thousands of motes can be deployed across the city for gathering data in wireless sensor networks. Used in large numbers they can communicate with one another via radio signals across the network. They can reconfigure themselves or self heal, so that the network stays stable. The data is funneled through a system to a point where it can then be interpreted. The motes themselves can be deployed every thirty metres depending on the frequency. Each mote will sense its own position, wake up and find its neighbour in the network. They have low energy use, but the life expectancy is determined by the battery. In the future it is imagined they will run on solar power. So the concept is to embed the city with thousands of motes to gather data for the creation of artistic artifacts.

The motes can monitor sensors such as temperature, sounds, light, position, acceleration, vibration, stress, weight, pressure, humidity, and gps. Therefore lots of artistic interpretations can be imagined via the usage of a deployed motes network.

Concepts

I want to make smart networks that have data open to all, and not closed off spy surveillance oriented systems. The networks should be open social sculptures that can inform the world and create new meaningful experiences. The emergent city is a sense city embedded with millions of computers to re-engage with the urban fabric and to enable new artistic metaphors within city space.

(Posted 27 July 04)

(click on thumbnails above to view larger images)

I am working on a group of works all connected by a central theme.

As you know a city is a web of connected networks. In essence, the city fabric is a giant multi-user, multi-data sphere. The city is made up of traffic patterns, pedestrian patterns, bird flocking patterns. Patterns can be seen in the architecture, patterns in the buildings, patterns in the architectural fabric of the urban design network. And closer inside the micro patterns of the city, we have the life cycles of the atomised.

All of these spheres can be represented by media and therefore by data within the digital realm. And all of this data can be interpreted and mediated. It becomes a matter of choice.

Collections of data can be stored to be retrieved later. The mobile data infrastructure becomes a data source so powerful so interwoven that its scale can only be imagined as metaphor. The size and scope of such an archive, of such rich mediated data experience can support many projects. As such it can be interpreted via a variety of interfaces.

Cities offer the opportunity for unique types of data gathering experiences via a variety of sources. My objective is to 'mediate' data into conceptual artifacts. With this perspective there are many unimagined threads of data and connections that describe our world that can be explored through wireless mobile networks within which we can create artistic interpretations.

Here is some information about two of the projects I now have in development, Robotica and Memory Mapping. You can view the works in progress by visiting the links below.

Robotica
http://www.stanza.co.uk/robotweb/index.html

Memory Mapping
http://www.stanza.co.uk/memoryweb/index.html

Robotica

Robotica is an installation using controlled autonomous players, robots that navigate their own space to make music and visuals. As they move around they trigger a database of sounds which are played dependent on the position in space. The movement of the robots also controls live dynamic data from the outside of the city to create the visuals. Live web feeds from around the city are projected into the installation space, via a series of projectors. These images are live real time live updated 3d models which are controlled by the movement of the robot.

Image of installation (click on the thumbnail above to view the larger image)

Memory Mapping

It is possible to remember everything, its just difficult. Taxi drivers use the knowledge to remember their way around the London streets. Birds in winter, can find ninety percent of the twenty thousand nuts they bury, even in the snow. This is called memory mapping. These maps below trace journeys through Bristol. Gps can also be used to trace journeys through the urban tapestry. These journeys becomes a multi-layered thread an archive a memory of this data.

Images from Memory Mapping (click on the thumbnail above to view the larger image)

(Posted 21 May 04)

The technology must work.

Interesting problems arise when starting projects that involve some sense of procedural design or building. If one is faced with the blank canvas, tech spec or whatever, the possibilities parameters and variables are limitless. This is often a good thing as there isn't anything in place to actually go wrong in theory. Here in my on going series of experiments at Watershed in Bristol I have decided to spend a week with the Ultrasound rig as well as gathering some assets for my other emergent city‚ projects that I have in development.

This ultrasound system is designed and built by Cliff Randel and its already been put in place and tested, it is configured to the room using the Bristol.11 network or Watershed Open wireless network. The general idea involves using a Hewlett Packard PDA running the Mobile Bristol client. If you then move around the room your position is tracked via the ultrasound system which is attached to the ceiling. This sends data to the Elvin‚ servers and using a Elvin‚ extra, this information can be used to create a visual response, using Director. The system, I have been told, is in place and set up. My intention is to extend the system using all sorts of sensing equipment.

Maybe its just one of those things with technology that you can find the software is deleted, the pdas are without power, the power cables are in Manchester and the servers are down. Initial set backs like this give technology a bad name. The conclusion is to be autonomous, not to rely on too many differing technologies, to run your own servers and most of all back your work up. The other thing to remember is that the technology must actually work - or should it? Well yes, otherwise the idea remains just that, an idea, a concept, a series of drawings.

I have imagined a simulated space re-representing a data mesh of Bristol. I am making a robot performance, using cybots‚ in avoidance mode and path following mode to make music and visual systems. So actually its two performances or experiments. The robots wander about and follow maps of Bristol and trigger sound and visual data. The first version involving simple tracking in the space was implemented on 14th May.

The whole robot performance can be monitored on the public cctv and is broadcast to the internet using the quicktime server. I have already used and tested this using all twelve cctv cameras to make an online document. This is Watershed TV exploring this concept of taking the 'inside outside', because as well as being CCTV its also 'PubliCITY'. It is in effect a TV station broadcast in real time.

I have also extended my CCITYV software which acquires open streamed images from around Bristol. Most are traffic cams but it is interesting to see the landscape for what it is. These images are presented into a 3d morphing model and the images are always changing and always evolving. It becomes a never ending film online for a very long time extending in real time.

In addition I have been collecting some sound to feed into a database. This is sort of an extension to www.soundcities.com (a project all about found sound in city spaces). Next I want to make this a mobile experience and I want to make a trial using the GPS feature of the HP PDA. When I visited the HP labs they were building new cameras into their PDA systems, and when I can get my hands on one of these I can upload and download live video and sound, so I will try out the outdoors version.

(Posted 12 March 04)

Experiments with technology at Watershed.

Technologies in place at the Watershed currently include, 1 pda, d-link wireless cameras, ultrasound rig, CCTV security set up, server space etc

It’s a curious approach to offer artists technology and then let them come up with ideas relevant to the technology specification itself. This makes the brief, technology – led, and is a more usual approach for 'design' rather than ‘art’sensibilities. Its easy to just deploy the kit available and this defines the project. In a metaphorical language it’s like being given the ‘painting by numbers’ set with the paint numbers missing. The provided parameters are likely to produce variously pretty pictures, though the results are largely predetermined. This is a common scenario with new technologies, where the artist’s role becomes end user, in place to evolve some sort of product within quite controlled parameters. The thing that is curious is how artists can be usurped by technology in certain situations. The is a sudden locative gold rush on within media arts. As well as futuresonic, ISEA 2004, Viper there is all of a sudden calls for works globally referencing this area. This is the topic for this year. But my question is directed to other artists because the problem seems to be a conundrum in that artists are now evolving products or services that are used on networks that charge the general public lots of money for their use. I mean its great to use 3g phones for example to make live video data feeds into a networked multi user database. Its what the industry wants, ie, we all end up using video phones at fifty pence per minute. So industry should be supporting the use of networks (so well done to Hewlett Packard) otherwise we will all just set up out own shared wi -fi network and do it ourselves which is after all the consume approach.

There must be room in the brief for personal artistic process. Of course the canvas can always be painted black…..later. It doesn't really matter what technology one uses its the level of inquiry and experimentation within any system that seems to offer creative possibilities. In using any technology it seems one can quite quickly imagine performative roles as well as installations, fixed perspective experiences and locative approaches to the delivery of user experience. My current interest is in the wider picture of city experiences which are being played out in real time. Instead of adopting narrative threads from other media, I am interested in the currency that exists already in the city space. This sort of experience of multi nodes and multi threaded spaces demands a refined gathering of data, a sensitive accumulation which can then lead to some kind of modeling and visualisation. [audible and visual (mis)-representation].

My approach so far is has been to avoid getting fixed into a ‘technical toolkit’ approach too early. The nature of emergent spaces and evolving cities experiences implies that the software itself should be emergent and adaptable to a 'multiplicity' environment. If you pin a system down and adopt a tool kit then there is only limited room for experimentation. In fact under this model the artists is end user of a defined technology and as such, works will only be curated to fit into the system, hardly any experimentation or research at all . So I am interested in a system which is modular and systematically capable of building on results from the previous project. One way to do this is to contact artists who are interested in the technology and the arts. Commission them to make a small website that addresses the key areas and questions of the brief. IE what do artists want from a network? Why go wireless? and What sensors would you use and how would you use them? AND fundamentally the artists should build the system and the network.

The agenda which incorporates the concept about a box of technology for any system and multiple use looks initially like a good idea. I have been investigating various boxes, tech systems, and visualisation softwares. A generic system would be useful, a box to plug sensors into, that can gather data and read it easily. This sort of kit is very usefully to artists. (or is it)? One should bare in mind, there is nothing new in this. At Birmingham NEC 11 12 feb 2004, a two day conference featured dozens of real companies who work with sensors, building management systems and visualisation software. This is not an emerging market, it is highly developed and developing existing market with products. Very sophisticated control and visualisation systems already exist for any industrial, civic and safety measurement and management situation you can imagine. Hard end engineering companies generally overlook the creative potential of their products, as do most artists. Often they can all be experimented with to allow creative uses. There are plenty of interesting tools already on the market for artists used by real industries, which can be adapted for creative use, given a little lateral application.

In Birmingham, I came across Redlion, whose gear I especially liked. They had a G3's data logger which allows you to record process values to CompactFlash for later access in CSV format via the integrated web server, making access to historical data a breeze. I found the bluetooth Wisnet system from Expert Monitoring (wireless sensor deployment kit) really of great interest ( although maybe not cheap) as it was ready to use. In the wireless sensors I also checked out Willow's 'motes' and Millenial. So I now have stacks of information and contacts in the areas of moulding, displays, sensors, wirless, and visualisation software. (more to come later)

I went to the open day of Arch OS in Plymouth 27 feb 2004. Quite illuminating, and possibly a good source of knowledge in terms of the Watershed project in general. They seem to have 'hi- jacked' the Invensys building management system, that is, they are adapting systems that are already in place within the context of a building and its real world function. Arch OS has incorporated various other os systems, one for vision tracking, for sounds, and is now pushing further to allow the creative use of the data through artistic and modifiable outcomes. It was amusing though that there where no light switches and that the automatic doors kept closing on people exiting. The intelligent building clearly defines its own intelligence. A more specific observation on Arch OS is that it operates as a giant surveillance system, a 'panoptican'. This suggestion is refuted by Mike Philips the director. The argument being that the level of control and responsibility given allows freedom within the system. Its a good argument until something changes ie, the passwords change, the building goes private, and every single piece of data is used for something that it wasn't intended for. Some things change for the better but sometimes they don’t; one thing is for sure; things change.

This is something that interests me. The things that change, the flow, the data that describes our experience of the city as space.

I am making a series of projects throughout this residency and bursary which at the moment I describe as research. I have already made two pieces of work relating to my research and work developments. The first, ccityv Bristol captures all the images from local webcams and re- collages and re - contextualises them into a software system online. It transposes the 'outside' to 'inside'. This work is still online at www.stanza.co.uk/bristol

The second experimental work (untitled at the moment), also used cctv, where cameras feeding the Watershed building were broadcast out over the internet for a 12 hour period. This is the transposition of 'inside' to 'outside'. In further experiments I will attempt to manipulate the live footage using software and broadcast this live encapsulation to roving PDAs. {Tech note. The ipaqs wont play real media, so we are stuck with windows media player. So in order to stream one would need a server with windows media 8. Unless a workaround is found.}

However although the making of some works is relevant, the understanding of emerging data networks seems to be a focus for my interest at the moment. My own research is leading me to Berkeley in the USA to look at 'motes' and wireless sensor networks. I have a number of artistic paradigms imagined and the next step is to describe them and build them to evaluate their merits. Data from all sides in systems that can be mediated by all, with varying visualisations possible communicated over the internet and represented onto display systems. (sketches to follow)

Stanza has recently received a Dreamtime Award from NESTA - as part of this he is writing a journal documenting his research and development during this period. To read these entries, visit www.stanza.co.uk/weblog/index.html

top | watershed | All material copyright - IPR Policy