You intrude on a territory of digital creatures. You have choice: disturb them or disappear. Camouflaged, you are able to observe their movements and patterns, but reveal yourself and they may swarm.

The concept for this project was to create digital wildlife in a physical location. The result was Region One, an installation that takes mobile advertising technology and repurposes it to create a playful, interactive habitat combining light, sound and empathy. When undisturbed the creatures move in unison, generating patterns with light and sound. Intrusion into the region causes the assembly to flock toward digital signals emitted by mobile devices carried by visitors. A app allows them to cloak these signals and locate the creatures, in order to observe the technology in its undisturbed state.
The project utilises BLE technology that has historically been used commercially and for targeted advertising. Through a combination of custom software and hardware, the installation exploits this to create a new digital and territorial experience.

Region 1 runs on multiple servers using javascript, node.js and - each server is a pollenHub which contains a Raspberry Pi 2 Model B running linux, a low energy Bluetooth 4.0 dongle and a relay output board with 1/0 (on/off) channels to clusters of six crickets. Each server is connected to a central switch and computer via ethernet or wifi.

Every visitor to Region 1 is essentially interacting with exactly the same technology that is present in most apps and websites that track our locations and offer updates and information based on our identities and web habits. In the same way that one visits a web address, they visit Region 1, though it is their digital presence that stalks an array of hanging pixels, rather than a cursor or thumb on an LCD. Via WebSockets. each interaction of every visitor is sent as personal ID and logged on the central computer.

The same technology that most of us use on a daily basis to send messages across the internet, WebSocket, is used in Region 1 to receive messages from your physical movement through space. This means that without any direct interaction the installation learns who you are. When no visitor is detected the central computer issues sequences to the clusters of crickets to determine behaviour based on user interaction - mimicking those who may have left the grid, perhaps repeating movement through the space.

The iOS app was written in Swift and Objective-C. It uses Core Location and Core Bluetooth frameworks to determine current location and heading, to define geographic regions and monitor when users cross region boundaries for way finding and physical interaction purposes.