Algorithmically-generated “island” maps using a random walk around a square and changing parameters for beach depth; created for an upcoming smell-based adventure game.
A messy version of the code is here; click on image for full-resolution
Software, hardware, art – a blog of process and findings
Algorithmically-generated “island” maps using a random walk around a square and changing parameters for beach depth; created for an upcoming smell-based adventure game.
A messy version of the code is here; click on image for full-resolution
Two-channel vibration motor test – equal-power panning (used in audio) applied to two vibration motors and controlled by a joystick
Testing different papers for asborbing essential oils for a smell-based videogame (handmade paper worked the best, bristol board the worst)
Absorbing scents overnight in a plastic bag
Testing a smell-release valve using a very small servo motor, a round valve with soft foam gasket, and controlled by an Arduino
Halfway through a commission from Harvestworks’ Cultural Innovation Fund program, I am fully immersed in haptic technologies, digging into early virtual reality research, and thinking about ways of expanding traditional game space to include non-traditional senses, extending common haptic technologies (vibration motors), and removing the visual from gameplay while still creating beautiful, meaningful games like the ones where people use P4rgaming services. Never before had anyone carried off such a huge sum in one spin of the reels. In fact, up until then, the record-breaking win for Canada had been $1.7 million. One lucky gambler won it playing a Powerbucks machine also at Casino de Montreal, in April 2016 — it was a new slots record in Montreal.
I am particularly interested in activating senses that have been neglected in media: touch (texture, temperature), smell, and (to some extent) sound. Above is a screenshot of an early 4-channel vibration controller that uses a joystick to move vibration around a surface, much like sound can be mixed to appear in different locations. The circuit fits on top of the Arduino microcontroller for easy programming and connection.
Research links (a bit of a mess – mostly what tabs are open at the moment):
A game where the player has to ride a bike using sonar (click into the microphone, receive auditory feedback on where objects are in front of them) is approaching the play-testing phase. The game is inspired by a video I saw a long time ago about Daniel Kish, shown below:
Up next: smell tests, optimizing combinations in a circle for combinatorial smell “playback”, testing various materials for vibration transmission (damping vs transmission).
Electrovibration for haptic interfaces (via: Tested.com)
I’ve added a few resources for the haptic and non-visual games research:
Please note that both of these are messy, working “sketchbooks” during the project’s development. As the tests are revised and finalized for release, the code and other files will be tidied.
When the PC game “Doom” came out in 1993, my dad brought a copy home from work. I don’t think he realized how violent and full of satanic imagery the game was (to my happy surprise) and I spent way too many hours playing the game. Not having much access to videogames as a kid, Doom and a few others (Super Mario Brothers for NES, Tetris on the original Game Boy, Sonic the Hedgehog, Mattel Football, Pitfall for Intellivision) remain my reference points when I think about games.
Early experiments toward a series of haptic and non-visual games have led to a realization: a single type of feedback in a game generally results in a flat, non-immersive experience. An animation of a gun firing with no gunshot sound is still believable and playable, but with that extra feedback of the gunshot sound the game becomes realistic, and we suspend disbelief and fall into the game. But another type of feedback has struck me as deeper, effecting not just the enjoyment of a game but actually driving the UX design and clarity of the gamespace and its rules.
A case study can be found in Doom: in the game there are two types of doors, ones that you can open any time and ones that require keys (picked up elsewhere in the level). Upon reaching a door of either type, the player presses a key on the keyboard to try to open the door. If you have the key or the door doesn’t require one, it opens with a satisfying “swoosh” – like a gunshot, this sound adds to our immersive experience. However, when the player attempts to open a door without the proper key, nothing happens visually but the character emits a loud “OOF!” sound.
The “oof” is key: without it nothing would happen at all and we would be left wondering “did I press the right key on the keyboard? Is this a door at all?” The above videos demonstrate this in the game – the first is normal gameplay, the second with the sound modified so there is no “oof”. While this translates in video, actual gameplay is really required to fully appreciate the difference a single sound effect can make on an interaction.
This question originally arose while developing the badge for the 2012 Games++ event, where the only feedback for navigating a dungeon space was a vibrating motor. With each tile-type (sand, stone, etc) as a unique vibration pattern, the question became: what about walls? Should there be a vibration when you can’t move in a particular direction? What does no vibration signify? We decided on no vibration, which gives the added possibility that someone might invert the level in their mind’s eye, seeing walls as pathways and other tiles as walls.
As the starting point for developing haptic and otherwise non-visual games for a project in collaboration with Harvestworks’ “Cultural Innovation Fund” program, I’ve been poking around the Android documentation for developing apps with Processing for tablets and mobile devices. Since this is a relatively new feature for Processing developers (and since the Android syntax is a bit weird), the first step was to figure out how to get everything working.
In the spirit of sharing, I’ve created a GitHub repository for these experiments, as well as a detailed set of instructions for getting started. I have also created a GitHub repository for this project, which will be a bit of a mess over the next 3-4 months of development but will hopefully get cleaned up as the project nears completion.
These examples and projects are being developed for the Google Nexus 10 tablet – it seemed the beefiest and most flexible for the price. If you have problems with any of these examples on your device, please let me know so I can update them!
In the pipeline:
Above: a giant d-pad