hardware and software development for artistic applications

Welcome to my website about some of the work I have done, though by far not all.

For artistic projects and publications, please take a look at my site at marijebaalman.eu

projects

  • SENSEFACTORY AFTER MOHOLY-NAGY

    A WORK BY ERIK ADIGARD, SOFIAN AUDRY, FM EINHEIT, DIETMAR LUPFER, CHRIS SALTER, ALEX SCHWEDER AND SISSEL TOLAAS

    SENSEFACTORY is a spectacular large-scale performative installation combining architecture, sound, smell, light and AI technology into an immersive multi-sensorial experience.

    SenseFactory makes use of the Sense/Stage MiniBees for its sensing capabilities. I have developed some custom firmware to interface with distance sensors that are used to detect when visitors pass from one space to another.

  • Embracing the temporal deterioration of digital audio

    Most of today’s media output, be it audio or video, is produced and stored in the digital domain. Although digital data are adorned by the myth of lossless transmission and migration, everyday experience does prove the existence of degradation and, ultimately, data loss in various forms. This pertains to the physical nature of storage media and playback devices as well as to media formats and software in the context of their technological infrastructure. The project strives to elaborate on the causes, mechanisms and effects of such deterioration, specifically in the context of digital audio. Since degradation cannot be avoided on principle, it is our general aim to unearth latent degrees of freedom pertaining to the artistic practice in the omnipresence of decay.

    Rotting Sounds is a project of artistic research funded by the PEEK (Programme for Arts-based Research) funding program, managed by the Austrian Science Fund (FWF). The project AR 445-G24 is scheduled to run from May 2018 until the end of 2021.

  • In Prime Mover Estonian/Norwegian choreographer duo Külli Roosna and Kenneth Flak deal with movement as both cause and effect, freedom as well as inevitability. A Prime Mover is an impulse to do something. It is also the giant engine that we are all a part of, a machinery consisting of environment, technology, history, sensations and dreams. We try to find our way in all of this. We look, we listen, we touch, we think. Occasionally we arrive to an illusion of certainty, but then reality shifts and we are lost. Again.

    The audience is invited into a breathing, living universe with no real beginning and no real end, where play and seriousness merge, and where thought and movement are one and the same.

    For this project, I am mentoring Kenneth Flak in his transition to use SuperCollider to implement the interactive sound for this work. He uses the Sense/Stage MiniBee platform as the main sensing technology in the piece.

  • The book “Just a Question of Mapping - Ins and outs of composing with realtime data” aims to give an overview of the process of mapping and common techniques that can be used in this. These methods will be described in a way that gives an artist a guideline of when to use the method and how to implement the method in the environment they work in. Examples of implementations of these methods will be provided seperate from the book in a repository to which readers of the book can contribute.

  • A Low Cost, Open Source Wireless Sensor Infrastructure for Live Performance and Interactive, Real-Time Environments

  • Intimate Earthquake Archive

    Tactile earthquake vests and compositions derived from seismic recordings. Interactive radio broadcast system. Sandstone earth core samples and wooden scaffolding.

  • After initial work from Dan Stowell on the audio engine backend, I have worked on the port of SuperCollider to the Bela platform.

    This work included the finetuning of the audio engine backend, writing the various UGens to access the special inputs and outputs of the Bela and documenting the usage.

  • In the first week of April 2014, I am organising a Modality Work Group residency at STEIM. Modality aims to make it easier to use HID, MIDI and other controllers in SuperCollider by providing a common interface.

  • Late 2012, I took it upon me to fix the HID implementation of SuperCollider, which had been broken on OSX since version 3.5. Crowdfunding and a generous donation from BEK brought together the finances for me to devote time to this.

  • Melodized Pillow Hammock

    Developed for Popkalab in collaboration with Bless.

    The hammock has embedded sensors to detect the person on the hammock and the swinging on the hammock and plays music based on this. I have developed the wireless sensing system with custom pressure sensors and accelerometers and the software infrastructure in SuperCollider for the mapping of the sensor data to the sound.

  • an OSC-controllable router for OSC (OpenSoundControl) messages

    Available at: XOSC on Github

  • MotionTrackOSC is a small program that does motiontracking on either a camera image or a video file.

    The tracked motion is output to a client via OpenSoundControl.

    Avalailable at: MotionTrackOSC on Github

  • VideoRecOSC is a small program that records video from either a camera image or a video file, and is controlled via OSC. Updates of the frame counts and current filename are sent out via OSC.

    This tool was created in order to be able to synchronise video and sensor data recording.

    VideoPlayOSC will play the video back changing each frame based on an incoming OSC message.

    Available at: VideoRecOSC at Github