Info: BioRhythm is a biofeedback installation; visualization and sonification are based on real-time Photoplethysmograph (PPG) physiological instrumentation (Biopac System, Inc). The PPG is a biomedical signal of noninvasive instrumentation related to the pulsatile volume of blood in tissue. Characteristics in time and frequency domain of PPG signal are used to generate visual and audio content that aims to express rhythmicity and resonance inside homosapien. Parallel processing is used to maximize real-time feedback and interaction. This is accomplished through wireless network communication among laptops and software (physiological signal processing on one laptop, visualization rendering on another, 32-channel spatial sonification on the other).

The installation is designed for users to explore the idea of BioRhythm through audiovisual interaction based on variation of physiological signal. Also, the installation aims to provide users and audiences an alternative of artistically deciphered information thorough visual and aural embodiment of the medical instrumentation. Transforming objective (medical) monitoring signal into subjective (artistic) representation of bodily condition might serve as a reflection on epistemological grounding of biomedical training regarding its objectivism, reductivism, and materialism.

BioRhythm's visualization was realized using real-time techniques in computer graphics. The biological nature of this project called for the use of organic form and movement for visualizing the incoming PPG data signals. This text will focus on the graphical techniques used in the visualization and the mapping of the incoming data signals to the visual form and composition. Please note that the downloadable application below is a modified version of the application used in the Installation. The downloadable application allows a user to manually vary parameters that affect the form's physical, spatial, temporal, and material properties. The last five sliders in the graphical user interface provide control over the lighting in the scene.

To achieve an organic form, a GLSL vertex transformation shader was used to deform a plane mesh into a sphere. The property "Base Radius" is able to modify the sphere's radius. Three dimensional perlin noise was used to deform the surface of the sphere to produce smooth distortion of its mesh. The property "Noisescale" modified the amount that the noise displaces (multiplies) the mesh. Further, a sharpness parameter is used to control the sharpness of the noise applied, a high value will result in noise that is high in frequency, thus distorts the mesh to a greater degree, a small value will yield noise that is low frequency thus providing a smaller amount of overall distortion because of smoothing.  The displacement amount of the vertices is directly correlated to the "Displacement" property. Higher displacement values will yield higher noise amplitudes, which will yield greater vertex displacement on the spherical surface. In addition the shader allows the noise to vary over time thus creating movement. This movement can be fast or slow, depending on the value of the Speed and the Timer. The overall speed of the movement is incremented by the timer value. This yielding fine and course control over the speed of the motion. All the parameters mentioned above are feed into the vertex shader when each vertex’s position is transformed to appear on a spherical surface, and then transformed again using the noise and the corresponding noise multiplier.

Furthermore, material properties of the form are controlled using the sliders in the middle of GUI; these properties include specular reflection (Shin), ambient, diffuse, and specular and emission values. These values are used in the GLSL fragment or pixel shader to calculate the resulting look or materiality of the form. The last set of sliders affect the lights in the scene. The GLSL pixel shader calculates the brightness and color of the fragment by utilizing a 5 light calculation, the five lights in the scene.

Lastly, in the installation PPG data signals are sent over the network to the BioRhythm application to drive its parameters that control its physical form and the lights in the scene. All incoming signals are low-pass filtered to prevent from sudden frantic changes in the scene, however fast transitions and motions are still achievable. The raw heart rate signal is mapped to the base radius value of the sphere. One feature from frequency domain of the raw heart rate signal is mapped to the noise scale, thus controlling the perlin noise distortion multiplier. The incoming interval data value is averaged over time, and incoming values that deviate from that average are used to control the amplitude of the displacement of noise applied to the mesh. Features extracted from frequency domain were used to control their corresponding light's diffuse property. 

GLSL shaders were crucial in creating this organic effect and motion, without which the installation would have no way of running in real-time. A future iteration of this project will implement stereo rendering to provide a sense of 3D, which this visualizing is dying to be in because of its organic human nature. Furthermore, the visualization creates interesting forms and patterns; future research will go into exporting these forms for 3D printing. Future iterations will utilize fragment shaders, such as chromatic dispersion, sketchy, blueprint, stripe, ambient occlusion, depth of field, refraction to highly BioRhythm's organic nature and surrealism.

Researchers: Yuan-Yi Fan (Bio-Sensing), Ryan McGee (Sonification), Reza Ali (Visualization)
Output: Installation & Custom Software (C++, openFrameworks)
Year: 2010