Share Dialog
Share Dialog
Subscribe to animata studio
Subscribe to animata studio
<100 subscribers
<100 subscribers


I was commissioned to create a large-scale, visually striking multimedia installation at the Hungarian House of Music for György Cziffra’s piano performance of Franz Liszt’s “Grand Chromatic Galop.” The goal was to craft a multimedia experience that would pay worthy homage to Cziffra’s musical legacy, while employing innovative technological solutions to bring his art closer to the visitors.
This rare archival footage, recorded in France, served as the starting point:
At the initial planning stage, selecting the right format, size, and technology was crucial. Fortunately, I was able to collaborate closely with the exhibition’s design team from the outset, allowing for efficient brainstorming and swift decision-making.
Early on, we decided to present Cziffra’s outstanding performance and the accompanying animations on a large (4m x 1.5m) curved LED wall. Although we considered other options, such as projectors, their size and the complexity of alignment—particularly in relatively well-lit spaces—led us to discard that solution. Since the exhibition would be traveling, easy setup and maintenance were also key considerations.
From the beginning of the planning process, it became clear that the focus should be on the uniqueness of the hand movements and the technically exceptional nature of the piano performance. We aimed to express this through abstract generative 3D animations and data visualizations.
The core idea was that presenting the performance from multiple perspectives would sensitize the viewer and help uncover deeper layers and hidden connections between music and motion. Throughout, we also strove to ensure that the original archival recording played a worthy role and was integrated as an organic part of the animation.
Objectivity was essential; we sought to use scientifically credible, authentic methods. Alongside aesthetics and spectacle, authenticity and a sense of reverence for both the artist and the work were given equal importance.
Our aim was to provide a meaningful, exciting experience for audiences familiar with Cziffra’s oeuvre, as well as for casual visitors. We wanted to show how such a piano performance pushes the boundaries of human capability and how the artist’s personality and physical presence influence the act of playing the piano.
Initially, we planned to reconstruct the hand movements from the archival piano footage using artificial intelligence algorithms, then use the resulting motion capture data to generate the animations. However, due to the quality of the original footage, the rapid hand movements, camera shifts, and edits, this proved impossible.
Even when using a high-quality recording for testing, the data was not sufficiently accurate to base our development on it.
Here we show an early experiment using the Handsfree JavaScript library to digitize hand movements from video.
We had to choose a different approach. We decided to ask a skilled pianist to perform the piece as accurately as possible, so we could create a digital version of the piano piece. János Balázs—one of the most distinguished Hungarian interpreters of the Cziffra legacy—was happy to take on the task. Thus, the notion of a “piano hand-double” was born.
We used several recording devices:
Video camera (Nikon Z6): Captured the entire performance from above.
Hand motion digitizer (Leap Motion Controller 2): Precisely recorded the 3D movements of the hands.
3D camera (Kinect v2): Captured point cloud-based motion data.
MIDI-enabled piano (Roland FP-90X): Digitized the keystrokes and their dynamics.



Because we switched between various recording devices, János Balázs had to perform the piece multiple times. We needed to allow breaks between takes since Cziffra’s original performance was so intense that replicating it continuously became physically exhausting. For the hand-skeleton recordings, we slowed the original footage to 75%, ensuring more precise digitization and preventing strain on the pianist’s hands before his next concert.
During the recordings, it became evident that János Balázs’s talent is on par with Cziffra’s. Listening to the archival recording, he played it in perfect sync, without a score, live. Everyone watched in silent amazement at this extraordinary feat.
I worked with mathematician and visual programmer Beni Bakó on processing the recordings and creating the final animations. We used the Unity game engine, which offered significant technical and creative freedom. Unbound by traditional video editing software, we could expand and modify Unity’s editor to serve our own goals.
We wrote our code in C#, as well as in shader and VFX Graph languages, and compute shaders (HLSL). Where possible, we optimized code for the GPU to leverage parallel processing. Thus, every experiment could run in real-time, tested live, ensuring flexible and dynamic development.
By fusing data from two Leap Motion cameras—often incomplete on their own—we obtained a stable 3D hand-skeleton motion. We then synchronized data from all recording devices: the archival video, MIDI data, and the hand-skeleton data. The first test animation clearly showed that the concept worked as intended.
The first successful test animation in which the MIDI and the hand-skeleton movements are synchronized with the archival footage.
Water Surface Simulation
We envisioned placing a virtual water surface at the level of the piano keys, where each keystroke would create ripples—akin to the sound waves generated by the music. We developed a physically accurate fluid simulation, fine-tuning parameters to experiment with various densities and material behaviors. This depiction offers insight into an otherwise invisible world of energy propagation—translating sound into tangible waves.
Since we employed GPU computing, the animations could run and be tested in real-time.
Slitscan Technique
Developed during the analog film era, the slitscan technique exposes a narrow “strip” of film as either the subject, the camera, or the slit moves. This results in a time-based distortion, as different parts of the subject are recorded at different moments. We applied this method to the overhead camera view.

Sound Spectrum Analysis
Through sound spectrum analysis, we decompose sound into its component frequencies. This lets us examine which pitches and amplitudes are present in the recording. Using this information, we manipulated the elevation data of a topographic map.
Three-Dimensional Time Sculptures
To highlight the uniqueness of each movement, we created three-dimensional “sculptures” that capture and record the hand’s trajectory over time. Using the lines traced by the outermost fingers, we formed continuous 3D surfaces. These shapes “freeze” the hand’s motion into a peculiar space-time sculpture.

smoothed, interpolated, and textured surface using Catmull-Rom splines
Ribbon-Flow Visualization
After much trial and experimentation, we developed an animation where ribbons continuously flow from the hand-skeleton’s points.
the first test animation.
This scene offers an emotional interpretation of the performance rather than a strictly accurate sound visualization. We applied various forces to the ribbons’ motion, enriching and dramatizing the animation. For the final recordings, we used a MIDI controller to adjust these parameters—such as wind strength, turbulence frequency, amplitude, twisting, and camera motion—while watching and listening to the animation live. Our software recorded these adjustments, enabling us to use them as animation curves during rendering.

Now multiple forces influence the ribbons’ motion
Point Cloud Visualization
We also recorded point cloud animations with a 3D camera during the studio session. Although the camera recorded at only 30 fps and was optimized for room-scale captures rather than detailed hand positions, it was an intriguing experiment. Ultimately, the point cloud visuals were not included in the final animations.


We adapted Keijiro’s plugins for these tests.
We gathered a variety of data for visual display:
Number of keystrokes
Keystroke density
Keystroke strength
Intensity of hand movement
Range of currently sounding notes
Chromatic scale
MIDI score
Calorie expenditure
Working with graphic designer Eszter Kiskovács, we developed the visual interface:


Chromatic Scale
We arranged the piano’s 88 notes along a spiral, with the deeper tones on the outer edge and the higher tones as we move inward. We positioned the octaves so that they align vertically toward the spiral’s center. This representation offers a fresh and exciting way to reveal hidden harmonic relationships.
MIDI Score
On the left side of the screen, we displayed a MIDI-based score that resembles a reversed punch-card machine printing out the visual notation. Numbers appearing next to the keys show how often a particular note is played.

Calorie Expenditure
Because we had to allow breaks between takes due to the physical demands of the performance, we decided to visualize this aspect as well. To estimate calorie usage, we used the simplified Metabolic Equivalent (MET) formula:
Calories = (MET value × 3.5 × weight in kg) ÷ 200
A piano performance has a MET value between 2.5 and 3.5, depending on intensity. Our algorithm estimated this value based on the intensity of hand movements, thus calculating current calorie expenditure. To make this more relatable, we compared it to other physical activities—estimating how many meters of running would equal the same amount of energy.

Originally, we planned a rich, montage-like presentation, allowing viewers to wander through different visual layers at their own pace. However, the display’s technical characteristics, especially its resolution constraints, made this approach impractical. In collaboration with the design team, we decided to create a single, focused, grand-scale animation—one that would, on its own, remain sufficiently varied and exciting to hold the viewer’s attention throughout the roughly three-minute duration. In the end, the data visualization, ribbon-flow, and time-sculpture animations were chosen for the final composition, as each could independently capture the piano performance’s dynamism and rhythm.
Through this project, we succeeded in creating a complex and innovative multimedia installation that pays tribute to György Cziffra’s piano artistry, highlighting its uniqueness and technical brilliance. By fusing various technologies and artistic approaches, we not only achieved a compelling visual representation of the music, but also uncovered deeper layers and relationships—illustrating the interplay of music, motion, and the limits of human performance.
We hope the installation provides both professionals and the general public with an exciting and inspiring experience, contributing to the worthy preservation and appreciation of Cziffra’s legacy.
I was commissioned to create a large-scale, visually striking multimedia installation at the Hungarian House of Music for György Cziffra’s piano performance of Franz Liszt’s “Grand Chromatic Galop.” The goal was to craft a multimedia experience that would pay worthy homage to Cziffra’s musical legacy, while employing innovative technological solutions to bring his art closer to the visitors.
This rare archival footage, recorded in France, served as the starting point:
At the initial planning stage, selecting the right format, size, and technology was crucial. Fortunately, I was able to collaborate closely with the exhibition’s design team from the outset, allowing for efficient brainstorming and swift decision-making.
Early on, we decided to present Cziffra’s outstanding performance and the accompanying animations on a large (4m x 1.5m) curved LED wall. Although we considered other options, such as projectors, their size and the complexity of alignment—particularly in relatively well-lit spaces—led us to discard that solution. Since the exhibition would be traveling, easy setup and maintenance were also key considerations.
From the beginning of the planning process, it became clear that the focus should be on the uniqueness of the hand movements and the technically exceptional nature of the piano performance. We aimed to express this through abstract generative 3D animations and data visualizations.
The core idea was that presenting the performance from multiple perspectives would sensitize the viewer and help uncover deeper layers and hidden connections between music and motion. Throughout, we also strove to ensure that the original archival recording played a worthy role and was integrated as an organic part of the animation.
Objectivity was essential; we sought to use scientifically credible, authentic methods. Alongside aesthetics and spectacle, authenticity and a sense of reverence for both the artist and the work were given equal importance.
Our aim was to provide a meaningful, exciting experience for audiences familiar with Cziffra’s oeuvre, as well as for casual visitors. We wanted to show how such a piano performance pushes the boundaries of human capability and how the artist’s personality and physical presence influence the act of playing the piano.
Initially, we planned to reconstruct the hand movements from the archival piano footage using artificial intelligence algorithms, then use the resulting motion capture data to generate the animations. However, due to the quality of the original footage, the rapid hand movements, camera shifts, and edits, this proved impossible.
Even when using a high-quality recording for testing, the data was not sufficiently accurate to base our development on it.
Here we show an early experiment using the Handsfree JavaScript library to digitize hand movements from video.
We had to choose a different approach. We decided to ask a skilled pianist to perform the piece as accurately as possible, so we could create a digital version of the piano piece. János Balázs—one of the most distinguished Hungarian interpreters of the Cziffra legacy—was happy to take on the task. Thus, the notion of a “piano hand-double” was born.
We used several recording devices:
Video camera (Nikon Z6): Captured the entire performance from above.
Hand motion digitizer (Leap Motion Controller 2): Precisely recorded the 3D movements of the hands.
3D camera (Kinect v2): Captured point cloud-based motion data.
MIDI-enabled piano (Roland FP-90X): Digitized the keystrokes and their dynamics.



Because we switched between various recording devices, János Balázs had to perform the piece multiple times. We needed to allow breaks between takes since Cziffra’s original performance was so intense that replicating it continuously became physically exhausting. For the hand-skeleton recordings, we slowed the original footage to 75%, ensuring more precise digitization and preventing strain on the pianist’s hands before his next concert.
During the recordings, it became evident that János Balázs’s talent is on par with Cziffra’s. Listening to the archival recording, he played it in perfect sync, without a score, live. Everyone watched in silent amazement at this extraordinary feat.
I worked with mathematician and visual programmer Beni Bakó on processing the recordings and creating the final animations. We used the Unity game engine, which offered significant technical and creative freedom. Unbound by traditional video editing software, we could expand and modify Unity’s editor to serve our own goals.
We wrote our code in C#, as well as in shader and VFX Graph languages, and compute shaders (HLSL). Where possible, we optimized code for the GPU to leverage parallel processing. Thus, every experiment could run in real-time, tested live, ensuring flexible and dynamic development.
By fusing data from two Leap Motion cameras—often incomplete on their own—we obtained a stable 3D hand-skeleton motion. We then synchronized data from all recording devices: the archival video, MIDI data, and the hand-skeleton data. The first test animation clearly showed that the concept worked as intended.
The first successful test animation in which the MIDI and the hand-skeleton movements are synchronized with the archival footage.
Water Surface Simulation
We envisioned placing a virtual water surface at the level of the piano keys, where each keystroke would create ripples—akin to the sound waves generated by the music. We developed a physically accurate fluid simulation, fine-tuning parameters to experiment with various densities and material behaviors. This depiction offers insight into an otherwise invisible world of energy propagation—translating sound into tangible waves.
Since we employed GPU computing, the animations could run and be tested in real-time.
Slitscan Technique
Developed during the analog film era, the slitscan technique exposes a narrow “strip” of film as either the subject, the camera, or the slit moves. This results in a time-based distortion, as different parts of the subject are recorded at different moments. We applied this method to the overhead camera view.

Sound Spectrum Analysis
Through sound spectrum analysis, we decompose sound into its component frequencies. This lets us examine which pitches and amplitudes are present in the recording. Using this information, we manipulated the elevation data of a topographic map.
Three-Dimensional Time Sculptures
To highlight the uniqueness of each movement, we created three-dimensional “sculptures” that capture and record the hand’s trajectory over time. Using the lines traced by the outermost fingers, we formed continuous 3D surfaces. These shapes “freeze” the hand’s motion into a peculiar space-time sculpture.

smoothed, interpolated, and textured surface using Catmull-Rom splines
Ribbon-Flow Visualization
After much trial and experimentation, we developed an animation where ribbons continuously flow from the hand-skeleton’s points.
the first test animation.
This scene offers an emotional interpretation of the performance rather than a strictly accurate sound visualization. We applied various forces to the ribbons’ motion, enriching and dramatizing the animation. For the final recordings, we used a MIDI controller to adjust these parameters—such as wind strength, turbulence frequency, amplitude, twisting, and camera motion—while watching and listening to the animation live. Our software recorded these adjustments, enabling us to use them as animation curves during rendering.

Now multiple forces influence the ribbons’ motion
Point Cloud Visualization
We also recorded point cloud animations with a 3D camera during the studio session. Although the camera recorded at only 30 fps and was optimized for room-scale captures rather than detailed hand positions, it was an intriguing experiment. Ultimately, the point cloud visuals were not included in the final animations.


We adapted Keijiro’s plugins for these tests.
We gathered a variety of data for visual display:
Number of keystrokes
Keystroke density
Keystroke strength
Intensity of hand movement
Range of currently sounding notes
Chromatic scale
MIDI score
Calorie expenditure
Working with graphic designer Eszter Kiskovács, we developed the visual interface:


Chromatic Scale
We arranged the piano’s 88 notes along a spiral, with the deeper tones on the outer edge and the higher tones as we move inward. We positioned the octaves so that they align vertically toward the spiral’s center. This representation offers a fresh and exciting way to reveal hidden harmonic relationships.
MIDI Score
On the left side of the screen, we displayed a MIDI-based score that resembles a reversed punch-card machine printing out the visual notation. Numbers appearing next to the keys show how often a particular note is played.

Calorie Expenditure
Because we had to allow breaks between takes due to the physical demands of the performance, we decided to visualize this aspect as well. To estimate calorie usage, we used the simplified Metabolic Equivalent (MET) formula:
Calories = (MET value × 3.5 × weight in kg) ÷ 200
A piano performance has a MET value between 2.5 and 3.5, depending on intensity. Our algorithm estimated this value based on the intensity of hand movements, thus calculating current calorie expenditure. To make this more relatable, we compared it to other physical activities—estimating how many meters of running would equal the same amount of energy.

Originally, we planned a rich, montage-like presentation, allowing viewers to wander through different visual layers at their own pace. However, the display’s technical characteristics, especially its resolution constraints, made this approach impractical. In collaboration with the design team, we decided to create a single, focused, grand-scale animation—one that would, on its own, remain sufficiently varied and exciting to hold the viewer’s attention throughout the roughly three-minute duration. In the end, the data visualization, ribbon-flow, and time-sculpture animations were chosen for the final composition, as each could independently capture the piano performance’s dynamism and rhythm.
Through this project, we succeeded in creating a complex and innovative multimedia installation that pays tribute to György Cziffra’s piano artistry, highlighting its uniqueness and technical brilliance. By fusing various technologies and artistic approaches, we not only achieved a compelling visual representation of the music, but also uncovered deeper layers and relationships—illustrating the interplay of music, motion, and the limits of human performance.
We hope the installation provides both professionals and the general public with an exciting and inspiring experience, contributing to the worthy preservation and appreciation of Cziffra’s legacy.
animata studio
animata studio
No activity yet