top of page

Autonomous Abstraction

''It is believed that entropy (a measurement of the lack of order in a system) in the universe will steadily increase (the law of increasing entropy) and that entities with form eventually collapse. Despite that, it is a wonder that the sun was created and the planets were born, that life was formed and societies exist. However, the reason why the universe, life, nature, and society continue to be maintained in spite of this may be because order is continuously formed on its own through the shared phenomenon of self-organization in the midst of disorder. In other words, the universe and our own existence are a continuous order created by the same phenomenon.'' - teamLab

The Process

The following diagram outlines the process required to complete this project.

Autonomous Abstraction Process of utilizing hardware and software system

Content Creation

Content Creation for this project requires creating pre-rendered motion graphics to cooperate with the real-time visuals responding to the interactions.

Content Creation - Motion Graphics

  • Process:

    • Design dynamic animations for the light spots.

    • Render multiple layers of animation for dynamic responses during real-time interaction.

       

  • Software:

    • Adobe After Effects: Traditional software for motion graphics creation.

    • Cinema 4D: For 3D motion graphics and abstract animations.

Guests visiting Autonomous Abstraction

Autonomous Abstraction. teamLab.

Content Creation - Interactive Logic Design

  • Process:

    • Design dynamic visuals that respond to real-time inputs (e.g., visitors' movements on the floor).

    • Integrate motion sensors with content to track audience interactions.

    • Changing light spot colors or rhythms upon touch detection.

    • Add randomness to enhance audience engagement

  • Software:

    • TouchDesigner: For creating and managing real-time visuals and sensor integration.

    • Notch: For interactive content with stunning real-time effects.

    • Unity/Unreal Engine: For real-time 3D environments and physics-based interactivity.

    • Max/MSP: For custom interactive behaviors and sensor-based triggers.

Guests visiting Autonomous Abstraction

Autonomous Abstraction. teamLab.

Hardware

To create immersive visual effects, hardware set-up merges projection mapping system with  interactive sensors.

Projection mapping set-up, for each projector can cover approximately an area with 8 meters in length and 5 meters in height.

​

Projectors:​

  • Specifications: 4K resolution, 20,000 to 30,000 lumens insuring high-brightness.

​

Interactive â€‹Motion Sensors:

  • Quantity: 2 - 4 pairing with 1 projector (ensure covering all areas accessible to audience interaction).

  • Type: Infrared or depth sensors like Microsoft Kinect or LiDAR-based sensors.

  • Purpose: To track visitor movements and provide real-time data to the system.

​

Kinect for Windows

Microsoft Kinect is often used in interactive projects for Motion Capture.

System

To manage content rendering, synchronization, and real-time interactivity:

  • Media Servers:

    • Quantity: approximately 1-2 servers per 3 walls.

    • Specifications: High-performance GPUs, 64 GB+ RAM, SSD storage.

    • Purpose: Real-time content rendering and managing multiple projection channels.

Budget

Autonomous Abstraction cost detail
Autonomous Abstraction Detail Total
Fusion Realm logo

Ready to transform your ideas into a standout experience?
Let’s talk!

powered by @For Real Entertainment

bottom of page