Evolving Reflections
VIS 160 A&B
Rotem Gotlieb
Evolving Reflections
VIS 160 A&B
Rotem Gotlieb
Evolving Reflections
VIS 160 A&B
Rotem Gotlieb

 Project Title: Evolving Reflections

Youtube Link: https://www.youtube.com/watch?v=89uzTU0OR8Q

Final Code: https://editor.p5js.org/rgotlieb/sketches/cXQP31Sd9

 Project Title: Evolving Reflections

Youtube Link: https://www.youtube.com/watch?v=89uzTU0OR8Q

Final Code: https://editor.p5js.org/rgotlieb/sketches/cXQP31Sd9

 Project Title: Evolving Reflections

Youtube Link: https://www.youtube.com/watch?v=89uzTU0OR8Q

Final Code: https://editor.p5js.org/rgotlieb/sketches/cXQP31Sd9

Project Overview

Project Overview

"Evolving Reflections" is an interactive digital art installation designed for a gallery setting, where visitors can effortlessly engage in the creation of a dynamic live painting through their own movements. The installation invites users to experience how their bodily motions are translated into fluid, evolving visual art without the need for any direct input or interaction with controls.

The User Experience

The User Experience

As visitors approach the installation, their movements are captured by a camera and processed in real-time using advanced pose detection technology. The system interprets these movements and transforms them into brush strokes of varying sizes and colors that seamlessly blend and evolve on the digital canvas.

Process Overview

Progress update with steps that I took

Conceptualization and Research:

The foundation of "Evolving Reflections" began with a deep dive into the world of interactive digital art. I researched various approaches to creating immersive, user-driven experiences and explored different technologies that could translate physical movement into visual art. This research phase was crucial in identifying the right tools—Teachable Machine for pose detection and p5.js for real-time rendering—which would form the backbone of the project.

Prototype Development

With the core concept and technologies in place, I moved on to developing a basic prototype. This involved integrating the pose detection model with a simple rendering system that could visualize user movements as shapes on the screen. The initial focus was on ensuring that the system could reliably track and respond to movements in real-time. This prototype served as a proof of concept, demonstrating that the technology could effectively translate physical actions into digital art.

Visual Refinement and Algorithm Tuning

Once the basic functionality was established, the next step was refining the visual aspects of the installation. I experimented with color transitions, aiming for smooth and cohesive shifts across the color spectrum. The brush stroke effect was enhanced to mimic natural, fluid movements, creating a more organic and visually appealing experience. This phase involved fine-tuning algorithms to ensure that colors faded seamlessly into one another and that brush sizes adapted dynamically to the speed and nature of user movements.

User Experience Design and Final Optimization:

The final phase of the project focused on optimizing the installation for a gallery environment, where ease of interaction and accessibility are paramount. I streamlined the user interface by removing any input controls, allowing the system to operate autonomously. The canvas was programmed to reset every 30 seconds, ensuring a continuous flow of new art for each user. This stage also included rigorous testing and adjustments to ensure the installation was both reliable and engaging, offering a smooth, immersive experience for all who interact with it.

Overall Goal

Overall Goal

The primary goal of this project is to create an interactive visual artwork that transforms body movements into a live, evolving painting. This artwork will be both aesthetically pleasing and technically sophisticated, utilizing machine learning and creative coding to create a dynamic and immersive experience.

Specific Goals

Specific Goals

1. Interactive Experience: Enable users to create art through their body movements.

2. Visual Appeal: Develop a visually captivating and fluid live painting.

3. Technical Integration: Seamlessly integrate machine learning models with creative coding frameworks to achieve real-time responsiveness.

WEEK 3

WEEK 3

- Goals: Establish a solid foundation for the project by defining the scope, objectives, and required technologies.

- Actions: Finalize the project proposal, start researching relevant technologies, and train the initial pose model using Teachable Machine.

WEEK 4

WEEK 4

- Goals: Create a basic prototype that integrates the trained pose model with p5.js to capture body movements and visualize them.

- Actions: Set up the HTML structure, write initial integration code, and debug any issues related to asynchronous functions and naming conflicts.

WEEK 5

WEEK 5

- Goals: Enhance the visual aesthetics of the live painting, ensuring smooth transitions and responsiveness.

- Actions: Refine the pose model based on initial tests, develop more sophisticated visual effects, and design the overall aesthetic (color schemes, styles).

WEEK 6

WEEK 6

- Goals: Gather feedback on the usability and experience of the interactive artwork to make necessary improvements.

- Actions: Conduct user testing, gather feedback, and make iterative improvements based on the feedback. Update project documentation and prepare for mid-project review.

WEEK 7

WEEK 7

- Goals: Ensure the project is well-documented and ready for final presentation.

- Actions: Focus on fluidity and smoothness of visual transitions, implement additional features based on user feedback, and prepare comprehensive documentation for the final presentation. Submit the exhibition list.

WEEK 8

WEEK 8

- Goals: Create a polished and functional project website that includes a summary, images, documentation, and progress so far.

- Actions: Conduct thorough testing and debugging, finalize the user interface and overall aesthetics, and submit the project website link.

WEEK 9

WEEK 9

- Goals: Finalize all details and ensure readiness for the final presentation and exhibition.

- Actions: Conduct a final review of the project with peers and instructors, make last-minute adjustments, prepare PR materials (including a good image for PR), and practice the final presentation.

WEEK 10

WEEK 10

- Goals: Successfully present the project and set up for the exhibition.

- Actions: Present the project in class, submit final project documentation and write-up, reflect on the project process and outcomes in a final journal entry, and set up the piece for exhibition in Kamil/Mandeville 114. Clean up after the exhibition.

1. Project Overview

- An interactive digital art installation designed for gallery environments.

   - Utilizes motion capture technology to create dynamic visual art based on user movements.

 2. Tech Used

 - Google Teachable Machine: A pre-trained model detects and responds to user poses and movements.

   - P5.js Library: JavaScript framework for creating and rendering visual effects on the canvas.


 3. Inter UX

- Users interact with the artwork through their body movements, tracked in real time.

   - Visual feedback is provided, creating a responsive and engaging experience for gallery visitors.

 4. Full-Screen Gallery

- The canvas expands to fill the entire screen, allowing for a full immersive experience.

   - All UI elements are hidden, except for an "Exit" button, to minimize distractions.


 5. Visual Effects

 - The artwork features smooth color transitions and varied shapes based on user movements.

   - Includes a unique brush stroke effect that simulates natural painting techniques.


 6. Canvas Manage

 - The canvas automatically clears every 30 seconds to refresh the visual experience.

   - This auto-clear feature ensures continuous engagement and a clean display.

 7. Custom Setting

  - Users can adjust settings such as color mode, shape size, and opacity before entering gallery mode.

   - These settings provide flexibility and allow the installation to be tailored to different environments.

 8. Real-Time Detection

- The system detects key points of the user's body using a webcam, enabling real-time interaction.

   - Different poses trigger specific visual effects, adding depth to the user experience.

 9. Brush Simulation

- Brush strokes are generated using Perlin noise and varying stroke widths for a textured look.

   - This feature adds a layer of realism and artistic quality to the digital artwork.

 10. Seamless Transitions

 - The project smoothly transitions between gallery and standard modes, adjusting the canvas size and functionality accordingly.

   - The reinitialization of the webcam and canvas settings ensures a consistent interactive experience.