project02:WS12026MSc2 G2Prototyping: Difference between revisions

From ETA
Jump to navigationJump to search
No edit summary
Line 29: Line 29:


=='''Prototyping'''==
=='''Prototyping'''==
The prototyping phase tested how the project’s adaptive logic could move from computational design into material and environmental implementation. Rather than functioning as a final product, the prototype operated as a proof of concept through which geometric refinement, lighting translation, electronics, and user interaction could be evaluated as parts of a single responsive system.
'''Fabrication and assembly'''
'''Fabrication and assembly'''


[[File:This.gif |thumb | center |700px | Printing process]]
[[File:This.gif |thumb | center |700px | Printing process]]


The prototype was refined based on fabrication constraints, including segmentation, print orientation, and assembly sequence. These adjustments made it possible to preserve the design logic while adapting it to the realities of model production.
The physical prototype was developed as a material translation of the project’s broader interior strategy. During this phase, the design was refined to meet fabrication constraints, including segmentation, print orientation, tolerances, and assembly sequence. These adjustments were not treated as secondary technical compromises, but as part of the design process itself, allowing the geometric logic of the proposal to remain legible while becoming buildable. The prototype preserved the relationship between an ordered structural frame and a more differentiated internal geometry, showing how the Voronoi-based system could be fabricated as a series of components and reassembled into a coherent spatial fragment. In this sense, fabrication became a way of testing not only form, but also the feasibility of the project’s multi-scalar design logic.


[[File:WhatsApp Image 2026-04-16 at 17.34.59.jpg|thumb|center|500px | 3D printed parts]]
[[File:WhatsApp Image 2026-04-16 at 17.34.59.jpg|thumb|center|500px | 3D printed parts]]
Line 42: Line 45:
[[File:WhatsApp Image 2026-04-16 at 17.43.55.jpg|thumb|center|600px | Electrical components assembly]]
[[File:WhatsApp Image 2026-04-16 at 17.43.55.jpg|thumb|center|600px | Electrical components assembly]]


The physical model was equipped with an embedded lighting system using an ESP32 microcontroller, addressable LED strips, and an external power supply. This setup allowed the prototype to move beyond static representation and test responsive environmental behaviour.
To extend the prototype beyond static representation, the model was equipped with an embedded lighting system consisting of an ESP32 microcontroller, addressable LED strips, and an external power supply. This setup allowed the prototype to function as a responsive environmental device rather than as a purely visual object. Light was treated as an architectural layer integrated into the panel's geometry, capable of supporting spatial atmosphere, temporal variation, and behavioural response. The electronic assembly, therefore, played a central role in the project: it connected the spatial concept to a tangible output and enabled testing of how adaptive lighting could operate within a confined habitat.




Line 54: Line 57:
|}
|}


Environmental input values were translated into controllable light scenarios through a multistep workflow: CCT and illuminance were converted into RGB and brightness values, refined, embedded in the Arduino logic, and finally connected to a browser-based control interface.
A central challenge of the prototyping phase was translating the predictive lighting model into values that could be implemented both digitally and physically. The AI workflow generated correlated colour temperature (CCT) and illuminance values that were meaningful from circadian and environmental perspectives, but could not be sent directly to the LED system. These outputs first had to be converted into RGB colour values and brightness levels. The resulting data were then refined to ensure stability and compatibility with the rendering environment and the physical hardware. In the digital model, values were cleaned, bounded, and remapped to produce a controlled luminous gradient. In the physical model, RGB values were additionally scaled by brightness to keep colour and intensity coupled. This workflow established a continuous chain from environmental input to luminous output, allowing the prototype to test not just light effects, but the operational logic of an adaptive architectural environment.




Line 61: Line 64:
[[File:DemonstrationHD.gif|thumb|center|700px | Interface and LED responce]]
[[File:DemonstrationHD.gif|thumb|center|700px | Interface and LED responce]]


The interface translates the prototype into an interactive system. Presets, activities, and behavioural scenarios allow the user to test how lighting conditions can respond to different states and routines within the habitat.
The final step of the workflow connected the prototype to a browser-based interface, transforming it from a scripted simulation into an interactive system. Through the interface, the user could select different presets, activities, and behavioural scenarios in order to test how the lighting environment would respond. The logic of the interface was not based on arbitrary colour selection, but on the interpretation of bodily state and intended use. Presets such as calm, focus, stress, and overload were combined with activities including sleep, eat, leisure, and work, while two response modes, mirror and compensate, defined whether the environment should reflect or counterbalance the detected condition. In this way, the interface served as the user-facing layer of the adaptive system, demonstrating how lighting could serve as a responsive mediator among physiological state, routine, and spatial atmosphere.





Revision as of 17:23, 23 April 2026



Group 2: Brendan Exterkate - Gabriel Marks - Giorgia Vercelloni - Maciej Sachse - Ruxandra Florut - Zuzanna Schleifer - Long Ki



Prototyping

The prototyping phase tested how the project’s adaptive logic could move from computational design into material and environmental implementation. Rather than functioning as a final product, the prototype operated as a proof of concept through which geometric refinement, lighting translation, electronics, and user interaction could be evaluated as parts of a single responsive system.


Fabrication and assembly

Printing process

The physical prototype was developed as a material translation of the project’s broader interior strategy. During this phase, the design was refined to meet fabrication constraints, including segmentation, print orientation, tolerances, and assembly sequence. These adjustments were not treated as secondary technical compromises, but as part of the design process itself, allowing the geometric logic of the proposal to remain legible while becoming buildable. The prototype preserved the relationship between an ordered structural frame and a more differentiated internal geometry, showing how the Voronoi-based system could be fabricated as a series of components and reassembled into a coherent spatial fragment. In this sense, fabrication became a way of testing not only form, but also the feasibility of the project’s multi-scalar design logic.

3D printed parts


Electronics and lighting setup

Electrical components assembly

To extend the prototype beyond static representation, the model was equipped with an embedded lighting system consisting of an ESP32 microcontroller, addressable LED strips, and an external power supply. This setup allowed the prototype to function as a responsive environmental device rather than as a purely visual object. Light was treated as an architectural layer integrated into the panel's geometry, capable of supporting spatial atmosphere, temporal variation, and behavioural response. The electronic assembly, therefore, played a central role in the project: it connected the spatial concept to a tangible output and enabled testing of how adaptive lighting could operate within a confined habitat.


Data-to-light workflow

RGB and brightness plot in 24h
Arduino code approach

A central challenge of the prototyping phase was translating the predictive lighting model into values that could be implemented both digitally and physically. The AI workflow generated correlated colour temperature (CCT) and illuminance values that were meaningful from circadian and environmental perspectives, but could not be sent directly to the LED system. These outputs first had to be converted into RGB colour values and brightness levels. The resulting data were then refined to ensure stability and compatibility with the rendering environment and the physical hardware. In the digital model, values were cleaned, bounded, and remapped to produce a controlled luminous gradient. In the physical model, RGB values were additionally scaled by brightness to keep colour and intensity coupled. This workflow established a continuous chain from environmental input to luminous output, allowing the prototype to test not just light effects, but the operational logic of an adaptive architectural environment.


Interface and demonstration

Interface and LED responce

The final step of the workflow connected the prototype to a browser-based interface, transforming it from a scripted simulation into an interactive system. Through the interface, the user could select different presets, activities, and behavioural scenarios in order to test how the lighting environment would respond. The logic of the interface was not based on arbitrary colour selection, but on the interpretation of bodily state and intended use. Presets such as calm, focus, stress, and overload were combined with activities including sleep, eat, leisure, and work, while two response modes, mirror and compensate, defined whether the environment should reflect or counterbalance the detected condition. In this way, the interface served as the user-facing layer of the adaptive system, demonstrating how lighting could serve as a responsive mediator among physiological state, routine, and spatial atmosphere.


Gallery