Module 2 Activity Research

Weekly Activities - Project 2

Sandra Krcmar

Project 2 investigates how Arduino-driven sensing and light feedback can translate environmental cues—like motion—into clear, unobtrusive home interactions.

This document provides a streamlined summary of Activities 1–3 with images and annotations; full analysis and reflections are available in the accompanying PDF.


Project 2


Module 2

Project 2 examines how Arduino-based sensing and light output can be used to create simple, intuitive, and unobtrusive interactions within the home. By translating environmental signals—such as motion, proximity, or subtle shifts in activity—into meaningful notifications and ambient light responses, the project explores how technology can quietly support awareness without overwhelming the user.

This website presents a consolidated summary of Activities 1, 2, and 3, for a full breakdown—including full Activity Insights, Code Reviews, Known/Unknown analyses, challenges, outcomes, and deeper reflections—please refer to the comprehensive PDF submission.

Activity 1

In this activity, I explored how ambient lighting could respond to presence or interaction, treating light not just as output but as atmosphere. Transitioning from single-color LEDs to an RGB LED shifted my understanding of circuitry, grounding, and multi-channel control, revealing new complexity around resistors, brightness, and color behavior.

My knowns centered on wanting multi-source light, interaction-driven behavior, and potential integration with ProtoPie. The unknowns included resistor requirements, brightness differences, and whether interaction should be manual (potentiometer) or sensor-triggered. As I prototyped, I focused less on UI screens and more on underlying behavior, reconnecting to the circuitry and code that shape interaction before interface.

Hands-on experimentation—discovering resistor bridging, correcting assumptions about grounding, debugging unresponsive hardware, and extending components with Dupont wires—became a core part of the learning. These moments clarified how current flows, how components communicate, and what happens when they fail. They also pushed me to explore scale by illuminating a small physical “room.”

Throughout the process, I considered behavior and emotional tone, influenced by WGSN personas. The goal shifted from simply changing color to exploring how light might somatically reflect presence, mood, or psychological need. Meaning emerged from function: an early step toward designing lighting that responds intuitively to human experience.

Phone-case guerrilla prototype exploring quick physical testing. Tried Pepper’s Ghost–style 3D reflections: Triceratops, Brontosaurus, Jellyfish (all worked clearly)- a potential ProtoPie insight reflection. Tried Pepper’s Ghost–style 3D reflections: Triceratops, Brontosaurus, Jellyfish (all worked clearly)- a potential ProtoPie insight reflection. Tried Pepper’s Ghost–style 3D reflections: Triceratops, Brontosaurus, Jellyfish (all worked clearly)- a potential ProtoPie insight reflection. Arduino R4 board photographed with base kit components. All components needed for this exploration RBG Led, hue adjust exploration. Full setup shown: breadboard, resistors, DuPont cables, potentiometer, RGB LED. Additional image: powered-on RGB LED demonstrating active output Photo of the full Arduino kit. Image showing extended male–female jumper wires to position the potentiometer off the breadboard for flexible placement in a future enclosure.

Example of me turning the potentiometer and watching the RGB LED shift colors, showing the full setup — DuPont wires, resistors, RGB LED, and potentiometer all connected on the breadboard.

Click the video to preview.


Activity 2 & 3

In Activity 2 & 3, I shifted from ambient lighting exploration to solving a real problem: my mother often misses package deliveries when couriers don’t ring the doorbell. Instead of relying on commercial systems like Ring or Nest, I set out to build a lightweight DIY notification tool using a PIR motion sensor and an Arduino. This reframed the project from general experimentation to designing with purpose.

Understanding the PIR sensor was a major learning curve. I discovered that it detects motion through two infrared-sensitive slots that compare changes in radiation, and that it requires a brief calibration period during which it may output several false signals. Fine-tuning sensitivity, adjusting physical placement, and experimenting with the onboard knobs taught me to shape an experience—not just get the sensor “working.”

Once motion was reliably detected, the challenge became transforming that physical signal into a meaningful digital response. Instead of building a visual UI in ProtoPie, I chose a functional IoT workflow: when the PIR triggers, ambient lighting turns on and the system sends an email notification. This required integrating Arduino with IFTTT through webhooks, creating an applet, configuring WiFi credentials in a secrets.h file, and ensuring motion events fired only when intended. Arduino became the bridge between real-world presence and a digital alert system.

Much of the work involved balancing calibration, sensitivity, false triggers, code logic, and network reliability. At one point the sensor spammed the inbox with constant emails until I refined both the hardware settings and software cooldown logic. Each iteration reinforced how tightly physical, digital, and experiential layers depend on one another.

By the end, I had a working IoT system that detects motion outside a structure, activates lighting inside, and sends a real-time email alert. Future steps include creating a small cardboard house to demonstrate spatial interaction and adding sound as a digital doorbell. This activity helped me see Arduino not just as a learning tool, but as a practical way to blend physical sensing and digital outcomes to solve real human problems.

Shows the PIR sensor wiring layout and component arrangement from the rear view. Displays the PIR sensor face, lens, and how it sits within the prototype from the front view. Created an IFTTT account and added IF This → searched Webhooks → selected Receive a web request Entered event name to generate the trigger Added Then That → searched Email → selected Send me an email Entered subject + body for the email Clicked Create action → Continue to complete setup. Adjusted the title/name as needed Screenshot of the automated email I receive when the IFTTT webhook event fires (motion detected → email sent).

Project 2


Final Project 2 Design

While the final outcome may appear minimal, the underlying code and applet represented a significant technical challenge. Looking ahead to Project 3, I intend to expand this work by incorporating light and sound, and by exploring guerrilla prototyping to house the system physically. I’m proud of this milestone and motivated to see how the project evolves in its next iteration.

Final Prototype
×

×

Powered by w3.css