ㅡ Intro to Physical Computing
- 1. Getting friendly with Arduino and Physical Computing
- Sep. 03. 2024
- 2. Digital Input and Output with an Arduino
- Sep. 10. 2024
By using the codes above, if i push the button, red LED turns on,
and if I do not push the button yellow Led turns on.
It became more brighter!
Touch sensor controls each LED’s brightness.
- Sep. 17. 2024
- Lab: Tone Output Using An Arduino
---Play Tones
-> this code did not work!!!!!!!!!!!!!
-> I checked sensor range with using // Serial.println(sensorReading); -> range was around 0~1000
-> After fixing code to this, I could play tones with touch sensor.
Thank you Chloe!!
---Play music (pitches.h)
---Making a musical instrument
A0,A1,A2 sensor range : 0~900
- Sep. 17. 2024
Lab: Servo Motor Control with an Arduino
- Oct. 01. 2024
Tried to figure out why mine was not working.
It was soldering issue in H-bridge moter driver. So Sky helped me to fix the soldering.
It started to work!
- 3. Midterm Process ( Jiyou, Jenn and Amelia )
After testing the Heart rate sensor several times, we can check the range of rates between around 80~120.
In the shop, there are 2 kinds of pulse sensors: black(unknown brand) and red(Sparkfun)
The next step will be to connect the sensor to 4 servo motors.
What we should be concerned about is:
- How many Amp we need for servo motors
- How to decide whether to divide each sensor value by 3 parts with heart rates(Low/Mid/High)
/*
Heart beat plotting!
By: Nathan Seidle @ SparkFun Electronics
Date: October 20th, 2016
https://github.com/sparkfun/MAX30105_Breakout Shows the user's heart beat on Arduino's serial plotter Instructions:
1) Load code onto Redboard
2) Attach sensor to your finger with a rubber band (see below)
3) Open Tools->'Serial Plotter'
4) Make sure the drop down is set to 115200 baud
5) Checkout the blips!
6) Feel the pulse on your neck and watch it mimic the blips It is best to attach the sensor to your finger
using a rubber band or other tightening
device. Humans are generally bad at applying constant pressure to a thing. When you
press your finger against the sensor it varies enough to cause the blood in your
finger to flow differently which causes the sensor readings to go wonky. Hardware Connections (Breakoutboard to Arduino):
-5V = 5V (3.3V is allowed)
-GND = GND
-SDA = A4 (or SDA)
-SCL = A5 (or SCL)
-INT = Not connected The MAX30105 Breakout can handle 5V or 3.3V I2C logic. We recommend powering the board with 5V
but it will also run at 3.3V.
*/#include <Wire.h>
#include "MAX30105.h"MAX30105 particleSensor;void setup()
{
Serial.begin(115200);
Serial.println("Initializing..."); // Initialize sensor
if (!particleSensor.begin(Wire, I2C_SPEED_FAST)) //Use default I2C port, 400kHz speed
{
Serial.println("MAX30105 was not found. Please check wiring/power. ");
while (1);
} //Setup to sense a nice looking saw tooth on the plotter
byte ledBrightness = 0x1F; //Options: 0=Off to 255=50mA
byte sampleAverage = 8; //Options: 1, 2, 4, 8, 16, 32
byte ledMode = 3; //Options: 1 = Red only, 2 = Red + IR, 3 = Red + IR + Green
int sampleRate = 100; //Options: 50, 100, 200, 400, 800, 1000, 1600, 3200
int pulseWidth = 411; //Options: 69, 118, 215, 411
int adcRange = 4096; //Options: 2048, 4096, 8192, 16384 particleSensor.setup(ledBrightness, sampleAverage,
ledMode, sampleRate, pulseWidth, adcRange); //Configure sensor with these settings //Arduino plotter auto-scales annoyingly.
To get around this, pre-populate
//the plotter with 500 of an average reading from the sensor //Take an average of IR readings at power up
const byte avgAmount = 64;
long baseValue = 0;
for (byte x = 0 ; x < avgAmount ; x++)
{
baseValue += particleSensor.getIR(); //Read the IR value
}
baseValue /= avgAmount; //Pre-populate the plotter so that the Y scale is close to IR values
for (int x = 0 ; x < 500 ; x++)
Serial.println(baseValue);
}void loop()
{
Serial.println(particleSensor.getIR()); //Send raw data to plotter
}
These screenshots are modeling the installation where the sensors and outputs will be installed.
- Concept & Background & Sketches
- Oct. 15-22. 2024
Theme Selection (Unexpected Fear)
We decided to create a fear that gives people unpredictable elements by incorporating aspects of traditional Asian ghosts into the Western Halloween setting.
As foreigners, the American subway is a dark, damp place that always makes us tense and scared. We placed typical elements of Asian horror, such as a virgin ghost, a snake, and talismans, in the subway setting, with a manually operated effect of the virgin ghost's eyes moving. In the center, we placed an unknown object with a straw shoe texture, inside which we positioned a sensor to create an unsettling feeling for people.
Material Selection (Heart Rate Sensor, Servo Motors)
Halloween evokes images of children collecting candy and the phrase "trick or treat!" We wanted to share candy and fun with those viewing our Pcom project, considering how to incorporate Pcom elements. Simply measuring heart rate through a sensor and distributing candy seemed boring, so we decided to add a game-like element where doors automatically open and close based on the measured heart rate, encouraging people to control their heart rate.
We chose to use three servo motors to classify heart rate levels into High, Mid, and Low, with corresponding rewards for each level.
Stage Sketch (Servo Motor Angles, Candy Baskets, Input)
For the Halloween atmosphere, we designed a small subway-shaped stage with layered forms to provide an immersive experience for the subjects. To create a floating ghost effect, we hung elements from the ceiling and attached a separate device to move only the eyes.
Functionally, the most important considerations were:
- Hiding places for servo motors and breadboards
- Positioning for the subject's hands
- Sensor location
- Candy dispensing area
The most challenging part was connecting the candy dispensing area with the servo motors and concealing the connection. We utilized elements like benches, trash cans, and ticket machines, designing them to rotate and reveal hidden compartments containing candy for the subjects to retrieve. The input was installed in a location where people's hands could easily reach and view the entire stage, ensuring our intentions were fully reflected.
- Fabricating Process
- Set Design Sketch and Modeling
- Before building the set, we started with simple hand sketches and 3D modeling to establish the scale of the set.
- Our goal was to rescale the large space of a subway station into a more manageable set size while finding the most effective layout that could contain all necessary components.
- By using a 1:1 scale model, we aimed to minimize potential sizing errors during the actual set construction.
2. Creating the Subway Set with Foam Board
- Referring to the model, we used foam board to construct the main structure of the set.
- After creating the images needed for each surface, we printed them and attached them to the foam board.
- To capture the essence of a subway station, we used actual images of subway elements.
- By layering features like pillars and ceiling protrusions, we aimed to give the set a sense of depth and space.
3. 3D Printing Subway Elements
- To visually link the output of each sensor to the subway environment, we used a 3D printer to create objects commonly found in subway stations, such as benches, trash bins, and ticket vending machines.
4. Building the Sensor House
- We needed a sensor house large enough for people to put their hands in.
- We cut string with a straw-like texture and attached it to the outer surface of a wireframe house.
- Inside the house, we added slippery-textured toys to create a sense of discomfort when people reached inside.
5. Installing the Circuit on the Set
- After creating the circuit, we attached it underneath the set.
- The heartbeat sensor was placed on top of the set, and the servo motors were installed below.
- After installation, we tested whether the servo motors worked properly in response to the sensor input.
6. Attaching Motors and Candy Dispenser to the Set
- We attached the 3D-printed subway elements to the motors.
- Below the motors, we built and installed a candy dispenser in the form of a box.
- When the servo motor rotated 90 degrees, it revealed an opening where people could take candy from the dispenser.
7. Attaching the Sensor and Making a Finger Placement Guide with Clay
- We attached the heartbeat sensor to the top of the set.
- Since people might have difficulty finding the exact sensor location when placing their hands inside the sensor house, we thought it would be helpful to make a guide.
- Using clay, we molded a guide to help people place their fingers in the correct position.
8. Installing the Ghost and Sensor House on the Set and Adding Final Details
- Finally, we installed all the fabricated elements onto the subway set.
- To enhance the atmosphere, we added details like ghosts and talismans to the background.
- We also made the ghost’s eyes move manually to add a more creepy effect.
- Technical Process
- Heart rate sensor(MAX30105) testing
- Decoupling 3 Servo motor
- Combine 3 servo motors as an output and a sensor as an input
- Scaling the sensor range and adding State variables and functions for interaction
- State variables: Stabilization period, Per-person quotas
- User testing and adjusting input range of Low, Mid and High.
Constraints
- Use the average value of the heart rate in the first 2 seconds after sensing the person's heart rate.
- Each person can have only one candy
Capacitor:
- Each servo motor: 220µF (decoupling x3)
- Entire power supply line: around 1000~1200(470µF x3)
Code
https://app.arduino.cc/sketches/521a936f-f60c-406b-97cb-316bb80b4f7e
Troubleshooting
- Unexpected conductivity of Playdough
- Controlling artificial heart rate for testing
Asynchronous Serial Communication: The Basics
1. Basic Concept of Serial Communication
- Communication between devices requires a method and agreed-upon language
- Serial communication is one of the most common forms of communication between computers
2. Essential Communication Agreements
- Data transmission and reception rate
- Voltage levels representing 1 and 0
- Meaning of voltage levels (whether high voltage represents 1 or 0)
3. Basic Connection Configuration
- Common ground connection
- Transmit line (for sending data)
- Receive line (for receiving data)
4. Data Transmission Method
- Example: At 9600 baud rate, 1200 bytes can be transmitted per second
- Data is transmitted by changing voltage levels bit by bit
- Generally transmitted from Most Significant Bit (MSB) first
5. UART, USB, and CDC Explanation
- UART: Universal Asynchronous Receiver-Transmitter handles serial communication
- USB: Primary communication method in modern computers
- CDC: Communications Device Class that supports serial communication over USB
6. Serial Buffer and Port Control
- Processors have serial buffers to store received data
- Only one program can control a serial port at a time
- Data is processed in FIFO (First-In, First-Out) order
7. Importance of Communication Protocol
- Both devices must use the same communication protocol
- Agreement needed on the meaning and order of transmitted bytes
– Week 8 ––––––––––––––––––––––––––––––––––––––––––––––––
- Oct. 29. 2024
Lab: Two-Way (Duplex) Serial Communication Using An Arduino and the p5.webserial Library
Sending Multiple Serial Data using Punctuation
works well!!!!
https://editor.p5js.org/wc2771/sketches/--tEkUGZT
– Week 10 ––––––––––––––––––––––––––––––––––––––––––––––––
- Nov. 12. 2024
Sorry.. I will update soon..
but I am archiving on https://www.figma.com/board/R6S91lMy2ORdENxY5vCSJB/Pcom-Baby?node-id=0-1&t=U2edRwq09Jh2JxUv-1
– Final documenting –––––––––––––––––––––––––––––––––––––
- Dec. 9. 2024
<Interaction between People and Computers and the Space In-Between>
- Reflecting on a Semester of Physical Computing
Through this course, I learned how to think about the interaction between three elements: people, software, and hardware. As a foundational process to understand this, I used Arduino to learn how electrical current flows, how voltage should be supplied, and how to write Arduino code when adding inputs like sensors and buttons, or outputs like monitors and speakers.
While the 'Physical Computing' course itself primarily focused on Arduino, coding, and circuit understanding, for me, this entire process was preparation for the final stage of connection with 'people' - human interaction. Since my future goal is to be a UX or Product designer, I found the processes most interesting that were related to people and the physical world. Every moment was fascinating - from choosing sensors that connect with the world, to deciding how to attach them, and whether to expose or hide sensors for the experience.
Because of these reasons, I took charge of concept planning, fabrication, and visualization for both midterm and final projects, working with team members who were more interested in coding and circuit connection. Our collaboration allowed us to leverage each other's strengths, helping each other understand and resolve challenges.
It was a meaningful time where people with different strengths came together to support mutual understanding. As a result, during this semester, I studied the foundational knowledge of what physical elements exist in a product, service, or experience, and how different commands and languages interact. Based on this, I became a more comprehensive designer, capable of thinking multi-dimensionally and imaginatively about various stages of a project.
Final Project : Baby Mac
A MacBook and a baby MacBook exist. The participants must take care of the baby MacBook while the mother MacBook rests. However, the MacBook does not want the baby MacBook to leave its line of sight even while resting. The main goal for participants is to care for the baby MacBook without letting it cry, while meeting the mother MacBook's demands.
Our project creates a system of interconnected unreasonable machines, blending human emotion and relationships with technology. At its core, it explores a complex parent-child dynamic between machines—the Baby Mac and Mother MacBook—reflecting deeper emotional connections and potential relational entanglements between machines and users.
The project is composed of two key technical components. The Baby Mac is a motion-detecting device that tracks rotation and acceleration, providing real-time feedback through visual or audio responses based on user interaction. The Mother MacBook uses machine learning via ml5 to monitor the Baby Mac's position, adjusting its expression when the BabyMac enters a predefined "insight" zone.
- Interaction
Initial Scenario Context
People first encounter our project in a chaotic and frustrating environment. Their initial interaction introduces them to a mother computer and a baby MacBook, where they are immediately challenged to:
- Sensory Overload
- Confronted with a noisy, overwhelming setting
- Faced with two technological entities with unclear requirements
- Experiencing immediate confusion and sensory stress
- Interaction Dynamics
- Presented with visual guidance instructions for each device
- Tasked with attempting to soothe the "baby MacBook"
- Forced to navigate complex interaction protocols
Psychological Experience
- Confusion: Participants will feel profound bewilderment
- Uncertainty: Lack of clear understanding about each entity's specific conditions
- Problem-Solving Motivation: Driven to find a resolution to their perplexing situation
Key Objectives
- Prevent the baby MacBook from crying
- Keep the baby within the mother's line of sight
- Ensure the mother MacBook feels secure and relaxed
- Maintain a delicate balance of care and observation
Visual Language
1. class
- Sep. 05. 2024
- Assignments : Choose a design you like and analyze its system, hierarchy, typography, color system and use of negative space.
Intro to Comp Media
- Worksheet assignment
<Statement>
I believe that art is a means of expanding a person's experience and thinking. Over the past few centuries, many artworks and sculptures have inspired, entertained, and even comforted people. In the era we live in now, most people are used to the web and mobile environment. The internet is a constant part of their daily lives. As designers and artists, we need to think about how to communicate inspiration in new ways in spaces that are part of their lives.
I aspire to become a UX designer. Through this class, I hope to understand web environments and computation and become a bridge between technology and human experience. I hope to combine physical computing and ICM classes this semester to create new experiences for people.
Where should createCanvas() go?
function setup() {
createCanvas(400,400);
}function draw() {
background(400);
}
// Code that describes the starting position of a shape that will move
let x = 50;
let y = 50;function setup() {
createCanvas(400, 400);
}
function draw() {
background(220);
// Code that changes the position of the shape over time
x++;
y++; // Prevent the rectangle from moving off the canvas
if (x > width) {
x = -50; // go back to left when it gets out of screen }
if (y > height) {
y = -50; // go up when it gets out of screen }
}// Code to draw the shape
rect(x, y, 50, 50); // Code that describes mouse interaction
if (mouseX > x && mouseX < x + 50 && mouseY > y && mouseY < y + 50) {
x = mouseX - 25;
}
}
// Canvas drawing
function setup() {
createCanvas(800, 600);
}// background
function draw() {
background(3,255,255); // blue background
// red line
stroke(255,0,0); // red color
strokeWeight(45); // line weignt
line(0,0, 800, 600); // line position
//draw the ellipse with no stroke
noStroke()
fill(0,200,2) //ellipse color
ellipse(400,300,380,290)
//draw the square with no stroke
noStroke()
fill(0,0,128) //square color
square(540,240,50)
}
- Drawing my portraitdrawing mechanism
function setup() {
createCanvas(400, 400);
background(100); // grey
angleMode(DEGREES);
// bottom hair shape
fill(139, 69, 19); // brown hair
noStroke();
beginShape();
vertex(350, 350);
vertex(50, 350);
vertex(110, 90);
vertex(290, 90);
endShape(CLOSE);
// face shape
fill(252,243,220); // face tone
noStroke();
ellipse(200, 190, 220, 250);
// right top hair shape
push();//rotate only hair
rotate(45);
fill(139, 69, 19); // brown hair
ellipse(260, -110, 170, 70);
pop();
// left top hair shape
push();//rotate only hair
rotate(135);
fill(139, 69, 19); // brown hair
ellipse(-20, -170, 170, 70);
pop();
// left hand-eye
fill(0, 191, 255); // blue
beginShape();
vertex(120, 160); // wrist start
vertex(110, 130); // finger start
vertex(115, 110); // first finger
vertex(125, 115);
vertex(130, 100); // second finger
vertex(140, 110);
vertex(145, 95); // third finger
vertex(155, 110);
vertex(160, 100); // fourth finger
vertex(165, 115);
vertex(170, 130); // fifth finger
vertex(160, 160); // wrist end
endShape(CLOSE);
// right hand-eye
beginShape();
vertex(240, 160); // wrist start
vertex(230, 130); // finger start
vertex(235, 110); // first finger
vertex(245, 115);
vertex(250, 100); // second finger
vertex(260, 110);
vertex(265, 95); // third finger
vertex(275, 110);
vertex(280, 100); // fourth finger
vertex(285, 115);
vertex(290, 130); // fifth finger
vertex(280, 160); // wrist end
endShape(CLOSE);
// left eye
fill(0); // black
ellipse(140, 140, 20, 20);
// right eye
ellipse(260, 140, 20, 20);
// eyelash
stroke(0); //black
strokeWeight(3);
//left
line(135, 130, 130, 120);
line(140, 130, 140, 120);
line(145, 130, 150, 120);
//right
line(255, 130, 250, 120);
line(260, 130, 260, 120); line(265, 130, 270, 120);
// nose
fill(255, 204, 0); // yellow
triangle(205, 165, 215, 210, 195, 210); // shape
// mouth
fill(255, 0, 0); // red
noStroke();
arc(200, 260, 80, 60, 0, 180);
}
- Worksheet assignment
- Start with this sketch. The default p5 arguments for the rect() function are: x, y, w, h where x, y are the coordinate for the top left corner and w, h are the width and height of the rectangle. What are other ways to define a rectangle? Make up your own arguments for rect()? Come up with at least 2 sets.
width and height of the rectangle even if they are same.
- 2. Draw a rectangle in the middle of the screen that is half the width and half the height of the canvas. Write it so that you can change the size of the canvas and the rectangle will stay in the center and maintain its size relationship to the canvas.
- 3. Draw a rectangle in the middle of the screen that is half the width and half the height of the canvas. Write it so that you can change the size of the canvas and the rectangle will stay in the center and maintain its size relationship to the canvas.
- 4. Move a circle from the middle of the screen to the right side of the screen.
- a. Add 3 more, 1 moving left, 1 moving up, 1 moving down.
- b. Add 4 more, each moving towards each of the 4 corners of the canvas.
- c. Make one of your circles move 10 times faster than the other circles.
- d. Challenge: Re-write 4b. so if I change the width of the canvas, the circles still go to the corners without having to change any other code.
createCanvas(600, 600);
circleX = width / 2;
circleY = height / 2;
circleFastX = width / 2;
circleY = height / 2;
}
function draw() {
background(220);
circleX = circleX + speed;
circleFastX = circleFastX + speed*10;
//go back to middle
if (circleX > width) {
circleX = width / 2;
}
// Circle 1 - moves to the right
fill(myBlue);
ellipse(circleFastX, height / 2, 50, 50);
// Circle 2 - moves to the left
fill(myMint);
ellipse(width - circleX, height / 2, 50, 50);
// Circle 3 - moves up
fill(myMint);
ellipse(width / 2, height - circleX, 50, 50);
// Circle 4 - moves down
fill(myMint);
ellipse(width / 2, circleX, 50, 50);
// Four corners movement (relative to canvas size)
fill(myPurple);
ellipse(circleX, circleX, 50, 50); // Top-left
ellipse(width - circleX, circleX, 50, 50); // Top-right
ellipse(circleX, height - circleX, 50, 50); // Bottom-left
ellipse(width - circleX, height - circleX, 50, 50); // Bottom-right
}
5. Move a circle towards the mouse. Hint: Use mouseX + mouseY.
2. Using
cos(angle)
and sin(angle)
, the x and y coordinates of the circle were set based on the current angle. By multiplying with the radius
, the distance of the circle's rotation was established.3.The
angle
is continuously changed, causing the x and y coordinates to vary, creating the effect of the circle rotating around the mouse cursor.6. Move your rectangle from Q3 towards the mouse.
let rectX = 400; //starting point
let rectY = 400; //starting point
let speed = 2;
function setup() {
createCanvas(800, 800);
}
function draw() {
let gColor = random(200, 255);
fill(0, gColor, 10);
move();
background(220);
push();
rectMode(CENTER);
rect(rectX, rectY, 200, 200);
pop();
// line
stroke(myPinkple);
strokeWeight(5);
// line moving with rect
let lineX1 = rectX - 100;
let lineY1 = rectY - 100;
let lineX2 = rectX + 100;
let lineY2 = rectY + 100;
line(lineX1, lineY2, lineX2, lineY2);
line(lineX1, lineY1, lineX1, lineY2);
line(lineX2, lineY2, lineX2, lineY1);
line(lineX1, lineY1, lineX2, lineY1);
}
function move() {
// follow the mouse!
if (mouseX > rectX) {
rectX += speed;
}
else if (mouseX < rectX) {
rectX -= speed;
}
if (mouseY > rectY) {
rectY += speed;
}
else if (mouseY < rectY) {
rectY -= speed;
}
}
- Create an animated sketch!
- As an exercise include all of the following.
- One element controlled by the mouse.
- One element that changes over time, independently of the mouse.
- One element that is different every time you run the sketch.
- See if you can eliminate all (or as much as you can) hard-coded* numbers from the sketch. A hard coded number is something like fill(150). Better practice is to save 150 in a variable, for example myGreyColour, then use fill(myGreyColour)
- As an exercise include all of the following.
I could not use circle in WEBGL mode. -> changed to sphere
I used bugs from example (https://p5js.org/tutorials/custom-geometry/)
-> First try..
let myPear = "#D1D646";
let myGreen = "#798071";
let bug;
function setup() {
createCanvas(800, 800, WEBGL);
describe('Bugs randomly moving around');
// bug
bug = createBugGeometry(); // bug 3d
}
function draw() {
background(220);
// Sphere
push();
fill(myPear);
rotate(frameCount * 0.1);
sphere(30);
pop();
orbitControl();
rotateX(PI * -0.1);
noStroke();
lights();
for (let i = 0; i < 20; i++) {
push();
// Move each bug to a random position and rotation using noise
translate(
map(
noise(frameCount * 0.001, i, 0), // Map this value...
0, 1, // ...from this range...
-150, 150 // ...into this range
),
0,
map(
noise(frameCount * 0.001, i, 100), // Map this value...
0, 1, // ...from this range...
-200, 300 // ...into this range
)
);
rotateY(noise(frameCount * 0.01, i, 200) * TWO_PI);
scale(0.1);
// Bug drawing
bug();
pop();
}
}
function createBugGeometry() {
return function() {
// Head
push();
translate(-50, 0, 0);
sphere(70);
// Draw symmetrical parts of the head that come in pairs
for (let side of [-1, 1]) {
// Eye
push();
translate(-20, -60, side * 30);
sphere(20);
pop();
// Antenna
push();
translate(0, -100, side * 30);
rotateX(PI * -0.1 * side);
cylinder(5, 100);
pop();
}
pop();
// Body
push();
translate(50, 0, 0);
scale(1.5, 0.8, 1);
sphere(100);
pop();
};
}
-> Second changing the bugs move toward the yellow snack ( mouse )
let myPear = "#D1D646";
let myGreen = "#798071";
let bug;
let circleX = 0; // for making snack to follow mouse
let circleY = 0;
let bugs = []; // Array to store bugs' positions
function setup() {
createCanvas(800, 800, WEBGL);
// Initialize bugs' positions
for (let i = 0; i < 50; i++) {
bugs.push({
x: random(-300, 300),
z: random(-300, 300)
});
}
// Create bug geometry
bug = createBugGeometry(); // bug 3d
}
function draw() {
background(220);
// Snack following mouse
if (circleX < mouseX - width / 2) {
circleX += 5; // follow right
} else if (circleX > mouseX - width / 2) {
circleX -= 5; // follow left
}
if (circleY < mouseY - height / 2) {
circleY += 5; // follow down
} else if (circleY > mouseY - height / 2) {
circleY -= 5; // follow up
}
push();
fill(myPear);
translate(circleX, circleY);
sphere(30);
pop();
orbitControl();
rotateX(PI * -0.2);
noStroke();
lights();
// Draw bugs
for (let i = 0; i < bugs.length; i++) {
let bugX = bugs[i].x;
let bugZ = bugs[i].z;
// Move bug towards the snack
if (bugX < circleX) {
bugs[i].x += 0.5; // Move right
} else if (bugX > circleX) {
bugs[i].x -= 0.5; // Move left
}
if (bugZ < circleY) {
bugs[i].z += 0.5; // Move down
} else if (bugZ > circleY) {
bugs[i].z -= 0.5; // Move up
}
push();
translate(bugs[i].x, 0, bugs[i].z);
rotateY(noise(frameCount * 0.01, i, 200) * TWO_PI);
scale(0.1);
bug();
pop();
}
}
function createBugGeometry() {
return function() {
// Head
push();
translate(-50, 0, 0);
sphere(70);
// Draw symmetrical parts of the head that come in pairs
for (let side of [-1, 1]) {
// Eye
push();
translate(-20, -60, side * 30);
sphere(20);
pop();
// Antenna
push();
translate(0, -100, side * 30);
rotateX(PI * -0.1 * side);
cylinder(5, 100);
pop();
}
pop();
// Body
push();
translate(50, 0, 0);
scale(1.5, 0.8, 1);
sphere(100);
pop();
};
}
-> Third make bug’s location random every run, and change it’s color every second in random green.
let myPear = "#D1D646";
let myGreen = "#798071";
let bug;
let circleX = 0; // for making snack to follow mouse
let circleY = 0;
let bugs = []; // Array to store bugs' positions
function setup() {
createCanvas(800, 800, WEBGL);
// Initialize bugs' positions
for (let i = 0; i < 10; i++) {
bugs.push({
x: random(-400, 400),
z: random(-400, 400)
});
// fill(random(255),20,20);
}
// Create bug geometry
bug = createBugGeometry(); // bug 3d
}
function draw() {
background(220);
// Snack following mouse
if (circleX < mouseX - width / 2) {
circleX += 5; // follow right
} else if (circleX > mouseX - width / 2) {
circleX -= 5; // follow left
}
if (circleY < mouseY - height / 2) {
circleY += 5; // follow down
} else if (circleY > mouseY - height / 2) {
circleY -= 5; // follow up
}
push();
fill(myPear);
translate(circleX, circleY);
sphere(30);
pop();
orbitControl();
rotateX(PI * -0.2);
noStroke();
lights();
// Draw bugs
for (let i = 0; i < bugs.length; i++) {
let bugX = bugs[i].x;
let bugZ = bugs[i].z;
// Move bug towards the snack
if (bugX < circleX) {
bugs[i].x += 0.5; // Move right
} else if (bugX > circleX) {
bugs[i].x -= 0.5; // Move left
}
if (bugZ < circleY) {
bugs[i].z += 0.5; // Move down
} else if (bugZ > circleY) {
bugs[i].z -= 0.5; // Move up
}
push();
translate(bugs[i].x, 0, bugs[i].z);
rotateY(noise(frameCount * 0.01, i, 200) * TWO_PI);
scale(0.1);
bug();
pop();
}
}
function createBugGeometry() {
return function() {
let gColor=random(100,255);
print (gColor);
fill(0,gColor,10);
// Head
push();
translate(-50, 0, 0);
sphere(70);
// Draw symmetrical parts of the head that come in pairs
for (let side of [-1, 1]) {
// Eye
push();
translate(-20, -60, side * 30);
sphere(20);
pop();
// Antenna
push();
translate(0, -100, side * 30);
rotateX(PI * -0.1 * side);
cylinder(5, 100);
pop();
}
pop();
// Body
push();
translate(50, 0, 0);
scale(1.5, 0.8, 1);
sphere(100);
pop();
};
}
– Week 4 –––––––––––––––––––––––––––––––––––––––––––
- W4 assignment
Try1
Try 2
Try 3
– Week 6 –––––––––––––––––––––––––––––––––––––––––––
- worksheet
- Create 10 columns that toggle on and off when you click on them. You click on the column and the column turns red and stays red. You click on the column again and it turns white and stays white. Challenge: Make this work for a grid of cells.
- Assignment
– Finals –––––––––––––––––––––––––––––––––––––––––––
- Team Jenn Sky Plan
- 1. Concept
This experience invites participants to embark on a warm journey of gift-giving and connection. As snowflakes gently fall on a serene winter landscape, Santa Claus becomes their collaborative partner in spreading joy and light.
The journey begins with a deeply personal moment: each participant chooses a single, meaningful word that represents the gift they most want to share with a loved one this Christmas. This word becomes more than just language—it transforms into a vessel of emotion, intention, and connection.
Once the word is selected, Santa Claus steps in as a messenger, preparing a gift package that embodies that word. Each package is imbued with a distinctive color that resonates with the random palette of the chosen word.
Participants then become essential helpers in Santa's mission, assisting in delivering these special packages to various homes. These are not ordinary houses, but dark, waiting spaces yearning for warmth and illumination. As each gift is received, something magical happens: the house begins to glow from within, filled with the vibrant color of the present's emotional spectrum.
The process is a metaphorical journey of spreading light, love, and personal meaning—transforming dark, silent spaces into radiant sanctuaries of connection and hope, one carefully chosen word at a time.
2. Coding Plan
- There will be 3 layers, one is on the foreground with houses, but the houses will have transparent windows so that the layer with another image on the next layer can be seen through the window when interaction happens. Then background video will be placed on the third layer.
Code type : Image, Video, Video on/off, Opacity.
- If the box radius overlaps with the house radius - video on.
There will be an input box on the bottom of the screen where users can type a word for ‘gift’. As input successfully returns a word, Santa sleigh will show up from the top left corner of the screen. As the Santa sleigh reaches to the middle of the screen, it will drop the word written in a gift box with different colors per each time.
Code type : Input box, array.
- If Santa sleigh’s position reaches to the middle of the screen, drop the gift box.
The users can deliver the gift to any house they want to give to. A parachute will follow where the hands are, and when it overlaps with the gift box, they will stick together and the users can move the gift box from the moment.
Code type : ML5 bodyPose.
- If hand position overlaps with the box radius, stick together and update the position based on where the hand is.
When the box overlaps with a house, it will make a sound effect and the image behind the house will be activated. And the lights in the window will turn on and the color of the lights will be different based on the color of gift box.
Code type : image, sound.
- If the box radius overlaps with a house, the house activates the image and the light turns on based on the color of the gift box.
- One round ends.
3. Interaction
In this experience, participants embark on a journey of gift-giving by entering a word that encapsulates their heartfelt message.
As the word is input, the scene transforms dramatically. High above in the wintry sky, Santa Claus and Rudolph's sleigh appear. The participant's chosen word materializes as a unique gift package within the sleigh, ready to be delivered.
The gift box descends from the sky, suspended in mid-air and awaiting the participant's touch. There's a moment of interactive art as the participant must physically reach out and catch the falling package—a symbolic connection between digital intention and physical action.
The final step requires the participant to select a specific chimney representing their chosen home. They must deliver the gift, guiding it perfectly into the chimney. Each chimney represents a potential destination, a home waiting to be touched by the warmth of the participant's carefully chosen word.
This experience transforms gift-giving from a simple transaction into an immersive, almost ceremonial process. It's not merely about sending a present, but creating a moment of genuine connection where words become tangible gifts of love, hope, and meaning.
- 4. Visual effect
The entire scene unfolds against a softly blurred background, reminiscent of a frosted winter landscape. Layered atop this dreamy backdrop are delicate line drawings that transform the visual experience. These precise, minimalist lines create a striking contrast—much like looking through a frost-covered window, offering a sense of both distance and intimate observation.
Each participant's chosen word becomes more than just language; it transforms into a uniquely colored gift package. This personalized packaging is not merely decorative but deeply symbolic. When the gift arrives at its destination, something magical occurs: the receiving house is illuminated by a corresponding glow of the same color as the gift.
This chromatic synchronization is profound in its simplicity. The matching color represents more than visual harmony—it symbolizes a successful emotional transmission. Each lit house signifies that the participant's carefully chosen word has not just been delivered, but has truly been received, bringing with it a sense of happiness, warmth, and connection.
The line-drawn aesthetic and color-infused illumination create a visual metaphor for communication, suggesting that words—when shared with intention and love—can transform spaces and touch hearts, especially during the season of giving.
Hypercinema
- Sep. 23. 2024
You should design these sounds from scratch and there should be at least 6 different sounds in your sound family. Create a short presentation on why you
chose these sounds and how you designed them for this system?
Final GIF!
https://editor.p5js.org/wonjchoi313/sketches/uzFvae0eD