ICM Final: The I Ching Machine

Link to final here:

After struggling with the Hero’s Journey YouTube Video Program as detailed in this past blog post, I was making process with the YouTube API, however I was not spending enough time on the creation of the video art work itself. Because the program was meant to be video art and not a design-led thing, it’s exact form would be determined as I completed it.

However, when user testing rolled around and one week till the end of finals, I decided that such a video art project was too ambitious despite the progress I had made with the API.

I then decided to pivot to a project that I had wanted to make for a while, which had a defined structure, and would be relatively do-able. This was the I Ching Machine.

The Concept

I’ve studied the I Ching, aka the Book of Changes, for several years. I have found, regardless of how it “works”, it has been helpful to me. The most mature human wisdom is found in the I Ching, whose images predate written history. The “modern” version of the I Ching is two thousand years old, and it boasts several thousands years worth of Confucian scholarship and commentary on the meaning of the images.

A while ago, I started to realize that the wisdom of the I Ching was not obscure knowledge contained in a book but rather it was wisdom assessable to us in every day life. The I Ching is not “Chinese” wisdom, it is human wisdom, and it variations can be seen in all time and all cultures. In the same way Joseph Campbell describes a singular, universal plot structure for all religious and folk myths of all human cultures of all times in The Hero With A Thousand Faces, so too the wisdom of the I Ching can be seen in our contemporary culture. The archetypes operate everywhere, always, under the veil of cultural specificity.

I therefore created an I Ching program in P5JS that used my prior work with YouTube to linked the fortunes given by the I Ching to specific YouTube clips that I hand-picked and believed to express the message of the hexagram.

Programming The Oracle

I programmed the P5JS I Ching program with the same instructions as one would consult the I Ching with coins. The steps are:

  1. Toss three coins and see if they land heads or tails (I used random(0,1) for this).
  2. Depending on how they land they are one of four possibility: broken-changing, unbroken-changing, unbroken-unchanging, broken-unchanging. These were represented either by a broken on broken rect drawn and a dot on the side if the lines were changing.
  3. The changing of the lines are a specific pattern of heads and tails that point of dynamic aspects of one’s situation. These in turn, change to their opposite to form a second hexagram. It is the interplay between the messages of the first and second hexagram, taken together, that generate the specific message to the user. I generated the second hexagram by storing the first hexagram in a lines[] array and the lines2[] array. I used the same drawing function for the second hexagram with a different x-value in order to position them beside each other.

When the hexagrams were stored in the lines[] arrays, I ran them through a function I created to name the hexagram based on the lines.x and lines.y object values. The values were x = b(broken) or ub(unbroken) and y = c(changing) or uc(unchanging). By the specific configuration of the array elements ,it corresponded to one of the 64 hexagrams of the I Ching (each hexagram is a chapter of commentary of a specific situation of change.

How I matched the YouTube Clips with I Ching

For instance, hexagram 51 is named The Arousing (Thunder). The message of the hexagram is:

“The shock that comes from the manifestation of God within the depths of the earth makes man afraid, but this fear of God is good, for joy and merriment can follow upon it. When a man has learned within his heart what fear and trembling mean, he is safeguarded against any terror produced by outside influences. Let the thunder roll and spread terror a hundred miles around: he remains so composed and reverent in spirit that the sacrificial rite is not interrupted. This is the spirit that must animate leaders and rulers of men-a profound inner seriousness from which all terrors glance off harmlessly.” http://ichingfortune.com/hexagrams/51.php

Pulp Fiction is one of my all-time favourite movies, and I reflected a lot of why this is. I believe the narrative of divine intervention that stopped Jules and Vincent from being killed in the first scene is what tied the entire movie together. Without it, I believe the movie would have been entertaining but lacking in depth.

It was the tale of change in a man’s outlook after experiencing the touch of God that led Jules to change his ways. It perfected described the essence of the hexagram’s philosophy above. As such, I chose it as suiting this I Ching passage.

It was only possible for me to choose all these youtube clips because I’ve had this idea kicking in my head for a while, and I am always on the looking for “archetypal movie scenes”. As such the I Ching machine in it’s current form is a patchwork of action movies, kung fu clips and music clips. The I Ching machine is a window into my own psyche, and another person’s I Ching machine will be completely different.

Technically, implementing the video was quite easy. I simply used an iFrame (which is much simpler than the YouTube API) and I destroyed the iFrame when the Second Hexagram video was pressed.

Conclusion and Next Steps. 

I am very happy with how the I Ching program turned out and I want to keep moving forward with it.

-I want this thing to show publicly at some point, and I want to link it to actually flipping coins in real life with the videos being shown by projector.

-I might explore putting it up in a webpage form on some subdomain of my website. Maybe ichingmachine.jasonyung.ca or something.

-I might even open it up to I Ching forums to get the wider community to “make your own I Ching”

That’s about concludes it. Thanks for a great semester, Dano.

Jason

 

 

 

PComp Final | Final Concept & Production Schedule

October 24:

  • After letting play-test results settle in the back of my mind, I’ve decided upon the final PComp project concept, thereby resolving issues brought up in my last update.

Final Concept

  • I have decided to use Servos after all, as the only means of controlling the blinds. Unlike during play testing, users will not be allowed to control the blinds directly. I decided this after considering what I want the piece will be used for, beyond it’s existence as a PComp final.
  • The goal of the piece is primarily art. I want it to be an art piece whose primary function is to explore the creation of space using light. I want it to be geared towards being shown in an art gallery. As such, I intend to use this piece as a way to approach galleries. The final design of the piece will all support this goal.
  • User control will be narrowly defined. Unlike the prototype used during play testing, there will be no direct way for the beholder to spin the blinds that control the light/shadow effects. I decided this because the point of the piece is a nuanced exploration of light/shadow effects — however when users are given free control, they tended to spin the blinds around in a way that does not accord with this primary function.
  • There will be a minimal interactive element: a button that will allow the user to change patterns. However, there will be two modes, an auto mode (where patterns change automatically to a set schedule) and a manual mode, where users can press the button to display a new pattern.
  • I will be using one layer of blinds, given that its light/shadow/colour effect is the strongest. Later iterations may explore the second layer, but for now, one layer is sufficient complexity to present an impressive spectacle.
  • The piece will measure 8x8x8″. This decision was made after a tester suggested I use acrylic on the sides in addition to the front so that I could see the light effects on the side.
  • After experimenting with vellum paper on the sides, I discovered very impressive effects that took the piece to its esthetic conclusion, thereby settling my thoughts on the final form of the piece. It was this consideration, above all else, that allowed me to make the final decision on the servos and user interactions.
  • Micro-controller will be housed underneath the LED board, between another layer of MDF. I am using the Arduino UNO.

Final Production Schedule

  • The main structure of the piece is already complete, since I have the acrylic front panel, standoffs that comprise the main structure, and the LED board. Things that remain are:
  • A – Making the servo/blind system:
    • attaching servos to a piece of acrylic
    • laser cutting and bending a piece of acrylic so that it fits between standoffs and positions the servos for vertical-axis rotation of blinds
    • programming the servo system to display patterns of light/shadow
  • B – Making micro-controller housing:
    • Hooking up the LED board to an Arduino UNO.
    • Cutting a new 8×8 piece of MDF and attaching the UNO and other components to it.
    • Soldering a circuit board so that the UNO and the LED matrix are wired in a stable way (i.e. no jumper cables) and connection to a suitable wall wart power supply is as simple as plug-and-play.
  • C – Side panel material:
    • The light effects can be completely accomplished with Vellum paper on the side with a piece of clear acrylic underneath.
    • I will find a way to adhere the Vellum to the clear acrylic without using tape.
  • D – Make a button:
    • I will make a detachable button that attaches to the electronics housing that prompts users to be able to generate new patterns.
  • Precise schedule to be determined organically. But since I have only ICM and PComp as finals, it would take only a few days and I could finish ahead of schedule.
  • If I finish ahead of schedule, I plan to devote the extra time to programming the light/shadow patterns.

 

ICM Final Journal

Nov 15:

  • Working on making a P5 program that can play YouTube videos and switch between videos.
  • No progress since last week. I spent Thursday-Sunday working on finishing the Rembrandt for my show on Sunday and on Mon-Tues I had personal obligations to attend to.

Based on class feedback and discussions with Dano last week my program will consist of:

  • Using the YouTube API or making my own API in order to access a structure series of videos, with multiple in and out points. (To be determined as I do the project).
  • Until I get the program done, I will only use a few videos around the Departure-Initiation-Return cycle of the Hero’s Journey. I will populate with more videos once I have the bulk of the programming done.

Objectives to end of week:

  • Aim by end of week is to have a solid framework (or at a minimum a solid understanding of what it takes to develop such a framework) that I can use to orient my efforts around.

Nov 22:

  • I’m starting to find my way around this YouTube iFrame player API.
  • There’s numerous confusing elements to using this API. There is an expertise gap between my current level of P5/JS knowledge and the knowledge that the API guide assumes. That said, I am just trying stuff right now at the moment and discovering things by trial and error.
  • Discovery: the P5 alpha editor is somewhat inconsistent in its execution of the YouTube API. Sometimes it doesn’t read functions, sometimes it does.

Achievements of the day:

November 24:

  • Figured out the onYouTubeIframeAPIReady() and onPlayerReady(event) are executed before setup() is even called. I learned this using console.log numbers to show me what order things were presented.
  • I was trying to make two player objects, one for Harvest Moon, and one for Old Man, and somehow have a defined start/stop time for each of them respectively. But it looks like you can’t make two player objects since P5 only plays the first one created.
  • Figured out the format that playervars, a setting inside the creation of the YTPlayer object should take. It’s like this:

autoplay: ‘1’, — [i.e. the first value no quotes and second value in one finger quotes]

With this discovery, I can utilize the wide range of iFrame player customizations listed here.

Nov 28:

  • Previous attempts at loading a playlist did not work because I was not able to specify different start and end points for each separate video in the playlist, since the player objects only generically allows start/end points. This means I can’t do individualized start/end points, so if I want to use 0:08-0:13 of Harvest Moon and 0:03-0:07 of Old Man, it doesn’t work.
  • This being the case, I am not working on constructing multiple player objects so that each player object holds one video with a unique start/stop point.
  • Problem is, there is a DOM element to this that I don’t understand how the YTPlayer objects works and I don’t seem to be finding sufficient (or understandable) documentation. StackOverflow has been helpful in this regards and it looks like others have been in my shoes. But at this point, I don’t really get if I can have multiple iFrame objects
  • The YouTube API documentation, as well as StackOverFlow discussions, all refer to things in an “event” function that seems to be built into Javascript. Relvant functions include
    • event.target.a.id;
    • event.data;
    • event.target.playVideo();
  • While the API’s own documentation relies on event.data in theonPlayerStateChange function, event.data returns an “undefined” when printed in console.
    • function onPlayerStateChange(event) {
      if (event.data == YT.PlayerState.ENDED && !done) {
      setTimeout(nextVideo, 9300);
      done = true;
      }//is event.data only avail in these fucntions?
      }

Pcomp Final | Playtest Results, Production Sched & BOMs

Play-testing conducted on October 8th generated positive reception of the light/shadow box. People commented on the beauty of the images created, and were excited by it.

Playtesting turned out to be far more useful than I thought. Initially, I was concerned that because my project was an art piece and not a design-oriented project, it would be inappropriate to play-test “the artist’s message”. Thankfully, actual experience showed this to not be the case. Play-testers generally understood what I was trying to do. They made  good suggestions on how to make it better, offered further areas for exploration and helped to clarify decision points that I’ll need to make as the project proceeds toward finalization.

Key issues for resolution:

To Servo, or Not To Servo?

  • Should servos to control the blinds or should I let users control them by hand?
  • People really enjoyed controlling the blinds by hand, and there is a certain deep satisfaction of direct control that I hesitate to even put servos in there.
  • Other people suggested that I have some kind of half-user controlled, half-servo controlled blinds. Like blinds that can be moved by hand but always set back to zero by themselves.
  • This is a tough one because it is directly at the heart of my piece. Putting servos for user control requires buttons to be made, wiring to be thought out, wiring to be placed and hidden — it requires a lot of work.
  • However, keeping it directly user controlled would imply a specific type of light patterns at the back.
  • Not doing servos would save me a ton of work and let me focus on the interaction aspect of it with the users and the light patterns
  • DECISION: No servos. With the time I’m taking off in Colombia, I’m cutting it short as it is. I’ll just focus on the blinds interactions.
  • Decision: No gears. It’s a little late in the game for gears.

LED patterns

  • I’m not sure what pre-programmed patterns should appear on the board. During playtesting I manually programmed the arduino to display different sets of neopixel colour. Mostly I had the board be two different colours, which were sufficient to display the light/shadow effects on the blinds on the surface. People commented that they had the urge to change the colours as they changed the blinds.
  • Should the colours be user controlled? No it would be too complicated to give them full control, but I could give them partial control, or at least put a button for some change to happen, even if they don’t control the parameters of that change.
  • Solution: users will be partial control with one button to change colour. However I will define the patterns that user changes to. This is a mix of half control and half spontaneous.

Decisions about Blind Layers and Shapes.

  • Coming into play testing I had reservations about the original design which was two layers of blinds. The one-layer image is so much crisper, clearer and vibrant than the two layers, because the distance from LED to projective surface is smaller. Yet it lacks a certain degree of complexity that the two layer image is, despite being more faded. For sheer colour display considerations, I think I will make it one layer.
  • It was also suggested that I use different shaped blinds, not just rectangles. Maybe circles or different shaped objects. The shapes of the two layers could also be different. CONCLUSION: run experiment with second layer of objects.

Siding material:

  • Someone suggested that there should be acrylic on the sides of the boxes so we can see colours on different sides of the box.
  • This bring up the different canvas possibilities.
  • Solution: I think it would be interesting to have the side of the box show the different angle of light reflecting, but the top and bottom of the box will not show this so the top and bottom of the box should be wood, while the sides are acrlyic.

Where to put the arduino? 

  • For playtesting, I had the neopixels attached to a nearby UNO, but I would like for the controller to build into the piece, ideally the back of it.

Production Schedule:

  • Make final enclosure, enclose Teensy, acrylic (potentially I could use some pieces I already have), wood
  • Do blinds experiments, decide upon final config, make final blinds: wood(?) metal(?) other pieces of vellum paper?? (could really work)
  • Program light interactions with user

Bill of materials

I already have most of my materials including

  • Thick acrylic projective front panel
  • LED board
  • standoffs that make the entire structure, as well as screws
  • Other pieces of acylic for the side:

I will need to decide on:

material for final blinds (maybe balsa wood?) of course this relies upon my experiments with different types of blinds

I will need to get: wood for the top and bottom (or maybe it should just be acrylic all the way?

The Beauty of Incorrectness

As I am work on the Rembrandt, I notice that it’s the first time I am fully using the drawing functions I did all the way back in the last spring. Like, I’m really learning to use it, paint with it, de-bug it. I used them to an extent with the stuff I did over the summer, but doing figurative stuff really takes them out to town.

What I notice most is how some functions have mistakes in them and doesn’t display the pixels in the way in which it was intended. But it sort of does the gist of it and displays it in a unique way.

Sometimes it’s these mistaken functions that have the most value, that add to uniqueness and esthetic.

ICM Final (Concept – Nov 8)

I got my idea for the final from a conversation with Dano, about the power of recombinatory media. Ones and zeroes make the binary code. Movable type led to Gutenburg’s revolution. So far, film has not yet been combinable in the traditional sense — my ICM final is an attempt to address this challenge.

For the final, I will be using P5.JS to bring to life theories contained in Joseph Campbell’s famous book The Hero With A Thousand Faces.

The Hero’s Journey in modern film:

The idea:

  • video part project consisting of youtube video clips of movies
  • clips separated based on what part of the Hero’s journey they are at (Separation, Initiation, Return)
  • sound to be movie music with the same structure built into it
  • Video and Audio selected at random to play in an infinite loop. Movies will have no audio and their corresponding “natural” soundtracks will not play a roll.
  • The hope is to create a piece that makes sense within the Hero’s Journey narrative structure, even if none of the component parts make sense on their own.

Rembrandt Part 3: Making the Rembrandt Image

Building off of Pcomp projects of getting the Rembrandt image on to the big board, I will stop at no less than creating a fully articulated piece on to the big board.

Last week’s Pcomp project allowed me to reproduce, literally, the re-sized Rembrandt image on to the big board. Looking at the blurry image made me come to some key realizations:

  1. Mapping image data to the LED matrix from P5JS over serial creates very different images than manual construction (manually programming loops and shapes). I am happy about this.
  2. RBG values that produce colours on the screen are different than RBG combinations that produce similar effects on the big board.

FIXING PAST ERRORS

In last week’s program, despite it working there were still fundamental errors.

First was while the image would appear on the board, I could never bring the first two LED strips until control. They would always escape control.

Second was that if I sent the data twice, the image would turn up disjointed. The first time was fine, but the second time was not.

When I saw Danny in his office hours this week, I brought these errors to him and he gave me some good pointers:

-instead of using get() in P5 to get the pixel array, use loadPixels which returns a pixel[] array of RGBA values. (lol… wow very simple)

-set baud rate higher if transmission seems slow. He assured me the Due is fast enough to do video.

-don’t use a string, but bytes. Previously I was using parseInt() on the Arduino side, which was parsing the comma values. The room for error is higher since the arrays need to be so precise — this may be causing the error.

First problem: how to extract an array from serial.

With this code:

if (Serial.available() > 0) { // if there’s serial data available

for (int i=0; i<8960; i++) {
all[i] = Serial.read();
}

I get this error from P5.Serialport program.

 

 

The Carl Jung Data Project (ON HOLD)

During the intense period that was the week of both Fab and Video finals, as well as my girlfriend Lina visiting me, I let ICM fall to the wayside. This was the week of Data. As a result, I did not learn data well enough — that’s about to change!

This week I stumbled upon the Carl Jung Correspondence dataset. It’s in CSV format so and I’m going to see what I can do.

Things learned from this assignment:

-csv files consist of strings