Understanding Networks: Assignment #1 Ball Game

The ball drop game took some doing but I managed to figure it out.

Most difficult was figuring out how the link up the Arduino Sketch to the Processing Sketch and ITP Sandbox Server. To be honest, I’m still not sure how it all works. All I know is that I stuck the BallDropServer IP address from the Sandbox and stuck it into the Arduino sketch, the BallDropClient sketch and it started working! (See video above).

Instead of a joystick, I used two potentiometers for the X and Y axis respectively. It was a while since I used both a button and pots, so I had to test both separately on a isolated Arduino sketch.

Here’s a photo of the way the Arduino was connected:

SERIAL JOURNAL

So I just got back to NYC last weeks from 3 weeks in Canada. The first week was in Montreal for the Chromatic Festival, May 26 – June 2, where I showed the Digital Rothko patterns.

Now that I’m back, I have a summer of works to do. First on my to-do list… finally get a handle of serial communication to the lightboard. By this I mean being able to send an jpeg image through P5JS through serial, to the arduino, which spits out the patterns on the neopixel matrix. This is the key to my mastery of this board, and key to the next idea for a piece that I have.

Right now the status is that I am trying to just get a one pixel red coloured gif image from P5JS to display on the very first LED pixel of the matrix, at position 0. The serial port is open, and arduino takes in comments from the serial monitor but it doesn’t seem to be showing what I am sending through P5….

June 20: All I’m trying to do is enter a value into the serial monitor and have the LED reflect that value in brightness. So if i enter 1, I will get a very dim white LED at 1,1,1.

Notes:

-when you set the chat fromSerial, then everything you type in the Serial monitor will show up as you type it.

-P5 serial.readline() reads a string until it sees a newline character — this is in comparison to serial.read()

-the atoi() function converts char to int. It is derived from the standard C libraries and there’s not much info on the arduino reference guide on it.

-BREAKTHROUGH (Clue): the topic of extracting ASCII values from chars (i.e. I type in 2 and it becomes the int 2, not 46 or whatever it is.) is a discussion topic which means many people have run into this problem and asked this question before. https://stackoverflow.com/questions/5029840/convert-char-to-int-in-c-and-c

-BREAKTHROUGH: int character = fromSerial – ‘0’; that’s what solves this whole thing. Now I can press on serial monitor and the LED is at level 1. AMAZING.

-now I need to create a system that takes serial input either at one or three digits for the range of possible serial values between 0 and 255. Hmmmm

JUNE 22 (Friday): The process is as follows:

-serial.read() gives single chars

-these chars can be made into ints

-these ints can be concated into a string

-this can be mapped into the LED

PROBLEM: Now things are appearing in the serial monitor at differing order. I think this is at the heart of the mix ups with the INPUT. Right now, I’m trying to map on to a 3 item array.

BREAKTHROUGH (June 22 afternoon):

-able to input three numbers separated by commas and have these mapped into the first pixel LED.

THEN: -able to input a series of 3 and have them light up LED 1, 2, 3, etc consecutively and accurately.

PROBLEM: Unable to verify whether any serial information is being received over P5.

RECTIFIED: It was just messing with P5 Serialport and opening up the port

PROBLEM: P5 should be providing data with the correct format A,B,C, but somehow the arduino sketich is not responding to it, despite me verifying that serial. available is working.

JUNE 29: Got the thing working somewhat. I can grab the pixel values from a pixel image from P5JS using a one-item array, and then serial.write that 3 value data through P5 serial control. I get this on the arduino, have it show on the LED display, then spit back the values back to P5 which I see on Chrome’s Console. HOWEVER, the issue remains that this interaction happens once in a while and the system of transferring data is still unstable. I dont know if there is anything I can do to make it stable, or if this patch-work of P5 serial control is a work-around that is not meant to be stable.

Loadpixels(): from this processing article it seems like loadpixels starts the pixel array from the top left, going right, then down to the next row.

PROBLEM: I am trying to display this chessboard 10×10 pixel pattern, but the data coming out of load pixels seems to not be in the right order. After further experiments with loadpixel() it seems that it does not display the pixel information in the right order…

(SOLVED) PROBLEM: LoadPixels() gives you an RGBA value of each pixel. Which means there are 4 array elements for the first pixel. However, for a 100 pixel picture, I only get 100 array elements, which means only the data of 100/4 = 25 pixels. What’s up with that? SOLVED: it was simply the matter of adjusting createCanvas to 100,100 because it was on 1,1 when I was just doing one pixel

(SOLVED) PROBLEM: now that I’ve adjusted the canvas and image size to reflect it (10×10), somehow there are TOO many array elements. pixels.length gives me 10,000. Not sure why… since if there are 4 elements to each pixel and 100 pixels, then that would be max 400. So…. SOLVED: the problem was pixeldensity, which I guess was making it more pixels on the screen that the image was, and P5 automatically scales the pixel to the screen. Got rid of this problem by setting it to pixeldensity(1). Now pixel.length is 400 and all is right with the universe again 🙂

(SOLVED) PROBLEM: Uncaught TypeError: pixels.splice is not a function. Trying to splice the ‘A’ out of the RGBA of the pixels array, so every 4th element. But it’s not recognising splice as a function. Weird because I just made a separate array in the same program and splice works just fine. Also weird is that pixels.splice seems to crash the entire program, and no lines of code below it is used.

My working hypothesis at this point must be that there must be something different about the array that loadPixels() creates and a regular array I can create with just var array [blah0, blah1, blah2]. Proposed Solution is to find a workaround, since there’s not much I can do. I dunno, maybe the people who made P5 decided that splice shouldn’t be something you do to the pixel array. Anyways, I’ll try to make a new array and just have it be 300 elements and try to take out the 4,8,12, etc of the pixel array. I just need a mathematical way of doing this… SOLVED: I had to duplicate the array with a separate array function but not using array.slice but rather just doing it the manual way with a for loop. THEN with the new array, P5 allowed me to do splice. YES!

PROBLEM: I am attempting to gain a degree of stability in serial interations. I gained that through serial.write something in arduino in the serial.available statement that that spits it back out to the Chrome console through the P5 SerialEvent function. However no matter what I serial.write, it keeps showing up at “255” when P5 serial.read. Not sure why…

JULY 2: PROBLEM: Still stuck on the same thing yesterday. The key to this whole thing is why the P5 serial.read spits out a 255 from the arduino’s kicked-back serial input…

This problem is pretty tough. I’m not sure what parts of the code are being written to serial by the arduino. Specifically the part that tells the program if you see a comma, do this –  I’m not sure that part is writing. So far I wrote a small flashLED() function which flashes the LED.

Having huge problems even reading serial data from P5 on the arduino now… not sure where the problem is located or if it’s just P5 serialport.

Neopixel library too slow?

The shape of the Digital Rothko splotches are just not fading in fast enough and things seem to really slow down in the IDE’s serial monitor. This is after I simplified the number of operations as much as possible by re-orienting them from the drawcirclemaster, drawcircle and circle functions. So then I just did basic shapes: two rectangles and four lines — should be faster, right? WRONG.

It’s simply too slow. It’s like the time it takes to do all those setPixelColor is too slow. Going to look into the FastLED Library now and see if I can build the sketch that I showed at the ITP Spring Show from it, stat.

Lag on Uploading Sketches to Arduino Due

I’ve always noticed a lag in uploading to the Arduino Due when connected to the LED board. Like, a super long delay from pressing the “upload” button to seeing anything happen in the serial monitor or on the LED board. This can be 30 seconds to minutes. I don’t know why this is. With the UNO it’s generally less than a second. The Due is the most powerful Arduino money can buy, yet it’s super slow — why?

Time calculations for figuring out color transitions in Digital Rothko.

39 seconds from “done uploading” to the following happening on the Serial monitor:

 

understanding getPixelColor in Neopixel Library

According to Adafruit, the function getPixelColor returns an “32-bit merged color value”

When I set it to fetch the value for a WS2812B set at 10,10,10 and println, the monitor shows the value to be 657930. This is the 32 bit merged colour value.

From an Adafruit forum discussion, I’ve learned that this number is the summation of the following formula:

(red * 65536) + (green * 256) + blue

So sub’ing in my 10,10,10 to confirm this formula:

(10* 65536) + (10 * 256) + 10 = 655360 + 2560 + 10 = 657930

Excellent. Now we’ve found bedrock.

Now question is… can I extract the RGB value just by knowing the 32BMV?

From a math standpoint, it seems like ascertaining 3 unknown variables from an equation is impossible. BUT since I am using making the yellow using only one random value and then dividing that value by 3 to make the green to mix w the red to get the yellow, it is possible. GENIUS!

Therefore, applied to Digital Rothko sketch, it becomes:

(red* 65536) + (red/3 * 256)  + 0 = 32BMV

red = 32BMV / 65621.33

 

 

Serial Understanding Experiments

The following code from https://www.arduino.cc/en/Serial/Read:

int incomingByte = 0; // for incoming serial data

void setup() {
Serial.begin(9600); // opens serial port, sets data rate to 9600 bps
}

void loop() {

// send data only when you receive data:
if (Serial.available() > 0) {
// read the incoming byte:
incomingByte = Serial.read();

// say what you got:
Serial.print(“I received: “);
Serial.println(incomingByte, BIN);
}
}

—-

FOR BIN:

1,2,3 leads to:

I received: 110001
I received: 110010
I received: 110011

For OCT: 61,62,63

For HEX: 31,32,33

For DEC: 49,50,51

Serial.print(x, DEC);  // print as an ASCII-encoded decimal
Serial.print(x, HEX);  // print as an ASCII-encoded hexadecimal
Serial.print(x, OCT);  // print as an ASCII-encoded octal
Serial.println(x, BIN);  // print as an ASCII-encoded binary

XYZ / Final Documentation

Note: this final XYZ documentation is written in the style of Canadian diplomatic reporting. It’s something I’ve taken from my former career and is what I turn to when I need to thoroughly explore and clarify my own thinking about something. What follows is a total account of my XYZ final.

SUMMARY: Sometimes things that start badly end well, sometimes things that start well end badly. For the Noodle of Death team, what began with enthusiasm and smiles ended in disappointment and likely broken professional relationships. The robot itself works fine and is engaging and interactive – but that only tells part of the story.

REPORT: 

2. Re-Cap of Concept: The Noodle of Death was an idea I had based on my study of martial arts. Specifically it was based on an exercise developed by Ido Portal, who used it to develop a person’s movement capabilities, as well as based on my experiences training with pool noodles in boxing. A fun title initally suggested by Shreiya, the title was actually inappropriate to our project’s actual goal: which was to facilitate life.

3. Division of Duties and Workflow: We decided to use the Make-Block Plotter (MBP) after discussion with Ben, after consideration of other options. We agreed it would be the quickest, cheapest, easiest way. Our team was initially ahead of the other groups because of our clarity of concept and because our early experiments with the MBP had yielded successful conclusions about the way the MBP moved, as well as how the noodle would need to be secured. See below how the MBP originally moved, where we concluded that we had to put it on some kind of stiff board, secured to the ceiling rafters.

4. After this we decided to divide the workflow. I would do the noodle, Tony would do the board structure to secure the MBP to the ceiling and Shreiya would work with the G shield on the code. Comment: In hindsight, I wanted to leave XYZ with a skillset on how to use code to operate an XYZ machine. I should have made this need heard in the group and advocated for more involvement with the coding part of it. 

5. The thing came together fairly well. See below documentation of the final set up of the robot:

Final robot with board attached to speedrails, which we hung from rafters. The weight was it was adequate so that it did not move from the rafters.

Attached to the ceiling with noodle contraption. PVC pipe from Home Depot + foam insulation noodle:

The key difficulty for my part, making the “noodle of death”, was that the unit kept detaching from the stepper shaft coupler after a few rotations like this:

6. The Start Of Troubles: This is where our troubles began. I asked Shrieya to show me how to adjust the rate of the Z in order to fix the problem of the noodle falling off, but she seemed to be too busy to show me how to do this. Being busy with several other side XYZ projects, despite it being two days to the final, she did not have a simple interface where a user could control the noodle, since she still had to write the processing sketch. Therefore there was no recourse but to move it with G Code through GRBL.

7. Despite her assurances, I also did not fully trust that Shrieya would have this done on time because of subtle signals she had shown me which, in my view, was the result of insufficient communication. However, what led to corrosion of confidence was when she declined to take the time to discuss how to move the Z. It is possible that I did not articulate why I needed to move the Z and she felt that I was encroaching in her domain of responsibility, perhaps I needed to impress upon her that moving the Z was critical to my solving the problem of the noodle falling off the MBP. That possibility considered, I rule this out because I would unquestioningly help another group member if asked. For me, a basic courtesy was not shown to me, not was time given to deal with the situation. Instead, I just pushed aside. Comment: In hindsight, a timeline expectation for when each feature would be completed should have been completed and strictly adhered to.

8. I successfully figured out how to change the Z rate with GRBL G code, using the $112 command to reduce the rate from 400 to 20. This reduced the sound coming from the Stepper from a strain to something more pleasant. I found it on this GRBL feed rate website: http://www.diymachining.com/grbl-feed-rate/

9. With this, the robot was complete. Shrieya did a processing sketch and we were playing around with it earlier on. That said, it was not truly “complete” since the complete version of the robot would include an interface where User would control the robot against the Player. At this point, Shrieya settled on having a processing sketch take a pre-determined set of G Code commands, which completely changes the interaction from something dynamic to something routine. However, happy that we were done enough to show the presentation, I did not raise any complaint. Here is a video of my interaction with the final robot in action a few hours before our final presentation:

10. Spring Show Let-Down: It was ultimately my fault that we didn’t get into the spring show, since I put the wrong venue option during the application process. This was unfortunate, and my group members were very displeased with me. I think, however, that some of this anger is misplaced: after all, I did send them the link an hour before submission closed (i.e. no one bothered to check it), and I find it a little unjust that admission to the show would be decided by such a technicality – especially as we did submit on time. However, I take the majority share of blame. This was disappointing but still, things went further south from here.

11. From Cracks to Fissures: From early on, Shrieya and I had a good working relationship and friendship. This friendship was the basis of our collaboration. However, from the incident with the Z feed rate fresh in my mind, Shrieya let us know she would be away from the 8th of May up until the Spring Show. In the event we got into the show, she wrote that she would just write a bunch of G code files we could cycle through on the 14th. I responded and said that I would prefer she hand the coding over to myself and Tony while she was away, and that we would continue the work in her absence. I certainly did not want a bunch of pre-programming patterns instead of an actual user interface, for the show. She responded she would be OK doing this but said that she would not be doing a P5 interface anymore. I responded by voicing, for the first time, that I found it  unacceptable that the final robot did not have even a basic user interface and that I felt she took on unnecessary side projects at the cost of this one. She did not respond to this email. A few days later, when we spoke about this to clear the air, while I don’t feel it right to document the contents on that conversation I must say that what was initially meant to be a confidence-building conversation became antagonistic and the state of relations between Shireya and I declined significantly. [Updated Comment May 14: Upon reading a book called Difficult Conversations, which talks about moving from a blame mindset to a contribution-mapping mindset, I see how both Shrieya and I contributed to the state of affairs, and I see how things could be done different. I will take these valuable lessons into the future projects.]

12. Collaboration Lessons Learned: I count the Noodle of Death as a valuable learning experience on XYZ robots but mostly on how collaboration can go wrong when expectations and roles are not clearly spelled out. While we did clearly spell out the roles, these distinctions were insufficient to the task at hand. Combined with the fact that I did not articulate my concerns earlier on (i.e. about wanting to do more coding, with shrieya’s secondary projects, with what I expected for the final interface) and combined with team members non-communicative tendencies (i.e. the Z rate incident), this whole episode is a glaring reminder that hard discussion needs to happen at all stages, and that I should see past the laughter and good times to see the structure of the interaction itself. I felt I was blinded by how amusing the noodle was and failed to the see the decay underneath. Perhaps me putting in the wrong staging area was an unconscious action that revealed my dissatisfaction with this project. In any event, we did not get into the show, but I can honestly say I don’t really feel badly about it — other than letting my group members down, Tony especially, since he did his group role admirably.

13. I recently purchased a book entitled How to Have Difficult Conversations. I will be reading it over the summer and reflect on this invaluable learning experience.

XYZ / Z axis breakthrough

using this: http://www.diymachining.com/grbl-feed-rate/

I found that the $112 value in GRBL G-code controls how fast the noodle rotates. It was previously on 500, which spun way too fast for the noodle contraption. Taking it down to a value of 40 enabled it to spin without stressing the stepper or the noodle contraption.

That link also had this great super useful list of GRBL commands:

**** Connected to COM3 @ 115200 baud ****

Grbl 0.9j [‘$’ for help]
>>> $$
$0=10 (step pulse, usec)
$1=25 (step idle delay, msec)
$2=0 (step port invert mask:00000000)
$3=3 (dir port invert mask:00000011)
$4=0 (step enable invert, bool)
$5=0 (limit pins invert, bool)
$6=0 (probe pin invert, bool)
$10=3 (status report mask:00000011)
$11=0.010 (junction deviation, mm)
$12=0.002 (arc tolerance, mm)
$13=1 (report inches, bool)
$20=0 (soft limits, bool)
$21=0 (hard limits, bool)
$22=0 (homing cycle, bool)
$23=0 (homing dir invert mask:00000000)
$24=25.000 (homing feed, mm/min)
$25=500.000 (homing seek, mm/min)
$26=250 (homing debounce, msec)
$27=1.000 (homing pull-off, mm)
$100=314.960 (x, step/mm)
$101=314.960 (y, step/mm)
$102=78.740 (z, step/mm)
$110=800.000 (x max rate, mm/min)
$111=800.000 (y max rate, mm/min)
$112=350.000 (z max rate, mm/min)
$120=10.000 (x accel, mm/sec^2)
$121=10.000 (y accel, mm/sec^2)
$122=10.000 (z accel, mm/sec^2)
$130=200.000 (x max travel, mm)
$131=200.000 (y max travel, mm)
$132=200.000 (z max travel, mm)
ok