# Week 1 Journal

In the second chapter of The Art of Interactive Design, Crawford asks, why bother with interactivity? Obviously such a bold question doesn’t have a simple answer. Crawford makes a number of logical arguments on the topic, it works, it’s groundbreaking, and it allows us to use computers to their full potential. Sure, we want to take advantage of everything computers can do, but more importantly, we should be thinking about making the most of everything we as humans can do. Every aspect of our bodies and senses can play a role in responding to and controlling systems relevant to our needs. We can process feedback in incredibly complex ways and use it to iteratively change, and hopefully improve, our environments through voluntary action or even biofeedback. Tactile feedback is one of our most important and under-utilized assets in terms of interacting with computers and other physical systems. Vision is already crucial for interacting with the screens that fill our lives, but more prevalent use of auditory feedback, for instance, would allow us to experience the world without screens as intermediaries, responding to elements of our omnidirectional environment based on distance, azimuth, and elevation when the need arises. While a future that looks like the one Victor rants about is undeniably sleek, we as people are more than fingers and eyes, and it is up to today’s programmers and engineers to shape our technology accordingly.

# Week 2 Journal

Though drawing a still work of art in Processing was a valuable learning experience for the first week of class, I had to see examples of interactive animation at work to get an idea of the true potential of the program. For our animation assignment I was originally intent on creating an visual I’m not sure if I dreamed or actually experienced; seeing an image of the world blurred through the cell walls of your eyes, a grid of rounded, complacently meandering shapes that just loosely fit together. After working on the implementation for some time, I realized that this idea wasn’t going to come together the way I hoped at that stage of my experience with Processing.

So I began thinking about designs for a clock instead. I already had already been mulling around ideas for my final Fourier series project, so I wanted to prepare by incorporating shapes rotating around each other into my design. The plan I eventually decided upon involved nested circles, with a “second hand” that moved around the perimeter of the outmost circle, a minute hand that sat within the second hand, and a dot for the hour inside the minute hand. I chose for the second hand to be the largest circle to create the most movement within the design. I used the second(), minute(), and hour() functions to control hand movement, converting the functions to radians with a modulo of 2Pi. Mouse position along the x- and y-axes dictated circle sizes while mouse speed determined their color. Playing around with this sketch for a minute or two can create some really beautiful designs, but it can be a problem when it’s that much fun staring at the clock watching time pass.

# Week 3 Journal

The inspiration for my Sol LeWitt line generator came from his wall drawing instructions for the School of the MFA Boston. LeWitt instructed students to draw 50 randomly placed dots and connect them with straight lines. Fifty points seemed like it might be overkill for this assignment, so instead I decided to write a program for drawing five dots in random locations and connecting them. Though the white dots on a black background look identical, each is numbered within the code by which it was generated. As the dots are connected sequentially, not by any visible factor such as location on the x-axis, for instance, the design by which they will be connected is completely unknown to the audience. I created a function that draws the lines over time, so they gradually extend from start point to end point. The final design of each sketch thus becomes a more engaging experience as the user sees the artwork unfold in front of their eyes.

# Week 4 Journal

I built upon my Sol LeWitt piece the following week by writing the commands for the sketch into classes. I utilized two classes in my approach, one for the randomly placed points and another for the lines that connect them. The point class was quite basic, containing only three global variables for x- and y- coordinates and the diameter of each circular dot. The x and y variables are given values based on random() functions constrained within the width and height of the sketch. An ellipse is then drawn at the determined location and the x- and y¬-values are returned from the function as floats. These points are created within the setup() function and variables of the line class are given start and end points based on the x- and y-values returned by the points. The line class constructor first determines how to find the slope and distance between points based on whether the line must be drawn to the right or left. Then, a connect function uses translate() and rotate() to determine the orientation of the line to be drawn, incrementing its length until it reaches the distance between points. These line connect functions are implemented in the sketch’s draw loop to create the artwork of the previous week in a much more cleanly coded way.

# Week 5 Journal

For my stupid pet trick, I wanted to create something relevant to both my music technology interests and the types of sensors I would be making use of in my final project. The only thing we had done musically thus far in class was generate static tones with the piezo component of the Arduino experimentation kit. To expand upon this audio functionality, I decided to make a tone generator that would vary frequency based on sensor input. I used a force sensitive resistor as input since I had originally wanted to utilize the FSR to control shape rotation in my final project. I used the analogRead() function to take input from pin A0. The values were then mapped to a usable range from 100 to 1000 Hz and printed to the Serial Monitor so I could track the output of the piezo. Audio was generated using the tone() function, with arguments for the correct output pin and frequency value. A short two millisecond delay was also incorporated to allow time for the analog input to be read. The pet trick came together with a literal spin by putting the piezo component of my tone generator into the mouth of the pet wooden frog that hangs out on the windowsill of my apartment.

# Week 6 Journal: Fourier Geometry Debrief

Charles Deluga

Fourier Geometry: A Visual Additive Synthesizer

One fateful day surfing the web this past semester, I came across a visualization of the Fourier series that sparked my imagination (http://bl.ocks.org/jinroh/7524988). Creating a complex waveform from spinning, interconnected unit circles was such an elegant way of understanding additive synthesis. I started wondering, what if you could generate audio from other shapes? Unit squares and triangles alongside the conventional circles. Instead of creating complex waveforms from individual sinusoids, each harmonic would involve its own unique harmonics. These geometric harmonics would be so simple to understand in the visual realm, yet the waveforms they create would be extremely difficult to conceive in purely audio terms. When I began this Interactive Media course, I decided that if this undertaking was at all feasible, I would create such a visual additive synthesizer.

Additive synthesis traditionally involves the generation of complex waveforms through the summation of individual sine tones. These sinusoids are typically related in terms of frequency and amplitude. The lowest tone, or first harmonic, is typically referred to as the fundamental frequency. Higher harmonics, or overtones, are generally based on the fundamental at set integer multiples of frequency and fractions of amplitude. For instance, the third harmonic has a frequency three times that of the fundamental and one third the amplitude. Classic waveforms can be created by adding together these harmonics in specific combinations. A sawtooth wave results from summing a fundamental frequency and all its overtones, while a square wave can be approximated by adding together only the odd harmonics.

To create my additive synthesizer, I first needed a functional fundamental frequency. I based my code for this initial step on the sketch “Trigonometry: Fourier circles” by Diana Lange (http://openprocessing.org/visuals/?visualID= 137086). This sketch allowed me to create an initial unit circle, display its angular rotation, and create an ArrayList to graph the resulting sine wave.

The original code approached angular rotation in steps of 1/(10*TWO_PI). While this worked for generating a clean waveform, I wanted the frequency of my harmonics to be based on time rather than the frame rate of my sketch. I created a variable for elapsed milliseconds and used this to frame the frequency of a harmonic’s angular rotation in terms of its period. A variable counts through the milliseconds until a harmonic’s period has been reached, then resets back to zero through use of a modulo operator. Position with a cycle is then scaled to a value between 0 and 1, and a variable *angle* represents this proportion in terms of radians. This all takes place within the *move* function of a shape. The *displayCircleAngle* function uses *move* to draw a line that rotates along the perimeter of a drawn unit circle from its center point. The terms used to determine the end point of the line are used to get X and Y values from the circle for use in setting the center point of the harmonic that will move along its border. Once I accomplished this rotation with a circular harmonic, I began experimenting with unit squares.

Movement around the edges of a square gave me some difficulty because I was working with the radian-based *angle* variable. All eight sections between each corner and the midpoint of each side had to be addressed separately since values needed to be framed in terms of addition and subtraction from the center point. Each eighth of the circle was treated as a right triangle, utilizing the tangent function for movement along the triangle’s opposite side. In retrospect, angular rotation based on radians probably wasn’t necessary since I could have used the scaled period variable, *periodScaled*, on which *angle* is based. While the trigonometric functions used make for a constant speed of movement along the outside of a circle, they seem to result in some variability in speed when applied as I did to a square, as evidenced by the subtle curves in my square-based waveforms. In future work on this sketch, I would likely implement a constant-speed square in addition to this variable-speed version. Though there are a number of aspects regarding the math in this square implementation that could be simplified in the future, the complexity didn’t come close to that of creating my unit triangle.

I began planning the final shape of the visual synthesizer by scribbling equilateral triangles on the back of a McDonald’s receipt. I looked online for triangle proportions and plugged in numbers to figure out how to define the vertices of an equilateral triangle based on the location of its center point. But even determining where the center point should be located was a challenge. Using the midpoint of the triangle’s height would mean it could have the same y-axis dimensions of the circles and squares. This wouldn’t work, though, if movement around the shape was to take place at a constant rate and the top of the shape was to be reached a quarter of the way through rotation. This was the same problem with the visual center of the shape, its centroid, one third of the way up its height. The perimeter of the shape could be divided evenly into fourths, however, by setting the center point at a quarter of the height. Movement along the outside had to be divided into six regions based on every time rotation passed either the center point or a corner. To accomplish this, I created variables based on *periodScaled* that mapped the progress of movement along each section from 0 to 1. Then, the X and Y coordinates had to be defined in terms of the Pythagorean Theorem thought of in terms of 30-60-90 triangles of changing size. If triangles weren’t my favorite shape, I never would have found it in myself to figure out this math. I felt like Dr. Frankenstein once I finally created the unit triangle, having pieced together something beautiful yet utterly unholy.

Once I could display each shape, it’s angular rotation, retrieve X and Y values, and link together harmonics, I began incorporating key press functionality into the sketch. Pressing the letters “c”, “s”, and “t” respectively set the current shape type to circle, square, or triangle. Once a shape type is selected, all harmonics subsequently created will be that shape. Harmonics are created by pressing the number keys, with 1 through 9 corresponding to respective harmonics 1 through 9, and 0 corresponding to harmonic 10. Using if statements, these numbers toggle Boolean variables for the presence of each harmonic based on current shape type. Display of the shapes and their movement is implemented through if statements in the *draw* loop. By defining the outermost X and Y values of the harmonic series within these if statements, harmonics can be linked even when those in between them are absent. These *prevX* and *prevY* values are then used for graphing the waveform and the dotted line showing the current Y location. The delete key removes the harmonic that was last selected with a number key. The spacebar toggles Arduino functionality.

I set up an Arduino circuit with two potentiometers for controlling harmonic frequency and amplitude in the visualization. Code for the circuit was built from the VirtualColorMixer Arduino example. Analog read functions pull in the values from these sensors and serial communication is established to send these values to processing. Once the sensor values have been brought into the sketch and separated, they are mapped to ranges that make sense for the parameters of frequency and amplitude. Since the potentiometers introduce some noise into the visualization, I incorporated the spacebar key press function to toggle whether or not the Arduino values play a role in the processing code. Pressing spacebar means the potentiometer values act on the last harmonic to have been selected, so frequency and amplitude can be adjusted individually for each shape. Deleting a harmonic and reinitializing it resets its frequency and amplitude relative to the fundamental. This way, the Arduino can be used for modifying parameters but one can easily return the harmonic series to the classic sawtooth and square wave approximations. Reseting these parameters happens when pressing the delete key rather than when selecting the harmonic so harmonic shape can be changed without affecting any modifications made using the Arduino.

This Arduino-based real-time parameter adjustment was the final step in the current functionality of this visual synthesizer. Once it had been incorporated into my code, I went about cleaning up unused variables and commenting sections more thoroughly. I considered incorporating some bright colors and visual effects into the waveform graph but eventually opted for the clean black and white look, not wanting to detract from the effort put into this project with subpar decoration. I did add a title to the presentation of the sketch though, using a custom font just to make things more difficult for people.

All in all, I’m proud of how this project turned out, especially given the shortened time frame of this class, and I have tons of ideas for how to improve and expand upon this work in the future. My main goal is to send the Y values of the graphed waveform into Max, scale their values to an audible range, interpolate between values, and generate audio output from this visual synthesizer. If this turns out well, I would love to incorporate more complex behavior of the harmonics, such as rotation of the shapes themselves, crossfading between shapes, stretching shapes, or the ability to draw your own harmonics. Maybe someday I could even create dedicated hardware for toggling and manipulating harmonics and inputting MIDI. This project was a great learning experience and writing this explanation of my process has really helped me clarify some of my ideas, along with hopefully making up for the thus far minimal amount of writing on my blog. I’d like to thank you for teaching such an excellent course and I’m excited about all the directions I’ll be able to take everything I’ve learned!