I am a developer and musician who likes to make art and interactive experiences with code. I am especially interested in exploring new ways of making and interfacing with music. Check out NonCoreProjector for my latest art projects. In May 2017 I graduated from Tufts University with a degree in Computer Science and German Language and Literature, and now live in New York City.

Feel free to contact me. I am also on Github, Soundcloud, LinkedIn, Instagram, Medium, and occasionally Twitter.

Website by me.

Verbolect is visual exploration of a conversation between Cleverbot and itself. Cleverbot is a chatbot that uses artificial intelligence to talk with users. First made available in 1997, it learns from its conversations to generate responses, so everything that Cleverbot says has at some point been said to it. In this way, the project is as much an exploration of human tendencies as it is of artificial intelligence.

The installation, which allows you to see and hear the conversation, ran continuously from October 20 to November 19, 2017 at The Boiler in Brooklyn.

This project was a collaboration with artist and teacher John O'Connor, Cleverbot creator and AI expert Rollo Carpenter, and multimedia artist Jack Colton. Together we founded a collective called NonCoreProjector.

My Role: I implemented the projection, which manifests itself as a web application that uses several APIs to retrieve data that is then interpreted on the screen. The Cleverbot API is the backbone of the piece, as it generates the conversation. It provides emotional data that we use to produce visuals that correspond to each piece of the conversation. To get the idea, here is an example response from the API (these are the most relevant fields; the full response has many more).

emotion:"calm",
emotion_degree:"32",
emotion_tone:"0",
emotion_values:"0,0,0,0,40,0,0",
input:"Nice to meet you, too!",
output:"What are you doing now?",
output_label:"welcome",
reaction:"pleased",
reaction_degree:"65",
reaction_tone:"1",
reaction_values:"0,0,0,0,60,0,0"

We use these values to generate color, shape, video, and image by drawing on the HTML canvas, using a D3 force simulation, and calling external APIs (YouTube, stock footage, etc) These "modules" are explored by a roving eye while the conversation takes place. From the press release: The main organizational construct of the projection is of a roving eye — simultaneously the idea of the bot searching outside of itself, into the world, looking for patterns, and of us, looking into its brain as if through a peephole. The emotional intensity of the words the bot speaks will dictate the substance, pace and movements of the projection’s machinations.

Tools: Processing, JavaScript, D3.js, HTML, CSS

Fiber is an interactive VR experience that allows you to “step inside a song” and discover its full story, contributors and context.
fiber prototype screenshot

Fiber is the result of a two month Summer Lab at the Open Music Initiative, which is a partnership between IDEO and Berklee College of Music. The OMI mission is to create an "open-source protocol for the uniform identification of music rights holders and creators." This currently manifests itself as an API that can be used to register and query information. Our job in the Summer Lab was to imagine a future where this information is open and accessible.

Our team took on the challenge of "identifying individuals for their contributions to single tracks in new works." We expanded on the prompt and focused on interactively telling the whole story of a song.

One of our early prototypes was a cube that could be exploded into shards. Each piece represented an element of the song, whether that be the guitar part or the person who mastered the track.
For our next iteration we focused on telling the story of a song instead of simply assembling information about it. We chose 'Same Drugs' by Chance the Rapper and crafted a virtual reality environment that the user can interact with to discover the narrative and context behind the music.
For more information about our ideas and development, check out our articles that we wrote throughout the process:

Part 1: Stepping Inside a Song
Part 2: Playing With Music
Part 3: So The Story Goes
Part 4: Fiber: Bringing The Story of a Song To Life

Read the OMI publication

Tools: Unity3D, D3.js

Shape Your Music

Shape Your Music is a visual music composition language where shapes are melodies. Edges represent notes, and angles represent the intervals between those notes. Shapes are played by traversing their perimeters at a constant speed, sounding a new note at each vertex.

I wanted to find an answer to the question "What does a shape sound like?" The result is something that I like to think of as a musical geoboard.

SYM can be used as both a composition and performance tool.

The idea is simple: create melodies by drawing shapes. While not a unique idea, my implementation is different from other applications that allow you to “draw” music due to the specific way that it translates a shape’s geometric features to musical properties. A shape represents a melody, where each side of the shape is a note. When a shape plays, a node travels at a constant speed around its perimeter, sounding a new note at each vertex. This means that each edge represents a note, where the side length determines the note length. The actual note values are determined by the angles at each vertex. The note that a shape starts with is dictated by its y position on the plane. Each subsequent note is determined by the angle at the vertex where that note (edge) starts.

A series of short left turns plays the scale (C major in this case).

An equilateral triangle played counter-clockwise goes up 4 notes in the scale at each vertex, resulting in the melody C,G,D.

The algorithm uses the value of the previous note and the angle to determine the next note. A right turn is a downward interval, and a left turn is an upward interval. Each half-plane (right and left) is divided up into n sections, where n is the number of notes in the musical scale. The more acute the angle, the higher in magnitude the interval is. For example, in a scale with 7 notes, a right turn of 170° means the next note will be down six notes in the scale, whereas a left turn of 10° means that the next note would be up one note in the scale.

The Star Wars Theme in shape form.

Tools: JavaScript, HTML, CSS

The Xenomod

The Xenomod is a custom MIDI instrument, built as part of the Electronic Musical Instrument Design course at Tufts University.
The Xenomod

Our goal was to build an instrument that could be played like a MIDI keyboard but with much more control over timbre. Its ergonomic design allows the user to control as many parameters as possible with limited hand movement. Fingers control eight touch-sensitive keys that respond to up/down sliding, palms rest on a trackballs that can be rotated as well as pressed down upon, and thumbs control rotary touch controllers. Additional buttons control octave changes and toggle various modes, and the LEDs provide visual feedback when keys are played.

The Xenomod also doubles as a 16-step sequencer that can be controlled dynamically. In sequencer mode, the keys are used to set note and effect values at each step. The trackballs control effect amounts and can be spun to randomize values.

My role: I programmed the Max/MSP logic for reading the sensor data, assigning notes and scales and other parameters, and also designed and wrote the sequencer.

Tools: Sensors(Soft potentiometers, FSRs, switches/buttons), Arduinos, Max/MSP, Reason

The live mode interface

The sequencer interface

The live mode patch

The sequencer patch

Recorded Music

For this EP, I limited my audio library to only sounds that I had recorded myself specifically for this project. I brought a recorder wherever I went and compiled recordings from shopping centers, the MBTA (subway), restaurants etc. I imported sounds and manipulated them to define tonal and percussive elements.