Featured in: The New York Times, Screen Slate.

Verbolect is visual exploration of a conversation between Cleverbot and itself. Cleverbot is a chatbot that uses artificial intelligence to talk with users. First made available in 1997, it learns from its conversations to generate responses, so everything that Cleverbot says has at some point been said to it. In this way, the project is as much an exploration of human tendencies as it is of artificial intelligence.

The installation, which allows you to see and hear the conversation, ran continuously from October 20 to November 19, 2017 at The Boiler in Brooklyn.

Below is one of the many streams that were recorded during the exhibition.

This project was a collaboration with artist and teacher John O'Connor, Cleverbot creator and AI expert Rollo Carpenter, and multimedia artist Jack Colton. Together we founded a collective called NonCoreProjector.

My Role: I implemented the projection, which manifests itself as a web application that uses several APIs to retrieve data that is then interpreted on the screen. The Cleverbot API is the backbone of the piece. In addition to generating the conversation, it provides emotional analysis for each reply that we use to source images, videos, and audio. To get the idea, here is an example response from the API (these are the most relevant fields; the full response has many more).

emotion: "calm",
emotion_degree: "32",
emotion_tone: "0",
emotion_values: "0,0,0,0,40,0,0",
input: "Nice to meet you, too!",
output: "What are you doing now?",
output_label: "welcome",
reaction: "pleased",
reaction_degree: "65",
reaction_tone: "1",
reaction_values: "0,0,0,0,60,0,0"

We use these values to draw graph the conversation, control parameters in a force simulation, and find online content via external APIs (YouTube, stock footage, etc). These distinct "modules" are explored by a roving eye while the conversation takes place.

The main organizational construct of the projection is of a roving eye — simultaneously the idea of the bot searching outside of itself, into the world, looking for patterns, and of us, looking into its brain as if through a peephole. The emotional intensity of the words the bot speaks will dictate the substance, pace and movements of the projection’s machinations.

Read the full press release.

Tools: Processing, JavaScript, D3.js, HTML, CSS