Jabber Mouth is a website I made that uses Asterisk, Node, Socket.io, Ruby AGI, and JQuery in order to allow users to call into a website and control a mouth based on a users phones audio levels. The essence of this experiment is to understand how to control screens through your phone using Node.js. The illustration I used was appropriated from an illustrator I love: Jason Levesque. In a future iteration I plan to record the audio coming into the call and play it back to the user through the mouth. Overall, I found this code to be magical in the fact that one can pretty much send through anything once you open up a server port with node.js. A huge thanks goes to Chris Kairalla who is teaching us asterisk and node.