Projects

Acoustic Velocity and Loudspeaker Delay with Temperature and Python
Samples and Time Converter with Python
Speaker Coverage Measurements with Python
Fun With Animatronics
QLab->Terminal->Sonic Pi
Make a texting applet for QLab and QDisplay
Remote voices with AppleScript
Make a secure ad hoc wireless network
Remote application control with AppleScript
Printing a document with QLab
Installation video wall control with QLab
Program LEDs to flicker like a candle


Acoustic Velocity and Loudspeaker Delay with Temperature and Python

NOTE: visit https://www.github.com/kreivalabs for the current code versions.

Use the following to calculate acoustic velocity in air based on measured temperature in degrees Fahrenheit. You can also input a measured distance from one loudspeaker to another to calculate the delay in milliseconds, based on the acoustic velocity calculation.

The third example script requires Future additions from http://www.python-future.org

Save the text below as a .py file to run in the Terminal of your choice.

Python 2:

Python 3:

And a version if you have Future installed, to avoid the version call:


Samples and Time Converter with Python

NOTE: visit https://www.github.com/kreivalabs for up to date code versions.

The following will calculate elapsed time in milliseconds and seconds for a given sample rate and time (entered in seconds). It will also calculate the number of individual samples elapsed based on sample rate and time (seconds). This Python script utilizes the Future functions available for free at www.python-future.org.


Speaker Coverage Measurements with Python

NOTE: visit https://www.github.com/kreivalabs for up to date code versions.

Calculate the coverage pattern area of a point-source loudspeaker based on its dispersion angle and the measured distance from source to listener (or some other point).

This method uses Future functionality from www.python-future.org:


Fun With Animatronics – A few years back, I worked with Washington Ensemble Theatre on a play called Sprawl which featured characters who are infected with a zombie-like virus that turns them into giant insect creatures. My takes was to design and build an animatronic control system for the actors, which would allow them to move antennae on their masks via control input from a glove. Using flex sensors, Arduino UNOs, servo motors and reg-green-blue-white LEDs, we achieved a fun, low tech horor effect.

Early development consisted of getting control over the servos via the flex sensors, using a wig form as a stand in for the actor’s head. Two flex sensors were fitted to each glove, so that the actor had independent control over each antenna.

Arduino sketch:

Next we crammed an UNO, solderless breadboard, servos, tubing, LEDs and a heap of wire onto bicycle helmets fitted with the mask, jaws adn hair. The solderless components were only intended for testing, but ended up staying in the final version due to time constraints. I ultimately made a modified version of the rig, with the sketch burned to an AT Tiny85 and soldered together onto a wafer in a small project box.


Controlling Sonic Pi from QLab – Use Sonic Pi to generate music. Sonic Pi, created by Sam Aaron at Cambridge University, is a fantastic live-coding music synth utilizing Ruby script and the SuperCollider synth engine. Originally developed for the Raspberry Pi platform, it is also available, free, on Windows, Mac OS X and desktop Linux. Utilizing script commands from QLab (as AppleScript cues) and a command line interface for Sonic Pi developed by Nick Johnstone called sonic-pi-cli, you can randomly call up pre-built instruments/pads/drones on a Pi connected to the audio system (you can also do this internally on the QLab host). With the script cues placed within a Fire All group triggered by a script, the Pi will create a reasonably random layering of sounds. The control script selects from the child cues within the group will only start cues who are not running (“whose running is false.”)

The Pi is connected via Ethernet to the QLab host, connected via SSH. You can install sonic-pi-cli via gem on a Mac.

As a simplified example, the following scripts call Ruby files from a directory located on the machine running both QLab and Sonic Pi. It requires that Sonic Pi is running before the call is made. All communication is local to the host machine. To run these same commands to a connected Pi, simply SSH into the Pi and then update the file pathway as necessary. The initial call will activate Terminal and keep the window hidden. All subsequent calls containing “in window 1” will keep the commands within the same terminal window.

To run additional buffers/scripts without opening new terminal windows, you can use:

To stop playback of all buffers/scripts, use:

To select a random Ruby file to playback, place all of the trigger scripts (those that call a specific .rb file) into a Fire All group. Create a script cue with the following:

Set a remote trigger (a randomized wait time with a restart, etc) to run the script cue that points to the group of .rb scripts). This will ensure that files which have been called will be disarmed and not allowed to be selected again. You can comment out that safety if you want files to be called in multiple thread runs.


Make a texting applet for QLab and QDisplay –

UPDATE – See project repository at https://www.github.com/kreivalabs/frontOfHouseMessenger for up to date code.

ORIGINAL – Created to solve the problem of verbal communication between a mix engineer and wireless engineer during performances. Using AppleScript, QLab and QDisplay, create a non-verbal communication method to allow the A2 to warn the A1 of wireless mic dropouts, and the A1 to acknowledge receipt without getting on the com.

This method utilizes QDisplay from Figure53, a companion application to QLab, AppleScript, and Remote Apple Events (System Preferences>Sharing). Ensure QDisplay is installed on the both machines, then enable Apple Remote Events (again, on both machines). On the sending side, paste the following into AppleScript Editor and save it as a script. Placing it in ~/Library/Scripts will put it in the menu bar if “Show Script menu in menu bar” is selected in Script Editor>Preferences.

Note: you will need the administrative user name and password for the receiving machine, as well as its IP address. Use static addressing for show networks.

 

Create another Script cue on the target machine that clears the message window:

 

If using a MIDI-enabled console, set this “clear” script to fire via a MIDI message and map that message to a user defined key on the desk. Otherwise, use a hotkey trigger. Next, create a third script cue on the target machine which continues from the “clear” script cue, and contains the following:

 

Lastly, create a duplicate “clear window” script on the A2 side, to purge the QDisplay window after the “received” message is printed.

Source available at https://github.com/kreivalabs/frontOfHouseMessenger

 


Remote voices with AppleScript – I once did a show in which a laptop on stage needed to beep, ding and generate the Apple voice assistant sounds, all while in motion across the stage. Since there wasn’t a place to hide a speaker with the machine in constant motion, I used the eppc protocol supported by Apple Remote Events (System Preferences>Sharing) and the following script. The laptop could be tethered via Ethernet, or on the same Wi-Fi show network as the QLab machine (always used closed networks for show systems). Note – the eppc protocol is unstable under OS X 10.8


Make a secure ad hoc wireless network – At some point along the way (I can’t remember when…), Apple removed authentication security from the ad hoc network function “Create a network…” in the Wi-Fi setup. I use ad hoc networks to link up iDevices to Macs running Max, QLab, Isadora, etc, and not being able to lock up the network is a real drag, and security hole. Here’s a better method.

Setup

First, go to System Preferences>Sharing. Uncheck “Internet Sharing’ if it is checked. Then, enter the following in Terminal:

You can alter the IP address to your liking. You will use it in a few more steps.

Configuration

  1. Go to System Preferences>Network and configure the newly created ‘Loopback’ network device to match the IP and subnet above, setting the ‘Configure IPv4’ menu to ‘Manually.”
  2. After setting up ‘Loopback,’ go to System Preferences>Sharing. Click ‘Internet Sharing’ on the left. Change ‘Share your connection from’ to ‘Loopback’ on the right.
  3. Under ‘To computers using’ check ‘Wi-Fi’
  4. Click ‘Wi-Fi options…’ and set a name, password and channel for the new wireless network. Click OK.
  5. Check the ‘Internet Sharing’ box to enable the new secure network.

Remote application control with AppleScript – Building on the techniques above, you can use this script to remotely open an application on a target machine and launch a file. Again, this may not work as expected under OS X 10.8. The example below launches iTunes and plays a sound file.


Print a document with QLab – A further example of AppleScript integration with QLab, this method allows for the sending of print jobs as a QLab cue.

The method above assumes a USB printer, but this could be altered to use a wireless or AirPrint device, so long as the QLab host and the printer are on the same network.


Installation video wall control with QLab – In the Summer of 2015, Left Coast Mac was contracted by Martin Christofel of Scenografique to design and program a dual display installation video wall for Axon’s corporate office lobby in Seattle. The media design took the form of a space station bridge, looking out onto a star field, with multiple sprites moving in and out of frame, a heads up display that prompted a visual shift, all of which had to run during business hours, without the need for the staff to turn it on and off daily. For this, we used QLab, as it would give us the most flexibility in setting event timings, transitioning between “scenes,” as it were, and because of its AppleScript and OSC integration.

Since I knew this would be a big programming job, I broke out the media and events into discrete cue lists within the larger workspace, allowing me to focus only on certain elements at the time, which could be cross-referenced from a master list of GO commands.

To automate the process, I created simple AppleScript applets that were triggered by the Mac Pro’s internal clock (via Calendar events) to halt playback at the end of the work day (18:00 hours, Monday-Thursday) and reboot the machine the following morning at 06:45 hours to clear the RAM cache. At 07:05 hours, another applet launches the workspace, which then auto-loads and begins playback. At 20:00 hours on Fridays, the machine shuts down for the weekend, then via another automated action, starts up Monday morning.

The media has been playing uninterrupted since July of 2015.

You can read articles about the office and its themed environment here and here (yes, it won “Geekiest Office of the Year”).


Program LEDs to flicker like a candle – Sketched and beta tested using an Arduino Uno. Grab some LEDs and a dev board to try it out. After you tweak it to your liking, burn it to an ATTiny chip so you can hid it in an actual candle.