Naivi: Studio 80 Year Of The Horse VJ / Light Performance

DSC7963-1024x965

As NAIVI we performed with our custom build LED panel and self written software at Year Of The Horse in Studio 80 (Amsterdam). The organisation wanted to do something different with their light setup and we (NAIVI) were invited to make that happen.

The LED panel is build from the ground up from high brightness LED’s shipped in from China. The panel receives it’s data through a Teensy Microcontroller. I wrote a custom C++ library to convert any sort of image (or video) data to a stream of bytes the micro controller can work with. On top of the library we wrote a vj application that samples and decomposes a video and audio stream to procedurally generate the image data that is send over. These images are synced to the music based on a modulation matrix that accepts LFO, Recorded Channel and Audio data. The modulation matrix consists of a set of application effects (controls) that are used to render a set of images.

You could think of the application as a video synthesizer that accepts any sort of device input to generate an image using a specific render interface (OpenGL).

The Application can also control any already existing light in the club. Most lights accept a DMX signal that is send out by a control panel. The panel in studio 80 accepts midi data that can be mapped to various controls to manipulate the brightness or movement of a light. The already existing modulation matrix (in the software application) is used to calculate a new light value based on the image, audio and lfo data available in the running application.This enabled us to control both the lights and visuals from the same software environment, merging two distinct worlds that are most often controlled seperately.

All the software is written using OpenFrameworks in C++. The teensy code is native C.

Programmable LED wall

After performing with our custom VJ software last month, I noticed that the video stream felt constrained by the bounds of the canvas and did not light up the venue in a way that was desirable or expected. Although we received very positive feedback regarding the visuals itself, I wanted to get rid of the beamer and move towards a more custom / scalable light emitting video projection source.

After reading up on custom LED panels I decided to give that a go. This would render the lights in a club in-effective and could display the visuals in a more tangible way. The added benefit of the hardware is that there would be no more fighting over what gets priority, the lights or the visuals. This turned out to be my main point of frustration that night we performed.

The final wall (combined) should be approx 4 meters wide and 2 meters high. The LED screen can be broken up in to various parts (panels), all of them addressable through a custom GL interface that renders parts of the video stream to a sub-set of the screen. This enables us to place multiple panels throughout a possible venue / space,

But buying that much equipment at once costs a lot of money. I decided to build a small prototype first, using components recommended online. Further down you can find the various parts I used. The microcontrollers arrived within a week, as well as the programmable led strips, cables, plugs and adapters.  All of the equipment was shipped from the United States using UPS.

I ordered 2 Teensy 3.1 microcontrollers, together with the OctoWS2811 Adaptor board to address the individual led strips. In combination with the OctoWS2811 LED library this setup can address thousands of LED’s, perfect for building a project of this size. I had to solder the components together and used a standard CAT cable to drive the individual strips. The wiring of the strips to the board was rather straight-forward. For this prototype 500 individual LED’s were fed a video stream using only 4 outputs out of the 8 available on every Teensy board. Every 2 strips (approx 300 leds) were given their own 10 amp, 5 volts power supply. The PWM signal was send from one teensy board. Multiple Teensy boards can be connected to keep the video signal in sync. The led strips were ordered at Adafruit.

The Led Panels are addressed through a custom C++ interface, compatible with OpenFrameworks. Using this interface I can drive multiple LED panel using various applications, including my VJ application. The vj app is controlled using an IPad and Akai APC20 (midi controller). The amount of light emitted is (to say the least) exceptional. Perfect for lighting up a medium sized venue, especially once all the hardware is combined in to a blistering light source.

Now there’s a working prototype, we can order the additional led strips, controllers and adaptors. 8 teensy boards will be used to drive the individual LED panels. Every teensy board wil feed approximately a 1×1 meter led panel. All of these panels can be combined to create a big wall, or placed throughout a venue to generate a more spacial effect. More on that soon.

grab_onegrab_threegrab_fourgrab_five

One finished led panel dancing to the music and lighting up the club @ Year of the Horse, Studio 80, Amsterdam

led wall test wip (using OpenFrameworks)

2nd programmable video led wall test

App Controllers

Akai APC20 and Ipad, used for controlling the application that drives the panels.

DSC_0230

The 2 teensy controllers soldered on to the Octo adapter (running)

DSC_0241

16 meter of led strips controlled by one Teensy board, using 4 out of the 8 available PWM channels. 2 strips use one 10A 5V power supply (300 leds)

Kala Processing Library (C++)

Started a new generative visuals project, where a video stream is layered and fragmented in real-time, using a custom library build upon OpenFrameworks. This library will be used to develop various filters and algorithmic effects that can be combined with video streams, images and geometric data. The current version runs a stable 60+ fps, where the source data is sampled from a video stream.

The framework will be used the 17th of May (2014) to render the visuals for the upcoming club night: The Year Of The Horse. The trailer (shown below) is recorded live and used a custom midi interface to control the layers. The midi interface hooks in to an IPad and Akai APC-20

blur

glitch_oneglitch_three

glitch_two

Video Analyzer / Sampler

I’m working on a program that analyzes and samples large video files (movies). First implementation is shaping up nicely, where there is an option to rip individual frames from a stream (in sequences) and analyze shots. The idea is to use this information to create spacial relationship maps based on the information found in a video stream.

The application is written in C++ and fully threaded. Multiple operations can be performed at the same time on the same video stream: for example sampling, analyzing and browsing. The interface is straightforward and should be of help when trying to find the right sample settings.

At some point I hope to be able to share the application and release the source code.

video_analyzer_screen

Pacemaker Game Sketch

Last weekend I participated in the Global Game Jam 2013. For me it was the first time making a game from scratch and I liked the idea of returning to my student days of no sleep and sipping beer behind a monitor early in the morning. In the end we had a working platformer where all the elements are synced to a global pulse (beat) that changes over time in the level. The theme was: heartbeat.

As the bpm increases, so does the difficulty level (objects rotate, move.and disappear at a different interval). It also causes the audio to slowly change from a heartbeat to a more aggressive hardcore kick.

Unfortunately we had no real level / game designer and for that reason not a completely polished game. Everything came together 3 hours before the deadline. But it’s playable and does illustrate what we tried to accomplish.

I used Unity for the first time and must say I’m impressed. The component based  object model is flexible and almost everything you can think of is there: Audio, Shaders, Lights, Scripting / Coding environment, VS integration, etc. Definately worth trying out.

If you feel like playing it, I uploaded the game right: HERE. Use the arrow keys to move and press space to jump. Works better with an actual Controller (tried xbox 360).

 

Vertex interpolation using Barycentric Coordinates and Bilinear Filtering

About a week ago I had to get uv coordinates accurately transferred on to a new set of points.
I knew these points lay on the existing surface, and that a nearest neighbour solution would cause problems, because a point can be closer to a different vert than the primitive it sits on. I also knew I had to be able to correctly interpolate quads and triangles.

After consulting the web I found out that you could use the triangle’s Barycentric coordinates and the quadratic’s filtered uv position to correctly interpolate the vertices that define a primative (or face). Both methods return a set of uv coordinates that help define the point’s relative position on that sampled face.

Remember that with this particular uv set I mean the uv set relative to a certain primitive or face. Not the uv set that defines a vert’s position in 2d space (relative to other vertices).

In the case of a triangle (defined by: A,B,C), the uv set specifies the weight of every vert, say: (U: 0.2, V: 0.5, W: 0.3). Note that for simplicity the last value is often left out because if the point lies within the triangle, the seperate weights always add up to one. Coming back on the previous weights, we can write the third weight (W) as: 1-(0.2+0.5).

You could also say that the uv coordinates describe the distance you have to walks over the edge of the triangle starting from vert A to vert B and C. If the value of u or v (where u = location on edge: A-B, and v = location on edge: C-A) ends up to be higher or lower than one, the point lies outside of the triangle.

The point’s position (P) can now easily be calculated: P = (B*U) + (C*V) + (A*W)

For a quadratic primitive the uv coordinates don’t describe the vert weights directly. The set only describes the relative position of the point based on the primitive’s local uv space. But by using a simple bilinear interpolation we can use these coordinates to calculate that point’s position (or interpolated uv values). We start by defining two new points in the u axis, and use these points to find the final weight in the v axis.

Say we have a square defined by: A, B, C and D and a uv set that defines a point in this square with the coordinates: (U)0,2 and (V)0,5. It’s a right sided primitive. We can figure out the two new points (P1 and P2) in the u direction by using the primtive’s edges: P1 = (B-A)*U, P2= (D-C)*U. But this only get’s us half way. We have the interpolated u value as a set of 2 points, but we can use these points as an edge to get the final position (PF) (or whatever data you see fit): PF = (P2-P1)*V. And that’s it…

But this doesn’t explain how to get those uv coordinates. Houdini has some build in functions for this that can be accessed with VEX, C++ and Python. For this example I will stick with Python. But you could also write your own. A good example on finding the Barycentric coordinates of a triangle can be found here. Finding the coordinates for a quadratic primitive is more diffifcult and depends on the way the primitive is described. Doing a simple google search on: ‘parametric surface’ will get you half way. This wiki article is also a good starting point.

Now for some code. I wrote a UV Interpolator in Python that corrrectly interpolates the uv attribute from a set of vertices. I already know the primitve to sample, so I don’t perform any unnecessary ray casts. The operator works with both quads and triangles. The function that is used to get the uv coordinates local to the primitive is: hou.Prim.nearestToPosition(Point). In vex these coordinates are generated when doing an intersect operation.

"""
This operator interpolates the uv coordinates from a primitive, indexed from the second input.
The points defined in the first input should have an integer attribute pointing to the primitive to sample from.
The primitive should have uv coordinates defined on vertices or points.
Only works with quadractic or triangle primitives!
For Quadratic primitives: bilinear interpolation is used to find the new uv coordinates.
For Triangle primitives: the Barycentric coordinates are used to find the new uv coordinates
For a sample reference: http://www.gamerendering.com/2008/10/05/bilinear-interpolation/
"""


# This code is called when instances of this SOP cook.
node = hou.pwd()
geo_one = node.geometry()
geo_two = node.inputs()[1].geometry()

# Sample the parameters
uv_location = node.parm("uv_location").evalAsString()
prim_index_name = node.parm("prim_index_name").eval()
max_distance = node.parm("max_distance").eval()
prim_type = node.parm("prim_type").evalAsString()
group_name = "faulty_samples"

#Attributes-------------------------------------------------------------------------------

# First sample the uv attribute from the second input
uv_attrib = None
if uv_location == "points":
uv_attrib = geo_two.findPointAttrib("uv")
else:
uv_attrib = geo_two.findVertexAttrib("uv")
use_points = (uv_location == "points")
use_triangles = prim_type == "triangle"

# Make sure the attribute was found
if uv_attrib is None:
raise hou.NodeError("Can't find uv attribute")

# Now sample the primitive index attribute from the first input
prim_index_attrib = geo_one.findPointAttrib(prim_index_name)
if prim_index_attrib is None or prim_index_attrib.dataType() != hou.attribData.Int:
raise hou.NodeError("Can't sample primitive index attribute of type Int: %s" % prim_index_name)

# Add a new point uv attrib if necesarry
added_uv_attrib = geo_one.findPointAttrib("uv")
if added_uv_attrib is None:
added_uv_attrib = geo_one.addAttrib(hou.attribType.Point, "uv", (0.0,0.0,0.0), True)

# Create a faulty point group
faulty_point_group = geo_one.findPointGroup(group_name)
if faulty_point_group is None:
faulty_point_group = geo_one.createPointGroup(group_name)

#Methods--------------------------------------------------------------------------------

def getUVCoordinatesFromQuad(inCoordinates, inPrimitive):
"""
From the incoming primitive we first create two new interpolated points on the u axis
From these points we create the final uv coordinate based on the v axis, using bilinear interpolation
"""


verts = inPrimitive.vertices()
vertuvs = []

if len(verts) != 4:
raise hou.NodeError("Primitive: %d does not have excactly 4 verts!" % inPrimitive.number())

# get the uv values from our verts or points
for vert in verts:
if not use_points:
vertuvs.append(vert.attribValue(uv_attrib))
else:
vertuvs.append(vert.point().attribValue(uv_attrib))

# get our final weights in u and v
rv = 1-inCoordinates[1]
ru = 1-inCoordinates[0]
pv = inCoordinates[1]
pu = inCoordinates[0]

# calculate two new uv samples in the u direction
bottom_uv = ((vertuvs[1][0]*pu + vertuvs[0][0]*ru), (vertuvs[1][1]*pu + vertuvs[0][1]*ru))
top_uv = ((vertuvs[2][0]*pu + vertuvs[3][0]*ru), (vertuvs[2][1]*pu + vertuvs[3][1]*ru))

# interpolate over v to get our final value
final_uv = ((top_uv[0]*pv + bottom_uv[0]*rv), top_uv[1]*pv + bottom_uv[1]*rv, 0.0)

return final_uv

def getUVCoordinatesFromTriangle(inCoordinates, inPrimitive):
"""
Compute the new uv coordinates based on the incoming Barycentric coordinates.
The first coordinate maps to the 3rd vert, the second coordinate to the 2nd vert.
The weight of the first vert is computed by complementing the added two weights.
"""


verts = inPrimitive.vertices()
vertuvs = []

if len(verts) != 3:
raise hou.NodeError("Primitive: %d does not have excactly 3 verts!" % inPrimitive.number())

# get the weights
vert_weights = (1-(inCoordinates[0]+inCoordinates[1]), inCoordinates[1], inCoordinates[0])

# get the uv values from our verts or points
for vert in verts:
if not use_points:
vertuvs.append(vert.attribValue(uv_attrib))
else:
vertuvs.append(vert.point().attribValue(uv_attrib))

# compute the new uv values
new_u = (vertuvs[0][0]*vert_weights[0]) + (vertuvs[1][0]*vert_weights[1]) + (vertuvs[2][0]*vert_weights[2])
new_v = (vertuvs[0][1]*vert_weights[0]) + (vertuvs[1][1]*vert_weights[1]) + (vertuvs[2][1]*vert_weights[2])

return (new_u,new_v, 0.0)

#Compute---------------------------------------------------------------------------------

"""
Iterate over every point that we need to interpolate the coordinates for.
"""


points = geo_one.points()
prims = geo_two.prims()

warning_string = ""
warning_occured = False

for point in points:
# Get the primitive
sample_prim = prims[point.attribValue(prim_index_attrib)]

# Make sure the primitive is a poly
if sample_prim.type() != hou.primType.Polygon:
raise hou.NodeError("Primitive: %d is not of type Polygon" % sample_prim.number())

# Get the parametric uv location of the point on the primitive
local_sample_data = sample_prim.nearestToPosition(point.position())
para_uv_coor = (local_sample_data[1], local_sample_data[0])

distance = local_sample_data[2]

# Add an entry if it's too far away from the primitive
if distance > max_distance:
warning_string += "Point: %d appears to be %f units away from indexed primitive: %d\n" % (point.number(), distance, sample_prim.number())
faulty_point_group.add(point)
warning_occured = True

# Sample the uv coordinates
new_uv_coord = None
if not use_triangles:
new_uv_coord = getUVCoordinatesFromQuad(para_uv_coor, sample_prim)
else:
new_uv_coord = getUVCoordinatesFromTriangle(para_uv_coor, sample_prim)

point.setAttribValue(added_uv_attrib, new_uv_coord)

if warning_occured:
raise hou.NodeWarning(warning_string)

C++ Midi Analyser

Been playing around with openFrameworks recently, trying to get a real time 2d fluid solver up and running (more on that later). To get it more interactive I decided to attach some of my midi controllers, but couldn’t find a Midi library in openFrameworks. Luckily the ofxMidi addOn helped me out.

After downloading the addon and getting it compiled I noticed that the example was actually quite useful to test my various midi devices! After tweaking the code a bit to view the selected port and the devices in an ordered list, the app proved quite useful…

It allows you to select any of the attached devices and display the messages that are received (value, channel, velocity etc.). When a device get’s disconnect it simply disappears without crashing the app. The other way round works as well. Ever tried that in CUBASE? Pretty solid code, thanks Chris O Shea for making a neat addon. Much appreciated! Makes my life a lot easier.

I attached my modified code and the compiled x64 executable (windows). If you have any midi devices hooked up, they should simply appear. Use the numeric keys (1-9) to select a port (device).

Note that in order to compile the modded code, you need to download the ofxMidi addon and link to the source code in there, together with the openFrameworks library. Tested mine with v0071. But the exe should be enough.

Midi Analyser

C# Zoom, Pan and Composite Exercise

Was working on a custom control that allows a user to freely pan and zoom in on image.
It also supports a minimal amount of image compositing, in this case a custom grid.

The image is being drawn on a custom panel using my own drawing code.
The actual image can be composited with a grid overlay, that is being placed on top of the main form.

The ZoomControl currently holds an Image and Overlay object. The Overlay object is used to draw the grid and is tied with the main control using a seperate event that is dispatched when the amount of rows or columns changes. The rows and columns can be updated through the main interface. The Overlay object creates an image that is stored internally, but is accessible to other components. This image is composited with the background image in the ZoomControl and drawn to screen. The drawing is based on the current zoom and pan levels.

Drawing could be optimized by storing the composited image in memory, updating it only when one of the images changes. Currently the image is composited every time the draw function is called.

Anyway, the project can be downloaded HERE.
It’s build in Visual Studio 2010 but should compile find in older versions.

Blofeld

After days of coding for fun or at work I figured it was time to pick up my old synth again.

Back in the days (just a couple of years ago), I spend a lot of time producing music. I still enjoy my record collection and am passionate about anything electronic. Especially Drum and Bass, only not the type that gave it a bad name. Thank god there are people like D-Bridge, Spectrasoul and Alix Perez to make your life more comfortable.

Anyway. Produced this little track in the wee hours of my spare time, all strings and basses are made with the Waldorf Blofeld. Initial patterns created in Renoise, finalized in Cubase. Long live the string and pad!

Download: Fomal – Sugar Rush

Opening Up Houdini

There are times when you want to get information from other applications, or your own, into Houdini. Although there are many useful operators for loading and exporting information, opening up a port offers more or additional functionality.

A while back I was asked to create a protocol that would enable us to do exactly that in Houdini. This could be from Maya or any other application. In this post I won’t discuss the various serialization methods but will try to open a door on how to start your own (simple) server when launching houdini. One could use this server that runs in the back-ground (in a seperate thread) to directly access information, execute tasks or sample data (in a custom node for example).

A possible use could be to access external data to reconstruct meshes without writing a single file to disk.

To achieve this we need to open a port on startup that can receive data independently from the session a user is working in. Therefore not locking up Houdini.

A couple of files are searched for and run when Houdini is launched. Most noticably 123.cmd, 456.cmd, 123.py and 456.py. 123.cmd and 123.py run when Houdini launches without loading a file. 456.py and 456.cmd are run whenever the session is cleared or a file is loaded. Another option could be to create a file called pythonrc.py. This file is always invoked when Houdini starts up.

For this example I created a file called 123.py and made sure the HOUDINI_SCRIPT_PATH was confifgured correctly to find that file when it launches. If unsure on how to do that, simply create the file in you Houdini script folder.

For example: $HFS/houdini/scripts/456.py > C:/Program Files/Side Effects Software/Houdini 12.0.581/houdini/scripts/456.py. There should already be a file called 123.cmd (the hscript equivalent) in there. Appending your own custom HOUDINI_SCRIPT_PATH location is advisable! When upgrading Houdini, scripts remain accessible.

Once the file has been created we want to start the server when we launch a new Houdini session. I’ve given the server a permanent port to connect to (2000) but it is possible to assign multiple ones for every session running. In this case it means that from wherever you are, you can connect to exactly one Houdini session by sending data over port 2000. Most likely you want to connect to ‘localhost’, which is the computer you are using.

To seperate the object from the initialization steps, a seperate HServer class is created and instantiated in the 123.py file. There can be only one port open and one instance of this class running, making it a singleton. And although Python’s definition of a singleton is weird, it is preferred in this case.

# import our server module
import hdaemon

# start the python server
thread_name = hdaemon.startHPythonServer('python_server')
print "started python sever in thread: %s" % thread_name

These lines of code are executed when houdini launches. When the instance of houdini is exited, the port will close and the thread will stop. But what is actually started, and how? The following piece of code shows what happens when trhe startHPythonServer function is called.

The module ‘hdaemon’ holds the HPythonServer object and 2 methods for starting and stopping it. The start method checks for a thread already running with the name ‘python_server’. If a thread with that name is still active, no new port will be opened and no new process will be returned. If the server is not running, a new HPythonServer is created and started in a seperate thread.

The socket module is used as the low level network interface.

"""
@package hdaemon
This package defines a Python Server that can be used to communicate with Maya
@author Coen Klosters
"""


import threading
import socket
import uuid
import hou

class HPythonServer(threading.Thread):
    """
    Simple Python Server running in a seperate thread within houdini, listerning to commands
    Connect to Houdini using the HPORT specified.
    """

   
    CLIENT_IDENTIFIER = "HPythonServer"
    HOUDINIPORT = 2000
   
    def __init__(self, inClientIdentifier=None ):
        """
        Constructor
        @param self The object pointer
        @param inClientIdentifier A string that allows the overriding of the client identifier.
        """

       
        threading.Thread.__init__(self)     # initialize the thread
       
        if inClientIdentifier is not None:
            self.CLIENT_IDENTIFIER = inClientIdentifier
       
        self.__thread_running = True        # when set to false, cancels the socket to listen
        self.__guid = uuid.uuid4()          # unique identifier name
        self.__associated_socket = None     # Initialize the socket
           
        # Start the thread
        self.setDaemon(True)
        self.start()

           
    def stop(self):
        """
        Close the socket and stop the thread
        @param self The object pointer
        """

        self.__thread_running = False
        self.__associated_socket.close()

   
    def setup(self,inConnection):
        """
        Listens to the connection given to receive external data.
        @param self The object pointer
        @param inConnection The incoming server connection
        """

       
        operation = True
        conn, addr = inConnection.accept()
        print 'Connected to: ', addr
        while self.__thread_running:
            data = conn.recv(4096)          #make sure it's a power of two!
            if data == "print":
                print "switching to print mode"
                operation = True
            if data == "evaluate":
                print "switching to evaluation mode"
                operation = False
                continue
           
            if operation:               # if in print mode, print whatever received
                print data
            else:
                hou.session.dataobject.data = data  # otherwise store the data in the houdini session currently active
               
            if not data: break
           
        print "connection broken with: %s" % str(addr)
       

    def run(self):
        """    
        This is the override of the threading.Thread.run() function which is started on the AssetDaemonPythonClient's thread.
        As long as the __thread_running variable is True we will stay in this function listening for data from the socket.
        @param self The object pointer
        """

       
        HOST = ''                                  # Localhost
        PORT = self.__class__.HOUDINIPORT          # Port to connect to Houdini with
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        try:
            s.bind((HOST, PORT))
            s.listen(1)
        except socket.error:
            s.close()
            s = None
            return -1
       
        while self.__thread_running:
            self.setup(s)


def startHPythonServer(inDaemonThreadName):
    """
    Helper function for setting up and starting a HPythonServer
    @param inDaemonThreadName The name to use for the thread containing the client, this name is used to ensure the client works as a singleton
    @return object instance
    """

    mcd_thread = None
    for thread in threading.enumerate():
        if thread.name == inDaemonThreadName:
            mcd_thread = thread
   
    if mcd_thread == None:
        mcd_thread = HPythonServer()
        mcd_thread.name = inDaemonThreadName
       
    return mcd_thread

     
def stopHPythonServer(inDaemonThreadName):
    """
    Helper function for stopping an HPythonServer
    @param inDaemonThreadName The name of the thread that the client should be running in, by using this name the function will find the thread and stop it.
    """

    mcd_thread = None
    for thread in threading.enumerate():
        if thread.name == inDaemonThreadName:
            mcd_thread = thread
           
    if mcd_thread is not None:
        mcd_thread.stop()

When launching Houdini, a seperate thread running alongside Houdini should be waiting for packages to arrive. To connect to Houdini simply specify the server and port you want to connect to. Messages that are send are printed to screen in Houdini.

To connect to Houdini, do the following.

import socket

HOST = 'localhost'                          # The remote host
PORT = 2000                             # The same port as used by the server
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((HOST, PORT))                 # Connect
s.send("Hellow Houdini, I am your father")      # Send a message
s.close()                       # Close the connection

Off course, sending simple strings doesn’t get you far. In order to actually do something useful, the data needs to be serialized / deserialized. But that’s something I might address some other time. A handy tip is to look in to the hou.session module. This module can be used to store received data and is accessible throughout Houdini! Think of parsing mesh data that can be accessed in a python Sop. Very useful!