Easy and Modular

SPARCK is by design highly modular and flexible to your demands.

It uses a Node-based composing principle.

Every tool is encapsulated in a Node, and every month new Nodes with new functionality will become available.

3D View

SPARCK lives and breaths 3D space.

It has a live Viewer to present your scene and visualize your installation with the calibrated projectors, virtual cameras, lights, led-strips, imported scenes or meshes, preview of dynamic softedges and much more.

Light and Shadow

You can load the most common 3D model formats with integrated textures and animations into SPARCK.

Add lights to make sure your scenes look even more realistic.

Projection Mapping

There are different tools at your disposal to solve the mapping problems from the very simple to the highly complex interactive realtime multi-projector installation.

With the CornerPin-Node you can quickly match a rectangular videotexture onto simple planar surfaces.

The MeshWarp-Node allows you to use your own custom meshes and manipulate them with a configurable bezier-based lattice to make your content fit the real object

And the Calibrator-Node uses 3D keystoning to match a virtual projector with the position of a real projector using corresponding 2D/3D points with high accuracy.

Dynamic softedges for spatial augmented reality with up to 6 projectors can be calculated in realtime on the fly.

Canvas Types

Only your imagination (or the physics of 3D space) are the limitations for the types of canvas SPARCK can project onto. All you need are custom created meshes which are the virtual representations of the physical shapes SPARCK will project onto.

Dome

Cylinder

Stone

Dodecaeder

Head

Car

Projection Modes

SPARCK can also do projections of video-textures in a purely virtual manner

Use the good old UV mapping technique and map your content onto a custom mesh with UV-vertices.

Take a single texture and virtually project it onto a virtual canvas, then capture it with the virtual representation of a projector and project it onto the physical world. Easy.

You also can create you own realtime raytracing-like Raymarching shader and use it to create amazing immersive spaces.

Or use the more sophisticated BoxMap projection to map seamlessly complete 360° 3D scenes from the internal or an external RenderEngine via Syphon or Spout.

RealTime Tracking

In order to drive interactive content there are several ways to get external tracking tracking tools to talk to SPARCK.

Using third-party camera based tracking system Optitrack, SPARCK can be configured to receive the real-time transformation data-stream of multiple objects in the scene. This data is then used to move the corresponding 3D object inside SPARCK synchronous with the physical model, providing a tracked match between virtual & physical worlds.

Using Microsoft Kinect together with HeadRoom, SPARCK can track visitors and their heads and use the data to move corresponding virtual cameras to record the virtual scene and project it back onto the physical world. Especially useful where the user can just walk at a spatial augmented reality installation and doesn’t need to wear or hold anything to get the experience.

Build your own interactive Turntable based on an open sourced design of ours and link it to SPARCK. This easy and cheap tracking solution is combined with the Kinect HeadRoom a poor mans path to the amazing world of spatial augmented reality.

Scripting

With QueScript SPARCKS has its own nonlinear animation scripting language. With a few commands you can create many parallel running scripts to control any aspect of SPARCK or beyond.

Or use SPARCK – DEVELOPER inside your own MaxPatches and let take Max take care of all the interaction.

<script>
    <que name="myQue">
        <timer>
        <send>/address list with strings and numbers 0 2.45 56</send>
        <wait timer="5s"/>
        <print>timer has passed</print>
    </que>
    <que name="my2ndQue">
        <anim name="simpleRamp" duration="5s" fadeout="2s">
             <track name="t1">0. 1.</track>
             <send>/address ramp {t1}</send>
        </anim>
        <wait anim="simpleRamp"/>
    </que>
</script>

Multi Screen Video Output

With SPARCKS Display Output Selector you will have sent your content within seconds to any imaginable configuration of your hardware, be it multiple graphic cards or video splitter (i.e. Matrox or DataPath). In theory SPARCK can deliver to up to 96 individual output devices.

LED

SPARCK can talk directly to PixelPusher, a highly versatile LED controller, and helps you realize fully 3D aware LED installations.

Combine it with projections and create highly integrated shows.

Integrates with other Apps

Any Apps that have the industry standards Syphon or Spout integrated can stream visuals in realtime between themselves and SPARCK.

If the internal 3D toolset is not of your taste, you can use other tools to generate the virtual content and send it to SPARCK to project it onto the real world.

and there is much more

Fast

SPARCK lives on the GPU in 3D space, so everything graphical is done fast and efficient with the best image quality possible, always in realtime on the spot.

Remote controllable via OSC

Each and every parameter of each Node you are going to use can be addressed via OSC and thus controlled by any external App, like TouchOSC.

Live Composite and Effects

A whole suite of color management, live texture masking and mixing tools are available.

Audio

Plays back Audio that is part of your video file. When using SPARCK – DEVELOPER you have the complete arsenal of MaxMSP at your disposal for audio fidelity craziness.

Videos

SPARCK videoplayer uses code from ffmpg and can cope with most common videocodes like HAP, h264 or Apple ProRes inside containers like AVI, MOV or MPEG.

Textures

SPARCK can deal with TIFF, GIF, JPEG, PNG

soon to come

Capturing Node

get live video feeds from capture cards

Oculus Node

Use Oculus Rift as output device