I 'm searching the best way to prepare a live drawing performance during a music and VJing gig.
My first idea was to connect a camera to vdmx , drawing in live, and playing with FX on the image of the drawing filmed by the camera.
But I was also thinking about using a classical Wacom graphic tablet.
Is there a way to connect a tablet ton VDMX, or to link VDMX to a drawing soft?
I have done this with an artist drawing on an iPad Pro with apple pencil with a drawing app called Procreate.
The iPad is connected to the mac with AirServer (making the mac an AirPlay receiver) and then from AirServer to VDMX with syphon.
Procreate has a mode which allows you to have second screen sent out via AirPlay without any gui from the app.
I have also done this with photoshop and captured the window and sent it to VDMX
I have a too old iPad to get the soft, but thanks for the suggestion, I’ll probably test it in the future.
For the Photoshop Capture in VDMX it will be a bit complicated because I’ll already have AbletonLive and VDMX running on the same laptop for the show… So it seems a bit too heavy for my computer.
I maybe found the solution, my resourcefulness lol… Using an old overhead projector for the live drawing + Using VDMX plugged to a video projector. It could be an interessting mix of techniques, like drawing characters into a video setting for example…
I’ll experiment that and come back into this conversation for sending you the results
In the past there are a few variations I’ve done for live drawing…
Building a physical ‘drawing table’ that has a camera mounted above or below a drawing surface (for below you kind of need to use tracing paper, and there are some limitations when it comes to drawing / painting material options, but they are great for live performances because people really see how the images are being made, and enables audience participation)
I’ve worked with a few artists who work in Photoshop, and there are a few ways to go about setting up a performance for this. In one case, the artist wanted to work on their own computer (which had brushes, etc installed), so we did a video mirror of their display and captured that as an HDMI input (now we could do it over NDI…). In the other case, we performed used the same computer using the ‘window grabber’ – this required me to preconfigure my settings in VDMX to MIDI control so they could use the main interface of the computer for Photoshop.
Another route that I’d like to explore is creating some ISF compositions that take the various inputs from a Wacom and apply brush strokes into a persistent buffer.
I haven’t used it, but apparently Adobe Character Animator supports Syphon, and is used for real-time animation stuff?
For each of these, one really great technique is to use the movie recorder plugin to record snippits of the live drawing that can be layered on top of each other as part of the composition. Eg capturing ‘gestures’ or characters as foreground elements that have a black background, then starting over to create backgrounds, and changing the combinations of elements.
Also in that thread, around that time I was also working on this shader, and I linked to this video. I don’t think I ever made a version of it that was a not an automatic random walk for the ‘pen’ point, but it could be the starting point for something that took a manual input. Random Walker.fs.zip (1.5 KB)
Beyond that, I didn’t see this mentioned, but for working with an iOS device, you can directly capture their screen into VDMX when they are connected via a cable, so you can pretty much use any drawing app, maybe need to zoom / crop some of the interface out.
Thats a great Procreate tip; I hadn’t realised it was hidden in there. The other option, using Airplay and Syphon, is Tagtool which is not as sophisticated for drawing as PS or ProC BUT it does have excellent animation possibilities and a projector window like Procreate. Worth a look…