I have made it originally to meet our needs, then opensourced it: We needed to move a PTZ cam based on the stage/pulpit mute states on our X32, but it is capable for way more. Let me know what do you guys think!
Cheers!
PTZ - pan/tilt/zoom camera, that much I understood. The rest? Uh… can I get an ELI5 please?
Even though I’m clearly not in the target demographic, I’m eager to learn more..
Edit: ok, clicked through to GitHub, now I (kinda) got what it’s for :)
In this case, there are multiple points of interest on the stage which are sometimes used, and sometimes not. When an area of the stage is unused, the microphone(s) at that location are manually muted to eliminate unwanted noise. The remaining unmuted microphone is at a location of interest, which is also the logical thing for a motorized camera to point toward and zoom onto at that moment.
This project uses the muted/unmuted states of microphones as a cue for camera movement, although it takes some upfront work to set it up. It also could cause trouble for looser or more improvisational shows where such rigidity might actually get in the way.
[1]: https://www.behringer.com/series.html?category=R-BEHRINGER-X...
This allows them to be programmed as general purpose computers.
I understand how the automation of "point at stored location when something happens there" could be useful on some sets, but it also looks like you could do the jog button automation with this toolkit. Vary the speed of the movement with a volume slider, use fade in and fade out to smooth out the motion...
To paraphrase Malsow, I suppose it is tempting, when the only tool you have is a customizeable digital mixing console, to treat every problem as if it were an audio channel!
I’ve been playing with hooking up a MIDI controller to my OBSBot Tail Air PTZ camera and OBS.
The config and filters and triggers looks similar to my prototypes.
I’ve been wondering if there’s any sort of prior art or standards here from other domains like lighting consoles or workflow systems.
If I understand your prior-art question, I think there is a few: - the OSC protocol for audio gear and some other studio equipment. - In the lighting world I heard DMX is king - OBS has a very extensive Websocket client
For PTZ cameras I'm not sure, our old PTZ camera needs HTTP GET-s to weird urls (RPC).
Thanks again for starring!
Interestingly, both seem to be projects from French developers, and they look very similar.
Also see IanniX (https://www.iannix.org/fr/) which is another OSC sequencer from french developers & researchers and Open Stage Control: https://openstagecontrol.ammd.net/ (yes, also french ;p)
@op: ossia has some support for PTZ cameras (mainly through NDI) and MIDI/HTTP/WS/etc..., I'd be curious to know if it would fit your bill. Definitely hooking some MIDI control to a PTZ movement should be a matter of a few clicks at most but I don't have a PTZ camera plugged in right now to make a small video.
I have started to mess around with Chataigne, it seems promising, but this far I'm getting stuck at trivial steps (matching a regexp, converting a "contains" to bool, etc). We'll see how I progress with support:)
But that guy is clearly brilliant just like Chataigne.
Thanks for mentioning!
Just spitballing.
Also general audio stuff is from 20-20khz, so you don't really have headroom to super(or sub?) impose another signal into it frequency wise, as it would be audible. (Unless you sacrifice some parts with a high/low cut on the original signal.)
Although you could have 48khz sampling rate on most audio gear, so you could do time-multiplexing if you are really desperate, but then all signals need processing before becoming useful/noiseless.
About the cable sheath, of course you can do anything with custom circuits, but general audio stuff will not help you in that as far as i know.
If you ask it because of controlling the cam with a mute status felt out-of-place for you, then: no it's not wasting any channels. As someone above already explained very precisely above, the thing is you don't want to have live and unused microphones on your setup, due to feedback, extra noise, etc.
So if the speaker is speaking, the band is muted, if the band is playing the speaker is muted. And this can be used (in our case, not always for everyone), to track where the event is happening, and the camera can turn at the right position based on this. Therefore we, in our usecase could eliminate an extra step for someone to manage the camera by hand.
By the way, many consoles have extra buttons/knobs that are assignable to random stuff, so through OSC i could query their states as well and i could set up camera movement with those as well if i wanted to.
[1] https://documents.blackmagicdesign.com/UserManuals/ShieldFor...
By the way, we do have a custom keyboard as well, which is basically a separation of concerns/simplified interface for OBS, which is intimidating for most of our crew. But they can handle 10 buttons with nice labels.
For example they have one button that toggles the camera view and the projector view.
Also I scripted OBS to manage camera positions with the scene "Pulpit", "Stage", "Pulpit wide", "Sitting": so they can manually override the automated thing in 1% of the time, but 99% of the time the automation is enough.
Also I have scripted OBS to manage the stream lifecycle based on which scene is active. So: - they can bring up the "Starting soon" scene that would start streaming but not the recording - they can bring up the "Break" scene that would pause(!) the recording but not the stream - they can bring up the "Finished" scene that would stop all with a nice fade out and "thank you for joining" text, etc.
So a control pad can work together with this tool.
I've used it a lot for the original designed use-case (sending parameter updates between controllers and music synths), but also a bunch of other things (sending tracking information from a python computer vision script to a Unity scene).
I kinda dig this concept of O2 (https://rbdannenberg.github.io/o2/) layered over OSC for easy interoperability, but it also seems like a lot of work, and I'd rather spend my time making music.
I'm not a fan of MIDI, but it does kind of blow my mind when things just work. Having OSC be as plug-n-play as MIDI would be awesome!
X Touch MIDI goes into a piece of software called open stage control https://openstagecontrol.ammd.net/ which runs https://github.com/xxpasixx/pam-osc which then translates the MIDI messages to the correct OSC commands to send to grandMA3. Then on the grandMA3 side there is a lua plugin that sends OSC commands back out to open stage control to set fader positions and LED status.