Hybrid setup synchronisation

I have been trying to find a good workflow for production on hybrid setup lately. I find myself in a situation where I have a lot of hardware synthesizers and a lot of trouble getting everything synchronized. Playing software tracks in Ableton and successfully recording multiple external instruments in tempo seems very difficult for me to achieve.

I would appreciate it if someone here who had more experience and understanding over this matter could demystify some basic things for me. As a beginner I feel overwhelmed with all this stuff.

At the moment I am using Ableton as the master clock. Ableton is sending clock via USB to E-RM Multiclock and the Multiclock is sending clock to Digitakt, Pulsar, DFAM and a MIDI Thru box that is dividing the clock to a few more devices (RD-9, Oto Machines Bim).

All the hardware is quite well in sync but it seems difficult to get them to start on the right spot when I’m recording them into Ableton. I know that syncing Multiclock with an audio signal is more stable but since I replaced my RME Fireface with the Big Six I didn’t want to sacrifice one mono track for this purpose solely. The Big Six works as a mixer and an audio interface.

I also tried sequencing the Super 6 from Ableton via USB with no success getting it in sync with other gear :frowning: I dream of being able to play a kick from Ableton, send some sequence to my synthesizer and to be able to record other hardware at the same time. I have been watching a lot of tutorials on the YouTube but all this information and mixed techniques gets me frustrated.

How do you guys set things up? Is there a way to do this or should I just record external instruments first and then fix the timing issues on the samples in Ableton? Maybe you know some good tutorial to explain me how to fix everything?

My entire setup consists of:
Macbook Pro running Ableton
SSL Big Six
E-RM Multiclock
Digitakt
DFAM
Pulsar 23
UDO Super 6
RD-9
Nord Drum 2
Norand Mono
Subsequent 37
Bunch of Eurorack
FX: Bim, Bam and Boum

i’m using a similar setup except i’m using an audio output to send clock to the er-m and i’m getting very good sync.
i would recommend finding a way to sync the multiclock with an audio channel over usb.

You do really need to use the multiclock audio sync to achieve what you want to do. What about a second cheap audio interface and run as an aggregate device device on the mac.

Some devices work great as an aggregate some less so but it is an option.

1 Like

Thank you for the answer.

The aggregate device came to my mind too. Let’s say I would have Multiclock synced through audio… how would I then sync the Super 6 that’s receiving midi notes via USB? By tweaking the midi delay for the Super 6? Or just sequencing it with some hardware? Should I record audio with input monitoring ON, OFF, or AUTO?

Do you use the Multiclock in the POS/NEG mode or the POS mode? Seems that if I want to use the ability to tweak the individual channel timing it need to be i the POS/NEG but then there’s the count in. Should I use the Multiclock to also set Abletons clock?

I have 2 channels in Ableton for each instrument. 1 has an external instrument on the channel, the other is then just an audio channel with its input set to the external instrument channel with monitoring off. I only record to the audio channel when ready.

ERM is set to POS only to eliminate count in issues. Their is a good video on youtube which explains it. It talks about setting a negative track delay on the ERM audio channel but Ive found to have better results using the audio interface latency offset in Ableton preferences instead.

All my ERM outputs are routed to a Mio 10 and then I sync the clock with 0 delay in one ERM channel to my worst instrument to deal with clock which in my case is MPC one. All others are then clock from different ERM outputs with varying levels of positive delay.

I play a kick drum or tick through each instrument record the results and look at how close they are in Abletons timelime adjusting to get as close as possible. It isn’t perfect but only off by 256th of a beat or something.

The ERM doesn’t sync midi, but if you are sending midi to the UDO from Ableton it will adjust the audio return if you have automatic delay compensation enabled.

I do a mix of external sequencing and ableton sequencing and with a bit of adjusting you can get it all tight.

3 Likes

Thank you for the answer. I still find some things unclear.

I managed to make an aggregate device with Big Six and the RME. So at the moment I’m sending an audio signal to Multiclock through the RME. There still was a remarkable delay. I also went through the procedure of optimizing Ableton settings and Driver Error Compensation.

I found this tutorial on YouTube and followed the instructions. So at first I added a negative track delay on the E-RM plugin channel and then compensated with a positive delay on the Multiclock. This way I got the delay relatively small on devices that are connected directly with the Multiclock. I am sequencing the Pulsar 23 with the Digitakt so there is a chain of midi connections. Don’t really know how this effect everything.

So at the moment I have the E-RM connected directly to RD-9, DFAM, Digitakt, and one extra device. From Digitakt the midi is going to a thru box that’s going to all devices that need external sequencing.

I’m sequencing the Super 6 with a USB cable directly from my Mac Book. Do you mean I should have it also connected with the Multiclock somehow? Could not find the automatic delay compensation either.

Did you try Abelton Link aswell - for the MPC One?

Turning the monitoring off in the track, enables the delay compensaiton - you can duplicate the track to monitor it. You know this probably already.

I currently debate with myself if the Multiclock would be worth it. I see timing problems, when other midi controllers are connected (and i tweak these)- or heavy processing vst are loaded. At least from my expierience - the first track in the set has the lowest latency - it runs from the first cpu. Its worthwhile to try out the track order also.

I did recently, but the results weren’t as good as syncing to the ERM. I don’t use link for anything else though so that might be my fault.

1 Like

Did you monitor the MPC through abelton, or via mixer /interface ? I think link compensates to run Live in parallel. I have to test this myself. I used MPC as a plugin in Live as of today, but i think that in standalone it would allow to not impact the processing of Abelton - ok - the reverb sucks, and i just could not use valhalla - but maybe i can live without that.

I monitored through Ableton. It wasn’t doing what I thought it would. I assumed if would give perfect sync. The sync was ok but with a lot of latency so it was out of sync with everything else.

Ive tried the VST but you can’t sequence external synths from the MPC doing this. Im assuming you can through ableton routing but thats not how I wanted it to work so I went back to standalone.

1 Like

Can you try it parallel through direct monitoring? I will do the same to verify how the Abelton Link sync works.

I found these ressources - maybe interesting for some of you looking behind the scene:

https://www.thesycon.info/eng/system_info.shtml

I was not aware of this feature:

Maybe someone else could test. I dont want to go changing anything at the moment. Im in the middle of a project and dont want to screw anything up changing settings.

1 Like

using hybrid setup with iPad / Drambo.
it just works, since iOS has decent latency.
personal computers are so XX century.

I did put the Octatrack on track one of the abelton live set - which is very empty - only Fabfilter L2 on the master, and some empty audio clips to record.
The MPC is running via Link and mixer interface. This combination is running relativly stable - the OT starts one bar too late. I could compensate this with an empty sequence on the MPC - at the beginning of the project - i know that program changes have to be send earlier for elektron equipmen -but start /stop messages? There is nothing earlier than start - so setting up a empty scene where synchronization happens is the way to go?

The advantage of remote control via mpc, also means that i possibly dont have to bring a monitor with the mpc, the mac mini is then essentially a sound module. (but you would have to configure automatic logon, and abelton live start of the project.)