-
Posts
756 -
Joined
-
Last visited
Everything posted by azslow3
-
With Freezing it always was a bit fancy. One of related bugs seems like just visual, so "Freeze" button is grayed while freezing is possible. I can't reproduce with exact sequence, but I managed to get it in that state within 1 minute. Project saving/reopening as well as mute/unmute the track helps to get that Very old trick toggling mute brings that in consistent state.In general it is better have Rack visible when working with freeze, it sometimes shows different picture. When having problems, check routing. I mean which tracks are pointing to the synth and how its output is routed. I have never used MIDI-Output from synth (it was never working for me good), but I guess that also can influence some internal logic. A bit strange behavior with routing is reproducible: * one simple instrument track, muted. Freeze is available. Frozen. Silence is rendered. It was muted, so logical. Unfreeze. * add new midi track, set output to the same synth. Freeze the first (still muted) track. It is rendered using both MIDI tracks. This time not really logical... ?
-
In short, in Windows device manager switch option to see all (also disconnected) devices and check you don't have "duplicates" in MIDI devices. Always connect USB-MIDI devices to exactly the same USB port. Even short connection ("by mistake") to another port and you have to re-visit device manager and cleanup. That is not eliminating the problem, but at least reducing a chance of strange mappings. A bit longer... Cakewalk does not use deepest possible way to re-discover devices. They save "names" and "numbers" (as can be seen in INI file), but both are not really persistent in the Windows world of MIDI devices. Better way is in fact rather tricky... Not all MIDI devices are USB devices and there is no strait route from USB world to MIDI world. Not only MIDI, but also USB devices are not "unique" (unlike f.e. network interfaces). If some device is "re-connected", it is not possible to detect it is the same or just similar. In addition, several devices of the same type can be connected at the same time. There is absolutely no way to distinguish between them, f.e. if you swap there cables. So Windows is using the only available "safe" approach, if USB device with the same IDs is connected to the very same USB port, it is matched to previously registered device. If there are any doubts, it is declared and registered as "new" device. One visible "name" will also be "new", matching by names is more complicated then someone can think. To attempt match things better, software should try to detect if some MIDI device is USB device and if so try to track re-connections to different USB port, also doing that at run-time. Windows does not really help in that journey. Apart from "known" MIDI devices (with some of them enabled) and assignments to surfaces, Cakewalk project also keeps a sequential list of MIDI devices activated at the time it was saved. So when opened with a "new" list it should be somehow matched. For "known" MIDI devices the goal is just detect any changes and detect what was not changed since the last time. But a project could be created long time ago or even on different computer. On one side users will probably hate explicit re-route MIDI on project open every time after any MIDI configuration change, "I was using a MIDI keyboard for recording on one computer and it is natural another MIDI keyboard is auto-used on another computer, if there is one". But in general such "natural" mappings are failing, especially in case there is more then one MIDI device. What I mean, for that part there is no "right" approach, any particular approach will have some consequences.
-
https://www.azslow.com/index.php/topic,295.0.html
-
Sorry, but I don't believe you ? If knobs on Mini are in relative mode, it always change the value relative to current (once DAW is configured to accept relative value, otherwise the knob always effectively set one of two fixed values...). There are several relative modes and DAW surface plug-ins support several, but both have to be set manually to match each other. StudioOne supports several types of not Presonus controllers, but in general they protect own controllers market. So you just had luck "it works!" for your controller. When you have a controller with which it does not work, you can't make it work. And no-one (except Presonus) can. They don't have open surface API (unlike Cakewalk, REAPER, Ableton and Bitwig). The same with Cubase. ProTools long time was supporting just one protocol for surfaces (Mackie HUI), any controller had to implement it to work with ProTools. Since long time they also use EUCON. I mean your claim "other DAWs" do it better with controllers in general is not true. Cakewalk was the first DAW which has published open source surface API. The only other DAW which has done the same is REAPER. When someone writes about changing a DAW because of controller support... that is just LOL. DAWs are different and each has advantages and disadvantages. Many users have and use more then one DAW. You can't "replace" ProTools if you are forced to use it. An attempt to "replace" Ableten or Maschine is not going to work well (if you used them for the purpose they was designed...). And there are FL, Tracktion, etc. which are also quite specific. There can be (and was, when Sonar had EOL...) discussions about changing between Cakewalk/S1/Cubase/REAPER/Samplitude/some other since they have significant overlap in the functionality and in the framework. But there are always pro and cons for each, which by far overweight a possibility to work with the cheapest controller with encoders on the market (Behringer X-Touch Mini).
-
With Behringer X-Touch Mini you have several options: put it into Mackie mode and use "Mackie" control surface plug-in in Cakewalk. But It will not control VST plug-ins, only strip pans (Unlike Compact and big X-Touch, it has no sufficient number of controls to mimic Mackie). put it into MIDI mode, set knobs to transmit relative value, use "ACT MIDI" control surface plug-in, set it to use relative values for knobs. There you can configure to use knobs for different parameters, including "dynamic plug-in mapping". It will not "jump", but there will be no ring indication ("ACT MIDI" is not supporting feedback). put it into MIDI mode, set knobs to transmit relative value, use it as MIDI input (without control surface plug-in) and MIDI learn in VSTi in question (assuming it support relative MIDI controls learning). Unlikely you will get any feedback. put it into Mackie mode and use preset for AZ Controller: https://www.azslow.com/index.php/topic,377.0.html You will get feedback and can control VSTs in "Plug-in control layer". Sorry to says, but quickly searching YouTube videos for Mini and Cubase/Protools/etc., most videos are a kind of "it's so boring!!!" for one or another feature which doesn't work. Well, logical... google top "known bloggers" which normally have no idea what are controllers and how to make them working, with any DAW (till particular controller officially support particular DAW) ? Any controller/DAW combination require the user spend a bit of time to learn how to setup it optimally for the use case (which can be different). You can "hit" something good working "by luck", f.e. when someone has added particular controller support into particular DAW or controller comes with own software for particular DAW. But you can't expect that magically automatically works. "A controller" is not "a mouse", they are all different.
-
I expect Cakewalk inform me when something is changed homogeneous way. Asking Cakewalk to do something does not means it has done that immediately, it can be not possible or is done later. Well, that is not so important, I am just monitoring the value without caching now. @norfolkmastering AZ Controller specific processing is relevant for AZ Controller only. There are times Cakewalk does something not possible to "fix" in AZ Controller, like original BR, and there are times AZ Controller does something unexpected, like your last observation. Lets not disturb Cakewalk with something they do not manage
-
"The controller" is like "the music instrument". That tells nothing about which device and mode of it you use, and for your question that is important. If your controller has encoders (infinite knobs), they send right type of messages (increment/decrement) and "ACT MIDI" is configured to understand these messages, you will have controls always "in sync". If your controller expect a "feedback" from the DAW and it is sending absolute position, the DAW should know how to send the feedback. That is controller specific and can't be done "generic" way. "ACT MIDI" is not designed for that. Some DAWs support simplest feedback (send current value to the place and in form the control is sending to the DAW) and for some controllers that works, but in most cases there is a separate software for particular DAW and controller combination.
-
Are you sure that is the case when you change by the API? Sure it can be my bug, but in my observation the flag is set only in case you change is done by mouse.
-
Testing Midi Latency Thanks for your help.
azslow3 replied to John Vere's topic in Cakewalk by BandLab
I think MIDI jitter and audio jitter are from a bit different "domains": Audio is a continuous stream. Audio is sampled using clock and when this clock is not accurate, samples are for "wrong" time. Audio jitter is inaccuracy in the time of samples. If there is more then one clock and they are not synchronized, f.e when two interfaces are used in parallel, there are "drift" and "jitter" between audio streams. If interfaces are synchronized, there is no drift but still there is some jitter between stream (samples taken at the same world time are not put at the same time position in the audio streams). Note that audio transfer way/speed/latency does not influence that jitter. MIDI events are not sampled as continuous stream. Jitter there is a deviation in latency. So how late an event is delivered in comparison with usual delivery time. Unlike audio, there is no predefined "sample rate". Obviously there is some "rate of reading hardware sensors and converting them to events", but it is unknown and device specific. The only known clock/delays is MIDI hardware transfer clock (~31kHz) and so it takes ~1ms to transfer one note. Hardware MIDI transfer use uni-directional continuous stream, so a note can be delivered as soon as prepared. In other words ~1ms is full (and constant) delay between the note is ready and delivered (important for comparison with USB). USB-MIDI has much higher transfer speed in comparison to MIDI. Even USB 1.1 is at least 1.5MHz (up to 12MHz), so transferring one (or even several) notes is way faster using any USB (one note in less then 0.02ms). But USB use host driven packet delivery. And here comes the problem, in a "standard" mode used by computer keyboards, mouse and "cheap" USB-MIDI, delivery from device happens every 5-12ms ("polling rate", device+mode specific fixed number, easy to see in Linux, I have not tried to look under Windows). So a single note, in case of 10ms quantization, will be delivered between 0.02ms and 10.02ms after it is ready for delivery. And so there will be "jitter" up to 10ms. USB-MIDI devices with own drivers are supporting (and using) lower polling rates. With 1kHz polling rate max deliver jitter will be ~1ms, for any number of simultaneous notes (USB2+ can go higher, but I have not checked if that is used in USB-MIDI). -
Testing Midi Latency Thanks for your help.
azslow3 replied to John Vere's topic in Cakewalk by BandLab
I have not done the test, but one "theoretical" note. Let say you have 100Hz sin wave. That means its "turnaround" is 10ms. If the interface input to output latency is 5ms, recording input and output simultaneously should produce 180° phase shift. I mean visual shift between waveforms depends from the frequency and the interface RTL. PS. I assume you have checked the interface reported latency is accurate, using loop-back recording. Good interfaces report it correctly, but if external mic pre-amp with digital output is used, it is not (can't be) auto-accounted into the interface RTL. Also while RTL is easy to measure with loop-back (in a DAW or with special utility), its division into input and output parts is way trickier to deduct. -
Testing Midi Latency Thanks for your help.
azslow3 replied to John Vere's topic in Cakewalk by BandLab
Drum/Keyboard module processing time (physical impact till MIDI is generated) can be ms or so. And there is no guarantee MIDI(USB or own) output is sent at the same time/before/after module sound generator get the event. So comparing Mic on pad with audio from module output is just a measure of module impact to sound latency, which is not bound to MIDI latency. Audio buffer size contribute to the "jitter" of MIDI to VST output latency. If the interface is used for MIDI and audio, I have a feeling the jitter is smaller since the interface knows time relation between both inputs. "Real hardware MIDI 1" transfer speed (throughput) is ~1ms per note. For drums that is less significant then for keyboard (we have more then one finger per hand). USB quantization contribute as most into transfer latency (there is almost no difference to transfer 1 or 100 notes, the delay till the "packet" is sent dominates). In that respect USB 1 is way worse then 2 (throughput of USB 1 is sufficient). So in practice hardware MIDI connection + audio interface with MIDI may have lower latency then device own USB connection. For interfaces there is MIDI loopback test utility (like RTL for audio), I remember with RME I had something around 2-3ms while cheap "MIDI to USB adapter" had more then 10ms. My Kawai connected throw RME with MIDI had lower latency then own USB ( I don't remember the results of the test with my Roland drums). For me MIDI latency starts to be boring in case it goes "crazy" only. That has happened several times, for some reason some (not all!) MIDI devices start to get latency over 30-40ms. Not DAW/audio interface/audio buffer dependent. Disappears with Windows restart... I still have no idea from where that comes. Note that most MIDI devices normally "imitate" instruments with "natural" acoustic latency (unlike f.e. singing, guitar, flute, etc., I mean something with rather short or even fixed distance from the "sound generator" to our "sound sensors"). Just using headphones compensates 3-5ms latency. -
You can... That was never possible if the output was to side-chain and I remember there was problems with AUXes (since reply what is current output returns error, at least for side-chains), but changing bus to other bus is working fine. Well, if there are several sends, changing the output can auto-reorder sends, which for me is a bug, but that is not surface specific. Surface can't see topology graph, it just get a flag "topology changed" as refresh parameter. And the flag is not set when track output is changed, nor when send output is changed by surface. At least in X2.
-
Thanks! In my current dev version of AZ Controller I try correctly cache names when possible (previous behavior was not optimal and buggy...) . That is how the bug was spotted. Since I slowly checking for names, once a change is noticed I re-check output/send names (if they are used for feedback). May be it is time for me to check if CbB: (a) trigger topology change flag for refresh when output or send output are changed (the last by surface) (b) if outputs/sends to side-chains and auxes are controllable. I am still developing under X2, may be there are some improvements ?
-
When destination (bus) is renamed, old name is reported by GetMixParamValueText(... MIX_PARAM_OUTPUT ...). I can't tell since when, but I remember a discussion about some "caching for names" not long time ago. In Sonar X2 it is working correctly. CbB version: 2022.11 (build 021, 64bit)
-
"Shift" is logical definition in "ACT MIDI", you can "Shift Learn..." it on the Options tab of "ACT MIDI". Which button of MPK261 you define as "Shift" is up to you. It just should be able to send MIDI (momentary, so it sends a message when you press and another when release) to the DAW (see AKAI documentation which buttons do). "Shift B1" and "B1" use the same incoming MIDI for B1, just the reaction is different when B1 is pressed when "Shift" is already pressed. So you can't "MIDI Learn" "Shift B1" separately. "ACT MIDI" is fixed to support up to 8+8+8+1 controls (separate MIDI messages). MPK261 has more physical controls (8+8+8 strip controls, 5 transport buttons, "DAW Control" section). And there are "Banks", which switch MIDI messages sent by the same controls. It is not possible to assign all available controls with one instance of "ACT MIDI". You may switch to "Generic Surface", it supports more controls. Note that both "stock" Cakewalk surface plug-ins have limitations what and how you can assigned, if you need more freedom you can use "AZ Controller".
-
Switch between midi channels while i playing
azslow3 replied to צביקה שמואלי's topic in Instruments & Effects
I think it is unclear what you want... Do you want play only one instrument using whole keyboard at any particular time? Then already given answer should help. If you want switch instruments from keyboard, write what you want to use for that (f.e. pedals, knobs/buttons on keyboard, etc.). If you want play all instruments at the same time, so split your keyboard into several regions, then you can use Drum Map or several tracks with active echo and the same keyboard as input (or just omni), with forced channel and MFX event filter (MIDI FX plug-in) tuned to filter out unwanted regions. -
If that is possible (only M-Audio knows for sure...), AZ Controller can use it (well, that has to be defined in the preset...). Also note that with AZ Controller any button or pad (which sends messages) can be used as a "Shift" ("Ctrl", "Alt.", "CapsLock". etc.), to change what any other control(s) is(are) doing. But the button has to send something. And any knob (in finite CC mode) can be used as "N position" switch. I mean If "Shift+<<" is not sending separate MIDI message and you want extra command, you can define f.e. "Back+<<" for that (while still keeping "Back" doing "Undo" when pressing "Back" alone).
- 31 replies
-
- 1
-
-
- mackie
- remote control
-
(and 2 more)
Tagged with:
-
Map Pitch "Wheel" to Midi Expression? Arturia Minilab MkII
azslow3 replied to congalocke's topic in Instruments & Effects
If the strip can send pitch bend only and that is not editable, in Cakewalk you can use MFX which converts pitch bend to CC: https://tencrazy.com/gadgets/mfx/ , PW-Map. What you will see in PRV is still pitch bend, till you "render" that effect. -
CBB, MPE, MIDI 2.0 and Expressive Controllers Discussion
azslow3 replied to RexRed's topic in Instruments & Effects
I was targeting "is any sound can be driven by MPE?" question. But you are right, my post is probably "too academic" for musicians... So I better list just practical points: MPE should be activated on keyboard and inside VSTi to work correctly. Knowing how that is done (and what can deactivate it) may avoid confusion. Original MIDI Polyphonic After-touch is not a part of MPE. Some other messages are used differently. In other words, MPE keyboard can be used with MPE unaware VSTi, but it should not be in MPE mode. The "sound" can be wrong otherwise. the "sound" produced by MPE compatible VSTi in MPE mode from preset not designed for MPE may be not the same as in conventional mode. That is VSTi (and preset) dependent. Also editing MPE recording can be more difficult then conventional recording (DAW dependent). In other words, it is probably better use (record) MPE mode only in case MPE is really used. -
Unfortunately I have not found HammerPro88 protocol documentation. User Guide mention 2 things: "Output port for LEDs in DAW mode" and "most DAWs automatically configure the device". In BITWIG, are any LEDs follow the DAW? If some do (f.e. transport), but other don't (fader buttons), M-Audio has not foreseen the feedback for particular buttons (at least not "easy" way). If no LEDs have feedback... it may be worse checking output port is set correctly. But the fact "they are always ON" point to the first case (with incorrect port they should be "always OFF"). "Most DAWs automatically..." is probably not true. In particular DAWs installation instructions they suggest selecting the DAW on keyboard and use Mackie mode in the DAW. Theoretically they can support feedback for buttons and pads (so the possibility for a DAW control LEDs, pads colors, etc.) even in case Mackie mode is not supporting that, I mean in "native" mode. But without protocol documentation from M-Audio it is very hard (up to impossible) to deduct.
- 31 replies
-
- mackie
- remote control
-
(and 2 more)
Tagged with:
-
CBB, MPE, MIDI 2.0 and Expressive Controllers Discussion
azslow3 replied to RexRed's topic in Instruments & Effects
If I have understood the specification correctly... there is no "required" ingredients at all to use (or imitate) MPE ? "MPE compatible host" just means the DAW can save (record, edit) Note and CC MIDI events with different channels into one track and deliver them to VSTi "as is" (without modifying MIDI channel in the events). I think (almost?) all DAWs can do that. For editing, a DAW should support convenient editing events with one particular MIDI channel (without that editing MPE recording will be nightmare). Controller is not required, corresponding MIDI can be created in MIDI editor. But converting recording from not MPE keyboard to "MPE compatible" can be boring (till the DAW supports corresponding scripting... theoretically possible with CAL..). If VSTi is not MPE aware, it will take 15 instances of the same VSTi with the same preset to implement MPE with it. Also MIDI has to be specially processed before feeding each instance. Note that such structure is rather hard to do in Cakewalk. PS. depending from VSTi and DAW, special care may be required for switching on MPE (RPN 6 messages) and prevent switching it off (on stop). ------------------------- Finally, I think everyone who want use MPE should read MPE specification instead of MPE advertisements.... MPE is rather simple "trick" to allow changes in several parameters per note. Original MIDI 1.0 has foreseen just one (Polyphonic After-touch). Current MPE defines 3. All that just to support keyboards with extra sensors per key. -
For the 3d option to work, the slider should send CC11. It will be assigned to AZ Controller logic, so initially CC11 will not come throw but modify WAI volume (or something according to preset logic). If the control is put in the group "A" and that group is toggles, it will no longer block CC11 and so it should be usable in plug-ins. And in that case it will be be processed by AZ Controller, so will not move WAI volume. But switching presets on the Keyboard, so control is sending something not assigned in AZ Controller, is probably simpler approach to achieve the same goal. "Groups" in AZ Controller are primary for controllers which can't switch hardware presets easily or when "advanced" logic is used (f.e. auto move some controls to/from pure MIDI mode when particular VSTi is in focus... some AZ Controller users experiment with funny configurations, may be just because that is possible ? ).
- 31 replies
-
- mackie
- remote control
-
(and 2 more)
Tagged with:
-
It depends from the version of Session. "SONAR" edition was bound to Sonar. General Strum Session 2 has serial number (as other AAS Session plug-ins, Platinum users could get those, in this case they are listed in AAS account).
-
I have to checked what exactly is going on in this concrete preset for AZ Controller and so how easy/hard it is going to be to add "modes" for controls. But AZ Controller supports all kinds of plug-in control in Cakewalk, including Dynamic Plug-in Mapping (sometimes called just "ACT" since used in "ACT MIDI" plug-in) Direct Plug-in Control (used in Mackie Control plug-in) dynamically including/excluding controls from surface plug-in operations (not available in other surface plug-ins). The last options allows at run time "de-assign" let say sliders 7-8 from AZ Controller logic, which effectively allows using them in VSTi plug-ins as MIDI input. For that, put controls in questions in some Group (in the Hardware Tab) and assign some button to toggle the group.
- 31 replies
-
- 1
-
-
- mackie
- remote control
-
(and 2 more)
Tagged with: