Jump to content

azslow3

Members
  • Posts

    671
  • Joined

  • Last visited

Everything posted by azslow3

  1. "Cakewalk ACT" / "Generic surface" / (AZ Controller...) can do that. But in any case you need to "prepare" your Akai using its configuration utility. I recommend "CC" mode of pads instead of notes. If you use, make sure pads generate "unused" CC number which don't clash with knobs. F.e. assign "undefined" from the table https://www.midi.org/specifications-old/item/table-3-control-change-messages-data-bytes-2 Fo the plug-in to use. If all actions you want are in "Generic surface", it is the simplest to understand and use. Otherwise use "Cakewalk ACT". Both are well described in Cakewalk documentation (any many videos). If you want something unusual (f.e. LEDs under pads show current transport status, tricky sequences of operations not available as a single "command", etc.), you will need AZ Controller (and long time to configure it...).
  2. The same with DAWs checking current time... The intention is also questionable and the implementation sometimes also produce unexpected issues 😉
  3. Cakewalk exists for too long, v27 😀 Roland has not foreseen that Windows 8 drivers can work in Windows 10, so many devices was instantly "obsolete". AZ Controller also checks host version... fortunately it is still "supported" and so the bug could be fixed.
  4. There is no more problem with AZ Controller. Was my fault 🙄
  5. I have an idea concerning AZController misbehavior with 2021.1... Can it be that surface loop/midi can be called multi-threaded now? I mean f.e. calling MIDIIn while in idle loop? Not that is not allowed, but I have never observed that in any previous versions. There are some reasons AZController is not re-entrant. But the "protection" code can be buggy, since it was never tested...
  6. Unlike other plug-ins, AZ Controller allows fancy things... And Heinz has decided to do jogging a bit unusual way, as you can see in his videos, he is jogging by fader. In addition he is calling "Go to prev/next measure" for that. I also support "normal" jogging, including by fader/knob. But somehow he had better results with commands before. Why he could get 127 changes from controller even on fast movement and why that "works" only 2 times in 2021.1 and can be "reset" by preset reloading stump me as well 🤨 I have no ideas so far.
  7. We will try to find out the reason for observation with AZ Controller "offline", but it will be helpful to know if something was changed in: input MIDI and the way it is delivered to Control surfaces, so something which can affect the number of received messages (when there are many of them, f.e. a knob is turned fast). commands processing, so processing of many commands (like go to the next measure) coming fast.
  8. That is like looking for low cost auto with all possible feature, which work flawlessly on each road type at any speed 😎 Transport and fixed set of parameters will work using any keyboard (as long as it has corresponding hardware controls). "Fixed set" means you need "setup time" to bring each fader/knob in sync with Cakewalk, so once you change what you control (f.e. switch the sync or preset in the synth, move to the next block of tracks, etc) you need "setup time" again. Note that Synth control is normally work the same way as keys, so the DAW is not involved in that process (it just pass MIDI to plug-in). For anything else (Mix, FXes, complicated synths control) low cost keyboards are not a good. Which more expensive keyboards to choose depends from required operations. F.e. for synth parameters adjusting while switching synths, NI is probably the best option. While for 8 channel mixer combined with keyboard, Motör can be better. My usual quick and dirty advise: with TouchDAW you can get an idea what (expensive) DAW controller can do in Cakewalk, for $5. As low cost hardware try X-Touch Mini for $50. Put on your PX201, it can somehow imitate (expensive) NI keyboard experience with synths while providing reasonable DAW controlling functionality (NI can't do that currently in Cakewalk).
  9. You already hit the first reason Control Surfaces are difficult... Names!🙂 ---- ProTools support HUI (protocol) only. So, if you select "ProTools" target, DM will speak HUI. HUI (device) was produced by Mackie (company) with special HUI protocol, later they have produced Logic and then MCU (Pro) devices, commonly known as "Mackie" (device). And MCU can be switched to "ProTools" mode (HUI protocol). So, switch DM to "ProTools" mode and use MIDI devices to configure "Mackie Control" surface plug-in in Cakewalk. In that plug-in select "HUI (Beta)" protocol. Start with MIDI In/Out which you think control channels 1-8 and should also send transport and other buttons(*). When that work, add "Mackie Control XT" with In/Out for the next group of 8 channels. And so on. * that is why you see some "CC" from transport on USB-1. Note they are a bit more complicated that just "CC"... mcmcleod is the author of "HUI (Beta)", so you will be supported in case something goes wrong on the software side (BTW "Mackie Control" is Open Source, in case you have some experience with C++ and want to have a look how all that works...) ---- If you want to observe which messages are sent and how they are interpreted, you can initially use AZ Controller with mentioned by be "HUIv2" preset loaded on USB-1 (Instead of "Mackie Control", do not try to use in parallel). Unlike other plug-ins, it display what it receives and how it interpret (for not yet assigned complex messages you will need to play with interpretation "Options", see my previous post for explanation). But there is no "XT", so you will be limited to 8 channels + transport, till you extend the preset (far from easy without experience) or switch back to "Mackie Control" + XT(s). ---- MMC is a (fixed) set of SysEx messaged, originally indented to sync hardware MIDI devices. Cakewalk CAN send them for that purpose, but you do NOT want to use MMC as "MMC" in any case, just as SysEx messages assignable to some actions in Generic Surface (if you are not in ProTools mode...). For Generic Surface read (3-5 times, not a joke...) the documentation of Cakewalk. I repeat, at least 3 times. I have stopped after reading it 2 times, could not understand it, and started to write AZ Controller 🙄
  10. AZ Controller support all MIDI messages Not a joke, in fact that is more difficult than someone can guess... HUI has "triggered" support different CC modes in one preset 14bit CC sequences (including (N)RPNs) are supported with standard as well as arbitrary order, including mapping a part of the value to separated controls (A&H style) complex SysEx mappings (Roland and Yamaha digital mixers in "native" mode) Mackie handshake and Roland checksum, Mackie style ring feedback (other feedback I have seen so far could be organized in presets without special code in plug-in). But I can't claim the same for OSC, f.e. Behringer DMs OSC is not supported, I guess NI MK2 OSC also needs some addition in the code. Well, NI MK2 MIDI (native mode) will be hard to support too, but not in the MIDI parsing part... My HUI example was tested with Nocturn only, one user has tested with d8b http://www.azslow.com/index.php/topic,223.msg1386.html#msg1386 BTW I have complex preset for Yamaha 01V in MIDI mode, can happened something in it can work for other Yamaha mixers (if MIDI implementations overlap). But that is not recommended since controlling DAW using DM native MIDI means DM can no longer be used as a mixer (Cakewalk project will influence sound processing in the mixer).
  11. It depends... Configure DM's DAW control mode (preferably "Mackie", in case it has it, but HUI should also work somehow). And then use Mackie surface module in Cakewalk. For anything else (in case you are not happy with simple approach), you will need deep understanding how all that staff works. F.e. you probably don't want MMC function as originally intended (to control hardware devices) even in case you have corresponding hardware, but you can use MMC sending buttons as generic Control Surface buttons, to control Cakewalk transport (or something else in Cakewalk...). You can even use MIDI DM signals, originally independent to sync DMs and/or save parameters, to control something in the DAW. But that is relatively difficult.
  12. I also can't remember faders ever followed CC7 events. Logically CC7 envelopes was always separate from CC7 events in the clip, the fader is working with envelopes (and there is no mode switch for it), so how could it follow two different values at the same time?
  13. I guess in case that is true, my old 8x8 USB2 interface couldn't work on it's lowest settings when connected to 10m USB hub in parallel with several USB1 devices. But it works. USB specification deals with different standard/speed devices much better then making everything slow. 🙂 Also under 1ms RTL is never "comfortable". Computer should be top optimized and plug-ins carefully selected. Yes, there are no USB interfaces with such feature. But 3.3ms is really usable, with USB2 and moderate buffer size. In practice, the difference can be rarely perceived (taking into account that moving your head 30cm in any direction change latency by 1ms...).
  14. And I have uploaded moded version there 😉
  15. Also note in case you use the same plug-in for several devices (f.e. "ACT MIDI" for both), Preset names will be bugged (both will show the same preset name, while in fact they will load correct preset for each instance). BTW @msmcleod are there any plans to fix that ? 😉
  16. We better discuss that on my forum... Upload your current preset there (v6 preset in the link doesn't produce WAI picture shown) and I will try to adopt it.
  17. Unfortunately Digitech documentation say nothing about MIDI these devices send (they just describe how they work with in-house software). Please check the following: remove ACT MIDI and start recording "RPx400 MIDI" device into MIDI track. Press each button for 2-3 seconds. Then stop recording. Open "Views/Event List" in Cakewalk. It should list what buttons send to Cakewalk. Write us what you see in the "Kind" and "Data" column, Alternatively you can use any MIDI monitoring tools you now.
  18. If some device can send MIDI messages, you can configure it to stop/record/play (or to do many other things). you are mixing many terms, so please write exactly which device you try to use as MIDI controller, RPX400 or "control foot pedal device". If the later, which one (name/model)? which Cakewalk Surface plug-in you try to configure: "ACT MIDI Controller" or "Cakewalk generic surface"? "ACT learn" cell/button you can see in plug-ins (including mentioned Surface plug-in and VST) have nothing to do with what you try to achieve. You need to do "MIDI Learn" (only) inside Surface plug-in (sometimes called "ACT plug-in"). Yes, the word "ACT" is used for many different things. some devices/controls don't send simple MIDI messages, in that case you can't "learn" them, you need to enter the message manually. Other can send more then one message from the same control. So my question (1) is important, we can have a look in concrete documentation to find the right way for you
  19. You are right, not there by default... Now I have to check what else I have manually changed, so I can do that again when needed... 🤔
  20. USB 1.1 has sufficient bandwidth for 2x2. What make difference for audio interfaces between USB 1/2/3 / FW/ thunderbolt is communication organization. F.e. USB is a bus with predefined minimum for communication "cycles", and that minimum is relatively high for USB1 and USB2. That is the reason you can't find USB1-2 interfaces with latency (RTL) lower then some value (for USB1 is was quite big, with significant improvement in USB2). USB3/FW/TB/PCI(e) open a possibility to make it lower, which some interfaces use (down to 1ms). So, USB3/TB can improve latency, when used properly on hardware and in drivers. But since USB2 can go down to ~3ms and lower latency requires very special system settings to work stable, the market is limited and so the number of such devices. Disabling parking cores (in general disabling C state changes) is the way to bring down occasional system latency from ~250uSec to ~50uSec. Unfortunately that spread out significant heat (f.e. my i9 will constantly spread >=90W,). Unfortunately that is the only way to work with sub 64 samples buffers and/or to bring possible CPU load closer to theoretical max without introducing audio glitches. But the price (in terms of noise or super-silent cooling system) is too high for an average user... Plug-in multi-core processing in Cakewalk (as I understand it) is based on parallelizing processing after splitting audio buffer (that is why there is lowest buffer size with which it can be enabled), that effect can't be achieved with external tools.
  21. tip with power plan is good. To be on the "safe side", there is also "Ultimate" power plan. Note that by default Windows is NOT showing all available (and so many relevant) options in power plan editor, so it is f.e. not possible to manually edit one power plan to another (there is registry tweak on github to show all options). Simple switching to properly constructed power plan covers all recommended (f.e. in mentioned MOTU pdf) settings. Also I have found on the fly switch https://www.microsoft.com/en-us/p/powerplanswitcher/9nblggh556l3 useful, since I use one computer for everything (no reason to keep it "ultimate" all the time). disabling WiFi, NVIDIA audio and in fact all other devices which are not in use is also good idea in general. but switching priority to background processed is not a good idea in general... Proper written drivers + proper written DAW RT part should take care about priorities. Sometimes priority to background helps, sometimes running DAW as background process also change the result. All that are dirty workaround. Obviously, we want the DAW get resources as quick as possible, except audio driver activity. F.e. we don't want Windows own scheduled tasks have priority over the DAW. It seems like the problem is with some audio drivers, which for some reason run something as background process. So once the DAW (heavy plug-in) use resources, the driver can't get required time slot in time. For me, that is the only reasonable explanation why shifting general priority can have positive influence. Note there are some "tools" which allow manually set priority to particular processes/threads and some people report that works. I want to add one point, which I have noticed by occasion recently and it seems like that is not mentioned often: sharing driver between applications can drastically affect stable buffer size. I have checked with my M-Audio and Phonic. Both allow ASIO in parallel with other modes. Once the same device is open by other application (f.e. web browser is running), even in case that other application(s) is not producing any sound, small buffers start glitch in the DAW.
  22. Cakewalk always works in "real time", REAPER by default use anticipative engine. The later has obvious advantages, but there are some disadvantages as well (f.e. try to live play with several tiny "look ahead" plug-ins...). To really compare, record arm all tracks in REAPER (or switch off anticipative processing). I must admit that in most cases I still get better performance in REAPER and I was really surprised when I hit opposite the first time, but under some conditions Cakewalk can deal with the same project in real-time better. Some people have switched to REAPER for this (and other) reasons, but other stay with Cakewalk or use both (or even more DAWs), also for good reasons... In any case, I always recommend to have REAPER+ReaCWP installed near Cakewalk. In case of questions and/or troubles (what is plug-in CPU load on each track? which plug-ins have "look ahead"? which plug-in is crashing the DAW? etc.), just open the project in REAPER and check performance meter/use plug-ins isolation/other "debugging" tools. Sure, most real projects will not sound the same (no Cakewalk specific plug-ins except ProChannel EQ will work after conversion, along with other differences), but for debugging that should be more then sufficient.
  23. Have you checked your computer for "audio processing compatibility"? I mean Ultimate power plan, latency monitor, CPU throttling, etc. I mean something has to make your system (unexpectedly) busy for more then 20ms to force 2048 buffer size. Another quick check: open the project in REAPER (with ReaCWP, that should load some if not all plug-ins with project specified settings). Check performance monitor to detect what is going on (it will display CPU load per track, RT load, etc.). Even on old notebook with Realtek and ASIO4All I was never forced to set more then 192 for recording, if the project could work at all (if CPU is insufficient, the buffer size doesn't help). I think 256 is "safe maximum" for mixing on modern systems, it tolerates not optimized systems and other glitches. Your system should be able to record with 128 with many FX/VSTi. I mean with any interface (if everything is optimized and the interface is reasonable, 64 or even lower should work without glitches). PS. Lookahead in plug-ins increase RTL but has no direct influence on the buffer size nor CPU use. Lookahead is just algorithm forced approach, by itself that doesn't indicate the plug-in is CPU heavy.
  24. I don't think that buffer processing overhead plays significant role at such buffer size, so the need to go over 1024 comes from some severe jitters in processing. It can be seriously underpowered or not optimal for audio processing system. Are simple projects (f.e. audio + not sample based synth + FXes) run fine with low buffer (64, in worse case of 10 years old celeron, 128)? If yes, is the same project still runs fine with 1-2 Kontakt instruments? If both are fine, I guess the system is underpowered for current project. If "CPU only" project runs fine, but sample based has troubles, closer look at the disk system (disks, controller, settings, fragmentation) should help to understand from where it comes. If CPU only project doesn't run with 128, something is the system introduce (unexpected) latency, so system settings are not optimal. I guess MOTU think that modern computers don't need huge buffers, also in some DAWs the buffer size has little impact on possible mixing project size (mixing doesn't work in real-time).
  25. May be I have misunderstood the intention with EQ/Comp, it will be way less then. By itself 1000 parameters under default timing is not a problem, there are some presets which use that amount. At the beginning I had worries, so my monitors have "speed" parameter to not ask every cycle. In practice, I have not hit significant CPU use nor audio glitches by monitoring every cycle. All my presets have all monitors in that mode. But when requesting CPU time is absolute, let say 1ms per loop, that is just 1/75 of not RT processing by default. With 10ms cycle that is 1/10 and can start influence something.
×
×
  • Create New...