Jump to content


  • Posts

  • Joined

  • Last visited


408 Excellent


Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If "Quick start" is working (you can control track volume), your device is delivering what it should. For the Instrument. Start by looking if you can write an automation for the parameter in question (without controller, just try to add automation line for that parameter or write automation using GUI). If that is possible, you can assign it to hardware slider. If not, (2) and (3) will NOT work (VSTi normally support automations for parameters which make sense to automate, but there are cases when they "forget" to do that). I assume possible. For (2), replace "Strip" Action in the "Quick start" with "ACT" Action, select "Slider 1" in parameters. Try "ACT Learn"... if not working, follow "AZ ACT Fix" instructions I have linked.
  2. The instrument has to support the feature explicitly. When supported, normally you can right click on the control and there is something like "MIDI learn" option. Then you send MIDI, like CC message, by moving/pushing/turning controller and plug-in remembers that. When there is no such option, instrument may response on some particular CCs. Normally mentioned in the documentation for the Instrument. In this case you externally configure the controller to send what is expected. Going with Surface Plug-ins (any of them) require first "MIDI learn" inside surface plug-in and then "ACT Learn" the parameter. As I have mentioned, the second part had quite some bugs in the history of Cakewalk. Check "AZ ACT Fix" utility: https://www.azslow.com/index.php/topic,297.0.html also read https://www.azslow.com/index.php/topic,13.0.html (some comments can be obsolete/fixed, but in general still valid). So using it allow editing mapping without "ACT Learn", but read the documentation, the utility is not intuitive. Note you can use it for control the mapping for any surface plug-in, you don't have to use AZ Controller. AZ Controller is most flexible but also most complicated surface plug-in. There are manual, tutorials and examples. If you master it (will take some time...), you will be able quickly assign your sliders to anything within seconds. To understand how "ACT MIDI" surface plug-in works (and optionally recreate its functionality in AZ Controller), read https://www.azslow.com/index.php/topic,107.0.html From that, even if you can not/will not follow, you will understand why the whole topic is not "simple". Note that ideas behind Control Surfaces in DAWs are almost the same for all controllers and all DAWs. Inside Cakewalk with AZ Controller you can learn these concepts without any background. In some other DAWs for the same you need programming skills (C++ in REAPER, Python in Ableton, etc.), and some DAWs don't have open API in public. So you have an opportunity to look "behind the scene" with relatively small effort.
  3. May be REMOVING everything in "Control Surfaces" preferences is what you really want... Just make sure you set Omni MIDI as instrument input The explanations. Parameters of Instruments and FXes can be controlled 3+ different ways in Cakewalk: MIDI assign/learn inside the Instrument (many CCs are per-mapped). Some FXes may have MIDI input, in this case they can be controlled using the same approach. But FXes in general can be controlled used following methods only Using "ACT Dynamic Plug-in Mapping" feature, with "ACT MIDI", "Generic Surface", "AZController" plug-ins and corresponding configuration. The mapping work with plug-in in focus, the mapping is done with "ACT Learn" button. There was many bugs with the mapping, at least in the past. "AZ ACT Fix" utility can help keep persistence under control Using "ACT Direct Plug-in Control" feature, possible with "AZController" (also used by Mackie Cakewalk plug-in, but it works with Mackie compatible devices only). In this case you can control specific parameter of specific plug-in, so even several plug-ins in parallel. But for that you have to manually select what exactly you want control in the configuration. * Cakewalk has direct "Remote control" feature, so you can learn DAW parameters (and with some workaround Instrument parameters) directly. Unfortunately, 20+ years old design bug is still there (assigned MIDI still leaks to everything else) and I remember some other bug reports about persistence for assignments. So, till you do NOT use any MIDI instruments in the Project, I can't recommend using it (when unlucky, side effects can make you think you have serious problem with your minds...).
  4. Good Sonar and Next overviews. Thanks! For stem separation, at least some time ago (when I was interested) my favorite was Ultimate Vocal Removal. Most (all?) separators differ just in trained model, and results depends from the music you separate (the quality of particular tracks is the source and model dependent, one model can deliver better drums while other better vocal, extracted from the same source...). UVR is a GUI with several parameters and models. It is offline and open source, as everything else it uses. Back-ends have good scientific explanations, comparisons and references.
  5. For me https://www.bandlab.com/membership is just black... It contacts tons of trackers (google, tiktok, facebook, outbrain, etc.), my Firefox refuse to load it. From the beginning, "free" CbB was demanding re-activation every half an year. So it was clear it will not be really "free" forever. All these locking, started at Platinum time, especially after "new" version of Z3TA2 with online activation as the only visible change, and especially related bugs, was not convincing. Now the owner is different and management behavior is even less convincing: "our free product will be paid product... soon..." But I have thought may be there is re-start in the development. Unfortunately the video linked early in this thread has demonstrated the second change of GUI framework and several tiny features. No MIDI routing/VST and no offline pre-processing during playback and so I guess still the very same 20 years old engine. I am hobbyist, but last years I have payed quite some money for plug-ins and a bit for the DAW. And I have no regrets because each time it was absolutely clear for what I pay (and how much... for Bandlab membership I can't even find the price in euro and info about VAT). I want a DAW be able to run during lifetime (my! not DAWs!). And I know that is the case with Sonar X and my current DAW. Any version, at any time, offline. In other words, Sonar X (and CbB, till possible to run) projects will be supported by ReaCWP (I am still fixing bugs when spotted), but any changes in the format introduced by "new Sonar" will not be supported/fixed. I mean if you have project files from X/CbB, don't overwrite them when "testing" "new Sonar". Just in case you need ReaCWP in the future... who knows...
  6. OSARA is a REAPER add-on, it has automatic installer but can be installed manually by copy one dll into expected directory (and loading shortcuts afterward). Works with any screen reader, under Windows and OSX. Sonar 8.5 was so popular because of CakeTalking (payed software) and JSonar (free). From what I know, developers was from the community, I mean not sighted developers with target users as testers, but other way around. Sonar X1 had completely new interface, Windows 8 also breaks several things. So, that direction is stalled long time ago. OSARA was born as a demand to move on somewhere. Since REAPER is the only other DAW with open API (in practice with way more possibilities then Cakewalk API), the choice of the DAW was simple. The idea was to make it free and open source, also the developer (AKA the owner) of REAPER is aware the community exists (clear from the REAPER's change-logs). All that is "alive", I mean there is active REAPER developer, active OSARA developer, active community "manager" and several sighted helpers. Most people have moved away from Sonar 8.5 (from what I know, including JSonar developer). I mean, I am not sending Annabelle down the rabbit hole, I just try to convince to take the route other have already taken (and not regret that). Cakewalk had intention to make CbB accessible. May be some day there will be yet another attempt. There are some people which in fact use Sonar X3/CbB, after creating the project in 8.5. With XTouch Compact and NI Keyboard that kind of works. But I can't recommend that, at least at the moment.
  7. Unfortunately Speakerphone has no demo, so I can't check myself. May be worse to ask developers if the new version is JUCE based (most popular framework) and if so they should not forget to check accessibility. Recent JUCE version are accessible. As I have written before, for the host it is better leave Sonar 8.5. I know that is not easy, even for experienced people. But there is a big community which can help, most REAPER/OSARA users was working with Sonar 8.5 before. For plug-ins, fortunately there is movement toward native accessibility. JAWS / HSC / SIBIAC scripts for inaccessible plug-ins was fighting against the wall, and once something was developed there was a new version of plug-in and the work was voided.
  8. You have even "liked" my post in the previous thread, about FluidSynth VST... I have checked and I am not delivering SysExes. I don't remember why, in the source I have a comment "todo" for it 😏 There was several technical problems with GM in VST3 format. And I have not invested time into GUI... But that does not prevent current version more or less working.
  9. Some years ago I have tried to find multi-platform solution to use in place of TTS-1. And I have found nothing... Ended rolling my own based on SF2 synth, for sure far from perfect, not even with complete FluidSynth capabilities: https://github.com/AZSlow3/FluidSynthVST
  10. That means my example with tempo change is not (or not only) the problem you observe... I don't have other ideas at the moment, but I will think more. So may be you can publish it (without audio, just .cwp file). Who knows, something there can give a hint (Cakewalk and I can see internal project structure), even when in its current state the bug is not reproducible. PS, OT Thank you for the clarification. I am long time present on this and previous Cakewalk forums, and there was several people which for one or another reason refused to communicate with me. Please don't get me wrong, I respect such decisions. But the silence sometimes irritates me, since in that case I don't know if my attempts to help make sense. Peace
  11. @pulsewalk answering questions which can help identify the origin can... well.. help identify the origin. If you don't take me serious, ok, but at least don't ignore questions from msmcleod (assuming you want the problem is found and fixed...) @msmcleod it seems like I have found a method to reproduce "reverse Z": zoom to the samples level add an automation with 2 points, so a "jump" composed by two linear segment. Note they are aligned to samples and points can be moved by samples only. add a tempo change before points in question. Notice that points are no longer aligned to the samples (at least visually). adjust the tempo change such way that automation points are behind sample tick, but close to the previous tick then to the next one. start dragging the point, it will snap to the nearest (previous) sample, producing the picture above. Note in case points are "closer" to the next sample, CBB does not allow dragging to previous sample. In saved project, at least in my case, "start" time for some segments is saved as "musical" but "duration" as "absolute" (I obviously don't know how these parameters are really called in Cakewalk, but I guess you understand what I am talking about ).
  12. @pulsewalk Give them a project with the problem... Background. Automations in Cakewalk are saved as segments, using Musical time (ticks). Each segment has "start" and "duration", specified in Ticks. If "start"+"duration" is not equal to the "start" of the next segment, it can look like your screenshot. How (and if) that can happened only Cakewalk knows. Even if it is aligned in ticks, it can be they calculate resulted sample the way the result is no longer aligned. Note that default segment type is "linear". But there are other types, including "jump" (in this case there will be no "previous value" point visible). VST standard is fuzzy how automation segments should be transmitted to plug-ins. At the very same place in the documentation they claim there should be a linear segment for the whole block, then giving an example with single not block border aligned points. Then mentioning "jumps" should be specified with 2 points with the same time (sample). Finally technically it is possible to specify several points for a block, not in time order... And so, what Cakewalk does and how that is interpreted by particular plug-in is almost up to them 🤪 From the "user" perspective, check the following: do you have tempo changes in the project? If yes, are "Z" appears in such changes (only) or also at constant tempo? do you have "unusual" settings in Project/Clock? (not audio Sync / TPQ != 960) how you create your automations? (controller, moving knob/slider by mouse, editing points)
  13. Cakewalk is far from convenient for MIDI processing, since VST MIDI processors are not supported (can be used as VSTi only) and MIDI routing is not flexible (apart from hard to follow and sometimes buggy). You can check Ctrlr (free). It allows talk to external MIDI port from within plug-in (so without DAW). Not sure that works from several instances to the same device... If the target is just making a panel to control external device with MIDI, you can stop reading here For complicated MIDI hardware controlling, REAPER is probably most flexible. Not only it allows VST MIDI processors (most DAWs except Cakewalk allow that), it supports: * arbitrary MIDI routing (including between tracks and sends to external MIDI) * several MIDI buses on one track (a track is not limited to just one MIDI stream) * Input FX - plug-ins can change MIDI (and audio) before it is recorded * build-in MIDI modulation. Including MIDI learn and MIDI control for parameters in any VSTs, even those which don't support MIDI control directly. * build-in audio modulation, so audio level can control arbitrary parameter. Combined with "parameter to MIDI" capable plug-in (f.e. build-in), audio level can be "converted" to MIDI changes and then processed/routed/recorded. * when some wish is still not covered, scripting and plug-in extension are supported, with access to everything (MIDI, audio, content, DAW/project state, etc.). And some special MIDI related plug-ins make use of that (f.e. ReaLearn). PS. Long, long time ago Cakewalk had build-in solution for controlling external hardware. It was called StudioWare. Unfortunately for unknown reason that functionality was abandoned...
  14. ReaCWP tries to put clips with "all goodies" specified in Cakewalk (sec/beat, clip, offset, loop, etc.). I went throw DAWProject documentation and all that is possible to specify. But what is the "right" way to do that is unclear. Unfortunately, unlike in Cakewalk and REAPER, where "track with all related parameters(->lane)->clip with all related parameters" is hard to interpret wrong (since that is logical and follows what users see and do...), DAWProject introduce different hierarchy of containers, without precise specification how it should be build (at least at first look, all provided with the specification examples are "too simple"). PS. I hope DAWProject is not going to be another "VST3" (where no-one knows how to implement many things "right", not even if the "right" way exists...). Triggering endless incompatibilities...
  15. Sorry, that statement was theoretical... I mean apart from Cakewalk, I am the only person which knows CWP format sufficiently to convert it into something... For VST conversion problems... Is it about VST2, VST3 or both? Recently I have found that ReaCWP doesn't convert correctly some VST3 presets. I have a bit better code, but I have not put it into ReaCWP yet (it was written for other VST presets related utility...). But there is yet another possible problem, not really tested by me nor confirmed by someone else: https://forums.cockos.com/showthread.php?t=281800 So it can happened I have overseen something. In case: 1) the problem you observe is with VST3 2) the state of the VST is correct in REAPER 3) you have a minute (since you know exact VST and state which does not work, I guess the test will not take long), Please for VST/state in question save .vstpreset (state file) in REAPER and load it in Bitwig. Does it loads correctly? Note that REAPER bug (if there is real bug) probably affect just some VST3 and particular states, since many don't use the second section for state (or ignore it). "Standards", especially complex, especially at the beginning, are rarely implemented completely and without bugs. I guess something about audio clips as created by ReaCWP is not implemented in Studio One (DAWProject importer) yet. ReaCWP tries to match clip parameters in Cakewalk (time base, length, offsets, loops). Directly created in REAPER project follows current defaults, which can be different. If you give an example to Jürgen (the author of ProjectConverter), he can probably understand the reason quick (since he knows exactly what and how is converted, so he can guess what from that can be incompatible).
  • Create New...