Jump to content

azslow3

Members
  • Posts

    801
  • Joined

  • Last visited

Reputation

508 Excellent

4 Followers

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. That is how it should work. Unfortunately, it seems like that is not (or not always) the case. Yesterday I have created a simple project, one audio track and one MIDI track. I have assigned one CC from device knob as "Remote Control" for track one volume. Then I have armed MIDI track and have started recording. Turning the knob was recording CC changes... The test was quick, may be I have overseen something.
  2. MIDI Messages assigned to do "Remote control" are not blocked. So if you use the device as an input to some synth (or your have Omni input), the synth will receive them. What will happened depends from the synth and used MIDI messages. I have just checked, it is still the case with current Sonar. I was lucky to spot that early, with well known CC7. I had some knob generating CC7 and I have assigned it as a "Remote control" for something (I think volume of some track...). Everything was working fine, till I have focused a track with a synth. At some moment the synth went silent... There was no indication in GUI, nothing visual at all. But no sound. I have reloaded the project - everything ok again. After a while, silence again. It took me a while to understand the reason πŸ™„ After that, when close to the same happened with "ACT MIDI Controller" (from what I remember, that is already fixed), I already knew where to look. It happened my controller buttons was generating Notes (not CCs), and "Note Off" is a different message from "Note On", and so it was not blocked. That was silencing particular note, only in case it supposed to be played at the time you use control button for something unrelated (f.e. muting some other track), so not easy to understand what is going on. ---- In other words, mixing "Remote control" with synths (any MIDI in the Project) is looking for troubles. Sure, you can carefully assign MIDI tracks inputs to particular MIDI channel and use other channel for "Remote control". But any mistake and you get a mess (as in the example before, a synth can react on particular MIDI messages without any indication in GUI). I think that is a bug which for some reason is not fixed since decades... --- ACT (I mean all surface plug-ins) block assigned messages, so there is no such problem (except it is not possible to block Wheel, but not block everything...).
  3. Cakewalk has developed a special system which allows to learn way more controls. (Still) a bit buggy, but works. And for the case you manage more then 15-20 assignments, it make sense to have a editing tools for assignments. And they exists. And if you want see all that on a tablet, with names and synced values, there are programs which allow you to do so. Yes, that system is not "Remote control". It is called "ACT". In other DAWs that is usually the same, "simple" assignment and "complicated" (surface) control. It can happened FL Studio has a bit more features in the "simple" approach then Sonar in its "Remote control". These DAWs are different from many perspectives.
  4. Yes. Than in case particular MIDI track input is set to specific device/MIDI channel/both, only messages from this device/channel are delivered there. Omni Inputs+All channels and "Remote control" are getting all MIDI messages from all devices, always, independent from track "buttons". But there is a separate "Control surfaces" logic, they can block all/particular MIDI messages from particular device, and they can do this dynamically. Remote control (still? I have not checked for a while) also "leak" to MIDI track inputs. Cakewalk is (at least was) not blocking assigned messages. BTW "Input Echo", as "Mute", is output related. You are slowly getting to the idea why is some DAWs there is a separate "Enable input" option. But man... (all?) people which try to use the concept after Sonar, initially think that is "not intuitive/wrong/why???" 😏 This.... And I have tried to explain him why it is not a good idea to use "Remote Control" complicated way. It was not designed to be flexible/complicated. And it was superseded by ACT (till Matrix... ACT has no control over it). Unfortunately, flexibility is bound to complexity. If you control something "complex", you shouldn't make controlling system "simple". Most people don't know (even those who should... musicians are not obligated to know that), but there is corresponding theorem. Perpetuum mobile, GΓΆdel's incompleteness theorems, traveling to other galaxy... There are many things we "intuitively" think are possible, but they are not (with proves)... Something/someone has introduced interesting rules in this Universum... πŸ˜’
  5. I don't know any DAW in which Mute works for Input (audio or MIDI). Mute means "mute the output" (everywhere). "Input echo" / Record mode (f.e. "Sound on sound") / "Record arm" control what goes where and when. "Input stream" and "Existing recording stream" can be merged differently (even at the same time) for output and for recording. There is no "enable input" in Sonar. There is just one "MIDI Bus" in one MIDI track in Sonar. It can't distinguish different controllers if they send the same MIDI messages. And you can assign "One controller" or "All controllers" as an input. But there are "Drum maps" and MIDI filtering/converting plug-ins which give some routing/filtering possibilities (plug-ins are always "Effects", so applied after recording). All that is still in DX MIDI domain and with limited MIDI routing. MIDI in Sonar is not change like 20+ years. May be devs have plans to touch that in the future, but that is unknown... In other words, if you need/want intensive MIDI processing in the DAW, use another DAW(s).
  6. I have tried several times to understand how to work with tempo in Melodyne when it is inside a DAW. But I have failed to understand the logic (if it is there at all...). If you have Studio, you can do multi-track tempo manipulations standalone. If material tempo is recognized/assign correctly, tempo changing of any kind is working predictable (including making tempo constant from free style playing). But if Cakewalk way (AudioSnap) produce reasonable results, render Melodyne and then do tempo change in Cakewalk. It can happened that is working fine without Melodyne rendering these days, back in time that was not a good idea...
  7. I don't have membership, so I don't know for sure... but from all announcements: if (1) someone has membership, (2) new core plug-ins are installed, (3) VST3 "Replace if possible on project load" is enabled, then all unlocked Sonitus DXes will be auto-replaced by membership locked VST3 Core plug-ins. That route is one way, so it seems like if someone ends the membership: a) explicitly added or auto-replaced new plug-ins will stop working b) implicit plug-ins may stop working, but I don't know (and I can't test) if implicit plug-ins are auto-replaced. By implicit plug-ins I mean FXChains and Style Dial PC knobs. I couldn't find any info about (b).
  8. Hi Bruce, I am glad you have solved the problem. And sure, once a problem is solved, it is easy to find from where it was (from Sonar change-log for 2025.07):
  9. OP has already mentioned the driver name of the interface: RME Fireface. In other words, he already has an interface with build-in matrix mixer and more routing capabilities then M4 and SSL2 together... I guess there must be a reason he tries to use extra software. ASIO4ALL and VB-Matrix are in different categories. And till "old school" ASIO is finally vanished from Windows and replaced by something "modern", software which workaround artificial Steinberg limitations can be useful.
  10. Hi Bruce, I have installed Coconut and Sonar is still running fine. I even connected RME and set it as the clock master, to be closer to your environment. Tested with M-Audio (2x4), RME (4x12) and Phonic (12x12), at 48kHz, all ASIO. Sonar is using VASIO-512 / VASIO-8. So there is no general problem with Coconut and Sonar. But I have an idea what can be wrong... Which RME device you use? Do you have other ASIO devices in the system? Coconut expose huge number of own IO channels. It can happened Sonar has some limit on total number of channels it is able to "sense" in the system without crashing. To check that theory, can you install "normal" VB-Matrix (uninstall Coconat), and check Sonar still crash?
  11. I can't understand the meaning of that statement. Sonar definitively can work with VB-Audio Matrix (I have never tried Coconut, but seriously... if someone needs Coconut, it is time to invest into proper gear...). Just to be sure, I have tried all combinations. So ASIO /WASAPI. With Matrix loaded and not loaded (obviously IO selection in Sonar has to be changed to get sound). But as with any VB staff, the user has to understand what this software and many of its settings are doing. Windows Audio settings are important as well, especially when audio drivers for hardware audio interface have "not the best quality". VB programs tend to glitch/stuck if something is configured wrong (or by starting programs in wrong sequence). The only Sonar specific I know, it tries to auto-open devices "early". The settings are scattered across several Preferences pages, it is not possible select complete target configuration (driver model + IO channels + clock source) and only then load it. Once ASIO is selected and applied, Sonar already tries to work with some (f.e. last used in ASIO mode) device/channels. And by default Sonar enables all channels. As the consequence, when default/current Sonar ASIO configuration is not only "wrong", but also stuck/crash (easy to achieve with VB and many "well known" devices), that is not easy to correct. General strategy is: make Sonar work in particular driver mode, let say ASIO. If uninstalling VB Matrix is required, fine (doesn't take long to re-install). But just disabling ASIO device in Matrix can be sufficient. If uninstalling is the only route: change ASIO configuration in Sonar to something which should work after required change. F.e. don't use audio interface which will be under VB control. There are plenty of "virtual" devices, from VB and other software, which can be used at that stage. re-install drivers (VB Matrix). check Sonar is still ok properly configure Matrix/Windows/everything else working with audio attempt to select Matrix ASIO device in Sonar. Most important: don't let any software except VB Matrix use real audio interface(s). That includes Windows/OBS/Sonar/etc. Especially important if real device driver is unable to work in ASIO and WDM in parallel. That should be easy to do after you install VB Matrix but before you configure any ASIO (real) audio interface in it. Just switch Windows/OBS/Sonar to use Matrix virtual devices. let audio interface be the clock master in Matrix (if possible) everything should be configured to work with the same sample rate. When some not primary WDM device simply can't work with desired sample rate (f.e. webcam), that is usually not a problem. But Matrix/Device/Windows/Sonar in different sample rate is looking for troubles. The only tricky software is Sonar, sample rate is project dependent. I will say: if you have projects with different sample rate, use dedicated audio interface for Sonar. Can be combined with external analog audio looping when required. Note that Matrix ASIO sample rate is locked to master. So if you still consider to use different sample rates in Sonar, you will need more then one Matrix configuration. Use audio interface as master with different sample rates or use Matrix Clock as master with different sample rates while keeping audio interface under the same sample rate (if that works better). use pessimistic buffer settings (start at least with 512). VB solution is not going to have low latency. It put extra processing and synchronization into audio chain. Any underrun can produce problems in range from glitches up to total crash, even when in "direct" mode it just produce small pop.
  12. I know it sounds "not right", but try to EQ TXLTimecode output, LPF ~20kHz The plug-in outputs "perfect" digital square. That is not good for audio output. File downloaded from the side you have referenced and one other DAW build-in generator have (a kind of) LPF applied.
  13. Driver related setting are grayed when the driver doesn't allow to change them. So it is locked on AXE IO side. But in general I can't imagine a reason to want 16 bit there. AXE IO dynamic range is significantly more then 96dB and DAW processing is done at 24bit (32 bit float, more when "Double precision engine" is enabled). But you can record/render into 16bit files. Unlike with sampling frequency, I don't think Sonar put any restriction on different depth in one project.
  14. For that money, they had to do something "special" 😏 Merging knows how to do that type of equipment, but "USB part" is hacked in by someone else. That is the first (and the only) interface from Neumann, also I don't know if Sennheiser ever had a company with experience in normal USB interfaces. What I think is "special" in the class of "notebook capable" audio interfaces with small number of IO is... fan... reviews mention it is not really silent... πŸ€” Also build-in effects are not stochastic, no reason to record wet. I think many DMs and "DM like" audio interfaces can. DMs effects can be stochastic, so that can make sense. Recording digital EQ and Compressor (f.e. in RME) has questionable value.
  15. Aux Tracks was introduced exactly for that purpose (a possibility to use them as buses, to visually interleave "buses" with tracks, was a side effect...). Many synths and effects produce significantly different result every time you play the track. During performing, people "adopt" to currently produced sound. F.e. all effects with slow continuous (in world time) modulation. There are DAWs in which there is a special "FX bin" which should be applied before writing track data to the disk (useful for audio and MIDI).
Γ—
Γ—
  • Create New...