Jump to content

David Baay

Members
  • Posts

    3,682
  • Joined

  • Last visited

Reputation

1,793 Excellent

4 Followers

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. The last time I looked only Pro Tools and Reaper support this (i.e. defining the value of a beat to be a dotted quarter with three eighths per beat), and it's optional because not all 6/8 and 9/8 compositions have a triplet feel. In CW and every other DAW, a "beat" is always a quarter note and has two 8ths. This also affects how tempo is defined - bpm is always quarters per minute (not the dotted quarters per minute of compound time).
  2. I suggest you uncheck Use ASIO Reported Latency, leave the Manual Offset at 0, and try recording the metronome output via the loopback setup suggested by reginaldStjohn. The recorded click should be laid down approximately that 262 samples late (usually a little more due to unreported hardware/firmware latencies). If it's still early, execute Reset Configuration File and try again. If you can get the metronome recording late without compensation as expected, then you can re-enable automatic latency compensation plus or minus a Manual Offset to dial it in to the sample and then move on to recording your DI guitar. If you continue to see the metronome recorded early without any compensation applied, then I would have to think there's something on your system like ASIO4ALL or Steinberg's Generic Low Latency ASIO Driver interfering with Cakewalk's using the Focusrite driver exclusively.
  3. Your setup should be working as expected if channels and ports are correctly assigned. If you record MIDI, do you see CC64 messages in the PRV controller lane and/or Event List, and they're on the same channel as the Note events? If so, try playing back that recorded MIDI to EZ Keys. If not, double-check that the track Input is including the Icon controller port with no channel restriction (Omni) and the Arturia and Icon are both set to transmit on channel 1, and retry recording.
  4. I had not updated to V15 yet so I had a go just now. I got installation errors until I executed Clear Cache in Settings as suggested above (and also by the Waves support page for "Installation failed – Please check your internet connection"). All seems fine with existing projects after updating and re-activating licenses.
  5. Because the underlying code that determines what should be drawn in any given situation hasn't changed, only the way it's drawn. I've previously reported a number of crossfade-related display issues that have been fixed. What are the specific scenarios/steps where you're seeing problems?
  6. Go to HKEY_LOCAL_MACHINE\SOFTWARE\ASIO in the registry and remove the 'Generic Low Latency ASIO Driver' key. Cubase installs this and it's notorious for causing issues with Cakewalk. It's not needed for using WASAPI with onboard audio or for an external audio interface with its own ASIO driver.
  7. Go to the Cakewalk application folder in Program Files, and rename AUD.INI. Cakewalk will replace it on launch with one that defaults to WASAPI. Not sure why this happened with WDM; likely some issue with the response of the device driver to initialization. EDIT: It'll be in Cakewalk\Cakewalk Core for Cakewalk by Bandlab.
  8. That's not likely an inherent issue with Instrument tracks but some error in the conversion process causing the track to be associated with the wrong synth in the first place. Because inserting synths in FX bins is such an old implementation, I wouldn't be too surprised if there are issues with audio/MIDI port enumeration changes when deleting the ones in the FX bins. The right sequence of conversion steps should avoid that. I would recomment the following approach to avoid having ports crossed up or having to recreate the MIDI and synth audio tracks, preserving the existing FX and track settings: - Save a copy of the project, save presets for any synths that aren't using a default, and then delete all the synths from FX bins and re-save it. - Using the original project as a reference, insert the first missing synth by Insert > Soft Synth, and don't have Sonar create any tracks for it. - Assign the input of the audio track to that previously hosted the synth in its FX bin to the newly inserted instance, click the icon to open the synth UI and assign the relevant preset. - Assign the ouput of the relevant MIDI track to the synth in the rack. - Select the MIDI and Audio tracks, right-click and choose Make Instrument. - Test playback. - Repeat the insert, I/O assignment, Make Instrument and test playback for each synth.
  9. Enable Ripple Edit Selection, select the region from the cutoff point up to the controllers that you want to keep (PRV or Event List will probably be easiest for this) and delete. Ripple edit will close the hole created by the delete, bringing the controllers back to the end of what you're keeping.
  10. LPF in Gloss EQ or LP-64 EQ or built into the plugin or somewhere else? Does it matter how low the cutoff is? Any particular instrument plugin or you can reproduce it with several?
  11. There are still situations where MIDI timing can drift progressively out of sync by a few samples per iteration when looping at random points at certain tempos but it's usually avoided by looping exactly on bars/beats. I believe Jamstix uses song position so I might try experimenting with that later.
  12. Alt+Ctrl+C gets you Copy Special and Alt+Ctrl+V gets you Paste Special with the option to include Tempo Changes.
  13. Glad to help. Separate Synth Audio and MIDI tracks should be okay as well, you just need to get the audio into synth track via the Input from the Synth Rack rather than the FX bin.
  14. I created a comparable project with Sonitus Delay delaying a track by one beat and nulling against the same audio manually offset by 1 beat. I then added CW's TS-64 Transient Shaper which adds a significant amount of plugin delay to the Sonitus-delayed track and confirmed the two tracks continued to null. I then moved Sonitus to the bus through which that track was outputting, and the project continued to null both before and after restarting playback. For good measure, I tried moving Transient shaper to the same bus and other buses and tracks, and the project continued to null as expected. So I think this issue is either specific to ShaperBox or possibly involving the live synth which I'll test next. EDIT: Did the same test using two nulling instances of Session Drummer (because two instances of SI Drums already weren't nulling well), and got the expected result regardless of where Sonitus Delay or Transient Shaper were located.
  15. I installed the plugins, but the demo of ShaperBox isn't doing anything - possibly because I didn't want to install the 300MB of presets and other content. I can see that Voxengo is set to add 1000 samples of delay to the synth audio track, and I can hear that delay vs. the metronome when engaged, but with ShaperBox not doing anything, there's really nothing to hear. EDIT: As I re-read the OP, it seems I may have misunderstood and this isn't even about PDC but more about tempo-syncing. Is that right? Also, as I read that Voxengo "Latency Delay introduces 10000 samples latency itself and delays the audio signal by 10000 minus the specified amount of samples or milliseconds, thus eliminating the unreported latency", the expected behavior of the test project becomes even harder to predict. I'll have to experiment with some other tempo synced plugin.
×
×
  • Create New...