Jump to content

David Baay

Members
  • Posts

    4,446
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by David Baay

  1. When recording a mono source, you need to select only the left or right channel of the stereo input pair as Input to the track. You can bounce the stereo track to Split Mono and discard the silent channel to fix what you’ve got without re-recording.
  2. The last time I looked only Pro Tools and Reaper support this (i.e. defining the value of a beat to be a dotted quarter with three eighths per beat), and it's optional because not all 6/8 and 9/8 compositions have a triplet feel. In CW and every other DAW, a "beat" is always a quarter note and has two 8ths. This also affects how tempo is defined - bpm is always quarters per minute (not the dotted quarters per minute of compound time).
  3. I suggest you uncheck Use ASIO Reported Latency, leave the Manual Offset at 0, and try recording the metronome output via the loopback setup suggested by reginaldStjohn. The recorded click should be laid down approximately that 262 samples late (usually a little more due to unreported hardware/firmware latencies). If it's still early, execute Reset Configuration File and try again. If you can get the metronome recording late without compensation as expected, then you can re-enable automatic latency compensation plus or minus a Manual Offset to dial it in to the sample and then move on to recording your DI guitar. If you continue to see the metronome recorded early without any compensation applied, then I would have to think there's something on your system like ASIO4ALL or Steinberg's Generic Low Latency ASIO Driver interfering with Cakewalk's using the Focusrite driver exclusively.
  4. Your setup should be working as expected if channels and ports are correctly assigned. If you record MIDI, do you see CC64 messages in the PRV controller lane and/or Event List, and they're on the same channel as the Note events? If so, try playing back that recorded MIDI to EZ Keys. If not, double-check that the track Input is including the Icon controller port with no channel restriction (Omni) and the Arturia and Icon are both set to transmit on channel 1, and retry recording.
  5. I had not updated to V15 yet so I had a go just now. I got installation errors until I executed Clear Cache in Settings as suggested above (and also by the Waves support page for "Installation failed – Please check your internet connection"). All seems fine with existing projects after updating and re-activating licenses.
  6. Because the underlying code that determines what should be drawn in any given situation hasn't changed, only the way it's drawn. I've previously reported a number of crossfade-related display issues that have been fixed. What are the specific scenarios/steps where you're seeing problems?
  7. Go to HKEY_LOCAL_MACHINE\SOFTWARE\ASIO in the registry and remove the 'Generic Low Latency ASIO Driver' key. Cubase installs this and it's notorious for causing issues with Cakewalk. It's not needed for using WASAPI with onboard audio or for an external audio interface with its own ASIO driver.
  8. Go to the Cakewalk application folder in Program Files, and rename AUD.INI. Cakewalk will replace it on launch with one that defaults to WASAPI. Not sure why this happened with WDM; likely some issue with the response of the device driver to initialization. EDIT: It'll be in Cakewalk\Cakewalk Core for Cakewalk by Bandlab.
  9. That's not likely an inherent issue with Instrument tracks but some error in the conversion process causing the track to be associated with the wrong synth in the first place. Because inserting synths in FX bins is such an old implementation, I wouldn't be too surprised if there are issues with audio/MIDI port enumeration changes when deleting the ones in the FX bins. The right sequence of conversion steps should avoid that. I would recomment the following approach to avoid having ports crossed up or having to recreate the MIDI and synth audio tracks, preserving the existing FX and track settings: - Save a copy of the project, save presets for any synths that aren't using a default, and then delete all the synths from FX bins and re-save it. - Using the original project as a reference, insert the first missing synth by Insert > Soft Synth, and don't have Sonar create any tracks for it. - Assign the input of the audio track to that previously hosted the synth in its FX bin to the newly inserted instance, click the icon to open the synth UI and assign the relevant preset. - Assign the ouput of the relevant MIDI track to the synth in the rack. - Select the MIDI and Audio tracks, right-click and choose Make Instrument. - Test playback. - Repeat the insert, I/O assignment, Make Instrument and test playback for each synth.
  10. Enable Ripple Edit Selection, select the region from the cutoff point up to the controllers that you want to keep (PRV or Event List will probably be easiest for this) and delete. Ripple edit will close the hole created by the delete, bringing the controllers back to the end of what you're keeping.
  11. LPF in Gloss EQ or LP-64 EQ or built into the plugin or somewhere else? Does it matter how low the cutoff is? Any particular instrument plugin or you can reproduce it with several?
  12. There are still situations where MIDI timing can drift progressively out of sync by a few samples per iteration when looping at random points at certain tempos but it's usually avoided by looping exactly on bars/beats. I believe Jamstix uses song position so I might try experimenting with that later.
  13. Alt+Ctrl+C gets you Copy Special and Alt+Ctrl+V gets you Paste Special with the option to include Tempo Changes.
  14. Glad to help. Separate Synth Audio and MIDI tracks should be okay as well, you just need to get the audio into synth track via the Input from the Synth Rack rather than the FX bin.
  15. I created a comparable project with Sonitus Delay delaying a track by one beat and nulling against the same audio manually offset by 1 beat. I then added CW's TS-64 Transient Shaper which adds a significant amount of plugin delay to the Sonitus-delayed track and confirmed the two tracks continued to null. I then moved Sonitus to the bus through which that track was outputting, and the project continued to null both before and after restarting playback. For good measure, I tried moving Transient shaper to the same bus and other buses and tracks, and the project continued to null as expected. So I think this issue is either specific to ShaperBox or possibly involving the live synth which I'll test next. EDIT: Did the same test using two nulling instances of Session Drummer (because two instances of SI Drums already weren't nulling well), and got the expected result regardless of where Sonitus Delay or Transient Shaper were located.
  16. I installed the plugins, but the demo of ShaperBox isn't doing anything - possibly because I didn't want to install the 300MB of presets and other content. I can see that Voxengo is set to add 1000 samples of delay to the synth audio track, and I can hear that delay vs. the metronome when engaged, but with ShaperBox not doing anything, there's really nothing to hear. EDIT: As I re-read the OP, it seems I may have misunderstood and this isn't even about PDC but more about tempo-syncing. Is that right? Also, as I read that Voxengo "Latency Delay introduces 10000 samples latency itself and delays the audio signal by 10000 minus the specified amount of samples or milliseconds, thus eliminating the unreported latency", the expected behavior of the test project becomes even harder to predict. I'll have to experiment with some other tempo synced plugin.
  17. I can't reproduce a problem with my own test project, simply routing the phase-inverted one of two nulling audio tracks through a bus with a PDC-inducing plugin on it and moving that plugin back and forth between the bus and the track. I haven't yet tested with your project because I'll have to install the demo of ShaperBox and the Voxengo Delay and review what they're doing, but I see a couple issues with the project at a glance that could be contributing: - SI Drums is inserted in the audio track's FX bin; this is technically supported for backward compatibility with projects created before the Synth Rack was introduced, and there are some known issues with it. The preferred method is to insert the synth in the Synth Rack wiith the Input of the synth audio track set to the output of the synth. - This issue may be peculiar to live MIDI-driven synths or even SI Drums, specifically, which tends to render transients a bit late to begin with. Overall, the project is introducing too many variables to simply and clearly demonstrate whether PDC is working correctly on buses, including the use of three plugins, one of which is known to have poor timing, and two of which are 3rd-party and are altering that timing with unspecified custom settings.
  18. Ah, okay. I was using Cakewalk back then but didn't join the forum until around Sonar 6.
  19. 48/44.1 (a factor of ~1.09 ) is in between a semitone (~1.06) and a whole tone (~1.12). No idea how you would get a 4-semitone error.
  20. You can drop the SM/BAN click track from the demo project I shared into a new project, set the time signature to 3/4, drag the click to the timeline and you'll get the same (bad) result. Melodyne just doesn't 'get' that you only need at most one tempo change per transient, and it has trouble with material that deviates more than a couple bpm or changes rhythm even as the tempo remains perfectly constant. Melodyne can do some really amazing stuff in the pitch realm, but I have never been impressed with its tempo extraction.
  21. Remove ASIO4ALL key from this registry path if still present: HKEY_LOCAL_MACHINE\SOFTWARE\ASIO Then click reset Config to Defaults under Preferences > Audio > Config File, reset the driver mode to ASIO and reselect the Apollo driver.
  22. Essential is sufficient, and my understanding is that the drag-to-timeline function should continue working after the trial has expired, but I can't verify that because I have Studio. Personally, I prefer Set Measure/Beat At Now because Melodyne interpolates tempo changes every 8th note where they are not needed and often makes gross errors if the tempo is more than slightly variable. EDIT: Here's quick example of two projects using the same short audio clip - one with tempos set by Melodyne and the other using Set Measure/Beat at Now. The tempo map in the Melodyne project might look "smoother" and more precise, but the metronome is terribly out of sync with the audio: SMBAN vs Melodyne Tempos.zip
  23. CbB or Sonar? It should work without having to arm the track; sounds like the input port is not being opened as it should be Input Echo alone. What audio interface and driver mode? EDIT: Or possibly the audio engine is not running. Verify the metronome is set to Audio (not MIDI), and try toggling the Run/Stop Engine button in the Transport Module or momentarily starting playback . If there's no MIDI/audio content in the project yet, disable 'Stop at Project End' in the track view Options menu to allow the transport to run with no track content.
  24. David Baay

    Clip color

    The new position should persist for the session but will reset on re-opening the project if it isn't re-saved.
  25. David Baay

    Clip color

    Once you save a project with a plugin UI in a different position, that position will be the new default. You can achieve the same for completely new projects by saving a tempate with some plugin moved to the preferred position (not sure if this persists with no plugins in the template). Workspaces may also recall I changed default position - I haven't tested that. EDIT: Project and Track Templates can also be used to define default clip colors, and the Clip Properties tab in the Track Inspector allows independently specifying custom foreground (waveform/notes) and background colors.
×
×
  • Create New...