Jump to content

David Baay

  • Content Count

  • Joined

  • Last visited

Community Reputation

350 Excellent


Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Set Measure/Beat At Now aligns the timeline to existing MIDI and Audio tracks without affecting the absolute timing of either. So it won't bring them in sync if they aren't already. And when you delete the tempo changes that brought the audio in sync with the timeline, the MIDI will follow, but the audio won't, so that's not going to get you what you want, either. What you want to do is align the timeline to the audio first using SM/BAN, and then import the MIDI, and it take on the project tempo(s).
  2. The goal is just to create a 'Master' track within the project that renders the Master bus output to a WAV file, just to make sure you don't have a routing issue within the project or some problem with your export options that's causing the vocal to be dropped. Assuming everything is currently routed to your interface hardware output via a Master bus: - Insert > Audio Track - Double-click the track name and rename it 'Master Bounce'. - Expand the track header, and change the Output assignment to go directly to your audio interface/soundcard output. - Ctrl+Shift+A to de-select everything. - Tracks > Bounce to Track(s)... - In the bounce dialog, select Master Bounce as the Destination, change the Source Category to Buses, highlight only the Master bus, and OK. - Mute the Master bus (grouping in opposition with Master Bounce is not immediately necessary - just a convenience for toggling between the bounce and the 'live' mix later). - Start playback, and verify the vocal is present in the Master Bounce. - If so, stop playback, select that track's number, and File > Export > Audio > Files of type: MP3
  3. Indirectly, yes. It just happens that I like to use 125bpm as a baseline for relating ticks to milliseconds at a given tempo because it's exactly 2 ticks/ms, making the math easy. Here's the full derivation; you would substitute a different 'delay' value in ms to get the simplified formula for a different delay: ticks = beats/minute x 960 ticks/beat / 60000 ms/minute x 50 ms --> beats/minute x 0.8 --> Tempo / 1.25 EDIT: And, of course, you need to enter a negative value in time+ to advance the AmpleGuitars track, effectively delaying the rest of the project relative to it; so the formula gets a minus sign: - Tempo / 1.25
  4. Try going to Properties of non-workiing VSTis in Plugin Manager, and setting "Do not intercept NRPNs".
  5. When you say "I can play the notes which are showing on the piano roll and they make a sound", do you mean on a hardware controller, on the PRV keyboard, or by clicking the notes in the PRV? Other possibilities: - Clips muted - Notes muted - MIDI channel mismatch - Instrument attack too slow for duration of MIDI notes. - Instrument's velocity response curve requires a high minimum velocity to be heard. - Wayward controllers silencing the synth. - MIDI Prepare Using buffer too low.
  6. Positive values delay playback of the offset track against the rest of the project, and negative values advance it. But it's fixed value in MIDI ticks, so you'll have to calculate the offset for a given, fixed project tempo. To advance the Ampleguitars track by 50ms: Offset = - Tempo / 2.5
  7. Yes it's available on Instrument tracks. Click the MIDI tab at the bottom of the Track Inspector (keyboard shortcut 'I') to see the MIDI controls for an Instrument track. Time+ is the third field up from the bottom of the righthand pane of the Inspector.
  8. Yes, I reported this myself back in 2016. I know that clip tempo maps were saved properly in earlier versions, but not sure exactly when it broke. Now you have to get all your work done in a single session which can be problematic since making a lot of edits to a clip map tends to cause a hang in my experience, and you can't go back to an intermediate save. If possible - depending on the situation - it might be best to break up long clips, finish work on one section at a time, and bounce them down as you go.
  9. A quick search found a is 5-year-old thread that seems to suggest the Amplesound synth engine has huge latency, but doesn't report plugin delay to the host so you have to manually offset existing MIDI that's driving the Amplesound synth 50ms early to have Amplesound play in sync with other tracks. This sounds pretty goofy, if still applicable, but that's my reading of it: http://forum.cakewalk.com/Ample-Guitar-great-sound-but-latency-m2989102.aspx All of this might no longer be applicable. Have you tried playing it in real time? If still necessary, MIDI can be offset using the Time+ control in the Track Inspector (-100 ticks = 50ms early at 125bpm). But it can't offset an event that's at 1:01:000 so you might have to start your whole project at 2:01:000. Alternatively, you could create a 'Delay' bus through which every audio/synth track in your project except Amplesound is routed to get things in sync, but you wouldn't be able to do any recording in real time.
  10. I understand that you want to experiment with relative timing of rhythm tracks, but not understanding why you would do it by moving waveforms within clips rather than just moving the clips themselves...? A feature that will help you with tweaking relative timing if you move clips instead of waveforms is 'Nudge' which lets you' define three different increments for moveing clips earlier or later by small amounts using the Numkey pad: Nudge Documentation
  11. True. I kind of overlooked that you specifically mentioned using the mouse in the PRV, and was thinking you were referring to much older pre-Windows flavors of Cakewalk that wouldn't have had a scrub tool.
  12. Not a problem. Larry. And I should clarify that my comment about other things being more determinative of the emotive impression made by a piece was also directed at the premise of that Ledgernote.com page that Joe shared, which proposes that different key have distinct (and definable) emotional characters beyond major = happy and minor = sad. Even that convention isn't totally consistent across all cultures.
  13. I think you might be right about SM/BAN not always handling durations correctly when they cross 'set' points. I recall looking into this once, but don't remember if it was with Fit Improv or SM/BAN. But I also often record solo piano without a click using liberal rubato and sustain, and have rarely encountered problems with notes getting unintentionally sustained (or failing to sustain) because things got out of sync. But I guess it might depend on your playing style. At a glance, placement of pedal events in the snippet you shared looked pretty typical. I'd be interested in looking an example that went wrong in Fit Improv. As I mentioned in the other thread, SM/BAN doesn't force you to set every beat which might help avoid the problem. EDIT: Here's a typical solo piece of mine for reference, though I don't think I ever got around to 'setting' this one. The main reason I would do it with a piece like this would be to convert the MIDI to notation.
  14. Agreed. SM/BAn pops an error if a Now time is already 'set' or if setitng a pooint owuld require a tmepo outside the supprted range (8-1000bpm IIRC). Seems fit imporv could easily do that. Yeah, SM/BAN is a little more typing-intensive, and won't guess a resolution finer than a quarter-note beat. But the advantage is that you can tab to notes, and set points precisely at note starts, and also set intermediate beat values in cases where few notes are falling on a beat or you need to align an accel or rit. And - maybe more importantly - you're not forced to set every beat if you just want to tighten up tempo variation of specific measures, and let it flow elsewhere. Just remember that fractional beats are decimal, not ticks, so the 8th note 1:480 of a measure would be set to beat 1.500.
  • Create New...