Jump to content

David Baay

Members
  • Posts

    4,433
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by David Baay

  1. After you Write the preset to the User bank, click the instrument name again and choose Save Bank. Save the .GMN file with an appropriate name. Note, however, that CbB does not automatically reload the saved bank file when you re-open the project. The instument assignment will have been save independently in the project, but if you want to load presets from the saved bank to another channel or change the instrument assigned to the current channel using some other preset in your custom bank, you will need to manually Load the saved file into the current session.
  2. Go back to the original 75 BPM project with the synced audio and MIDI. - Set the Now time at 76:01. - Shift+M to bring up Set Measure/Beat At Now, enter measure 101, beat 4, and OK. - CbB will reset the initial tempo to 100.75 without affecting the playback timing of either audio or MIDI. - It will also add a mathing tempo node at 101:04 which you can delete. Alternatively, to have CbB find a possibly more precise average tempo: - Let the project play with the metronome disabled and count bars out to the last downbeat. - Stop the transport , zoom in, and place the Now time precisely on the downbeat transient. - Shift+M, enter that measure, beat 1 and OK. - CbB will calculate the tempo needed to make that measure/beat hit that Now time and change the project tempo .
  3. I made an articulation map for the Kontakt Factory Library "Flute (all)" patch based on the following "style" info and did a quick test: case (24) message ("Staccato 1") set_text ($info,"Staccato 1") case (25) message ("Staccato 2") set_text ($info, "Staccato 2") case (26) message ("Sustain") set_text ($info, "Sustain") case (27) message ("Sforzando") set_text ($info, "Sforzando") case (28) message ("Fortepiano") set_text ($info, "Fortepiano") I found the volume variation with velocity worked pretty much as expected for all articulations. The Staccato, Sforzando and Fortepiano articulations are initially a bit louder due to their emphasized attacks, and indeed, the lowest level you can get at the default MIDI volume of 101 is around -36dB for these articulations vs, about 10 dB lower for the Sustain articulation. -36dB is pretty quiet at normal monitoring levels, but if you need to go lower, you can just lower the instrument's overall level using either MIDI Volume or Expression. For example, lowering the MIDI track Volume to 64 allowed a minimum level for a single Stacatto note of about -50dB and a maxmum of -19dB - a 30dB dynamic range which seems reasonable for a flute.
  4. Glad to help. Incidentally, it sounds like you manually inserted and adjusted tempos to fit the timeline to the musical tempos. In future - or maybe even in this project - you might want to look into using Set Measure/Beat At Now (Shift+M) to build the tempo map. You just tell CbB where the beat transients should fall, and it calculates and inserts the tempo changes for you to make that happen. I'm a huge advocate of this technique to allow recording without a click and then matching the timeline to the performance and tightening it up as needed.
  5. That's kind of what I was getting at: In the absence of video, you could get a very comparable aural result from far less hardware (though, admittedly I skipped ahead at some point - probably right around the 6-minute mark - to see how the video would end, so maybe I missed something important...?). The video content is the only thing that made me listen for more than a minute, and it wasn't captivating enough to make me endure the full musical composition.
  6. Kontakt should be able to scale volume/loudness more or less continuously across the velocity range regardless of the number of velocity layers. What you typically hear with a limited number of velocity layers is sudden transitions in timbre, not volume. I haven't played around with articulation maps much at all but if the OP shares the .artmap file, I can give it a shot with the Kontakt Factory Library and see if I can reproduce and/or diganose the problem.
  7. What keyboard and are you monitoring the keyboard's onboard sound or using a soft synth? And if it's onboard sound, is it triggered by Local Control or by MIDI echoed through a track in CbB and back to the keyboard synth? I've seen the opposite problem with some soft synths that sustained notes are stopped by All Notes Off (CC123) sent from the keyboard when the last key is released (not all keyboards do this, but my Roland RD-300s did). This is a MIDI programming error specific to some synths (e.g. TruePianos). But I've never heard of this happening the other way around as you describe it. Incidentally, just to be clear, releasing the sustain pedal should cause the keyboard to send CC64=0 (or a series of events down to 0 if the pedal/keyboard implements continuous sustain), not Note Off(s). EDIT: I see from your other post about the MIDI problem that it's a controller-only M-Audio Hammer 88 . Strangely, the manual does not have a MIDI Implementation chart, so I can't tell if it ever sends CC123, but it does indicate that the sustain is On/Off only. So the big question is what piano synth are you using?
  8. David Baay

    Loopback issues

    Maybe using onboard sound? Most internal sound chips/cards have an option in the mixer to use the output as an input for recording - commonly called "What U Hear" or "Stereo Mix". Otherwise you would need to physically patch outputs to inputs on your Focusrite.
  9. Yeah, I'm pretty sure you could get that out of a laptop running just CbB with a copy of NI Komplete Standard and a 49-key MIDI controller. ;^) Just don't ask me to troubleshoot "No sound" from that setup ? More than 'shades' of Vangelis.
  10. I agree the amount of change he's dealing with is pretty small, especially considering it's over 16 bars. But if he cuts it up and moves parts around without allowing them to follow tempo, or flattening the tempo overall, he's going to have gaps/overlaps at clip boundaries and sync problem if he moves pieces if individual tracks. In this context, it could be audble if he doesn't do one or the other.
  11. - Save As a copy of the project with "Flattened Tempos" appended to the name. because you're going to be deleting your painstakingly matched tempo changes. - Select all audio clips. - Open Audiosnap section of Clip Properties in the Inspector and check the 'Enable' box. - In 'Follow Options', choose 'Autostretch'' (this mode stretches audio uniformly between tempo changes without regard to beats). - Check the 'Follow Proj Tempo' box above Follow Options to enable audio stretching to follow changes to project tempo(s). - Switch to the Tempo List in the Inspector and delete all the variable tempos, leaving just the initial one to be the fixed tempo (you can then adjust that one tempo as needed). - Experiment with different stretching algorithms on different tracks (see Audiosnap documentation). - When you're satisfied with the result, you'll probably want to render the stretching permanent by Bounce to Clips(s)) to get the best possible audio quality. - As before I recommend doing detructive things like Bounce to Clip(s) in a new copy of the project so you can revert to the non-destructively stretched version if you determine something needs to be fixed later.
  12. When using a keyboard synth, it's usually best to set it up as if it were separate keyboard controller and sound module. Disable Local Control of the internal sounds from the keyboard and echo the MIDI through a MIDI track in CbB and back to the keyboard synth. For drums, the keyboard synth will probably want to see MIDI on channel 10, but you don't have to change the transmit channel on the keyboard; you can just set the forced ouput Channel in the MIDI track to channel 10, and leave the Input set to channel 1 of the keyboard's MIDI port. The same goes for any other instrument sound you might want to use from the keyboard, assuming it's multi-timbral - just set the the output channel of a dedicated MIDI track to the relevant channel in the multi-timbral patch setup. You'll probably also want to find and load an Instrument Definition for the keyboard synth so you can set patches from the MIDI track so you always have the right sounds on the right channels for a particular project. You can continue to use 'live', real-time audio output of the synth from the recorded MIDI while devleoping a project, and only commit to recorded audio when the arrangement and MIDI editing are complete. Just don't make the mistake I've made in the past of only recording the final Master and never recording the individual synth tracks such that I couldn't revisti a project later because the synth had died.
  13. When you have Always Echo Current MIDI Track enabled in Preferences > MIDI > Playback and Recording (on by default), the currently focused track will show an 'A' for 'automatic' in the enabled Input Echo button. If you click that button, the 'A' will go away, indicating that Input Echo is forced On and will continue echoing even when focus moves away. This allows having more than one track echoing input (for purposes of layering, etc). So check that other tracks don't have the Input Echo button lit. Otherwise, you should not be encountering this problem, and it should not be necessary to to assign specific input channels. But I usually do recommend choosing channel 1 for everything to avoid doubling in the case that some MIDI source is sending on more than one channel (as my old RD-300s did by default).
  14. I can confirm that inconsistency with Send Level and Pan controls. Double-clicking anywhere in any other control resets it and never opens text entry. With track/bus Volume controls you can double-click the numeric value to enter text, but I usually use Enter on the keyboard to access text entry for any control, regardless.
  15. Yes, that's my understanding as well. And the change is pretty subtle and would probably only be really noticible if it's out of sync with some other timing reference. So.. what's the goal?
  16. I've sometimes had only the 'touched' track change, but always attributed it to workiing too fast and releasing Ctrl a moment before making the selection. I don't recall ever seeing some-but-not-all of them change; are you sure about that?
  17. More like an 'oversight' than a 'decision' probably. If they had taken the time signature into account, it would have been easy to specify "Beats in Clip" based on that, and probably not that difficult to implement after the fact.
  18. Sorry, but this is just wrong. The denominator is the value of a 'beat' (i.e. where you would naturally tap your foot or count along in your head), and the numerator is the number of beats/taps/counts in a measure. 7/4 isn't "7 notes played across 4 beats" it's 7 quarter-note beats in a measure.
  19. Save As is a feature of pretty much all Windows content-creation apps that allows saving a copy of the 'document' currently in memory with a new file name without affecting the file from which is was loaded. What I typically do is Save As periodically while working and add a filename suffix to the base song name, referencing the last major action/edit or two, especially immediately after recording content that I don't want to lose to a crash, and before doing something destructive that can' t be easily un-done after making a bunch of other edits or closing the session, like quantizing MIDI or running a CAL script. You can Save As all the versions to the same project folder so they all share the same audio files. The project files themselves generally remain pretty small so it doesn't use a lot fo space. When a project gets to the point that you feel it's "done", you can save it to a new project folder with Copy All Audio enabled to save only the files used by that final version. But I generally just keep everything in the original folder as a record of how a project evolved and a resource for raw recordings that I might re-visit later using different instrumentation.
  20. I've never lost MIDI spontaneously in all my years except by my own mistake. If you're seeing this in the PRV, it could be due to having Hide Muted Clips enabled, but that would not affect the Track View. The most likely way of actually deleting something indirectly is by having Ripple All enabled when you delete something in another track. Other possibilities would be moving it to a hidden track or inadvertently slip-editing the clip down too nothing or maybe executing Bounce to Clip(s) while it's muted. For future reference, I always advise using Save As to preserve earlier versions of a project as it develops in order to avoid saving irreversible edits. If you haven't already re-saved the current session, you should Save As with a new name now and check the last saved version.
  21. It's not clear to me that this is reproducible enough to conclude that. I'd be inclined to think he's running up against some sort of resource management threshold, and that it might coincidentally have worked just as well to export Entire Mix at that particular moment. I'd want to see that there's a a consistent difference in between exporting just the Master bus and exporting the 'Main Outs' that the Master routes to and/or Entire Mix, and also that it's reproducible with different projects. A lot of users export the default Entire Mix without knowing any better and without encountering any issue.
  22. As I said, Sonar creates a virtual bus for each hardware output in the system and uses that to mix down. The driver is not involved. The fact that you can export to a file format that your interface doesn't support - and even with no interface/driver loaded at all - testifies to this.
  23. If I understand correctly, you're conflating parallel processing with parallel routing. What you're describing is parallel routing . Parallel routing of signals through FX plugins is likely to sound vastly different from serial routing, and isn't generally desirable except for specific purposes like adding reverb and parallel compression. And parallel processing should not depend on (or benefit from) parallel routing.
  24. The soundcard is not involved either way. CbB creates a virtual hardware bus for each hardware output on the interface (hidden by default but viewable in the Console by dragging the splitter at the far right to the left), and uses this to mix down if the Source for the export/bounce is Hardware Outputs or Entire Mix (all hardware outputs combined). Like Glenn, I don't know why you would have problem exporting some or all hardware ouputs, but I always recommend exporting with Source = Buses, and only the Master selected so you don't have trouble with something like a headphone mix on another hardware out getting included in the export.
×
×
  • Create New...