Jump to content

bitflipper

Members
  • Posts

    3,211
  • Joined

  • Last visited

  • Days Won

    19

Everything posted by bitflipper

  1. Bottom line first: pick a pan law and stick with it. It almost doesn't matter which one you pick, because you'll quickly become accustomed to it and hopefully never think about it again. Stick with just the one pick, though, because the pan law is global, and changing it will mess with any previous projects you revisit. And of course, never change it mid-project unless you want to restart the mix from square one. As to which one is "best", most intuitive or most practical, that's been debated since the feature was first introduced in the 60's. Some prefer -3dB, some -6dB, with reasonable technical arguments for both. SSL introduced -4.5dB, not because it's better than those but merely as a compromise between them (conventional hardware consoles' pan laws are not selectable like in a DAW). The fact that it's not offered in CbB shouldn't trouble anyone in the slightest. Pan laws not only differ in whether they use 3, 4.5 or 6 decibels, but whether they achieve that compensation by lowering the center or by raising the sides. IOW, you can keep the volume steady by adding 3dB at the extreme left and right positions, or by lowering the center by 3dB. The latter helps avoid the effect described in the OP, wherein a track unexpectedly clips just from being panned. Unfortunately, Cakewalk's default option, "0dB, sin/cos taper, constant power" suffers from this potential problem because it raises the sides rather than lowering the center.
  2. The problem, Eric, is that those of us who've used SONAR 6 are all old farts who have trouble remembering what we were doing last month, let alone in 2005. Trust me, the new DAW will be a huge step up from S6 (or even 8.5, which I agree with Mark was SONAR's pinnacle). Since you're getting back into it after a long absence, you're going to have to face a learning curve anyway. Might as well put that unavoidable effort into CbB. We'll be here for ya.
  3. ^^^Your point is well-taken: when there is a discrepancy, trust the DAW over the playback device/software. Rather than asking "what's wrong with my DAW?" instead ask "what's wrong with my player?". But to their credit neither the original poster nor the user who revived the thread reflexively blamed the DAW. In fact, Thierry's (correct) instinct was to verify the file's integrity. He just went about it wrong, making a reference mix within the project rather than exporting it and then importing it back into the DAW, as suggested by Noel. Craig's suggestion is probably the most likely explanation. Note to other folks experiencing this problem - if you think this is bad, wait until you play your masterpiece back in your car. Or on earbuds, or on your friend's hi-fi, or over a PA system. It'll sound different every stinkin' time.
  4. In terms of audio quality, there is no practical difference between DX and VST. I am perfectly happy with DX plugins. And no, DX is not going away. At least, not as long as the XBox lives on.
  5. USB ports can become unresponsive after a sleep, or just due to your power scheme. You can go into Device Manager and exclude them from whatever power-saving scheme you've specified. In Device Manager, locate your USB port (it'll be called "USB Root Hub", and there will likely be more than one so do this for all of them). Right-click on each USB device, select Properties and go to the Power Management tab. If the box labeled "allow the computer to turn off this device to save power" is checked, un-check it.
  6. ^^^ This is the answer. Assuming you're not going to send your mix to a third party for mastering, the last active component in your master bus fx bin should always be a limiter, followed only by metering plugins. You'll probably want a LUFS meter at the very end, which will do a pretty good job of telling you whether the master is going to be too quiet, too hot, or somewhere comfortably within the Goldilocks Zone. You'll probably want to import some music from your favorite commercial recordings, anything that sounds particularly good in the car, and use that as your LUFS target. Setting levels for your car is a tricky business, as the car's player probably has built-in compression and EQ that the owner's manual doesn't mention. Plus the acoustics inside a car are pretty awful. So don't be surprised if your carefully mastered songs sound good only in the car, and nowhere else.
  7. Check to make sure it installed correctly and didn't fail the scan. Go to Preferences -> VST Settings -> Scan Options. Check the box labeled "Generate Scan Log", and then click the button labeled "Reset". Run the scan by clicking "Scan". This alone may do the trick, but if Synthmaster still isn't showing up, check the log. It'll be in %appdata%\Cakewalk\Logs. Open it in Notepad and search for SynthMaster.
  8. "Soloing and muting verifies it is this track." Are you saying that the empty portion produces an output even when it is soloed? If so, is this a separate instrument or one of several tracks routed to a common multi-timbral instrument, e.g. Kontakt with more than one instrument loaded? Can you correlate the phantom notes to any notes in any other track? It would help if we had more information about the project, what instruments are being used, and how the routing is set up. Whenever I've confronted such mysteries, it's always turned out to be a routing problem. That can include how MIDI tracks are routed to synths, how audio is routed internally within a multi-timbral synth, or inconsistent MIDI channel assignments. Symptoms can be varied and weird, e.g. the wrong voice sounding, a silent instrument, keyswitches or CCs being ignored.
  9. Exactly what went through my mind as I listened to it. What is a piano supposed to sound like, when no two sound alike to begin with? I have a real piano, a nice one. But I don't record it. It simply doesn't sound as good as some of my sampled pianos.
  10. Could you describe the symptoms of the "output silence/vst scrambling" issue? I just had a bizarre thing happen in this Omnisphere-heavy project: an instance of Kontakt went silent after updating Omnisphere. Weird. I hesitate to bring it up here, as it might be off-topic. The problem turned out to be a routing issue - the Kontakt audio track's input source had been switched from Kontakt to another synth (Zebra2). I can't imagine a scenario in which I could have accidentally done that myself, and the cross-routing had to have occurred within the last hour.
  11. I missed the 2.7 announcement and wasn't even aware of the update (running ver. 2.6 here), so thanks for cluing me in. Yeh, I know, it clearly says "updates available" every time you start it up. Situational blindness, I guess. I have current projects that use Omnisphere, Trilian and Keyscape. I'm going to update them this weekend and see if there are new problems, then post back my results. [EDIT] I couldn't wait for the weekend. Omnisphere is too important to me to not know if it has a problem. Don't know if this is good or bad news, but I just played back a project with 16 patches in a single Omnisphere instance and there were no discernable problems. How could that be bad news? It is if you're trying to replicate a problem. Sorry, I could not.
  12. Orchestral and choir, and an appreciation for video games...sounds like you should try your hand at video game music. Over the past decade the artform has grown in both sophistication and popularity, yielding some truly memorable pieces that have broken out of the game context and are interpreted live in concerts. And brought unexpected fame to journeymen composers such as Jeremy Soule, who created the soundtrack for my all-time favorite game:
  13. They are all monsters - every one of them. All have a daunting learning curve. They will all create equally great-sounding recordings. Whatever you choose, you can be sure it will require a significant investment in time and effort to obtain fluency. The main differentiators are not what you can do, but a) how well-supported they are by both the vendor and the user community, and b) how intuitive you find the workflow. On the first measure, vendor and community support, Reaper scores very highly. So does Cakewalk. As for the second measure, only you can determine how comfortable the software is to use. For me, Reaper and I didn't click, even though I have great respect for it on a technical level. But hey, Reaper's cheap and Cakewalk's free, so grab them both and dive in. Just try to avoid getting discouraged at the outset by reminding yourself that it takes time to get rolling. You may not play video games anymore, but I still find that an occasional zombie-murdering intermission is helpful for concentration and alleviating stress.
  14. It's been a bit of a dirty secret that some editing techniques that are standard practice with pop music are also used with "pure" genres such as classical, jazz and folk. Sure, engineers working in those genres will usually strive for transparency, but then pitch correction in pop music was once meant to be unnoticeable too. They got over that self-imposed restriction pretty fast. At present, classical music production sticks to a light touch and subtle digital manipulation, but they've only recently become comfortable with admitting they do it at all. It makes sense to apply some amount of dynamic range reduction, given that people are far more likely to listen in the car while sitting in traffic, or on ear buds on a plane or train. Noise reduction seems reasonable, too. But I have to wonder if there hasn't also been some discrete enhancements using EQ and reverb. This video mostly addresses editing, as opposed to processing (we can still draw a distinction between those things, for now). Mostly they talk about comping, but also mention the ability to do polyphonic pitch and timing corrections using Melodyne. There is a segment in the middle that CW users might find interesting, where the hosts attempt to discern between real instruments and virtual instruments. Spoiler: the drummer guessed wrong on the drums and the pianist guessed wrong on the piano.
  15. Better still, it would be nice to see a Sonitus revamp down the road. I understand that even though CW did not write these, they do own the source code.
  16. Well done, as always. I enjoyed it, even though I don't use either feature. First time I've heard "if you don't like the video, click Dislike twice".
  17. One of the reasons ASIO is so efficient is that it doesn't support and therefore doesn't have to deal with multiple data streams from multiple programs. The trade-off is that you can only have one ASIO device active at a time. That's why Glenn suggested WASAPI Shared. Although not quite as fast as ASIO (but close enough), it can deal with multiple sources concurrently. That's how Skype can "ring" even though you're listening to music or watching a YouTube video.
  18. What happens with CC7 is entirely up to the instrument. Even though CC7 (and CC1) are the most widely-implemented controllers, they're not universal. A synth may allow runtime conversion of CC7 to some other value, use it for a nonstandard purpose, or ignore it completely. However, there are other possible reasons for CCs in general to be ignored. Every CC command carries a MIDI channel number; if it's different from the instrument's assigned channel, the synth will ignore it. This is possible when you hand-plant a CC7 via the Event List. If that's how you're adding the CC7 event, try using an automation envelope instead, which should always have the correct MIDI channel.
  19. I don't generally use multi-band compressors (prefer dynamic equalizers) but I've had good results in the past with the Sonitus plugin. A pretty UI isn't everything. I have FabFilter Pro-MB with its colorful visuals, but the Sonitus multiband is quicker to dial in. Sure, I get it when people refer to the Sonitus suite's look as "dated", but if it works it works, and a good design is still a good design 20 years later. Nobody turns their nose up at a '59 Les Paul because it looks dated.
  20. I'll be curious to hear about peoples' experience running Win11 in a VM. As a software developer, I'll have no choice but to start experimenting soon, but there's no way it's going near my production machines unless safely tucked away within a VM.
  21. $5k for a new computer is a small price to pay for a center-justified taskbar. Ever notice Microsoft's surprisingly consistent pattern of alternating good and bad versions? Windows 2 was useless, Windows 3 changed the world. Win 95 made half your applications stop working, Win 98 addressed those problems. Windows ME, 'nuff said. XP was great, Vista sucked. Win 7 was troublesome but ultimately recognized as forward progress, but Win 8 was widely ridiculed for trying to make desktop displays look and act like tablets. Then Win 10 redeemed the brand again. Let's hope Win 12 swoops in to save the day.
  22. Yup, CW thinks the project ends at 5:27, even though the last MIDI event (pedal up) occurs at 2:54:17. The longest clip contains a single MIDI note at 2:32. Clearly, the clip length as defined in the cwp file does not reflect the actual distribution of events within the clip. This is not an uncommon occurrence. This is because within the project, clips are their own objects with their own start and end times not necessarily correlated to the data within them. You can see this via the following experiment. Create a new MIDI track and open it in the PRV. Draw in a note, and watch a visual representation of the clip appear in the track view. Add a second note and see that the displayed representation of the clip automatically stretches to encompass both notes. Now delete the last note. The apparent length of the clip as visualized in the track view does not change. This is a routine performance optimization - you wouldn't want the program to have to scan the track to determine its effective endpoint every time you make an edit. Plus sometimes you'll intentionally want the clip to extend beyond the last note, e.g. so a long release doesn't get truncated. Fortunately, there is an easy solution: the slip edit. That's what the feature is for, to let the user determine where the clip starts and ends irrespective of the data behind it. I did that with your project, no problem. Note that it can be difficult to get the slip-edit cursor(s) to show when the track is minimized. Try vertically expanding those too-long tracks before hovering your mouse over the trailing edge of the displayed clip to see the cursor. Drag the end of the clip back to where you want it to end. For the most part, however, trimming clips isn't necessary before exporting the entire song. Just use start and end selection points as I described earlier in this thread. If a long clip bothers you, use a slip edit and bounce.
  23. May sound like a dumb question, but have you verified that there is actually data in the source files? I've been sent many blank files over the years.
  24. Question, resolution and self-deprecation, all in a single post. Well done! All it needed was one irrelevant and unhelpful reply to complete the perfect thread.
  25. We need to resist the normalization of normalization. Leave that to the statisticians, relational database designers and anybody still using a slide rule. Tangents are good. The most valuable lessons I've learned about electronics, recording, mixing and digital audio were acquired by accident while stumbling down some tangential path.
×
×
  • Create New...