-
Posts
3,346 -
Joined
-
Last visited
-
Days Won
21
Everything posted by bitflipper
-
"Soloing and muting verifies it is this track." Are you saying that the empty portion produces an output even when it is soloed? If so, is this a separate instrument or one of several tracks routed to a common multi-timbral instrument, e.g. Kontakt with more than one instrument loaded? Can you correlate the phantom notes to any notes in any other track? It would help if we had more information about the project, what instruments are being used, and how the routing is set up. Whenever I've confronted such mysteries, it's always turned out to be a routing problem. That can include how MIDI tracks are routed to synths, how audio is routed internally within a multi-timbral synth, or inconsistent MIDI channel assignments. Symptoms can be varied and weird, e.g. the wrong voice sounding, a silent instrument, keyswitches or CCs being ignored.
-
Exactly what went through my mind as I listened to it. What is a piano supposed to sound like, when no two sound alike to begin with? I have a real piano, a nice one. But I don't record it. It simply doesn't sound as good as some of my sampled pianos.
-
Could you describe the symptoms of the "output silence/vst scrambling" issue? I just had a bizarre thing happen in this Omnisphere-heavy project: an instance of Kontakt went silent after updating Omnisphere. Weird. I hesitate to bring it up here, as it might be off-topic. The problem turned out to be a routing issue - the Kontakt audio track's input source had been switched from Kontakt to another synth (Zebra2). I can't imagine a scenario in which I could have accidentally done that myself, and the cross-routing had to have occurred within the last hour.
-
I missed the 2.7 announcement and wasn't even aware of the update (running ver. 2.6 here), so thanks for cluing me in. Yeh, I know, it clearly says "updates available" every time you start it up. Situational blindness, I guess. I have current projects that use Omnisphere, Trilian and Keyscape. I'm going to update them this weekend and see if there are new problems, then post back my results. [EDIT] I couldn't wait for the weekend. Omnisphere is too important to me to not know if it has a problem. Don't know if this is good or bad news, but I just played back a project with 16 patches in a single Omnisphere instance and there were no discernable problems. How could that be bad news? It is if you're trying to replicate a problem. Sorry, I could not.
-
What about for mixing classical and jazz from Musescore?
bitflipper replied to Keith R. Starkey's question in Q&A
Orchestral and choir, and an appreciation for video games...sounds like you should try your hand at video game music. Over the past decade the artform has grown in both sophistication and popularity, yielding some truly memorable pieces that have broken out of the game context and are interpreted live in concerts. And brought unexpected fame to journeymen composers such as Jeremy Soule, who created the soundtrack for my all-time favorite game: -
What about for mixing classical and jazz from Musescore?
bitflipper replied to Keith R. Starkey's question in Q&A
They are all monsters - every one of them. All have a daunting learning curve. They will all create equally great-sounding recordings. Whatever you choose, you can be sure it will require a significant investment in time and effort to obtain fluency. The main differentiators are not what you can do, but a) how well-supported they are by both the vendor and the user community, and b) how intuitive you find the workflow. On the first measure, vendor and community support, Reaper scores very highly. So does Cakewalk. As for the second measure, only you can determine how comfortable the software is to use. For me, Reaper and I didn't click, even though I have great respect for it on a technical level. But hey, Reaper's cheap and Cakewalk's free, so grab them both and dive in. Just try to avoid getting discouraged at the outset by reminding yourself that it takes time to get rolling. You may not play video games anymore, but I still find that an occasional zombie-murdering intermission is helpful for concentration and alleviating stress. -
It's been a bit of a dirty secret that some editing techniques that are standard practice with pop music are also used with "pure" genres such as classical, jazz and folk. Sure, engineers working in those genres will usually strive for transparency, but then pitch correction in pop music was once meant to be unnoticeable too. They got over that self-imposed restriction pretty fast. At present, classical music production sticks to a light touch and subtle digital manipulation, but they've only recently become comfortable with admitting they do it at all. It makes sense to apply some amount of dynamic range reduction, given that people are far more likely to listen in the car while sitting in traffic, or on ear buds on a plane or train. Noise reduction seems reasonable, too. But I have to wonder if there hasn't also been some discrete enhancements using EQ and reverb. This video mostly addresses editing, as opposed to processing (we can still draw a distinction between those things, for now). Mostly they talk about comping, but also mention the ability to do polyphonic pitch and timing corrections using Melodyne. There is a segment in the middle that CW users might find interesting, where the hosts attempt to discern between real instruments and virtual instruments. Spoiler: the drummer guessed wrong on the drums and the pianist guessed wrong on the piano.
-
Better still, it would be nice to see a Sonitus revamp down the road. I understand that even though CW did not write these, they do own the source code.
-
Well done, as always. I enjoyed it, even though I don't use either feature. First time I've heard "if you don't like the video, click Dislike twice".
- 2 replies
-
- cakewalk
- screensets
-
(and 2 more)
Tagged with:
-
Can I route audio from Cakewalk to another program on my PC?
bitflipper replied to Roy Z's topic in Production Techniques
One of the reasons ASIO is so efficient is that it doesn't support and therefore doesn't have to deal with multiple data streams from multiple programs. The trade-off is that you can only have one ASIO device active at a time. That's why Glenn suggested WASAPI Shared. Although not quite as fast as ASIO (but close enough), it can deal with multiple sources concurrently. That's how Skype can "ring" even though you're listening to music or watching a YouTube video. -
What happens with CC7 is entirely up to the instrument. Even though CC7 (and CC1) are the most widely-implemented controllers, they're not universal. A synth may allow runtime conversion of CC7 to some other value, use it for a nonstandard purpose, or ignore it completely. However, there are other possible reasons for CCs in general to be ignored. Every CC command carries a MIDI channel number; if it's different from the instrument's assigned channel, the synth will ignore it. This is possible when you hand-plant a CC7 via the Event List. If that's how you're adding the CC7 event, try using an automation envelope instead, which should always have the correct MIDI channel.
-
I don't generally use multi-band compressors (prefer dynamic equalizers) but I've had good results in the past with the Sonitus plugin. A pretty UI isn't everything. I have FabFilter Pro-MB with its colorful visuals, but the Sonitus multiband is quicker to dial in. Sure, I get it when people refer to the Sonitus suite's look as "dated", but if it works it works, and a good design is still a good design 20 years later. Nobody turns their nose up at a '59 Les Paul because it looks dated.
-
I'll be curious to hear about peoples' experience running Win11 in a VM. As a software developer, I'll have no choice but to start experimenting soon, but there's no way it's going near my production machines unless safely tucked away within a VM.
-
$5k for a new computer is a small price to pay for a center-justified taskbar. Ever notice Microsoft's surprisingly consistent pattern of alternating good and bad versions? Windows 2 was useless, Windows 3 changed the world. Win 95 made half your applications stop working, Win 98 addressed those problems. Windows ME, 'nuff said. XP was great, Vista sucked. Win 7 was troublesome but ultimately recognized as forward progress, but Win 8 was widely ridiculed for trying to make desktop displays look and act like tablets. Then Win 10 redeemed the brand again. Let's hope Win 12 swoops in to save the day.
- 190 replies
-
- 14
-
-
-
-
Yup, CW thinks the project ends at 5:27, even though the last MIDI event (pedal up) occurs at 2:54:17. The longest clip contains a single MIDI note at 2:32. Clearly, the clip length as defined in the cwp file does not reflect the actual distribution of events within the clip. This is not an uncommon occurrence. This is because within the project, clips are their own objects with their own start and end times not necessarily correlated to the data within them. You can see this via the following experiment. Create a new MIDI track and open it in the PRV. Draw in a note, and watch a visual representation of the clip appear in the track view. Add a second note and see that the displayed representation of the clip automatically stretches to encompass both notes. Now delete the last note. The apparent length of the clip as visualized in the track view does not change. This is a routine performance optimization - you wouldn't want the program to have to scan the track to determine its effective endpoint every time you make an edit. Plus sometimes you'll intentionally want the clip to extend beyond the last note, e.g. so a long release doesn't get truncated. Fortunately, there is an easy solution: the slip edit. That's what the feature is for, to let the user determine where the clip starts and ends irrespective of the data behind it. I did that with your project, no problem. Note that it can be difficult to get the slip-edit cursor(s) to show when the track is minimized. Try vertically expanding those too-long tracks before hovering your mouse over the trailing edge of the displayed clip to see the cursor. Drag the end of the clip back to where you want it to end. For the most part, however, trimming clips isn't necessary before exporting the entire song. Just use start and end selection points as I described earlier in this thread. If a long clip bothers you, use a slip edit and bounce.
-
May sound like a dumb question, but have you verified that there is actually data in the source files? I've been sent many blank files over the years.
-
UJAM Carbon plugin not being seen by Cakewalk?
bitflipper replied to Jellybeantiger's topic in Instruments & Effects
Question, resolution and self-deprecation, all in a single post. Well done! All it needed was one irrelevant and unhelpful reply to complete the perfect thread.- 1 reply
-
- 3
-
-
We need to resist the normalization of normalization. Leave that to the statisticians, relational database designers and anybody still using a slide rule. Tangents are good. The most valuable lessons I've learned about electronics, recording, mixing and digital audio were acquired by accident while stumbling down some tangential path.
-
I looked at the project. Before I fix the problem, tell me where the observed project end time is and what time you'd like it to be. I show the track labeled "Split Note F#3" to be the longest, ending at 5:26. Is that what you see?
-
Well, you've piqued my curiosity so I had to go over to Bandcamp and give a listen. There's still plenty of bass in there, particularly the kick drum. It actually sounds pretty good, though. Maybe the answer lies in compression rather than more filtering.
-
On the rare occasions when I use synthetic drums, the only vanilla processing I apply is EQ, often to reduce that "unnaturally hyped high end" you describe. More often, the fx will be things that make them sound even more unnatural, such as delays, reverb, distortion and modulators. Acoustic instruments are far more tonally complex and dynamic, which lends themselves to treatments that either highlight or hide the many overtones that are in there, and/or their dynamic characteristics. Electronic percussion just doesn't have that depth. So acoustic fx tend to be subtractive in nature, while electronic drum fx tend to be additive. Also consider combining electronic and acoustic drums. You can end up with an instrument that sounds like it might exist in the real world, but with an interesting twist. The classic example is mixing an 808-style gated sine "kick" under a real kick drum sample, for a deeper low-frequency component. But you can do the same thing with snares and toms. There's no rule that says electronic music must be 100% electronic.
-
treesha's lovely jam is indeed a testament to the precept that free instruments can be creative catalysts. Reminded me that I recently dug out an old freebie that I'd never used, the Janggu. It's a traditional Korean bongo-like instrument, part of a collection offered by the Seoul National University. I threw it in as an experiment, just because I wanted some non-standard percussion on a song. It surprisingly changed the direction of the composition. I had originally discovered that instrument thanks to a thread similar to this one on the old forum.
-
Yeh, there's that; no point in sharing an articulation map for a customized instrument. Or, for that matter, an instrument that isn't widely used. I've begun work on an articulation map for an older library, Kirk Hunter Concert Strings 2. I used to love this library but haven't used it in a while, mostly because I rarely need its level of detail. I'm more likely to reach for Amadeus Symphonic Orchestra, which isn't nearly as deep but sounds just as good. My thinking is that if I had articulation maps for CS2, I might start using it again.
-
It's a logical detour in any discussion of freeware. You can't talk about freeware without acknowledging the reason commercial developers offer it in the first place: to encourage interest in their paid products. Granted, there are some great freebies out there that were created by dedicated hobbyists and altruistically shared (e.g.Thomas Mundt's Loudmax limiter). But you have to sift through a lot of klunkers to find them, which is why this kind of knowledge crowd-sourcing remains such a longstanding staple of recording forums. And you cannot confidently pronounce a freebie as useful without comparing it to its commercial alternatives. The basic premise of the whole thread, as stated by Starship Krupa, is the belief that "a person can put together an excellent system entirely with freeware". He can make such a proclamation only because he has extensive experience with both free and non-free software. So yeh, talking about commercial software in a freeware context is legit.
-
It's true that articulation maps are most helpful in orchestration. So useful that they actually make composing and arranging more fun and less tedious, and thus inspire greater experimentation. But that's not the only use-case. Any virtual instrument based on strings is a candidate for AM joy, especially faux guitars. Another application would be the more sophisticated voice and choir libraries that offer articulations beyond basic oohs and aahs. Speaking of being scared off by excessive complication, sjoen's mention of pedal steel reminded me of the time I decided against buying a pedal steel VI for that very reason. The demos sounded great, very expressive. But making that happen required some deep articulation switching that didn't look fun at all. I might have to revisit that decision.