ericn
Temp
Balance Engineer
Posts: 14,817
|
Post by ericn on Jun 23, 2020 10:50:56 GMT -6
Mark, I have heard that song before, the jump 2 G3, then Intel. Apple always seams to realize that writing for and supporting 2 different kinds of CPU is time consuming, expensive and bulky after about the 3rd OS update and then starts to phase out the old CPU support. Abandoning FW will only add to this coming up with their own interface is a PITA and then getting the resellers and customer to buy in is also a major issue. That's for sure a fair observation, Eric. I'm mostly thinking about: - How recent the new Mac Pro is. They're really just now catching up with all the original orders.
- The film industry, which is where the "creative money" is. Depending on how many units have been sold, they need to be really careful to continue supporting that market. FCPX is definitely part of Apple's growth strategy in the creative market. Some of those production houses have quite recently purchased $15k-$30k machines! If Apple wants to avoid another shitstorm meltdown with video content creators, they're going to have to be super careful to continue supporting that market and that machine.
But yes, your point is well said. It gives me serious pause about buying a new machine right now.
I don’t disagree with anything you said, this is just a reboot of: Nubus to PCI, migration to Power PC, PCI to PCIe, Power PC to Intel. The story remains the same only the names have changed. Also you have to understand Apple really doesn’t care about their pro level customers, just look at how long it took to refresh the Mac Pro or any of the pro Apps. Apple is a consumer and content company. The margins are just to high in streaming and IPhones. The Mac Pro, Logic & FCP are pretty much vanity products at this point.
|
|
|
Post by popmann on Jun 23, 2020 12:23:56 GMT -6
I'm going to point out that no matter WHAT computer your get you will need to run modern Spitfire instruments with a 512 sample buffer or sometimes bigger. The collective "you" wanted playable strings--but, that requires building context in LINEAR TIME...so that they can choose the right articulations and transitions. So, any of their "playable"+"time machine" patches require a certain amount of LINEAR TIME in the buffer.
I point that out...because I have a local buddy who bought a like $10k new iMac+Apollo Thunderbolt plus a million software updates and was disheartened that their instruments needed such large buffers. He thought there was "something wrong".
That said...I don't find them even challenging to PLAY at those large buffers in Cubase. It's not really more than "mushy" at 1024 in Cubase anyway. If you pull up the hardware Kronos Steinway and Ivory American D at 1024, it just sounds like a chorus--that's the level of "delay" there is.
Anyway--neither here nor there...I just feel like I saw someone mention they need a new machine because of the new Spitfire...that linear time stuff will be baked into more and more of their stuff. That's a new dev...I saw a piano library the other day that required "at least a 128(@44) input buffer"...so, it's a new paradigm that's not JUST Spitfire in the VI landscape.
|
|
|
Post by indiehouse on Jun 23, 2020 12:33:33 GMT -6
I'm going to point out that no matter WHAT computer your get you will need to run modern Spitfire instruments with a 512 sample buffer or sometimes bigger. The collective "you" wanted playable strings--but, that requires building context in LINEAR TIME...so that they can choose the right articulations and transitions. So, any of their "playable"+"time machine" patches require a certain amount of LINEAR TIME in the buffer. I point that out...because I have a local buddy who bought a like $10k new iMac+Apollo Thunderbolt plus a million software updates and was disheartened that their instruments needed such large buffers. He thought there was "something wrong". That said...I don't find them even challenging to PLAY at those large buffers in Cubase. It's not really more than "mushy" at 1024 in Cubase anyway. If you pull up the hardware Kronos Steinway and Ivory American D at 1024, it just sounds like a chorus--that's the level of "delay" there is. Anyway--neither here nor there...I just feel like I saw someone mention they need a new machine because of the new Spitfire...that linear time stuff will be baked into more and more of their stuff. That's a new dev...I saw a piano library the other day that required "at least a 128(@44) input buffer"...so, it's a new paradigm that's not JUST Spitfire in the VI landscape. I feel like this is something important I should know about, but I don't understand what you're saying. I do have a Spitfire library (among others) that I want to use, but haven't yet due to computing issues.
|
|
|
Post by Guitar on Jun 23, 2020 12:39:13 GMT -6
I think what he's saying is a lot of the patches, the "regular" patches, require a lot of latency from the interface.
Sounds like they have some "playable" patches that will run at lower latencies. Then, I would guess, you could sub in the "regular" patches after your performance is done.
It's a feel thing. We were sort of talking about this in the other thread. The long latencies will sound like a very short delay. Like standing across a large hall from your amp/speaker, for example, with a long cable.
|
|
|
Post by popmann on Jun 23, 2020 16:22:45 GMT -6
Except “playable” is their word for “must have big buffer” becuase its what allows you to PLAY a line on a keyboard and it come out sounding like it was played on a violin.
Think about it this way—a lookahead limiter will ALWAYS require latency....no matter if you have a CPU time machined from 2030....becuase it needs linear time to look ahead in the audio stream. Right? Now, becuase thats an audio plug in, it INDUCES it into the DAW....i mean technically it tells the DAW it needs 10ms, the DAW makes itself 10ms more latent to allow that....capice? If not....im not sure how to explain the next part.
Spitfire’s input scripting needs linear time to establish musical context between notes. No computer that will ever be made will be able to run them at a 128 buffer. 128@ 48khz equals a certain linear time. It varies a little by app...by sound card, etc....and thats not enough linear time. 256 at 48 might be....but, 512 is where it will start to run glitch free. IME. Double those for HD, and increase them SOME for 44.1, since Spitfire is all 48khz native samples. So, 44.1 will require a resampling routine during the buffer, as well (as obviously any HD will)....
So, if you bought the BBC and it doesnt run well at 128 on your 2012 macbook (what you think you need to play a VI at) , Im saying it wont run better on a new $10k mac Pro with RME PCIe. You need to increase the buffer size. You're not running out of CPU DSP time when the meter “overs”, youre running out of linear time. Now....if you think 512 is too latent to play....well, then....THAT might change with a RME PCIe card....but, its app dependent, too.
But, that also touches on needing to understand what the DSP meter in a DAW means and how virtual instruments throw that WAY outside what it was intended for....do, you understand that the meter in your DAW is not a CPU usage meter? Even logic’s that breaks it out into cores....
|
|
|
Post by indiehouse on Jun 23, 2020 16:52:31 GMT -6
So, my PT system usage meter isn't showing CPU usage? What's it showing?
|
|
|
Post by popmann on Jun 23, 2020 17:49:35 GMT -6
So, my PT system usage meter isn't showing CPU usage? What's it showing? No. It's showing you % capacity to fill the playback buffer. I just looked at the reported on my RME 1024 sample buffer. 96khz = 11ms in and 11.6 out. Playback of ANYTHING is just filling the output buffer. So the system has 11.6 ms to fill that buffer. If it takes 11.6+ms, it registers 100% probably lights up red. If it takes 5.8ms, it registers 50%. 2.9? 25%. Tracking? So, why is that different than CPU use? With audio DSP, it's not VERY different. There's overhead of an interface driver...overhead of CoreAudio in your case, because you're not using an Avid IO, right? So, also an extra buffer for CoreAudio...but...that's not important to understanding this...there are system configurations that will make it have less capacity, yet not use up CPU. When your iCloud inititates a sync...or something, it will temporarily interrupt they CPU from running your DAW--"over". Because? It kept it from filling the buffer in the allotted time. NOT because the CPU was maxed out ever. This is actually where Windows systems can be HORRIFYINGLY inefficient and confusing if you think that you're maxing it out because your CPU isn't fast enough. This is a way Apple's OS handles scheduling better. But, the important thing to understand is it hit red/100%/over because it couldn't fill the buffer in 11.6ms--the allotted output buffer time. Because when you play a VI, you're triggering a sample START that's stored in RAM....THEN it initiates the storage drive what sample you STARTED playing and it streams it from the drive into RAM where it's sewn seamlessly into the start of a sample. So, if for example your HDD where the library lives is too slow to get however manmy samples you triggered in time to fill the buffer? RED. Which is why with a VI, unlike audio production...you can see it go from 5% to over and back to 5%. Your CPU never approaches 100%...your (likely) drive couldn't backfill the data it needed to in the buffer time. So it completed the last handful of buffers at .4ms or something...reports 5% because it took 5% of the allotted time to complete the data to fil lthe buffer...when the Spitfire can't get it's disk stream in time the next buffer is 100% over. Now--that's all precursor to understand that Spitfire (and others) are now using input scripting that have their own LIENAR TIME requirements. So, think "if they play middle C4 at 34vel...if they play a second C4 within 10ms, play THIS sample...if they move to D4 at velocity >100 play THAT sample AND this other transition..." So, they NEED 10ms of linear time. If you change it to 512 samples at 96khz, the output buffer will be 6.3ms. 256 samples 3.68. Those are real numbers, so FWIW--RME must BE reporting the ADA loop, which is why it's not halving. That's a static amount at any given sample rate. Anyway--at 48khz, it will be like 512 samples because that will give you 13ms or something...because the rate of sampling halved/doubled, right? #MathIsHardMkay So--no...it's not directly a CPU measure when unlike audio, virtual instruments stress a bunch of subsystems in your computer. Basically, ANY or all those subsystems are now in play to respond quickly enough to provide samples within the allotted output buffer of time.
|
|
|
Post by indiehouse on Jun 23, 2020 18:25:58 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now.
|
|
|
Post by Guitar on Jun 23, 2020 18:29:50 GMT -6
Thanks for getting nitty gritty. CPU is certainly not the bottleneck in my PC. There's a good video on this topic let me grab it:
I wonder what the "real" bottleneck is in my system. Could be any number of things, I guess. Video is the only thing I do that really demands CPU power.
I've never really learned about "linear time" in computer audio that's a new term for me.
Anyway this video helped me understand digital audio a bit better: In case anyone's wanting to learn about what popmann is saying.
|
|
|
Post by indiehouse on Jun 23, 2020 18:52:15 GMT -6
Or system OS settings to optimize buffer capacity?
|
|
|
Post by indiehouse on Jun 23, 2020 19:11:49 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now. Apparently 16GB is the max you can put in a 2012 Mini.
|
|
|
Post by jeremygillespie on Jun 23, 2020 20:46:03 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now. Apparently 16GB is the max you can put in a 2012 Mini. Yes. But - you can get 64 in the new mini!! I’m close to pulling the trigger on one myself but can’t figure out if my next upgrade is going to involve going HDX.
|
|
|
Post by popmann on Jun 23, 2020 20:59:53 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now. For audio? Meaning you're recording audio tracks and mixing them? Not really. You can literally give it more buffer. Set ProTools to 2048 samples. That's all your CPU can do. I mean sometimes there's a 4096 setting, but even my decade old machine saw no tangible benefit. I'd mix with 2048 pretty regularly. RAM helps nothing "sample libraries aside"--I mean I know ProTools has that feature that you can load the whole session to RAM--even though the REASON they did that isn't any kind of optimization--it was so that you could store a whole studio's projects on a central server and basically download the one you're working on in STudioB to the StudioB system--run it in RAM and check the difference files back IN to the server. That really just takes the storage drive out of the equation, and if you're running out of mix DSP, that's live CPU time. Now--using ProTools vs another app absolutely factors into that. But, I feel like you know that. Beyond that....your routing makes enormous difference. Do you do that old TDM thing that PT users without TDM rigs tend to do....having a sub master fader with all your psudeo mastering stuff on it? That has zero benefit on SOFTWARE Protools...but, it adds another layer of series bottleneck. Like I can probably have 12+ Abbey Road Plates...each on their own aux, but I can't have TWO in series. 100% RED immediately and constantly. While technically--sure my CPU ran out of juice, it's because in series on an aux they need to be on the same core and if that reverb takes 2.2ghz...I have twelve threads it can put those on...but, if it needs to do two on the same channel, that's 4.4ghz--and I feel like my turbo is 4.2 or something. So--RED. But, I mean if you're maxing a 2012 MINI's i7 with AUDIO....? And at 48khz? It must be hitting a single core wall with some series processing. I'm not sure I'd be looking at a new MINI. That's got an underclocked and throttled version of my CPU. I was using a 2012 Air's mobile (dual core) i5 without running out of steam in Mixbus32c at 88.2khz (again doing NO instruments, which were all done on the PC tower recorded INTO Mixbus or LogicX)...I might not be the best to ask about cutting edge tons of DSP mixing.
|
|
|
Post by popmann on Jun 23, 2020 21:07:06 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now. Apparently 16GB is the max you can put in a 2012 Mini. Open your biggest session with NO VIs....then go to Spotlight and type "Activity Monitor". Tell me you're even approaching HALF of that 16gb usage-I'd be surprised. RAM does nothing...unless you don't have ENOUGH...it speeds NOTHING up...that's a computer myth from the days when NO ONE had enough RAM to do anything, thus the OS was always swapping to the slow drive--so adding RAM "made stuff faster"--but, there's no benefit until you actually NEED it...for the new Spitfire stuff, you will. For audio? That's Air I was doing 88.2 on? 4GB. I don't think it ever broke using 2gb on audio projects.
|
|
|
Post by christopher on Jun 23, 2020 21:21:05 GMT -6
Awesome discussion! So you are saying threads and cores do something, like in Reaper each channel gets its own thread?( I read that recently I believe) So maybe an AMD threadripper (24cores/48threads) wouldn’t be so ridiculous? They also make a 64core/128 thread chip that should come down in price in a few years.
|
|
|
Post by Blackdawg on Jun 23, 2020 22:43:05 GMT -6
I just built a brand new PC a month or so ago.
Ryzen 3700x 16Gb of 3200mhz RAM Really nice Mobo with lots of NMV slots, PCI 4.0 support, built in wifi, bluetooth, and dual LAN ports. GTX2070 Super graphics
Was super easy to build. The cases these days are so nice to put together and route stuff. WAY better than my old one.
Install was also super easy to get Win10 Pro 64bit. No issues at all.
Haven't bothered overclocking anything, I did some OS optimizations folloing Mergining technologies suggestions.
Runs Pyramix with easy doing DXD 384kHz multi tracks. If anything, its Pyramix that is holding me back with no multi core support.
Eats up any games I throw at it now too. I spend WAY more time on it now than my old macpro.
Also people have REALLY stream lined building hackintosh's on the new Ryzen CPUs. I kind of wish I'd gotten an 5700XT GPU instead because I could dual boot windows or MacOs with easy.
Either way. PCs have come a Loooooonnnngggg ways. Mines been stable has hell and works awesome. I love it.
This new update makes me just want to keep the PC and not bother with a new mac. I don't really see Mac just fucking over most of it's base...but at the same time. They do it ALL THE DAMN TIME! so..yeah.
|
|
|
Post by indiehouse on Jun 24, 2020 7:12:23 GMT -6
Ok, that was educational. Thanks man! So, sample libraries aside, let’s assume that the CPU in my 2012 Mac Mini isn’t actually getting maxed out, even though my usage meter hits red, and I get error messages that my DAW has run out of processing power. Are there system tweaks I can make that will give me more buffering, given that my CPU still has headroom? More RAM? I have 16 GB now. For audio? Meaning you're recording audio tracks and mixing them? Not really. You can literally give it more buffer. Set ProTools to 2048 samples. That's all your CPU can do. I mean sometimes there's a 4096 setting, but even my decade old machine saw no tangible benefit. I'd mix with 2048 pretty regularly. RAM helps nothing "sample libraries aside"--I mean I know ProTools has that feature that you can load the whole session to RAM--even though the REASON they did that isn't any kind of optimization--it was so that you could store a whole studio's projects on a central server and basically download the one you're working on in STudioB to the StudioB system--run it in RAM and check the difference files back IN to the server. That really just takes the storage drive out of the equation, and if you're running out of mix DSP, that's live CPU time. Now--using ProTools vs another app absolutely factors into that. But, I feel like you know that. Beyond that....your routing makes enormous difference. Do you do that old TDM thing that PT users without TDM rigs tend to do....having a sub master fader with all your psudeo mastering stuff on it? That has zero benefit on SOFTWARE Protools...but, it adds another layer of series bottleneck. Like I can probably have 12+ Abbey Road Plates...each on their own aux, but I can't have TWO in series. 100% RED immediately and constantly. While technically--sure my CPU ran out of juice, it's because in series on an aux they need to be on the same core and if that reverb takes 2.2ghz...I have twelve threads it can put those on...but, if it needs to do two on the same channel, that's 4.4ghz--and I feel like my turbo is 4.2 or something. So--RED. But, I mean if you're maxing a 2012 MINI's i7 with AUDIO....? And at 48khz? It must be hitting a single core wall with some series processing. I'm not sure I'd be looking at a new MINI. That's got an underclocked and throttled version of my CPU. I was using a 2012 Air's mobile (dual core) i5 without running out of steam in Mixbus32c at 88.2khz (again doing NO instruments, which were all done on the PC tower recorded INTO Mixbus or LogicX)...I might not be the best to ask about cutting edge tons of DSP mixing. Yeah, I pretty much live in a 2048 buffer. Now the routing thing, that's interesting. I actually do use a sub master with all my two buss plugs (and hardware i/o). So you're saying, this eats up more resources than having the same chain on the master fader? For my current project, I've tracked everything at 96khz. In hindsight, I should have kept it at 48khz. I think I will going forward, just don't know if it's a good idea to resample everything down to 48 at this point.
|
|
ericn
Temp
Balance Engineer
Posts: 14,817
|
Post by ericn on Jun 24, 2020 7:19:10 GMT -6
I just built a brand new PC a month or so ago. Ryzen 3700x 16Gb of 3200mhz RAM Really nice Mobo with lots of NMV slots, PCI 4.0 support, built in wifi, bluetooth, and dual LAN ports. GTX2070 Super graphics Was super easy to build. The cases these days are so nice to put together and route stuff. WAY better than my old one. Install was also super easy to get Win10 Pro 64bit. No issues at all. Haven't bothered overclocking anything, I did some OS optimizations folloing Mergining technologies suggestions. Runs Pyramix with easy doing DXD 384kHz multi tracks. If anything, its Pyramix that is holding me back with no multi core support. Eats up any games I throw at it now too. I spend WAY more time on it now than my old macpro. Also people have REALLY stream lined building hackintosh's on the new Ryzen CPUs. I kind of wish I'd gotten an 5700XT GPU instead because I could dual boot windows or MacOs with easy. Either way. PCs have come a Loooooonnnngggg ways. Mines been stable has hell and works awesome. I love it. This new update makes me just want to keep the PC and not bother with a new mac. I don't really see Mac just fucking over most of it's base...but at the same time. They do it ALL THE DAMN TIME! so..yeah. I’m just going to wait for a couple of server farms in my building to bite the dust and grab a nice server😎
|
|
|
Post by indiehouse on Jun 24, 2020 8:20:36 GMT -6
I just built a brand new PC a month or so ago. Ryzen 3700x 16Gb of 3200mhz RAM Really nice Mobo with lots of NMV slots, PCI 4.0 support, built in wifi, bluetooth, and dual LAN ports. GTX2070 Super graphics Was super easy to build. The cases these days are so nice to put together and route stuff. WAY better than my old one. Install was also super easy to get Win10 Pro 64bit. No issues at all. Haven't bothered overclocking anything, I did some OS optimizations folloing Mergining technologies suggestions. Runs Pyramix with easy doing DXD 384kHz multi tracks. If anything, its Pyramix that is holding me back with no multi core support. Eats up any games I throw at it now too. I spend WAY more time on it now than my old macpro. Also people have REALLY stream lined building hackintosh's on the new Ryzen CPUs. I kind of wish I'd gotten an 5700XT GPU instead because I could dual boot windows or MacOs with easy. Either way. PCs have come a Loooooonnnngggg ways. Mines been stable has hell and works awesome. I love it. This new update makes me just want to keep the PC and not bother with a new mac. I don't really see Mac just fucking over most of it's base...but at the same time. They do it ALL THE DAMN TIME! so..yeah. I can't deny that doesn't sound tempting. I don't play video games, but I do love making my dollars count where I can. I did the Hackintosh build once. It was an experience. I won't do it again, though. My most limited resource these days is time, so I need my computer to just work consistently. I don't want to spend it troubleshooting a computer. That's my reservation about going with a Windows build. My studio space has been built around the Apple OS since the beginning. Part of me feels like I'd be starting over, in a way. Something will inevitably not work properly.
|
|
|
Post by popmann on Jun 24, 2020 12:28:14 GMT -6
Yes, my understanding is that the sub master adds DSP usage...but, honestly--it's often people taking that TDM rig workaround to OTHER native software apps. So, I can't swear on your system. For an IO plug it I wouldn't think it matters. I wish I understood what the difference is...you're not using Acoustica crap or something are you?
You're going to troubleshoot a Windows machine for optimal use-not probably basic audio recording/mixing, but for VI optimization of all your subsystems. Or I mean--you can pay someone ELSE to do it. But, it will happen. I disagree that "windows has come so far"...I guess it depends on when your experience is from...and how picky you are about performance. It's not some ongoing thing. But, there's likely some stuff to work through unless you're JSUT recording and mixing audio. That's SO mature...it's hard to mess that up on any OS. IMO/E.
that said--silence below my desk while giving me hardware latency on VIs under fingers consistently, trumps any touchy feely "niceness" to an OS. For me. The fact that in two years--I'd added more SSD space and recently a GPU...I'm not really a "Windows guy"--I AM, a "computer is a box that your configure to meet YOUR needs" guy. FWIW.
|
|
|
Post by indiehouse on Jun 24, 2020 14:00:08 GMT -6
Yes, my understanding is that the sub master adds DSP usage...but, honestly--it's often people taking that TDM rig workaround to OTHER native software apps. So, I can't swear on your system. For an IO plug it I wouldn't think it matters. I wish I understood what the difference is...you're not using Acoustica crap or something are you? You're going to troubleshoot a Windows machine for optimal use-not probably basic audio recording/mixing, but for VI optimization of all your subsystems. Or I mean--you can pay someone ELSE to do it. But, it will happen. I disagree that "windows has come so far"...I guess it depends on when your experience is from...and how picky you are about performance. It's not some ongoing thing. But, there's likely some stuff to work through unless you're JSUT recording and mixing audio. That's SO mature...it's hard to mess that up on any OS. IMO/E. that said--silence below my desk while giving me hardware latency on VIs under fingers consistently, trumps any touchy feely "niceness" to an OS. For me. The fact that in two years--I'd added more SSD space and recently a GPU...I'm not really a "Windows guy"--I AM, a "computer is a box that your configure to meet YOUR needs" guy. FWIW. No Acoustica here. Just a mix of UAD and some pretty standard native stuff. I just ditched the Submaster. For some reason today my system usage has been manageable. I did remove a Soothe plug.
So, you have a Windows machine? I bet I could build a pretty nice one for the cost of a new Mini.
|
|
|
Post by popmann on Jun 24, 2020 14:35:10 GMT -6
Yes, I built a Coffee Lake one a couple year ago when the Air proved it just couldn't do the VIs--and I had to pull like 2008/9'ish Win7 64bit tower out of the closet to finish the VI tracks...I actually the same chip Apple ended up UNDER clocking the further throttling for the high end mini. I intended to Hackintosh it...but, there's a lot of bit to that makes dual booting not as doable. I mean if I just wanted to use Windows for games or something, that wouldn't be an issue, but to have them BOTH able to access the internal fast SSDs....is basically deciding which OS you want to hack up...they won't share a file system format. So I thought "well--I'll leave Win10 on here and just move to OSX if it gives me trouble." It's still there. I added a 1TB SSD a while back to the internal array when I decided to get some more drum samples I don't need ...and recently added the GPU I was going to add to DO the Hackintoshing--Metal capable RX570...and it removed the only small issues I've had--which were all graphics issues, so I figured "assume the internet is wrong about the intel GPU"...they were. I actually heard back from Synthlogy about Ivory not opening it's UI in Mixbus (it opens and plays as long as you don't open the UI) and it was basically "yeah--you really need a discrete GPU for a DAW"...which I knew but let the internets talk me out of when I built it... I might've opted for a MINI+RME THunderbolt, which would be roughly 2x that--just because I built that custom JavaScript thing in LPX to get their AI to play SD3 samples...but, that wasn't an option when I built. The REMOTELY equal option was a high end iMac with the RME and a bunch of required software upgrades added up to somewhere between 5-6k...that was just WAY beyond what I was going to put into a sandbox for my own noodlings. I was doing all the client work on that little Air just fine. So, there was no real business justification. I just wanted to get my own ya yas off...the elderly tower's motherboard went after a decade. But, for perspective, I think I put $700 into it INCLUDING the RME audio interface. Not including software licenses--holy hell I don't want to add THAT up...Used worthless PC (I think it was used as a server in some office that closed?)...RME bought from an Apple user when they(apple) removed PCI...modded it to be quiet enough to run in the studio. You can always mod a non proprietary box to be QUIET...but, silence requires ground up design choices. Anyway...Microsoft gave me the Win7 64bit license for it because I asked...perks of being an admin...having Microsoft reps write a key down on a post-it when they heard I was building a 64bit VI machine...
|
|
|
Post by popmann on Jun 24, 2020 14:41:15 GMT -6
Yes, my understanding is that the sub master adds DSP usage...but, honestly--it's often people taking that TDM rig workaround to OTHER native software apps. So, I can't swear on your system. For an IO plug it I wouldn't think it matters. I wish I understood what the difference is...you're not using Acoustica crap or something are you? No Acoustica here. Just a mix of UAD and some pretty standard native stuff. I just ditched the Submaster. For some reason today my system usage has been manageable. I did remove a Soothe plug.
If you're an Apollo user--stay on OSX. You don't really have a choice. Apogee and UA and Apple...business partners. Unlike the other two UA does "technically" work on Windows...but, historically not really that well, so...they tend to port stuff over to Windows after their Apple sales have slowed down--see if they can get some more sold. IMO/E.
|
|
|
Post by Guitar on Jun 24, 2020 15:58:46 GMT -6
I used to use Apollos on my Windows PCs. If you stick with the UA "recommended" motherboards you're pretty much good to go, when they eventually release the software. Mac software obviously comes first with UA. I think in 2020 they support more thunderbolt systems in general now on windows. I don't know what happened but it seems like thunderbolt got kind of standardized and worked out in the past couple years.
|
|
|
Post by m03 on Jun 24, 2020 18:24:08 GMT -6
I don't know what happened but it seems like thunderbolt got kind of standardized and worked out in the past couple years. Intel stopped charging a fee for manufacturers to license it back in 2018, so adoption of the protocol went up.
|
|