HOME » | BENCHMARKS | FORUM| CONTACT |
- Reference Benchmarks :.
Cross Platform DAW Performance - Part III :
the Steinberg specific testing out of the way, I wanted to address
some concerns that were raised by several punters across forums that
the reports were biased due to the well know fact that Cubendo performed
measurably better on Windows over OSX, despite the fact that I clearly
stated in the first report that it was simply a starting point and
that we needed to start somewhere. I was initially going to move
to Protools LE/MP/RTAS , but on giving it further consideration I
decided to instead move this round of testing to applications more
in line with Cubendo to see if the performance variable was actually
specific to the Steinberg applications themselves.
|Preparing for Battle :.|
or adjustments were required in regards to hardware, however the expanded
DAW application list and added 3rd party plugins proved to be the biggest
hurdle preparing for this round of testing as I needed to ensure that
not only all of the respective applications were updated , stable and
running well, but also all of the plugins performed as expected across
the 3 applications.
I reported that to the dev's at Presonus who have acknowledged the issue and are reportedly trying to resolve it with collaboration from WaveArts. I have not had further confirmation on whether that has or can be resolved
StudioOne OSX could also be very particular with naming the sessions as well , having a point in the name - i.e. 1.5, would cause it to truncate the name of the session on first save for example. Not a major issues, but annoying when simply trying to navigate around basic functionality.
There was also a looping glitch with Reaper on OSX that we were unable to resolve before print unfortunately, but it has been reported to Justin over at Cockos and a possible solution is currently being worked on.
|Round 1 : DAWbench DSP - WaveArts MD5 Multiband Compressor - VST2|
Lets take a closer look at the cross platform comparative results for each respective DAW application.
Nuendo 5 : @ 032 - performed better on Win7 by 240% , @ 064 - performed better on Win7 by 57% , @128 - performed better on Win7 by 21%, @ 256 - performed better on Win7 by 6 %
StudioOne 1.5 : @ 032 - performed better on Win7 by 89% , @ 064 - performed better on Win7 by 68% , @128 - performed better on Win7 by 57%, @ 256 - performed better on Win7 by 45 %
Reaper 3.6 : @ 032 - performed better on Win7 by 47% , @ 064 - performed
better on Win7 by 16% , @128 - performed better on Win7 by 20%,
On Windows the MD5 , as noted on the previous report , had always
been a very stable and consistent performer in past testing , no
matter what DAW application was being used , and again the results
across the 3 respective DAW's this time maintained that .
Reaper scaled a little better across all latency settings than both StudioOne and Nuendo 5 , which is consistent with past results, but there really isn't a lot separating the 3 applications.
On OSX the results were not as clear cut across the 3 DAW's. At 032 samples both StudioOne and Reaper easily outperformed Nuendo, but as the latency was increased StudioOne dropped off quite substantially while both Nuendo and Reaper continued to scale significantly better , especially above 064 samples.
Reaper was a bit of a stand out on OSX at the lower latencies, but the empirical results do not show the whole story. I encountered a small glitch as the playback approached the loop point, which is the identical issue I had with Reaper on Windows a few years back.
It is associated with the read ahead buffering and Anticipative multithreading routines of the application, and although the issue in Windows was resolved in collaboration with the dev's at the time , we couldn't pinpoint a solution this time around.
This has effected the end results for Reaper on OSX as I needed to back off a number of plugins to clear the slight glitch at the loop point, despite playback of the session apart from that being 100% clear.
Considering the cross platform comparative performance of Reaper on this session is already substantially better than the other 2 applications - even with the results being hampered by the loop glitch - it has the most potential to get closer to even par performance across both OSX and Windows at low latency, as long as that can be maintained with the other 3rd party plugins
We do need to note that despite the potential, the performance variable between Windows and OSX , especially at the lower latencies , is at varying degrees present in all 3 applications, so its not simply a case of pointing the finger at Steinberg's coding, as some have continued to maintain since the initial report.
|Round 3 : DAWbench DSP - URS Channel Strip Pro - VST2|
With the Channel Strip Pro being one of the new kids on the block for DAWbench, this was the first time I had tested across multiple applications , so again, lets take a look at the results for each respective DAW application.
Nuendo 5.0 : @ 032 - performed better on Win7 by 201%
, @ 064 - performed better on Win7 by 84% , @128 - performed better
on Win7 by 48%,
StudioOne 1.5 : @ 032 - performed better on Win7 by 103% , @ 064 - performed better on Win7 by 104% , @128 - performed better on Win7 by 85%, @ 256 - performed better on Win7 by 81 %
Reaper 3.6 : @ 032 - performed better on Win7 by
40%, @ 064 - performed better on Win7 by 47%, @128 - performed better
on Win7 by 47%,
On Windows the results were again very consistent and close to par across all 3 applications , with the results across all latencies being within 5 plugins. All scaled very well , nothing really to report past that
On OSX, the results again were a bit of a mixed bag across the 3 applications, but a few patterns were starting to emerge
Like the previous MD5 results, @032 both StudioOne
and Reaper clearly outperformed Nuendo , but StudioOne then quickly
dropped off at all latencies above that - consistent to the last test
session. Nuendo's and Reaper again scaled more effectively, but not
to the extent that they did with the previous plugin.
Quick analysis shows that all of the applications on OSX did not scale as well with this particular plugin compared to the previous, possible variables being the multiple and therefore more complex DSP processing required , or simply better code optimization on OSX for the previous plugin.
What is again evident and is remaining consistent , is that the performance variable between Windows 7 and OSX across all the applications is being maintained.
This does not rule out that code optimizations for
the respective DAW's are not consistent across both platforms , nor
that the optimization for the respective plugin's themselves are comparable
, but there is a pattern emerging the more elements we introduce into
the testing .
|Round 4 : DAWbench DSP - Elysia mPressor - VST2|
As I noted in the last
report, the Elysia mPressor adds another variable, that being
an inherent delay, so it would be interesting if that effected the
results in the other 2 DAW app's that hadn't been tested previously.
Nuendo 5.0 : @ 032 - performed better on Win7 by 237%
, @ 064 - performed better on Win7 by 98% , @128 - performed better
on Win7 by 71%,
StudioOne 1.5 : @ 032 - performed better on Win7 by 136% , @ 064 - performed better on Win7 by 96% , @128 - performed better on Win7 by 70%, @ 256 - performed better on Win7 by 97 %
Reaper 3.6 : @ 032 - performed better on Win7 by 61%,
@ 064 - performed better on Win7 by 52%, @128 - performed better on
Win7 by 48%,
As on the previous CSP results, on Windows the results were again close to par across all 3 applications , all latencies being within 5 plugins. The inherent delay may have come into play more so with Reaper as it was a touch below the other 2 app's on this session, which is different to all the others, but we are realistically splitting hairs trying to separate the Windows results.
On OSX the patterns that were emerging in the pervious runs were evident again with the results for StudioOne and Reaper being clearly better than Nuendo at 032 samples, but then dropping off dramatically while both Nuendo and Reaper scaled measurably better at the higher latencies.
The overall scaling with the mPressor is a lot closer to the Channel Strip Pro than the MD5, so my initial thoughts of the multiple DSP processors being a factor in the CSP not scaling as well doesn't seem to be panning out.
An analysis of the results maintains the consistency that has developed over the course of the testing where the variable between OSX and Windows 7 is measurably in favour of the later - actually it has widened in this specific instance which may indicate that plugins with inherent delays do take a harder hit on OSX.
Of course that is not conclusive by any stretch, we would need to test a bunch more plugins to get a clearly idea on that specific.
All of the testing thus far has indicated that there is a distinct variable between the cross platform performance of the current testing pool, whether that shows a consistency in the lack of optimization for all the DAW's and plugins on OSX , or whether the actual O.S and its inherent subsystems come into play is the question that remains.
I have tried to address as many of the initial qualms that were raised by several members of the community in respect to the performance variable being specific to Steinberg products.
With the variable remaining consistent with the other 2 DAW applications to some degree , that absolves Steinberg of being the only party that has a distinct cross platform performance variable , but of course then raises a whole bunch of other questions , is it specific to VST Plugins, is it related to Core Audio , is each application equally optimized for both platform, etc, etc
There was speculation doing the rounds on some of the forums that Steinberg did not have native Core Audio support, that in fact it used a wrapper , and that was one of the main causes of the performance variable.
This started circulating after a poster had analyzed an error dump and saw reference to a component called CoreAudio2ASIO.bundle , and of course came to the conclusion that Steinberg were using a wrapper, which accounted for the extra 2 cycles per buffer setting that I noted in my first report.
Well needless to say unless the other applications are using the same approach , then that assumption isn't really holding much water. Also my information regarding the extra cycles being required by Core Audio had nothing whats ever to do with Steinberg and its alleged wrapper, but was specifically related to Core Audio itself.
On further investigation , information from a reliable source confirmed that there was a related component in the Windows version of Cubendo, and that it is simply an interconnect between the Steinberg BAIOS audio engine and the respective audio driver for each platform
Of course all of the above can be taken with a grain of salt because it will be near impossible to get a Steinberg developer to openly discuss the details of that component, but I thought it was worth noting.
One qualm that will no doubt be raised again is that I have left out Logic in this round of testing, some even go as far as suggesting that no Apple testing is valid without including Logic ?
Considering that my focus is Cross Platform DAW Performance , its needless to say that Logic misses a vital component of the required criteria , also as I detailed in the first report , its hybrid engine makes a head to head comparative using the methodology near impossible. I will do some specific testing and analysis at a later date, but it is not a priority for me.
The next report is covering Protools LE/MP Native scaling, which shifts the testing to RTAS v VST , ASIO/CoreAudio v DAE. With all of the recent rumblings about PTHD Native and the rumours surrounding PT Native 9 ( which is apparently replacing both LE and MP ) , it will be a good springboard for doing head to head with the newer versions as they come on line , as well as some head to head with the competing plugin and driver formats.