Months have gone swiftly past since the initial beta announcement of SPECviewperf® 11 and on its first outing it did have many onlookers aghast with awe with the improvements shown. A few days back the final announcement was made and a huge scurry has been made to get the download to see how individuals and companies alike wanting to see just how their current systems stacked up. Some interesting results already have made the news with just how much this new professional benchmark has stretched systems to the full.
For those new to SPECviewperf and the varying committees and what they do we have to give a brief insight to you; so each understands what takes place behind these secretive closed door sessions.
Standard Performance Evaluation Committee (SPEC)
Many questions are asked who or what consists of the SPEC committee, as this is a non-profit organisation that sponsors the development of standardised, application-based benchmarks that have value to the vendor, research and user communities. Supported by the main project group members. Therefore for those not in the know, it consists of AMD, Apple, Dell, Fujitsu, HP, Intel and NVIDIA. Simply put; these Tier 1 vendors sit down around the table monthly and thrash out what we annually look forward to – the new professional benchmarks (APC/GPC) that stretch professional graphic workstations to the limits.
It allows the professional end user the chance to evaluate their own system in situ – or possible new system about to be purchased an exceptional idea of guidance on how it will perform to their own individual needs. It also means that with the returned figures shown within the SPEC website from its members users can see which system or professional graphics card suits their needs. Therefore when its time to make that all important purchase from the Tier 1 or some of the more famous system integrators guidance can be gained on how it might perform. One important point that some System Integrators and VAR’s overlook is that they can submit their individual systems for consideration and their benchmark results reviewed and, if accepted, posted upon the SPEC web site. Once this achievement has been made the individuals results are seen many corporate and government CTOs and IT managers who rely on SPEC results as a key component in their decision-making process for purchasing new computer equipment. The payback can be very considerable indeed.
The new SPECviewperf® 11 is made up from the major professional CAD/CAM/DCC packages namely viewsets that represent graphics functionality from Autodesk Maya 2009, CATIA V5 and V6, EnSight 8.2, LightWave 3D 9.6, Pro/ENGINEER Wildfire 5.0, Siemens NX 7, SolidWorks 2009, and UGS Teamcenter Visualization Mockup. Wrap these all up together and you have a substantial 1GB download package that when uncompressed requires 6GB of disc space and in total consists of 68 different viewsets. We will cover each viewset individually further on within this article and once each is expanded upon it will give end users an idea on what to expect from the viewset tests.
SPEC Brief History
GWPG consists of 2 Project Groups
The Application Performance Characterization (SPECapcSM) group was formed in 1997 to provide a broad-ranging set of standardized benchmarks for graphics and workstation applications. The group’s current benchmarks span popular CAD/CAM, digital content creation, and visualization applications.
The Graphics Performance Characterization (SPECgpcSM) group, begun in 1993, establishes performance benchmarks for graphics systems running under OpenGL and other application programming interfaces (APIs). The group’s SPECviewperf® benchmark is the most popular standardized software for evaluating performance based on popular graphics applications.
Therefore we can see that these committees have been around for considerable amounts of time and the current serving committee members have been within the industry for many years with a wealth of great experience.
SPECviewperf® 11 Information
The SPECgpcSM project group’s SPECviewperf 11 — released in late June 2010 — is totally new graphics performance evaluation software. Among the major changes are a new GUI, fully updated viewsets traced from newer versions of applications, larger models, and advanced OpenGL functionality such as shading and vertex buffer objects (VBOs).
Since the SPECviewperf source and binaries have been upgraded to support changes, no comparisons should be made between past results and current results for viewsets running under SPECviewperf 11.
SPECviewperf 11 has the following minimum requirements:
- OpenGL 1.5 plus extensions
- 3GB of installed memory
- 6GB available disk space
- 1920×1080 screen resolution for submissions published on the SPEC website
Beyond the minimum requirements, the SPECgpc group has the following recommendations and advice:
- Run SPECviewperf 11 on a graphics card with at least 512 MB of graphics memory
- For testing on 32-bit Windows systems, set the “/3GB” flag with appropriately sized page file
- SPECviewperf 11 will intentionally exit if system performance is significantly lower than expected for a 3D workstation-class system
SPECviewperf 11 has been tested on the following operating systems (Note: Submissions for publication on SPEC’s website must be run on 64-bit operating systems):
- Microsoft Windows XP (32- and 64-bit)
- Microsoft Windows Vista (32- and 64-bit)
- Microsoft Windows 7 (32- and 64-bit)
- Red Hat Enterprise Linux Workstation 5.4
- SUSE Linux Enterprise Desktop 11 sp1
Many further questions now must be fleeting across minds, therefore with the aid of these few additional URL’s listed below they just might answer those questions you might have
Full-screen anti-aliasing and how it is tested in SPECviewperf 10
With the introduction complete of what SPECviewperf® 11 and the committee is all about we move onwards to the next page for the system used to test todays exclusive outing
Page 1 – Introduction
Page 2 – System Set-Up and Build
Page 3 – CATIA Viewset (catia-03) and EnSight Viewset (ensight-04)
Page 4 – Lightwave (lightwave-01) and Maya Viewset (maya-03)
Page 5 – Pro/ENGINEER Viewset (proe-05) and SolidWorks Viewset (sw-03)
Page 6 – Siemens Teamcenter Visualization Mockup Viewset (tcvis-02) and Siemens NX (snx-01)
Page 7 – Nvidia Quadro® FX3800
Page 8 – Nvidia Quadro® FX4800
Page 9 – Conclusions