NERSC User SurveysUser Surveys Home2006 survey resultsResponse SummaryDemographics Overall Satisfaction All Satisfaction Ratings Hardware Software Visualization & Data Analysis HPC Consulting Services & Communications Web Interfaces Training Comments Results Query Interface Survey Text Printable
Ratings & Full |
2006 User Survey ResultsMany thanks to the 256 users who responded to this year's User Survey. This represents a response rate of about 13 percent of the active NERSC users. The respondents represent all six DOE Science Offices and a variety of home institutions: see Respondent Demographics. The survey responses provide feedback about every aspect of NERSC's operation, help us judge the quality of our services, give DOE information on how well NERSC is doing, and point us to areas we can improve. The survey results are listed below. You can see the 2006 User Survey text, in which users rated us on a 7-point satisfaction scale. Some areas were also rated on a 3-point importance scale or a 3-point usefulness scale.
The average satisfaction scores from this year's survey ranged from a high of 6.7 (very satisfied) to a low of 4.9 (somewhat satisfied). Across 111 questions, users chose the Very Satisfied rating 4,985 times, and the Very Dissatisfied rating only 51 times. The scores for all questions averaged 6.1, and the average score for overall satisfaction with NERSC was 6.3. See All Satisfaction Ratings. For questions that spanned the 2006 through 2003 surveys, the change in rating was tested for significance (using the t test at the 90% confidence level). Significant increases in satisfaction are shown in blue; significant decreases in satisfaction are shown in red.
Areas with the highest user satisfaction include the HPSS mass storage system, account and consulting services, DaVinci C/C++ compilers, Jacquard uptime, network performance within the NERSC center, and Bassi Fortran compilers. 7=Very satisfied, 6=Mostly satisfied, 5=Somewhat satisfied, 4=Neutral, 3=Somewhat dissatisfied, 2=Mostly dissatisfied, 1=Very dissatisfied
Areas with the lowest user satisfaction include Seaborg batch wait times; PDSF disk storage, interactive services and performance tools; Bassi and Seaborg visualization software; and analytics facilities. 7=Very satisfied, 6=Mostly satisfied, 5=Somewhat satisfied, 4=Neutral, 3=Somewhat dissatisfied, 2=Mostly dissatisfied, 1=Very dissatisfied
The largest increases in satisfaction over last year's survey are for the Jacquard linux cluster; Seaborg batch wait times and queue structure; NERSC's available computing hardware; and the NERSC Information Management (NIM) system. 7=Very satisfied, 6=Mostly satisfied, 5=Somewhat satisfied, 4=Neutral, 3=Somewhat dissatisfied, 2=Mostly dissatisfied, 1=Very dissatisfied
The largest decreases in satisfaction over last year's survey are shown below. 7=Very satisfied, 6=Mostly satisfied, 5=Somewhat satisfied, 4=Neutral, 3=Somewhat dissatisfied, 2=Mostly dissatisfied, 1=Very dissatisfied
Survey Results Lead to Changes at NERSCEvery year we institute changes based on the previous year survey. In 2006 NERSC took a number of actions in response to suggestions from the 2005 user survey.
Users are invited to provide overall comments about NERSC:113 users answered the question What does NERSC do well? 87 respondents stated that NERSC gives them access to powerful computing resources without which they could not do their science; 47 mentioned excellent support services and NERSC's responsive staff; 27 highlighted good software support or an easy to use user environment; 24 pointed to hardware stability and reliability. Some representative comments are: The computers are stable and always up. The consultants are knowledgeable. The users are kept well informed about what's happening to the systems. The available software is complete. The NERSC people are friendly. NERSC runs a reliable computing service with good documentation of resources. I especially like the way they have been able to strike a good balance between the sometimes conflicting goals of being at the "cutting edge" while maintaining a high degree of uptime and reliable access to their computers. NERSC has a lot of computational power distributed in many different platforms (SP, Linux clusters, SMP machines) that can be tailored to all sorts of applications. I think that the DaVinci machine was a great addition to your resource pool, for quick and inexpensive OMP parallelization. The preinstalled application packages are truly useful to me. Some of these applications are quite tricky to install by myself. NERSC makes possible for me extensive numerical calculations that are a crucial part of my research program in environmental geophysics. I compute at NERSC to use fast machines with multiple processors that I can run simultaneously. It is a great resource. 72 users responded to What should NERSC do differently?. In previous years the greatest areas of concern were dominated by queue turnaround and job scheduling issues. In 2004 , 45 users reported dissatisfaction with queue turnaround times. In 2005 this number dropped to 24 and this year only 5 users made such comments. NERSC has made many efforts to acquire new hardware, to implement equitable queueing policies across the NERSC machines and to address queue turnaround times by allocating fewer of the total available cycles, and this has clearly paid off. The top three areas of concern this year are job scheduling, more compute cycles, and software issues. Some of the comments from this section are: The move now is to large numbers of CPUs with relatively low amounts of RAM per CPU. My code is moving the opossite direction. While I can run larger problems with very large numbers of CPUs, for full 3-D simulations, large amounts of RAM per CPU are required. Thus NERSC should acquire a machine with say 1024 CPUs, but 16 or 32 GB RAM/CPU. More adequate and equitable resources allocation based on what the user accomplished in the previous year. Increased storage resources would be very helpful. Global file systems have been started and should be continued and improved. The CPU limit on interactive testing is often restrictive, and a faster turnaround time for a test job queue (minutes, not hours) would help a lot. 67 users answered the question How does NERSC compare to other centers you have used? 41 users stated that NERSC was an excellent center or was better than other centers they have used. Reasons given for preferring NERSC include its consulting services and responsiveness, its hardware and software management and the stability of its systems. Eleven users said that NERSC was comparable to other centers or gave a mixed review and only four said that NERSC was not as good as another center they had used. Some users expressed dissatisfaction with user support, turnaround time, Seaborg's slow processors, the lack of production (group) accounts, HPSS software, visualization and the allocations process. Here are the survey results: |
Page last modified: Wed, 02 Jul 2008 19:12:34 GMT Page URL: http://www.nersc.gov/news/survey/2006/first.php Web contact: webmaster@nersc.gov Computing questions: consult@nersc.gov Privacy and Security Notice |