NERSC User SurveysUser Surveys Home2005 survey resultsResponse SummaryDemographics Overall Satisfaction All Satisfaction Ratings Hardware Software Visualization & Data Analysis HPC Consulting Services & Communications Web Interfaces Training Comments Results Query Interface Survey Text Printable
|
2005 User Survey ResultsMany thanks to the 201 users who responded to this year's User Survey. The respondents represent all six DOE Science Offices and a variety of home institutions: see Respondent Demographics. The survey responses provide feedback about every aspect of NERSC's operation, help us judge the quality of our services, give DOE information on how well NERSC is doing, and point us to areas we can improve. The survey results are listed below. You can see the 2005 User Survey text, in which users rated us on a 7-point satisfaction scale. Some areas were also rated on a 3-point importance scale or a 3-point usefulness scale.
The average satisfaction scores from this year's survey ranged from a high of 6.73 (very satisfied) to a low of 3.95 (neutral). See All Satisfaction Ratings. For questions that spanned the 2004 and 2005 surveys the change in rating was tested for significance (using the t test at the 90% confidence level). Significant increases in satisfaction are shown in blue; significant decreases in satisfaction are shown in red.
Areas with the highest user satisfaction include the HPSS mass storage system, HPC consulting, and account support services:
Areas with the lowest user satisfaction include batch wait times on both Seaborg and Jacquard, Seaborg's queue structure, PDSF disk stability, and Jacquard performance and debugging tools:
The largest increases in satisfaction over last year's survey are shown below:
Only three areas were rated significantly lower this year: PDSF overall satisfaction and uptime, and the amount of time taken to resolve consulting issues. The introduction of three major ssytems in the last year combined with a reduction in consulting staff explain the latter.
Survey Results Lead to Changes at NERSCEvery year we institute changes based on the previous year survey. In 2005 NERSC took a number of actions in response to suggestions from the 2004 user survey.
Users are invited to provide overall comments about NERSC:82 users answered the question What does NERSC do well? 47 respondents stated that NERSC gives them access to powerful computing resources without which they could not do their science; 32 mentioned excellent support services and NERSC's responsive staff; 30 pointed to very reliable and well managed hardware; and 11 said everything. Some representative comments are: powerful is the reason to use NERSC NERSC provides superior hardware computing capabilities, and professional user support and consulting. These two areas is where I see NERSC core strengths, here NERSC offers resources that are not easily matched by any local cluster or computer farm set up. Speed, data accessibility There is a distinct PROFESSIONALISM in the way NERSC conducts business. It should be a model for other centers Nersc is important because it combines state of the art HW/SW but most important the combination of state of the art HW/SW with execlent first class consulting/collaboration. 65 users responded to What should NERSC do differently?. The areas of greatest concern are the inter-related ones of queue turnaround times (24 comments), job scheduling and resource allocation policies (22 comments), and the need for more or different computational resources (17 comments). Users also voiced concerns about data management, software, group accounts, staffing and allocations. Some of the comments from this section are: The most important improvement would be to reduce the amount of time that jobs wait in the queue; however, I understand that this can only be done by reducing the resource allocations. A queued job sometimes takes too long to start. But I think that, given the amount of users, probably there would be no efficient queue management anyway. Overallocation is a mistake. Long waits in queues have been a disaster for getting science done in the last few years. INCITE had a negative affect on Fusion getting its science work done. It's much better to have idle processors than idle scientists/physicists. What matters for getting science done is turnaround time. ... ... Interactive computing on Seaborg remains an issue that needs continued attention. Although it has greatly improved in the past year, I would appeciate yet more reliable availability. Expand capabilities for biologists; add more computing facilities that don't emphasize the largest/fastest interconnect, to reduce queue times for people who want to runs lots of very loosely coupled jobs. More aggresively adapt to changes in the computing environment. NERSC needs to expand the IBM-SP5 to 10000 processors to replace the IBM-SP3 NERSC needs to push to get more compute resources so that scientists can get adequate hours on the machine 51 users answered the question How does NERSC compare to other centers you have used? Twenty six users stated that NERSC was an excellent center or was better than other centers they have used. Reasons given for preferring NERSC include its consulting services and responsiveness, its hardware and software management and the stability of its systems. Twelve users said that NERSC was comparable to other centers or gave a mixed review and seven said that NERSC was not as good as another center they had used. The most common reason for finding dissatisfaction with NERSC is the oversubscription for its computational resources and the resulting long wait times. Among PDSF users, the most common dissatisfaction was with disk instability. Here are the survey results:
|