Respondent Summary
NERSC would like to thank all the users who participated in this year's survey. Your responses provide feedback about every aspect of NERSC's operation, help us judge the quality of our services, give DOE information on how well NERSC is doing, point us to areas we can improve, and show how we compare to similar facilities.
This year 177 users responded to our survey, compared with 138 last year. In general user satisfaction with NERSC was rated higher than last year, by an average of 0.6 on a 7-point scale across 27 questions common to the two years. The biggest increases in satisfaction were with the allocations process, the HPSS system and the T3E. See FY 1998 to FY 1999 Changes.
On a scale from 7 (very satisfied) to 1 (very dissatisfied) the average scores ranged from a high of 6.6 for timely response to consulting questions to 4.0 for PVP batch wait time. The areas users are happiest with this year are consulting services, HPSS reliability and uptime, as well as PVP and T3E uptime. Areas of concern are batch wait times for both PVP and T3E systems, visualization services, the availability of training classes, and PVP resources in general. See the table that ranks all satisfaction questions.
The areas of most importance to users are the overall running of the center and its connectivity to the network, the available computing hardware and its management and configuration, consulting services, and the allocations process. Access to cycles is the common theme. See the Overall Satisfaction and Importance summary table.
In their verbal comments users focused on NERSC's excellent support staff and its well run center with good access to cycles (although users wish we could provide even more), hardware and software support, and reliable service. When asked what NERSC should do differently the most common response was "provide even more cycles". Of the 52 users who compared NERSC to other centers, half said NERSC is the best or better than other centers, 23% simply gave NERSC a favorable evaluation or said they only used NERSC, 19% said NERSC is the same as other centers or provided a mixed evaluation, and only 4 said that NERSC is less good. Several sample responses below give the flavor of these comments; for more details see Comments about NERSC.
- "I have found the consulting services to be quite responsive, friendly, and helpful. At times they went beyond the scope of my request which resulted in making my job easier."
- "Provides reliable machines, which are well-maintained and have scheduling policies that allow for careful performance and scaling studies."
- "Provides a stable, user-friendly, interactive environment for code development and testing on both MP machines and vector machines."
- "It would be nice if there were fewer users, so turn-around time could be faster."
- "NERSC provides a well-run supercomputer environment that is critically important to my research in nuclear physics."
NERSC made several changes this past year based on the responses to last year's survey.
- We more frequently notified you of important changes and issues by email: This year 95% of users said they felt adequately informed, compared with 82% last year.
- We changed the way we present announcements on the web. (We have no comparison rating between the two years.)
- We restructured the queues on the Cray T3E: the satisfaction rating for T3E batch queue structure went up by one point (from 4.5 to 5.5).
- We added additional debug queues on all the Crays: last year we received 2 complaints in this area; this year none.
Watch this section for changes we plan to implement next year based on this year's survey.
Below are the survey results. For the survey itself, click here.
- Overall Satisfaction and Importance
- User Information
- Visualization
- Consulting and Account Support
- Information Technology and Communication
- Hardware Resources
- Software Resources
- Training
- Comments about NERSC
- All Satisfaction Questions and FY 1998 to FY 1999 Changes