Meeting Date
May 21, 2013
Attendees
Jeff Campbell, Jeff Capuano, Ricardo Chavira, Ed Kairiss, Susan Kelley, Rick Smith, Susan West, Colleen Whelan
Agenda
Customer Satisfaction
Customer Satisfaction
Susan West led the discussion on Customer Satisfaction, focusing on the recently completed 2013 Community Satisfaction survey. (See attached report). Formerly referred to as the Annual Survey, this went out to all faculty, students, and staff. The survey is likely to be repeated every other year (so we'll refer to it as the Campus-wide Survey).
The Service Board has identified 3 Sources for Customer Satisfaction data:
- Campus-wide Survey (Annual survey)
- may supplement with client-specific survey (like SSG survey), or service area-specific survey
- POD (Point of Delivery) surveys sent on Incident resolution
- Metrics
- Even if we do not have a specific customer satisfaction metric, some metrics point to or are correlated with, or serve as a proxy for, customer satisfaction, so we can derive some meaning from this. For example, speed to answer, revolution rates, uptime, availability, can all point to likely customer satisfaction metrics.
- Each service should have at least one primary source (of the three listed above) and preferably a secondary source as well; it is not likely that all services would have all three
SSG Client Survey
- This started with Brian's group; may extend to Jane and Marc's groups.
- The desire was to reach out to clients more often than campus-wide survey.
- Target audience are customers served by ITS client teams, not senior leadership, probably not end users, but middle managers, team leads; i.e., clients that regularly interact with ITS Client Teams.
- For survey purposes, we'll want to focus their attention to those services covered by the Client Team
- Discussion:
- Do we need more specificity for questions regarding applications?
- Clients will likely know which services and applications are provided by their Client Team, but it may be useful to know how they rate different services or applications.
- Should we give list of apps or services to narrow the scope? Since list of apps is large; can we narrow it down by those covered by the Client Team?
- Brian's input here will be especially helpful.
- As much as possible, would like to use same general questions from SSG survey for Marc and Jane's teams; good for consistency.
- Frequency of client surveys
- May be more than annual, 2 per year? More frequent than campus-wide survey
- Can also bundle this with the campus-wide survey during those years
- Client Team will need time between surveys to act on feedback received, communicate what was done
Response to Annual Surveys
- How should we respond?
- respond to individuals
- improve our services
- communications to public
- How much can we compare data from survey with data from metics, POD?
- WIll need to use multiple tools (different surveys and metrics) to derive satisfaction rates, while understanding that different tools give different information.
- Susan provided various reports and analysis of survey results; slides will be available soon on Service Board wiki
POD surveys update - Jeff Capuano
- POD surveys in development; should be in production by end of June, ready to start in FY14
Scorecards on KPIs
- Jeff also reported that his area's KPI scorecards have plateaued at 91%. This is in part due to some KPIs that have been difficult to move forward:
- FPOC resolution rate (goal is to have as many Incidents resolved on first pass by Tier 1) - target is 80%, currently @ 76%
- Restoration Cost/Incident - target is $72, currently @ $74; focuses on labor costs, staff time devoted to Inc resolution
Follow up item for Service Owners
- Service Owners need to specify how they get their customer sat data
- see list of services at http://isa.its.yale.edu/confluence/display/SN/Service+Board\
- Service Owners should update spreadsheet to indicate the Primary and Secondary sources of Customer Service data (Campus-wide Survey; POD Survey; Metics).