Libraries are increasingly putting systematic systems in place to evaluate their services more effectively to help them save costs, measure performance, develop their programmes or to better understand their user needs. This list of 24 considerations comes from my consultancy work with numerous libraries. It is intentionally a very mixed bag to address a variety of libraries.
It also refers to a new recommended publication on the topic: Dorner, D.G., Gorman, G.E. & Calvert P.J. (2014)
It also refers to a new recommended publication on the topic: Dorner, D.G., Gorman, G.E. & Calvert P.J. (2014)Information Needs Analysis. Principles and practice in information organizations London, UK: Facet Publishing.
1. Agree on the goal with your team and define the goal of each evaluation exercise. Then define the methods to measure the goal.
2. Specify the target groups and their sub-target groups you need to know more from.
3. What do you need needs analysis and assessment for?
- To assess the needs of an under-served audience
- To scope a particular problem
- For gap analysis
- For service development or policy planning
- For strategic planning and priority-setting
- For more efficient resource allocation
- To justify funding
- As part of institutional assessment
- As part of performance management exercises
- To improve client relations
4. Strive for a more complete picture by involving team members and related staff to come up with diverse questions for further investigation.
5. What do you want to know, and in which context?
- What your researchers need. What is the gap between the present state and the desired one?
- What pains your researcher? What are the problems at play?
- What concerns your clients have, e.g. What your researchers do not want
- What researchers’ perceptions
- What opportunities there are
- What the uptake of certain services is
- What new services you should offer
- How to improve current services
- What risks or threats there are to your initiative
- How to make changes to current services not too disruptive
- Data to substantiate discontinuing a current service
- How to expand knowledge in a certain area
6. Who has experience with evaluation to help you with methodologies? You don’t need to go it alone: libraries are increasingly collaborating with others. How can you liaise with Student Affairs or Doctoral Schools for example?
7. Try to ensure the scientific quality of your survey / focus groups by verifying your plans to better guarantee scientific approval from you community and higher traction. Collaborate with social scientists for example and close peers in faculty to support you in your efforts.
8. Identify what other evaluation exercises are going on in your institution. Can you piggy-back off another evaluation exercise within a faculty?
9. Success is often in the timing. Consider the busiest times and quietest times; when is best to evaluate what? And when is it the best time to get the highest response rates? Be wary of survey fatigue – and thus be aware of other exercises happening in parallel / in close proximity.
10. Analyse the priorities, concerns and impact of key stakeholders in preparation for qualitative work. Carry out a short stakeholder analysis. See my earlier blog post on this topic here.
11. Consider the values of your clients in your information needs analysis
12. What are the assumptions of your clients about you and your service/s? Build these into your information needs questions.
13. Are you going to evaluate the knowledge of your users on a particular subject to help develop a particular service? If so, are you going to carry our a needs assessment at recurring intervals to see how far you’ve achieved learning outcomes / goals?
14. Qualitative interviews, focus groups and observations will bring you far more insight into the particularity of a certain context and individual/s. Hone in on the context in your interviews.
15. Use the language of your target group; no jargon, slang or trigger words such as sorry or but; remain culturally sensitive and neutral.
16. Analyse external data. For example, use demographic data to put your data in the broader context and benchmark against other organisations. What are they doing to solve problem x, in service y?
17. Analyse internal data
- Input data can include income and expenditures, staff, collection size, library information system and space.
- Output data or performance indicators can include the use of specific services, e.g. no of enquiries, no of users in a particular context, e.g. discipline or career path, quality indicators, collection use incl. ILL or online catalogue, portal or database use, IR downloads or uploads. Social media analytics can bring up further data on the use of services, e.g. through bookmarking.
Note that statistics on the above do not usually evaluate the quality of services.
18. Less statistics is more. Hone in on analysing just the statistics to prove the point you want to make.
19. Consider conducting usability testing for a new portal or online information system, or to evaluate a current system.
20. Consider the audience of your evaluation report and write it to fit with their priorities (use your stakeholder analysis).
21. Share internal experiences: get library staff to share highlights, lessons learnt, and good and bad practices on themes to evaluate and build your knowledge.
22. Think about having team evaluations of your programmes before prioritising for your strategic plan. Proud2Know helps facilitate these workshops.
23. Consider setting KPIs (Key Performance Indicators) for your library to help you measure how far you have achieved your goals and objectives at specific given times.
24. Consider developing an evaluation plan where you specify how to more systematically evaluate your services rather than on an ad hoc basis.
What evaluation methods are you using for what purpose? How systematic are they? Please share your additions below.