Introduction
Library data services are an emerging and rapidly evolving field, often influenced by the unique mix of systems, partnerships, and research activities on each campus. As a result, services are highly customized based on the size, structure, and needs of individual institutions. With limited resources and expertise, libraries must identify which gaps they are best positioned to fill, making strategic assessment essential for effective planning and service delivery.
Because data services can include everything from advising on Data Management and Sharing Plans (DMSPs) to supporting lab data collection or offering instruction in tools like Python, regular assessment is essential for building campus buy-in. Data services remain a relatively new area, and many institutions are still determining how to define, assess, or even launch them as staff gain relevant training. Like other areas of librarianship, data services need ongoing evaluation to stay responsive and effective.
The Research Data Access and Preservation (RDAP) 2025 Summit, themed Evolutions in Data Services: Forging Resiliency, reflected this need. Many presentations focused on assessment as the first step toward evolving services in strategic, sustainable ways. A recurring theme across the conference was the role of assessment, not just as a routine task, but as an important tool for adapting, demonstrating impact, and aligning data services with institutional priorities.
Assessment Tools and Tactics: Lessons from Peer Institutions
Presenters at RDAP 2025 shared a variety of approaches to assessing data services, shaped by their institutional contexts and goals.
Lytle, Wham, and Moberly from Penn State conducted a scoping review of RDM needs assessment tools with the goal of creating a reusable dataset of survey and interview questions related to researcher data needs and perceptions of data services (Lytle, Wham, and Moberly 2025). Their work emphasized how researchers’ needs evolve over time due to changes in funding trends. By reusing existing assessment questions, institutions can conserve resources typically spent on survey design. A possible next step for their project is the creation of a publicly accessible question bank, which could support other institutions in designing their own assessments more efficiently.
In contrast, Mueller described how NC State University Libraries analyzed a decade’s worth of coded feedback from Data Management Plans (DMPs) to identify persistent challenges and shape more targeted support services (Mueller 2025). Common gaps included repository selection, file format specification, and misplacement of content within the DMP, which suggests areas where researchers consistently need guidance.
Another approach, presented by McCall from Tufts University, involved conducting a "listening tour" with faculty and graduate students (McCall 2025). Participants were identified through platforms like OSF, Dataverse, and DMPTool, as well as through recently awarded grants and faculty profiles. Individualized email invitations led to a high response rate among faculty, while outreach to graduate students via email lists and incentives had limited success. Semi-structured, one-hour interviews were conducted throughout the semester, gathering information on project types, data needs, current practices, concerns, and awareness of campus data services. This method was designed not only to gather insights for assessment, but also to foster relationships and raise visibility for library support. Although time- and labor-intensive, this approach prioritized depth and community engagement. The Tufts team plans to use the findings to guide future outreach.
Taken together, these examples highlight the value of both quantitative and qualitative approaches to assessment. Reusable survey tools, like those proposed by the Penn State team, offer a scalable model that supports cross-institutional benchmarking. In contrast, in-depth interviews such as those conducted by Tufts provide rich, contextual insights and build relationships that surveys may miss. Depending on institutional goals, researcher population, and available capacity, a mixed-methods approach may offer the most comprehensive understanding of campus data needs.
A Multi-Institutional Study of Data Service Gaps and Opportunities
One of the most comprehensive assessments discussed at RDAP 2025 came from Ithaka S+R’s multi-institutional effort to better understand researcher needs and challenges across a range of institutions (McCracken and MacDougall 2025). With 29 institutions participating, the study focused on fostering cross-campus collaboration using a consistent qualitative method.
The research team posed three core questions:
What needs and challenges do researchers face when working with data?
What campus resources have they used to support those needs?
How do they perceive the value of those resources?
To answer these questions, data service providers conducted 294 interviews, stratified by institution, discipline, and rank. A subset of 41 interviews was then analyzed using thematic coding to extract deeper insights.
Common challenges included interoperability issues among tools used by different research teams, and a general lack of preparedness for data curation. Researchers frequently expressed a need for more personalized services, clearer instruction, and better coordination of data services across their institutions.
A key insight from the study was that many researchers had only a limited understanding of the term “data services.” As a result, they frequently turned to peers or outside consultants instead of institutional support. In cases where they were aware of available institutional services, researchers sometimes found them slow or lacking in relevant expertise.
The study resulted in practical recommendations for a range of stakeholders, including libraries, IT, research offices, and funders. For libraries specifically, suggestions included expanding dataset collections, offering individualized consultations, demystifying funder requirements, strengthening relationships with sponsored projects and research offices, and embedding data publication consultations into IRB workflows.
Studies like this offer libraries a dual benefit: they provide benchmarks for national comparison while also helping shape local strategy. For example, an institution might use the findings to develop a branded, flexible repository tailored to local researcher needs while drawing on common patterns identified through cross-institutional collaboration (McCracken and MacDougall 2025).
From Assessment to Action: Strategic Takeaways
The Penn State team’s work on reusable survey questions (Lytle, Wham, and Moberly 2025) and Ithaka S+R’s cross-institutional analysis (McCracken and MacDougall 2025) offer valuable models for scalable, customizable assessment. These tools enable libraries to tailor their strategies to local needs while contributing to broader benchmarking efforts—allowing institutions to build on proven methods rather than start from scratch.
A key direction for the field may be embedding assessment directly into service workflows. Developing modular tools that scale well, from targeted departmental surveys to institution-wide instruments, can streamline processes and allow libraries to adapt quickly as needs evolve. Qualitative approaches like interviews and listening tours (McCall 2025) also serve a dual purpose: gathering insight while building relationships and raising awareness, particularly in contexts where “data services” remain unfamiliar or misunderstood.
Assessment plays both a practical role in shaping library services and a strategic role in supporting broader institutional priorities. Presentations at RDAP emphasized that well-designed assessments can support funding requests, staffing decisions, and cross-campus collaboration with units like IT, research administration, and sponsored programs. Integrating assessment into routine activities such as consultations, IRB support, or data deposit processes can reduce the need for separate outreach and increase both efficiency and impact.
Conclusion
Because data services vary widely based on institutional context, libraries must use assessment methods to guide their development and evolution. Presentations at RDAP 2025 highlighted both qualitative and quantitative strategies for identifying service gaps and aligning support with researcher needs.
By integrating assessment into regular workflows and drawing on shared tools developed across institutions, libraries can strengthen the focus and adaptability of their data services. As researcher expectations evolve and funder requirements change, assessment can help libraries shift from reactive support to more intentional, forward-looking service development.
In this way, assessment serves not only as a planning mechanism but also as a means of communicating value and building ongoing partnerships within the research community.
References
Lytle, Melissa, Emily Wham, and Heather Moberly. 2025. “A Scoping Review to Collect and Disseminate Survey and Interview Measures of Researcher Data Management Needs and Perceptions of Data Management Services.” Poster presented at the Research Data Access and Preservation (RDAP) 2025 Summit, March 11–13. https://osf.io/bnpyf.
McCall, Erin. 2025. “We’re New Here, Want to Talk? Conducting a Listening Tour as the First Step in RDM Outreach.” Presentation at the Research Data Access and Preservation (RDAP) 2025 Summit, March 11–13. https://osf.io/3un7d.
McCracken, Catherine, and Rachel MacDougall. 2025. “Understanding Researcher Needs and Challenges: Findings from a Qualitative Study on Research Data Services.” Presentation at the Research Data Access and Preservation (RDAP) 2025 Summit, March 11–13. https://osf.io/z2s78.
Mueller, Kristine. 2025. “Lessons Learned: Looking Back on 10 Years of DMP Feedback.” Presentation at the Research Data Access and Preservation (RDAP) 2025 Summit, March 11–13.