Introduction
In many colleges, undergraduate students have the opportunity to engage in real-world research and engineering design projects through interdisciplinary, faculty-led teams (Aazhang et al. 2017). These projects often bring together faculty, graduate students, undergraduates, and sometimes industry partners to collaborate on multi-faceted research and design tasks. As a result, these collaborative environments produce an increasing variety and volume of data, often involving the sharing and reuse of diverse datasets among collaborators with varying levels of experience with data, (Wallis et al. 2013; Zilinski et al. 2014) which creates several challenges.
These challenges include ethical considerations with respect to decisions derived from data (Beever & Brightman 2016), difficulties in facilitating cross- and inter-team data sharing (Tenopir et al. 2011), and challenges in ensuring reproducible research (Rozier & Rozier 2014). Other issues involve adhering to disciplinary data standards (Kalichman et al. 2015), implementing reliable storage and backup systems (Schuster & Birdsong 2006), managing data fragmentation and version control (Wilkinson et al. 2016), navigating short project timelines and student turnover (Schuster & Birdsong 2006) and the widespread lack of formal training in data management and stewardship practices (Carlson & Stowell-Bracke 2013; Shao et al. 2021).
To design effective data information literacy support in collaborative research settings, whether through institutional services, librarian partnerships, or instructional materials, it is essential to understand how research data are currently managed, shared, and used throughout the research process, and what challenges are being faced in the specific contexts of the researchers who work with these data.
Undergraduate students working in these collaborative research projects, are typically supported by a combination of faculty and graduate mentors, who work together to guide their learning and development. However, a study conducted at two research-intensive institutions found that more than 60% of undergraduate researchers were mentored primarily by graduate students (Ahn et al. 2013), highlighting the graduate students’ central role as mentors, educators, and technical facilitators in the undergraduate research experience (Ahn et al. 2013; Dolan & Johnson 2009). In the context of undergraduate engineering research, graduate students are often key members of the research group and, as Carlson, Nelson, et al. (2015, 15) put it, they are “on the frontline of the research process.” Hence, the graduate mentors’ perspectives and challenges must be further understood.
This study explores how graduate mentors describe the data information literacy practices of the undergraduate engineering research teams they support and identifies the challenges those teams face across the research data lifecycle. The study is guided by the following research question: How do graduate mentors describe undergraduate engineering students’ engagement with the data lifecycle of their research projects, and what challenges do they observe within their teams?
Data Information Literacy and Stewardship in Undergraduate Research
The goal of undergraduate research is to immerse students in meaningful, large-scale research experiences under the guidance of faculty and graduate student mentors. These experiences are intended to develop students’ understanding of research methodology, foster independence of thought, reinforce classroom concepts through real-world application, and build skills in research data collection, analysis, and management (Dolan & Johnson 2009; Petrella & Jung 2008).
As students learn to practice responsible data stewardship, they prepare for more advanced research roles where managing data over time and ensuring its quality and integrity are essential (Fitsilis et al. 2024). Data stewardship refers to the responsible management of data throughout all stages of a research project: before, during, and after its formal scope (Wendelborn et al. 2023). This concept is underpinned by the FAIR principles, which emphasize that data should be findable, accessible, interoperable, and reusable (Demchenko & Stoy 2021; Wendelborn et al. 2023). Closely aligned with this, is the research data lifecycle, a framework that guides researchers through steps such as data planning, acquisition, processing, analysis, preservation, and sharing (Bossaller & Million 2022). Together, these models provide a foundation for promoting data responsibility and are important in educating undergraduate students on data management and stewardship practices.
Despite the educational emphasis on undergraduate research, one of the persistent challenges is supporting students in practicing effective data stewardship. Difficulties have been documented across all stages of the research data lifecycle. During data collection, for example, students may overlook ethical considerations such as informed consent in their urgency to complete research tasks (Bossaller & Million 2022). In later stages like processing, analysis, and preservation, student researchers often struggle with proper citation practices, understanding the value of formal data repositories, and producing adequate metadata to accompany their project materials (Burress 2022; Carlson et al. 2011). As the primary advisors in many undergraduate research settings, graduate mentors often have a significant role in support of a range of data literacy and management competencies.
However, few studies have examined the specific contributions of graduate mentors to undergraduate research education (Dolan & Johnson 2009). Existing research tends to focus on mentor-mentee relationships, the reciprocal benefits of collaboration, or the influence of undergraduate involvement on graduate students (Ahn et al. 2013), with limited attention to how graduate mentors support students in developing data management and stewardship practices. Investigating graduate mentors’ perspectives on undergraduate data practices offers an opportunity to inform targeted and effective data information literacy support within undergraduate research environments.
Methods
Context
This study draws on interviews conducted in fall 2024 with two graduate mentors supporting undergraduate engineering research teams in the Vertically Integrated Projects (VIP) program. The VIP program is an initiative that enables students to earn academic credit while engaging in extended, authentic research and design projects. VIP teams are interdisciplinary and vertically integrated (first-year through senior students), mentored by faculty and graduate students, and aligned with active faculty research areas and industry-sponsored challenges. Before initiating the research activities, we obtained approval from the Institutional Review Board at our university. The study was approved as exempt under protocol No. IRB-2023-2, as it involved research conducted within the context of normal educational practices that are not likely to adversely impact students’ opportunity to learn.
Participants
Two graduate mentors participated in the study. These two participants were selected through a voluntary response process following a general invitation sent to graduate mentors of VIP teams. Both volunteered to participate and were compensated for their time.
Mentor Arun (pseudonym) was a computer engineering major enrolled in a 4+1 program, completing their undergraduate degree and beginning their graduate studies in the same field. He began mentoring in the summer of 2023 and he led the Electrical Engineering Education Website team. Their team had participated in the VIP program for two semesters (spring and fall 2024), with no repeat undergraduate students, as the team composition changed each semester. The fall team consisted of fifteen undergraduate students, divided into software and hardware sub-teams, each led by two undergraduate student leads. The graduate mentor was the only overall support for both sub-teams. Arun reported to the faculty member who owned the project.
Mentor Bryan (pseudonym) was a second-year PhD student in computer engineering, completing their third semester of graduate studies. He led the Drone Racing Algorithm team. He had been mentoring in the VIP program since the beginning of their graduate studies, for a total of three semesters. Their team consisted of nine undergraduate students with varied technical experience. Bryan was the only technical support, and he led weekly meetings to support the undergraduate student progress.
Data Sources
The interview protocol was guided by the Data Information Literacy Toolkit. Specifically, we adapted the Faculty Interview Worksheet (Carlson, Sapp Nelson, et al. 2015), which includes modules addressing nine areas of data information literacy: (1) Dataset Description, (2) Lifecycle of the Dataset, (3) Learning about Data, (4) Acquiring External Data, (5) Formats, (6) Tools, (7) Organization and Description of Data, (8) Cultural Practices and Ethical Behavior, and (9) Curation and Preservation. Each interview lasted approximately 50 minutes; one was conducted in person and the other via Zoom. Audio recordings were transcribed for analysis.
Data Analysis
The transcribed audio recordings were analyzed using a qualitative descriptive approach to provide an account of each team’s data information literacy practices. We employed a case-based analysis. Each transcript was read to understand the mentor’s perspective on undergraduate engagement with data across the research lifecycle. The analysis was structured around the modules of the Data Information Literacy (DIL) Toolkit (Carlson, Sapp Nelson, et al. 2015), which informed the interview protocol and results. For modules that included scaled items, we extracted the mentors’ 1–5 importance ratings and visualized the results using a radar chart using python in a Google Collaboratory. For all modules, we developed descriptive summaries to capture each mentor’s experiences across the modules of the DIL Toolkit.
Results
Graduate Mentor Ratings of DIL Competencies
The Data Information Literacy Toolkit (Carlson, Sapp Nelson, et al. 2015), included scaled items for the mentors to report the importance of specific data competencies. Figure 1 presents a visualization of the graduate mentors’ ratings of the importance of scaled DIL competencies for undergraduate students. Both graduate mentors rated Metadata & Data Description and Databases & Data Formats high with scores of 4 out of 5.
Mentor Bryan (Drone Racing Algorithms) consistently rated nearly all competencies at the upper end of the scale (4 or 5), particularly emphasizing Data Conversion and Interoperability, Data Processing and Analysis, Data Visualization and Representation, and the Curation and Preservation modules. Mentor Arun (Electrical Engineering Education Website) offered more varied ratings, with the highest importance assigned to Data Management and Organization, and moderate scores across most other areas. These comparative ratings highlight differences in mentoring emphasis and suggest variability in how data information literacy is prioritized based on various aspects, like project type and mentor expectations.

Figure 1: A radar chart comparing two graduate mentors’ perceived importance of Data Information Literacy skills for undergraduate researchers. Each axis represents a competency area, with scores ranging from 1(not important) to 5 (essential).
Graduate Mentor Descriptions of Undergraduate Data Practices Across DIL Modules
The summary below offers a qualitative comparison of how each VIP graduate mentor described undergraduate engagement with the data lifecycle. The descriptions are organized around the nine modules of the DIL Toolkit, and they reveal project-specific practices.
Team 1, supported by Mentor Arun, contributed to the development of an interactive electrical engineering education website used in Introduction to Digital Design course in Electrical and Computer Engineering (ECE) school. The team includes fifteen undergraduates, divided into hardware and software sub-teams. Each sub-team is led by two undergraduate team leads, selected based on seniority or demonstrated initiative. The team maintains and extends a GitHub-hosted private codebase, manages user design files on an internal storage server, and collects student feedback through Brightspace. The feedback is manually parsed and summarized into reports by a designated undergraduate team member that are used to inform development work. While the software development follows a structured review and testing process, including testing with enrolled ECE students, hardware development tasks are managed separately, with some hardware-related code containerized using Docker.
Team 2, supported by Mentor Bryan, worked on drone racing algorithms in a simulated environment. The team included nine undergraduate students with varied levels of technical experience. Their primary task was to develop and refine drone control algorithms in Python to improve race completion times through iterative testing. Students relied on performance metrics and simulation videos to evaluate algorithm behavior and make data-driven improvements. The team used GitHub for version control and Google Drive to store simulation outputs. They also reviewed baseline performance data from other university teams using the same simulation platform. Weekly meetings provided a space for performance review, discussion of optimization strategies, and alignment with project goals.
Summary of Graduate Mentor Descriptions of Undergraduate Data Practices across Data Information Literacy Modules
Mentor Arun: Electrical Engineering Education Website: Three datasets: website code (GitHub), user design files (internal servers), and student feedback (Brightspace LMS). Mentor Bryan: Drone Racing Algorithms: Three datasets: algorithm code (GitHub), simulation videos (Google Drive), and performance metrics (local JSON files).
Mentor Arun: Follows an iterative process: work packages are generated from student feedback, faculty requests, or bugs → software team students implement code changes using GitHub pull requests → hardware team students test physical system integration, ensure connectivity and update wiring diagrams and sensor placements as needed → updates are tested, and internal documentation is maintained → the full team reviews implementation outcomes and new feedback during weekly meetings → based on this review, work packages are revised or added, and the cycle repeats until tasks are completed and system functionality is achieved. Mentor Bryan: Follows an iterative process: students begin with background research to identify baseline metrics from previous research teams → implement initial drone control algorithms in Python → test them in a simulated environment to generate performance metrics and video recordings → analyze drone behavior, assess stability, speed, and race completion time → review performance outcomes during team meetings and discuss areas for improvement → based on these discussions, algorithms are refined or re-implemented, and the process repeats until optimal performance is achieved or deadlines are met.
Mentor Arun: Students learn through weekly meetings, where tasks are assigned, and progress is reviewed. Mentor guidance focuses on translating student feedback into feature requests and clarifying project expectations. Students also learn from firsthand experience with tools like GitHub and Open Project. Feedback summaries serve as a learning opportunity for interpreting user input, though students occasionally focus on general impressions rather than extracting actionable insights. Mentor Bryan: Students learn through team discussions, mentor-led explanations, and practical coding assignments. The mentor emphasizes the importance of performance metrics and encourages students to focus on measurable outcomes. Learning also includes reviewing baseline research from other institutions and refining algorithms based on simulation data analysis. Mentor reinforces the role of data in validating design decisions and driving iterative improvements.
Mentor Arun: No external data used. Students rely only on internal sources, including Brightspace feedback, bug reports, and faculty guidance. There is no emphasis on seeking external datasets because the project is confidential. Mentor Bryan: Students review and learn from external data shared by other drone racing teams and universities using the same simulation platform. These external benchmarks serve as performance baselines for comparison. Students are encouraged to cite and replicate prior approaches as a starting point and use them to guide improvements in their algorithm design.
Mentor Arun: Code is stored and managed in a private GitHub repository. File types include JavaScript and Python, particularly from user-submitted testing files. Documentation is maintained using Markdown files and internal wikis. Task management is handled via Open Project, hosted on a lab server. Students in the VIP program are expected to maintain individual design logs. Submitted design files vary in structure and are stored on a private server; some are bundled and transferred internally using .zip files. Brightspace is used to collect user feedback, which is summarized into PowerPoint slides for internal review and translation into work packages. Mentor Bryan: The code is written in Python and managed through GitHub for version control. Simulation output data, such as speed, velocity, and position metrics, are stored in JSON format. Performance behaviors are also recorded and reviewed via MP4 simulation videos, which are stored in Google Drive. Students generate data visualizations using Python plotting libraries and incorporate them into presentations, typically as image files or slides. Documentation occurs through in-code comments and GitHub commit history to ensure clarity for internal use. While code is publicly accessible on GitHub, the simulation videos are private.
Mentor Arun: GitHub for version control. Open Project for task management. Brightspace for collecting anonymous student feedback. GitHub Actions for automating deployments. Microsoft PowerPoint for presenting parsed feedback summaries during team meetings. Mentor Bryan: GitHub for version control. Google Drive for storing simulation video files.
Mentor Arun: Data are primarily organized through GitHub and Open Project. Code is versioned via branches and pull requests, with varying levels of inline documentation. Markdown files and wiki-style internal documentation help structure content, but consistency is lacking. VIP students keep individual design logs, though their completeness varies. Naming conventions and file structure are informally managed and rely on the mentor’s oversight and periodic team reminders. Mentor Bryan: Students organize data using structured JSON files to label and record simulation metrics. Each simulation result includes labeled key-value pairs that are visually interpreted through graphs and video reviews. Code is tracked via GitHub, with students responsible for maintaining understandable commit histories and in-code comments. The mentor emphasizes clarity in data presentation, aiming to make outputs understandable for other team members. Formal metadata standards are not used, but internal consistency is prioritized.
Mentor Arun: Privacy and confidentiality are introduced during onboarding. Students are explicitly told not to share the website, codebase, or any user-submitted content. The project is not open source and is treated as confidential due to its academic and instructional value. There is no formal ethics training, but the mentor reinforces behavioral expectations in meetings. Attribution practices are not emphasized since the project outputs are internal. Students are expected to cite sources for technical contributions in formal presentations if needed. Mentor Bryan: All data is private and cannot be shared. Ethical behavior is framed around respecting project boundaries and proper attribution of external resources. The mentor sets expectations during orientation, but there is no structured or ongoing ethics curriculum. Citation practices are discussed when students use external algorithms or data sources, with the IEEE citation style used in formal contexts. Internally, hyperlinks or informal mentions are accepted for team communication.
Mentor Arun: Curation is informal and focused on active development. Code changes are preserved via GitHub which allows rollback and collaborative editing. Design files and logs are stored on internal servers without structured archiving. Student feedback data is only retained short-term. It is summarized and used to create work packages, then discarded. Preservation efforts prioritize current project needs rather than long-term reuse or public sharing. Mentor Bryan: Code is curated through GitHub with detailed commit histories. This allows reproducibility and reuse within the team. Simulation videos are stored in Google Drive but not archived in any long-term repository. Mentor encourages students to format data so others can replicate results, though curation is informal. No official preservation policy exists, and data is retained as long as it serves current team functions. |
Graduate Mentor Perspectives on Data-Related Challenges
Graduate mentors identified several challenges that impacted the quality, consistency, and sustainability of data practices. These challenges spanned technical, organizational, and behavioral domains, often reflecting students’ limited experience with research workflows and evolving project responsibilities as shown in Table 1. The following categories summarize the challenges mentors observed in supporting undergraduate engagement with data-related tasks.
Table 1: Challenges observed by the graduate mentors
| Challenges | Mentor Arun: Electrical Engineering Education Website | Mentor Bryan: Drone Racing Algorithms |
|---|---|---|
| Task Ownership & Accountability | Students sometimes fail to take full ownership of their assigned tasks and deliverables; some do not complete work reliably. | Some students do not search for solutions independently and rely too quickly on mentor help. |
| Documentation & Metadata | Consistency in documentation varies; wiki-style and design logs are used, but quality depends on individual students. | Documentation is basic; in-code comments and commit history are used, but no formal metadata standards are applied. |
| Data Preservation & Archiving | No structured archiving or backup of user design files; feedback data is retained short-term without a preservation strategy. | Simulation data is stored on Google Drive with no long-term archival system; preservation is informal. |
| External Data Acquisition | N/A. No emphasis on using external datasets because project is confidential. | Most students do not come in knowing how to locate or evaluate external data sources. |
| Ethical Practices & Attribution | Ethics are not formally taught; students are told not to share the project, but attribution practices are informal. | Ethical expectations are discussed once but not consistently reinforced throughout the semester. |
| Communication & Expectations | Miscommunication at the start of tasks sometimes leads to scope drift or misaligned contributions. | Students have varying levels of readiness; some lack basic coding or research exposure, making communication uneven. |
| Feedback Interpretation | Student feedback is collected, but occasionally summarized too generally, missing deeper actionable insights. | Students sometimes fail to recognize the broader value of reflection or learning from prior iterations. |
| Technical Tool Familiarity | Students are familiar with GitHub and Open Project, but newer members may lack CI/CD or testing pipeline knowledge. | Some students lack experience with GitHub or video analysis tools when they first join. |
| Timeliness and Completion of Work | Students sometimes underestimate the importance of deadlines or the interdependence of their work on others. | With varied technical backgrounds, some students need extra time to catch up, delaying group progress |
Discussion
The findings from this study suggest that graduate mentors demonstrate an awareness and an acknowledgment of the data lifecycle and associated challenges that undergraduate engineering students encounter during research projects. The challenges reflect the complexity of engaging novice researchers in sustained data practices, especially within team-based environments where responsibilities are distributed, and turnover is frequent. The variability in student preparedness, especially in areas like metadata, ethics, and data reuse, suggests that undergraduate teams can benefit from mentorship on data stewardship practices. Mentor narratives also shed light on the critical instructional role graduate students play in the development of undergraduate data and information literacy.
Because undergraduate research projects are often long-term endeavors with high student turnover, maintaining consistent data practices has become essential for ensuring quality and continuity over time (Shorish 2015). As students cycle in and out of projects, proper data stewardship supports the preservation of institutional knowledge and enables ongoing progress.
These findings call for targeted support for graduate mentors, who are positioned to embed data stewardship practices into the everyday fabric of undergraduate research programs. For example, a range of open-access kits like those in “Data management for researchers: Organize, maintain and share your data for research success” (Briney 2015) and frameworks that guide data stewardship practices (Purzer et al. 2024) should be made available to mentors on university libraries websites.
Future research and institutional initiatives should consider how data information literacy instruction can be more intentionally embedded into undergraduate research programs. While the current approaches include workshops, embedded librarianship, or course-integrated modules, it is important these resources are accessible to graduate mentors. A focus on supporting graduate mentors would help close the gap between theoretical data competencies and actual research practice.
Conclusion
This study contributes to ongoing efforts to understand and support undergraduate engagement with the data lifecycle by exploring the perspectives of graduate mentors who guide the student teams. Using a case-based approach, we offered both a comparative view of perceived skill importance across DIL Toolkit modules and a descriptive account of team-level data practices. Our findings highlight the essential role that graduate mentors play in shaping students’ data competencies and highlight the need for intentional support around data stewardship in undergraduate research programs.
Looking ahead, we aim to continue supporting undergraduate research data practices through the development of mentorship guides, reusable documentation templates, and data management frameworks that enhance students’ capacity for ethical and sustainable data stewardship.
We acknowledge that a limitation of this study is its small sample size of only two participants, which restricts the generalizability of our current findings. Nevertheless, the insights gained from these initial cases provide a foundation for future investigations. As undergraduate research becomes increasingly data-intensive, sustained collaboration among graduate mentors, faculty, and librarians will be essential to fostering a culture of responsible data practices in STEM education.
References
Aazhang, Behnaam, Randal T. Abler, Jan P. Allebach, et al. 2017. “Vertically Integrated Projects (VIP) Programs: Multidisciplinary Projects with Homes in Any Discipline.” Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio, June 24, 2017. https://doi.org/10.18260/1-2--29103.
Ahn, Benjamin, Monica Farmer Cox, Heidi A. Diefes-Dux, and Brenda M. Capobianco. 2013. “Examining the skills and methods of graduate student mentors in an undergraduate research setting.” Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia, June 23, 2013. https://doi.org/10.18260/1-2--19577.
Beever, Jonathan, and Andrew O. Brightman. 2015. “Reflexive Principlism as an Effective Approach for Developing Ethical Reasoning in Engineering.” Science and Engineering Ethics 22 (1): 275–291. https://doi.org/10.1007/s11948-015-9633-5.
Bossaller, Jenny, and Anthony J. Million. 2022. “The Research Data Life Cycle, Legacy Data, and Dilemmas in Research Data Management.” Journal of the Association for Information Science and Technology 74 (6): 701–706. https://doi.org/10.1002/asi.24645.
Briney, Kristin. 2015. Data management for researchers: Organize, maintain and share your data for research success. Pelagic Publishing Ltd.
Burress, Theresa. 2022. “Data Literacy Practices of Students Conducting Undergraduate Research.” College & Research Libraries 83 (3): 434. https://doi.org/10.5860/crl.83.3.434.
Carlson, Jacob, Michael Fosmire, C.C. Miller, and Megan Sapp Nelson. 2011. “Determining Data Information Literacy Needs: A Study of Students and Research Faculty.” Portal: Libraries and the Academy 11 (2): 629–657. https://doi.org/10.1353/pla.2011.0022.
Carlson, Jake, Megan Sapp Nelson, Lisa R. Johnston, and Amy Koshoffer. 2015. “Developing Data Literacy Programs: Working with Faculty, Graduate Students and Undergraduates.” Bulletin of the Association for Information Science and Technology 41 (6): 14–17. https://doi.org/10.1002/bult.2015.1720410608.
Carlson, Jake, Megan Sapp Nelson, Marianne Bracke, and Sarah Wright. 2015. The Data Information Literacy Toolkit. Purdue University. https://doi.org/10.5703/1288284315510.
Carlson, Jake, and Marianne Stowell-Bracke. 2013. “Data Management and Sharing from the Perspective of Graduate Students: An Examination of the Culture and Practice at the Water Quality Field Station.” Portal: Libraries and the Academy 13 (4): 343–361. https://doi.org/10.1353/pla.2013.0034.
De Demchenko, Yuri, and Lennart Stoy. 2021. “Research Data Management and Data Stewardship Competences in University Curriculum.” In 2021 IEEE Global Engineering Education Conference (EDUCON), Vienna, Austria, April 21-23, 2021. https://doi.org/10.1109/educon46332.2021.9453956.
Dolan, Erin, and Deborah Johnson. 2009. “Toward a Holistic View of Undergraduate Research Experiences: An Exploratory Study of Impact on Graduate/Postdoctoral Mentors.” Journal of Science Education and Technology 18 (6): 487–500. https://doi.org/10.1007/s10956-009-9165-3.
Fitsilis, Panos, Vyron Damasiotis, Charalampos Dervenis, Vasileios Kyriatzis, and Paraskevi Tsoutsa. 2024. “Effective Data Stewardship in Higher Education: Skills, Competences, and the Emerging Role of Open Data Stewards.” arXiv:2410.20361. https://doi.org/10.48550/arXiv.2410.20361.
Kalichman, Michael, Monica Sweet, and Dena Plemmons. 2015. “Standards of Scientific Conduct: Disciplinary Differences.” Science and Engineering Ethics 21 (5): 1085–1093. https://doi.org/10.1007/s11948-014-9594-0.
Petrella, John, and Alan Jung. 2008. “Undergraduate Research: Importance, Benefits, and Challenges.” International Journal of Exercise Science 1 (3): 91–95. https://doi.org/10.70252/mxri7483.
Purzer, Senay, Carla B. Zoltowski, Wei Zakharov, and Joreen Arigye. 2024. “Developing the Design Reasoning in Data Life-Cycle Ethical Management Framework.” Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon, June 23, 2024. https://doi.org/10.18260/1-2--47172.
Rozier, Kristin Yvonne, and Eric W. D. Rozier. 2014. “Reproducibility, Correctness, and Buildability: The Three Principles for Ethical Public Dissemination of Computer Science and Engineering Research.” In 2014 IEEE International Symposium on Ethics in Science, Technology and Engineering, Chicago, Illinois, May 23-24, 2014. https://doi.org/10.1109/ethics.2014.6893384.
Schuster, Peter, and Charles Birdsong. 2006. “Research In The Undergraduate Environment.” In 2006 Annual Conference & Exposition, Chicago, Illinois, June 18, 2006. https://doi.org/10.18260/1-2--844.
Shao, Gang, Jenny P. Quintana, Wei Zakharov, Senay Purzer, and Eunhye Kim. 2021. “Exploring Potential Roles of Academic Libraries in Undergraduate Data Science Education Curriculum Development.” The Journal of Academic Librarianship 47 (2): 102320. https://doi.org/10.1016/j.acalib.2021.102320.
Shorish, Yasmeen. 2015. “Data Information Literacy and Undergraduates: A Critical Competency.” College & Undergraduate Libraries 22 (1): 97–106. https://doi.org/10.1080/10691316.2015.1001246.
Tenopir, Carol, Suzie Allard, Kimberly Douglass, Arsev Umur Aydinoglu, Lei Wu, Eleanor Read, Maribeth Manoff, and Mike Frame. 2011. “Data Sharing by Scientists: Practices and Perceptions.” PLoS ONE 6 (6): e21101. https://doi.org/10.1371/journal.pone.0021101.
Wallis, Jillian C., Elizabeth Rolando, and Christine L. Borgman. 2013. “If We Share Data, Will Anyone Use Them? Data Sharing and Reuse in the Long Tail of Science and Technology.” PLoS ONE 8 (7): e67332. https://doi.org/10.1371/journal.pone.0067332.
Wendelborn, Christian, Michael Anger, and Christoph Schickhardt. 2023. “What Is Data Stewardship? Towards a Comprehensive Understanding.” Journal of Biomedical Informatics 140 (April): 104337. https://doi.org/10.1016/j.jbi.2023.104337.
Wilkinson, Mark D., Michel Dumontier, IJsbrand Jan Aalbersberg, et al. 2016. “The FAIR Guiding Principles for Scientific Data Management and Stewardship.” Scientific Data 3 (1): 160018. https://doi.org/10.1038/sdata.2016.18.
Zilinski, Lisa D., Megan Sapp Nelson, and Amy S. Van Epps. 2014. “Developing Professional Skills in STEM Students: Data Information Literacy.” Issues in Science and Technology Librarianship 77 (2014): Summer. https://doi.org/10.29173/istl1608.