Librarians and faculty instructors share a role as guides for students who are navigating new knowledge landscapes. Data literacy, which involves being able to describe, evaluate, use, share, and cite data, is a particularly challenging area for students to navigate on their own (Mendez-Carbajo 2020). However, data literacy is increasingly central to succeeding in careers and research, and even in understanding news and information online (Pothier and Condon 2020). Due to these difficulties and current needs, librarians and instructors have a responsibility to help students gain competency using and understanding data.
To effectively teach data literacy skills, librarians and instructors need to understand the most important areas as well as consistent gaps. Since prominent areas and needs may vary by discipline, this study focuses on economics. As a social science with close connections to business and high data use, findings on data literacy in economics have the potential to extend to additional fields. Areas of significance and weakness within data literacy may be identified through analysis of curriculum, student performance, job postings, or research in the field. This study focuses on peer-reviewed research articles since they are often the culmination of working with data in economics and are used as a teaching tool in college classrooms. The framework presented for content analysis of published research can be applied to other disciplines to extend subject-specific data literacy insights.
Past research has looked at the growth of data services in libraries and librarianship, fueled by large amounts of research and business data and an increased emphasis on data, open science, and data science in government and higher education. Bibliometrics studies illustrate growth in areas of open data and data science, including references to topics such as big data and data sharing, in the past 10-12 years (Zhang, Hua, and Yuan 2018; Raban and Gordon 2020).
Corrall, Kennan and Afzal (2013) and Tenopir et al. (2014) provide some of the earliest summaries of data and research data management (RDM) services in libraries and call for more training on data management for librarians. Almost a decade later, many libraries and librarians have responded. Goben and Griffin (2019) conducted a systematic review of RDM needs assessments in libraries, illustrating the vast amount of work that has been done to provide services to researchers. Librarians without data-specific roles are recognizing the need for and gaining skills with data to meet patron needs (Kubas and McBurney 2019).
Data literacy is an important facet within library data services. Librarians have traditionally focused on providing instruction and support for building information literacy competencies among users. 1 With information increasingly being provided and displayed in formats outside of traditional text, additional literacies are gaining attention such as media literacy, statistical literacy, and data literacy. There is not yet a codified standard for data literacy, but there is some consensus on general competencies and on the need for training students (Carlson et al. 2011). Calzada Prado and Marzal (2013) reviewed existing information literacy standards and made connections to data literacy competencies. They identified five core data literacy competencies: understanding, finding/obtaining, reading/interpreting/evaluating, managing, and using data.
In recent years, social science researchers and economists have given more emphasis to data literacy. Pothier and Condon (2020) identified unique business and economics data literacy needs: data-driven decision making, communicating and presenting effectively with data, and data ethics and security. Miklós et al. (2022) provide guidelines for data citations that enable replication in the social sciences. ReplicationWiki was started to increase transparency and reproducibility in economics research by making replication data more accessible (Hoffler 2017). By encouraging replication studies, ReplicationWiki also helps economists and students develop and display data literacy competencies. All journals published by the American Economic Association follow a data policy that encourages data sharing and replication (American Economic Association 2020).
Recent studies focused on the data literacy competencies of analyzing data and sharing data. Economics and finance researchers are exploring the limitations of statistical significance and ways to accurately assess and discuss significance of results (Mitton 2022; Roth 2022). Data analysis is central to data literacy because it is the pathway to producing insights. Assessments of top economics journals found that those with data availability policies have higher rates of data sharing (Vlaeminck and Podkrajac 2017), and the number of economics journals with explicit data availability policies has increased in the last decade (Vlaeminck 2021). Zhang and Ma (2021) researched the benefits of data sharing for economics research in China and found that open data increased citation and impact. Data sharing is an important element of data literacy because it facilitates replication and additional research. Other data literacy competencies help make research accessible and understandable. This study helps fill this gap by addressing competencies of describing, evaluating, using, and citing data, as well as sharing data.
Much of the previous research on these additional areas of data literacy focused on science specifically or higher education generally. For example, Pouchard and Bracke (2016) assess the RDM practices and abilities of agriculture faculty. They find that faculty consider data literacy important and that there is room for libraries to expand involvement in teaching data literacy competencies. In higher education research, data literacy skills are evaluated in the context of faculty using student data to improve teaching rather than in teaching data literacy skills to students (Raffaghelli and Stewart 2020; Enakrire 2021).
Research at the intersection between data services and economics and business faculty so far has focused on assessing needs and gaps. Wheatley, Chandler, and MicKinnon (2020) report on ways business and data librarians can assess data needs and market data services, focusing on making connections with business faculty to accomplish those goals. Since economists value data skills, as shown in recommended learning outcomes for undergraduate education (Myers, Nelson, and Stratton 2011; Allgood and Bayer 2016), these are likely to be fruitful partnerships.
Carlson et al. (2011) shed light on how data literacy competencies are used in research through interviews with faculty researchers. They found that faculty often delegate data documentation and management tasks to graduate students, but faculty were not satisfied with the level of graduate student data skills. Carlson et al. (2011) note that faculty comments show a general lack of data literacy understanding among both faculty and students (p. 17). The limitation in faculty members’ expression of data literacy competencies suggests it would be helpful to assess these skills in practice. This study adds to current understanding of faculty data literacy competencies through evidence in published research.
Methods to analyze research articles include bibliometrics, citation analysis, and content analysis. Traditionally bibliometrics of economics and business research has analyzed citation sources, document types, and research communities without focusing on data or data literacy (Calma and Davies 2016; Wei and Zhang 2020; Nigro, Johansson, and Hansson 2022). Content analysis studies have expanded the focus on citations to include data citations. Lowry (2015) identified the topic, method, and type (i.e. primary or secondary) of data used in business masters theses. Narrowing down to secondary data, Reiter (2020) identifies commercial data vendors commonly used in business research articles. The Lowry (2015) and Reiter (2020) studies use their findings to demonstrate how libraries can support data access and management needs of researchers, which is an important data service for business and economics disciplines. This article adds to previous business and economics content analysis research by examining data literacy competencies. Findings lead to implications for increasing understanding about how libraries and economics programs can teach and build students’ data literacy together.
The definition of data literacy used in this paper is based on the work of Calzada Prado and Marzal (2013) and Pothier and Condon (2020). From their frameworks five areas of data literacy were identified: describing, evaluating, using, citing, and sharing data. Each help improve students’ understanding as they learn to do research. The area of data analysis is excluded as past work has demonstrated the centrality of data analysis to economics education (Batt et al. 2020; Marshall and Underwood 2019).
Using the five overarching data literacy competencies, the author and her research assistant did close readings of ten seed articles from economics publications to identify terms used when discussing the competencies. Since not all areas of data literacy were identified in the seed articles, additional terms found in the data literacy theory literature were also added to the data literacy competency coding protocol. Overall, 58 terms were identified as themes for coding of articles (see Table 1).
Table 1 : Data Literacy Competency Coding Terms Used, Identified from Economics and Data Literacy Literature
Data Literacy Competency | Terms from Economics Articles | Terms from Data Literacy Articles |
---|---|---|
Describing Data | Administrative | Binary |
Average | Categorical | |
Cross-sectional | Continuous | |
Experiment | Describe | |
Frequency | Discrete | |
Mean | Longitudinal | |
Median | Nominal | |
Ordinal | ||
Panel | ||
Survey | ||
Time-series | ||
Evaluating Data | Bias | Accurate |
Caveat | Authority | |
Compare | Credible | |
Concern | IRB | |
Confound | ||
Consistent | ||
Limitation | ||
Match | ||
Missing | ||
Noise | ||
Robust | ||
Using Data | Clean | Reformat |
Collect | Verify | |
Combine | ||
Construct | ||
Convert | ||
Download | ||
Drop | ||
Exclude | ||
Merge | ||
Normalize | ||
Obtain | ||
Restrict | ||
Validate | ||
Citing Data | Distributor | License |
Publisher | Permission | |
Sharing Data | Available | |
Replication | ||
Repository | ||
Supplement |
Note: Terms from economics and data literacy articles are listed in alphabetical order. The order does not represent hierarchies or relationships across rows beyond terms being associated with the same overarching data literacy competency.
After identifying the data literacy competency terms, the Adobe Acrobat index search function was used to find and export the instances of each term in the text of each article in the sample. Truncation of terms with multiple potential endings was used to capture all instances. 2 Previous research has shown that Adobe Acrobat is a useful tool for content analysis of text when looking for specific, pre-determined words (Nur, Adams, and Brailsford 2016). Using the Adobe Acrobat text search tool falls between manual content analysis and analysis with specific text analysis software, which provides both flexibility as well as higher accuracy and time-savings (Boettger and Palmer 2010). To ensure reliability of coding, the Adobe Acrobat results were scanned for unrelated uses of each term. 3 Additionally, uses of terms in references sections were not included in the final dataset, with the exception of the terms for citing data.
Data for this study comes from a sample of articles from top journals in economics. A set of the ten top economics journals was determined through comparing rankings from four journal impact metrics: Clarivate’s Journal Impact Factor and Journal Citation Indicator (2023a), Google Scholar’s h5-index (2023), and SCImago Journal Rank indicator (2022). Journals that were ranked within the top 15 for each metric were added based on number of metric citations and individual rankings. Looking at multiple impact metrics allowed for a more well-rounded ranking of top journals. Being able to assess practical data literacy and apply it to a variety of economics research topics was also important. Thus, journals that do not publish empirical research (e.g., Journal of Economic Literature ) and journals with a niche topic focus (e.g., Canadian Journal of Agricultural Economics ) were excluded. The journal titles are listed in alphabetical order in Table 2, along with information about data policies and impact metric percentile scores.
Table 2 : List of Ten Top Economics Journals
Journal Name | Data Sharing Required | Dedicated Location for Data Sharing | Journal Impact Factor Percentile | Journal Citation Indicator Percentile | Google Scholar H5-Index Percentile | SCImago Journal Rank Percentile |
---|---|---|---|---|---|---|
American Economic Review | Yes | ICPSR | 99.10 | 99.23 | 95.00 | 99.72 |
Econometrica | Yes | Journal website | 90.10 | 96.13 | 65.00 | 99.16 |
Journal of Finance | No* | Journal website | 95.00 | 96.70 | 75.00 | 99.44 |
Journal of Financial Economics | Yes | Mendeley Data | 98.60 | 98.46 | Not included | 98.61 |
Journal of Political Economy | Yes | Harvard Dataverse | 96.40 | 98.54 | 85.00 | 99.58 |
Journal of Public Economics | No | N/A | 98.60 | 98.02 | 55.00 | 96.38 |
Quarterly Journal of Economics | Yes | Harvard Dataverse | 99.90 | 99.91 | 80.00 | 99.86 |
Review of Economic Studies | Yes | Zenodo | 87.50 | 95.44 | 70.00 | 99.30 |
Review of Economics and Statistics | Yes | Harvard Dataverse | 95.70 | 96.47 | 50.00 | 98.19 |
Review of Financial Studies | No* | Harvard Dataverse | 96.80 | 98.02 | 90.00 | 98.89 |
Note: The sample size for percentiles for each journal ranking varies: Journal Impact Factor and Journal Citation Indicator cover 381 journals, Google Scholar H5-Index covers 20 journals, and SCImago Journal Rank covers 718 journals. All the percentiles are within the context of the economics discipline. Data come from Clarivate (2023a), Google Scholar (2023), and SCImago (2022).
*Sharing replication code is required, but sharing of datasets is not.
The Web of Science database was used to export citation data for all articles published in the journals in 2021 (Clarivate 2023b). The statistical software Stata was used to select a random sample of 10 articles from each journal. In cases where the selected articles did not use data, other articles were randomly selected as replacements. Table 3 presents summary statistics on the sample of 100 journal articles compared to all articles published in the ten top journals in 2021. On average, the articles have been cited 6.5 times since publication and have been used (i.e., downloaded) 25 times. The median values for citations and usages in the sample are slightly lower at 4 and 17.50 respectively. This demonstrates that the articles come from a right-skewed distribution, which is generally consistent with the population distribution for all articles published in the ten top journals in 2021 (see Table 3). Based on the pareto principle, it is expected that a few articles with high citation and usage numbers would skew the distribution.
Table 3 : Descriptive Statistics of Sample of Articles from Top Economics Journals
Sample articles (N = 100) | All articles from top 10 journals published in 2021 (N = 961) | |||||
---|---|---|---|---|---|---|
Article Characteristics | Median | Mean | Range | Median | Mean | Range |
Citations | 4 | 6.50 | 43 | 3 | 6.23 | 170 |
Usages | 17.50 | 25.02 | 229 | 19 | 28.05 | 683 |
Notes: Data come from Clarivate (2023b).
To assess the data literacy competencies demonstrated in the articles, two measures were used: 1) the proportion of articles that included a data literacy term or competency category and 2) the total use of each term. Table 4 and Figure 1 show the proportion of articles that included each term and the total times each term was used in the sample of 100 articles. 4 For the high-level competency categories, 100% of the articles included at least one term referring to the competencies of describing, evaluating, and using data. For the competency category of sharing data, 87% of articles included at least one related term. Only 40% of articles included a term referencing the citing data competency category (see Table 4).
Table 4 : Proportion and Total Use of Data Literacy Competency Terms in Sample Articles, by Data Literacy Competency Category (N = 100)
Competency Category | Proportion | Total |
---|---|---|
Describing | 1 | 12,313 |
Administrative | 0.31 | 136 |
Average | 0.98 | 2,781 |
Binary | 0.32 | 105 |
Categorical | 0.03 | 6 |
Continuous | 0.4 | 151 |
Cross Sectional | 0.64 | 436 |
Describe | 0.98 | 915 |
Discrete | 0.28 | 85 |
Experiment | 0.65 | 677 |
Frequency | 0.55 | 234 |
Longitudinal | 0.15 | 26 |
Mean | 0.98 | 1,437 |
Median | 0.67 | 480 |
Nominal | 0.23 | 108 |
Ordinal | 0 | 0 |
Panel | 0.94 | 3,375 |
Survey | 0.77 | 1,233 |
Time Series | 0.28 | 128 |
Evaluating | 1 | 5,350 |
Accurate | 0.47 | 169 |
Authority | 0 | 0 |
Bias | 0.8 | 607 |
Caveat | 0.28 | 34 |
Compare | 1 | 1,556 |
Concern | 0.84 | 440 |
Confound | 0.37 | 115 |
Consistent | 0.97 | 1,093 |
Credible | 0.27 | 56 |
IRB | 0.04 | 6 |
Limitation | 0.38 | 83 |
Match | 0.84 | 883 |
Missing | 0.5 | 197 |
Noise | 0.24 | 111 |
Robust | 0.92 | 169 |
Using | 1 | 3,658 |
Clean | 0.08 | 16 |
Collect | 0.69 | 302 |
Combine | 0.75 | 238 |
Construct | 0.9 | 877 |
Convert | 0.26 | 40 |
Download | 0.1 | 16 |
Drop | 0.47 | 147 |
Exclude | 0.81 | 387 |
Merge | 0.31 | 57 |
Normalize | 0.5 | 167 |
Obtain | 0.88 | 606 |
Reformat | 0 | 0 |
Restrict | 0.88 | 681 |
Validate | 0.24 | 42 |
Verify | 0.4 | 82 |
Citing | 0.40 | 64 |
Distributor | 0.12 | 17 |
License | 0.01 | 2 |
Permission | 0.26 | 30 |
Publisher | 0.14 | 15 |
Sharing | 0.87 | 464 |
Available | 0.57 | 151 |
Replicate | 0.61 | 145 |
Repository | 0.04 | 5 |
Supplement | 0.4 | 147 |
A. Describing Data
B. Evaluating Data
C. Using Data
D. Citing Data
E. Sharing Data
Figure 1: Proportion of Articles with Data Literacy Competency Terms and Total Term Usage, by Competency Category. Note: Data collected using Adobe Acrobat’s index search function on PDFs of articles.
The analysis of terminology used in top economics journal articles demonstrates that the data literacy competencies of describing, evaluating, and using data are well established and clearly presented in economics research. The findings also suggest that these are the most understood and important competencies in the field. These are strengths in economics research and filter down to benefit students as they learn research and data skills from studying others’ work. The specific terms used in relation to the competencies of describing, evaluating, and using data provide insight on how to better prepare students (see Figure 1).
One pattern among the frequency of data literacy terms is that more general terms are used more often than specific terms. This is not surprising as general terms can be used in more contexts and instances. This is the case for terms related to describing. Terms like average , describe , mean , panel , survey , and median may be used many times throughout a paper. However, the low proportion of uses of terms specifically identifying types of variables ( binary , discrete , nominal , categorical , ordinal ) and types of datasets ( time series , longitudinal ) may contribute to students and other readers being unsure of the structure of the data and how it impacts the analysis. Economics instructors and librarians can partner to reinforce definitions for specific terminology as well as teach students to identify the format of variables and datasets when observing data.
Another pattern is that more scholarly or academic terms appear to be preferred over terms describing the same concept in more common settings. Linguistics research on academic literacy has developed lists of terms for academic writing generally (Gardner and Davies 2014) and economics specifically (O’Flynn 2019). Terms categorized as academic vocabulary in previous research include compare , consistent , robust , concern , and match. Each of those terms are used more than three times as often as noise when evaluating data within the sample economics articles. However, in settings such as classrooms and blogs, noise is often the term used when talking about outliers and variance in data. Similarly, for the competency of using data, terms like merge , convert , validate , clean , and reformat are used when teaching students how to prepare data for analysis. While terms like construct , obtain , and restrict, which are classified as academic vocabulary, may not convey clear intended actions to students. Economics instructors and librarians can also partner to support students in this area by building bridges between common or technical terms and academic terms when teaching. 5
Figure 1 panels D and E show that the proportion of articles using terms relating to data citation and sharing are much lower than terms relating to describing, evaluating, and using data. For sharing and citing data, replicate , available , supplement , and permission were the most common terms. Few articles used repository or license to discuss data. One reason for the low coverage for citation and sharing terms is that the coding structure identified fewer terms within the categories of citing and sharing compared to understanding, evaluating, and using data. However, individual articles may use unique terms to display data citation and sharing that were not recognized during the initial code development. If research articles are using a wide variety of terms when citing and sharing data, it may be difficult for students to recognize because they cannot rely on consistent patterns. Librarians and instructors can help students identify shared and cited data when they need to replicate or reuse research data.
Since the presence of terms relating to competencies of sharing and citing data was lower in the sample articles, additional analysis was conducted to explore the trend. Each article in the sample was thoroughly searched for references to secondary data sources or data collection, even if there was not a full data citation. If secondary data was used, the specific source(s) were also recorded. Additionally, the articles and journal websites were assessed for shared data files. Articles that shared data files were noted, along with articles that were unable to share data because of licensing or privacy considerations.
Only 36% of articles pointed to dataset files that readers could download and reuse, which is significantly lower than the 87% of articles that use terms relating to data sharing. Often articles reference a data replication package, but it only contains data code or other documentation, without the actual data. One reason for this is that data may be proprietary or confidential. Over half of the sample papers (58 out of 100) use restricted data, and as expected the percentage sharing that data is low (21%). On the other hand, 57% of the subset of articles that do not use restricted data share their research data. Other reasons for low rates of data sharing include limited time and resources to prepare data for sharing and lack of understanding of the benefits of data sharing, such as citations that come through data reuse (Cooper 2021; Sheffield and Burton 2022). Guidance on creating data documentation from Gentzkow and Shapiro (2014) suggests that disorganized data and code is a major hurdle for social science researchers, when sharing data with collaborators and others.
As shown in previous research (Vlaeminck and Podkrajac 2017; Vlaeminck 2021), journal policies requiring data sharing also relate to increased sharing, although there are still low compliance rates: 47% of articles published in journals with a data sharing policy shared data, while only 10% of articles in journals without a policy shared (see Table 5). This suggests that journal efforts to establish data policies are useful, but there is still room for wider adoption (American Economic Association 2020; Miklós et al. 2022). Librarians may be able to support and encourage wider data sharing by providing support creating data documentation and information about journal data policies and the benefits of sharing data.
While only 40% of articles used terminology related to a data citation, 97% at least listed the name of the data source used in the research (see Table 5). This suggests that while there have not been organizational efforts toward data citation in economics, researchers inherently understand the need to share source information. The next step is formal data citations to enable data reuse and replication studies. With documented experience finding, evaluating, and citing other types of sources (Tenopir et al. 2014), librarians are well positioned to teach how to find data shared alongside articles and the why and how of citing data.
The findings on data sources from the sample of economics research articles also demonstrate that secondary data is the most common form of research data. Over half of the articles use proprietary datasets, which supports the findings of previous research on commercial data sources used in business research (Reiter 2020). Additional analysis of other secondary data sources reveals that that government and nongovernmental organization (NGO) data are also widely used in economics research. Since 21% of articles that use proprietary data also share some research data, it is clear that a significant portion of articles combine multiple sources of data. While not all libraries have access to the common commercial datasets used in economics and business research, government and NGO sources like the U.S. Census Bureau and World Bank are excellent sources for librarians to become familiar with to be prepared to help students. Since many research projects rely on multiple sources of data, another service librarians can provide is guidance on accurately merging datasets.
Table 5 : Proportion of Sample Articles Citing Data Sources and Sharing Data, by Type of Data Used and Journal Requirements
Data Sharing Characteristic | All articles (N = 100) | Restricted data (N = 58) | No restricted data (N = 42) | Researcher collected primary data (N = 19) | Researcher did not collect primary data (N = 81) | Journal requires data sharing (N = 70) | Journal does not require data sharing ( N = 30 ) |
---|---|---|---|---|---|---|---|
List Data Source | 0.97 | 0.97 | 0.98 | 1 | 0.96 | 0.99 | 0.93 |
Use Proprietary Data | 0.58 | 1 | 0 | 0.21 | 0.67 | 0.51 | 0.73 |
Use Secondary Data | 0.84 | 0.98 | 0.64 | 0.32 | 0.96 | 0.81 | 0.90 |
Share Dataset | 0.36 | 0.21 | 0.57 | 0.53 | 0.32 | 0.47 | 0.10 |
This research adds to existing understanding about data literacy by identifying highly valued and well implemented competencies within economics—describing, evaluating, and using data. These areas need continued support from librarians and instructors in preparing students to succeed, including through preparing students to understand and use academic terminology. The study also identified areas for improvement—sharing and citing data. By partnering with faculty instructors and utilizing unique skills, such as finding open data and training students in research and citation, librarians can give students the tools they need to traverse the world of data successfully and confidently.
While this study provides insight into data literacy in economics research and how it can be used to improve students’ skill attainment, there are limitations. The study looks specifically at the data literacy terms identified in economics and data literacy literature, and does not cover the gamut of possible terminology, particularly within the citing and sharing areas. The sample of articles for this study also come from a single year (2021) in a set of top economics journals. Thus, this does not show developments over time or trends in other journals or disciplines. However, since the journals selected publish high quality economics research, they provide a good starting point for understanding data literacy among economics researchers and are among the universe of research articles used as classroom examples. Another limitation of the research is that it cannot show how researchers developed their data literacy skills.
Since past research has shown that most graduate students do not learn data skills in the traditional classroom (Pouchard and Bracke 2016), an important area of additional research is assessing how students most effectively learn these skills. Another future research application is investigating how having clearly displayed data literacy in published research can help close the gap between research and practice, particularly in helping business practitioners make more data-driven decisions (Banasiewicz 2022). Finally, the method used in this study can be replicated within other disciplines to understand subject-specific trends. Future work may assess the differences in data literacy displayed across disciplines.
Allgood, Sam, and Amanda Bayer. 2016. “Measuring College Learning in Economics.” In: Improving Quality in American Higher Education: Learning Outcomes and Assessments for the 21st Century , edited by Richard Arum, Josipa Roksa, and Amanda Cook, 87-134. San Francisco, CA: Jossey-Bass.
American Economic Association. 2020. “Data and Code Availability Policy.” AEA Data and Code Policies and Guidance. https://www.aeaweb.org/journals/data/data-code-policy .
American Library Association. 2015. “Framework for Information Literacy for Higher Education.” ACRL February 9, 2015. https://www.ala.org/acrl/standards/ilframework .
Banasiewicz, Andrew. 2022. “On Bridging of the Academic-Practitioner Divide in Business Education: New Opportunities in the New Era.” The Electronic Journal of Knowledge Management 20(1): 27-35. https://doi.org/10.34190/ejkm.20.1.2390 .
Batt, Steven, Tara Grealis, Oskar Harmon, and Paul Tomolonis. 2020. “Learning Tableau: A Data Visualization Tool.” The Journal of Economic Education 51(3-4): 317-328. https://doi.org/10.1080/00220485.2020.1804503 .
Boettger, Ryan K., and Laura A. Palmer. 2010. “Quantitative Content Analysis: Its Use in Technical Communication.” IEEE Transactions on Professional Communication 53(4): 346-357. https://doi.org/10.1109/TPC.2010.2077450 .
Calma, Angelito, and Martin Davies. 2016. “Academic of Management Journal, 1958-2014: A Citation Analysis.” Scientometrics 108: 959-975. https://doi.org/10.1007/s11192-016-1998-y .
Calzada Prado, Javier, and Miguel Angel Marzal. 2013. “Incorporating Data Literacy into Information Literacy Programs: Core Competencies and Contents.” Libri 63(2): 123-134. https://doi.org/10.1515/libri-2013-0010 .
Carlson, Jake, Michael Fosmire, Chris Miller, and Megan R. Sapp Nelson. 2011. “Determining Data Information Literacy Needs: A Study of Students and Research Faculty.” portal: Libraries and the Academy 11(2): 629-657. https://docs.lib.purdue.edu/lib_fsdocs/23/ .
Clarivate. 2023a. Journal Citation Reports: Economics. Dataset. Accessed March 8, 2023. https://jcr.clarivate.com/jcr/browse-journals .
Clarivate. 2023b. Web of Science: Full Record Export. Dataset. Accessed March 8, 2023. https://www.webofscience.com/ .
Cooper, Kristen A. 2021. “Data Sharing Attitudes and Practices in the Plant Sciences: Results from a Mixed Method Study.” Journal of Agricultural & Food Information 22(1-2): 37-58. https://doi.org/10.1080/10496505.2021.1891923 .
Corrall, Sheila, Mary Anne Kennan, and Waseem Afzal. 2013. “Bibliometrics and Research Data Management Services: Emerging Trends in Library Support for Research.” Library Trends 61(3): 636-674. https://doi.org/10.1353/lib.2013.0005 .
Enakrire, Rexwhite Tega. 2021. “Data Literacy for Teaching and Learning in Higher Education Institutions.” Library Hi Tech News 38(2): 1-7. https://doi.org/10.1108/LHTN-01-2020-0005 .
Gardner, Dee, and Mark Davies. 2014. “A New Academic Vocabulary List.” Applied Linguistics 35(3): 305-327. https://doi.org/10.1093/applin/amt015 .
Gentzkow, Matthew, and Jesse M. Shapiro. 2014. “Code and Data for the Social Sciences: A Practitioners’ Guide.” Stanford University. https://web.stanford.edu/~gentzkow/research/CodeAndData.pdf .
Goben, Abigail, and Tina Griffin. 2019. “In Aggregate: Trends, Needs, and Opportunities from Research Data Management Surveys.” College & Research Libraries 80(7): 903-924. https://doi.org/10.5860/crl.80.7.903 .
Google Scholar. 2023. Top Publications: Economics. Dataset. Accessed March 8, 2023. https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=bus_economics .
Hoffler, Jan H. 2017. “ReplicationWiki: Improving Transparency in Social Sciences Research.” D-Lib Magazine 23(3/4). https://doi.org/10.1045/march2017-hoeffler .
Kubas, Alicia, and Jenny McBurney. 2019. “Frustrations and Roadblocks in Data Reference Librarianship.” IASSIST Quarterly 43(1): 1-18. https://doi.org/10.29173/iq939 .
Lowry, Linda D. 2015. “Bridging the Business Data Divide: Insights into Primary and Secondary Data Use by Business Researchers.” IASSIST Quarterly 39(2): 14-25. https://doi.org/10.29173/iq779 .
Marshall, Emily C., and Anthony Underwood. 2019. “Writing in the Discipline and Reproducible Methods: A Process-Oriented Approach to Teaching Empirical Undergraduate Economics Research.” The Journal of Economic Education 50(1): 17-32. https://doi.org/10.1080/00220485.2018.1551100 .
Mendez-Carbajo, Diego. 2020. “Baseline Competency and Student Self-Efficacy in Data Literacy: Evidence from an Online Module.” Journal of Business & Finance Librarianship 25(3-4): 230-243. https://doi.org/10.1080/08963568.2020.1847551 .
Miklós, Koren, Marie Connolly, Joan Lull, and Lars Vilhuber. 2022. “Data and Code Availability Standard.” Zenodo. https://doi.org/10.5281/zenodo.7436134 .
Mitton, Todd. 2022. “Economic Significance in Corporate Finance.” The Review of Corporate Finance Studies . https://doi.org/10.1093/rcfs/cfac008 .
Myers, Steven C., Michael A. Nelson, and Richard W. Stratton. 2011. “Assessment of the Undergraduate Economics Major: A National Survey.” The Journal of Economic Education 42(2): 195-199. https://doi.org/10.1080/00220485.2011.555722 .
Nigro, Orlando, Jenny Johansson, and Stina Hogvik Hansson. 2022. “Insight into What They Cite: A Citation Analysis of Publications at the School of Business, Economics, and Law at the University of Gothenburg.” Journal of Business & Finance Librarianship 27(2): 127-153. https://doi.org/10.1080/08963568.2022.2044614 .
Nur, Selin, Clive E. Adams, and David F. Brailsford. 2016. “Using Built-in Functions of Adobe Acrobat Pro DC to Help the Selection Process in Systematic Reviews of Randomized Trials.” Systematic Reviews 5(33). https://doi.org/10.1186/s13643-016-0207-7 .
O’Flynn, James. 2019. “An Economics Academic Word List (EAWL): Using Online Resources to Develop a Subject-specific Word List and Associated Teaching-learning materials.” Journal of Academic Language & Learning 13(1): A28-A87. https://journal.aall.org.au/index.php/jall/article/view/592 .
Pothier, Wendy Girven, and Patricia B. Condon. 2020. “Towards Data Literacy Competencies: Business Students, Workforce Needs, and the Role of the Librarian.” Journal of Business & Finance Librarianship 25(3-4): 123-146. https://doi.org/10.1080/08963568.2019.1680189 .
Pouchard, Line, & Marianne Stowell Bracke. 2016. “An Analysis of Selected Data Practices: A Case Study of the Purdue College of Agriculture.” Issues in Science and Technology Librarianship 85. https://doi.org/10.29173/istl1691 .
Raban, Daphne R., and Avishag Gordon. 2020. “The Evolution of Data Science and Big Data Research: A Bibliometric Analysis.” Scientometrics 122: 1563-1581. https://doi.org/10.1007/s11192-020-03371-2 .
Raffaghelli, Juliana E., and Bonnie Stewart. 2020. “Centering Complexity in ‘Educators’ Data Literacy’ to Support Future Practices in Faculty Development: A Systematic Review of the Literature.” Teaching in Higher Education 25(4): 435-455. https://doi.org/10.1080/13562517.2019.1696301 .
Reiter, Lauren. 2020. “Commercial Data in Academic Business Research: A Study on Use and Access.” Journal of Business and Finance Librarianship 25(3-4): 244-260. https://doi.org/10.1080/08963568.2020.1847546 .
Roth, Jonathon. 2022. “Pretest with Caution: Event-Study Estimates after Testing for Parallel Trends.” American Economic Review: Insights 4(3): 305-322. https://doi.org/10.1257/aeri.20210236 .
SCImago. 2022. Scimago Journal & Country Rank: Economics and Econometrics. Dataset. Accessed March 8, 2023. https://www.scimagojr.com/journalrank.php?category=2002 .
Sheffield, Megan, and Karen B. Burton. 2022. “Research Data Management Needs Assessment of Clemson University.” Journal of Librarianship & Scholarly Communication 10(1): 1-28. https://doi.org/10.31274/jlsc.13970 .
Tenopir, Carol, Robert J. Sandusky, Suzie Allard, and Ben Birch. 2014. “Research Data Management Services in Academic Research Libraries and Perceptions of Librarians.” Library & Information Science Research 36(2): 84-90. https://doi.org/10.1016/j.lisr.2013.11.003 .
Vlaeminck, Sven. 2021. “Dawning of a New Age? Economics Journals’ Data Policies on the Test Bench.” Liber Quarterly 31: 1-29. https://doi.org/10.53377/lq.10940 .
Vlaeminck, Sven, and Felix Podkrajac. 2017. “Journals in Economic Sciences: Paying Lip Service to Reproducible Research?” IASSIST Quarterly 41(1-4): 1-16. https://doi.org/10.29173/iq6 .
Wei, Fangfang, and Guijie Zhang. 2020. “Exploring the Intellectual Structure and Evolution of 24 Top Business Journals: A Scientometric Analysis.” The Electronic Library 38(3): 493-511. https://doi.org/10.1108/EL-12-2019-0279 .
Wheatley, Amanda, Martin Chandler, and Dawn McKinnon. 2020. “Collaborating with Faculty on Data Awareness: A Case Study.” Journal of Business and Finance Librarianship 25(3-4): 281-290. https://doi.org/10.1080/08963568.2020.1847553 .
Zhang, Liwei, and Liang Ma. 2021. “Does Open Data Boost Journal Impact: Evidence from Chinese Economics.” Scientometrics 126: 3393-3419. https://doi.org/10.1007/s11192-021-03897-z .
Zhang, Yun, Weina Hua, and Shunbo Yuan. 2018. “Mapping the Scientific Research on Open Data: A Bibliometric Review.” Learned Publishing 31: 95-106. https://doi.org/10.1002/leap.1110 .
For example, see ACRL’s Framework for Information Literacy for Higher Education (American Library Association 2015). ↩
For example, describe was truncated to descr* so related terms such as description and describing would also be found in the articles. ↩
Terms that were not used in connection with data were excluded. For example, authority was used in 28 articles, but always in the context of government or organizational authority and not in reference to the data. Phrases that included the searched term, but did not match the intended meaning were also excluded (e.g., the only search results for reforma* were those referring to the Reformation). ↩
The data on the proportion of articles using terms related to areas of data literacy and total usage of terms is reported in both a table and graphs to allow for quickly comparing numbers as well as easily seeing trends in term usage. ↩
O’Flynn (2019) is a resource for teaching materials to support economics students’ vocabulary development. ↩