Akmon, D.
(2014). The Role of Conceptions of Value in Data Practices: A Multi-Case Study of Three Small Teams of Ecological Scientists.
|
Investigated how scientists conceive of the value of their data, and how they enact conceptions of value in their data practices |
Conducted interviews and engaged in participant observation of three teams of scientists performing ecological research at a U.S. university-sponsored field station |
Measurement, Targeted |
Alexogiannopoulos, E., McKenney S., & Pickton M.
(2010). Research Data Management Project: a DAF investigation of research data management practices at The University of Northampton.
|
Investigated research data management practices at the University of Northampton, specifically the types of data held by researchers throughout the university, researchers‟ existing data management practices, and the risks associated with these practices. |
Used the Digital Asset Framework methodology |
Measurement, Wider |
Averkamp, S., Gu X., & Rogers B.
(2014). Data Management at the University of Iowa: A University Libraries Report on Campus Research Data Needs.
|
This data management report was commissioned by the University of Iowa Libraries with the intention of performing a survey of the campus landscape and identifying gaps in data management services |
The first stage of data collection consisted of a survey conducted during summer 2012 to which 784 responses were received. The second phase of data collection consisted of approximately 40 in-depth interviews with individuals from the campus and were completed during summer 2013. The individuals engaged during the data collection phase spanned a diverse set of campus programs but should not be considered comprehensive. Information Technology Services was invited to participate in the interview process and has also contributed to this report. |
Measurement, Wider |
Bardyn, T., Resnick T., & Camina S.
(2012). Translational Researchers’ Perceptions of Data Management Practices and Data Curation Needs: Findings from a Focus Group in an Academic Health Sciences Library.
Journal of Web Librarianship. 6(4), 274 - 287. |
Investigated the digital curation needs of translational researchers |
Conducted focus groups with eight faculty members in departments within the David Geffen School of Medicine, UCLA |
Measurement, Targeted |
Beagrie, N., Houghton J., Palaiologk A., & Williams P.
(2012). Economic Evaluation of Research Data Infrastructure.
|
Investigated the economic benefits of investments of the Economic and Social Research Council (ESRC) in the Economic and Social Data Service (ESDS), a service that promotes use of research data and teaching in social sciences to ensure data availability. |
Performed analysis of existing evaluation literature and reports, looking at both methods and findings; examined results of KRDS and other studies; examined management and internal data collected by ESRC and ESDS such as user statistics, internal reports, and the ESDS Mid-Term Review; performed semi-structured interviews, case studies, and an online survey of ESDS users and depositors |
Measurement, Targeted |
Beagrie, N., & Houghton J.
(2013). The Value and Impact of the Archaeology Data Service: A Study and Methods for Enhancing Sustainability.
|
Investigated and attempted to measure the value and impact of the Archaeology Data Service (ADS) |
Reviewed value and impact evaluation literature; analyzed ADS reports and documentation; conducted 15 interviews with ADS stakeholders; conducted 2 online surveys, one of ADS data depositors and one of ADS users
Note: This study and the similar study of the British Atmospheric Data Centre (Node 34) both use the same value metrics framework |
Measurement, Metrics, Targeted |
Beagrie, N., & Houghton J.
(2013). The Value and Impact of the British Atmospheric Data Centre.
|
Surveyed and analyzed perceptions of the value of the digital collections held by the British Atmospheric Data Centre (BADC), and quantified the value and impact of those collections for BADC’s user community using a range of economic approaches; investigated the extension of the methodology used in Beagrie et al. 2012 and Beagrie and Houghton 2013a to the BADC.
Note: The results of Beagrie et al. 2012, Beagrie and Houghton 2013a and Beagrie and Houghton 2013b were summarized and collated in Beagrie and Houghton 2014. |
Similar to Beagrie et al. 2012 and Beagrie and Houghton 2013, methods included a combination of literature and documentation review, review of reports from BADC, 13 interviews of BADC users and depositors, and two online surveys, one of BADC data depositors and one of BADC users.
Note: This study and the similar study of the Archaeology Data Service (Node 33) both use the same value metrics framework |
Measurement, Targeted |
Fearon, D., Gunia B., Pralle B., Lake S., & Sallans A.
(2013). ARL Spec Kit 334: Research data management services.
|
To assess early endeavors in research data services and benchmark future growth in ARL member libraries. |
Conducted a survey of ARL member libraries. 73 of 125 responded. |
Measurement, Wider |
Fecher, B., Friesike S., & Hebing M.
(2015). What Drives Academic Data Sharing?.
PLoS ONE. 10(2), e0118053. |
Investigated the creation of a framework that explains the process of data sharing from the researcher’s point of view |
Performed a systematic review of 98 scholarly papers and empirical survey among 603 secondary data users |
Measurement, Metrics, Targeted |
Federer, L., Lu Y-L., Joubert D., Welsh J., & Brandys B.
(2015). Biomedical Data Sharing and Reuse: Attitudes and Practices of Clinical and Scientific Research Staff.
PLoS ONE. 10(6), |
Investigated differences in experiences with and perceptions about sharing data, as well as barriers to sharing among clinical and basic science researchers |
Distributed a survey to Clinical and basic science researchers in the Intramural Research Program at the National Institutes of Health. The survey was publicized through various NIH email lists, including NIH library and NIH special interest groups. Of 190 respondents, 135 who identified as clinical or basic science researchers were included in analysis.
Asked: Reuse (how relevant was their work and level of expertise); Relevance and expertise regarding depositing data in a repository; uploading data to a repository; sharing practices (metadata, codebook, processing); acknowledgement for sharing; reasons for not sharing |
Measurement, Targeted |
Fry, J., Lockyer S.., Oppenheim C.., Houghton J.W.., & Rasmussen B..
(2008). Identifying benefits arising from the curation and open sharing of research data produced within UK Higher Education and research institutes: exploring costs and benefits.
|
Investigated the benefits of the curation and open sharing of research data and the development of a methodology and model for estimating the benefits of data curation and sharing in UK higher education |
Performed a literature review to provide illustrative examples of reuse and the views of stakeholders in various disciplines towards data curation and sharing; conducted two case studies to identify and illustrate benefits and costs in these areas |
Measurement, Metrics, Wider |
Gibbs, H.
(2009). Southampton Data Survey: Our Experience and Lessons Learned.
|
To pilot the Digital Asset Framework (or Digital Audit Framework) methodology |
Used a modified version of the Digital Asset Framework; modified mainly due to time considerations; distributed an online questionnaire and follow-up interviews with researchers at the University of Southampton |
Measurement, Wider |
Guindon, A.
(2014). Research Data Management at Concordia University: A Survey of Current Practices..
Feliciter. 60(2), 15 - 17. |
Assess what researchers were doing with the data they generated and whether they were interested in sharing it with the academic community and determine what types of research data management services the library could offer |
Conducted a survey of full-time faculty in four departments (Geography, Planning and Environment, Political Science, Psychology, and Sociology and Anthropology); received 41 responses out of 11. Conducted post-survey interviews. Both the survey and interviews were based on the DCC Data Asset Framework. |
Measurement, Wider |
Hedstrom, M., Niu J., & Marz K.
(2006). Producing Archive-Ready Datasets: Compliance, Incentives, and Motivation.
IASSIST. |
Investigated effort researchers are willing to put into preparing data for deposit into an archive and incentives to induce researchers to improve the quality of data and metadata deposited |
Surveyed 170 researchers funded by the National Institute of Justice, which requires deposit of data in an established archive (National Archive of Criminal Justice Data at the Inter-university Consortium for Political and Social Research-NAJCD) |
Measurement, Metrics, Targeted |
Hedstrom, M., & Niu J.
(2008). Incentives for Data Producers to Create “Archive-Ready” Data: Implications for Archives and Records Management.
Society of American Archivists Research Forum. |
Investigated researcher behavior and attitudes about depositing data from sponsored research |
Conducted a survey of 55 graduates of the National Institute of Justice |
Measurement, Targeted |
Huang, X., Hawkins B. A., Lei F., Miller G. L., Favret C., Zhang R., et al.
(2012). Willing or unwilling to share primary biodiversity data: results and implications of an international survey.
Conservation Letters. 5(5), 399 - 406. |
Investigated attitudes, experiences, and expectations of researchers sharing and archiving of regarding biodiversity data |
Conducted an online survey asking about the respondents’ demographics and research background, their attitudes and experiences regarding biodiversity data sharing, and their expectations regarding future data archiving practices; invitations were sent to specific researchers, and then distributed by communications officers of select scientific societies; there were 372 valid responses (where ¾ of the survey was completed) |
Measurement, Targeted |
Jerrome, N., & Breeze J.
(2009). Imperial College Data Audit Framework Implementation: Final Report.
|
To pilot the Digital Asset Framework Methodology; evaluate the scale and scope of research data; and make recommendations accordingly |
Used a modified form of the Digital Asset Framework in multiple departments: used the audit framework in a first phase of investigation, then conducted an online survey and follow up interviews. |
Measurement, Wider |
Martinez-Uribe, L.
(2009). Using the Data Audit Framework: An Oxford Case Study.
|
Piloted the Digital Asset Framework methodology in work to scope digital repository services for research data management |
Adapted the Digital Asset Framework methodology |
Measurement, Wider |
McLure, M., Level A., Cranston C., Oehlerts B., & Culbertson M.
(2014). Data Curation: A Study of Researcher Practices and Needs.
portal: Libraries and the Academy. 14(2), 139 - 164. |
Investigated (1) the nature of data sets that researchers create or maintain; (2) How participants manage their data; (3) Needs for support that the participants identify in relation to sharing, curating, and preserving their data; and (4) The feasibility of adapting the Purdue University Libraries’ Data Curation Profiles Toolkit1 interview protocol for use in focus groups with researchers |
Conducted five focus groups with 31 faculty, research scientists, and research associates |
Measurement, Wider |
Noor, M. A. F., Zimmerman K. J., & Teeter K. C.
(2006). Data Sharing: How Much Doesn't Get Submitted to GenBank?.
PLoS Biol. 4(7), e228. |
Investigated frequency of researcher submission of DNA sequences to journals where their research was published |
Searched 290 papers in six journals with explicit policies requiring submission of DNA sequences to “GenBank” [Note: “GenBank” here refers to GenBank, the European Molecular Biology Laboratory, and the DNA Databank of Japan] |
Measurement, Targeted |
Open Exeter Project Team
(2012). Summary Findings of the Open Exeter Data Asset Framework Survey.
|
Investigated how researchers at the University of Exeter created data, where they stored their data, whether they backed up their data and what happened to their data when the project was finished |
Adapted from the Data Curation Centre’s Data Asset Framework methodology, an online survey was created and follow up interviews were conducted with respondents. |
Measurement, Wider |
Parsons, T., Grimshaw S., & Williamson L.
(2013). Research Data Management Survey.
|
Sought to understand the baseline of RDM practices, gather researcher requirements for RDM, and raise awareness of and gauge interest in a proposed service |
After testing on a smaller population, conducted an online survey of career researchers and post-doctoral researchers at the University of Nottingham using targeted email |
Measurement, Wider |
Pepe, A., Goodman A., Muench A., Crosas M., & Erdmann C.
(2014). How Do Astronomers Share Data? Reliability and Persistence of Datasets Linked in AAS Publications and a Qualitative Study of Data Practices among US Astronomers.
PLoS ONE. 9(8), e104798. |
Investigated data sharing practices of astronomers over the last 15 years |
Analyzed URL links embedded in papers published by the American Astronomical Society; performed interviews with 12 scientists and online surveys with 173 scientists at the Harvard-Smithsonian Center for Astrophysics |
Measurement, Targeted |
Perry, C.
(2008). Archiving of publicly funded research data: A survey of Canadian researchers..
Government Information Quarterly. 25(1), 133 - 148. |
To assess researchers’ attitudes and behaviours in relation to archiving research data and to determine researchers’ views about policies relating to data archiving. Investigated how much of the data being produced in the course of SSHRC-funded research is being archived. Surveyed social sciences and humanities researchers from universities across Canada. |
A questionnaire comprising 15 questions was mailed to 175 researchers randomly sampled from a publicly available list of 5,821 individuals who had received grants and awards from the Social Sciences and Humanities Research Council of Canada (SSHRC). From this sample, 75 (43.4%) responded within the five week time-frame stipulated. The questionnaire was constructed using four existing surveys and asked researchers for information about: geographical location, years of research experience, research funding sources, current plans to archive research data, awareness of archiving policies, attitude to mandated research data archiving, effect of mandatory data archiving policies on grant-seeking, attitude to making archived research data accessible, and use of research data collected by others. The questionnaire also included space for respondents to make comments. |
Measurement, Wider |
Peters, C., & Dryden A.
(2011). Assessing the Academic Library's Role in Campus-Wide Research Data Management: A First Step at the University of Houston.
Science & Technology Libraries. 30(4), 387 - 403. |
Interviewed PIs of significant grants, to assess individuals in as many science and engineering departments as possible, and to obtain information on data management practices from both individual and group-based projects |
Conducted interviews with PIs of 10 projects (14 contacted), as well as one Co-PI, one post-doctorate and one graduate student associated with one of the projects) |
Measurement, Wider |
Pienta, A. M., Alter G. C., & Lyle J. A.
(2010). The Enduring Value of Social Science Research: The Use and Reuse of Primary Research Data.
|
Investigated the extent to which social science research data are shared and whether data sharing affected research productivity of the research data themselves. |
Searched NSF and NIH databases to create a database of 7,040 research projects in the social and behavioral sciences funded by NSF and NIH from 1985-2001; surveyed the 4,883 unique PIs for these projects (there was a 24.9% response rate) about research data collected, methods of sharing data, attitudes about data sharing, and demographic information |
Measurement, Targeted |
Piwowar, H. A., & Chapman W. W.
(2008). Identifying Data Sharing in Biomedical Literature.
AMIA Annual Symposium Proceedings. 2008, 596 - 600. |
Investigated extent of data sharing in biomedical research |
Used national language processing (NLP) techniques to find evidence of dataset sharing within 1,028 open access research that mentioned one or more of five databases |
Measurement, Targeted |
Piwowar, H. A., & Chapman W. W.
(2010). Public sharing of research datasets: a pilot study of associations.
Journal of informetrics. 4(2), 148 - 156. |
Investigated whether data sharing frequency was associated with funder and publisher requirements, journal impact factor, or investigator experience and impact |
Used a previously-created set of 397 articles in 20 journals describing studies using gene expression microarray data; identified which studies had made their raw datasets available; used multivariate logistic regression to evaluate the association between authorship, grant, and journal attributes of a study and the public availability of its microarray data |
Measurement, Targeted |
Piwowar, H. A.
(2011). Who Shares? Who Doesn't? Factors Associated with Openly Archiving Raw Research Data.
PLoS ONE. 6(7), e18657. |
Investigated patterns in the frequency with which researchers openly archive raw gene expression microarray datasets after research publication |
Performed a full-text query of 5 databases to identify 11,603 articles published between 2000 and 2009 that describe the creation of gene expression microarray data; performed multivariate regression on 124 bibliometric attributes of the articles, which revealed 15 factors describing authorship, funding, institution, publication, and domain environments. |
Measurement, Targeted |
Read, K. B., Sheehan J. R., Huerta M. F., Knecht L. S., Mork J. G., Humphreys B. L., et al.
(2015). Sizing the Problem of Improving Discovery and Access to NIH-Funded Data: A Preliminary Study.
PLoS ONE. 10(7), e0132735. |
Investigated the discovery of and access to biomedical datasets to provide a preliminary estimate of the number and type of datasets generated annually by research funded by the U.S. National Institutes of Health (NIH); specifically those that are “invisible” or not deposited in a known repository |
Analyzed NIH-funded journal articles that were published in 2011, cited in PubMed and deposited in PubMed Central (PMC) to identify articles where data were submitted to a known repository; excluded these and analyzed a random sample of the remaining articles to estimate how many and what types of invisible datasets were used in each article |
Measurement, Targeted |
Scaramozzino, J., Ramírez M., & McGaughey K.
(2012). A Study of Faculty Data Curation Behaviors and Attitudes at a Teaching-Centered University.
College & Research Libraries. 73(4), 349 - 365. |
Investigated science researchers’ data curation awareness, behaviors, and attitudes, as well as what needs they exhibited for services and education regarding maintenance and management of data |
Distributed survey via email to 331 College of Science and Mathematics faculty at California Polytechnic State University, San Luis Obispo (Cal Poly), a master’s-granting, teaching-centered institution. Filtered results to include only science faculty from the Biology, Chemistry, Kinesiology, Mathematics, Physics, and Statistics departments who engaged in data collection in the course of their research (131 tenure-track faculty; 82 responded (62.6%) |
Measurement, Wider |
Science Staff
(2011). Challenges and Opportunities.
Science. 331(6018), 692 - 693. |
Investigated issues surrounding the growing amounts of research data that exist |
Performed a survey of Science peer reviewers, receiving 1,700 responses; asked about frequency of use of datasets from published literature and archival databases, the size of the largest dataset used or generated, where most of the data they generate is archived, whether they have asked colleagues for research data, whether the data were provided, whether there is adequate expertise in their lab or group to analyze their data in the way desired, and whether there is sufficient funding for data curation in their group |
Measurement, Targeted |
Sturges, P., Bamkin M., Anders J. H. S., Hubbard B., Hussain A., & Heeley M.
(2015). Research data sharing: developing a stakeholder-driven model for journal policies.
Journal of the Association for Information Science and Technology. |
Investigated the state of journal data sharing policies the views and practices of stakeholders to data sharing in order to outline a model journal research data sharing policy |
Reviewed the web pages of 371 journals including the most and least cited journals internationally and nationally and extracted categories of policy based on Piwowar and Chapman 2008b definitions of strong and weak policies; conducted 13 interviews with key stakeholders selected on the basis of their expertise in data sharing issues |
Measurement, Targeted |
Tenopir, C., Allard S., Douglass K., Aydinoglu A., Wu L., Read E., et al.
(2011). Data Sharing by Scientists: Practices and Perceptions.
PLoS ONE. 6(6), e21101. |
Investigated scientists’ data sharing practices and their perceptions of the barriers and enablers of data sharing |
Conducted an internet survey including questions about demographics and questions about scientists’ relationship with data; the survey was distributed initially using a snowball approach (contacting specific individuals who could promote the survey) and then by targeting universities in states with a low response rate. In all 1,329 respondents answered at least one question (an estimated response rate of 9%) |
Measurement, Targeted |
Tenopir, C., Dalton E. D., Allard S., Frame M., Pjesivac I., Birch B., et al.
(2015). Changes in Data Sharing and Data Reuse Practices and Perceptions among Scientists Worldwide.
PLoS ONE. 10(8), e0134826. |
Examined the state of data sharing and reuse perceptions and practices among research scientists as compared to the 2009/2010 baseline study (reported in Tenopir et al. 2011); examined differences in practices and perceptions across age groups, geographic regions, and subject disciplines |
Used snowball and volunteer sampling approaches to recruit respondents to an online survey; the survey was also distributed via a variety of listservs |
Measurement, Targeted |
Thornhill, K., & Palmer L.
(2014). An Assessment of Doctoral Biomedical Student Research Data Management Needs.
|
Explored institutional repository data management needs at the University of Massachusetts Medical School |
Conducted literature review; sent a data needs assessment survey based on the DCC lifecycle model and NSF requirements for data management to 470 students on an email discussion list.
|
Measurement, Targeted |
UNC-CH
(2012). Research Data Stewardship at UNC: Recommendations for Scholarly Practice and Leadership.
|
Sought to identify policy options for digital research data stewardship at UNC; further understanding of the full-breadth of activities, concerns, and opinions surrounding research data stewardship among researchers at UNC-CH |
Conducted semi-structured interviews with 23 faculty researchers representing several disciplines at UNC-CH; conducted an online survey of all faculty members, graduate students, and staff assigned to departments that engage in research |
Measurement, Wider |
Waller, M., & Sharpe R.
(2006). Mind the Gap: Assessing Digital Preservation Needs in the UK.
|
A study carried out for the Digital Preservation Coalition (DPC) to reveal the extent of the risk of loss or degradation to digital material held in the UK's public and private sectors |
Surveyed 900 individuals from a wide range of organisations in different sectors. The selected individuals all had an assumed interest in digital preservation as part of their professional responsibilities, and included a range of roles including records managers, archivists, librarians, but also IT managers and data producers. 104 responses were received, giving a good response rate of over 10%. These included respondents from education, libraries, archives, museums, local and central government bodies, scientific research institutions, and from organisations in the pharmaceutical, financial, manufacturing and engineering, media, energy and chemical, and publishing sectors.
Note: Discusses duration for keeping data.
|
Measurement, Wider |
Wallis, J. C., Rolando E., & Borgman C. L.
(2013). If We Share Data, Will Anyone Use Them? Data Sharing and Reuse in the Long Tail of Science and Technology.
PLoS ONE. 8(7), e67332. |
Investigated data sharing practices among scientists and technology researchers in CENS, a National Science Foundation Science and Technology Center. This was done as part of efforts to identify infrastructure needs for research data produced in long tail science. |
Conducted two rounds of interviews with researchers, students, and staff in CENS in the fourth and eighth years of the study, and ten years of ethnographic observation |
Measurement, Targeted |
Wicherts, J. M., Bakker M., & Molenaar D.
(2011). Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results.
PLoS ONE. 6(11), e26828. |
Investigated reasons for researchers’ reluctance to share data from published research |
Related the willingness to share data (as experienced by requesting data from the authors of 49 papers published in two high-ranked APA journals) to the internal consistency of the statistical results in the papers and the distribution of significantly reported (p<.05) p-values |
Measurement, Targeted |
Bigagli, L., Sveinsdottir T., Wessels B., Smallwood R., Linde P., Tsoukala V., et al.
(2014). Infrastructural and technological challenges and potential solutions.
|
Investigated infrastructural and technological barriers to Open Access and preservation of research data in Europe. This work was conducted within the EU FP7 funded project RECODE, which focuses on developing policy recommendations for Open Access to Research Data in Europe. In particular, this work is coordinated by RECODE Work Package 2 (WP2), Infrastructure and technology. It distinguishes between different categories of stakeholders in terms of how the experience and respond to these challenges |
Conducted desk research, an online survey, interviews, and a validation workshop |
Measurement, Metrics, Wider |
Noorman, M., Kalaitzi V., Angelaki M., Tsoukala V., Linde P., Sveinsdottir T., et al.
(2014). Institutional barriers and good practice solutions.
|
Investigated challenges faced by institutions, such as archives, libraries, universities, data centres and funding bodies, in making open access to research data possible. This work was conducted within the EU FP7 funded project RECODE, which focuses on developing policy recommendations for Open Access to Research Data in Europe. |
Conducted desk research, case study interviews, and a validation workshop |
Measurement, Wider |
Finn, R., Wadhwa K., Taylor M. J., Sveinsdottir T., Noorman M., & Sondervan J.
(2014). Legal and ethical barriers and good practice solutions.
|
Identify legal and ethical issues relevant to open access to research data in Europe, identify examples that illuminate these issues, and identify potential solutions currently being used to address these issues |
Conducted a literature review, five disciplinary case studies, and a validation workshop |
Measurement, Wider |
Sveinsdottir, T., Wessels B., Smallwood R., Linde P., Kala V., Tsoukala V., et al.
(2013). Stakeholder values and relationships within open access and data dissemination and preservation ecosystems.
|
Identify and map the diverse range of stakeholder values in Open Access data and data dissemination and preservation; map stakeholder values on to research ecosystems using case studies from different disciplinary perspectives; conduct a workshop to evaluate and identify good practice in addressing conflicting value chains and stakeholder fragmentation. This work was conducted within the EU FP7 funded project RECODE, which focuses on developing policy recommendations for Open Access to Research Data in Europe. |
Conducted desk research, case study interviews, and a validation workshop |
Measurement, Metrics, Wider |
Beagrie, N., & Houghton J.
(2012). Economic Impact Evaluation of the Economic and Social Data Service.
|
Sought to (i) evaluate the economic benefits and impact of ESDS; and (ii) contribute to the further development of impact evaluation methods that can provide ESRC with robust estimates of the economic benefits of its data service infrastructure investments |
Conducted (i) desk-based analysis of existing evaluation literature and reports, looking at both methods and findings; (ii) existing data from KRDS and other studies; (iii) existing management and internal data collected by ESRC and ESDS such as user statistics, internal reports, and the ESDS Mid-Term Review; and (iv) original data collection in the form of semi-structured interviews, case studies, and an online survey of ESDS users and depositors |
Measurement, Metrics, Targeted |