• Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Reporting Research Findings

Introduction.

  • Reference Resources
  • History and Trends
  • Guidance on Reporting Quantitative Reports, Syntheses, and Meta-analyses
  • Linguistic Analyses of Written Research Results
  • Writing Review Articles
  • Writing Qualitative Research
  • Scientific Reviewing
  • Rhetoric of Evidence-Based Management
  • Research on Graphics Perception
  • Statistical-Technological Trends

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Ethics in Research
  • Research Methods

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Multiteam Systems
  • Organization Design
  • Team Emergent States
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Reporting Research Findings by James T. Austin LAST REVIEWED: 24 June 2020 LAST MODIFIED: 24 June 2020 DOI: 10.1093/obo/9780199846740-0032

Not all research culminates in publication. This updated article surveys themes in reporting research findings for scholars and students. As context, consider that investigations of organizational phenomena require a series of choices that are cast here as craft. Choices span primary, secondary, and synthesis designs across qualitative and quantitative traditions. Primary research is the traditional design, measurement, and analysis of collected data, while secondary research involves reanalysis of existing data sets (obtained from peers or repositories), and research synthesis involves narrative or quantitative aggregation of studies. This distinction also holds for the qualitative mode. Reporting research findings is important for dissemination and for synthesis and evidence-based management (EBM). Primarily, the importance lies in dissemination across conferences, journals, books, and increasingly digital media. Understanding and replication by outside scholars depend on complete and accurate reporting; this centrality to the research craft commands a learning-development focus. Within a communications paradigm, individuals or teams create or send a persuasive message and the reader or listener receives (or may choose not to receive) the message. Persuasion is targeted via rhetoric across writing and graphics. Although oral and written forms of dissemination dominate, data repositories are emerging. Two additional reasons for importance pertain to the accumulation of knowledge. One is research synthesis. Structuring knowledge through synthesis uses the results of individual studies as data, and the audience is scientists. Narrative and quantitative reviews depend on the completeness and accuracy of reported findings. A related source of importance pertains to evidence-based management at the interface of research and practice—translation of research findings into practices and bundles of practices that can be used by managers. Given that practicing managers appear to rely on obsolete knowledge (aka “fads, fashions, and folderol” as used by Dunnette), proponents of evidence-based management advocate that firms consider the adoption of evidence-based medicine (EBM). Communicating clearly and establishing a context of implementation to assist practitioners are essential for EBM (in parallel to research synthesis, for an audience of practitioners). This article organizes a range of resources on writing and reviewing articles across the taxonomy above. For completeness, this article includes citations for scientific graphics (tables, charts, figures, etc.) organized around conceptualizations of graphics and related guidance, research on perception of scientific graphics, and recent developments in computing technology. Especially relevant are software routines for interactive graphics based on “grammars.” While this article draws on work in management studies (organizational behavior and human resources), it necessarily searches beyond traditional boundaries for relevant insights.

There are sporadic specialized sources on reporting of research findings. On scholarly writing, Cummings and Frost 1995 is an influential analysis of the publishing system in the organizational sciences. Abelson 1995 defines rhetoric as styles of writing up results in psychology. Research synthesis writing is addressed comprehensively in Cooper, et al. 2009 (cited under Guidance on Reporting Quantitative Reports, Syntheses, and Meta-analyses ). There are two major standards available for research synthesis: Meta-Analysis Reporting Standards (MARS) and Preferred Reporting Items for Systematic Reviews and Meta-Analyses ( PRISMA ).For graphics and quantitative studies, Tufte 2001 and Tukey 1977 are classics for guidance and perspective; others, including Cleveland 1985 , Kosslyn 2006 , Wainer 2000 (cited under History and Trends ), and Wilkinson 2005 , provide unique value. The work on maps in Börner 2015 is aptly named Atlas of Knowledge , while Grant 2019 provides a concise introduction to data visualization with a section on interactive graphics (a related instance is the class of data explorers used for large data sets as the Programme for International Student Assessment [PISA] and the National Assessment of Educational Progress [NAEP]—both large-scale testing programs). Sternberg and Sternberg 2010 is typical guidance offered to students and is not the only such resource. Many of these texts can be mined for dimensions to code the content and results of published organizational behavior and human resources research to facilitate critique A trio of books by Katy Börner ( Börner 2010 , Börner 2015 ) and colleagues ( Börner and Polley 2014 ) represents the newest in knowledge mapping. In addition, a rapidly emerging topic across science is the reproducibility and replicability of results—the consensus review published in 2019 by a committee of the National Academies of Science, Medicine, and Engineering provides an excellent overview.

Abelson, Robert P. Statistics as Principled Argument . Mahwah, NJ: Lawrence Erlbaum, 1995.

Describes magnitude-articulation-generality-interestingness-credibility (MAGIC) criteria to organize rhetoric in presenting research findings. Accepting statistics as an organizer of arguments using quantitative evidence allows identification of styles. Brash and stuffy are end points on a liberal-conservative style dimension. Management students and scholars could learn MAGIC for reporting quantitative findings; qualitative researchers might consider translation.

Börner, Katy. Atlas of Science: Visualizing What We Know . Cambridge, MA: Massachusetts Institute of Technology Press, 2010.

Books by Katy Börner show the potential and the practice of science and knowledge mapping. Atlas of Science (2010) presents three themes: power of maps (switching from geographic cartography to research-collaboration mapping), reference systems, and forecasts, as well as numerous examples.

Börner, Katy. Atlas of Knowledge: Anyone Can Map . Cambridge, MA: Massachusetts Institute of Technology Press, 2015.

Börner deftly gives readers principles for visualizing knowledge with more than forty large-scale and over a hundred small-scale color maps. Drives home the point that data literacy is as important as language literacy. She introduces a theoretical framework meant to guide readers through user and task analysis; data preparation, analysis, and visualization; visualization deployment; and the interpretation of science maps. Together with Börner 2010 and Börner and Polley 2014 , this trio provides levels of analysis from frameworks to workflow that support improved visualizations of science, knowledge, and interdisciplinary collaboration.

Börner, Katy, and David E. Polley. Visual Insights: A Practical Guide to Making Sense of Data . Cambridge, MA: Massachusetts Institute of Technology Press, 2014.

Along with Börner 2010 and Börner 2015 , a practical book by Börner and Polley based on the Information Visualization MOOC includes seven chapters—from a visualization framework through “when, where, what, and with whom” and dynamic visualizations—and concludes with chapters on case studies and discussion/outlook.

Cleveland, William S. The Elements of Graphing Data . Monterey, CA: Wadsworth Advanced Books and Software, 1985.

Cognitive science and statistical principles help dissect and improve graphics (a predecessor book from 1983 and articles that searched prestigious journals for common graphic errors are also useful). Based on extensive experience with AT&T data, the author distills and emphasizes procedural knowledge for constructing graphic displays.

Cummings, Larry L., and Peter J. Frost, eds. Publishing in the Organizational Sciences . 2d ed. Foundations of Organizational Science. Thousand Oaks, CA: SAGE, 1995.

This classic covers most aspects of publishing in organizational behavior and human resources (absent are emergent digital-technological issues). Organized into sections on perspectives on and realities of publishing, which are insightful for scholar and student alike. Benjamin Schneider’s ten propositions on “getting research published” end with practicing the skill of writing. This edition inaugurated the Foundations of Organizational Science series, and the 1985 edition is also useful.

Few, Stephen. Now You See It: Simple Visualization Techniques for Quantitative Analysis . Oakland, CA: Analytics, 2009.

Suggests that in a data-dense world, the human brain—and hence, visualization—is key to avoiding overload. Three sections, namely “Building Core Skills for Visual Analysis” and “Honing Skills,” each with six chapters plus a “Further Thoughts and Hopes” with eight promising trends, cover much ground. Based on quantitative preferences, the most substantive portion is contained in Part 2. The book ends with an excerpt from the poetry of T. S. Eliot.

Grant, Robert. Data Visualization: Charts, Maps and Interactive Graphics . Boca Raton, FL: CRC Press, 2019.

This author provides a vast range of examples of data visualization, mostly open source and with code available on a website . It provides a good mix of detail with sharing of tacit knowledge.

Kosslyn, Stephen M. Graph Design for the Eye and Mind . New York: Oxford University Press, 2006.

DOI: 10.1093/acprof:oso/9780195311846.001.0001

Based on sound cognitive science and ample research by the author, presents and elaborates eight principles of effective graph construction (summarized in pp. 5–20). Analyzes prominent guidance on graphics, Edward R. Tufte for example, and suggests flaws. that could lead to productive research.

Sternberg, Robert J., and Karin Sternberg The Psychologist’s Companion: A Guide to Writing Scientific Papers for Students and Researchers . 5th ed. Cambridge, UK: Cambridge University Press, 2010.

DOI: 10.1017/CBO9780511762024

Aligned to American Psychological Association (APA) style as a prototype of good practice in publishing; the author is a productive researcher and APA journal editor; thus tacit knowledge in this edition is well grounded and expressed. Represents a class of books on research communication. Some translation required to organizational behavior and human resources context. Comparable to Cooper 2010 (cited under Writing Review Articles ). Next edition will need to conform to the seventh edition of the Publication Manual of the American Psychological Association .

Tufte, Edward R. The Visual Display of Quantitative Information . 2d ed. Cheshire, CT: Graphics Press, 2001.

Revises a classic 1983 text in analytic design (Tufte’s preferred term); presents and expands on five core principles and coins numerous terms (“chartjunk” as well as “sparkline” and “data-ink ratios” are personal favorites). Critiqued for its advice, however, by other researchers on graphics ( Kosslyn 2006 ).

Tukey, John W. Exploratory Data Analysis . Reading, MA: Addison-Wesley, 1977.

A classic presenting Tukey’s data detective work rooted in his 1962 “The Future of Data Analysis” exposition ( Annals of Mathematical Statistics 33.1: 1–67). Premise is that exploratory data analysis (EDA) deserves status with confirmatory. Loaded with philosophy of EDA and tools—the stem leaf, box plot, and “five-number summary.” Graphic display and analysis are covered in the service of learning about data. A part of research craft to be honed post-schooling.

Wilkinson, Leland L. The Grammar of Graphics . 2d ed. New York: Springer-Verlag, 2005.

Cited by many, this conceptualization rooted in the work of Jacques Bertin extends work done with the Task Force on Statistical Reporting in 1999. Within an object-oriented design approach, the grammar consists of the rules and elements of graphics, for example, geoms, scales, and coordinates. Framework has been useful for deriving tools, such as Wilkinson’s GPL, Wickham’s ggplot2, and others.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Management »
  • Meet the Editorial Board »
  • Abusive Supervision
  • Adverse Impact and Equal Employment Opportunity Analytics
  • Alliance Portfolios
  • Alternative Work Arrangements
  • Applied Political Risk Analysis
  • Approaches to Social Responsibility
  • Assessment Centers: Theory, Practice and Research
  • Attributions
  • Authentic Leadership
  • Bayesian Statistics
  • Behavior, Organizational
  • Behavioral Approach to Leadership
  • Behavioral Theory of the Firm
  • Between Organizations, Social Networks in and
  • Brokerage in Networks
  • Business and Human Rights
  • Business Ethics
  • Career Studies
  • Career Transitions and Job Mobility
  • Certified B Corporations and Benefit Corporations
  • Charismatic and Innovative Team Leadership By and For Mill...
  • Charismatic and Transformational Leadership
  • Compensation, Rewards, Remuneration
  • Competitive Dynamics
  • Competitive Heterogeneity
  • Competitive Intensity
  • Computational Modeling
  • Conditional Reasoning
  • Conflict Management
  • Considerate Leadership
  • Cooperation-Competition (Coopetition)
  • Corporate Philanthropy
  • Corporate Social Performance
  • Corporate Venture Capital
  • Counterproductive Work Behavior (CWB)
  • Cross-Cultural Communication
  • Cross-Cultural Management
  • Cultural Intelligence
  • Culture, Organization
  • Data Analytic Methods
  • Decision Making
  • Diversity and Firm Performance
  • Diversity and Inclusion, Global Perspective on
  • Dynamic Capabilities
  • Emotional Labor
  • Employee Aging
  • Employee Engagement
  • Employee Ownership
  • Employee Voice
  • Empowerment, Psychological
  • Entrepreneurial Firms
  • Entrepreneurial Orientation
  • Entrepreneurship
  • Entrepreneurship, Corporate
  • Entrepreneurship, Women’s
  • Equal Employment Opportunity
  • Executive Succession
  • Faking in Personnel Selection
  • Family Business, Managing
  • Financial Markets in Organization Theory and Economic Soci...
  • Findings, Reporting Research
  • Firm Bribery
  • First-Mover Advantage
  • Fit, Person-Environment
  • Forecasting
  • Founding Teams
  • Global Leadership
  • Global Talent Management
  • Goal Setting
  • Grounded Theory
  • Hofstedes Cultural Dimensions
  • Human Capital Resource Pipelines
  • Human Resource Management
  • Human Resource Management, Strategic
  • Human Resources, Global
  • Human Rights
  • Humanitarian Work Psychology
  • Humility in Management
  • Impression Management at Work
  • Influence Strategies/Tactics in the Workplace
  • Information Economics
  • Innovative Behavior
  • Intelligence, Emotional
  • International Economic Development and SMEs
  • International Economic Systems
  • International Strategic Alliances
  • Job Analysis and Competency Modeling
  • Job Crafting
  • Job Satisfaction
  • Judgment and Decision Making in Teams
  • Knowledge Sharing and Collaboration within and across Firm...
  • Leader-Member Exchange
  • Leadership Development
  • Leadership Development and Organizational Change, Coaching...
  • Leadership, Ethical
  • Leadership, Global and Comparative
  • Leadership, Strategic
  • Learning by Doing in Organizational Activities
  • Management History
  • Management In Antiquity
  • Managerial and Organizational Cognition
  • Managerial Discretion
  • Meaningful Work
  • Multinational Corporations and Emerging Markets
  • Neo-institutional Theory
  • Neuroscience, Organizational
  • New Ventures
  • Organization Design, Global
  • Organization Development and Change
  • Organization Research, Ethnography in
  • Organization Theory
  • Organizational Adaptation
  • Organizational Ambidexterity
  • Organizational Behavior, Emotions in
  • Organizational Citizenship Behaviors (OCBs)
  • Organizational Climate
  • Organizational Control
  • Organizational Corruption
  • Organizational Hybridity
  • Organizational Identity
  • Organizational Justice
  • Organizational Legitimacy
  • Organizational Networks
  • Organizational Paradox
  • Organizational Performance, Personality Theory and
  • Organizational Responsibility
  • Organizational Surveys, Driving Change Through
  • Organizations, Big Data in
  • Organizations, Gender in
  • Organizations, Identity Work in
  • Organizations, Political Ideology in
  • Organizations, Social Identity Processes in
  • Overqualification
  • Paternalistic Leadership
  • Pay for Skills, Knowledge, and Competencies
  • People Analytics
  • Performance Appraisal
  • Performance Feedback Theory
  • Planning And Goal Setting
  • Proactive Work Behavior
  • Psychological Contracts
  • Psychological Safety
  • Real Options Theory
  • Recruitment
  • Regional Entrepreneurship
  • Reputation, Organizational Image and
  • Research, Ethics in
  • Research, Longitudinal
  • Research Methods, Qualitative
  • Resource Redeployment
  • Resource-Dependence Theory
  • Response Surface Analysis, Polynomial Regression and
  • Role of Time in Organizational Studies
  • Safety, Work Place
  • Selection, Applicant Reactions to
  • Self-Determination Theory for Work Motivation
  • Self-Efficacy
  • Self-Fulfilling Prophecy In Management
  • Self-Management and Personal Agency
  • Sensemaking in and around Organizations
  • Service Management
  • Shared Team Leadership
  • Social Cognitive Theory
  • Social Evaluation: Status and Reputation
  • Social Movement Theory
  • Social Ties and Network Structure
  • Socialization
  • Sports Settings in Management Research
  • Stakeholders
  • Status in Organizations
  • Strategic Alliances
  • Strategic Human Capital
  • Strategy and Cognition
  • Strategy Implementation
  • Structural Contingency Theory/Information Processing Theor...
  • Team Composition
  • Team Conflict
  • Team Design Characteristics
  • Team Learning
  • Team Mental Models
  • Team Newcomers
  • Team Performance
  • Team Processes
  • Teams, Global
  • Technology and Innovation Management
  • Technology, Organizational Assessment and
  • the Workplace, Millennials in
  • Theory X and Theory Y
  • Time and Motion Studies
  • Training and Development
  • Training Evaluation
  • Trust in Organizational Contexts
  • Unobtrusive Measures
  • Virtual Teams
  • Whistle-Blowing
  • Work and Family: An Organizational Science Overview
  • Work Contexts, Nonverbal Communication in
  • Work, Mindfulness at
  • Workplace Aggression and Violence
  • Workplace Coaching
  • Workplace Commitment
  • Workplace Gossip
  • Workplace Meetings
  • Workplace, Spiritual Leadership in the
  • World War II, Management Research during
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|81.177.182.159]
  • 81.177.182.159

An official website of the United States government

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

PLOS ONE logo

Transparency in conducting and reporting research: A survey of authors, reviewers, and editors across scholarly disciplines

Mario Malički

IJsbrand Jan Aalbersberg

Adrian mulligan, gerben ter riet.

  • Author information
  • Article notes
  • Copyright and License information

Competing Interests: IJsbrand Jan Aalbersberg is Senior Vice President of Research Integrity at Elsevier, and Adrian Mulligan is a Research Director for Customer Insights at Elsevier. Mario Malicki is a Co-Editor-In-Chief or Research Integrity and Peer Review journal. Other authors declare no competing interests. This does not alter our adherence to PLOS ONE policies on sharing data and materials.

* E-mail: [email protected]

Received 2022 Jan 31; Accepted 2022 Jun 3; Collection date 2023.

This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Calls have been made for improving transparency in conducting and reporting research, improving work climates, and preventing detrimental research practices. To assess attitudes and practices regarding these topics, we sent a survey to authors, reviewers, and editors. We received 3,659 (4.9%) responses out of 74,749 delivered emails. We found no significant differences between authors’, reviewers’, and editors’ attitudes towards transparency in conducting and reporting research, or towards their perceptions of work climates. Undeserved authorship was perceived by all groups as the most prevalent detrimental research practice, while fabrication, falsification, plagiarism, and not citing prior relevant research, were seen as more prevalent by editors than authors or reviewers. Overall, 20% of respondents admitted sacrificing the quality of their publications for quantity, and 14% reported that funders interfered in their study design or reporting. While survey respondents came from 126 different countries, due to the survey’s overall low response rate our results might not necessarily be generalizable. Nevertheless, results indicate that greater involvement of all stakeholders is needed to align actual practices with current recommendations.

Introduction

Scholarly publishing has been steadily growing for the last two centuries with estimates of more than 3 million articles published per year [ 1 ]. In recent times, there has been an increasing focus on the detection and prevention of misconduct and detrimental research practices, in enhancing research rigor and transparency, and in cultivating a research climate best suited to foster these goals [ 2 – 5 ]. Specifically, calls have been made to increase the transparency in conducting and reporting research by registering projects and data analyses plans before data collection, using reporting guidelines when writing up studies, posting preprints, sharing (raw) data, reproducing or replicating studies, as well as rewarding, hiring or promoting researchers based on these practices [ 6 – 9 ]. One such call was made in 2014, when the Transparency and Openness Promotion (TOP) Committee developed eight transparency standards related to: 1) citations, 2) data, 3) analytic methods (code), 4) research materials, 5) data and analysis reporting, 6) preregistration of studies, 7) preregistration of analysis plans, and 8) replication of research [ 10 ]. Since then, more than 5,000 journals and 80 organizations became TOP signatories [ 11 ]. However, to the best of our knowledge, attitudes related to all aspects of TOP guidelines have not been systematically assessed, and, yet, without agreement from all stakeholders regarding these guidelines, it is unlikely that scholarly practices will change.

It was therefore our goal to assess differences in attitudes and perceptions between authors, reviewers, and editors regarding the TOP guidelines, differences in perceptions of their work climates, and differences in their perceived prevalence of responsible and detrimental research practices.

Materials and methods

We reported our study following the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines [ 12 ], as well as the Checklist for Reporting Results of Internet E-Surveys (CHERRIES) [ 13 ].

Study design and participants

Full methodological details of our study, including its registration and study protocol are available at our project’s data repository [ 14 ]. In short, we sent a survey to 100,000 email addresses. The email addresses were from: 1) randomly selected corresponding authors of papers indexed in Scopus (n = 99,708); or 2) editors whose Instructions to Authors we analysed in our previous study (n = 292) [ 15 ].

The survey was sent on 24 April 2018, had reminders on 9 and 24 May, and was closed on 12 June of that same year. The survey invitation and reminders, full survey questions and details of their development and testing are available on our project’s data repository [ 14 ]. Respondents could skip items and change the answers they gave until submitting their responses by clicking on the “close the survey” button. Estimated time to finish the survey was 12 minutes (based on pilot results), and was listed in the survey invite letter.

Variables and their measurement

The survey was divided into 4 sections and the questions were presented in the same order to all participants:

attitudes towards transparency in conducting and reporting research (11 questions covering the TOP guidelines with 5-point Likert-type answers ranging from strongly agree to strongly disagree);

perceptions of work climate (13 questions with 5-point Likert-type answers ranging from strongly agree to strongly disagree);

perceived prevalence of responsible and detrimental research practices (14 questions with 5-point Likert-type answers ranging from very prevalent to never occurring);

sociodemographic and knowledge of statistics questions (10 questions with categorical answers).

All questions for the first three sections also included a Don’t know/Not applicable option. In section 4, respondents could select one of 28 prespecified scholarly fields, a multidisciplinary category or the “ other ” category where they could write their field(s) using an open text response format. All answers they provided in free text, as well as the 29 choices, were recoded to one of the 6 major categories that were used in our previous study on Instruction to Authors : Arts & Humanities , Health Sciences , Life Sciences , Physical Sciences , Social Sciences , and Multidisciplina ry [ 15 ]. Additionally, we had open-ended questions (free text format) that explored reasons for respondents’ (dis)agreement with questions of first three sections, and a final survey question for general feedback on the survey. In order to maintain the focus on quantitative data and have a reasonable reporting length, analysis of open-ended answers is planned for another publication.

Statistical methods and study bias

We conducted all analyses in STATA v.13, with statistical code and outputs published on our project’s data repository [ 14 ]. The grouping of respondents to authors (A), reviewers (R) and editors (E) is explained in detail in the Appendix. The main outcomes (answers to questions in sections 1 to 3 listed above) are presented as absolute numbers and percentages (calculated on the basis of the number of respondents that answered a specific question). For readability, percentages are shown only for those who agreed or strongly agreed to statements, or that perceived prevalent or very prevalent practices. Data for all answer options are available on our project’s data repository [ 14 ]. Differences in sociodemographic characteristics and knowledge of statistics between authors, reviewers and editors were explored using the chi-squared test for categorical variables, and Kruskal-Wallis test for respondents’ age. To explore possible associations between answers to questions of sections 1 to 3 and explanatory variables (sociodemographic characteristics and knowledge of statistics) we used ordinal regression analyses. We conducted regressions for all 38 questions individually. We also explored treating questions of each section as separate scales (with the scales consisting of 11, 13 and 14 questions, respectively). We constructed 3 summary scores for those scales, which we then also explored in ordinal regression analyses. To adjust for multiple comparisons, we considered p≤0.001 as statistically significant (based on the Bonferroni correction method of dividing 0.05 by 50, which was the number of conducted regressions rounded-up to the nearest decile). For readability, regression outputs are presented graphically in the Appendix, while details of analyses, odds ratios and their associated 95% CI are available on our project’s data repository [ 14 ].

Respondents’ inquiries and deviations from the protocol

Due to miscommunication within the team, instead of adding the collected email addresses of the editors from our previous study on Instruction to Authors [ 15 ] to the total of 100,000 planned invites, 58 of collected email addresses were used for piloting the survey, and 292 were incorporated into the 100,000 invites sent. Additionally, during survey creation, instead of planned options of 0, 1–4, 5–10,10+ for question— How many articles (approximately) did you review in the last 12 months ? survey options were instead 1, 2–5, 6–10, and 10+. In the invite email, respondents were asked to contact us if they encountered any (technical) difficulties. Details of their questions are listed in the Appendix.

Ethics approval

An ethics waiver for the study has been obtained on 6 April 2018 from the Medical Ethics Review Committee of the Academic Medical Center, University of Amsterdam (reference number W18_112). The survey invitation (available on our project’s data repository [ 14 ]) included information on the study purpose, investigators, estimated length to complete the survey, planned sharing of anonymised data, and publication of summary results. No incentives were offered for participation.

Our overall response rate was 4.9% (3,659 out of 74,749 delivered emails) and included responses from 1,389 authors, 1,833 reviewers, and 434 editors. Respondents came from 126 countries, most commonly USA (16%), India (8%), Italy (7%), and Brazil (5%). Respondents’ median age was 44 (IQR 35 to 55, range 23 to 90). The majority worked for universities or colleges (62%), and were male (66%). While respondents came from all scholarly disciplines, most were from Physical (33%) and Life Sciences (25%), followed by Multidisciplinary (18%) and Health Sciences (13%). Similarly, while respondents came from all career stages, most had a publication record of 6 to 25 publications (39%), or more than 50 publications (25%). Finally, most respondents considered themselves to have either basic (44%) or intermediate (45%) knowledge of statistics. Full details on response rate calculation, assignment of respondents to authors, reviewers, and editors, their self-declared statistical knowledge, and sociodemographic characteristics are available in the Appendix. Summary results are presented below, while all answers, as well as percentages of respondents who chose “ Don’t know” or “ Not applicable” options, are shown in the Appendix.

Attitudes toward transparency in reporting and conducting research

Respondents’ attitudes towards 11 statements on transparency in reporting and conducting research, which were based on TOP guidelines, are shown in Fig 1 . The lowest support, across all respondents, was for preregistering studies prior to conducting the research (21%), and the highest for appropriately citing all data, analytic methods (program code) and materials used in the study (95%). Regression analyses (Appendix Table A7 in S1 Appendix ) revealed no significant differences between authors, editors and reviewers and significant associations for 3 factors: 1) discipline– Health Sciences researchers had more positive attitudes towards transparency in reporting and conducting research than researchers of other disciplines; 2) number of publications–researchers with less than 6 publications had more positive attitudes than other researchers; 3) country–respondents from India had more positive attitudes than respondents from other countries ( Fig 2 ).

Fig 1. Respondents’ attitudes toward transparency in reporting and conducting research.

Fig 1

In the numerical comparison differences between groups larger than 5% are in bold. In the graphical comparison highest percentage is darker. * Questions are sorted by the total agreement percentage. Order of questions as they were asked in the survey is presented in the Appendix. Slight variations exist for the number of respondents per question, exact numbers are available on our project’s data repository.

Fig 2. Examples of univariate comparisons that were confirmed in regression analyses to be significantly associated with attitudes towards transparency in reporting and conducting research, perceptions of work climate or perceived prevalence of responsible and detrimental research practices.

Fig 2

In the graphical comparison highest percentage is darker. Slight variations exist for the number of respondents per question, exact numbers are available on our project’s data repository.

Perceptions of work climate

Respondents’ perceptions on 13 statements about their own work climate are shown in Fig 3 . Most respondents stated that they share their research data with other researchers unless legal or ethical reasons prevent them from doing so (79%), and that having access to others’ data benefits (or would benefit) them (79%). Two thirds (66%) considered the quality of peer review they received to be generally high, as well as the quality of publications in their field (64%). One fifth of respondents (20%) stated that due to the pressure to publish they sacrifice the quality of their publications for quantity, and one seventh of respondents (14%) stated that funders or sponsors interfered in their study design or study reporting. Regression analyses (Appendix Table A8 in S1 Appendix ) revealed no significant differences between authors, editors and reviewers and significant associations with perceptions of work climate with two factors: 1) country–respondents from India or USA had more positive perceptions of their work climate than respondents from other countries; 2) age–younger respondents perceived a worse work climate than older respondents (e.g., younger respondents perceived more pressure to sacrifice quality for quantity, and they more often stated that long peer review times had negatively impacted their careers, Fig 2 ).

Fig 3. Respondents’ perceptions toward their work climate.

Fig 3

Perceived prevalence of responsible and detrimental research practices

Respondents’ perceived prevalence of responsible and detrimental research practices are shown in Fig 4 . Among detrimental practices, 38% of respondents perceived undeserved authorship, and 33%, prior relevant research not being cited, as (very) prevalent in their field. Of responsible practices, 32% perceived self-reporting of limitations, 19% perceived sharing of raw data, and 6% publication of studies with null or negative results to be (very) prevalent. Regression analyses (Appendix Table A9 in S1 Appendix ) revealed editors perceiving fabrication, falsification, plagiarism, and omitting of references to be more prevalent than authors or reviewers. Additionally, editors also perceived undeclared conflicts of interest, and publication of studies with null or negative results to be more prevalent than reviewers (but not than authors, Appendix Table A9 in S1 Appendix ).

Fig 4. Respondents’ perceptions of perceived prevalence of responsible and detrimental research practices.

Fig 4

Regression analyses also revealed 3 other factors associated with the prevalence of research practices: 1) country–respondents from India or USA perceived detrimental research practices to be more prevalent than respondents from other countries; 2) discipline– Health Sciences respondents perceived responsible practices, i.e., use of reporting guidelines, open peer review, publication of studies with null or negative results, and reporting of study limitations as more prevalent than respondents from other disciplines, but they also perceived one detrimental practice—ghost writing—to be more prevalent than the respondents from other disciplines; 3) age–younger respondents perceived undeserved authorship, as well as prior relevant research not being cited, as more prevalent than older respondents ( Fig 2 ).

Our study has shown that authors, reviewers and editors were not supportive of all TOP recommendations for transparency in conducting and reporting of research. For example, while 95% of respondents (strongly) agreed that researchers must appropriately cite study data, methods and materials; a large majority (74%) that authors must follow reporting guidelines, that journals must encourage publication of replication studies (61%), or that authors must share data (60%); only half (50%) felt that journals have to verify that study findings are replicable using the deposited authors’ data and methods of analysis, and 21% that studies must be preregistered. While we found no significant differences in these attitudes between authors, reviewers, and editors, we did observe differences between respondents of different countries, disciplines and research seniority. Overall, younger respondents, those from Health Sciences , or from India, had more positive attitudes towards the TOP recommendations. Direct comparisons of our results with other surveys are difficult, as to the best of our knowledge, no surveys have addressed all aspects of TOP guidelines, nor surveyed respondents from all disciplines or countries represented in our survey since the TOP guidelines were published. Furthermore, year when the survey is conducted, wording differences, use of scales versus single questions to assess attitudes, as well as differences in collected sociodemographic data that might be explored as explanatory variables, often don’t allow direct comparisons [ 16 ]. Nevertheless, a survey of 1,015 authors of observational studies, also conducted in 2018, showed that 63% of respondents used reporting guidelines, and that their attitudes towards, awareness, and use of reporting guidelines are influenced by journals’ endorsements [ 17 ], which was also echoed in earlier studies [ 18 ]. Our previous analysis of journals’ endorsements, however, showed that only 13% of journals across disciplines recommended the use of reporting guidelines, and only 2% required it [ 15 ]. Data sharing practices across sciences have not been systematically explored, but recent estimates indicated that data sharing was mentioned in 15% of biomedical [ 19 ], and in only 2% of psychological articles [ 20 ]. A 2020 systematic review indicated that most researchers have positive attitudes toward data sharing [ 21 ]. This discrepancy of positive attitudes versus lack of actual practice of sharing data is influenced by many factors, including requirements for job selection and promotion, dedicated funding and skills, as well as incentives, time and training required to prepare (anonymised) data for sharing [ 21 ]. Regarding preregistration, a 2017 survey of 275 authors of systematic reviews or meta-analyses, showed that 37% of participants agreed with making protocols mandatory [ 22 ]. While that percentage was higher than in our study, the study’s sample size was smaller, respondents were predominantly from biomedical disciplines, and the questions only pertained to preregistration of systematic reviews.

It is likely that the lack of support of editors in our study towards all aspects of TOP recommendations is one of the contributing factors why these recommendations have not been endorsed by a larger number of journals today. The roles of editors and journals in endorsing or requiring specific practices, the lack of resources they often face, and the lack of proper intervention studies, were discussed extensively in our previous publications [ 15 , 23 ]. Here we would like to add that uptake and attitudes toward different practices are also likely influenced by whom and how recommendations are made. More rigorous methodological steps during recommendation development (similar to those for reporting guidelines or clinical practice guidelines), and open feedback calls might have led to higher initial uptakes. Additionally, published case reports with cost estimates and practical tips from those involved in recommendation development, that then managed to change practices of their journals, departments or institutions, could perhaps lead to additional endorsements.

Our study has also shown that work climates of authors, reviewers, and editors still had room for significant improvements. Approximately two thirds of respondents (66%) found the quality of peer review they received, as well as the quality of publications in their field (64%) to be generally high, one fifth of the respondents (20%) stated that due to the pressure to publish they sacrifice the quality of their publications for quantity, and 14% stated that funders or sponsors interfere in their study design or study reporting. The finding regarding pressure to publish was associated with respondent’s age, with younger respondents feeling more pressure to sacrifice quality for quantity. Younger respondents also felt that long peer review times had a more negative effect on their careers than did the other researchers. Again, direct comparison with previous studies is difficult due to differences in questions, but a recent survey on 7,670 postdoctoral researchers from 93 countries indicated that 56% had a negative outlook on their careers, and that only 46% would recommend to their younger selves to pursue a career in research [ 24 ]. Furthermore, as most postdocs in that survey reported being hired for only short periods of time, it is likely that long peer review times, and the number of publications might have more impact on their job prospects than of (tenured) academics. The overall high satisfaction of researchers with peer review we found, has also been reported in previous studies, with a caveat of known differences in reported satisfaction among those that had their papers rejected vs those that had them accepted [ 25 ].

Finally, our study showed that most commonly perceived detrimental research practices were undeserved authorship (as was also shown by previous research) [ 26 , 27 ], and prior relevant research not being cited. While there were no significant differences between perceptions of authors, reviewers, and editors regarding the prevalence of undeserved authorship; editors perceived higher prevalence of relevant research not being cited as well as higher prevalences of fabrication, falsification, and plagiarism than authors or reviewers. These findings could be explained by the fact that most researchers often engage in conversations on authorship during their projects and publications, while less than 2% of researchers admitted to having fabricated or falsified data [ 28 ], or plagiarised other’s work [ 29 ], and therefore the latter practices are more likely to have been encountered or discussed by editors than by other researchers. Additionally, we observed differences in perceived detrimental research practices, with younger researchers finding undeserved authorship, and not citing relevant research to be more prevalent than older researchers. This mimics previous studies which showed that young researchers often felt they were doing all of the work while others were receiving the credit, and that they had more research experience than many starting faculty members [ 24 ].

We also found that Health Sciences respondents perceived use of reporting guidelines, open peer review, publication of studies with null or negative results, and reporting of study limitations to be more prevalent than respondents from other disciplines, which is congruent with the frequency of recommendations on those topics in Health Science journals compared to other disciplines [ 15 ]. Finally, we also observed that researchers from USA and India perceived detrimental research practices to be more prevalent than respondents from other countries. This could be a consequence of the higher visibility of the Office of Research Integrity, and its legal foothold in the USA [ 30 ], as well as role of the Society for Scientific Values in India [ 31 ], the number of misconduct cases in those countries, as well as possibly higher competition levels in those countries [ 32 , 33 ].

While our study assessed attitudes and perceptions of a large number of respondents across many countries and disciplines, it is not without limitations. First, due to the low response rate our findings are not necessarily generalizable (we discuss possible influences of self-selection and non-response biases in detail in the Appendix). Our response rate, however, was similar to other recent large online surveys [ 16 , 22 , 34 – 36 ], and response rates have been consistently found to be lower in online versus other modes of survey dissemination [ 37 ]. The recent exception to this pattern was a 2019 survey of 2,801 researchers from economics, political science, psychology, and sociology regarding open science practices, which had a response rate of 46%. However, each participant in that survey was compensated with either 25 or 40 dollars (randomly) if they were students, or 75 or 100 dollars (randomly) if they were an author.

Second, while we did have respondents from 126 countries, we explored differences between respondents’ attitudes and perceptions for only 4 countries with highest number of respondents in our survey. Further research is warranted to determine national or institutional differences [ 16 , 38 ]. Third, although our survey was confidential, previous research has suggested that researchers often overestimate detrimental research practices of their colleagues [ 28 ], but may also underreport such practices in order to protect the reputation of their field or themselves, for being unwilling to report such practices for official investigations [ 39 ]. Fourth, to preserve confidentiality, we did not ask information on respondents departments or universities. This precluded taking into account potential clustering of some observations as it is possible that witnessing the same detrimental practice or being aware of the same high-profile cases within a department or a field might have led to overestimation of such practices. Finally, we did not define all terms used in the survey, so some differences between respondents might also stem from their different interpretations of some terms. For example, previous surveys on falsification yielded higher estimates when the term ‘falsified’ was not used but researchers were instead asked if they had ever altered or modified their data [ 28 ]. Sixth, while we explored several sociodemographic characteristics, publication practices and knowledge of statistics as factors associated with respondents’ attitudes and perceptions, previous research has also shown influence of respondents personality traits which we did not measure in our study [ 40 ].

In conclusion, our study has found that attitudes of authors, reviewers, and editors did not significantly differ regarding the TOP guidelines or their perceptions of their work climates. We also observed differences in the perceived prevalences of detrimental practices between editors and authors or reviewers, which highlights the need to raise awareness of these issues among all stakeholders, and to develop projects where all stakeholders would be working together to eradicate or minimize them. More studies are also needed to showcase the impact of any policy changes, as well as studies that lower the burden of implementing such policies [ 41 ]. Finally, recognition and rewarding of responsible practices should move from recommendations to actual practice [ 9 ].

Supporting information

Acknowledgments, declarations.

We would like to thank both our pilot and survey respondents for their valuable inputs and taking the time to answer our questions. We would also like to apologise to them for the delay in publishing our results, which occurred in part due to MM moving to another institution and due to the COVID-19 pandemic and its effects on our personal and professional lives. MM is currently a postdoc at Meta Research Innovation Center at Stanford University, but as most of the study had been conducted during his postdoc in Amsterdam, the listed affiliation is the one that reflects the institute where most of the work was done. Finally, we would like to thank Ricardo Moreira, Research manager at Elsevier for scripting the survey and managing the fieldwork, and Robert Thibault for comments on the draft of our manuscript.

Presentations at meetings/conferences

Preliminary results of our survey were presented at PUBMET 2018: The 5th Conference on Scholarly Publishing in the Context of Open Science held on September 20–21, 2018, in Zagreb, Croatia; as well as on the 6th World Conference on Research Integrity, held on June 2–5, 2019, in Hong Kong, China.

Data Availability

Study registration and protocol, survey questions, survey invites, statistical analysis codes, anonymised data, and paper and appendix tables are available on our project’s data repository: http://dx.doi.org/10.17632/53cskwwpdn.6 .

Funding Statement

This study was part of an Elsevier funded project: Fostering Transparent and Responsible Conduct of Research: What can Journals do?. Details of the project are available on our project’s data repository: http://dx.doi.org/10.17632/53cskwwpdn.6 . The funder (other than the funder-affiliated authors IJJA and AM) had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

  • 1. The STM Report: An overview of scientific and scholarly journal publishing. The Hague, the Netherlands: 2018.
  • 2. Fanelli D. Why growing retractions are (mostly) a good sign. PloS Med. 2013;10(12):e1001563. Epub 2013/12/07. doi: 10.1371/journal.pmed.1001563 ; PubMed Central PMCID: PMC3848921. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 3. Ioannidis JPA. Why most published research findings are false. PLoS medicine. 2005;2(8):696–701. ISI:000231676900008. doi: 10.1371/journal.pmed.0020124 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 4. Nuijten MB, Hartgerink CH, van Assen MA, Epskamp S, Wicherts JM. The prevalence of statistical reporting errors in psychology (1985–2013). Behav Res Methods. 2016;48(4):1205–26. doi: 10.3758/s13428-015-0664-2 ; PubMed Central PMCID: PMC5101263. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 5. Bouter LM, Tijdink J, Axelsen N, Martinson BC, ter Riet G. Ranking major and minor research misbehaviors: results from a survey among participants of four World Conferences on Research Integrity. Research Integrity and Peer Review. 2016;1(1):17. doi: 10.1186/s41073-016-0024-5 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 6. Marusic A, Malicki M, von Elm E. Editorial research and the publication process in biomedicine and health: Report from the Esteve Foundation Discussion Group, December 2012. Biochem Med. 2014;24(2):211–6. Epub 2014/06/28. doi: 10.11613/BM.2014.023 ; PubMed Central PMCID: PMC4083572. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 7. Tennant JP, Dugan JM, Graziotin D, Jacques DC, Waldner F, Mietchen D, et al. A multi-disciplinary perspective on emergent and future innovations in peer review. F1000Research. 2017;6:1151. Epub 2017/12/09. doi: 10.12688/f1000research.12037.3 ; PubMed Central PMCID: PMC5686505. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 8. Spellman BA. A Short (Personal) Future History of Revolution 2.0. Perspectives on Psychological Science. 2015;10(6):886–99. doi: 10.1177/1745691615609918 . [ DOI ] [ PubMed ] [ Google Scholar ]
  • 9. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLOS Biology. 2020;18(7):e3000737. doi: 10.1371/journal.pbio.3000737 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 10. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, et al. Scientific Standards. Promoting an open research culture. Science. 2015;348(6242):1422–5. doi: 10.1126/science.aab2374 ; PubMed Central PMCID: PMC4550299. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 11. Science CfO. Current Signatories 2017 [cited 2017 14/12/2017]. Available from: https://cos.io/our-services/top-guidelines/ .
  • 12. Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS medicine. 2007;4(10):e296. doi: 10.1371/journal.pmed.0040296 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 13. Eysenbach G. Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). Journal of medical Internet research. 2004;6(3). doi: 10.2196/jmir.6.3.e34 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 14. Malički M, ter Riet G, Bouter LM, Aalbersberg IJJ. Project: Fostering Transparent and Responsible Conduct of Research: What can Journals do? Mendeley Data; 2019. Available from: 10.17632/53cskwwpdn.6. [ DOI ] [ Google Scholar ]
  • 15. Malicki M, Aalbersberg IJJ, Bouter L, Ter Riet G. Journals’ instructions to authors: A cross-sectional study across scientific disciplines. PLOS One. 2019;14(9):e0222157. doi: 10.1371/journal.pone.0222157 ; PubMed Central PMCID: PMC6728033 alter our adherence to PLOS ONE policies on sharing data and materials. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 16. Baždarić K, Vrkić I, Arh E, Mavrinac M, Gligora Marković M, Bilić-Zulle L, et al. Attitudes and practices of open data, preprinting, and peer-review—A cross sectional study on Croatian scientists. PLOS ONE. 2021;16(6):e0244529. doi: 10.1371/journal.pone.0244529 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 17. Sharp MK, Bertizzolo L, Rius R, Wager E, Gómez G, Hren D. Using the STROBE statement: survey findings emphasized the role of journals in enforcing reporting guidelines. Journal of Clinical Epidemiology. 2019;116:26–35. doi: 10.1016/j.jclinepi.2019.07.019 [ DOI ] [ PubMed ] [ Google Scholar ]
  • 18. Fuller T, Pearson M, Peters J, Anderson R. What Affects Authors’ and Editors’ Use of Reporting Guidelines? Findings from an Online Survey and Qualitative Interviews. PLOS ONE. 2015;10(4):e0121585. doi: 10.1371/journal.pone.0121585 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 19. Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. Assessment of transparency indicators across the biomedical literature: How open is open? PLOS Biology. 2021;19(3):e3001107. doi: 10.1371/journal.pbio.3001107 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 20. Hardwicke TE, Thibault RT, Kosie JE, Wallach JD, Kidwell MC, Ioannidis JPA. Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014–2017). Perspectives on Psychological Science. 2021:1745691620979806. doi: 10.1177/1745691620979806 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 21. Zuiderwijk A, Shinde R, Jeng W. What drives and inhibits researchers to share and use open research data? A systematic literature review to analyze factors influencing open research data adoption. PLOS ONE. 2020;15(9):e0239283. doi: 10.1371/journal.pone.0239283 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 22. Tawfik GM, Giang HTN, Ghozy S, Altibi AM, Kandil H, Le H-H, et al. Protocol registration issues of systematic review and meta-analysis studies: a survey of global researchers. BMC Medical Research Methodology. 2020;20(1):213. doi: 10.1186/s12874-020-01094-9 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 23. Malički M, Jerončić A, Aalbersberg IJ, Bouter L, ter Riet G. Systematic review and meta-analyses of studies analysing instructions to authors from 1987 to 2017. Nature Communications. 2021;12(1):5840. doi: 10.1038/s41467-021-26027-y [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 24. Woolston C. Postdoc survey reveals disenchantment with working life. Nature. 2020;587(7834):505–8. doi: 10.1038/d41586-020-03191-7 [ DOI ] [ PubMed ] [ Google Scholar ]
  • 25. Global State of Peer Review: Publons; 2019. Available from: https://publons.com/community/gspr .
  • 26. Marusic A, Bosnjak L, Jeroncic A. A systematic review of research on the meaning, ethics and practices of authorship across scholarly disciplines. PLoS One. 2011;6(9):e23477. Epub 2011/09/21. doi: 10.1371/journal.pone.0023477 ; PubMed Central PMCID: PMC3169533. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 27. Noruzi A, Takkenberg JJ, Kayapa B, Verhemel A, Gadjradj PS. Honorary authorship in cardiothoracic surgery. The Journal of thoracic and cardiovascular surgery. 2021;161(1):156–62. e1. [ DOI ] [ PubMed ] [ Google Scholar ]
  • 28. Fanelli D. How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data. PLoS One. 2009;4(5). ISI:000266490000014. doi: 10.1371/journal.pone.0005738 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 29. Pupovac V, Fanelli D. Scientists Admitting to Plagiarism: A Meta-analysis of Surveys. Sci Eng Ethics. 2015;21(5):1331–52. Epub 2014/10/30. doi: 10.1007/s11948-014-9600-6 . [ DOI ] [ PubMed ] [ Google Scholar ]
  • 30. Pascal CB. The Office of Research Integrity: Experience and Authorities. Hofstra L Rev. 2006;35:795. [ DOI ] [ PubMed ] [ Google Scholar ]
  • 31. Juyal D, Thawani V, Thaledi S. Rise of academic plagiarism in India: Reasons, solutions and resolution. Lung India. 2015;32(5):542–3. doi: 10.4103/0970-2113.164151 . [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 32. Ison DC. An empirical analysis of differences in plagiarism among world cultures. Journal of Higher Education Policy & Management. 2018;40(4):291–304. doi: 10.1080/1360080X.2018.1479949 [ DOI ] [ Google Scholar ]
  • 33. Gaudino M, Robinson NB, Audisio K, Rahouma M, Benedetto U, Kurlansky P, et al. Trends and Characteristics of Retracted Articles in the Biomedical Literature, 1971 to 2020. JAMA internal medicine. 2021. doi: 10.1001/jamainternmed.2021.1807 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 34. Van Noorden R. Some hard numbers on science’s leadership problems. Nature. 2018;557(7705):294–6. doi: 10.1038/d41586-018-05143-8 . [ DOI ] [ PubMed ] [ Google Scholar ]
  • 35. Patience GS, Galli F, Patience PA, Boffito DC. Intellectual contributions meriting authorship: Survey results from the top cited authors across all science categories. PLoS One. 2019;14(1):e0198117. doi: 10.1371/journal.pone.0198117 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 36. Rowley J, Johnson F, Sbaffi L, Frass W, Devine E. Academics’ behaviors and attitudes towards open access publishing in scholarly journals. Journal of the Association for Information Science and Technology. 2017;68(5):1201–11. [ Google Scholar ]
  • 37. Daikeler J, Bošnjak M, Lozar Manfreda K. Web Versus Other Survey Modes: An Updated and Extended Meta-Analysis Comparing Response Rates. Journal of Survey Statistics and Methodology. 2020;8(3):513–39. doi: 10.1093/jssam/smz008 [ DOI ] [ Google Scholar ]
  • 38. Huang C-K, Wilson K, Neylon C, Ozaygen A, Montgomery L, Hosking R. Mapping open knowledge institutions: an exploratory analysis of Australian universities. PeerJ. 2021;9:e11391. doi: 10.7717/peerj.11391 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 39. Gardner W, Lidz CW, Hartwig KC. Authors’ reports about research integrity problems in clinical trials. Contemporary clinical trials. 2005;26(2):244–51. doi: 10.1016/j.cct.2004.11.013 [ DOI ] [ PubMed ] [ Google Scholar ]
  • 40. Tijdink JK, Bouter LM, Veldkamp CL, van de Ven PM, Wicherts JM, Smulders YM. Personality traits are associated with research misbehavior in Dutch scientists: a cross-sectional study. PloS one. 2016;11(9):e0163251. doi: 10.1371/journal.pone.0163251 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • 41. Digital Science, Fane B, Ayris B, Hahnel M, Hrynaszkiewicz I, Baynes G, et al. The State of Open Data. 2019. 10.6084/m9.figshare.9980783.v2. [ DOI ] [ Google Scholar ]

Decision Letter 0

Florian naudet.

10 Mar 2022

PONE-D-22-03115Transparency in conducting and reporting research: a survey of authors, reviewers, and editors across scholarly disciplinesPLOS ONE

Dear Dr. Malički,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

 First of all, I would like to thank the 2 reviewers. Their comments were really helpful and fast. As you will see they consider that the manuscript is a robust one. Only minor changes were required. I have also a few minor comments too: - Please add a few words in your abstract about the main limitation in order to avoid any spin.- Please add in the text (not in the appendix), any change from the initial protocol. In my opinion it is important to be make sure that the reader can see these changes in a first look of the paper. I do think that all the suggestions by the reviewers will be easy to implement.

Please submit your revised manuscript by Apr 24 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at  [email protected] . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols . Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols .

We look forward to receiving your revised manuscript.

Kind regards,

Florian Naudet, M.D., M.P.H., Ph.D.

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

3. Thank you for stating the following in the Funding Section of your manuscript: 

"This study was part of an Elsevier funded project: Fostering Transparent and Responsible Conduct of Research: What can Journals do?. Details of the project are available on our project’s data repository: " ext-link-type="uri" xlink:type="simple">http://dx.doi.org/10.17632/53cskwwpdn.6.14"

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form. 

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows: 

"This study was part of an Elsevier funded project: Fostering Transparent and Responsible Conduct of Research: What can Journals do?. Details of the project are available on our project’s data repository: http://dx.doi.org/10.17632/53cskwwpdn.6 . "

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

4. Thank you for stating the following in the Competing Interests section: 

"IJsbrand Jan Aalbersberg is Senior Vice President of Research Integrity at Elsevier, and Adrian Mulligan is a Research Director for Customer Insights at Elsevier. Mario Malicki is a Co-Editor-In-Chief or Research Integrity and Peer Review journal. Other authors declare no competing interests."

Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests ).  If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. 

Please include your updated Competing Interests statement in your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I am slightly concerned the the conclusion, whilst I agree with it, doesn't follow on logically from the survey results which may or may not be representative of reality anyway. We can all agree that policies to foster better research practices should be enforced rather than encouraged, but it is a bit of a leap to conclude this from the results of a survey showing little difference in attitudes and experience to these standards between three groups of stakeholders.

Reviewer #2: I uploaded the review that I copied below:

This article is interesting, of good quality and deserves publication subject to some clarification. The Appendix is worth reading.

Comments on a few points

• Introduction (page 3) is well done and sets out the problem. In the first sentence, it would be useful to clarify that the 3 million articles per year data is an estimate of the STM segment, and that there is no HHS data for the number of articles.

• Methods: Correctly stated, given that the study protocol is available. Perhaps add, if available, the estimated time to answer all questions.

• Results (page 6). The population needs to be detailed a bit more, and it's all in the appendix. It must be said that there were 28 fields for the disciplines, and especially cite the proportion for health sciences, the results of which appear below. Further on, the data analysed concern health sciences (xx%) which could be confused with life sciences (25%)

• No comments on all the tables: they are well done; the statistical tests are in the appendix and this is sufficient; perhaps some comments in the text could be reduced as redundant, but it is not essential

• Discussion (page 13 and following). The discussion needs to be reconsidered to address a few points.

• Would it be useful to compare your 4.9% response rate with E Fong's 10.5% response rate on a sample of 110,000 emails. Why such a difference? I haven't looked closely at the E Fong article, and I don't know if it's relevant. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0187394

• The distinction between authors and reviewers is rather artificial as all reviewers are authors. The comparisons of authors / reviewers / editors are disappointing. Should table A1 be discussed?

• It is a pity that the notion of gender has not been explored. These are points for discussion.

• Sometimes the discussion repeats data from the tables, and this could reduce the length a bit; the conclusion is not useful

• I agree with the sentence on page 13 'Direct comparisons of our results with other surveys are difficult'... but the discussion is just that, when there are issues that deserve discussion:

o The discussion is mainly focused on researchers and very little on journals and peer review;

o Wouldn't it make sense to discuss the responsibility of journals? This is a huge topic, but isn't it the major point for improving the system. How can TOP be implemented more quickly? Better recognition of peer review, etc. is essential to develop TOP. What can journals do to implement the TOP guidelines faster?

o Why not compare the implementation of TOPs, guidelines to the implementation of protocol registration: the ICMJE requested protocol registration in 2004, and almost 20 years later, only prestigious journals require and control this registration.

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1: No

Reviewer #2:  Yes:  Hervé Maisonneuve

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  [email protected] . Please note that Supporting Information files do not need this step.

Submitted filename: PONE-D-22-03115_review.docx

Author response to Decision Letter 0

Collection date 2023.

Dear Editor Florian Naudet,

Thank you and the reviewers for your comments and kind words for our study. We present below a point-by-point response for all style requirements and review suggestions. Sentences in red are those that have been added during the revision.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

Reply: The style changes have been applied. We did not use the track changes for this, as they would have obscured the replies to reviewer comments.

2. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Reply: The references have been reviewed and formatted to PLOS style. The only reference change was updating our old reference 37 which was a preprint to its peer reviewed version ( https://doi.org/10.1101/2020.11.25.395376 to https://doi.org/10.1371/journal.pone.0244529 ). As an editor, I would also like to praise your rules on indicating the article retracted status – however I would recommend you have a guide to authors on how to check this, and provide them with a tool that does this for their submitted manuscripts. For this manuscript we checked the retracted status using Zotero. None of our references refer to retracted studies.

3. Thank you for stating the following in the Funding Section of your manuscript:

"This study was part of an Elsevier funded project: Fostering Transparent and Responsible Conduct of Research: What can Journals do? Details of the project are available on our project’s data repository: http://dx.doi.org/10.17632/53cskwwpdn.6.14 "

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"This study was part of an Elsevier funded project: Fostering Transparent and Responsible Conduct of Research: What can Journals do? Details of the project are available on our project’s data repository: http://dx.doi.org/10.17632/53cskwwpdn.6 . "

Reply: The funding section has been removed from the acknowledgments, the text to be used is listed now in our new cover letter.

4. Thank you for stating the following in the Competing Interests section:

Please confirm that this does not alter your adherence to all PLOS ONE policies on sharing data and materials, by including the following statement: "This does not alter our adherence to PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests ). If there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Reply: The Competing Interests section has been removed from the acknowledgments, the text to be used is listed now in our cover letter and includes the sentence: This does not alter our adherence to PLOS ONE policies on sharing data and materials.

Editor’s comments:

1. First of all, I would like to thank the 2 reviewers. Their comments were really helpful and fast. As you will see they consider that the manuscript is a robust one. Only minor changes were required. I have also a few minor comments too:

Please add a few words in your abstract about the main limitation in order to avoid any spin.

Reply: The following sentence has been added to the abstract: While survey respondents came from 126 different countries, due to the survey’s overall low response rate our results might not necessarily be generalizable.

2. Please add in the text (not in the appendix), any change from the initial protocol. In my opinion it is important to be make sure that the reader can see these changes in a first look of the paper. I do think that all the suggestions by the reviewers will be easy to implement.

Reply: The changes from the protocol have been moved from the appendix to the main manuscript. Their full added text is: Due to miscommunication within the team, instead of adding the collected email addresses of the editors from our previous study on Instruction to Authors to the total of 100,000 planned invites, 58 of collected email addresses were used for piloting the survey, and 292 were incorporated into the 100,000 invites sent. Additionally, during survey creation, instead of planned options of 0, 1-4, 5-10,10+ for question - How many articles (approximately) did you review in the last 12 months? survey options were instead 1, 2-5, 6-10, and 10+.

3. Please include the following items when submitting your revised manuscript:

• A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

• A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

• An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

Reply: All of the above have been submitted and accordingly labeled.

Reviewer #1:

1. I am slightly concerned the conclusion, whilst I agree with it, doesn't follow on logically from the survey results which may or may not be representative of reality anyway. We can all agree that policies to foster better research practices should be enforced rather than encouraged, but it is a bit of a leap to conclude this from the results of a survey showing little difference in attitudes and experience to these standards between three groups of stakeholders.

Reply: We thank the reviewer for this comment, and even though we also personally agree about enforcements, we have to emphasize that we did not state anywhere in our conclusions that any of the practices should be enforced. We only stated that our results showed a mismatch between the practices of researchers and current recommendations, and that due to this “greater involvement of all stakeholders is needed to align actual practices with current recommendations.” There are multiple ways how this could be achieved, including education and awareness raising, or changing of incentives for hiring, promotion and tenure and these have been stated in the discussion as: which highlights the need to raise awareness of these issues among all stakeholders, and to develop projects where all stakeholders would be working together to eradicate or minimize them. If the reviewer is referring to our last sentence: which states recognition and rewarding of responsible practices needs to move from recommendations to actual practice, we changed the need to should in the revised manuscript so that it does not confuse the readers that this means enforcement. This sentence was however a reflection of the fact that DORA declaration and Hong Kong principles for assessment of researcher were endorsed by only a small fraction of institutions, i.e., there are more than 32 000 universities in the world and only 2560 institutions (8%) endorsed DORA.

Reviewer #2:

1. This article is interesting, of good quality and deserves publication subject to some clarification. The Appendix is worth reading.

Reply: We thank the reviewer for these kind words.

2. Comments on a few points

Reply: We have modified the sentence to state with estimates of more than 3 million articles published per year. The reviewer might also be interested that field distribution of published articles can be seen using Dimension.AI or Open Alex databases (e.g. https://app.dimensions.ai/analytics/publication/for/aggregated?or_facet_publication_type=articleor_facet_year=2021and_facet_publication_type=article )

3. Methods: Correctly stated, given that the study protocol is available. Perhaps add, if available, the estimated time to answer all questions.

Reply: The following sentence was added to the methods: Estimated time to finish the survey was 12 minutes (based on pilot results), and was listed in the survey invite letter.

4. Results (page 6). The population needs to be detailed a bit more, and it's all in the appendix. It must be said that there were 28 fields for the disciplines, and especially cite the proportion for health sciences, the results of which appear below. Further on, the data analysed concern health sciences (xx%) which could be confused with life sciences (25%)

Reply: We thank the reviewer for this comment. It is always a delicate balance to decide which information to keep in the appendix and which in the main manuscript. The information on disciplines has now been added to the methods, and percentage for health sciences listed in results. While some readers may be confused between life and health science, it was based on Scopus journal classification from which we sampled participants, and it the same classification we used in our two other papers ( https://doi.org/10.1371/journal.pone.0222157 , https://www.nature.com/articles/s41467-021-26027-y ) that were from the same project. The methods text now states: In section 4, respondents could select one of 28 prespecified scholarly fields, a multidisciplinary category or the “other” category where they could write their field(s) using an open text response format. All answers they provided in free text, as well as the 29 choices, were recoded to one of the 6 major categories that were used in our previous study on Instruction to Authors: Arts Humanities, Health Sciences, Life Sciences, Physical Sciences, Social Sciences, and Multidisciplinary.[15]. And the added results text states: Health Sciences (13%).

5. No comments on all the tables: they are well done; the statistical tests are in the appendix and this is sufficient; perhaps some comments in the text could be reduced as redundant, but it is not essential.

Reply: We thank the reviewer for the kind words, we kept the result text as is, as we feel it is still quite a shortened version with many additional results in the appendix.

6. Discussion (page 13 and following). The discussion needs to be reconsidered to address a few points: Would it be useful to compare your 4.9% response rate with E Fong's 10.5% response rate on a sample of 110,000 emails. Why such a difference? I haven't looked closely at the E Fong article, and I don't know if it's relevant. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0187394

Reply: The Fong study is an interesting one, however due to following reasons we do not feel it is a good comparison to our study: 1) the Fong study are actually 4 surveys done over a period of 5 years, and it employed a different sampling strategy – emails from respondents were obtained through associations (e.g., American Economic Association), annual meeting lists, and from University websites – this makes it more likely that those were active emails those individuals used, while in our study we used corresponding emails of papers, and we know those change when researcher move from one university to another. That is why we compared ours to other surveys who used corresponding emails. 2) Their survey was shorter – and this also likely affects response rates. 3) Their sampling included only American researchers, and was funded and supported by ORI - Office of Research Integrity – an important and a very well known institution for Research Integrity in USA. We on the other hand sampled researchers from across the world, and our project was funded by Elsevier - information we disclosed in the invite letter – and these two differences may also have impacted response rates.

7. The distinction between authors and reviewers is rather artificial as all reviewers are authors. The comparisons of authors / reviewers / editors are disappointing. Should table A1 be discussed?

Reply: While it is true that (almost all, 93%) reviewers are also authors, the reverse does not apply. This is reflected in the differences in age and number of publications between authors and reviewers in our study. One could say that reviewers are essentially more experienced authors, and while we did not collect information about the number of different journals they reviewed for in their life time, we can presume that with each review a reviewer learns about the practices of other journals, their review forms or instructions, and they may learn from comments of other reviewers. And so, it can be reasonable to expect that review experience might lead to differences in attitudes towards transparency, in their perceptions of prevalence of misconduct and even in the quality of their work. Detectable differences in our outcomes could have been found if we looked at only the most experienced reviewers in our study, or may be found in other studies that might focus on those that produce more than 100 or even 500 reviews per year

( https://publons.com/researcher/?is_core_collection=1is_last_twelve_months=1order_by=num_reviews ). However, as researchers, we must also be consistent to our planned study protocol. Our objective was to explore differences between authors, reviewers, and editors - and use their self-reported role(s) in Q1 of our survey to group them in these roles. Our shared data allow researchers to group participants differently and conduct sub-analyses. In light of this, we don’t think additional discussion is needed regarding Table 1, and to limit an already long discussion, we would rather expand on the other issues the reviewer raised below.

8. It is a pity that the notion of gender has not been explored. These are points for discussion.

Reply: We did explore differences between male and female respondents, as can be seen in the Appendix Tables 7 to 9 but found no statistically significant differences. We chose not to focus on this in discussion, due to several reasons. One, it is difficult to find a survey with which to compare our results. Some surveys of specific countries have found gender differences between data sharing attitudes (e.g., Croatia https://doi.org/10.1371/journal.pone.0244529 , or Germany https://doi.org/10.1371/journal.pone.0183216 ), but due to different sampling methods, how the questions were asked, response rates, etc., we feel these are not appropriate comparisons. Additionally, we wanted to focus in the discussion on the observed differences for our main objective – between authors, reviewers and editors. Our survey was quite large, and it is hard to comment on each of the 38 questions in regards to gender as they could all be individual topics, or questions asked one day in sys. reviews. We are of course willing to expand on the gender in the discussion if there is something very specific the review or editor wants us to mention, or if they had a specific study as comparison in mind.

9. Sometimes the discussion repeats data from the tables, and this could reduce the length a bit; the conclusion is not useful

Reply: We have intentionally repeated data for 11 out of 38 questions our survey addressed, as we wanted to spare the readers of going back to the tables while they read the discussion on these specific topics. We are open to removing some of them if the editor or the reviewer insist, but we do not think that much is gained by shortening the discussion by removing 5-6 numbers/words. And while we could cut more words by cutting some of the 11 out of the 38 topics we discuss in the discussion, we prefer to keep these 11, as we still omitted many others to be mindful of the length of the discussion. As for the conclusions, we hope that added texts written below make the conclusion and discussion more useful to the readers and the reviewer.

10. I agree with the sentence on page 13 'Direct comparisons of our results with other surveys are difficult'... but the discussion is just that, when there are issues that deserve discussion:

Reply: We agree with the reviewer that we could have given more discussion in this paper to the role of journals and peer review – but as we discussed those roles quite extensively in two other publications that came from our project Fostering Transparent and Responsible Conduct of Research: What can Journals do? - of which this survey was also a part, and as in those two publications we discussed ICMJE, protocol registrations, as well as reporting guidelines, we did not want to repeat the same or similar messages. We also planned to summarize all publications from this project in a report that would focus on exactly the action that journals should take. Nevertheless, the reviewer is right to ask this, and so we added the following to the discussion: It is likely that the lack of support of editors in our study towards all aspects of TOP recommendations is one of the contributing factors why these recommendations have not been endorsed by a larger number of journals today. The roles of editors and journals in endorsing or requiring specific practices, the lack of resources they often face, and the lack of proper intervention studies, were discussed extensively in our previous publications. [15, 23] Here we would like to add that uptake and attitudes toward different practices are also likely influenced by whom and how recommendations are made. More rigorous methodological steps during recommendation development (similar to those for reporting guidelines or clinical practice guidelines), and open feedback calls might have led to higher initial uptakes. Additionally, published case reports with cost estimates and practical tips from those involved in recommendation development, that then managed to change practices of their journals, departments or institutions, could perhaps lead to additional endorsements.

in the name of all co-authors,

Decision Letter 1

Transparency in conducting and reporting research: a survey of authors, reviewers, and editors across scholarly disciplines

PONE-D-22-03115R1

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at [email protected] .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact [email protected] .

Additional Editor Comments (optional):

Thank you for the revisions and kudos for this important paper. As those were minor edits, I have assessed all your answers and will ask PLOS One to FWD your answers to the reviewers. Best.

Acceptance letter

Dear Dr. Malički:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact [email protected] .

If we can help with anything else, please email us at [email protected] .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Pr. Florian Naudet

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data availability statement.

  • View on publisher site
  • PDF (1.6 MB)
  • Collections

Similar articles

Cited by other articles, links to ncbi databases.

  • Download .nbib .nbib
  • Format: AMA APA MLA NLM

Add to Collections

IMAGES

  1. Types of Research Report

    importance of reporting research results

  2. Effective Reporting Infographic

    importance of reporting research results

  3. Reporting & Use of Assessment Results

    importance of reporting research results

  4. Research Report

    importance of reporting research results

  5. Reporting Research Findings

    importance of reporting research results

  6. Research Report

    importance of reporting research results

VIDEO

  1. Research Ethics,Publishing and Reporting Research Results and Plagiarism

  2. Research Profile 1: Why is it so important?

  3. Mastering the Art of Reporting Research Results: A Step-by-Step Guide"

  4. What Honest Representation Means in Financial Reporting

  5. 5000 साल पुराना😱 कब्र की तलाश #a1explain #facts

  6. Research Methods & Report Writing