In recent years, we have seen an explosion in the techniques available to analyse and understand developmental processes – from single cell ‘omics to high resolution 4D microscopy – that brings with it an enormous expansion in the amount of data generated during the course of a research project. In addition, the ever-increasing array of research materials available through public repositories or on request brings with it both the opportunity to build on the research of others but also the challenge of ensuring rigour and reproducibility. The research community is best served by open exchange of data, materials and methods, and by maximum transparency in the reporting of research results.
Development has long believed in upholding high standards for data, materials and methods presentation and availability, meaning that the research we publish can be trusted. Moreover, a paper is most useful to its audience when readers can easily find out exactly how experiments were conducted and analysed, access the data behind the results and obtain relevant materials to replicate and build on the work presented. These principles underlie many of our policies, such as excluding the Materials and Methods section from our length limit for papers to encourage full reporting, requiring authors to provide access to ‘omic data at the point of submission (so that they can be scrutinised by referees) and supporting authors through the process of making data available before formal publication.
However, we also recognise that researchers are having to navigate an increasingly complex environment when it comes to data and materials reporting and availability, that different countries, institutes and funding agencies may have different policies and processes, and that the varied requirements of journals can place a significant burden on researchers when preparing their work for publication. We therefore aim to strike a balance: ensuring high standards of data and materials presentation and availability without asking too much of our authors. We continually review our policies and practices on this front and, in this editorial, we provide a brief overview of some of the latest updates.
Development, like many journals, requires authors to complete a checklist [https://journals.biologists.com/DocumentLibrary/DEV/Checklist.pdf] at the point where they submit their revised manuscript to the journal. As well as serving a compliance role, this provides a useful reminder to authors of some of the key elements of good research and reporting practice. For example, authors are asked to ensure that their paper contains details on cell line authentication, ethical compliance for animal and human studies, and also provides instructions for data and statistical reporting. We have recently updated this checklist to incorporate further guidance to authors on reporting data, materials, software and code availability as well as the use of Artificial Intelligence (AI) tools.
Importantly, we also now link to checklists produced by other organisations and encourage authors to complete these, where relevant, alongside our own document. The ARRIVE (Animal Research: Reporting of In Vivo Experiments) guidelines (https://arriveguidelines.org/arrive-guidelines; Percie du Sert et al., 2020) provide a framework for reporting on animal research; many of these points are also covered in our own checklist, but we encourage authors working with animals to consult these guidelines at an early stage in their research to ensure compliance. We also refer authors to the SAGER (Sex and Gender Equity in Research) guidelines (https://ease.org.uk/wp-content/uploads/2023/01/EASE-SAGER-Checklist-2022.pdf) for reporting of sex/gender in both human and non-human studies. Given the known developmental differences between males and females in many species, appropriate use of experimental subjects – and accurate reporting – is essential (Heidari et al., 2016; Van Epps et al., 2022; Justice, 2024). Finally, we recommend that authors working with human pluripotent and tissue stem cells complete the International Society for Stem Cell Research (ISSCR) checklist. This checklist is based on the important work carried out by the ISSCR Standards Taskforce (Ludwig et al., 2023) to define a set of standards and recommendations intended to improve rigour and reproducibility in the stem cell field (see also Juguilon and Wu, 2024, in our sister journal Disease Models & Mechanisms). We were delighted to be included in the consultation process that preceded the development of this checklist, and although we are not currently enforcing the completion of this document, we believe that it serves as crucial guidance for authors to ensure their research reporting complies with best practice, and we strongly encourage authors to refer to the ISSCR standards and checklist when designing their experiments and preparing their article for publication. Hopefully, as the community embraces these guidelines, compliance will become standard practice, and we will continue to monitor how well our papers meet the recommendations set out by the ISSCR.
Another area in which we have been reviewing our policies concerns data transparency and integrity. For several years, we have encouraged authors to make all data underlying their research available to the community, for example, through deposition in the data repository Dryad (with whom we have a relationship to facilitate deposition) or equivalent venues. We are now adding some additional specific requirements to this overarching policy. When plotting quantitative data, we now ask authors to use graphs that allow the reader to see the true data spread (e.g. box-and-whisker plots or super-plots). Where relevant, we also encourage authors to provide a supplementary file containing the raw, uncropped images of any western blots shown in the figures. These new guidelines follow those already in place at our sister journals, Journal of Cell Science and Biology Open (Gorelick, 2023; Way and Ahmad, 2022), and will allow better assessment of the quality of the data, helping the editors and reviewers determine whether the data support the conclusions. We also hope that it will support the work of our Publication Integrity team, who check papers at acceptance for any potential issues with data presentation. Supported by the Proofig AI tool, we screen all figures for potential duplication or manipulation, which results in our contacting authors for clarification on around 20% of our accepted papers. Although we have very occasionally needed to revoke acceptance of a paper, the vast majority of issues raised through this screening can be easily resolved through consultation with the authors, ensuring that – as far as possible – the papers we publish conform to high standards of publication integrity.
Finally, the rise of quantitative and computational approaches in developmental biology has prompted us to update our guidelines regarding software and code deposition. When integral to a paper, we now require deposition of code in a public repository with a permanent digital identifier (e.g. Zenodo; https://zenodo.org/) – ideally at the point of submission so that it is available to referees. Where this is not possible, authors should discuss options for making code available with the editorial office.
Overall, we hope that by encouraging best practice in data presentation and reporting during manuscript preparation, we can minimise issues that come to light at later stages in the editorial and publication process and, most importantly, make a paper and its accompanying materials as useful as possible for the reader. As always, we are happy to hear any feedback you may have about these policies.