Bioimage analysis (BIA), a crucial discipline in biological research, overcomes the limitations of subjective analysis in microscopy through the creation and application of quantitative and reproducible methods. The establishment of dedicated BIA support within academic institutions is vital to improving research quality and efficiency and can significantly advance scientific discovery. However, a lack of training resources, limited career paths and insufficient recognition of the contributions made by bioimage analysts prevent the full realization of this potential. This Perspective – the result of the recent The Company of Biologists Workshop ‘Effectively Communicating Bioimage Analysis’, which aimed to summarize the global BIA landscape, categorize obstacles and offer possible solutions – proposes strategies to bring about a cultural shift towards recognizing the value of BIA by standardizing tools, improving training and encouraging formal credit for contributions. We also advocate for increased funding, standardized practices and enhanced collaboration, and we conclude with a call to action for all stakeholders to join efforts in advancing BIA.

Bioimage analysis as an emerging discipline

Computational analysis of image data generated with techniques such as light and electron microscopy, pre-clinical imaging, and clinical imaging is a comparatively recent development in the centuries-long history of light microscopy for biological discovery. To interpret biological images, scientists have long relied on the human visual system, which is not suited for reproducible quantification (Jambor, 2023). The subjective nature of visual analysis has famously been demonstrated in neuroscience, where Golgi and Cajal used the same method to reach widely different conclusions regarding neuroanatomy, and is perhaps best seen in pathology, where despite a long history and comprehensive training of researchers in discerning phenotypes from images, the interpretation of many phenotypes results in poor diagnostic agreement between individual observers (Elmore et al., 2017; Hamilton et al., 2015; Polley et al., 2013; Varga et al., 2012). Such disagreement, alongside the limitations of human perception in detecting subtle phenotypes (Gibson et al., 2015), underlies the serious need for quantitative, reproducible methods in bioimage analysis (BIA) that align with the FAIR (findable, accessible, interoperable and reusable) principles for scientific data management and stewardship (Barker et al., 2022; Kemmer et al., 2023; Wilkinson et al., 2016).

Further complicating this landscape is the sheer number of advanced microscopy techniques. Microscopes are both ubiquitous and extraordinarily heterogeneous. This combination means that, although microscopy has a low barrier to entry, standardization of methods poses a significant challenge, leading to the emergence of an entire career track for microscopy experts: the imaging scientist (Wright et al., 2024). The heterogeneity of microscopes and microscopy applications is naturally reflected in the imaging data itself; therefore, extensive time and expertise are required to design and implement robust BIA methods. The 2010s saw the rise of network organizations, such as the Network of European BioImage Analysts (NEUBIAS), aiming to connect experts in the creation and application of BIA tools (Martins et al., 2021). Although artificial intelligence (AI) and machine learning (ML) will continue to dramatically improve the throughput of BIA and create newer, easier-to-use analytical options, the human expertise of imaging scientists and bioimage analysts is still needed to provide the detailed understanding necessary to both generate viable training datasets and implement these techniques correctly.

We stand at a critical juncture where the establishment of dedicated BIA support within academic and research institutions – in the form of expert groups, facilities and dedicated staff in individual laboratories – presents a huge opportunity to enhance the quality and efficacy of research output. Since the conclusions of an experiment ultimately rest on its design, the inclusion of BIA experts during the experimental design phase can make the difference between a successful or an unsuccessful outcome – as Ronald Fisher said, “To consult the statistician after an experiment is finished is often merely to ask him to conduct a post-mortem examination. He can perhaps say what the experiment died of.” Direct contributions from BIA experts lead to better data utilization, more principled and conclusive results, stronger publications, and a highly productive environment that is conducive to fewer revision cycles and improved project setups, thus directly impacting scientific endeavors (Fig. 1) (Jambor et al., 2021; Jonkman et al., 2020; Lee et al., 2024; Senft et al., 2023; Soltwedel and Haase, 2024). They can also help guard against research misconduct (Bik et al., 2016; Pulverer, 2015; Rossner and Yamada, 2004), the majority of which is thought to be inadvertent and due to insufficient understanding of best practices (Pulverer, 2015). Bioimage analysts thus play a pivotal role in elevating the scientific reputation of an institution, attracting superior talents and nurturing a vibrant community that pushes scientific frontiers. Nevertheless, we estimate that most biological researchers do not actively consult BIA experts. Although to our knowledge no study has determined why this is, in our experience, low computational comfort (resulting in discomfort when approaching computational experts), lack of awareness about manual analysis biases (Lee et al., 2024), lack of awareness of available methods, lack of knowledge about how to find BIA experts and lack of resources to hire BIA experts all contribute to the relatively low rate of collaboration.

Fig. 1.

Overview of the key skills and capabilities of a BIA specialist. Four categories of skills and capabilities have been identified: (A) image analysis, including theoretical and technical knowledge as well as a good understanding of the context of scientific questions; (B) implementation, covering building and deploying computational workflows, developing pipelines, and coding; (C) project management, including communicating effectively, providing support and devising data management plans; (D) education, focusing on teaching and training, designing curricula, and producing materials.

Fig. 1.

Overview of the key skills and capabilities of a BIA specialist. Four categories of skills and capabilities have been identified: (A) image analysis, including theoretical and technical knowledge as well as a good understanding of the context of scientific questions; (B) implementation, covering building and deploying computational workflows, developing pipelines, and coding; (C) project management, including communicating effectively, providing support and devising data management plans; (D) education, focusing on teaching and training, designing curricula, and producing materials.

The BIA field is technologically poised for a paradigm shift. As ML approaches increasingly allow automation of routine tasks such as segmentation, future BIA specialists will be free to focus on experimental design and advanced method development and application. Culturally, however, this evolution will require a shift towards a more inclusive, collaborative approach, where bioimage analysts participate in the experimental design process from the outset. A concerted effort to foster a culture that recognizes the intrinsic value of quantitative BIA and the indispensable role of analysts in advancing scientific discovery will allow many positive returns, including deeper scientific insights and more efficient research methodologies. In this Perspective, we describe the results of a recent meeting of bioimage analysts at The Company of Biologists ‘Effectively Communicating Bioimage Analysis’ Workshop, held in February 2024. This Workshop hosted BIA practitioners, tool developers and method developers, and served as a platform for discussions in which we categorized the current obstacles and possible solutions to the creation of such a new BIA culture, as detailed below.

Advancing BIA as a discipline will require the broader biology community to become more engaged in using BIA in their work. In addition to the cultural barriers described above, several other factors currently limit the ability of many researchers to perform high-quality and FAIR BIA in biology. Biologists who lack education or support in BIA might not be sure which tools to invest their limited analysis time into, and a lack of documentation and proper training materials available for BIA tools might discourage them from delving deeper. Researchers engaged in BIA also often encounter technological obstacles, such as insufficient computing power, storage space or funding for commercial licenses (Fig. 2). The help of a BIA expert and access to more powerful hardware and/or software is therefore often critical to a researcher's success, but these barriers also highlight the need to provide biologists with expanded training opportunities in programming, empowering them with necessary skills.

Fig. 2.

Major barriers to effective uptake of BIA. We have identified three categories of barriers: personal barriers, including the difficulties related to getting started in BIA and to finding suitable career options; structural barriers, including barriers in publishing and obtaining funding; and barriers related to the culture of the scientific community, such as peer pressure, a lack of incentives and a lack of clear governance principles.

Fig. 2.

Major barriers to effective uptake of BIA. We have identified three categories of barriers: personal barriers, including the difficulties related to getting started in BIA and to finding suitable career options; structural barriers, including barriers in publishing and obtaining funding; and barriers related to the culture of the scientific community, such as peer pressure, a lack of incentives and a lack of clear governance principles.

Effective BIA requires a comprehensive understanding of several diverse fields, including image processing, complex workflow creation (Cimini, 2024; Miura and Nørrelykke, 2021) and, increasingly, familiarity with data management, IT infrastructure and deep learning (Fig. 1). At each step, algorithm selection and/or defining large, complex parameter sets may be required. This technical understanding must be paired with thorough understanding of experimental design, including which artifacts are likely under various sample preparations and/or imaging conditions and how best to ameliorate them (Culley et al., 2024; Senft et al., 2023). Statistical considerations, such as measurement errors associated with image analysis, what data should be used to train or validate an algorithm, selection of appropriate metrics, whether and how to aggregate data (including how variability and uncertainty should be assessed), and how to report the results so that they are reproducible and interpretable, are also necessary. Only with all of this knowledge can bioimage data be accurately analyzed, underlining the importance of consultation throughout the experimental design process (Fig. 3).

Fig. 3.

A concise etiquette guide for interacting with BIA specialists. We recommend contacting your local bioimage analyst from the start of the project and providing them with regular updates instead of waiting for a complete acquired dataset before reaching out. Moreover, during the data acquisition phase, we suggest that experimentalists prepare a detailed plan for better time management and to aid effective communication with BIA specialists. Additionally, we ask for the bioimage analyst's expertise to be respected and for their scientific contributions to be recognized.

Fig. 3.

A concise etiquette guide for interacting with BIA specialists. We recommend contacting your local bioimage analyst from the start of the project and providing them with regular updates instead of waiting for a complete acquired dataset before reaching out. Moreover, during the data acquisition phase, we suggest that experimentalists prepare a detailed plan for better time management and to aid effective communication with BIA specialists. Additionally, we ask for the bioimage analyst's expertise to be respected and for their scientific contributions to be recognized.

Bioimage analysts are typically also on the front line of creating FAIR training materials that facilitate learning but also promote the sharing and standardization of best practices within the community (Haase et al., 2024a preprint; Imreh et al., 2024). The creation of standardized, quality-controlled, modularized educational materials would improve the ability of all bioimage analysts to effectively train diverse audiences, from novices to advanced practitioners, with tailored curricula both in formal educational settings and in more informal workshops. Community efforts (including funding acquisition) should also be undertaken to develop training schools and curricula in which one can ‘train the trainers’. Importantly, BIA training materials must be made accessible to a diverse audience and ideally be understandable for researchers with various backgrounds. This will require adjustments for individuals with disabilities, translations into multiple languages to reach a global audience, and inclusion of a variety of formats ranging from informal blog posts and instructional videos to full academic courses. In addition, a collection of case studies that illustrate the practical application of BIA skills in various research contexts could be compiled to serve as a valuable resource for both learners and trainers.

To facilitate this, existing material should be reviewed (for example, surveys and meta analyses of Jamali et al., 2022; Miura, 2021; Schmidt et al., 2022; Sivagurunathan et al., 2023; Waithe, 2021) and cataloged, allowing greater focus to be placed on missing elements important for FAIR training. These could include, but not be limited to: application of ML techniques in BIA for more efficient and accurate image processing and analysis; ethical considerations, data privacy and the responsible use of bioimage data, especially in contexts where sensitive or personal information may be involved; how to perform correct validation and accurately report and interpret results; proper use of statistics, including how to report P-values and prevent ‘P-hacking’; training on the use of IT infrastructure, such as high-performance computing and cloud computing; and research data management and reporting guidelines (Sarkans et al., 2021; Schmidt et al., 2022, 2024), with BIA-specific data management plans containing sections about who is responsible for processing the data, from where they get their resources (including human resources, storage resources and computing resources) and how to ‘FAIR-ify’ the project (such as by sharing data and code sustainably).

In tandem, the BIA community can lead a concerted effort to develop better standards and harmonized approaches with the goal of simplifying the learning process, making it easier to train future analysts and researchers. This will require engagement from those across the BIA community, including researchers, developers, funders and policy makers. Fostering interoperability through the use of common file formats (Hiner et al., 2016; Moore et al., 2021), the creation and adoption of metadata formats that include aspects of experimental design, and promoting best practices in software engineering for greater modularity and documentation (Afiaz et al., 2024; Sharma et al., 2024; Sivagurunathan et al., 2023; Wiesmann et al., 2015) will ensure that image data can be used and re-used with many tools. The establishment of standardized datasets and metrics for tool evaluation (Maier-Hein et al., 2024; Rubens et al., 2020) as well as the benchmarking initiatives these enable – such as setting up algorithm challenges for solutions to BIA problems like segmentation and object tracking (Caicedo et al., 2019; Haase et al., 2024b preprint; Ma et al., 2024; Maška et al., 2023; Ulman et al., 2017) – can enhance visibility and comparability among tools and reduce duplicated efforts. By adopting these strategies, the BIA community can foster a more integrated, efficient and collaborative research environment.

The broad technical knowledge required to successfully become a BIA expert typically takes many years to develop, especially in the absence of formalized training pathways. As such, bioimage analysts are sometimes employed in academic core facilities, like other technical specialists such as bioinformaticians and imaging scientists. Because they are often exposed to a range of biological problems, broad subsets of the BIA tool ecosystem and user difficulties with said tools, such ‘BIA application experts’ can highlight the unmet needs of the biological community when communicating with (or even serving as) research software engineers (Deschamps et al., 2023). To succeed in such roles, bioimage analysts must possess not only the technical skills described above but also the ability to understand and communicate analysis techniques in order to be able to advise and train researchers on the requirements and resources for data sharing (Fig. 1). As detailed above, the benefits to institutions employing such individuals are vast, as they can raise the quality of research for whole departments through consultations on analyses and statistical presentation of image data, as well as facilitate more efficient use of computational resources through consultation on data management and cluster utilization, reducing institutional costs.

In our collective experience, few people currently have all the skills needed to succeed in this career, and those who do might find the academic core facilities career path to be challenging and unstable. Although in some regions, such as parts of Europe, it is common to find regional or national funding for core facilities staff (O'Toole and Marrison, 2024; Pfander et al., 2022), in other regions, such as the United States, this type of funding is difficult if not impossible to acquire, leaving the funding of core facility staff up to individual institutions. In a recent Global BioImage Analysts' Society (GloBIAS) survey of the field (GloBIAS Survey Working Group, personal communication), fewer than 20% of bioimage analysts working in core facilities reported that their facility is more than 75% internally funded, suggesting that most institutions do not have sufficient funding dedicated to covering salary and expenses for bioimage analysts. Most analysts rely on a combination of collaborators’ grants and microscope usage fees to make up the remainder of the budget, with a smaller fraction of money coming from their own grants, external consulting or fees from analysis projects. Only one of 166 respondents reported being able to fund more than 75% of their position through image analysis fees alone, which likely reflects similar cost recovery challenges in other kinds of computational core facilities due to higher salaries of computational experts and lower willingness of researchers to pay for computational work (Dragon et al., 2020). Without reliable institutional or agency funding, academic core facility bioimage analysts typically have little job security, low pay and the same difficulties in career progression found in other core facility career paths (Adami et al., 2021; Dragon et al., 2020; Lippens et al., 2022; Rahmoon et al., 2024; Soltwedel and Haase, 2024; Tranfield and Lippens, 2024; Waithe, 2021; Wright et al., 2024).

Unfortunately, the historical reliance on qualitative assessment in biology research has led to relative underdevelopment and underappreciation of the difficulty of BIA, combined with other cultural biases against facility technical specialists (Knudtson et al., 2019; Kos-Braun et al., 2020) (Fig. 3). Researchers often approach bioimage analysts with predetermined mindsets about what an analysis should look like and what the results ought to be, rather than engaging in a collaborative dialogue to explore the full spectrum of analytical tools available. This disconnect not only suppresses innovation in novel methods and approaches but also prevents the optimal application of BIA in addressing complex scientific questions. In many scientific environments, bioimage analysts find their contributions condensed to brief mentions in the methods section or relegated to supplementary materials despite their pivotal role in shaping the research outcome from the outset. This oversight not only diminishes the value of their expertise but also risks diminishing the integrity and reproducibility of scientific findings. The combination of low pay and under-acknowledgement can lead to burnout in such analysts, who often find themselves both more respected and better compensated in industry roles. This leads to a ‘brain drain’ in which highly talented experts who serve as a critical resource for both researchers and BIA tool creators are often lost from the academic community.

It will therefore be vital to establish a stable career path for BIA specialists. BIA experts (alongside their imaging scientist colleagues) represent a crucial, stable source of knowledge and expertise within their local community. Increasing the number of BIA experts, both by expanding existing training efforts (Cimini et al., 2024; Martins et al., 2021) and by promoting better career paths and acknowledgement structures for analysts, will be essential to generating the expertise needed to push imaging science forward. Other types of core facilities can serve as a template for ways to promote career advancement as well as adoption of training standards and programs (Adami et al., 2021; Waters, 2020; Wright et al., 2024). Internal funding for such facilities typically requires convincing stakeholders of the value of such a facility; therefore, quantifying the benefits and contributions of BIA in measurable terms for funders poses a challenge, as each institution or granting agency will differently value key performance indicators such as impact factor, publication timelines or production of open data. Within universities, academic stakeholders must recognize the value of on-site BIA experts who can train users and propose tailored solutions over purchasing proprietary software with limited scope and user training resources. Here, we provide a draft template (Table S1) to help bioimage analysts (or those wishing to employ them) translate their many roles into monetary value. Although this template must be personalized for every situation, it can be used in discussions on key performance indicators with decision makers to help explain the value of a dedicated BIA facility (Soltwedel and Haase, 2024). While our template covers many possible activities, the number and priority of aspects that any given facility can cover should be tailored to the staffing size, as the requirement to cover too broad an area of expertise is a known issue affecting staff retention and well-being in other bioinformatics facilities (Dragon et al., 2020).

The creation of dedicated BIA core facilities and/or the embedding of bioimage analysts in bioimaging core facilities can help institutions centralize and share costs for these services, which are increasingly integral for the functioning of departments or even whole universities. In accordance with common funding models used in both imaging core facilities and bioinformatics facilities (Dragon et al., 2020; O'Toole and Marrison, 2024; Soltwedel and Haase, 2024; Tranfield and Lippens, 2024; Waithe, 2021), bioimage analysts can of course generate revenue but also should not be required to recover all of the associated costs through user fees. Costing model choices are absolutely critical. Charging per project can limit access to only higher-funded labs, whereas a centrally funded BIA facility with set hours per group can provide a more equitable service (Soltwedel and Haase, 2024). Since BIA needs are often unexpected, this model also protects groups who find themselves suddenly needing BIA services but who had not previously budgeted for them and might not realize the costs required for computational collaboration. For projects requiring significant research from BIA experts, the intellectual contributions of such experts should be recognized by including them as co-principal investigators on grants. To prevent the unfortunate but not uncommon practice of including computational collaborators on initial grants only to make massive cuts to their budget once a grant is awarded (Way et al., 2021; https://www.timeshighereducation.com/opinion/scientific-collaborators-are-not-disposable), BIA co-principal investigators should be provided with official subcontracts that cannot be reduced without mutual agreement.

The increasing inclusion of datasets, methods and software sections in peer-reviewed papers by journals is a welcome development that allows BIA experts more opportunities to publish their work; however, we emphasize that academic citations should only be viewed as one metric of value, as we will detail below. Alternative mechanisms of recognition – such as the increased use of narrative CVs emphasizing the value of collaboration and support work, and the creation of field recognition structures – will also allow hiring and promotion committees to recognize excellence and community value in BIA. The BIA community could work to create such awards to foster recognition of innovation, collaboration and excellence and to acknowledge the diverse contributions of its members, from enthusiastic students to seasoned principal investigators. Creation of awards that highlight impactful work in BIA across all levels of scientific engagement could be administered by reputable entities within the community, including societies like GloBIAS, funders like the Chan Zuckerberg Initiative, or individual institutes and universities.

BIA experts who develop new software tools, much like those who work with researchers on BIA applications, are also undervalued in many academic structures and need better working conditions that will inspire them to develop, maintain and update those tools, including a ‘critical mass’ of local experts to interact with (Way et al., 2021). Unfortunately, the current incentive structure in academic research makes it harder than necessary to build and support the practical BIA tools that the wider community needs (Fig. 2).

One of the main quantified outputs of academic research is peer-reviewed publications (Derrick et al., 2024; Way et al., 2021). The ‘publish or perish’ paradigm encourages researchers to narrowly focus on their own area of expertise rather than target their tools broadly. BIA tool developers are thus expected to continually publish novel methods that demonstrably improve on the state of the art according to some benchmark – even though novelty and benchmark performance are not reliable surrogate measures of real-world usefulness (Maier-Hein et al., 2018). Additionally, although a ‘proof-of-concept’ algorithm that works on a restricted dataset might be publishable, there is often no requirement for authors to provide any code that would enable scrutiny or reproducibility of the method (Sharma et al., 2024). Compounding this, academic research labs are typically primarily staffed by trainees, who are unlikely to continue maintaining software that they generated in previous positions.

Making an algorithm accessible to researchers can also be a double-edged sword: a BIA specialist that takes the considerable time and effort needed to create well-documented, user-friendly, open-source software will end up with fewer publications, may be falsely perceived as less productive and might have their work dismissed as ‘software, not research’. If they attempt to incorporate an algorithm into a larger tool within the BIA ecosystem rather than make a new tool, it might be falsely perceived as a trivial advance even when its functionality is novel. The payoff from such efforts is also unclear: even when software is broadly used by non-computational users, a commensurate level of citation is not guaranteed (Giving software its due, 2019). The amount of overlap and redundancy in these tools not only dilutes resources but can also have a considerable environmental impact due to the computational resources required to retrain deep learning models for slight advancements. Current incentives therefore create a ‘graveyard’ of unmaintained, standalone ‘usable enough’ and ‘good enough’ tools developed for particular projects rather than reusable workflows or plugins for existing BIA tools that would promote FAIR principles. (Deschamps et al., 2023; Moses and Pachter, 2022).

The BIA field would thus strongly benefit from funding dedicated to encouraging software maintenance and documentation, which are essential to users, rather than novelty alone. Some philanthropic funders have introduced such programs (such as Chan Zuckerberg Initiative Imaging Software Fellows), but far more of this type of funding is needed globally to promote the long-term maintenance and support of existing solutions rather than the recurrent cycle of development and abandonment. Such a shift in funding models should be accompanied by community effort to define criteria for quality software and good practices for software development and maintenance. Taken as a whole, these measures could help direct efforts towards a smaller number of BIA tools, with the benefits of greatly improved quality, accessibility and usability.

Although significant steps have been taken in creation and adoption of BIA as an expert discipline, many challenges remain. Given the complexity and diversity of biological data, coupled with the rapid evolution of imaging technologies, a concerted effort to use BIA to advance scientific knowledge is necessary. This requires the collaborative engagement of all stakeholders involved in the bioimaging ecosystem, including but not limited to researchers, bioinformaticians, software developers, policy makers and funding agencies. A synergistic approach, combining top-down strategies from organizational and policy perspectives and bottom-up initiatives driven by community-based innovations, is paramount for fostering an environment conducive to solving the problems at hand. A central aspect of our strategy is cultivating a culture that values and understands the importance of BIA, ensuring buy-in from all the various stakeholders (Fig. 4). We emphasize the following six key actions. (1) Enhancing visibility: active participation of bioimage analysts in scientific conferences and forums not only boosts the visibility of BIA but also facilitates collaboration and the exchange of ideas. (2) Building empathy and collaboration: willingness to learn from one another, letting BIA novices learn from experts and developing a common language are key. Engaging in activities such as pair programming (where researchers write code together) or ‘rescue sessions’ for problematic datasets (where novices attempt to analyze very difficult data, discuss computational methods to improve analysis, and brainstorm changes in sample preparation and/or imaging) can bridge the gap between bioimage analysts and researchers. (3) Highlighting unique contributions: by presenting case studies and research outcomes that were made possible exclusively through advanced BIA, we can underscore the unique value it brings to scientific discovery. These success stories can serve as powerful testimonials to the crucial role of quantitative analysis. (4) Advocating for standards in publishing: lobbying for journals to mandate not just the inclusion of quantitative image analysis but also a thorough description of BIA methodologies. Journals should adopt BIA-inclusive publication checklists (Schmied et al., 2024) to ensure that all image quantification is performed according to field standards, similar to existing checklists for statistical analysis. Journals or journal sections dedicated to methods and resources should accept BIA tools and workflows. (5) Securing support from funders: pushing for funding bodies to recognize and support the infrastructure of the BIA community, including the development of communal resources and spaces for collaboration. (6) Emphasizing the importance of user support: advocating for consideration of user support as fundamental to software development and providing funding for tool maintenance ensures that tools are not only technically robust, but also user friendly and suitable for widespread adoption.

Fig. 4.

A vision for the future of the BIA community. We have set short-term and long-term goals for our community to address. In the short term, we would like to direct our efforts into defining standards, cataloging resources and routinely adopting best practices. In the long term, the defined standards should be disseminated through publications, policies and training. These objectives need the support of policy makers and funders for success. A particular focus should be placed on tailored funding opportunities, publication support and recognition of non-standard scientific contributions.

Fig. 4.

A vision for the future of the BIA community. We have set short-term and long-term goals for our community to address. In the short term, we would like to direct our efforts into defining standards, cataloging resources and routinely adopting best practices. In the long term, the defined standards should be disseminated through publications, policies and training. These objectives need the support of policy makers and funders for success. A particular focus should be placed on tailored funding opportunities, publication support and recognition of non-standard scientific contributions.

Despite these challenges, the BIA community is motivated by a shared sense of organization and purpose and has high enthusiasm and readiness to contribute to the collective mission of enhancing the rigor, reproducibility and impact of scientific research. The recent founding of GloBIAS, a global society for bioimage analysts, is likely to be as globally catalytic in the future as NEUBIAS has been for Europe. The GloBIAS website (www.globias.org) lists training materials, events and pages where biologists can find local BIA experts available for collaboration. Many groups primarily focused on bioimaging [such as BioImaging North America (BINA), the Royal Microscopical Society (RMS), the African Bioimaging Consortium (ABIC), Euro-Bioimaging and Global BioImaging] also now host BIA subgroups and run workshops on BIA training for beginners. The Scientific Community Image Forum (forum.image.sc) also serves as a global community gathering location for image analysis events and education that is free and open to all (Rueden et al., 2019).

Meetings such as The Company of Biologists ‘Effectively Communicating Bioimage Analysis’ Workshop, which inspired this Perspective, play a crucial role in stimulating this community and equipping participants with the knowledge, skills and networks necessary to advocate for and implement best practices in BIA. As we look to the future, it is imperative that we continue to nurture this collaborative spirit by fostering open dialogue, knowledge exchange and innovation. By doing so, we can collectively surmount the challenges that lie ahead, paving the way for groundbreaking discoveries that will propel the field of biomedical research forward. In this spirit of unity and determination, we extend an open invitation to all stakeholders to join us in this endeavor. Together, we are poised to make a substantial impact on the advancement of science, underpinned by the power of effective BIA.

In conclusion, the journey ahead is undeniably challenging yet filled with immense potential. Armed with a clear vision, a robust framework for collaboration and a relentless drive for excellence, the BIA community is well-equipped to embrace the complexities of the future. Let us proceed with confidence and collective resolve, committed to the pursuit of scientific innovation and the advancement of human health.

The authors thank Workshop participant Jason Williams, as well as Helen Zenner, Johanna Langrish and Jane Elsom from The Company of Biologists for organization of and participation in the Effectively Communicating Bioimage Analysis Workshop. B.A.C. and K.W.E. wish to thank the BINA Image Informatics working group for useful discussions.

Funding

This publication has been made possible in part by Chan Zuckerberg Initiative grants 2020-225720 (doi:10.37921/977328pjvbca) and 2023-329649 to B.A.C., grant 2020-224348 to K.W.E., grant DAF2019-198153 (doi:10.37921/603497pkcvxd) to S. McArdle, and grant 2021-244318 to C.S.-D.-C. from the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation. This work was also supported by the Center for Open Bioimage Analysis (COBA) funded by the National Institute of General Medical Sciences (P41 GM135019 to B.A.C. and K.W.E). This work was supported by the Francis Crick Institute, which receives its core funding from Cancer Research UK (CC1069, CC1076), the UK Medical Research Council (CC1069, CC1076) and the Wellcome Trust (CC1069, CC1076). E.F. acknowledges support by Biomedicum Imaging Unit, University of Helsinki, as a part of Biocenter Finland infrastructure. R.H. acknowledges the financial support by the Federal Ministry of Education and Research of Germany and by Sächsisches Staatsministerium für Wissenschaft, Kultur und Tourismus in the programme Center of Excellence for AI-research “Center for Scalable Data Analytics and Artificial Intelligence Dresden/Leipzig”, project identification number ScaDS.AI. S. Marcotti acknowledges Biotechnology and Biological Sciences Research Council (BBSRC) project grant BB/V006169/1. This work was partially supported by the European Union Horizon Europe research and innovation program under grant agreement number 101057970 (AI4Life project) and by Ministerio de Ciencia, Innovación y Universidades, Agencia Estatal de Investigación, under grant PID2023-152631OB-I00, co-financed by European Regional Development Fund (ERDF), ‘A way of making Europe’. This work was supported by The Institute of Genetics and Cancer, The University of Edinburgh. This work was supported by the Helmholtz Association of German Research Centers in the scope of the Helmholtz Imaging Incubator (H.I.). C.S.-D.-C acknowledges funding from National Institutes of Health (NIH) grants U01CA200059-06 and UM1HG011536-04S1. O.U. was supported by the UK Engineering and Physical Sciences Research Council (EPSRC) grant EP/S024336/1 for the University of Leeds Centre for Doctoral Training. A.H.K. acknowledges BIIF, a unit of the National Bioinformatics Infrastructure Sweden (NBIS), with funding from SciLifeLab and National Microscopy Infrastructure (NMI; VR-RFI 2019-00217). This work was supported by France-Bioimaging, which is supported by the French National Research Agency (ANR-10-INBS-04). J.W.P. was supported by InFLAMES Flagship Programme of the Academy of Finland (decision number: 337531). M.S.N. and K.W.E. acknowledge support from Morgridge Institute for Research, Beckman Center for Light Sheet Microscopy, and NIH U54 CA268069. V.U. acknowledges funding from the TRANSFORM funding instrument of the University of Zurich. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. For the purpose of Open Access, the author has applied a Creative Commons Attribution (CC BY) licence to any Author Accepted Manuscript version arising from this submission. Open Access funding provided by Chan Zuckerberg Initiative. Deposited in PMC for immediate release.

Special Issue

This article is part of the Special Issue ‘Imaging Cell Architecture and Dynamics’, guest edited by Lucy Collinson and Guillaume Jacquemet. See related articles at https://journals.biologists.com/jcs/issue/137/20.

Adami
,
V.
,
Homer
,
N.
,
Utz
,
N.
,
Lippens
,
S.
,
Rappoport
,
J. Z.
and
Fernandez-Rodriguez
,
J.
(
2021
).
An international survey of training needs and career paths of core facility staff
.
J. Biomol. Tech.
32
,
1
-
9
.
Afiaz
,
A.
,
Ivanov
,
A. A.
,
Chamberlin
,
J.
,
Hanauer
,
D.
,
Savonen
,
C. L.
,
Goldman
,
M. J.
,
Morgan
,
M.
,
Reich
,
M.
,
Getka
,
A.
,
Holmes
,
A.
et al. 
(
2024
).
Best practices to evaluate the impact of biomedical research software-metric collection beyond citations
.
Bioinformatics
40
,
btae469
.
Barker
,
M.
,
Chue Hong
,
N. P.
,
Katz
,
D. S.
,
Lamprecht
,
A.-L.
,
Martinez-Ortiz
,
C.
,
Psomopoulos
,
F.
,
Harrow
,
J.
,
Castro
,
L. J.
,
Gruenpeter
,
M.
,
Martinez
,
P. A.
et al. 
(
2022
).
Introducing the FAIR Principles for research software
.
Sci. Data
9
,
622
.
Bik
,
E. M.
,
Casadevall
,
A.
and
Fang
,
F. C.
(
2016
).
The prevalence of inappropriate image duplication in biomedical research publications
.
MBio
7
,
e00809-16
.
Caicedo
,
J. C.
,
Goodman
,
A.
,
Karhohs
,
K. W.
,
Cimini
,
B. A.
,
Ackerman
,
J.
,
Haghighi
,
M.
,
Heng
,
C.
,
Becker
,
T.
,
Doan
,
M.
,
Mcquin
,
C.
et al. 
(
2019
).
Nucleus segmentation across imaging experiments: the 2018 Data Science Bowl
.
Nat. Methods
16
,
1247
-
1253
.
Cimini
,
B. A.
(
2024
).
Creating and troubleshooting microscopy analysis workflows: common challenges and common solutions
.
J. Microsc.
295
,
93
-
101
.
Cimini
,
B. A.
,
Tromans-Coia
,
C.
,
Stirling
,
D. R.
,
Sivagurunathan
,
S.
,
Senft
,
R. A.
,
Ryder
,
P. V.
,
Miglietta
,
E.
,
Llanos
,
P.
,
Jamali
,
N.
,
Diaz-Rohrer
,
B.
et al. 
(
2024
).
A postdoctoral training program in bioimage analysis
.
Mol. Biol. Cell
35
,
e2
.
Culley
,
S.
,
Caballero
,
A. C.
,
Burden
,
J. J.
and
Uhlmann
,
V.
(
2024
).
Made to measure: an introduction to quantifying microscopy data in the life sciences
.
J. Microsc.
295
,
61
-
82
.
Derrick
,
G. E.
,
Hettrick
,
S.
,
Baker
,
J.
,
Karoune
,
E.
,
Kerridge
,
S.
,
Fletcher
,
G.
,
Chue Hong
,
N.
,
Ballantyne
,
L.
,
Fransmann
,
J.
and
Roche
,
T.
(
2024
).
Shaping the future of research evaluation. Insights from The Festival of Hidden REF
.
Emerald Publishing
.
Deschamps
,
J.
,
Nogare
,
D. D.
and
Jug
,
F.
(
2023
).
Better research software tools to elevate the rate of scientific discovery or why we need to invest in research software engineering
.
Front. Bioinform.
3
,
1255159
.
Dragon
,
J. A.
,
Gates
,
C.
,
Sui
,
S. H.
,
Hutchinson
,
J. N.
,
Karuturi
,
R. K. M.
,
Kucukural
,
A.
,
Polson
,
S.
,
Riva
,
A.
,
Settles
,
M. L.
,
Thimmapuram
,
J.
et al. 
(
2020
).
Bioinformatics core survey highlights the challenges facing data analysis facilities
.
J. Biomol. Tech.
31
,
66
-
73
.
Elmore
,
J. G.
,
Barnhill
,
R. L.
,
Elder
,
D. E.
,
Longton
,
G. M.
,
Pepe
,
M. S.
,
Reisch
,
L. M.
,
Carney
,
P. A.
,
Titus
,
L. J.
,
Nelson
,
H. D.
,
Onega
,
T.
et al. 
(
2017
).
Pathologists’ diagnosis of invasive melanoma and melanocytic proliferations: observer accuracy and reproducibility study
.
BMJ
357
,
j2813
.
Gibson
,
C. C.
,
Zhu
,
W.
,
Davis
,
C. T.
,
Bowman-Kirigin
,
J. A.
,
Chan
,
A. C.
,
Ling
,
J.
,
Walker
,
A. E.
,
Goitre
,
L.
,
Delle Monache
,
S.
,
Retta
,
S. F.
et al. 
(
2015
).
Strategy for identifying repurposed drugs for the treatment of cerebral cavernous malformation
.
Circulation
131
,
289
-
299
.
Giving software its due
. (
2019
).
Nat. Methods
16
,
207
.
Haase
,
R.
,
Tischer
,
C.
,
Bankhead
,
P.
,
Miura
,
K.
and
Cimini
,
B.
(
2024a
).
A Call for FAIR and Open-Access Training Materials to advance Bioimage Analysis
.
OSF preprint
osf.io/2zgmc
.
Haase
,
R.
,
Tischer
,
C.
,
Hériché
,
J.-K.
and
Scherf
,
N.
(
2024b
).
Benchmarking Large Language Models for bio-image analysis code generation
.
bioRxiv
2024.04.19.590278
.
Hamilton
,
P. W.
,
Wang
,
Y.
,
Boyd
,
C.
,
James
,
J. A.
,
Loughrey
,
M. B.
,
Hougton
,
J. P.
,
Boyle
,
D. P.
,
Kelly
,
P.
,
Maxwell
,
P.
,
Mccleary
,
D.
et al. 
(
2015
).
Automated tumor analysis for molecular profiling in lung cancer
.
Oncotarget
6
,
27938
-
27952
.
Hiner
,
M. C.
,
Rueden
,
C. T.
and
Eliceiri
,
K. W.
(
2016
).
SCIFIO: an extensible framework to support scientific image formats
.
BMC Bioinform.
17
,
521
.
Imreh
,
G.
,
Hu
,
J.
and
Le Guyader
,
S.
(
2024
).
Improving light microscopy training routines with evidence-based education
.
J. Microsc.
294
,
295
-
307
.
Jamali
,
N.
,
Dobson
,
E. T.
,
Eliceiri
,
K. W.
,
Carpenter
,
A. E.
and
Cimini
,
B. A.
(
2022
).
2020 BioImage Analysis Survey: community experiences and needs for the future
.
Biol. Imaging
1
,
e4
.
Jambor
,
H. K.
(
2023
).
A community-driven approach to enhancing the quality and interpretability of microscopy images
.
J. Cell Sci.
136
,
jcs261837
.
Jambor
,
H.
,
Antonietti
,
A.
,
Alicea
,
B.
,
Audisio
,
T. L.
,
Auer
,
S.
,
Bhardwaj
,
V.
,
Burgess
,
S. J.
,
Ferling
,
I.
,
Gazda
,
M. A.
,
Hoeppner
,
L. H.
et al. 
(
2021
).
Creating clear and informative image-based figures for scientific publications
.
PLoS Biol.
19
,
e3001161
.
Jonkman
,
J.
,
Brown
,
C. M.
,
Wright
,
G. D.
,
Anderson
,
K. I.
and
North
,
A. J.
(
2020
).
Tutorial: guidance for quantitative confocal microscopy
.
Nat. Protoc.
15
,
1585
-
1611
.
Kemmer
,
I.
,
Keppler
,
A.
,
Serrano-Solano
,
B.
,
Rybina
,
A.
,
Özdemir
,
B.
,
Bischof
,
J.
,
El Ghadraoui
,
A.
,
Eriksson
,
J. E.
and
Mathur
,
A.
(
2023
).
Building a FAIR image data ecosystem for microscopy communities
.
Histochem. Cell Biol.
160
,
199
-
209
.
Knudtson
,
K. L.
,
Carnahan
,
R. H.
,
Hegstad-Davies
,
R. L.
,
Fisher
,
N. C.
,
Hicks
,
B.
,
Lopez
,
P. A.
,
Meyn
,
S. M.
,
Mische
,
S. M.
,
Weis-Garcia
,
F.
,
White
,
L. D.
et al. 
(
2019
).
Survey on scientific shared resource rigor and reproducibility
.
J. Biomol. Tech.
30
,
36
-
44
.
Kos-Braun
,
I. C.
,
Gerlach
,
B.
and
Pitzer
,
C.
(
2020
).
A survey of research quality in core facilities
.
Elife
9
,
e62212
.
Lee
,
R. M.
,
Eisenman
,
L. R.
,
Khuon
,
S.
,
Aaron
,
J. S.
and
Chew
,
T.-L.
(
2024
).
Believing is seeing - the deceptive influence of bias in quantitative microscopy
.
J. Cell Sci.
137
,
jcs261567
.
Lippens
,
S.
,
Audenaert
,
D.
,
Botzki
,
A.
,
Derveaux
,
S.
,
Ghesquière
,
B.
,
Goeminne
,
G.
,
Hassanzadeh
,
R.
,
Haustraete
,
J.
,
Impens
,
F.
,
Lamote
,
J.
et al. 
(
2022
).
How tech-savvy employees make the difference in core facilities: recognizing core facility expertise with dedicated career tracks
.
EMBO Rep.
23
,
e55094
.
Ma
,
J.
,
Xie
,
R.
,
Ayyadhury
,
S.
,
Ge
,
C.
,
Gupta
,
A.
,
Gupta
,
R.
,
Gu
,
S.
,
Zhang
,
Y.
,
Lee
,
G.
,
Kim
,
J.
et al. 
(
2024
).
The multi-modality cell segmentation challenge: towards universal solutions
.
Nat. Methods
21
,
1103
-
1113
.
Maier-Hein
,
L.
,
Eisenmann
,
M.
,
Reinke
,
A.
,
Onogur
,
S.
,
Stankovic
,
M.
,
Scholz
,
P.
,
Arbel
,
T.
,
Bogunovic
,
H.
,
Bradley
,
A. P.
,
Carass
,
A.
et al. 
(
2018
).
Why rankings of biomedical image analysis competitions should be interpreted with care
.
Nat. Commun.
9
,
5217
.
Maier-Hein
,
L.
,
Reinke
,
A.
,
Godau
,
P.
,
Tizabi
,
M. D.
,
Buettner
,
F.
,
Christodoulou
,
E.
,
Glocker
,
B.
,
Isensee
,
F.
,
Kleesiek
,
J.
,
Kozubek
,
M.
et al. 
(
2024
).
Metrics reloaded: recommendations for image analysis validation
.
Nat. Methods
21
,
195
-
212
.
Martins
,
G. G.
,
Cordelières
,
F. P.
,
Colombelli
,
J.
,
D'Antuono
,
R.
,
Golani
,
O.
,
Guiet
,
R.
,
Haase
,
R.
,
Klemm
,
A. H.
,
Louveaux
,
M.
,
Paul-Gilloteaux
,
P.
et al. 
(
2021
).
Highlights from the 2016-2020 NEUBIAS training schools for Bioimage Analysts: a success story and key asset for analysts and life scientists
.
F1000Res.
10
,
334
.
Maška
,
M.
,
Ulman
,
V.
,
Delgado-Rodriguez
,
P.
,
Gómez-De-Mariscal
,
E.
,
Nečasová
,
T.
,
Guerrero Peña
,
F. A.
,
Ren
,
T. I.
,
Meyerowitz
,
E. M.
,
Scherr
,
T.
,
Löffler
,
K.
et al. 
(
2023
).
The cell tracking challenge: 10 years of objective benchmarking
.
Nat. Methods
20
,
1010
-
1020
.
Miura
,
K.
(
2021
).
A Survey on Bioimage Analysis Needs, 2015 (Version 1)
[Dataset].
Zenodo
.
Miura
,
K.
and
Nørrelykke
,
S. F.
(
2021
).
Reproducible image handling and analysis
.
EMBO J.
40
,
e105889
.
Moore
,
J.
,
Allan
,
C.
,
Besson
,
S.
,
Burel
,
J.-M.
,
Diel
,
E.
,
Gault
,
D.
,
Kozlowski
,
K.
,
Lindner
,
D.
,
Linkert
,
M.
,
Manz
,
T.
et al. 
(
2021
).
OME-NGFF: a next-generation file format for expanding bioimaging data-access strategies
.
Nat. Methods
18
,
1496
-
1498
.
Moses
,
L.
and
Pachter
,
L.
(
2022
).
Museum of spatial transcriptomics
.
Nat. Methods
19
,
534
-
546
.
O'Toole
,
P. J.
and
Marrison
,
J. L.
(
2024
).
A perspective into full cost recovery within a core facility/shared resource lab
.
J. Microsc.
294
,
372
-
379
.
Pfander
,
C.
,
Bischof
,
J.
,
Childress-Poli
,
M.
,
Keppler
,
A.
,
Viale
,
A.
,
Aime
,
S.
and
Eriksson
,
J. E.
(
2022
).
Euro-BioImaging - Interdisciplinary research infrastructure bringing together communities and imaging facilities to support excellent research
.
iScience
25
,
103800
.
Polley
,
M.-Y. C.
,
Leung
,
S. C. Y.
,
Mcshane
,
L. M.
,
Gao
,
D.
,
Hugh
,
J. C.
,
Mastropasqua
,
M. G.
,
Viale
,
G.
,
Zabaglo
,
L. A.
,
Penault-Llorca
,
F.
,
Bartlett
,
J. M. S.
et al. 
(
2013
).
An international Ki67 reproducibility study
.
J. Natl. Cancer Inst.
105
,
1897
-
1906
.
Pulverer
,
B.
(
2015
).
Reproducibility blues
.
EMBO J.
34
,
2721
-
2724
.
Rahmoon
,
M. A.
,
Hobson
,
C. M.
,
Aaron
,
J. S.
,
Balasubramanian
,
H.
and
Chew
,
T.-L.
(
2024
).
More than just “added value”: the perils of not establishing shared core facilities in resource-constrained communities
.
J. Microsc.
294
,
440
-
447
.
Rossner
,
M.
and
Yamada
,
K. M.
(
2004
).
What's in a picture? The temptation of image manipulation
.
J. Cell Biol.
166
,
11
-
15
.
Rubens
,
U.
,
Mormont
,
R.
,
Paavolainen
,
L.
,
Bäcker
,
V.
,
Pavie
,
B.
,
Scholz
,
L. A.
,
Michiels
,
G.
,
Maška
,
M.
,
Ünay
,
D.
,
Ball
,
G.
et al. 
(
2020
).
BIAFLOWS: a collaborative framework to reproducibly deploy and benchmark bioimage analysis workflows
.
Patterns (N. Y.)
1
,
100040
.
Rueden
,
C. T.
,
Ackerman
,
J.
,
Arena
,
E. T.
,
Eglinger
,
J.
,
Cimini
,
B. A.
,
Goodman
,
A.
,
Carpenter
,
A. E.
and
Eliceiri
,
K. W.
(
2019
).
Scientific Community Image Forum: a discussion forum for scientific image software
.
PLoS Biol.
17
,
e3000340
.
Sarkans
,
U.
,
Chiu
,
W.
,
Collinson
,
L.
,
Darrow
,
M. C.
,
Ellenberg
,
J.
,
Grunwald
,
D.
,
Hériché
,
J.-K.
,
Iudin
,
A.
,
Martins
,
G. G.
,
Meehan
,
T.
et al. 
(
2021
).
REMBI: recommended metadata for biological images-enabling reuse of microscopy data in biology
.
Nat. Methods
18
,
1418
-
1422
.
Schmidt
,
C.
,
Hanne
,
J.
,
Moore
,
J.
,
Meesters
,
C.
,
Ferrando-May
,
E.
and
Weidtkamp-Peters
,
S.
and
Members of the NFDI4BIOIMAGE initiative
(
2022
).
Research data management for bioimaging: the 2021 NFDI4BIOIMAGE community survey
.
F1000Res.
11
,
638
.
Schmidt
,
C.
,
Boissonnet
,
T.
,
Dohle
,
J.
,
Bernhardt
,
K.
,
Ferrando-May
,
E.
,
Wernet
,
T.
,
Nitschke
,
R.
,
Kunis
,
S.
and
Weidtkamp-Peters
,
S.
(
2024
).
A practical guide to bioimaging research data management in core facilities
.
J. Microsc.
294
,
350
-
371
.
Schmied
,
C.
,
Nelson
,
M. S.
,
Avilov
,
S.
,
Bakker
,
G.-J.
,
Bertocchi
,
C.
,
Bischof
,
J.
,
Boehm
,
U.
,
Brocher
,
J.
,
Carvalho
,
M. T.
,
Chiritescu
,
C.
et al. 
(
2024
).
Community-developed checklists for publishing images and image analyses
.
Nat. Methods
21
,
170
-
181
.
Senft
,
R. A.
,
Diaz-Rohrer
,
B.
,
Colarusso
,
P.
,
Swift
,
L.
,
Jamali
,
N.
,
Jambor
,
H.
,
Pengo
,
T.
,
Brideau
,
C.
,
Llopis
,
P. M.
,
Uhlmann
,
V.
et al. 
(
2023
).
A biologist's guide to planning and performing quantitative bioimaging experiments
.
PLoS Biol.
21
,
e3002167
.
Sharma
,
N. K.
,
Ayyala
,
R.
,
Deshpande
,
D.
,
Patel
,
Y.
,
Munteanu
,
V.
,
Ciorba
,
D.
,
Bostan
,
V.
,
Fiscutean
,
A.
,
Vahed
,
M.
,
Sarkar
,
A.
et al. 
(
2024
).
Analytical code sharing practices in biomedical research
.
PeerJ Comput. Sci.
10
,
e2066
.
Sivagurunathan
,
S.
,
Marcotti
,
S.
,
Nelson
,
C. J.
,
Jones
,
M. L.
,
Barry
,
D. J.
,
Slater
,
T. J. A.
,
Eliceiri
,
K. W.
and
Cimini
,
B. A.
(
2023
).
Bridging imaging users to imaging analysis - a community survey
.
J. Microsc. [Epub].
Soltwedel
,
J. R.
and
Haase
,
R.
(
2024
).
Challenges and opportunities for bioimage analysis core-facilities
.
J. Microsc.
294
,
338
-
349
.
Tranfield
,
E. M.
and
Lippens
,
S.
(
2024
).
Future proofing core facilities with a seven-pillar model
.
J. Microsc.
294
,
411
-
419
.
Ulman
,
V.
,
Maška
,
M.
,
Magnusson
,
K. E. G.
,
Ronneberger
,
O.
,
Haubold
,
C.
,
Harder
,
N.
,
Matula
,
P.
,
Matula
,
P.
,
Svoboda
,
D.
,
Radojevic
,
M.
et al. 
(
2017
).
An objective comparison of cell-tracking algorithms
.
Nat. Methods
14
,
1141
-
1152
.
Varga
,
Z.
,
Diebold
,
J.
,
Dommann-Scherrer
,
C.
,
Frick
,
H.
,
Kaup
,
D.
,
Noske
,
A.
,
Obermann
,
E.
,
Ohlschlegel
,
C.
,
Padberg
,
B.
,
Rakozy
,
C.
et al. 
(
2012
).
How reliable is Ki-67 immunohistochemistry in grade 2 breast carcinomas? A QA study of the Swiss Working Group of Breast- and Gynecopathologists
.
PLoS One
7
,
e37379
.
Waithe
,
D.
(
2021
).
Summary of two questionnaires designed to understand the research climate for Bioimage Analysts in the UK between 2016-2019
.
F1000Res.
10
,
276
.
Waters
,
J. C.
(
2020
).
A novel paradigm for expert core facility staff training
.
Trends Cell Biol.
30
,
669
-
672
.
Way
,
G. P.
,
Greene
,
C. S.
,
Carninci
,
P.
,
Carvalho
,
B. S.
,
De Hoon
,
M.
,
Finley
,
S. D.
,
Gosline
,
S. J. C.
,
Lȇ Cao
,
K.-A.
,
Lee
,
J. S. H.
,
Marchionni
,
L.
et al. 
(
2021
).
A field guide to cultivating computational biology
.
PLoS Biol.
19
,
e3001419
.
Wiesmann
,
V.
,
Franz
,
D.
,
Held
,
C.
,
Münzenmayer
,
C.
,
Palmisano
,
R.
and
Wittenberg
,
T.
(
2015
).
Review of free software tools for image analysis of fluorescence cell micrographs
.
J. Microsc.
257
,
39
-
53
.
Wilkinson
,
M. D.
,
Dumontier
,
M.
,
Aalbersberg
,
I. J. J.
,
Appleton
,
G.
,
Axton
,
M.
,
Baak
,
A.
,
Blomberg
,
N.
,
Boiten
,
J.-W.
,
Da Silva Santos
,
L. B.
,
Bourne
,
P. E.
et al. 
(
2016
).
The FAIR Guiding Principles for scientific data management and stewardship
.
Sci. Data
3
,
160018
.
Wright
,
G. D.
,
Thompson
,
K. A.
,
Reis
,
Y.
,
Bischof
,
J.
,
Hockberger
,
P. E.
,
Itano
,
M. S.
,
Yen
,
L.
,
Adelodun
,
S. T.
,
Bialy
,
N.
,
Brown
,
C. M.
et al. 
(
2024
).
Recognising the importance and impact of Imaging Scientists: global guidelines for establishing career paths within core facilities
.
J. Microsc.
294
,
397
-
410
.

Competing interests

The authors declare no competing or financial interests.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.

Supplementary information