Jump to content

Cochrane Collaboration and AI Updates

From UBC Wiki

Compiled by

Updated

See also

How is Cochrane advancing responsible AI for evidence synthesis?

How is Cochrane Integrating AI into Evidence Synthesis?

  • Cochrane is implementing a range of automation solutions in its review processes; machine learning, for example, is used to identify randomized controlled trials (RCTs), and other technologies such as generative AI. CC is primed to use AI due to high-quality, structured data from systematic reviews and included studies; it can perform analysis in RevMan, which allows authors to insert live results directly into their reviews while they’re writing and updating them in CENTRAL.
  • Cochrane plans to promote responsible AI combining automation and verification to help authors identify studies. CC plans to improve AI literacy across their organization.
  • AI Methods Group <https://www.cochrane.org/about-us/news/new-ai-methods-group-spearhead-adoption-across-four-leading-evidence-synthesis-organizations> includes Cochrane, Campbell Collaboration, JBI, and the Collaboration for Environmental Evidence. The groups aims to standardize responsible AI use across evidence synthesis organizations; focuses on methods research, tool validation, and fostering collaboration via the International Collaboration for Automation in Systematic Reviews (ICASR).
  • Cochrane’s Evidence Synthesis and Methods journal encourages research on AI applications in evidence synthesis, such as search strategy development, screening, and risk of bias assessment. Recent studies highlight modest adoption of traditional AI tools but promising advancements in language editing and data extraction.

Responsible AI in Evidence Synthesis (RAISE)

Covidence support of RAISE

  • Human oversight is mandatory
  • Authors retain full responsibility for the review
  • AI must not compromise methodological rigor
  • All AI use must be transparent and reported

References

  • Evidence synthesists publishing with Cochrane, Campbell Collaboration, JBI, and Collaboration for Environmental Evidence can use AI as long as they demonstrate it does not compromise methodological rigour or integrity of their synthesis;
  • AI and automation in evidence synthesis should be used with human oversight;
  • Any use of AI or automation that makes or suggests judgements should be fully and transparently reported;
  • AI tool developers should proactively ensure their AI systems or tools adhere to the RAISE recommendations so we have clear, transparent, and publicly available information to inform decisions about whether AI should be used in evidence synthesis.

Disclaimer

  • Note: Please use your critical reading skills while reading entries. No warranties, implied or actual, are granted for any health or medical search or AI information obtained while using these pages. Check with your librarian for more contextual, accurate information.