AI & SOCIETY
Hard to find, harder to understand: examining transparency in educational generative AI
This document analysis examined the transparency of 20 GenAI tools for teachers by analyzing publicly available information on the tools' websites. Findings indicate a significant lack of transparency across these tools. Most failed to provide explicit details about their AI mechanisms, and none disclosed their training procedures or data sources. While several tools assured users that their data would not be used for further training, discussions on limitations and potential risks were largely absent.
Executive Impact: Key Metrics
Our analysis reveals critical performance indicators for enterprise AI adoption.
Deep Analysis & Enterprise Applications
Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.
Lack of Clear Mechanism Information
The study found a significant lack of clear information regarding how GenAI tools work or generate their responses across most tools. Only 4 out of 20 tools provided any details about what GenAI is or its content generation process.
A striking 80% of the reviewed GenAI tools provided no clear information on their underlying AI mechanisms, making it difficult for educators to understand how content is generated.
Educational GenAI Tool Sampling Procedure
Absence of Training Data Disclosure
None of the 20 tools provided information on the specific datasets or training procedures used. While many relied on third-party models, their websites did not extend to model developers' sites for this crucial information.
Crucially, zero out of twenty educational GenAI tools disclosed their training data sources or procedures, raising significant concerns about bias, copyright, and ethical use.
| Information about foundational models | Number of tools | Name of the tools |
|---|---|---|
| No mention of models | 2 | Diffit, TeachAid |
| GPT or OpenAI's API as models | 9 | Almanack, Class Companion, Conker AI, Edcafe, Mizou, Owler AI, QuestionWell, SchoolAI, TeacherServer |
| Multiple models | 9 | Alayna, Brisk Teaching, Curipod, Eduaide AI, Fetchy MagicSchool AI, Questgen, Teacherbot, Twee |
Inconsistent Disclosure of Limitations
Only 7 out of 20 tools included clear disclaimers about potential weaknesses and warned users to be careful. The remaining tools either offered no warnings or confined them to legal terms and conditions pages, often using complex, jargon-filled language.
A significant 65% of the tools failed to provide clear, accessible advisories regarding GenAI limitations, potentially leading to over-reliance and negative outcomes in educational settings.
Enhancing Transparency in Educational GenAI
Context: The study highlights a critical need for enhanced transparency in educational GenAI tools to foster trust, ensure ethical use, and support informed decision-making among educators. Current disclosures are often limited, unclear, or hidden.
Solution: Developers should implement clearer, more accessible disclosures, including dedicated transparency pages, plain-language definitions of key terms, and visible warnings about limitations. They must practice accountability and provide concrete examples of data use policies.
Outcome: By adopting these recommendations, educational GenAI tools can build stronger trust with educators, enabling them to critically assess functionalities, responses, and suitability for classroom use. This facilitates informed decision-making and promotes responsible AI integration in education.
Checklist for Educators
A practical checklist is provided for educators to evaluate the transparency of educational GenAI tools before adopting them. This includes checking for model and training data transparency, user data privacy, limitations, placement, clarity, and institutional compliance.
Advanced ROI Calculator
Estimate your potential time and cost savings with transparent AI implementation.
Your Enterprise AI Implementation Roadmap
Our structured approach ensures a seamless transition and maximum value from your AI investments.
Phase 1: Discovery & Assessment
Evaluate current teaching practices and identify areas where GenAI can offer the most impact and efficiency.
Phase 2: Pilot Program & Feedback
Implement selected GenAI tools with a small group of educators, gather feedback, and iterate based on real-world use.
Phase 3: Training & Integration
Conduct comprehensive training for all staff on GenAI tools, ethical considerations, and best practices for educational use.
Phase 4: Scaling & Continuous Improvement
Roll out GenAI tools across the institution, establish ongoing support, and monitor for new developments and feedback.
Ready to Transform Your Enterprise with AI?
Connect with our experts to discuss your specific needs and chart a course for innovation.