AI Is a Part of Local Schools
Careful steps incorporating a new technology

Artificial intelligence (AI) is no longer a futuristic topic reserved for tech talks. It’s reshaping much of how we live our lives today, including in local classrooms. From tools that draft text to platforms that help structure student thinking, AI promises powerful new capabilities.
But AI in schools also raises complex questions about academic integrity, equity and ethics. In County Lines country, independent schools are navigating these opportunities and challenges in ways that reflect their distinct missions and values.
Here’s what we learned from five local schools.
Intentional Integration and Ethical Grounding
Rather than as a substitute for critical thinking, AI use is viewed at Episcopal Academy as a tool that should support meaningful learning. The school’s mission emphasizes intellectual curiosity, ethical reflection and integrity — themes that guide how AI fits into academic life there.
According to Kelly Edwards, the Academic Dean, the faculty use AI like a “teaching assistant.” For example, it’s used for shaping alternative assessments, for making curriculum guides and, to avoid cheating, for creating alternate versions of a test. While creating presentations, teachers will ask AI tools, “What’s the theme I want to emphasize? What’s the format? What do I want this to look like? How do I want it to flow? How many examples do I need?” Teachers get quick assistance creating a better, polished product.
Students, said Edwards, use AI tools as a “thought partner.” They’ll get feedback to help improve essays or get assistance to present their work in a “beautiful presentation.”
In general, Episcopal Academy encourages transparent, ethical use of AI tools. Students are taught to develop technological fluency without losing judgment. The goal is not avoidance, but preparedness — students who use AI thoughtfully, ethically and with self-awareness.
Learning with Purpose and Moral Insight
Approaching technology through the lens of mission and moral development, Villa Maria Academy’s long history of academic excellence, character formation and critical thinking provides a natural foundation for responding to incorporating AI in the classroom.

Villa Maria treats AI as a tool with ethical expectations attached. According to Linda Schweitzer, the school’s Education Technology Specialist, the computers of our childhood replaced the “monotonous tasks that may not be difficult but may be time consuming.” In contrast, generative AI responds to “a prompt” to replicate “what humans have historically done.” The school’s teachers are using AI to “enhance their lesson plans, classroom activities and assessments.” Meanwhile, students use AI for such learning tasks “to help them create study guides, do practice tests and brainstorm ideas.”
Schweitzer explained that teachers use a “stoplight formula” when deciding whether to permit AI tools for assignments. Assignments designated red means no AI use is permitted. Yellow means a student must obtain permission for use from the teacher. Green means that the student is free to use AI.
Teachers are integrating AI tools while simultaneously safeguarding student learning goals. Rather than viewing AI as a shortcut, the school’s culture encourages students to treat AI as one of many tools that support meaningful intellectual and moral growth.
Careful Boundaries and Ethical Use
Combining a strong academic program with values rooted in integrity, respect and responsibility, the Academy of Notre Dame partners with a company called Flint for some of its AI use. According to Tyler Gaspich, Director of Information Resources and Technology, “Flint responds to prompts with guiding questions and suggestions for the user to consider. So instead of students losing the ability to think critically by having the AI solve that math problem or write that essay for the student, Flint encourages students to consider certain aspects, nudging them in an appropriate direction.”

Notre Dame’s academic code includes an AI Acceptable Use Policy to make clear the distinction between acceptable and unacceptable uses. Students may use AI for idea generation, research assistance or understanding complex topics, provided they engage critically with the material and ensure that final submissions reflect their own understanding. At the same time, presenting AI-generated content as personal, original work is explicitly prohibited.
Take for example, the case of a senior who’s taken a number of classes in computers. This student uses AI “to break down instructions and aid in studying,” to build “flashcards, replayable podcasts or create practice tests within minutes” and “to clarify prompts or create a comprehensive list of expectations.” She also uses AI “to manage [her] time by creating work schedules for [her] studies, sports, college applications and extracurriculars.” All helpful and permitted uses.
Quaker Values and Thoughtful Exploration
Operating within a Quaker educational tradition, Wilmington Friends School emphasizes reflection and community, valuing critical inquiry, individual expression and ethical engagement. Its Head of School, Ken Aldridge, reported that AI is used by faculty and staff as a “professional support tool to enhance planning, efficiency, creativity and instructional responsiveness rather than as a substitute for teaching or human judgment.”
Many teachers use AI tools in their classrooms — some for lesson planning and resource creation, others for research, and still others for creative work and image design — reflecting “thoughtful experimentation, with an emphasis on supporting student learning and teacher effectiveness,” said Aldridge.
AI is largely disallowed in middle school. Although “under the guidance of their teachers,” students are experimenting with several platforms, such as Grammarly, ChatGPT and Claude.
In upper school, AI use is more common, including Grammarly, Duolingo and various features within student devices and platforms. How students use AI “will vary class to class, teacher to teacher and assignment to assignment … with explicit guidance and instructions from teachers.”
Teachers at Wilmington Friends invite students to reflect on when and why to use AI and when to rely on their own reasoning. Student are encouraged to focus on balanced exploration, helping learners situate AI tools within broader questions of meaning, purpose and ethical responsibility.
Innovation Within a College-Prep Mission

Early and common uses of AI tools have been for middle and upper school subjects like history, math or biology. Yet, Bill Burton, who teaches K to 5 at the Tatnall School, uses AI to teach art. For instance, his second grade students learn about camouflage and make their own “fantasy creature.” Then they’ll use AI to create a background of their animal’s habitat. “Text and writing,” he said about AI uses, “is just scratching the surface.”
Tatnall emphasizes rigorous academics and personal growth. To that end, faculty and administrators are actively considering how AI fits into teaching and learning contexts. In environments where project-based learning, technology literacy and deep inquiry are hallmarks, AI is incorporated to enhance creativity and problem solving, but always with clear expectations about when it is appropriate. Unauthorized AI use is treated as a form of cheating.
Shared Themes — Support, Integrity and Preparation
Several common themes emerge across these five schools. AI presents their faculty and students with great opportunities. But each school expects AI use to align with ethical principles and academic integrity, ensuring that students remain responsible authors of their own learning.
Policies and classroom practices distinguish between supportive uses — idea generation, research support, time saving — and unacceptable uses — submitting AI-generated content as original work.
Teachers, not technology alone, determine how AI tools fit within curriculum goals. Professional development, discussion and reflection help faculty guide student engagement with emerging technologies.
All the schools recognize that AI is part of students’ current and future landscape. Rather than treating AI as a threat, they emphasize thoughtful and discerning use.
An Evolving Future
As AI continues to evolve, schools must balance innovation with values. Not merely embracing technology for its own sake, independent schools are modeling ways to integrate AI thoughtfully. They are shaping its use through intentional policies, ethical frameworks and educational conversations. These schools remind us that AI’s true value lies not in what it can generate, but in what it can help students understand about themselves, their world and how they choose to act within it.
Some Terms Beyond ChatGPT

(an AI chatbot developed by OpenAI that uses advanced machine learning to process and generate human-like text, audio and images.)
- Claude: an AI assistant by Anthropic, trained to be safe, accurate and secure.
- Flint: an AI-powered learning platform designed specifically for K–12 schools that offers personalized tutoring and transcription services.
- Gemini: an AI platform, by Google, that includes a family of large language models (LLMs).
- Grammarly: an AI-powered writing assistant, working over other applications, that provides real-time feedback on grammar, spelling, punctuation, tone and clarity.
- Notebook LM: an AI-powered research and note-taking assistant by Google Labs. It’s “source-grounded,” meaning it primarily uses documents you provide to generate summaries, answer questions and brainstorm new ideas.
- Sora: a generative AI model by OpenAI that creates high-fidelity videos and synchronized audio from text prompts or images.
Our Favorite Resources
- Berk Hathaway Holly Gross
- Berk Hathaway Country Prop
- Chester Cty Community Fdn
- Chester Cty Hosp, Penn Med
- Chester Cty Library System
- Colonial Theatre
- Delaware Museum of Nat & Sci
- Key Financial, Inc.
- King Construction
- Mercedes Benz
- Osher Lifelong Learning
- Ron’s Original
- Tim Vaughan
- West Chester BID
- Walter J. Cook