Guardrails for the Ethical Use of AI in Early Childhood: Our Boulder, CO, Meeting Report

By Nina Kruschwitz

T9 Hackathon Early Childhood track judges Alyssa Ilov, Tom Yeh, Sam Hall, and Sabrina Whitfill. Photograph by Elliot Whitehead.

T9 Hackathon Early Childhood track judges Alyssa Ilov, Tom Yeh, Sam Hall, and Sabrina Whitfill. Photograph by Elliot Whitehead.

“Dad, why is Google smarter than you?” —the 9-year-old son of Colorado University Professor Tom Yeh

No parent wants to have to answer that question, but it’s one more adults may be asked as Artificial Intelligence (AI) is increasingly integrated into products that interact with children directly, and indirectly through devices, products, and apps designed for parents and professionals. Who determines the purpose and methods used by AI is a question that affects children, caregivers, policy-makers, researchers, and business.

The Headstarter Network (HSN), devoted to accelerating innovation for all early childhood education (ECE) practitioners, convened a meeting February 7-8, 2020 to begin to think about how to create “guardrails” for the ethical use of AI in childhood education. The need for such principles emerged at the HSN Chicago Catalyst meeting held in the spring of 2019. In response, a subset of EC educators, practitioners, and influencers formed the AIEC Guardians working group to focus on developing a set of guidelines that could be shared with the wider HSN community. 

Host: The Boulder Journey School

The AIEC Guardians gathered at the Boulder Journey School, whose current CEO, Sam Hall, was also the school’s first student more years ago than he cares to admit. This private preschool—an internationally recognized leader in EC and teacher education—follows the Reggio Emilia approach to early childhood education, honoring every child’s creativity, curiosity, and competency. The school’s contextual curriculum starts with questions, and engages both children and adults as learners.

The two-day meeting kicked off with a tour of the school while the children, aged 8 weeks to 5 years, were present in classrooms and shared spaces throughout the building. Community Outreach Specialist Alex Cruickshank explained each room’s design and purpose in relation to the school’s child-centered philosophy. The students’ “Charter of Rights,” a list of 61 statements written by the children, occupies a four-by-ten foot panel on one wall and sets the tone for all the activities that occur in the building. (The hands-down favorite was “Pretend that there is a beach anywhere.”) Additional displays documenting specific projects covered the walls of every hallway in the building. One captured the children’s year-long exploration of “BB-8,” a tiny robot that—allaying some caregivers’ concerns—evoked the children’s empathy as they built it a home and created ways to share it with younger students. 

Layne Hubbard and the robot puppet she developed to help children tell their stories.

Layne Hubbard and the robot puppet she developed to help children tell their stories.

Meeting Research and Design

The meeting participants prepared for their design task by reviewing the Ethical OS Toolkit and relevant standards released by UNICEF and the EU Commission on Ethics. They also responded to survey questions distributed by HSN (see Appendix). In addition, they were addressed by John Havens, the Executive Director of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. IEEE, the international standard-setting body for the engineering profession, involved over 800 people in their deliberations about what constitutes ethical AI. Their proposed guidelines were then checked by thousands of people through the Creative Commons platform. 

Havens emphasized the need for accountability and transparency in data handling, diversity of input in constructing guidelines, and asked, “How will machines know what we value if we don’t know ourselves?”

Ably facilitated by CU Boulder Professor Tom Yeh and PhD candidate Layne Hubbard, the Guardians broke into small groups to consider different AI topics and build an understanding of how each would relate to ECE guidelines. They also brainstormed values and priorities that could eventually inform guidelines for evaluating AI products and services. The insights from these activities (see Appendix) were then tested in situ, used to review and judge the results of a student hackathon held on the UC Boulder campus. That partner event, during which students spent 24 hours imagining and creating an AI product for ECE, was initiated and organized by students in the Technology, Arts, and Media program at the university.

AIEC Guardians at work.

AIEC Guardians at work.

Brainstorm Discussion One  

The first brainstorm activity asked participants to adopt the mental model of one of four constituents—children, adults, educators, and investors or government policymakers—and consider what questions they would ask themselves or the developer of an AI application before inviting that technology into their home, school, organization, or life.  

The download from those discussions made visible the complexity of the topic. Taking the child’s point of view, one participant wondered whether AI could hurt her parents, by sharing sensitive information about them. An educator questioned whether an AI product was destined to replace him in the classroom. A parent asked whether the technology could help compensate for her child’s special needs and develop his strengths. A policymaker pointed out that if there were an ethical lapse, it was unclear what organization would be held accountable. 

One participant reported that IRL, Alexa required her to change the pronunciation of her daughter’s name before she would respond to a request, a case of humans being forced to change their own behavior to accommodate technology. Issues concerning diversity and inclusion were raised at every turn.

AIEC Guardians at work.

AIEC Guardians at work.

Brainstorm Discussion Two 

For the second brainstorm round, breakout groups developed a set of questions that could be used to judge whether an application should be adopted by the ECE community. Each group was assigned one hypothetical product to spark their inquiry.

The first was a “Crying App” that collected recordings of babies’ cries in order to help parents or caregivers answer the question “Why is my child crying?” Dozens of feedback questions covered the design of the interface (Does it ask the caregiver “what kind of cry is that?” vs simply always listening); its effect on users (does it strengthen or weaken the child-caregiver bond?); and the way data was collected, accessed, and shared.

An app that monitored an outdoor playspace via cameras was the the second possibility. While such an app could help keep children safe, and note patterns of behavior and flag them (e.g. a child who is constantly self-isolating), questions about use, privacy violations, and data security abounded. The overarching question participants had was: Is this solving a big problem or is it just another app?

Similar questions and concerns were raised by the third group, which rather than working with a specific product, considered the generalized concept of a cognitive assessment app. The benefits were easily itemized (early identifier of problems, additional source of information), as were the concerns (how would this data be used, how long would it be kept, who has access?). One significant outcome was a desire to use the information to provide a snapshot of present activity, rather than let it be used as a predictor, or a way of assessing childrens’ potential.

T9 Hackathon team winners Liza Tolkin and Fiona Bell celebrate. Photograph by Elliot Whitehead.

T9 Hackathon team winners Liza Tolkin and Fiona Bell celebrate. Photograph by Elliot Whitehead.

Testing the Questions: T9 Hackathon

The Guardians had a chance to test out some of their potential evaluation criteria at the conclusion of the Colorado University students’ T9 Hackathon. Six teams had chosen the Early Childhood Education track, and developed an ambitious range of products. The winner, chosen by a six-member judging team, dubbed “Brain Break,” was designed to support healthy digital habits by offering parents a way to interrupt content consumption and control their children’s screen time. The app, aimed at six- to eight-year-olds, used pop-ups to suggest alternate activities the child could “take a break” and do. These appeared at specific intervals, and parents could discuss various options with their child before programming. 

“The most challenging aspect of developing Brain Break was coming up with the concept,” student team member Fiona Bell commented afterwards. “We wanted to build something fun and creative that also addressed negative aspects of technology like mental and physical health. Once this was decided, we decided to focus on childhood education, because childhood is when skills are easily learned and habits are formed. If we teach children how to interact with technology in a healthy way, it will carry through their lives and truly make a difference.”

T9 Hackathon winning team. Photograph by Elliot Whitehead.

T9 Hackathon winning team. Photograph by Elliot Whitehead.

Conclusion

The need to involve a broader and more diverse group of participants in developing a set of guiding principles was a key takeaway from the AIEC Guardians gathering. At the end of their meeting, the group discussed both who needed to be involved in the ongoing conversations, and how to engage them. They quickly decided that they wanted to involve non-U.S. ECE practitioners and thought leaders to ensure that the process represents a more global and diverse perspective. A list of potential people and strategies were collected and captured. Recommendations for next steps are forthcoming, and will include conducting a survey of the wider ECE community; forming focus groups; and reaching out to influencers in the media. 

Appendix A: Relevant Standards

Appendix B: Survey Response Summaries

Appendix C: Brainstorm Responses Round One

Appendix D: Brainstorm Responses Round Two

Appendix E: Boulder Journey School Children’s Charter of Rights

The AIEC Guardians meeting was sponsored by: 

PNC Grow Up Great 

Gary Community Investments

Buffett Early Childhood Fund

Nina Kruschwitz