Regulations Hitting AI Usage in EdTech

Higher Ed Institutions, vendors, and investors should pay attention to current and future regulations

Was this forwarded to you by a friend? Sign up, and get your own copy of the news that matters sent to your inbox every week. Sign up for the On EdTech newsletter. Interested in additional analysis? Try with our 30-day free trial and Upgrade to the On EdTech+ newsletter.

Late last week Van Davis wrote at the WCET blog about how AI usage at colleges and universities should be designed with regulations in mind, particularly Regular and Substantive Interaction (RSI).

As more and more faculty are experimenting with AI in their classes, institutions need to be increasingly careful that they are in compliance with federal regulations governing regular and substantive interaction (RSI) and Title IV financial aid eligibility.

After describing RSI regulatory language, Davis noted that the final 2020 regulations on RSI specifically called out technology usage such as AI [emphasis added].

Interaction with artificial intelligence, adaptive learning systems, or other forms of interactive computer-assisted instructional tools quality as types of ‘academic engagement,’ but in this limited context those forms of engagement do not meet the statutory requirements for regular and substantive interaction between students and instructors… [T]he definition currently requires regular and substantive interaction between students and instructors; substantive interactions with machines or other forms of technology that do not involve in [an] instructor would, therefore, not qualify.

Although the milestone release of ChatGPT as a commercial product was just 14 months ago, that does not mean that government regulatory efforts are waiting until we have clarity on real use cases or impacts. While the AI genie is out of the bottle, there are major efforts to fence it in, often in naive fashion (it’s hard to build a fence around something which is not understood).

Author via ChatGPT / DALL-E

US Executive Order

In late October 2023, President Biden issued an 80-page executive order on AI that does not constitute regulatory action, per se, but that directs government agencies to create regulations and guidance. The headlines from the fall centered on the item listed in the executive order’s fact sheet.

Shape AI’s potential to transform education by creating resources to support educators deploying AI-enabled educational tools, such as personalized tutoring in schools.

Resources to help educators, sounds hard to argue with that. That aim, however, is but one part of the broader regulatory agenda sought in the actual executive order.

There are several sections based on helping educators, and reminding people that consumer protections will apply to AI-impacted industries, but for EdTech the biggest section is the following (section 8 d) [emphasis added].

To help ensure the responsible development and deployment of AI in the education sector, the Secretary of Education shall, within 365 days of the date of this order, develop resources, policies, and guidance regarding AI. These resources shall address safe, responsible, and nondiscriminatory uses of AI in education, including the impact AI systems have on vulnerable and underserved communities, and shall be developed in consultation with stakeholders as appropriate. They shall also include the development of an “AI toolkit” for education leaders implementing recommendations from the Department of Education’s AI and the Future of Teaching and Learning report, including appropriate human review of AI decisions, designing AI systems to enhance trust and safety and align with privacy-related laws and regulations in the educational context, and developing education-specific guardrails.

In other words, the executive order is directing the Secretary of Education to produce regulatory guidance by next fall - not actual regulations going through negotiated rulemaking, or heaven forbid making statutory changes - with nondiscrimination as the core element. Basically, be prepared, something is coming down the road.

The second piece is interesting as it essentially calls for implementing recommendations from this May 2023 report. That 71-page report was based on “constituent meetings” held in June and August, 2022. aka several months before the true (or at least visible) dawning of generative AI. The report’s intro - which again will form the basis of much of the upcoming guidance - has a strangely outdated and limited definition of the new AI capabilities [emphasis added].

In late 2022 and early 2023, the public became aware of new generative AI chatbots and began to explore how AI could be used to write essays, create lesson plans, produce images, create personalized assignments for students, and more. From public expression in social media, at conferences, and in news media, the Department learned more about risks and benefits of AI-enabled chatbots. And yet this report will not focus on a specific AI tool, service, or announcement, because AI-enabled systems evolve rapidly.

Apparently AI-enabled systems change so quickly that in May 2023 ED was not aware that generative AI goes beyond chatbots.

The seven recommendations from the report:

  1. Emphasize Humans in the Loop

  2. Align AI Models to a Shared Vision for Education

  3. Design Using Modern Learning Principles

  4. Prioritize Strengthening Trust

  5. Inform and Involve Educators

  6. Focus R&D on Addressing Context and Enhancing Trust and Safety

  7. Develop Education-Specific Guidelines and Guardrails

Each recommendation includes a 3 - 10 paragraph section with general discussions of the very general points. In fact, it is hard to really call these recommendations rather than semi-organized aspirational sentiments that are hard to argue with. But viewed another way is that the chain from the May 2023 report to the October 2023 executive order to the future 2024 guidance represents effectively a blank check to craft Dear Colleague Letters and saying hey, I was told by the President to do this.

The one recommendation with a little bit of meat on the bones is the Shared Vision from #2. This gives a conceptual view of AI-based tools.

EU: Hold my Belgian Tripel

While the Biden Executive Order and referenced AI report constitute a lot of pages pushing regulatory guidance on tools without significant use cases yet, the European Union (EU) and the UK have gone further.

The EU created the AI Act in August 2023, and in December the Parliament and Council came to agreement on the terms. The AI Act is now headed towards formal commitee votes and formal adoption before becoming law.

Unlike the US executive order, the EU’s AI Act has more specifics based on a risk pyramid, as shown below.

Source: Telefónica augmentation of AI Act risk pyramid

There are specific banned usage, including social scoring and harmful manipulative 'subliminal techniques', and there are different levels of risk.

What is important to understand for the vendor and investment community is the scope of regulations. As described in Telefónica:

The regulation will be applicable to all uses of AI affecting EU citizens, no matter where the service provider is based, or where the system developed or being run, within or outside EU boundaries. This is also the case of other EU regulations, such as the General Data Protection Regulation (GDPR), and other legislative proposals, such as the DMA (Digital Markets Act) and the DSA (Digital Services Act).

GDPR, DMA, and DSA represent a family of regulatory actions that each had surprising (and confusing scope) and a host of unintended consequences. The AI Act will likely have strong similarities.

Expect Actions in 2024

I don’t pretend to be able to describe all regulatory activities around AI that could impact EdTech. The bigger issue is for readers to be aware that 2024 is shaping up to be a banner year in the introduction of new laws and regulations and regulatory guidance around AI, and that these actions will impact education. And as the WCET post has shown, there are already regulatory limitations that the EdTech community should understand.

Stay tuned.

The main On EdTech newsletter is free to share in part or in whole. All we ask is attribution.

Thanks for being a subscriber.