

Trend Update: How Do You Use Generative AI?
HLC invites the membership to tell us if your institution is proactively using artificial intelligence (AI). Share your experience using the form linked below.
What is AI?
There are different types of AI:
Machine AI (Narrow AI): The most common form of AI is designed to solve one single problem and is able to execute a single task well.
Generative AI (General AI): This AI has a human-level of cognitive function, across a wide variety of domains such as language processing, image processing, computational functioning and reasoning, and so on.
Super AI: Not achieved as of yet, this would be able to surpass all human capabilities, including decision making, taking rational decisions, and even includes things like making better art and building emotional relationships.
Member Experience
HLC notes that there may be positive and negative aspects of AI’s use at colleges and universities. The positive impacts include reduction in human error, availability 24x7, help with repetitive jobs, task automation, and faster decisions. Are there other impacts that you have experienced?
Concerns related to using AI include job displacement, ethical concerns about bias/privacy, security risks, lack of human-like creativity or empathy, and students cheating. Are there are other concerns on your campus?
Colleges and universities have various stakeholders, all of whom may have differing perspectives on the use of AI. How have trustees responded to the impact of AI on your campus? What have you heard from students about AI? What are faculty saying about the use of AI?
HLC’s Role Regarding AI
HLC’s expectations on the use of generative AI tools with respect to all activities conducted on behalf of HLC are:
- Any use of generative AI tools must be consistent with all HLC policies for peer reviewers, including the Standards of Conduct.
- It is the responsibility of the persons using the tools to ensure any resulting work product is high-quality, accurate, and does not infringe on any third-party intellectual property rights.
- Peer reviewers may not input content provided by institutions (e.g., assurance arguments, supporting evidence, web text, or other institutional information) into any generative AI tool. Generative AI tools use content from users to "train" themselves, and any content entered can be unintentionally exposed elsewhere or even used by other entities as a result. Utilizing institutional information in this manner is not aligned with the confidentiality expectations in the Standards of Conduct.
Share Your Experiences and Ideas
HLC is also looking at other ways that its role can support colleges and universities' use of AI or in ways HLC could use AI to help the membership. We welcome your ideas related to this growing area. Send your thoughts on AI to HLC.
Keep Reading
- New Proposed and Adopted Policy Changes
- BGD Says
- HLC Fielding Questions, Taking Steps with new Credential Lab
- Higher Ground: Looking Ahead at the HLC Annual Conference
- Evaluating Student Success Outcomes
- The Latest from the Peer Corps
- Trend Update: How Do You Use Generative AI?
- Advocacy and Higher Ed Policy Update