When Automated Systems Attend Courses

Calling it an innovative approach for institutions to evaluate their courses, Ferris State University made a splashy announcement a few weeks ago that it intended to enrol two chatbots as “students” in its programs.

The unconventional concept appears to be a publicity gimmick to highlight the artificial intelligence major it offers. Local TV news stations seized on the idea that students would be taking hybrid college courses with young people wearing T-shirts and nonhuman classmates side by side. However, the project raises ethical concerns as well as intriguing prospects for enhancing instruction using the newest AI technology.

Indeed, one might argue that the Michigan public college’s initiative ushers in a new era in the field of “learning analytics.” In an effort to enhance course design and even tailor content for individual students, institutions are attempting to use the digital breadcrumbs that students leave behind when they navigate digital platforms and online course materials. This strategy has gained traction over the last ten or so years.

According to Arizona State University deputy chief information officer Kyle Bowen, “AI could afford us a novel way of seeing into something we haven’t seen into before.” “Now that we have something that mirrors a persona at a data level, we may have the idea of a data doppelganger.”

Put another way, generative AI tools like ChatGPT allow educators to create simulations of students that represent various profiles, such as a first-generation student or a student who is struggling in a particular subject, and observe what happens when they encounter material in college courses, as opposed to simply watching how students click.

“How can we adjust AI responses to better represent the needs of a first-year student or the diversity of our student body?” Bowen inquires, implying that doing so may provide those who create learning experiences with fresh perspectives.

Although Arizona State hasn’t started accepting virtual students, it has made a significant commitment to using AI research to enhance its instruction. With the aim of “enhancing student success” and “streamlining organizational processes,” the university last month became the first higher education establishment to collaborate with OpenAI, the company that created ChatGPT.

In an effort to comprehend student data more fully, other colleges are also putting significant effort into the newest AI. Following his resignation as president of Southern New Hampshire University late last year, Paul LeBlanc said he would head an initiative to transform college instruction at the school using ChatGPT and other AI technologies.

What therefore may generative AI do to enhance learning?

Making “Students” out of AI

Few specifics about Ferris State’s experiment have been made public as yet, and a representative for the school, Dave Murray, informed EdSurge that the chatbot students have not yet begun attending courses.

They are still being erected, according to officials. The two chatbots are called Ann and Fry; the former is named after the university librarian Ann Breitenwischer, while the latter is a reference to Kasey Thompson, one of the project’s leaders, who used to work at McDonald’s corporate headquarters. The AI bots’ identities were developed with the assistance of interviews with actual pupils.

It is said that the bots will have speech and voice recognition skills, enabling them to engage in class conversations with real students and pose questions to instructors. The AI agents will submit tasks and get information from the course syllabi.

The special assistant to the president for innovation and entrepreneurship at Ferris State, Thompson, said to a local news station that “the whole role of a university and college is evolving to meet the needs of how society is evolving.” And from Ann and Fry, we hope to get some insight into what it entails. How can we improve the students’ experience there?”

“The intention is to have them in classes this semester,” according to Murray.

Information security major Seth Brott, a sophomore at Ferris State University, intends to provide a nice greeting to his robot classmates.

He claims that when one of his teachers informed him about the scheme, he was “excited.” He remarks, “I’d love to watch how these bots perform in a classroom.”

Brott claims to have tried using ChatGPT on a few homework projects. He claims that although the technology helped assist him in brainstorming ideas for a public speaking class, its value diminished when he was permitted to use it to provide suggestions for data system security in an information security lesson.

Does he believe the chatbots can pass his courses, then?

He surmises that although the chatbots couldn’t do all that well right now, they can learn. They get feedback when they make a mistake, just as humans do. Furthermore, he notes that in due course, he believes the institution might train a chatbot student to succeed in the classroom.

He expressed his excitement over the university’s effort at the ground-breaking endeavour. Additionally, he thinks it might encourage the institution to enhance its instruction. For example, he was recently informed by a friend of a course in which the class average was barely sixty percent at the time of the midterms. He saw it as an opportunity to send in a chatbot to test whether the pupils could understand the instructions better.

However, not every student is enthused. The method at Ferris State raised some concerns for Johnny Chang, a doctoral student at Stanford University who last summer arranged a nationwide online seminar to urge more instructors to learn about and test AI.

According to Chang, “They should build tools to help administrators better talk to real students if the goal is to get feedback about the student experience.”

With an emphasis on artificial intelligence, he is now pursuing a master’s degree in computer science. He believes that the risk associated with developing chatbot pupils is that, depending on how they are taught, they may introduce “inherent bias.” “The underrepresented student population might end up feeling unsupported,” according to Chang if the chatbot students are trained just on pupils of a certain sort.

It does not follow, however, that AI cannot contribute to a university’s progress. He proposed that Ferris State administrators develop a mechanism that would prompt students to complete brief survey questions at different points throughout their learning process. Then, with AI, all that data could be sorted, organized, and synthesized in ways that would have been too challenging to do with older technology.

According to Chang, “These chatbots are good at analyzing and summarizing— almost like a copilot for administrators— if the goal is to get insights from student behaviours.”

Ferris State is open to experimenting with different strategies, according to Murray, the university’s spokesperson.

We often discuss experiences with students and adapt our approach in response to their input. This is an extra strategy, he explains. “We want to explore the kinds of educational apps we can create. We’ll discover what works, as well as what requires improvement and what could not function at all.

Constructing a “Syllabot”

According to Bowen, after a request for suggestions from the community on how to utilize ChatGPT, hundreds of professors and staff people are involved in more than 100 authorized projects at Arizona State. They want to ask students to take the lead on initiatives in the future.

He states, “We want a lot of experimentation to take place.”

He claims that one concept under consideration is a project they “jokingly call Syllabot.” The idea is this: Instead of a syllabus being a static document, what if students could ask questions about it?

They could inquire, “How might I approach it?'” if you have an assignment to do, such as a writing prompt, he explains.

He said that the university is developing a plan centred on “an AI platform for ASU that blends our data here” in general.

The key issue, according to Bowen, will be “How can it help us take action on that insight,” once huge language models can be integrated with analytical data unique to the institution.

Credit: Allschoolabs, EdSurge

Leave a Reply

Your email address will not be published. Required fields are marked *

Comment

Name

Home Shop Cart Account
Shopping Cart (0)
No products in the basket.