AI in Higher Ed: Q&A with Industry Experts
AI in higher ed
Recently, Signal Vine hosted a webinar featuring industry experts to discuss AI in higher ed. The panel consisted of Dr. Diana Oblinger, President Emeritus of EDUCAUSE; Matthew Etchison, CIO of Ivy Tech Community College; and Brian Kathman, CEO of Signal Vine. They discussed how AI presents an opportunity to improve personalization in communication, teaching, and the overall student experience. The webinar can be viewed here.
During Q&A, we had so many great questions come in from our audience that we could not get to them all. As a result, following the webinar, we hosted a Q&A session with our panelists. Read on to hear from these industry leaders their answers to some of the questions we received.
Q: How did you manage data security as you integrated AI systems into your various systems used in your student lifecycle at Ivy Tech? (e.g., recruitment, enrollment, billing systems)
We partnered with Cisco to drill down and focus mainly on security measures. Then, we secured, validated, and cleaned up our student data. We were able to make sure it was compliant with both state and federal regulations. All of our 180 IT team members work to ensure that all of our data is secure.
Q: All three of you have assumed that technology maps on to education in the same way that it does in for-profit industries. Education is fundamentally different. How do you address that?
Education can be a different realm from for-profit industries, but in some respects, it isn’t. There are areas, such as accounting or facilities, where you’d find similarities with other industries. More significant differences would be found in teaching, learning, and research. The keys are what do you want to achieve and what tools will help you achieve those goals. I noted that we can save time and talent for what matters most by using AI. I think preserving time and talent are critical in education. What if we could save a staff member 100 hours of work? What if we could save a student 1,000 hours of unnecessary effort? Perhaps students would find more time to develop competencies needed for tomorrow’s professions. Perhaps they would have more time to focus on personal wellness or giving back to the community.
To echo Dr. Oblinger's comments, we’re seeing an evolution of technologies/skills versus traditional degree paths, as software and tech are evolving every second of the day. This is true for higher ed. For example, at Ivy Tech, we teach Salesforce skills because there is clearly a demand in the market for these skills, yet no one else in the country teaches them. The key is being able to rapidly evolve to meet market demands. Many roles in and out of higher ed involve technology that is evolving, and people’s skills will need to evolve also to keep up with this demand.
Q: What enabling technologies did you have to have in place to fully leverage AI at Ivy Tech? (e.g. API environment, cloud strategies, etc.)
At Ivy Tech, we’re focusing on a cloud strategy. The days of rack and stack services or an air-conditioned server room are obsolete. We are partnering with other tech companies to leverage AI and improve our student services. The last decade in IT has largely focused on these types of partnerships. We scale and evolve with these technologies and partners as well. During slow periods, we can scale back our technology and save money. This has helped us save money while also improving the student experience.
Q: How has the chatbot been received by students and other users? Do they perceive the tech to be impersonal? Useful? Annoying? Does it depend on use cases and types of users?
It’s natural for people today to turn to technology for things they want. Sometimes we prefer a chatbot to human interaction. At Wayne State University, for example, they found that chatbots were valuable, particularly for first-generation students who were reluctant to ask an “obvious” question of a person but had no hesitation asking the same question of a chatbot.
Chatbots are an enhancement to the human experience when students need information but an advisor may not be available. FAQs are an excellent use case for chatbots so students can get immediate answers to their simpler or frequently asked questions. Much like how the ATM and Paypal have automated many of our interactions previously between us and bank tellers, chatbots are alleviating some tasks and automating some interactions for higher ed staff.
Today’s students not only know how to use social media better than most of us, but they can also identify chatbots from miles away. Our view at Signal Vine, as we’ve demonstrated through our Virtual Advisor, is that AI and chatbots should be incorporated into the natural flow of conversations between students and staff members. The chatbot should have the voice of staff and play a supporting role of staff when appropriate. This is definitely dependent on the use case. For example, a chatbot may do just fine answering a question about a deadline. It might struggle to answer a question with a more sensitive nature, though.
Overall, the reception on chatbot functionality is positive because students feel they’re getting attention from a staff member. The key is ensuring a chatbot is reserved for just the right queries, and that a staff member is available when students need a human’s help.
Q: How do you address the fundamental “Garbage in, Garbage Out” (GIGO) problem with AI? How do you verify/confirm that the data being fed into AI is good or valid?
This is where the partnering piece and the human element come in. We have engineers and functional users in every area of the college. We can take data to them and tell them this is what the data says, but does it make sense? What do we disregard and what can we use? Humans know their business to the greatest extent. They use their human element - business intelligence - to analyze that data and sort it as appropriate.
Today’s smart machines and systems develop new knowledge by feeding on data. Compiling large datasets is a prerequisite to the use of AI. Poor data or insufficient data will result in faulty conclusions or decisions. Considering how prevalent AI and analytics have already become, future professionals will very likely need to know how to gather and analyze large datasets as well as how to interpret the results. Data is a critical element of virtually all professions. We should ask questions such as:
- What place does data have in our courses?
- Do students have the appropriate mix of mathematics, statistics, and coding to understand how data is manipulated and how algorithms work?
- Should students be required to become “data literate” (i.e., able to effectively use and critically evaluate data and its sources)?
Q: Can you distinguish between the benefits and challenges of implementing AI to improve ops and administration in higher ed versus teaching AI to students in all disciplines?
I suppose the common thread is understanding the problem you want to solve and determining if AI is the best tool to use. Chatbots are used to respond to student questions—for admissions or registration, for example. Chatbots can be used as English tutors, as well.
Some really sophisticated applications of AI involve learning and assessment—a more complex application than robotic process automation of invoices, for example. For example, we can obtain more holistic information about a learner by creating learning and testing opportunities through interactions in games, simulations or virtual reality. A learner can fly the plane, operate on the patient, drive the car, learn and be tested all in one. This generates lots of rich data. Computational models are fit to that data in order to provide feedback to the learner and to feed the learner forward to the next learning opportunity. (See this article for more information.)
Perhaps our emphasis should be less on the fact that machines can do things differently and more on how people can benefit from the outcomes. Society benefits from the sharing of expertise—whether it is shared via a person or a chatbot. While non-thinking, high-performing machines do not necessarily operate the same way humans do, they can help us in many ways.
Great question. The tools and technologies of AI are rapidly emerging and the workers of today and tomorrow have to understand the new normal is that everything, or nearly everything, is dynamic. Technology is growing exponentially (e.g. Moore’s Law) with no end in sight.
As a technologist, engineer, and executive working in higher ed, I think it is absolutely critical that everyone being educated in the modern world have a foundational knowledge of AI, information technology, and software development. The goal is not to make everyone an engineer or to work in the IT department. The goal is for everyone to understand at a foundational level the technology they use and consume with their smartphone and all realms of applications used to live in the modern era.
Q: A lot of institutions are using Azure and AWS for the LMS and university data. And they have their own automation tools, Flow, Sharepoint, and Q&A bots and such. How do you integrate external resources with it?
We use ETL tools and Snowflake for our backend database for our Data Warehouse.
Q: Are you able to pull more information/data from students through the mobile app you’ve created at Ivy Tech? And is the app integrated with Salesforce and Signal Vine?
We are definitely using data analytics on our Mobile App platform as that is key in making data-driven decisions in sync with our Strategic Plan. The app just launched in August and we have partner integrations with Salesforce and Signal Vine on our 2020 mobile roadmap.
Q: Have you seen any instances of AI replacing people’s jobs at Ivy Tech?
No evidence of AI replacing jobs. In fact, AI and automation can handle routine and repetitive tasks so people are freed up to do more challenging work involving their creativity and critical thinking. That type of work is much more liberating, motivating, and rewarding than routine, mundane tasks.
Q: In the beginning, Dr. Oblinger mentioned that AI would change the substance and the delivery of higher education. Tell us what you see as changing about the substance of higher ed.
By changing the substance of higher education I don’t mean that everyone needs to take a course in AI. But there are things we should consider. Let me give you three quick illustrations.
Since data is essential for AI—no matter in what discipline—everyone will need a new level of data literacy. What place do data have in our courses?
Second, since AI enables new ways of working we will need to understand how to integrate human expertise with machines. This will require us to constantly upskill and reskill ourselves since the technology will always be changing.
Finally, there are ethical considerations. Smart machines do not have moral reasoning capability so humans must always be mindful of the implications of AI use. And, as always, we must remember that just because you can do something doesn’t mean you should.
Q: What are some ways you’ve [Brian] seen institutions promote buy-in and get past the pushback of using AI at their institutions?
Often, conversations about AI can start with skepticism and even fear is some cases. We’ve seen many of our customers start by showing a specific example of AI in action. For example, some of our customers will use our Virtual Advisor, which uses AI, as an example to sort of demystify the experience. By showing the question set-up and the student response, their colleagues can see the benefit while knowing this supports — not replaces — their work.
Ready to learn more?
Request a demo to speak one-on-one with a Signal Vine team member.