Kenrick Cabrera, CTO, Oxford International Education Group (OEIG)

Kenrick Cabrera joined Oxford International Education Group in January 2022 as Chief Technology Officer. Kenrick brings over 20 years of expertise to the Executive Leadership team in delivering business transformation in various sectors and organisations. With a proven track record of using technology to encourage business development and growth, Kenrick brings a unique perspective that will influence the company’s understanding of how the latest technology can help students. Kenrick’s previous roles include Programme Delivery & Assurance Lead at PwC, where he ran several technology and regulatory change programs, Head of IT Projects and Support at Hyde Housing Association Ltd, and Head of IT Operations at Markerstudy Group.

 

Artificial Intelligence (AI) seems poised to completely change our sector. Broader public interest in the technology cannot be overstated: the launch of ChatGPT in November 2022 catapulted AI into the mainstream and related internet searches reached an all-time high in May 2023 at 42 million. However, AI is not, in itself, brand new. From biometric scanners in airports to ‘Alexa’ devices in homes, versions of modern artificial intelligence have existed for some time.

So, what has changed? Our answer: the level of human engagement with AI technology and its increasingly mainstream accessibility. Although AI was previously quietly whirring away in our home devices, films, and television shows portrayed artificial intelligence as the harbinger of the apocalypse. Now, however, generative AI programs have suddenly become within easy reach for anyone who looks for them. Where we once asked Siri for facts or answers, we now work with Chat GPT to perform simple tasks. The reaction to this, publicly and in the higher education sector, has been a mixed bag.

This technology will drastically alter the student experience and the way in which higher education is delivered – from plagiarism to virtual teaching assistants and recruitment acceleration. It is crucial that we move beyond panicking about the challenges AI presents in our sector and work to understand how this technology can be used to our benefit. There will be no escape from AI in the years to come. Educational organisations must be at the forefront of this technological revolution to ensure AI is used and regulated in a way that enhances the higher education sector and society.

How do you win the AI race?

For AI to have a positive impact on the services provided by educational groups and the student experience, organisations need to understand the technology at the deepest level possible. Belated, circular training and discussions about the implications of AI are not enough. Organisations must engage highly skilled and experienced individuals who are able to understand developments as they are happening, rather than after the fact.

Interacting meaningfully with experts in the field will also go a long way in the development of necessary ethical frameworks. Universities have exhibited concerns that the introduction of the newly-advanced technology will have a negative impact on their reputation and stymie compliance procedures. To mitigate these concerns, partner organisations must prove that they can adopt the technology in a safe and sustainable manner. Given the trust placed upon them by higher education institutions, partner companies such as Oxford International are uniquely positioned to catalyse the introduction of AI at universities.

The challenges:

The seismic shift caused by the introduction of such ‘new’ AI is comparable to the launch of the Internet. These advancements will change our world irreversibly and, like the dawn of the Internet – organisations and individuals can either embrace the change and work with it, or risk fading into irrelevance. Both the introduction of the World Wide Web and generative AI come with a similar number of challenges around regulation and safety and, generally, young people continue to be much more in tune with new technology than their parents, guardians, and teachers.

For the higher education sector, plagiarism remains a core concern – from worries about academic integrity to reports of university students being falsely accused of using ChatGPT for coursework.

The industry remains divided on best practices regarding this. However, the absence of a universally agreed approach opens the door for increased collaboration and innovation across the higher education sector and beyond. For example, OEIG recently rebuilt our English Language Level Test (ELLT) platform to incorporate a more robust defense against AI-generated content and plagiarism. To do so, OIEG partnered with CopyLeaks, incorporating their technology to flag content that students may have obtained through a generative AI platform. The new ELLT also utilises AI to monitor eye or head movements associated with cheating, a system known as ‘proctoring’. AI actually provides opportunities for far more sophisticated plagiarism monitoring than was previously available.

The opportunities:

Most commonly, the use case for AI in the higher education sector is associated with streamlining administrative processes, such as recruitment, tracking student performance, and accounting. However, the technology also presents the opportunity for a vastly enriched student experience, both during the admissions process and at the university itself. Advanced chatbots, like ChatGPT, can be used to improve the interface between students and educational providers. Integrating AI in this way will reduce response times, alleviating frustrations and anxieties during the admissions period.

Once at university, students will begin to expect systems that have integrated AI. Benefits of such systems range from accelerated interactive learning and increased inclusivity for students with disabilities (including mobility and speech impairments), to a tailored student interface. Increasingly, students will begin to consider the level of access to AI technology when making their university plans, including studying abroad. Communicating universities’ AI offerings will, therefore, become an essential part of the recruitment process.

Staying ahead of the curve:

Universities and educational organisations that do not grasp AI technology with both hands are at significant risk of fading into irrelevancy. As is so often the case with new, groundbreaking technology, students are likely to be ahead of the curve. This means educational institutions must be open to upskilling quickly to reap the benefits and continue to attract leading talent.

Right now, the industry is apprehensive about these technologies. We are at a tipping point of integrating this new tech, and I would urge others in the industry to consider the opportunities it presents as well as the challenges. We are not on the verge of being plugged into The Matrix – though we will have to ensure that students are not farming essays out to machines. AI is not only the problem but also the solution to this and other issues facing our sector. We must treat AI as an operational, ethical, and moral challenge as well as the technological one. Only then will we be able to fully benefit from it.

Content Disclaimer

Related Articles