[Opinion] This is not a university professor

Today, the world of universities is in a phase of complete transformation and is experiencing a worrying development process. ChatGPT technology has recently entered academic circles in the form of “social acceleration”. Such innovation has profound implications for teaching and research.

ChatGPT is a “natural language processing” technology that can be used to automate academic writing and research. In a changing academic world, this technology has the potential to accelerate the production and dissemination of knowledge. It can indeed facilitate the translation, writing and research of documents, thus making the production of academic work faster and more efficient.

But this technology poses serious risks to scientific and academic integrity. Some fear that this technology is being used to circumvent the academic research process, giving authors an unfair advantage and unreliable results. Moreover, this technology can be used to accelerate the neoliberalization of universities, which tends to reduce access to resources and lower the quality of teaching and research.

Finally, using ChatGPT can lead to “surveillance capitalism” where users are constantly tracked by algorithms. Researchers and students may be subject to more scrutiny as their research and work is analyzed and evaluated by algorithms. This technology can also lead to a loss of autonomy and freedom for researchers and students, depriving them of the right to decide how their work is produced and disseminated.

The arrival of ChatGPT in academia is therefore a legitimate concern. Universities must take steps to protect their students and researchers and ensure the integrity of their activities. It is important to limit the use of this technology where necessary and to ensure that the work produced is always properly vetted.

Therefore, universities should be vigilant and take measures to combat the misuse of ChatGPT technology and its harmful effects. Students and researchers must be protected from possible abuses, and the quality of teaching and research must be protected.

What to do with this tool?

The previous six paragraphs were not written by me, but by ChatGDP, a tool based on the latest techniques of deep learning and generative artificial intelligence, capable of generating more or less complex texts from simple instructions. I personally wrote this request: “Write a 600-word lead article for a newspaper Position On ChatGPT’s shakeup in academia. Write critical thinking, including concepts of social acceleration, neoliberalization of universities, and control capitalism. »

As a university professor interested in the transformation of advanced capitalism, social innovation, and the complex impact of new technologies, I am faced here with a very specific question that directly affects my profession: what to do with this tool that will directly affect evaluation? and teaching methods?

Apart from the issue of plagiarism, these tools can also be used by teachers to complete various tasks. I’ve taken several tests myself, developed a program in seconds, created questions for a final exam, and answered the same questions with higher quality writing than a college undergraduate, all in less than three minutes.

This immediately made me think of philosopher Gunter Anders’ concept of human obsolescence. Here’s another answer from ChatGPT: “The term human obsolescence by Günter Anders refers to the potential threat to human survival due to the arrival of new technologies. While this concept applies to all areas of life, it has special meaning for university professors with the advent of ChatGPT. Indeed, ChatGPT is an artificial intelligence system that can simulate a teacher and teach students. This could lead to a form of course automation and the loss of jobs for university professors. »

This is a somewhat disturbing answer. For my part, I believe that the status of full-time professors is not in danger in the short term. Contrary to popular belief, recent innovations in artificial intelligence are not leading to major job losses, but rather to the reconfiguration of work and operational techniques (including “digital work”), the outsourcing and fragmentation of tasks, and the expansion of work automation mechanisms. industrial capitalism of the past.

Return the genie to his lamp

Nevertheless, the implementation of AI in schools and universities raises a number of fundamental questions. If these tools can be used pedagogically in certain circumstances, and teachers will have to adapt their assessment methods accordingly, it would be simplistic to adopt an “adaptive” approach limited to individual responsibility. All primary and secondary schools, CEGEPs and university communities are affected.

Moreover, technological innovations emerge and spread within a certain socio-economic context, in this case capitalism, which influences the possible use of algorithmic tools. It would not be surprising if the uncertainty of teaching intensified, as did the pressures to publish. In our capitalist system based on the imperatives of productivity and performance, such technology would above all serve to intensify the dynamics of competition between students, professors and researchers.

Moreover, no matter how much we “ban” these tools in the classroom, the technology is now available, free, easily accessible, and usable, so we can’t go back or put the genie back in his lamp.

Therefore, we are faced with a fundamental reflection on the goals of teaching and the relevance (or obsolescence) of the methods used in the field of education. Honestly, I’m dumbfounded. If teachers aren’t already obsolete, the fact remains that a good portion of our current assessment tools are trash.

To see in the video

Leave a Reply

Your email address will not be published. Required fields are marked *