AI Funding Boost

The Faculty of Humanities at The University of Manchester has been awarded £2.73 million.

The funding will be used to enhance the faculty’s research and teaching capabilities over the next five years in the critical areas of AI, trust and society.

An AI funding package from The University of Manchester’s Strategic Investment Reserve Fund (SIRF) is being matched by £2 million from the Faculty of Humanities.

The investment will go towards appointing an interdisciplinary team of six senior lecturer or lecturer-level academics, six post-doctoral research associates and six PhD students. They will form a cross-cutting research cluster with the Centre for Digital Trust and Society (CDTS) at the University.

The investment will also leverage further research and industry funding, and help develop new teaching and executive education programmes, strengthening the University’s capability in ethical and responsible AI.

Professor Fiona Devine, Vice-President and Dean of the Faculty of Humanities, said: I am absolutely delighted that the Faculty has been successful in securing this funding to significantly expand and enhance our research and teaching capabilities in this emerging field. The investment is designed to retain our status as a UK leader in cyber security and responsible AI research and teaching.

Step-change

Richard Allmendinger, Professor of Applied Artificial Intelligence at AMBS, and Faculty Associate Dean for Business Engagement, Civic and Cultural Partnerships, said: This investment comes at a critical juncture and gives the Faculty of Humanities a critical mass in social science-led approaches to AI which will enable us to maximise external research funding opportunities.

The demand from industry is clear. International partners wish to collaborate on issues of AI governance and responsible AI, as do various strategic partners. As a city region, Manchester also has the largest AI innovation cluster by number of jobs outside London.

Professor Nick Lord, Director of the CDTS, and Professor of Criminology in the School of Social Sciences, added: AI is already having a profound effect on society and will continue to do so, and that means impacting everything we do as a University too. To mitigate risks and ensure the benefits of AI technologies we must consider the social, environmental and economic contexts they will operate in, and the consequences of their deployment.

There is an urgent need to drive approaches to AI that are secure, safe, reliable and trustworthy. It is also vital that they operate in a way that enables us to understand and investigate when they fail.

New collaborations

Enhancing Faculty of Humanities research power in AI trust and security will also catalyse new collaborations with the Faculty of Biology, Medicine and Health at the University, most notably with the Christabel Pankhurst Institute for health technology research and innovation.

Added Professor Devine: The complexity and rise of data in healthcare means that AI will increasingly be applied within the field and has the potential to speed up diagnostics and make healthcare operations more efficient.

Humanities research has much to contribute to this truly inter-disciplinary agenda and this investment will establish the University of Manchester as a leader in ethical, assessable, inclusive and responsible AI. It aligns not only with our commitment to cutting-edge research and innovation but also with our commitment to social responsibility.

Research cluster

The AI Trust and Security team will form a cross-cutting research cluster within the CDTS. The new initiative follows the recent announcement that the University of Manchester was awarded the status of Academic Centre of Excellence for Cyber Security Research by the National Cyber Security Centre and the Engineering and Physical Sciences Research Council.

The Centre is distinctive as it is the only cyber and digital security and trust research centre in the UK led from social science, rather than computer science or engineering.

Scroll to Top