Keynote Speaker: Professor Phillip Morgan (in person presentation)

Prof Phillip Morgan BSc DipRes PhD PGCHE FHEA AFALT AFBPS 

Phillip holds a Personal Chair (and is a Senior Professor) in Human Factors and Cognitive Science within the School of Psychology at Cardiff University. He is Director of the Human Factors Excellence Research Group (HuFEx) and Director of Research for the Centre for AI, Robotics and Human-Machine Systems (IROHMS). He is an international expert in Cyberpsychology, Transport Psychology, Humans in Automation and AI, Human-Machine Interface Design, Human-Computer Interaction, and Adaptive Cognition. He has been awarded >£25Million (UK) funding (>£14M direct) across >45 funded grants from e.g., Airbus, CREST, DHC-STC, ERDF, EPSRC, ESRC, GoS, HSSRC, IUK, MoD, NCSC, RAEng, SOS Alarm, and the Wellcome Trust, and has published >140 major papers, conference outputs, and significant reports for e.g. government and industry. Phil currently works on large-scale projects funded by Airbus, where he was seconded for 3.5-years (March 2019-August 2022), 80% FTE, as Technical Lead in Cyberpsychology and Human Factors and is Head of the Airbus Accelerator in Human-Centric Cyber Security (H2CS). Recently, Phil became Director of a new Airbus Centre of Excellence in Human-Centric Cyber Security at Cardiff University and is one of two Academic Leads for a Strategic Partnership between Airbus and Cardiff University.  

Phil is UK PI on an ESRC-JST project (2020-24) (with collaborators at e.g., Universities of Kyoto and Osaka) on the Rule of Law in the Age of AI and autonomous systems with a key focus on blame assignment and trust in autonomous vehicles – exploring Human-Robot Interaction and Explainable AI (XAI) as core interventions. He also currently works as a senior academic on two HSSRC (UK MOD / Dstl / BAE Systems) projects examining HF guidelines for autonomous systems and robots (with QinetiQ & BMT Defence) and complex sociotechnical systems (with Trimetis). He also works on two projects funded by the NCSC focussed on interruptions effects on cyber security behaviours, as well as recently completing a project on XAI funded by Airbus Defence and Space. 

Phil overseas the IROHMS Simulation Laboratory (250+sqm, > £700k invested over 3-years) based within the School of Psychology at Cardiff University that currently comprises five state-of the art zones: immersive dome; transport simulator; cognitive robotics; VR/AR; and a command and control centre. 

Phil is also Visiting Professor at Luleå University of Technology - Psychology, Division of Health, Medicine & Rehabilitation, Sweden – where his key role is lead Human Factors collaborations between the Universities of Luleå and Cardiff (and others).

Phil played a key role on two large scale multi-partner Innovate UK funded autonomous vehicle projects: Venturer Autonomous Vehicles for UK Roads (2015-2018) and Flourish Trusted Secure Mobility for citizen population sectors who may benefit most from CAVs – including people with disabilities, cognitive impairments, sight/hearing impairments and so on. Phil was a Human Factors lead on both projects – focussing on e.g. handover, HMI design, HCI, trust, adoption, running participant workshops and so on.

Phil is particularly interested in collaborating on research projects focussed on human interaction with autonomous systems (e.g. self-driving vehicles) linked to the following topics: safety, trust, cyber security, privacy, adoption, blame, responsibility (law, moral, causal), regulation. Phil is also very interested in collaborating on research projects focussed on human aspects of cyber security (e.g. risks, mitigating risks etc.) within multiple application areas, including self-driving vehicles. He is also interested in adaptive cognition and how this can be applied to optimise human-machine interaction and human-computer interaction. 

Some areas of expertise:

Transport & Intelligent Mobility

Prof Morgan is an international expert on Transportation Human Factors with a thriving 10-year portfolio of research projects (>10 grants, >£13M funding) – including: Venturer Autonomous Vehicles for UK Roads – IUK – 2015-18 (focussed on Level 3 partially autonomous vehicles – e.g. human trust during/after handover as well as during complex interactions with other road users – drivers, pedestrians, cyclists); Flourish Trusted Secure Mobility – IUK – 20-16-19 (focussed on Levels 4 and 5 – e.g. design, testing, development & deployment of accessible, usable, functional, adaptable, safe, secure, and trusted human-machine interfaces for connected autonomous vehicles); and, Rule of Law in the Age of AI: Principles of Distributive Liability for Multi-Agent Societies - ESRC-JST (focussed on safety, trust, security, privacy, adoption, blame, responsibility (law, moral, causal), and regulation – e.g. in the event of ‘hypothetical’ incidents and accidents’ – international collaboration with the Universities of Kyoto, Osaka and Doshisha (2020-23).

Cyber Security

Prof Morgan has been leading on human-centric cyber security research for almost 10-years. Notable projects include Airbus Cyber Security and Human Factors (5 Airbus funded projects, 2 funded by ESRC and EPSRC, total value ~£1.5M) where following a 3.5 year major (80% FTE) industry secondment, he is now Director of a new and best-in-class Airbus and CU Centre of Excellence in Human-Centric Cyber Security (ACE-H2CS) and one of two Academic Leads for a Strategic Partnership between Airbus and CU. Whilst much of the research has been part-embargoed (until recently), Prof Morgan (and colleagues) can report that they have e.g. developed leading-edge tools that can account for up to 65% of factors (e.g. individual differences, organisational, technical) that lead to risky human cyber security behaviours, as well as methods (technical, training etc.) to mitigate human vulnerabilities and have conducted research into factors often beyond the control of individuals (including task interruption, distraction, workload, time pressure etc.) that can exacerbate risk taking behaviours in the context of cyber security. Other cyber security projects (e.g. NCSC funded 2019-present - with Dr Morey and colleagues) have involved examining the effects of multitasking (focussed on interruption and distraction) on cyber security work – particularly focussing on errors. Prof Morgan has also conducted research on cyber security and privacy vulnerabilities linked to smart IoT devices, especially those that need to be connected (funded by CREST).

Task Interruption and Distraction – with Applications to Defence, Cyber Security, Transport and Healthcare

PMs ESRC funded PhD (2001-2004, supervised by Prof Dylan M Jones OBE (services to military science) DSc) was in Cognitive Experimental Psychology – ‘Now where was I? A cognitive experimental analysis of the effects of task interruption on goal-directed memory’ – Prof Morgan established many of the disruptive effects of task interruption on goal directed memory – including characteristics such as interruption duration, complexity and frequency as will the role of (and based on the Memory for Goals model – Altmann, Trafton and colleagues (e.g. 2001, 2007) ‘strengthening’ (goals in memory) and ‘priming’ (e.g. cues in the environment) to mitigate some of the disruptive effects. Prof Morgan has continued to research this area over the past ~20-years with e.g. applications to defence (e.g. projects funded by DIFDTC, HSSRC – e.g. Distracting Effects of Low and High Intensity Light and Sound - HSSRC), cyber security (e.g. projects funded by Airbus, NCSC), transport (e.g. projects funded by ESRC, EPSRC) and healthcare.

If you would like to read the full bio along with the publications, click here.

Keynote Speaker: Professor Peter Buckle (online presentation)

Professor Peter Buckle PhD, C.ErgHF, FIEHF, FIEA, FRSPH
Principal Research Fellow at Imperial College  
Peter is the Methodology theme lead and Head of Human Factors at the NIHR London In-Vitro Diagnostics Co-operative. This a partnership between Imperial College London and Imperial College Healthcare NHS Trust that generates evidence to support diagnostic test development.

He has he has held Professorial positions at Imperial College London, the Royal College of Art and Design, and the Universities of Nottingham, Leeds and Surrey.

Peter has published over 300 peer reviewed research articles on ergonomics, human factors, occupational health and designing better performing work systems. He is a Fellow and a former President of the Chartered Institute of Ergonomics and Human Factors (CIEHF) and a Fellow of the International Ergonomics Association and the Royal Society of Public Health.

He is a member of both the HSE’s Workplace Health Expert Committee and the newly formed HSE Science Quality Assurance Group, (and formerly the Science and Engineering Evidence Assurance Committee). He led the Robens Centre for Ergonomics at the University of Surrey (1992 to 2006) and was director of the Robens Centre for Public Health (2007-2009). In 2001, he was awarded the Sir Frederic Bartlett medal for his research and in 2005 the President’s Medal, both from the Chartered Institute for Ergonomics and Human Factors. In 2017, he was awarded the USA Foundation for Professional Ergonomics (FPE) Ergonomist of the Year Award.