Janine Lloyd-Jones, Faculty: On the ethical considerations of AI and ensuring it’s a tool for positive change

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)


The benefits of AI are becoming increasingly clear as deployments ramp up, but fully considering the technology’s impact must remain a priority to build public trust.

AI News caught up with Janine Lloyd-Jones, director of marketing and communication at Faculty, to discuss how the benefits of AI can be unlocked while ensuring deployments remain a tool for positive change.

AI News: What are the constraints, ethical considerations, and potential for deep reinforcement learning?

Janine Lloyd-Jones: Whilst reinforcement learning can be used in video games, robotics and chatbots, the reality is that we can’t fully unlock the power of these tools as the risks are high and models like these are hard to maintain.

As AI makes more and more critical decisions about our everyday lives, it becomes even more important to know it’s operating safely. We’ve developed a first-of-its-kind explainability tool, which generates explanations quickly, incorporating causality; making it easier to improve the performance of models, because users understand how it makes decisions. This has been integral in our Early Warning System (EWS), allowing NHS staff to understand and interpret each forecast, which has increased the adoption of the tool.

It’s our view that clearer regulation is needed to ensure AI is being used safely, but this needs to be informed by what’s practical and possible to implement. We also need to ensure we don’t stifle innovation. Any regulation needs to be context-dependent. For example, when AI is used to make decisions in a medical diagnostics context, safety becomes far more important than if an AI algorithm is trying to choose which advertisement to show you. As we acquire the right tools and regulation, it’s exciting to see what complex AI like deep reinforcement learning will achieve in our industry and society.

AN: Faculty is among the companies taking admirable steps to offset its carbon emissions. How can AI play a role in combating climate change?

JL: We’re here because we believe that AI can change the world – we want to take this technology and use it to solve real, tangible, important problems.

Like many tech companies, the biggest sources of our carbon emissions are cloud computing (the tech sector has a greater carbon footprint than the aviation industry) but sustainable AI can be part of the solution. Our work includes analysing data for Arctic Basecamp and regulating pressure on the UK gas grid. We’re expanding our sustainable AI work with environmental organisations, supporting them to tackle climate change.

AN: How quickly do you think most factories will either go entirely “dark” – as in having no or very few humans working in them – or at least have a portion of them being fully autonomous? How can the workforce prepare for such changes?

JL: AI is not universal just yet, so we don’t expect we’ll see factories going entirely dark anytime soon. Most companies are using AI to automate, save time and increase productivity, but the potential of AI is huge – it will transform industries. AI can become the unconscious mind of an organisation, processing vast volumes of data quickly, and freeing humans to focus on what they’re best at and where their input is needed; humans have a far greater appreciation for nuance and context for example.

We’ve already helped clients across industries do this from cutting the backlog of cases from four years to just four weeks, developing models which detect harmful content online with a positive rate of 94% to helping large retailers ensure they are marketing to the customers most likely to purchase, increasing profits by 5%.

AN: The NHS was able to enhance its forecasting abilities thanks to its partnership with Faculty. What successes were achieved and was anything learnt from the experience that could improve future predictions?

JL: We’re really proud of our partnership with the NHS; our groundbreaking Early Warning System (EWS) was crucial in the NHS’ nationwide pandemic data strategy, forecasting spikes in Covid-19 cases and hospital admissions weeks in advance. These forecasts allowed the NHS to ensure there were enough staff, beds and vital equipment allocated for patients. There are over 1000 users of the model across the NHS.

Following the success of the tool, we are addressing new areas where AI forecasting can be used to improve service delivery and patient care in the NHS, including predicting A&E demand and winter pressures. The EWS uses our Operational Intelligence software; leveraging Bayesian hierarchical modelling to form forecasts on a national level to an individual trust level. We’ve used the same software in scenarios where demand forecasting is needed, including for consumer goods. 

AN: Faculty continues to expand rapidly and recently raised £30m that it expects to use to create 400 new jobs and accelerate its international expansion. What else is a key focus for Faculty over the coming years?

JL: We’re excited to be able to bring the power of AI to even more customers, helping them to make effective decisions with real-world impact. We are enhancing our technology offering, hiring 400 new people over the next few years and accelerating our international expansion. We are also doubling down on our AI safety research programme, so our customers have the assurance that all of our AI models are always performing safely and to the best of their ability.

AN: What will Faculty be sharing with the audience at this year’s AI Expo Global?

We’re glad to be at in-person events again, and we’re looking forward to meeting fellow exhibitors and attendees. Our focus at this year’s AI Expo will be on our Customer Intelligence software – which we are predominantly using within the consumer industries to demonstrate the impact marketing has on individual customer behaviour. Millions of marketing spend is wasted each year, being spent on the wrong people. With our technology, marketers will finally have the insight to know when and who they should be focusing their efforts on. 

We’re also sharing more about our Faculty Fellowship, our in-house L&D programme where organisations looking to expand their data science teams can hire top data scientists for six weeks before they decide to hire. This is particularly critical as the UK tech industry looks to hire and attract the top talent. We’ve already had some great companies take part in this programme from Virgin Media and Vodafone, through to leading startups like The Trade Desk and JustEat.

AN: It’s the 20th anniversary of the Faculty Fellowship in October – what’s the focus for the Fellowship over the coming years?

JL: Faculty began with the Fellowship, so it’s a really special milestone to be celebrating the 20th anniversary. With demand for data scientists at an all-time high – with over 100,000 vacancies in 2020 alone, it’s a competitive space. We expanded the programme this year to include an additional fellowship, and we’re continuously working to ensure we are attracting top talent, and making the process as easy as possible for our partner companies. 

Overstretched teams are fed up of spending their time on hiring and long interview rounds—the fellowship is designed so companies only invest 2-3 hours in total, but have an elite data scientist embedded in their team within weeks.

(Photo by Clark Tibbs on Unsplash)

Faculty will be sharing their invaluable insights during this year’s AI & Big Data Expo Global which runs from 6-7 September 2021. Faculty’s stand number is 178. Find out more about the event here.

Tags: , , , , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply