Article
Partner Updates
A word from our partner Resolver, a Kroll Business
Resolver, a Kroll Business, has been a valued partner to INHOPE for many years, sharing our commitment to creating a safer digital world. We had the pleasure of speaking with George Vlasto, Resolver’s new Head of Trust and Safety, about his journey, the challenges of tackling online harms, and the importance of cross-sector collaboration in shaping the future of Trust and Safety.
Who are you and why did you join Resolver?
My name is George Vlasto, and I lead the Trust and Safety Division at Resolver. I joined because I believe in our vital role in creating a safer online environment for users and platforms. Moving from a public service career, I wanted to find a role and an organisation which had a genuine mission: working in the Trust and Safety sector alongside some of the most significant platforms, technology services and other organisations in this space certainly offers that sense of purpose! Prior to Resolver I spent over 15 years as a UK government diplomat and developed a global perspective on risk intelligence and collaboration, which I now bring to this role.
At Resolver, I’m proud to be part of a team that addresses some of the toughest challenges in Trust and Safety. We focus on keeping users safe, protecting technology providers, and supporting the broader trust and safety ecosystem. Our work is impactful, and we’re constantly pushing the boundaries of what’s possible, delivering actionable insights and intelligence that make a real difference. As we near our twentieth anniversary, I’m excited to continue delivering gold-standard support and shaping the future of Trust and Safety.
Can you briefly explain Resolver's mission and strategy?
Resolver, a Kroll business, has worked in the Trust and Safety space for 20 years. Formerly Crisp Thinking, our founding principle remains to make the digital world safer for users, platforms, and service providers. We focus on identifying risks across Trust and Safety verticals – particularly child safety – through on-platform and off-platform risk intelligence.
We identify harmful content, bad actors and problematic behaviour for our partners and work with them to build the technology, processes and policy mitigations needed to address these challenges. Leveraging our technology platform and over 180 analysts, we help partners detect emerging risks early and respond effectively. Our watchwords are discretion and partnership: we understand the sensitivity and complexity of these challenges and work to build partnerships based on deep trust where we operate as an extension of our partners’ internal teams.
Where do you see the value of cross-sector partnerships in addressing issues like CSAM?
No one has all the answers to problems as broad and complex as CSAM. We believe there is huge value to cross-sector partnerships: bringing together expertise, capability and resources is the only way to get a handle on the challenges we collectively face. We partner with organisations like INHOPE , which lead the way in addressing these harms and bring together a diverse array of stakeholders, we are also increasingly collaborating with academic institutions to develop best practices in addressing harmful content online.
For example, we recently announced our partnership with StopNCII.org (hosted by SWGfL) to advance online trust and safety. Through this partnership, we are integrating StopNCII.org’s hashing tool with Resolver’s new image similarity matching technology to combat non-consensual intimate image abuse.
In our commercial partnerships, we always aim to bring the best capabilities to bear on the challenges faced by our partners: whether age verification, specialist image classifiers or real-world investigations into online harms. Collaboration is essential to achieve all our trust and safety objectives and we believe that it's in the interest of our online safety mission as well as our ambition to provide the best possible service to our partners.
How is Resolver adapting to emerging threats in the digital landscape?
Our partners span social media, gaming, and generative AI, so this is something we think about constantly. Our central principle is that complex online harms – such as AI-generated abusive content – are best addressed through a combination of human expertise and cutting-edge technology. We approach these problems from both angles and are always looking to build or integrate the best available detection technologies but always allied with subject matter expertise from our analyst team to ensure precision, context, and continued evolution of our capabilities.
As an example, we are currently working to deploy more powerful CSAM and NCII image classification and similarity-matching technology to deal with the challenge of AI-generated content that poses particular challenges to traditional hash-matching detection methodologies. As ever, this is augmented by deeper research by our subject matter experts into how this content is generated, by whom and how it is disseminated.
What role do you see Resolver playing in shaping industry standards for online safety?
A core part of our work is to help partners test, refine, and update trust and safety standard. Increasingly we aim to shape best practices more broadly across the field. We worked closely with the UK government and Parliament in the development of the Online Safety Act and participate in World Economic Forum expert groups and initiatives like the Australian e-Safety Commissioner’s efforts to refine regulatory implementation.
We are increasing our collaboration with NGO and academic partners to contribute to the thought leadership that will shape approaches to online harms in the future. We are aiming to make our subject matter expertise and technology capabilities more available to organisations working to improve and promote important Trust and Safety initiatives. Ultimately, it is central to our mission as a business to be at the forefront of efforts to improve digital safety and we will do whatever we can to help those organisations that share this purpose.
What are Resolver’s long-term goals in creating a safer digital world?
It will never be possible to eradicate all harm online – as in the offline world. However, as we mature in understanding how to combine technology, human expertise, and the social aspects of digital life, new and innovative ways to tackle the worst harms will emerge.
Resolver’s long-term goal is simple: to be at the forefront of these efforts and the indispensable partner to the companies and organisations working to make the internet safer for all.
No one has all the answers to problems as broad and complex as CSAM. We believe there is huge value to cross-sector partnerships: bringing together expertise, capability and resources is the only way to get a handle on the challenges we collectively face.
'