How do you get a bunch of disruptive minds together in a (virtual) room to build innovative AI applications? Organize a hackathon! During the FINN feat. AI Hackathon held on May 30, participants embarked on a one-day challenge to generate measurable business impact by leveraging AI to solve problems. Solutions developed ranged from automated vehicle damage recognition and GPT-4-powered dynamic car pricing to a bot to answer queries about vehicles directly from Slack.

In this article, we’ll showcase the two winning projects: one winner from the teams competing at FINN’s base in Germany, and the other from the New York City headquarters. Get ready for some inspiration on hacking your business with AI 🧠🤖

Winner 🇩🇪: Finding your dream car with the power of AI

Picture discovering your perfect car with just a few clicks. That’s the remarkable power of personalization. Did you know that 56% of consumers state they would become repeat buyers if they get a personalized experience?

Problem

How can we at FINN leverage personalization to bring the car dealership experience to our customers on the go?

That’s the question our winning team in Germany asked themselves. Sofiane Zeghoud, Ishtiaque Zafar, Sofya Durneva, and Robert Ghazaryan were driven by a shared vision: transforming the way our customers engage with car subscriptions.

Four people standing in an office space, with laptops in their hands, smiling From left to right: Sofya Durneva, Sofiane Zeghoud, Ishtiaque Zafar, and Robert Ghazaryan.

The team noticed that we currently offer a uniform product listing page (PLP) experience to everyone. The problem? Our customers are all unique and have distinct needs and preferences. We want to meet them where they are.

Solution

In just a few hours, the team created a fully functional recommendation engine prototype for our website, delivering personalized suggestions that instantly connect with each customer.

For a smooth customer experience, the engine uses already available analytics data to display the most relevant vehicles on the PLP. Imagine effortlessly navigating our website and being greeted by a vibrant display of vehicles that are exactly what you’re looking for. The “Help me pick a car” chat box becomes your trusted advisor that can always find the perfect car for your taste.

FINN car selection web interface with a row of three suggested cars. The first car box has a small blue label at the top saying ‘AI Recommendation’ Personalized car recommendations hand-picked for the visitor’s needs

By offering a seamless personalized experience this solution:

  • drives higher clickthrough rates (CTR)
  • drives higher conversion rates (CVR) to subscription
  • reduces paid marketing costs

This is how the magic happens: Sofiane leveraged the power of deep learning to anticipate the preferences of each visitor for our top car models and list the models accordingly. The “Help me pick a car” feature runs on GPT-4 under the hood, a powerful language model, to seamlessly fuse AI and user-centric design, helping customers find their dream car with ease.

The winning team is already setting its sights on training the models on richer fleet data and fine-tuning the engine on more matching parameters.

As we continue making mobility fun and sustainable, we invite you to stay tuned for more exciting innovations from us! But first, let’s explore the winning idea of our USA-based team.

Winner 🇺🇲: Revolutionizing user acquisition calls with AI

In the hackathon held at FINN’s NYC base, the winner was Team User Acquisition (UA), consisting of Kevin Allen, Bethany Looi, Anna Kohlasch, and Eric Van Thuyne. Team UA pitched a project to revolutionize the processing of user acquisition call information with the use of artificial intelligence.

Problem

Team UA addressed the issue that on the one hand calls with customers are a crucial part of the work for many people working in user acquisition at FINN, while on the other hand call records can be quite cumbersome to extract actionable information from. Yes, calls are recorded, but re-listening those calls would take ages. And yes, call recordings are transcribed, but again, plowing through all those call transcriptions would be a huge effort. As a result, currently it is difficult to keep track of which topics came up in an individual call with a customer, or whether there are any trends or common topics across multiple calls. This means that a lot of valuable call data and information is currently left unused, or takes significant effort to use, hence restricting the UA department’s overall efficiency.

Solution

To solve this issue, Team UA developed an AI-powered automation to facilitate quick and easy call summaries. The solution automatically summarizes individual calls, identifies customer intent (that is, whether the customer is likely to want to get a subscription), action items (such as requesting a customer’s confirmation), as well as any keywords associated with the call. With this in hand, anyone can swiftly get the gist of what was covered in a call.

Note on Hubspot with estimated customer intent, a call summary, action items, and keywords Note on Hubspot with estimated customer intent, a call summary, action items, and keywords

In addition, the keywords extracted from a call can be used to automatically summarize the contents of multiple calls over a specific period. A keyword cloud can give a visual summary of the common topics that user acquisition agents are dealing with.

A keyword cloud visually summarizes the topics of multiple calls A keyword cloud visually summarizes the topics of multiple calls

How does the solution work in practice? The AI-driven automation consists of the following steps:

  1. A phone call between a sales agent and a customer is recorded and transcribed via CloudTalk.
  2. Using the low-code tool Make, an automation scenario fetches the call transcript from CloudTalk, matches the call to the customer contact details on Hubspot, and sends the transcript to ChatGPT for parsing.
  3. ChatGPT uses the call transcript to create a call summary, and to extract inferred customer intent, action items, and keywords.
  4. The Make scenario adds a note with the extracted information to the relevant customer record on Hubspot, and saves the extracted keywords in a Google sheet.

Make scenario that summarizes calls, and extracts customer intent, action items, and keywords Make scenario that summarizes calls, and extracts customer intent, action items, and keywords

Why it’s so good

Team UA’s AI-driven solution is expected to offer three core direct benefits to anyone working with call data in user acquisition:

  • Time management — Automated call summaries can reduce manual effort in figuring out what happened in a call, or across multiple calls.

  • Increase sales — The summaries can capture valuable insights that can in turn be used to train agents and enhance scripts.

  • Collaboration — The call summaries can be useful in collaboration, by allowing the easy sharing of keywords and learnings with other departments.

In addition, in the longer term the automated call summaries and keyword clouds are also expected to offer the following advantages:

  • Sales training — Call summaries can be used to create training exercises and prepare agents for specific sales scenarios.

  • Capacity planning — Having a good, quick and easy overview of call topics will enable the department to scale, anticipate resource needs, and to allocate resources effectively.

  • Customer service and close rates — Call information can ultimately be used to increase the number of subscriptions.

In fact, Team UA’s project was developed to such a high standard that, within days of winning the US hackathon, it has already been implemented 🥳

What’s next?

AI hackathons can be a powerful tool when you’re seeking to drive innovation and generate tangible business outcomes. At FINN, we’re definitely going to continue implementing some more of the solutions developed during the hackathon day — and come up with new ones! How about you, have you used AI to build any business solutions? What worked well, what didn’t? Let us know.

This article was written by Kosara Golemshinska and Chris Meyns.