GFT is an international technology service provider with strong expertise in the banking, insurance and associated financial services sectors. It has over 10,000 employees globally, including the UK, Poland, Spain, Canada, the USA, Vietnam, Brazil, Germany and many other countries. GFT focuses on local client engagement combined with nearshore delivery, with teams just 1 or 2 time zones away.
“I would say our key differentiator is that we pride ourselves as being precision engineers across a multitude of domains. It's about cloud computing, neobank services, mainframe modernisation and a strong focus on AI,” explains David Tuppen, GFT’s Head of Data and AI.
David Tuppen and GFT’s data team are developing the GFT AI.DA marketplace that can support and accelerate their clients’ journey utilising AI.
“GFT is maturing the AI landscape for rapid entry. Many organisations are focusing on their investments into AI,” he shares.
Specifically, GFT is doing something a little different - placing a lot of emphasis on an AI and Data marketplace, which allows clients to see AI use cases, AI and data journeys, through to the development of a modern data platform.
“We firmly believe that there is a connection between AI and the data that goes with it,” says Tuppen. “From integration, storage and management of data processing and analytics, building specific pipelines is as important as generating the AI itself.”
For Tuppen, data platforms are an evolving concept which have changed over time and will continue to transform.
“There are various architectural patterns which are needed, which change for each business and client domain,” he says. “There is no single solution for a modern data platform. So, from the big trend on data mesh and democratised type architectures, decentralised data stores, there needs to be a specific pattern for each business unit and for each client.”
This is what GFT provides, a data strategy that fully aligns with an organisation's AI goals.
“Data strategy is fully dependent on the business strategy to show true value,” he emphasises.
A solution needs to start with business value first. After that comes data lifecycle, data management and governance, data and technical architecture, data science and AI visualisation, and then finally, data and change management.
“Each one of these steps is needed in order to achieve controlled and secure insights.”
Tuppen sees the data pipeline prior to AI projects as greatly important.
“You need to move from the front to the back,” he says. “As I mentioned, you start with the business and then move backwards to the IT backend. You look at how to change the impact and what impact adding intelligent automation has on your business. How will you view and access those insights from the AI? How is the data stored and processed? How is the data integrated into the platform?”
Tuppen sees that if a customer removes any of those considerations, they might land themselves in a data integrity issue, which could cause processing issues and an increase in time and cost.
AI is fed by data, so users need to have a secure data source in place.
“If you look at data, the industry is moving towards a democratised, domain-led architecture,” Tuppen comments. “So, isolating each functional domain into an end-to-end data product is becoming the go-to standard.”
This is a centralised layer that is more traditional in terms of AI trends. Meanwhile, GFT’s data scientists are seeing the uptake of smaller task-oriented language models.
“Model ops for governance is another trend that we are also seeing” he says. “With various optimisation frameworks being built, we are seeing prompt engineering becoming less of a trend.”
At GFT, our focus is not just on the latest AI trends, but how to achieve practical outcomes, based upon well-structured data, organised in a modern data platform.
Read the full story HERE.