Generative AI (GenAI) technology is a rapid means to real efficiency gains when deployed as an assistant to the human workforce. DEAS is implementing the technology to reduce generic workload, speeding up legal case management and payroll inquiries, and gaining a documented 30% speed increase in IT development it uses as a reference for company-wide efficiency expectations and as a vendor challenge in its IT sourcing strategy.
In all the hype surrounding GenAI, this story from DEAS proves it can drive genuine value for enterprises—but they must ensure they focus on tangible value possible today rather than getting swept up in the exciting potential of the tech in the next decade.
Over the last few years, DEAS, which manages €45 billion in Nordic property assets, has invested heavily in digital transformation in an industry not known for such forays. For DEAS, this meant that when the ChatGPT earthquake occurred in November 2022, its digital strategy was already well on its way toward first getting the data right and second putting the right automation in place to improve efficiency.
By injecting its own, non-compromising data into the large language model (LLM) behind ChatGPT instead of using the front-end OpenAI web tool, DEAS has rapidly built several use cases while staying in control of data and utilization. A payroll chatbot has been developed to take care of generic employee queries, and it is also rolling out a knowledge retrieval app to pre-generate responses for its legal teams (like we’ve said, generic legal work is doomed). In both cases, employee and customer experience is at the center, but this should not take anything away from the core efficiency and refocusing gains.
DEAS applied the LLM to its IT development projects, investigated the results, and concluded that, on average, it reduced project delivery time by 30%—for some projects, even higher. Add in the ability to deliver previously unfeasible projects due to technological improvements and reduced effort needed, and DEAS is unlocking the quality triangle to get quality, speed, and low cost.
The 30% increased efficiency is at the core of how DEAS approaches the next wave of use cases. It believes it can extrapolate this percentage to almost any case where the technology is applied, and it’s using it as a point of reference. Laser focus on direct efficiency gains by applying GenAI as an augmentation to the human workforce means DEAS is not currently even discussing the implications of fully automatic processing. The value of potentially saving 30% across their enterprise operations combined with retained human accountability means no serious headaches for controlling— just a gold mine of opportunities.
During their initial projects working closely with Microsoft and its recommended partners, DEAS found that what it essentially needed from vendors was not strategy advisory. Rather, it sought pure digital engineering skills, tapping obvious and known use cases and revisiting older, rejected ones. Coupled with its (at least) 30% efficiency gains, this has added up to three major changes in how it approaches its IT sourcing:
This adds up to a very delicate nut to crack for IT service companies: How do they retain their strategic relevance on use case heatmapping while simultaneously lowering project sizes and taking outcome-based risks?
Revisiting abandoned use cases of automation days gone by, harvesting the obvious ones in front of you, and keeping a steady focus on efficiency gains are imperative tasks for the next few years. The message is clear: Enterprises must forget about “autonomous” for a while and get the benefits their competitors are reaping. DEAS shows that generative is a cure for generic work and is revamping internal operations and partnership models to gain full-circle value of the technology.
Register now for immediate access of HFS' research, data and forward looking trends.
Get StartedIf you don't have an account, Register here |
Register now for immediate access of HFS' research, data and forward looking trends.
Get Started