Alois Reitbauer shares how to adapt to an AI world

Alois Reitbauer shares how to adapt to an AI world

Dynatrace

Dynatrace’s chief technology strategist explains how organisations can capitalise on the rise of AI and provides an insight into what the future holds 

Amber Hickman |


There is a rising demand for generative artificial intelligence products that could result in about $280 billion in new software revenue by 2032, according to Bloomberg’s 2023 report Generative AI Growth

Many organisations are eager to act quickly to implement AI into their own business, in fear of getting left behind. But what challenges are these organisations likely to face? According to Alois Reitbauer, chief technology strategist at Dynatrace: “The biggest challenge has nothing to do with AI but involves understanding your business.

“Organisations need to know their customers, the processes that they want to focus on and where they want to improve. Only then can they identify how they can apply AI accordingly,” he explains.

Reitbauer also believes it is important for decision makers to improve AI literacy and understand how associated technologies can work for them.

“Generative AI is huge right now, but there is also machine learning, knowledge-based models and so on,” he says. “Many organisations tend to default to using a large language model (LLM) and building a chatbot, but there is a much broader spectrum of possibilities that could be better suited to their business.”

With so much to consider and so much at risk, Reitbauer says organisations need to work out how AI can best work for their business, rather than jumping in for the sake of having the technology.

“AI is an investment, and it can affect your business if not implemented responsibly,” he says. “Not only is it costly, but also consumes a lot of resources. For example, an LLM running off a large database can get quite slow, which is why prompt engineering and retrieval augmentation generation are important factors.”

Dynatrace has been building AI into its platform for over 10 years so customers can make the most of their data to resolve security incidents faster.

“The Dynatrace platform has a range of AI-powered features such as automatic problem analytics and root cause analytics for observability and security,” says Reitbauer. “For example, if an organisation is facing an issue in its production environment, manually searching for the cause and solution can be time consuming and introduce bias, as they are more likely to go with the first solution they find, rather than the most optimal.

“Right now, we are focusing on preventative operations. This is like preventive maintenance but for software systems, utilising predictive AI and automation. A lot of automation in place currently can be rigid and provides results based on a singular event, but with predictive AI we can apply cognitive analytics that can deal with tasks contextually.”

Dynatrace

As Dynatrace advances its AI developments, its partnership with Microsoft is creating opportunities.

“We’re collaborating with Microsoft on various levels,” says Reitbauer. “Our platform is entirely hosted on Microsoft Azure, and we are also using the Azure OpenAI services to enhance our offerings. 

“The primary advantage for us has been the ability to experiment rapidly in the fast-evolving realm of generative AI. It has enabled us to delegate the complexities of making new models production-ready, leaving us to focus on identifying new use cases and learning how to best employ the technology.” 

Reitbauer believes that we are only scratching the surface of the possibilities that AI can present and that it will soon become a part of every industry and have an impact on everyday work. 

“AI will have more of a prominent role in eliminating repetitive tasks that are present in every organisation, leaving employees available to focus on other areas,” he explains. “This doesn’t mean that AI will be taking over, though. Humans will lead AI systems by articulating the solutions that they want, verifying outcomes and ensuring everything is still working efficiently. 

“One of the most interesting developments right now is in how we use LLMs and their potential to be more creative problem solvers,” says Reitbauer. “I think these models will become much easier to run in the future as we improve hardware and produce more purpose-built models, meaning faster response times for longer and more complicated tasks. Overall, the next five to ten years are going to be very interesting.” 

This article was originally published in the Spring 2024 issue of Technology Record. To get future issues delivered directly to your inbox, sign up for a free subscription.   

Subscribe to the Technology Record newsletter


  • ©2024 Tudor Rose. All Rights Reserved. Technology Record is published by Tudor Rose with the support and guidance of Microsoft.