The AI processing evolution: when is the right time to go neural?

With AI workloads increasing demand on computer resources, when is it time to make the switch to Neural Processing Units (NPU)? asks Jolyon Vernon.

Securing the right performance from a combination of CPU and GPU has long been a challenge across industries with substantial 3D or graphics-heavy application demands. The AEC industry is at the forefront of this race, with applications requiring ever more GPU to drive high quality, rapid rendering and proposal output. Meeting business demand with optimised workloads, VDI choices and global working patterns are topics we discuss weekly with customers.

Now, with AI and machine learning placing growing demand on IT resources, more detailed building design data, and firms starting to leverage generative AI, will NPUs become the standard for processing power in our next generation of devices and services?

NPUs are specialised hardware accelerators designed to handle the intensive computational requirements of neural networks and AI algorithms. Unlike CPUs and GPUs, NPUs are architected to optimise deep learning operations, providing significant performance boosts and energy efficiency.

By 2031 it is predicted that the NPU market will be worth $10.4bn. This is not only in enterprise IT; it’s expected that consumer devices will ship with integrated NPUs as operating systems leverage more built-in AI.

In the AEC industry, NPUs can help accelerate the processing of vast datasets and complex simulations, enhancing capabilities in design optimisation, structural analysis and construction planning. They can expedite BIM processes, enabling real-time updates and more accurate predictive modeling. Additionally, they can facilitate advanced image recognition tasks, improving quality control by swiftly identifying defects and inconsistencies in construction projects.

We’re also discussing the demand and workload requirements for generative AI with a number of clients, a technology which promises to be hugely powerful in design and architectural workflows. Will firms start needing NPU-based systems to make full use of this technology?

Being relatively new, the integration of NPUs into existing AEC processes and platforms presents several challenges. Firstly, there is significant cost associated with adopting NPU technology, including hardware investment and the need for specialist training. This can present a barrier for smaller firms or those with limited budgets. During this rapid evolution of NPU technology there is increased risk of obsolescence, where hardware investments may quickly become outdated.

It will also take some time for existing software ecosystems to make the required modifications to leverage full NPU capabilities, leading to likely compatibility and collaboration issues.
Lastly, let’s not forget people and process change. Careful consideration must be given to existing staff and those already experienced across a relatively mature application set. The learning curve associated with implementing NPUs could cause delays as professionals adapt to new workflows and tooling in an already complex ecosystem.

Despite these challenges, the potential benefits of NPUs to enhance efficiency, accuracy and innovation in the AEC industry make them a compelling advancement; they promise to redefine traditional practices and drive the sector towards a more technologically advanced future.

As for generative AI – future innovation or firmly here to stay? I’ll cover this in my next article in the coming weeks!