Cut mine planning time by 70 per cent and boost value with integrated data virtual twins and smart algorithms transforming mining decisions


, , , , , , , , , , , , , , , , , , , , , ,
,
, , , , ,
Unlocking up to 70 per cent faster mine planning cycles and millions in additional project value is now within reach for operations that combine centralised data systems, virtual twins and advanced optimisation engines.
At the APCOM2025 conference in Perth, Dr Gustavo Pilger, GEOVIA R&D Strategy and Management Director at Dassault Systèmes, presented the keynote The Value Multiplier: Unlocking Potential with Integrated Data and Smart Algorithms. Drawing on more than two decades in mineral resource modelling and geostatistics, Gustavo outlined how combining federated data with next-generation optimisation and design tools can transform the way mines plan, adapt and extract value.
From data overload to strategic insight
Gustavo began by describing a familiar challenge in mining - the time lost simply locating and consolidating critical information before planning work can begin.
“Think about how much time is spent just gathering the data,” he said. “Where is the latest block model? Is it in the server, or is it the file named ‘final_final’? Who owns it? These delays inflate costs and slow decision-making.”
The problem, he argued, has grown as sensors and digital systems generate ever-increasing volumes of operational and geological data. While valuable, this information often sits in disparate systems, stored in inconsistent formats and managed by teams with varying levels of awareness about data integrity.
The result is inefficiency, risk, and a planning process that remains rigid and linear - poorly suited to handling the technical, economic and geopolitical changes that routinely disrupt mine schedules.

The case for centralised, standardised and contextualised data
For Gustavo, the starting point for change is treating data as a core business asset. This means adopting centralised systems that not only federate and secure datasets, but also index, sanitise, and contextualise them in space and time.
He explained that using interlinked data models with a shared “language” - through semantic dictionaries or industry-standard ontologies - ensures information can be used consistently across processes. This structure also makes the data machine-readable, enabling AI-driven workflows without losing the human context essential for decision-making.
In practical terms, this approach ensures every team member has access to the right version of the right dataset, with permissions and traceability built in. “It’s about putting you in control of your data, rather than letting the data control you,” Gustavo said.
Unlocking the power of virtual twins
Gustavo described how centralised, structured data can be brought to life through virtual twin technology - digital replicas of assets, systems and processes that simulate behaviour in real time.
By modelling not only the physical attributes of a mine but also the interactions between geological, geotechnical, operational and economic factors, virtual twins allow teams to test scenarios before acting in the field. This capability supports both early-stage project evaluation and operational adjustments in response to changing conditions.
“Virtual twins enable meaningful links between processes and data,” Gustavo said. “It’s this associativity that allows you to keep chasing value, adjusting to uncertainty and unplanned events.”
Beyond generic AI to industry-specific intelligence
While generative AI and large language models (LLMs) have dominated recent headlines, Gustavo emphasised the need for domain-specific approaches in mining. Dassault Systèmes, he said, focuses on building industry LLMs that integrate decades of sector knowledge with ontologies and physics-based models.
“In this context, industrial AI is secure, sovereign and traceable,” he explained. “It doesn’t generate approximations - it produces reliable, context-aware representations of real-world objects and processes, including their interactions.”
By keeping humans at the centre of the process, these tools aim to amplify collective intelligence, increase decision-making speed, and maintain governance and auditability.
Optimising mine planning with GMX
Gustavo highlighted one of Dassault Systèmes’ latest developments - the GEOVIA Mine Maximizer (GMX) engine - as an example of how advanced algorithms can deliver step-change improvements in strategic mine planning.
An evolution of the well-known Bienstock-Zuckerberg algorithm, GMX can achieve near-optimal global scheduling outcomes in far fewer iterations. In trials across 20 projects, it delivered run-times up to 22 times faster than the original method, with results within one per cent of theoretical optimum. In one case study, this translated into an additional US$127 million in net present value.
The performance gains come from combining GMX with a new algorithm for practical phase optimisation, producing mine phases that align closely with actual designs and minimising deviation - and therefore potential value loss - between plan and reality.
Generative design for rapid iteration
Integrating optimisation with generative design tools, Gustavo said, creates the ability to automatically adjust mine designs in response to changing parameters without reworking the entire model manually.
He used the example of designing pit ramps for battery or trolley-assist haul fleets. Designers can define entry and exit points, ramp dimensions, slopes, and other operational constraints, then evaluate trade-offs automatically. When parameters change - due to new geotechnical data, for example - the design updates instantly, remaining compliant with safety and operational rules.
“This takes away the need to manually edit the design every time,” Gustavo noted. “It means plans can be updated days or even weeks faster than current practice allows.”
Scaling optionality and scenario testing
The combination of federated data, virtual twins, optimisation engines and generative design opens the door to scenario testing at a scale previously impractical.
In one example, an engineering team used the approach to create 23 pit options, analyse multiple mining directions, generate more than 1500 pushback options, and then develop over 9000 possible mining sequences. This allowed them to select a plan robust enough to meet targets across NPV, logistics and ESG metrics, while withstanding uncertainties and potential disruptions.
Such capability, Gustavo argued, fundamentally changes the agility of mine planning. “You can explore many more design alternatives in less time, with less friction, and without risking data integrity,” he said.
Quantifying the gains
Gustavo shared modelling results comparing the conventional, linear approach to mine planning with a flexible, iterative, collaborative process supported by centralised data and smart algorithms.
Across three iterations - from initial optimisation and design through to changes in geomechanical parameters and design constraints - engineering hours fell by around 70 per cent. In absolute terms, this meant reducing a typical 71-hour workload to 18.5 hours.
“These savings are mainly due to less rework,” he explained. “When processes are parametrically defined and digitally interconnected, changes propagate automatically through the chain.”
Beyond labour savings, the integrated approach improves review cycles, increases the reliability of plans, and enables faster responses to improved resource definition, engineering decisions, and market shifts.
Enabling capital project transformation
For capital project evaluation, Gustavo positioned the approach as a potential game-changer. By solving complex optimisation problems in minutes rather than days, automatically updating designs, and running thousands of scenario combinations, planners can select plans that meet targets for geotechnical integrity, ESG compliance and financial performance - all while maximising NPV.
The key, he concluded, is not just the individual technologies but the way they are combined. “The value comes from connecting processes and data, ensuring associativity between them, and enabling robust, fast scenario evaluation,” he said.
Change management remains the biggest barrier
While the enabling technology is already available, Gustavo cautioned that the main obstacles to adoption are organisational, not technical.
“The technology is there,” he said. “The barriers are really processes and people - change management. Start small, identify the bottlenecks in your operation, and show value. Once management sees the benefit, it opens the path for more experimentation and adoption.”
Human expertise still central
Asked whether engineers risk being replaced as AI becomes more capable, Gustavo was clear: “AI will help us make decisions, but humans will still need to build models, validate data, and decide on the right course of action. The type of work will change, but there will always be a role for skilled people.”