Data is the new gold as miners turn scattered spreadsheets into centralised systems powering smarter decisions and future ready ESG reporting
, , , , , , , , , , , , , , ,
, ,
, ,
When Katrina Garven, Principal Database Consultant at Alias Database Services, reflects on how mining and exploration companies use geological data, she sees an industry undergoing a quiet revolution.
Katrina founded Alias Database Services in 2016, building on years of experience helping companies manage complex geoscientific information. She has since witnessed a profound shift away from scattered spreadsheets and ad hoc processes towards centralised, structured systems that elevate geological data beyond the geology team. Today, that data is treated as a core business asset – feeding into executive decision-making, supporting compliance, and adding measurable long-term value.
“Geological data isn’t just something the geology team worries about anymore,” Katrina explains to The Rock Wrangler. “It’s part of the bigger business picture. It informs investment decisions, keeps companies compliant, and shapes how projects are developed and reported.”
From scattered files to structured systems
In her early years consulting, Katrina often encountered sites where critical drillhole or assay data lived in spreadsheets with little governance or validation. That left room for errors – from missing intervals to mismatched survey collars – which, while minor on the surface, could ripple downstream into flawed models and inaccurate interpretations.
“Good decisions rely on good data – it’s that simple,” she says. “If your geological data is off, everything downstream is compromised. Drill targeting, resource models, mine planning – they all depend on data integrity.”
The industry has since matured, placing greater emphasis on data governance and consistency. Katrina sees clients increasingly recognising that without a validated, centralised geological database, they face fragmentation, inefficiencies, and ultimately reduced trust in the numbers.
Turning standardisation into savings
One example that stands out in Katrina’s consulting career was a project with a large gold miner whose operations each maintained their own database structures and logging codes.
“Each site had different GIM Suite configurations and different field structures,” she recalls. “It was a nightmare for group reporting and system support.”
Katrina helped lead a standardisation project that aligned field names, logging codes, GIM Suite objects and tasks across all sites. The result was transformative. Updates could be made once and rolled out across the group, training became simpler, QA/QC more consistent, and workflows more automated.
“It reduced the support burden significantly,” she says. “But more importantly, it created a foundation for consistency and efficiency across the company. That’s where you see the real cost and time savings.”
Unlocking hidden value in acQuire GEM Suite
As one of the few GIM Suite Certified Professionals at the highest enterprise level, Katrina also sees how much functionality remains underused in the GIM Suite solution.
“Automation is a big one,” she says. “Too often geologists are bogged down in repetitive admin tasks. With tools like GIM Suite, you can let field staff run QA/QC checks, validations, and approvals directly in a browser, without touching the database backend. It frees them up to do actual geology.”
APIs are another untapped resource. By automating imports such as downhole survey data, companies can cut manual handling, enforce consistency, and ensure data flows seamlessly into models and plans.
“These tools are sitting there ready to use, but many companies don’t realise the value until they see them in action,” she notes.
Before and after: mining companies are moving from scattered spreadsheets and disconnected systems to a centralised geological database.
Integration makes the difference
Another theme Katrina returns to is integration. Geological data rarely exists in isolation – it must connect with GIS platforms, modelling software, and planning tools.
“My approach is to create standardised, repeatable export workflows so validated data flows straight into Leapfrog, Vulcan, Micromine, or ArcGIS,” she explains. “The idea is that the data comes out ready to use – no extra clean-up required.”
The benefit is more than just efficiency. It ensures that geologists, resource modellers, and mine planners are all working from the same dataset. Updates cascade across systems, reducing the risk of errors and improving trust in the outputs.
Making legacy data usable
For many miners, the real headache is not today’s data but decades of historical drilling records that may never have been properly validated. Katrina has developed a structured process for rehabilitating those datasets.
“It usually starts with an audit to understand the structure, identify missing fields, and map old codes against modern standards,” she explains. “From there, I use translation tables to bring everything into line, while keeping the original codes stored alongside the new ones for traceability.”
Running the cleaned dataset through QA/QC in a test environment is essential before it goes into production.
“That traceability is key,” she says. “People need to know they can track back to the original records if they have to. It gives them confidence that the historic data is trustworthy and can be integrated with current drilling.”
QA/QC: part of the workflow, not an afterthought
Katrina stresses that data quality control must be embedded into daily workflows, not treated as an end-of-pipe fix.
“In GIM Suite, you can configure business rules, validations, and QA reports that catch issues as they arise,” she says. “Checks like overlapping intervals can run automatically during data entry, while dashboards can flag missing assays or intervals as part of routine reporting.”
She believes consistency is achieved when QA/QC becomes part of how geologists and field staff work every day, not something left to administrators.
“It’s about giving people the tools, clear procedures, and access they need,” she adds. “Training helps, but the biggest impact comes when QA is just part of the workflow.”
Building capability through training
Katrina has also seen the difference that tailored, hands-on training can make.
“Geologists benefit most from practical sessions that show them how to validate, review, and export the data relevant to their role,” she explains. “Database managers need deeper technical knowledge of configuration and automation.”
Just as importantly, she advocates for ongoing support and reference materials so teams aren’t left struggling after a one-off training session.
“Training should evolve alongside the system,” she says. “That’s how you build confidence and capability.”
What’s next: cloud, automation, and ESG
Looking to the next five years, Katrina sees geological data management becoming even more integrated into the wider business.
“Cloud systems will make it easier for teams across sites to collaborate in real time, while automation will further reduce manual handling and errors,” she predicts.
She also highlights the growing pressure of ESG reporting.
“Companies are going to need their geological databases to capture more than geology – environmental, heritage, and community data will all be managed in the same system,” she says. “Long term, these systems will support not just technical work but also compliance and sustainability reporting in one place.”
The first step: get centralised
For companies still wondering how to maximise the value of their geological data, Katrina’s advice is simple: start with centralisation.
“If your data is scattered across spreadsheets and fragmented systems, you don’t have a foundation you can trust,” she cautions. “The single most important step is to get everything into a centralised, validated geological database. From there, you can build automation, integrations, and analytics – but none of that adds value without reliable, accessible data at the core.”