Re-Imagining the Way We Work: How data can unlock value

Part 2: How data can unlock value

In the first article of this series, I talked about how it’s time for mining to take what works from product lifecycle management — especially viewing what we do as creating a product, from ideation to end-of-life — and then to customise it to suit the complexities of our industry.


And the best way to do that is by employing virtual twin technology, which  allows you to visualise your systems as a collection of tiered and linked virtual twins, each with its own data sources, processes, and areas of knowledge. From there, your teams can run comprehensive virtual scenarios that allow them to identify, understand, and digitally resolve problems in the virtual world, and then apply their solutions quickly in the real world.


In this article, I discuss two essentials that will help you get ready to employ your data in a new, more valuable way within a virtual twin experience.


1. Consolidate and give meaning to your data

In the mining industry, data comes in many formats and from many different data sources, often with unique data models. If you can consolidate all your data into one central location and then give that data a common language, you ensure that all your data can be employed to contribute to your knowledge base.


One way to do this is to use a semantic data model to consolidate your data, while also adopting industry standards and dictionaries:

  • Semantic data is data – structured or unstructured and existing in any source – that has been organised to add meaning by creating data relationships between the data sources. This organisation both allows the same information to be viewed in multiple ways, without human intervention or additional processing.


  • Industry standards and dictionaries, such as GEOSciML, ensure the interoperability of software and hardware systems – meaning that one data source will always speak to another data source. Short of that, developing your own company-wide dictionaries and best practices can help to align data across the mine.

Here’s one example of how this kind of consolidation can help: Many software packages today work with data stored on a desktop in a design space that is not precisely geo-located to a real-world coordinate reference system. This is fine when the data is used only within that design space. However, if a mine wants to work within a virtual twin – where real world and design data can be combined and linked on a platform to planning and operational execution – consolidating the data to the precise global location encourages unbroken interaction between humans, software, and hardware. This unbroken interaction results in better use of data, which leads to better information, which leads to more confident decision-making.

There are several enabling solutions available today that can help with data consolidation. One is GEOVIA’s Collaborative Designer for Surpac role. The role connects geologists and engineers from GEOVIA Surpac to the Dassault Systèmes’ 3DEXPERIENCE Platform. It facilitates the centralization of data using industry standards and semantic dictionaries to turn your data into a common language, applies common coordinate reference systems, all in a unified data structure.


2. Control and manage your data

Mineral exploration and mining companies spend a huge amount of money to define and understand mineral resources, either to make capital decisions, to reduce geologic uncertainty before a mine is commissioned, or to improve operational efficiency when a mine is active.


As the digital representation of a mine’s biggest asset, the resource model is critical to a mine’s future. And that means you need both to control access to the model and to ensure that only people authorized to create, update, and view the resource data can get to it.

Version control is critical in other areas of mining as well. How many times have you run into the situation where different people have different versions of the same data on their local machines, USBs, or share sites? Who owns that data? Does anyone know which is the latest version of the data or which has been approved by management? How much time is  wasted by people working on the wrong data at the wrong time?


Additional benefits to implementing and tracking versioning and access control:


  • Traceability and auditability are built-in. You will have a clear trail showing when and where and by whom decisions were made, ensuring confidence in the decision chain and giving you the opportunity to engage in root-cause analysis should challenges or issues arise.


  • Because you have already built links and relationships between data, you will be able to trigger automatic updates whenever approved new data is available or when an update is required.

Once your data is consolidated and using a common language, you need to be able to manage it across teams and across virtual twins to ensure the data is synchronised and that the right people have the right access at the right time to the latest version.


Here’s an example of how sub-surface data can be controlled and managed throughout the product lifecycle inside a GEOVIA Virtual Twin Experience:

  • After you complete a summer drilling campaign to improve your resource definition, you receive the assay sample results from the lab, which means you need to update your borehole database


  • The virtual twin automatically notifies your exploration geologist that the approved geology logs and assay results are now available and the geologist, who is authorized to make changes:
    • creates a new iteration of the exploration borehole database with the new data incorporated, without losing any earlier versions of that database
    • visually validates the new data, and
    • sends a request through an automated approvals chain to approve the boreholes for use in updating the geology model and resource definition.


  • The geologist’s manager reviews and approves the new database iteration, and the virtual twin automatically:
    • notifies your geology modeler that a new version of the borehole database is available, and
    • generates the new geology model based on past-preserved parameters for implicit geology modeling – allowing your geology modeler to focus on analysing the results rather than performing manual updates.


  • After the geology modeler refines the new geology model and the manager again reviews and approves it, automated triggers notify the resource modeler who in turn updates the resource definition – certain that the new definition incorporates the latest approved data.


Next in this series

For the next two articles in the Re-imagining the Way We Work series, I will look further at how mines can:

  • apply virtual twin experiences further along the mining value chain, from evaluation to extraction, and
  • what benefits mines can expect to realise from using virtual twin experiences, using case examples.


About the author


@AJ - GEOVIA R&D Portfolio Management Director

Anthony is an experienced Product Management Director with a demonstrated history of working in the natural resources software industry. He has worked as a consultant using and teaching the use of software applications as well as providing guidance on the development of software applications for the last 10 years. Skills developed during this time are largely focused towards geological modelling, resource evaluation, mine planning and product management. 


Related Posts: