MiningSurvey DataEnterpriseWorkflow

Managing Survey Data for Mining Operations: From Capture to Queryable Record

Underground mines, open-cut pits, and processing plants generate enormous volumes of spatial data. Here's how mining operations are moving from scattered Dropbox folders to organised, queryable site records.

Alex Tolson

Alex Tolson

May 5, 2026

A modern mine generates more spatial data in a month than most engineering firms produce in a year. Daily drone flights over the pit. Weekly LiDAR scans of the high wall. Monthly underground surveys for stope reconciliation. Quarterly aerial captures for stockpile volumes. Annual photogrammetry runs for rehabilitation reporting. Add the inspection scans of plant infrastructure, the 3D models for design changes, and the routine documentation of new development headings, and you are looking at terabytes of LAS, E57, GeoTIFF, MP4, and 3D Tiles data flowing into the operation every quarter.

Where does it all go? In most operations, the honest answer is: a Dropbox folder, a SharePoint site, a project drive on the corporate network, and the laptop of whichever surveyor captured it. Frequently, all four — the same dataset duplicated across multiple stores, with no single source of truth and no way to find anything that’s more than six months old.

This is the reality of spatial data management in mining today, and it is the reality that is starting to change.

What mining operations actually capture

A typical mid-sized open-cut mine has at least four distinct streams of spatial data flowing in:

SourceTypical formatsFrequencyVolume per capture
Drone flights (pit, stockpiles, infrastructure)GeoTIFF orthos, LAS point clouds, MP4 videoDaily to weekly2–20 GB
Mobile mapping / LiDAR (haul roads, walls)LAS, LAZ, E57Weekly to monthly5–50 GB
Underground scanning (development, stopes)LAS, E57, OBJ meshesPer shift to weekly1–10 GB
Plant and infrastructure scanningE57 point clouds, 3D models, photosAs required5–30 GB

Add periodic captures from external contractors — aerial photogrammetry providers, specialist scanning firms, geotechnical consultants — and the inflow can easily exceed 500 GB per month for a single operation.

Most of this data is captured carefully and processed professionally. The breakdown happens at the next step: storing it, organising it, and making it findable for the people who need it later.

Why generic storage fails

Cloud storage platforms — Dropbox, OneDrive, SharePoint, Google Drive — were built for documents. They handle large files acceptably, they have good sync clients, and they have decent permission systems. For PDFs, spreadsheets, CAD drawings, and reports, they are fine.

For spatial data, four problems show up immediately.

No viewer. A LAS file in SharePoint is a file with an unfamiliar extension. Nobody in the organisation can click it and see the point cloud. To use the data, someone has to download it, open it in CloudCompare or LASTools, and produce screenshots — which then get pasted into a PowerPoint that becomes the de facto deliverable. The actual point cloud is seen by maybe two people.

No spatial organisation. The folder structure is whatever the surveyor decided to call things on the day. 2026-04-12_Pit2_Highwall_v2_FINAL_use-this-one.zip is a real filename. Multiply that by three years of captures and the result is a folder you cannot navigate without a guide.

No audit trail. Mining is heavily regulated. When something goes wrong — a wall failure, a haul road incident, an environmental breach — investigators want to know what was known, when, and by whom. Generic file storage tells you who opened the folder. It cannot tell you who looked at which scan, when, and for how long. That gap matters when the question becomes what did the operator know about the wall condition before the failure.

No data residency control. A surprising number of mining operations are storing spatial data in cloud regions that don’t meet the host country’s data residency expectations. Mineral exploration data, in particular, is often subject to country-specific rules. SharePoint and Dropbox don’t give you fine-grained control over which data centre your point clouds are sitting in.

What purpose-built spatial data management looks like

A proper spatial data store for a mining operation looks fundamentally different from a generic cloud drive. It treats the site — a pit, a heading, an asset, a section of haul road — as the primary unit of organisation, and time-indexes every capture against that site.

Site-per-pit (and per stope, per asset)

Every spatial unit on the operation gets its own site in the platform. An open-cut might have one site for the main pit, another for each stockpile area, another for the ROM pad, another for the workshop and admin precinct. An underground operation has sites for major development headings, ventilation infrastructure, and the portal area.

When a surveyor captures a new dataset, they upload it to the correct site. There is no debate about which folder to put it in. There is no risk that a critical scan ends up in Old/Misc/Probably_useful/.

Time-indexed by survey date

Each upload is automatically grouped into a capture session — the natural unit of “everything from this survey on this day.” When the geotechnical engineer needs to compare wall conditions between February and April, they open the site and see the two sessions side by side. The point clouds load in the browser. The orthos display on a map. The comparison is a five-minute job, not a two-day exercise in finding the right files.

For a deeper look at how this works for survey data generally, see how to deliver drone survey data.

Browser-viewable for everyone

This is the change that makes the others worthwhile. When the mine manager, the geotechnical engineer, the production planner, the environment officer, and the contractor’s project lead can all open a point cloud in their browser without installing anything, the data starts being used. Decisions get made from spatial data instead of from screenshots of spatial data.

Swyvl supports the formats that mining operations actually produce — LAS and LAZ for point clouds, GeoTIFF for orthos and DEMs, MP4 with GPS telemetry for drone video, OBJ and GLB for 3D models, 3D Tiles for massive datasets, and 360° panoramas for plant inspections. All in the browser, on any device.

Regional data residency

For mining operations subject to data residency rules — government-owned land, joint venture obligations, foreign investment review constraints — every site can be assigned to a specific regional data centre. Australian operations stay in Sydney. Canadian operations stay in Toronto. UK operations stay in London. The compliance posture is built into the platform, not bolted on.

Audit trail with IP and geolocation

Every access event is logged with timestamp, IP address, and approximate location. When the question becomes who looked at this scan and when, the answer is in the activity log. When a contractor is granted access to a site for a specific scope of work, the audit trail proves what they accessed and what they did not.

Multi-contractor delivery

The contractor problem is one of the messier parts of mining spatial data. A single mine might have:

  • Two or three drone service providers covering different parts of the operation
  • A specialist mobile mapping contractor for haul roads
  • A separate firm doing terrestrial LiDAR for highwall monitoring
  • Internal surveyors doing pickup work and stope scans
  • An external photogrammetry provider doing quarterly aerial captures
  • Periodic visits from geotechnical and environmental consultants

Each delivers their data the way they prefer to deliver it — usually by emailing a WeTransfer link or sharing a Dropbox folder. None of them know what the others have captured. The operation ends up with a fragmented record of its own physical state.

A site-based platform inverts this. Each contractor uploads directly to the relevant site. Their captures appear alongside everyone else’s, time-stamped and properly organised. The mine has a single record of what was captured at the pit, regardless of who captured it. The contractor relationships stay the same; only the delivery mechanism changes.

This also makes contractor handovers easier. When a new drone provider takes over from an outgoing one, they don’t inherit a problem of “where is everything.” They inherit a site record that already contains the historical context, and they start adding to it from day one.

The compliance and reporting layer

Mining regulators increasingly want spatial data, not just spatial reports. Rehabilitation bonds, environmental compliance, mine closure planning, water management, ore reserve audits — every one of these has a spatial data component, and the regulator’s expectation is moving from “show us the report” to “show us the data the report came from.”

A queryable site record makes this easy. The rehabilitation officer can pull up the orthos for the disturbed area, sorted by date, and produce a defensible visual record of progress. The environment team can show the regulator the same imagery the operations team is using. The mine closure planner can audit the cumulative spatial record back to the start of operations.

Without this organisation, every regulatory request becomes an expedition into the file shares to find what was captured when. With it, the answers are minutes away.

Getting started

Most mining operations don’t transition all of their spatial data at once, and they don’t need to. The pattern that works is:

  1. Pick the highest-traffic site first. The main pit, the active development heading, the asset getting the most scrutiny. Set it up and start uploading new captures.
  2. Run two systems in parallel for a quarter. Continue your current storage for legacy data, but route new captures through the site record. Resist the urge to backfill everything; let the active data prove the model first.
  3. Backfill selectively. Once the platform is in active use, bring in the historical captures that are still being referenced. Leave the truly archival data where it is; you can always import it later.
  4. Standardise contractor delivery. Once the internal team is comfortable, give your survey contractors upload access. Their captures land in the right site automatically.

For an asset owner’s perspective on receiving spatial data from multiple contractors, see the asset owners page.

The longer game

Spatial data is the most accurate record a mining operation will ever have of its own physical state. Treated properly, it compounds — every capture adds context, every comparison generates insight, every survey becomes part of the long-term reference for the operation.

Treated as a pile of files in a shared drive, it costs more to store every year and gets harder to use. The decision is not whether the data is valuable — it is, and everyone knows it — but whether the operation is going to invest the small amount of structure required to make that value accessible.

For mines doing this well, the spatial record is starting to look like a strategic asset. For everyone else, it remains a liability sitting on a SharePoint site nobody wants to clean up.

Alex Tolson

Alex Tolson

Co-founder of Swyvl. Eight years capturing the world in 3D — underground mines, the Great Barrier Reef, and everything in between. Previously co-founded Lateral Vision, a 3D visualization company and Google Street View contractor.

Share spatial data the right way.

Swyvl lets you upload your LAS, GeoTIFF, drone video, and 3D models and share them with clients via a branded portal — no software required on their end.

Get started free

Not ready to sign up? See Swyvl live in 30 minutes.

Related articles

Back to all posts