Any Question ?

We got answers


COMPANY & PRODUCT

loader-image

Samp is an industrial software vendor headquartered in Europe, established in 2020, with operations and clients across the globe.

Samp's clients are primarily process industry operators, either owner operators, contract operators or non-operating owners (including public authorities), active in the following sectors: utilities, power & energy (oil & gas), chemicals & industrial gases, maritime & offshore, pulp & paper, agribusiness, food & beverage, pharma & biotech, batteries & semiconductors, mining, metals & minerals.

These organizations open Shared Reality to their staff, supply chain, partners, contractors, inspection or certification bodies, to ensure that all parties work with a shared context of the physical assets, and that each party can contribute to check, flag or update data. This principle is called "crowd sourcing of data quality", and is the cornerstone of continuous improvement and operational excellence.

Shared Reality is the name of the software sold by Samp. It is an online SaaS, accessible via a web browser for authorized users. Each client organization is served by a fully segregated cloud environment, hosted in the region of their choice.

Built by industry veterans for the industry, Shared Reality is the easiest and most powerful 3D viewer available. Embedded AI and self-service 3D update ensure asset information integrity.

Only authorized personnel can access specific product features (roles), on a site-by-site basis (audiences), and all updates are traced.

EXPLORE is the set of primary capabilities of Shared Reality, where client organizations can leverage AI-generated 3D reality models. These high-fidelity 3D replicas of their facilities are not based on human modeling, but on 3D reality capture performed by third party surveyors, transformed in a few hours into an interactive 3D workspace where each asset is individually segmented.

EXPLORE empowers users to navigate multiple sites around the globe, each one as large as million sqm, with a level of detail as granular as 3mm. This unique scalability is made possible by the built-in Infinistream 3D engine. Users can take 3D measurements (from cm to km, from inches to miles), collaborate through 3D annotations, and insert 3D objects in the scene.

Beyond these useful basic features, EXPLORE offers a unique way to interact with 3D scans. Thanks to automatic segmentation by the 3D Assetizer AI, the 3D reality model of the site is no longer monolithic, but totally interactive: each asset can be selected, hidden, grouped, exported, filtered or colorized.

EXPLORE is particularly useful for different teams to assess a site, prepare engineering or hazard studies, plan work, optimize procurement, optimize training, or resolve claims. All this without having to install software, download large or sensitive files, or undergo complex training.

Because EXPLORE only has value if the 3D reality model is up-to-date, Shared Reality also offers powerful and easy update capabilities. Entire areas can be replaced after major modifications when a surveyor recaptured them, or individual equipment can be updated from a simple smartphone/tablet scan after a crew member modified it.

When clients need to go beyond the basic capabilities offered by EXPLORE, they can ENRICH any asset of the 3D reality model with 1D information within Shared Reality.

ENRICH can be performed in Shared Reality, by tagging 3D assets to easily retrieve them with the search bar, or by adding metadata such as equipment attributes/properties to leverage the conditional coloring options of the 3D Business Intelligence feature.

ENRICH can also be performed in Shared Reality by linking 3D assets with the existing tag identifier of an existing system of record (accessed via file exchange or API), such as an ERP, EAM, CMMS and more. Such linking is a 2-click process (click the 3D asset then click the tag identifier). By linking the 3D asset with an external tag identifier, the 3D asset immediately inherits the metadata of the tag from the external system. It then becomes possible to access the asset attributes/properties from Shared Reality, as well as searching the asset with the search bar, or leveraging the conditional coloring options of the 3D Business Intelligence feature.

ENRICH serves advanced use cases in QHSE, asset management, permitting, inspection, and access to external IT/OT systems data (and vice-versa, external systems can access the Shared Reality 3D assets).

ENRICH is performed on a need basis, and can be distributed across teams (various disciplines, staff or contractor) and time (sequentially or concurrently).

ENRICH can be combined with UNIFY if needed.

When clients need to go beyond the basic capabilities offered by EXPLORE, they can UNIFY any asset of the 3D reality model with 2D information within Shared Reality.

UNIFY can be performed in Shared Reality, by linking 3D assets with symbols or lines in flowsheets (P&ID, PFD, isometric, single line diagrams and more, accessed via file exchange or API) in order to combine a geospatial and functional view of the facilities at the asset level.

Linking is a 2-click process (click the 3D asset then click the symbol in the flowsheet). In piping-intensive environments, linking can be assisted by Shared Reality Copilot, an AI guiding the human users along the piping system, suggesting piping sections and process equipment to be linked across 3D assets and 2D symbols.

UNIFY is made possible by the 2D Assetizer AI, which turns monolithic flowsheet images (such as PDF, SVG, PNG) into interactive diagrams where each line and asset is selectable in an interactive overlay.

Whenever discrepancies exist between the up-to-date 3D reality model, and the (generally) outdated or partial flowsheets, the user can rely on the redlining and markup capabilities of Shared Reality to re-align the diagrams contents with the reality of the field.

UNIFY serves advanced use cases in P&ID revalidation, equipment isolation (lock-out, tag-out), tests & commissioning, certification.

UNIFY is performed on a need basis, and can be distributed across teams (various disciplines, staff or contractor) and time (sequentially or concurrently).

UNIFY can be combined with ENRICH if needed. In this case, the identifiers extracted from the 2D flowsheets can be automatically associated with the 1D equipment list identifiers and their metadata, if the identifiers are sufficiently similar across 2D and 1D.


REALITY CAPTURE

loader-image

Reality capture is the process by which the dimensions of existing physical assets are captured, with various devices depending on the environment. Those devices can be static, mounted on monopod/tripod, or mobile, mounted on a human/robot/drone. The reality capture technology itself can range from LiDAR (lasergrammetry), images (photogrammetry/videogrammetry), magnetic or radar (for buried assets).

Surveyors select or combine techniques depending on the configuration of the installation and the intended use of the data. Shared Reality is agnostic of the reality capture method and will work with any of the technologies mentioned above.

Industrial operators rely on their surveying contractors to perform the site reality capture. Some of them have in-house surveying teams. For those who have no experience nor established contractors, Samp can recommend a local reality capture partner, and provide surveying specifications.

What is displayed in Shared Reality is a 3D reality model. Compared to the raw monolithic 3D scan uploaded by the surveyor, in the 3D reality model each piece of piping and process equipment is separated from the others as an individual asset and becomes interactive. The level of detail is the same as in the original 3D scan.

There is no need to perform a Computer Aided Design (CAD) remodeling of the facility, also known as a 3D CAD model. This saves a lot of time and money, while eliminating the risk of human modeling error. 3D CAD models are now only used to design future modifications (hence the “D” for “Design” in CAD).

The 3D reality model level of detail and precision will directly depend on the device(s) used during the reality capture, the method applied by the surveying & postprocessing team, as well as the environmental conditions (eg: humidity).

Generally, models used in Shared Reality are centimetric grade: equipment details range between 3mm and 5mm, absolute precision at full site scale ranges between 10mm and 50mm.

Shared Reality was designed for high scalability from the ground up, and is hosted by Amazon Web Services, one of the largest IT infrastructure providers.

Whether for storage, computation, caching, display, parallel use, Shared Reality can shoulder the load, even if your organization operates hundreds of large sites staffed with thousands of users.

Several of our hundreds of client’s sites span each over 1,000+ acres / 500+ hectares, with millions of assets.

INPUT DATA

loader-image

The minimum needed is a 3D point cloud. This is the result of the 3D reality capture process. Formats accepted are industry standards and vendor neutral: E57, LAS, LAZ. They can be structured or unstructured, georeferenced or not, and may also contain 360° panoramic images. Each file can be up to 200Gb in size, several files can be used for each site.

Other information can be used as optional inputs, as files or via APIs:

  • equipment list (as table or spreadsheet extracted from the ERP/EAM/CMMS)
  • technical diagrams, such as PFD, P&ID, isometric, electrical diagrams, flowsheet,..

Shared Reality can be used without any pre-existing equipment list.

Shared Reality’s 3D Assetizer AI will generate an equipment inventory, which can be used as a basis to create an equipment list, an MTO list, or a BOM.

Equipment can be tagged within the Shared Reality workspace, to assign unique equipment identifiers (e.g. “Tag numbers”), or for the initial loading of an ERP/EAM/CMMS. Tagged equipment can be retrieved instantly anywhere on the site by typing at least 3 alphanumeric characters of the tag identifier in the search bar.

Once tagged, assets can be enriched with metadata via user-defined forms. All equipment properties/attributes filled in the form can then be exported for loading in an ERP/EAM/CMMS system.

Once enriched with metadata, equipment can be colorized based on conditional formatting rules selected by the users with the 3D Business Intelligence feature.

Use the equipment list that makes most sense to your teams’ daily work. It can be a flat list but is preferably organized with one or two sub-levels. Subfolders can be organized by system, area, equipment class, depending on your organization practice.

If Shared Reality is used without connection to your IT/OY, the list is extracted as a table (CSV) or spreadsheet (XLS, XLSX) from the ERP/EAM/CMMS.

For each asset, it can contain attributes/properties metadata, and even hyperlinks to existing systems of record (e.g. ERP, EAM, CMMS, EDMS, DAHS, SCADA). This asset information will be displayed in the equipment information box when using Shared Reality.

If Shared Reality is used with connection to your IT/OT, a webservice (accessible via Samp API) will directly display your existing equipment list, and all the associated information.

Shared Reality can be used without any pre-existing technical drawings or flowsheets.

Many brownfield facilities do not have any drawing or flowsheets. It is possible to easily redesign them leveraging Shared Reality and a drawing authoring tool on the side, ticking each asset as it is captured in the authoring tool. By doing so, engineering hours to reverse-engineer P&IDs will be slashed compared with traditional approaches where an engineering company is tasked to produce those deliverables in a traditional way.

Note that for simple sites, it is possible to author such flowsheets directly in Shared Reality, using the built-in redlining and markup features on a blank page, then exporting the drawing as an image

Any type of technical drawing can be used for linking or verification against 3D, such as PFD, P&ID, isometric, electrical diagrams, flowsheets.

The main open file formats are supported, whether they are vector/raster images (SVG, PNG), PDF documents (single/multiple pages) or DEXPI (depends on version).

Absolutely not. It is rather the opposite.

Shared Reality is designed to be a continuous and distributed visual data cleansing workspace. Most of our clients load their existing (partial and outdated) data as-is, then leverage the power of up-to-date 3D for enriching / linking / filtering / colorizing / redlining their data on the go and on a need basis.

This approach ensures affordable and pragmatic data verification & update of existing records. This enables a continuous improvement loop, with empowered and engaged teams, whatever their discipline and seniority level.

Implementing Shared Reality early also saves A LOT of time and money on asset data or document upgrade projects, such as EAM deployment or P&ID creation/update.

CYBERSECURITY

loader-image

Two main methods exist to connect existing technical data with Shared Reality. They are not mutually exclusive.

Most industrial operators start with manual files loading, then evolve to automated files loading. Some clients chose to move towards webservices, generally read-only, and sometimes move to bidirectional (read/write) webservices for specific types of data.

A/ FILE-BASED (push model)

If Shared Reality is used without connection to your systems, a copy of the equipment list (table or spreadsheet) and drawings/schematics (PDF or image files) must be loaded into the workspace via the upload interface. Those files can be updated as often as needed, resulting in an update of the attributes/properties metadata displayed in the equipment information box. Periodic file loading/update can be done manually, or can be automated, and performed on a need basis (monthly, weekly, nightly…).

PROS: simple to implement, no impact on existing systems, used during the pilot project.

CONS: does not provide fast information refresh rates.

B/ WEBSERVICE-BASED (pull model)

If Shared Reality is connected to your systems, webservices can directly query your systems of record to read dynamically the equipment list/properties and drawings. It can also query the attributes/properties to be displayed in the equipment information box, including “hot data” from real time monitoring systems / SCADA / data historians. These connectors can be mono or bi-directional, depending on the client’s needs and constraints.

PROS: removes the need to load/duplicate data, fast information refresh rates, can be bi-directional if needed.

CONS: requires longer IT & cybersecurity work, typically studied during the pilot project (part of the scope).

NOTE: it is possible to easily navigate from/to Shared Reality with hyperlinks:

  • from Shared Reality: with hyperlinks embedded in equipment infoboxes, enabling the users to directly open existing IT/OT systems on the selected asset.
  • to Shared Reality: with hyperlinks embedded in existing systems or shared by emails, authorized users can directly access a specific piece of equipment or location (e.g. a weld) in a given site.

Your data remains your property.

A copy of your 3D data always needs to be hosted by Samp, as it is transformed into a 3D streaming format for instant access over the web, without the need to download large or sensitive 3D files.

When a copy of other data needs to be hosted by Samp (if Shared Reality is not yet connected to your IT/OT systems), it will be exclusively hosted inside your company’s instance of Shared Reality.

Each client organization has a fully segregated Shared Reality environment hosted by Amazon Web Services, hosted in the regions of their choice:

  • us-east-1 - US East (N. Virginia)
  • ca-central-1 - Canada (Central)
  • eu-central-1 - Europe (Frankfurt)
  • eu-west-1 - Europe (Ireland)
  • eu-west-2 - Europe (London)
  • eu-west-3 - Europe (Paris)
  • ap-southeast-2 - Asia Pacific (Sydney)
  • ap-southeast-1 - Asia Pacific (Singapore)
  • ap-northeast-2 - Asia Pacific (Seoul)
  • me-south-1 - Middle East (Bahrain)
  • mx-central-1 - Mexico (Central)
  • sa-east-1 - South America (São Paulo)
  • me-central-1 - Middle East (UAE)

(this list is updated regularly and is not binding)

Your data remains your property.

All data hosted by Samp is a copy of your original data. Your original data remains stored in your systems of record. If specific temporary data was created in Shared Reality (e.g. annotations, inventories, redlining) it can be exported in an open CSV format for separate use.

All segmented 3D scans can be exported in an open E57 format, each asset as a separate georeferenced file bearing the name of the tagged assets.

All our clients are highly sensitive critical infrastructure operators, or their suppliers. As such they conduct in-depth security audits, which are part of the onboarding process.

All our client’s data remains in Samp’s systems, whether for storage, processing or display.

Samp’s system architecture is deployed as a code (infrastructure as code), which can be fully audited. Our clients can select the location of their data hosting on Amazon Web Services.

Samp recommends using your corporate Single Sign On (SSO) system, to ensure a seamless experience for your teams, as well as the highest level of access control. Users’ permissions are managed within the Shared Reality's administration environment on two different dimensions: roles and audiences.

Roles define which level of features a user can access (e.g. read only, annotate, edit, upload, download, admin…). Audiences define which site or which group of sites a user can access, as not all users may need to access all sites. The roles x audiences combination ensures granular user access management.

AI & TRUST

loader-image

Shared Reality’s AI can perform several tasks.

3D AI

Shared Reality's 3D Assetizer AI it will detect process equipment (e.g. pipes and any asset attached to them such as valves, pumps, tanks, instruments…) and will group objects individually in the scene. It will suggest an equipment class proposal for each object (eg: pipe, valve, pump) enabling class colorization for easier navigation.

It will also detect and group the floor or the main structures, such as walls, roofs, ceiling etc. for easier navigation (hide/show/group structures). Overall, the detection rate is 80%+ for the common classes managed by the 3D Assetizer AI.

2D AI

Shared Reality's 2D Assetizer AI will overlay selectable lines and boxes on the system diagrams (P&ID, PFD, isometrics, single line diagrams), so that they can be selected and linked to the corresponding 3D asset, or to the corresponding tag in the 1D equipment list.

In piping-intensive environments, after manual seeding by the user, Shared Reality's Copilot AI can follow the flowsheets lines and suggest the corresponding 2D-3D pair of assets to be linked. The user always remains in control and can accept/correct/skip the proposal.

1D AI

Shared Reality will automatically associate 1D tags from your existing asset register, along with the associated metadata, to the corresponding symbols in the flowcharts, if the asset tag identifiers are similar (with a tolerance for ‘-’ ‘_’ or ‘space’ characters). If the tag names are not the same, or if the diagrams do not bear any tag names, the 1D-to-2D linking is manual (2 clicks).

SUMMING IT UP

The 3D-to-2D linking is manual (2 clicks), and after initial seeding can be assisted by Shared Reality Copilot in piping-intensive environments, still requiring human confirmation of the Copilot proposals for obvious safety reasons. The 2D-to-1D linking is automated provided that the asset tag identifiers are similar, otherwise it is manual (2 clicks). When 3D-to-1D linking is required, without 2D as intermediate media, the linking is manual (2 clicks). Linking capabilities are accessible to authorized users only.

The AI proposals can be visually checked directly in Shared Reality (various colorization modes are available for that purpose), and modified as needed in the workspace, giving you full control over the final outcome.

3D: the objects can be grouped together (e.g. pipe sections as a full line) or ungrouped (e.g. pump motor from pump head). The equipment classes proposals can be checked thanks to color coding and easily modified asset by asset or in batch.

2D: the interactive boxes around assets, or interactive lines over piping, can be edited, moved, resized, deleted, or added, asset by asset or in batch.

3D-2D links, 3D-1D links, 3D-2D-1D links can be checked with color coding and modified as needed directly in the workspace (for authorized users).

AI training and AI execution are two different stages in the AI lifecycle, executed on different data and in completely segregated environments.

Shared Reality’s AI is trained in Samp's own environment. It learns to detect and segment process industry equipment (pumps, valves, pipes etc.) both in 3D point clouds and in 2D flowsheets.

Once this training is completed with Samp’s generic training data, the trained AI is deployed in Shared Reality and is able to perform this segmentation on customers’ private data, in their dedicated production environment.

Note that some clients agree to provide some of their anonymized data to improve Shared Reality’s AI performance, e.g. to detect new classes. Such an agreement is not mandatory and is governed by strict contractual terms and technical methods.

Shared Reality detects piping (including insulated piping and piping down to NPS 3/4" or DN20) as well as 20+ classes of process equipment representing 80%+ of the assets commonly found in a process industry facility. Those classes are a subset of the CFIHOS / ISO 15926 standard.

This taxonomy can be replaced by one bearing client-specific names and subclasses, generally more detailed. This allows users to manually go deeper in the classification of objects if needed. Some of the client subclasses may then enter Samp's training dataset after anonymization, provided that the contract allows it.

The processing time will depend on the site size and equipment density.

After loading the raw 3D scan in Shared Reality, a site will be visible and ready to be used in the workspace on the next hours generally. For very large sites, it may go beyond a full day, for very small sites it may be less than an hour.

DATA ENRICHMENT

loader-image

No, usually 3D is enriched or linked on a need basis only, as projects or maintenance operations progress. Most clients start with interactive 3D only, which already delivers a lot of value by supporting many use cases.

Then, when necessary, different teams can, at different times, enrich selected 3D assets or specific systems with the associated 1D information, or link them with the associated 2D symbols.

Whenever a discrepancy or issue is found onsite or in technical information, it can be reported by users, or the association corrected. This principle is called "crowd sourcing of data quality" and is the cornerstone of continuous improvement and operational excellence.

There is a good chance that you are already paying people to do this.

Most of modification or maintenance projects include an as-built verification stage before studies begin, or after the project is completed. These verifications are generally conducted by checking spreadsheets or drawings during plants walkdowns.

With Shared Reality, it is possible to conduct these lists or flowsheets verification walkdowns virtually, much faster, and out of harm's way. The result is enriched and verified 3D assets. Moreover, the result of these verifications is retained and shared by all, instead of being locked away in inaccessible files.

This work can be distributed across teams of different disciplines (e.g. piping & process, turbines, HSE…) and over time (e.g. teams involved sequentially or concurrently).

Enriching 3D assets or systems with 1D or 2D data can be done either by your staff or by your contractors, or jointly by both, on complementary types of assets.

When using your staff, such work can be carried out by junior profiles or trainees/interns, coached by a more senior employee with site knowledge. This affordable setup has been successfully used many times by our clients.

When relying on contractors, this work can either be part of a maintenance or modification project (procurement requiring contractors to perform all pre-work / post-work verification in Shared Reality), or part of a specific data cleansing project (e.g. P&ID update). In all cases, the engineering hours are reduced compared with the standard practices requiring extensive onsite walkdowns. This means that the cost of enriching/linking is negative, while ensuring that the result is capitalized and shared by all.

All setups can be used in a complementary manner, either sequentially or concurrently).

The process is very fast (2 clicks) and can be parallelized across teams. The typical work rate is several hundreds of assets per day per person, depending on the site configuration. A 4-person team can typically process around 10,000 assets per week or more.

It is important to note that not all assets need to be enriched/linked, and that this work can be performed by system, by unit, by project, or in any other suitable progressive manner, driven by the end-users needs.

CONTINUOUS UPDATE

loader-image

Yes, Shared Reality is designed for authorized users to be able to update 3D data, technical diagrams or equipment lists by yourself, by simply uploading new versions in the workspace.

If you use Shared Reality with an API integration to your IT/OT systems, this update process can be automated, without the need to manually upload or duplicate data.

If a full area of your site was modified and re-scanned, you will re-upload the full area and reprocess it as new. You may reuse previously existing objects links and focus only on new/changed equipment. This process is recommended for contracted reality capture (third-party surveyors, often relying on lidar scanners).

If a few pieces of equipment were modified and re-scanned, you will re-upload the local scan and replace the previous equipment. This process is recommended for in-house staff (e.g. maintenance, construction, inspection teams, often relying on photogrammetry from smartphone/tablet).

In both cases, only authorized personnel can perform the publication of these updates, and all updates are traced.

It is possible to export any (multi) selection of asset(s) as E57 files, with a chosen density level (3mm, 2cm, 10cm). Each asset is exported as an individual georeferenced file, bearing the name of its tag identifier if any. All E57 files are contained in a ZIP file.

It is therefore possible to export a single system (e.g. fire suppression system), or only part of a unit (e.g. the civil structure of a building, without any systems inside) at variable densities depending on the intended use.

If a full area of your site was modified and the corresponding P&ID updated, you will re-upload (or APIs will display) the corresponding P&ID and either reprocess it as new or reuse previously existing objects links and focus only on new/changed equipment. This process is recommended for process engineers.

If a few pieces of equipment were modified, you will be able to redline/markup on the P&ID directly in the Shared Reality workspace, until you need process engineers to update it in the native CAD system. This process is recommended for in-house staff (e.g. maintenance, construction, inspection teams) and contractors.

In both cases, only authorized personnel can perform the publication of these updates, and all updates are traced.

OUR DIFFERENCE

loader-image

EXPLORE

Shared Reality allows users to EXPLORE your facilities’ 3D reality capture online, and easily navigate thanks to 3D streaming, without the need to download any file nor install any software. Taking 3D measurements, collaborating through 3D annotations and inserting 3D objects are the foundational capabilities powered by our proprietary Infinistream visualization engine, designed for scale (handles million-sqm class sites).

Unlike other 3D viewers displaying a monolithic scene, Shared Reality's 3D reality models feature selectable assets automatically segmented by AI, so you can easily navigate and interact within buildings, systems or equipment in the scene: multiselect any assets or system to hide/show/group/export them, filter or colorize based on equipment classes.

ENRICH

Shared Reality goes beyond 3D only, and when needed allows to ENRICH the assets with 1D tags, hierarchies or metadata, either created within Shared Reality, or linked to external systems of records (ERP, EAM, CMMS, SCADA...). Such information enables users to perform instant equipment search, properties/attributes retrieval, interaction with external systems of record, or 3D business intelligence (colorization & filtering based on metadata).

UNIFY

If needed, Shared Reality also allows to UNIFY the geospatial 3D view with the functional 2D process view. The online workspace allows users to link 3D assets with specific locations, lines or symbols, in 2D flowsheets, in order to check, share and navigate in a dual context. Such combined views are highly effective to support the most demanding workflows in the process industry, such as equipment isolation (lock-out tag-out), testing & commissioning, or P&ID revalidation.

NOTE: only authorized personnel can access the ENRICH & UNIFY capabilities, the links publication is controlled, and all changes are traced.

Shared Reality is not a Computer Aided Design software system. CAD systems are good for designing (hence the ‘D’ of CAD) new facilities, or to model modifications.

Shared Reality is used to rapidly generate a completely up-to-date and fully interactive 3D context for existing sites, which most of the time have no CAD data.

It does not require the installation of any software, nor the download of any files, thanks to 3D streaming. Shared Reality runs on midmarket hardware.

Beyond generating an interactive 3D reality model out of raw 3D reality capture, if needed Shared Reality can also support 1D attributes and 2D flowsheets:

  • 1D equipment lists can be generated by AI from the 3D reality model and enriched with metadata by users, or reused from existing IT systems (ERP, EAM, CMMS...)
  • 2D technical diagrams can be scanned from existing paper or microfilm records and made interactive by AI, or reused from existing digital P&ID

Unlike a CAD system, Shared Reality can be used by non-specialists and requires only minutes of discovery with the built-in guided tour.

It is possible to load and view 3D objects within Shared Reality, whether they come from a 3D CAD system or from online equipment libraries (e.g. Traceparts).

Shared Reality supports a variety of 3D input formats (e.g. GLB, GLTF, OBJ, IFC…), and third-party software may be used to convert other formats into a supported one.

Shared Reality also allows to position and name simple 3D parametric shapes in the workspace:

  • boxes: for space reservation
  • spheres: for exclusion zones simulation
  • trapezoid: for trench excavation simulation
  • cylinders: for cranes work radius or tank placement simulation

Shared Reality is designed to be the link between the reality of the field and these digital systems.

It can be connected progressively to some or all of your systems or records, depending on the use cases to be covered.

During a project, Shared Reality can interact with BIM/CDE systems. During operations it can interact with ERP, EAM, CMMS, EDMS, DAHS, SCADA systems.

Shared Reality increases the use of existing systems, making access to technical information easier and faster.

As you replace your legacy systems with new ones, end-users will notice no difference within Shared Reality.

Yes. Since industrial operators generally do not have reliable CAD models of their facilities (incomplete or obsolete models, costly or complex CAD software to open/edit), and most of the time do not have any CAD models at all, relying on 3D reality models generated and maintained with Shared Reality is a simpler, faster, and much more affordable alternative.

Industrial operators who need “BIM for operations” can easily generate, distribute, and update 3D reality models and, if necessary, attach technical information to any asset, leveraging their existing EAM or CMMS systems. Smaller operators who do not have such information systems can enter data directly into Shared Reality or import it from spreadsheets.

In short, Shared Reality is designed to fully support “BIM for operations,” providing an easy-to-use high-fidelity 3D workspace for everyone without having to rely on CAD models. This avoids the high costs associated with CAD remodeling, which will neither be maintained nor used downstream in the facility's life cycle.

USE CASES & ROI

loader-image

Our clients’ main use cases revolve around all stages of their existing facilities lifecycle. The 3 major types of use cases, and associated use of Shared Reality, are:

  • brownfield projects (site modification or expansion): pre-project studies, endorsement, engineering & safety studies, services procurement, lock-out tag-out, work preparation, tests & commissioning, handover to operations
  • contract operations: site assessment, bid preparation & bid presentation, asset management, teams training, inspection & maintenance, lock-out tag-out, information transfer to owner
  • operation & maintenance: equipment inventory, risk-based asset management, investment planning, services procurement, teams training, inspection & maintenance, lock-out tag-out, compliance

When Shared Reality is connected to existing IT/OT systems, it is also used as an easier & faster gateway to information, as teams always prefer to retrieve data from a visual representation, whether it is via the 3D reality model or technical diagrams.

The ROI of Shared Reality will depend on the client use case, industry and organization. It is generally extremely fast and can be assessed with the ROI calculator provided during the pilot project.

This ROI calculator lists typical use cases for 6 different domains (QHSE, engineering, project, procurement, site work, operations & maintenance), as well as observed gains and costs.

GETTING STARTED

loader-image

We offer a pilot subscription package for up to 20,000 sqm (200,000 sqft), which can be up and running overnight, and allows your teams to fully discover Shared Reality over a duration of 12 months.

Beyond the technical assessment, during this period, our team will provide you with the use cases guide and ROI calculator, and if needed will participate in architecture and cybersecurity reviews.

Yes, we do have reference customers who are willing to talk to their peers, share their experience, and the way they benefit from Shared Reality on a daily basis.

Customer reference calls/visits take place during or after a pilot.

Shared Reality is a SaaS, with a subscription model based on tokens:

  • 1 token = 1 sqm/year (or 10 sqft/year)
  • Min package: 20,000 Tokens
  • Discounted pricing tiers start for packages over 20,000 tokens
  • Extra discount for 3-year commitments
  • Tokens are non-refundable (used at the end of the year)
  • Tokens can be reallocated between projects/sites

The subscription includes:

  • all product features
  • unlimited data updates
  • unlimited users (incl. contractors)
  • continuous software upgrades
  • user guide, tutorials and support

Little to no training is needed for casual Shared Reality users. The workspace features a short guided tour popping up at the first connection. If needed, users can also access a complete online user guide, as well as video tutorials.

Advanced users such as administrators typically undergo a half-day of training, have access to the complete online user guide, and to the support ticketing system directly in the workspace.

Samp provides video tutorials, detailed documentation, as well as webinars, remote or on-site training if needed, particularly during the pilot project or for more advanced use of the solution (e.g. administrators).

Scroll to Top
Logo Samp

Scanning Services

Extend the reach of your scanning services by delivering your scanning campaigns on an AI-powered 3D workspace, designed from the ground up for the industry. Leverage your surveying and topographic expertise to consolidate valuable field data into a single viewer: maps, aerial and drone orthophotos, laser scans, photogrammetry or videogrammetry, and georadar can now be securely viewed, updated and shared in one place.

Engineering services, EPC

Upscale the value of your engineering services offering by delivering digital twin as a service, powered by your qualified staff. Improve customer retention with longer-term contracts that ensure continuous synchronization of technical data with the as-built facility. Accelerate or automate the production of technical deliverables when working on brownfield projects with little or no existing input information.

CONTRACT OPERATORS

When preparing a quote for operating a facility on behalf of the owner, be sure to maximize that short window of time by taking advantage of as much technical information as possible. Turn your initial site visit into a unique opportunity to capture the current condition of the facility. Make a bid that will beat the competition with an already operational digital twin, while giving you increased confidence in your future service contract margins.

OWNER OPERATORS

Whether you manage a single plant or multiple sites, whether your facilities are onshore or offshore, Shared Reality helps you build and maintain reality models within days. Major milestones in a plant’s lifecycle, such as handover from EPC to operator, change of ownership, revamping or decommissioning, provide an opportunity to implement a safer and more efficient way of working with your extended teams, regardless of the quality of your technical data.