Data export
Introduction
The Data Export Service automatically exports data related to your operations into a standard format at a chosen interval and stores it in a configured external location or provides through one email notification a temporary link to access it.
The data export is designed for BI purposes, to export your data in operational applications or to trigger processes with a near real time latency, contact your implementation manager or customer support manager.
Global Architecture

Configuration
To enable the data export you should notify it to your customer success manager and with them choose the right configuration for your export. After finalizing your configuration, we will validate and deploy the solution for you.
Base configuration parameters
site_id
List of uuids
A list of one or more site identifiers for which you want to export data.
export_type
work-orders-states
or work-orders-timeline
Indicates the type of data to export.
See Structured data format The timeline format is the default one.
file_format
json
The output format for the exported data files.
Note: we only support Jsonline.
interval
once, daily, weekly, monthly, ..
The frequency at which the export job will run.
See Export frequency
connector_type
email
/ GCP connector / SFTP connector
Determines how the structured data and images will be sent to you. See Connector types
Export frequency
You can have a one shot export or configure a regular data export with a defined frequency.
Daily
Weekly
Monthly
The export will be generated and sent at the chosen interval.@
Connector types
The connector_type section determines how and where your exported data is delivered. Two primary connector types are supported:
Email with temporary link
GCP cloud bucket connector
SFTP connector
Below are the configuration details that differ based on your chosen connector type.
Email connector
This connector allows you to receive at your email address one temporary link to the structured data export file data export generated.
Email connector Config
emails
List of strings (valid email addresses)
Recipients to be notified when data is available.
GCP Cloud bucket connector
This connector allows you to receive directly into your cloud bucket the data export you decided, both the raw data and the images.
Cloud Storage connector Config
storage_uri
String (e.g., gs://customer-cloud-bucket-export/folder/customer-external/
)
Prefixed URI of the storage location where the exported files will be saved on the customer cloud.
Note: Must be a valid Google Cloud Storage (GCS) bucket path. Example: gs://my-bucket/deepomatic-data
export_images
True
/ False
Determines whether images associated with the exported data should also be included in the export.
storage_uri
We already have a separation between different sites, but we advise to have one extra layer to separate different environments.
Separate the production / testing sites in different folders
gs://customer-cloud-bucket-export/deepomatic/testing/
gs://customer-cloud-bucket-export/deepomatic/production/
export_images
When enabled we will transfer images related to your export_type from our infrastructure to yours. (See image export section)
GCP setup:
Given that we will transfer the data from our infrastructure to yours, to enable the Deepomatic GCP Cloud Storage Connector:
We will provide you one Deepomatic Principal email (a Deepomatic GCP Service Account):
customer-<customer_name>[email protected]
To this GSA you will need to assign these permissions on the bucket specified in the external_storage_uri:
storage.objects.create
storage.objects.list
storage.objects.get
storage.buckets.get
Structure of the data in the target bucket
The new data exports and images will be added to your bucket following this structure :
<external_storage_uri>/
433a60f6-9756-473e-b1e9-9e89e24bb787/ # < your site Id
data/
work-orders-states/ # < export type of your choice
2025-03-16_00:00:00_2025-03-17_00:00:00.json # < date from - date to
2025-03-17_00:00:00_2025-03-18_00:00:00.json
...
inputs/ # < images
13ead234c-8236-4sfa-b748-sdfb409cfe5.png # < refer to the inpud id in the export
338e3fda-4aab-4d20-8aa0-e9b234ae4779.png
...
SFTP connector
This connector allows you to receive data exported from Deepomatic directly to your own SFTP server including AWS S3 or Azure BLOB with an SFTP configuration. The SFTP connector works similarly to the GCP bucket connector but sends data to the directory through the SFTP communication protocol.
Configuration
sftp_host
String
Hostname of the SFTP server
sftp_port
Integer
Port number (default: 22)
sftp_remote_path
String
Base directory path on the remote server
export_images
Boolean
Whether to include images in the export (True/False)
host public key
String
Optional, to allow Deepomatic to authenticate the server on which one the files will be transfered
username
String
The username created for Deepomatic to transfer the data on the SFTP server
Setup Process
Setting up the SFTP connector takes 3 steps
1. Your organization creates an SFTP server and user
Your organization creates an SFTP server and a user and provides to Deepomatic the following information :
The SFTP server hostname,
Server port number,
Base directory path where exports should be stored,
Host public key of the SFTP server
Username created for Deepomatic
2. Deepomatic provides the authentication information
Deepomatic provides :
An SSH public key (Deepomatic follows modern standards for encryption protocols),
A list of IP addresses/ranges that will connect to your SFTP server.
The directory tree structure to eventually setup the access control on the server. As an example :
<external_storage_uri>/
433a60f6-9756-473e-b1e9-9e89e24bb787/ # < your site Id
data/
work-orders-states/ # < export type of your choice
# < the exported files will be loaded here
inputs/ # < images
# < the images will be loaded here
Your organization configures the SFTP server to allow Deepomatic to transfer the data
The Deepomatic SSH key in the authorized_keys file for the dedicated user
Appropriate file/directory permissions for the user
Firewall rules to allow connections from Deepomatic's IP ranges
4. Validation and operation
Once your organization has configured the SFTP server, Deepomatic will perform a test with an initial export and verify that the data lands properly on the server. Following validation, Deepomatic will start exporting the data and images at chosen interval in the directory structure.
Your organization is free to delete the transferred files when needed.
<external_storage_uri>/
433a60f6-9756-473e-b1e9-9e89e24bb787/ # < your site Id
data/
work-orders-states/ # < export type of your choice
2025-03-16_00:00:00_2025-03-17_00:00:00.json # < date from - date to
2025-03-17_00:00:00_2025-03-18_00:00:00.json
...
inputs/ # < images
13ead234c-8236-4sfa-b748-sdfb409cfe5.png # < refer to the inpud id in the export
338e3fda-4aab-4d20-8aa0-e9b234ae4779.png
...
Structured data format
Timeline data format
Each generated file contains structured data on all the work orders created or updated during the last interval time frame, including all the analysis performed on them.
This format can be used for:
book keeping
field ops analysis, AI and FTR analysis
field operator experience analysis
The work-order-timeline file is in the jsonline format.
Schema
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"id": {"type": ["string", "null"], "format": "uuid"},
"name": {"type": ["string", "null"]},
"site_id": {"type": ["string", "null"], "format": "uuid"},
"create_date": {"type": ["string", "null"], "format": "date-time"},
"update_date": {"type": ["string", "null"], "format": "date-time"},
"metadata": {"type": ["object", "null"]},
"progress_score": {"type": ["number", "null"]},
"work_items": {
"type": "array",
"items": {
"type": "object",
"required": ["id", "name", "input"],
"properties": {
"id": {"type": ["string", "null"], "format": "uuid"},
"name": {"type": ["string", "null"]},
"input": {
"type": "array",
"items": {
"type": "object",
"required": [
"id",
"create_date",
"file_extension",
"metadata",
"data_conformity",
"job_conformity"
],
"properties": {
"id": {"type": ["string", "null"], "format": "uuid"},
"create_date": {"type": ["string", "null"], "format": "date-time"},
"file_extension": {"type": ["string", "null"]},
"metadata": {"type": ["object", "null"]},
"data_conformity": {
"type": ["array", "null"],
"items": {
"type": "object",
"required": ["check", "code", "level"],
"properties": {
"check": {"type": ["string", "null"]},
"code": {"type": ["number", "null"]},
"level": {"type": ["string", "null"]}
}
}
},
"job_conformity": {
"type": "array",
"items": {
"type": "object",
"required": [
"task_name",
"analyzed_value",
"corrected_value",
"corrected_date",
"message",
"is_conformity_validated"
],
"properties": {
"task_name": {"type": ["string", "null"]},
"analyzed_value": {
"type": [
"string",
"boolean",
"number",
"null"
]
},
"corrected_value": {
"type": [
"string",
"boolean",
"number",
"null"
]
},
"corrected_date": {
"type": ["string", "null"],
"format": "date-time"
},
"message": {"type": ["string", "null"]},
"is_conformity_validated": {
"type": ["boolean", "null"]
}
},
"additionalProperties": false
}
}
},
"additionalProperties": false
}
}
},
"additionalProperties": false
}
}
},
"required": [
"id",
"name",
"site_id",
"create_date",
"update_date",
"metadata",
"work_items"
],
"additionalProperties": false
}
Fields definitions
Work order level fields
id
UUID (string)
Unique identifier of the work order
name
string
Descriptive name of the work order
site_id
UUID (string)
Identifier of the related site
create_date
ISO 8601 string
Creation timestamp with timezone
update_date
ISO 8601 string
Last updated timestamp with timezone
metadata
object
Additional information about the work order
progress_score
number
Optional progress indicator (e.g., between 0 and 1).
work_items
array
List of items to be processed. Each contains an id
, name
, and input
.
A work item relates to one taskgroup and has the same name
Work item
id
UUID (string)
Identifier of the related work_item
name
string
Name of the work item
input
array
List of input, images, and their analysis results
input
id
UUID (string)
Input identifier (image id)
metadata
object
May contain metadata of the image like geolocation or timestamp.
data_conformity
array of objects
List of conformity checks on raw input data. Can be empty.
job_conformity
array of objects
List of job-specific conformity checks. Can be empty.
data_conformity fields
check
string
The data conformity check.
code
string
One code per data conformity check.
level
string
Error, information or warning. Error is blocking, Warning and information are not blocking.
job_conformity fields
task_name
string
Name of the task being validated.
analyzed_value
string / number / bool
Output of the automatic analysis (optional).
corrected_value
string / number / bool
Manually corrected value (optional).
corrected_date
ISO 8601 string
When correction was made (nullable).
message
string
Explanation, error, or note about the result.
is_conformity_validated
boolean
if this task is considered validate.
Example
{
"id": "a71805b6-93f2-4ea3-9e42-a46b397ad448",
"name": "test-BB6D70A6",
"site_id": "433a60f6-9756-473e-b1e9-9e89e24bb787",
"create_date": "2025-03-19T14:21:40.105568+00:00",
"update_date": "2025-03-19T14:21:40.125857+00:00",
"metadata": {
"created_with": "mobile_app"
},
"progress_score": null,
"work_items": [
{
"id": "4e5b8dd8-9d50-4888-977b-4064e4b6a357",
"name": "l4_civil_open",
"input": [
{
"id": "9a739279-0904-4710-b740-53971660f2ae",
"metadata": {
"timestamp_ms": 1742394153532
},
"data_conformity": [],
"job_conformity": [
{
"task_name": "l4_civil_open_geolocation_maps",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_ctx",
"analyzed_value": true,
"corrected_value": null,
"corrected_date": null,
"message": "OK",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_geolocation",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_earthing",
"analyzed_value": false,
"corrected_value": null,
"corrected_date": null,
"message": "KO",
"is_conformity_validated": null
}
]
}
]
},
{
"id": "d43a33f6-a4b2-42fc-8865-0c689f2d1c36",
"name": "l4_civil_close",
"input": [
{
"id": "9fe06e48-e29e-4735-92df-82bcb8827b51",
"metadata": {
"timestamp_ms": 1742394107294
},
"data_conformity": [],
"job_conformity": [
{
"task_name": "l4_civil_close_type",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_close_size",
"analyzed_value": "existing",
"corrected_value": null,
"corrected_date": null,
"message": "existing",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_close_geolocation",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_close_ctx",
"analyzed_value": true,
"corrected_value": null,
"corrected_date": null,
"message": "OK",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_close_pon_reference",
"analyzed_value": "CMIB-G-X-4NOZAJ",
"corrected_value": null,
"corrected_date": null,
"message": "PON Reference: CMIB-G-X-4NOZAJ",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_close_geolocation_maps",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
}
]
}
]
},
{
"id": "09f90b26-eb2b-44b7-a235-8e6973a6306f",
"name": "l4_civil_open_floor_closeup",
"input": [
{
"id": "ac58cbf8-d144-408b-9b23-ee78c72e479d",
"metadata": {
"timestamp_ms": 1742394787589
},
"data_conformity": [],
"job_conformity": [
{
"task_name": "l4_civil_open_floor_state",
"analyzed_value": "ANOMALY",
"corrected_value": null,
"corrected_date": null,
"message": "ANOMALY",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_bottom_state_validation",
"analyzed_value": false,
"corrected_value": null,
"corrected_date": null,
"message": "KO",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_floor_closeup_ctx",
"analyzed_value": true,
"corrected_value": null,
"corrected_date": null,
"message": "OK",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_duct_count",
"analyzed_value": 0.0,
"corrected_value": null,
"corrected_date": null,
"message": "0.0",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_floor_closeup_geolocation",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_floor_closeup_geolocation_maps",
"analyzed_value": "UNAVAILABLE",
"corrected_value": null,
"corrected_date": null,
"message": "UNAVAILABLE",
"is_conformity_validated": null
},
{
"task_name": "l4_civil_open_duct_count_validation",
"analyzed_value": false,
"corrected_value": null,
"corrected_date": null,
"message": "KO",
"is_conformity_validated": null
}
]
}
]
}
]
}
Work order state data format
This format is the same as the timeline data format, except that to simplify the analysis, there is only one item in the input array. The input exported is always the last one created.
Image data format
The images analyzed during each interval timeframe will be uploaded into the customer target cloud storage.
File name
The name of each image is composed of the input uuid and the file extension. Both information can be found in the data file in the fields input
and file_extension
.
Path
In the data file, within each input
you have the id
and file_extension
fields. The absolute path to the images is therefore:
<storage_uri>/<site_id>/inputs/<id><file_extension>
# ex:
gs://customer-cloud-bucket-export/deepomatic/prod/433a60f6-9756-473e-b1e9-9e89e24bb787/inputs/67dd3441-bdee-4deb-9a48-f7663637ed78.png
Last updated
Was this helpful?