CLI

Overview

The TimeLine Command Line Interface (CLI) is a command line utility written in Python that enables users to interact with the TimeLine Service from scripts and the command line. It provides access to a small but increasing subset of the TimeLine API and is currently focused on:

  • creating and deleting tenants

  • securely and efficiently loading entities, events, and associated metadata into the TimeLine Service

NOTE: The TimeLine CLI is in an early and active stage of development and has little error handling and recovery functionality. If you are using the TimeLine CLI and encounter problems, best to contact the WellLine team immediately and ask them for help.

Note on Usage

The TimeLine Service, and in general the WellLine Product, are not meant to be replicas of any source system, nor the "master" version of any data.

Care must be taken when using the TimeLine CLI, as it has limited capacity to "undo" commands. This is especially true of the update commands, due to the nature of the operation being performed.

When in doubt, contact the WellLine support team.

Requirements

  • Python 3.x

  • Dependencies:

    • pip install requests

    • pip install chardet

    • pip install azure-storage-common

    • pip install azure-storage-queue

    • pip install jsonlines

  • Authentication Token for the TimeLine Service

If both python 2.x and python 3.x are installed on the machine running WellLine CLI, ensure the following:

  • When installing dependencies, above, use "pip3" instead of "pip"

  • When executing commands, below, use "python3" instead of "python"

If only python 3.x exists, use simply "pip" and "python".

Configuration, Syntax, and Help

To get a description of the options available in the TimeLine CLI, run:

python timeline.py -h

This will display help information to the terminal similar to the following:

usage: timeline.py [-h]
{agent,dataset,entity,entityType,entityTypeGroup,event,eventType,eventTypeGroup,lexiconEntry,measureType,tenant,meta}
...
Timeline Service CLI.
positional arguments:
{agent,dataset,entity,entityType,entityTypeGroup,event,eventType,eventTypeGroup,lexiconEntry,measureType,tenant,meta}
sub-command help
agent Manage TimeLine Service agents.
dataset Publish entire datasets to the TimeLine Service with a
single command.
entity Manage TimeLine Service entities
entityType Manage TimeLine Service entityTypes
entityTypeGroup Manage TimeLine Service entityTypeGroups
event Manage TimeLine Service events
eventType Manage TimeLine Service eventTypes
eventTypeGroup Manage TimeLine Service eventTypeGroups
lexiconEntry Manage TimeLine Service lexiconEntrys
measureType Manage TimeLine Service measureTypes
tenant Manage TimeLine Service tenants
meta Convert old style metadata files into jsonl files.
optional arguments:
-h, --help show this help message and exit

The commands and actions available through the TimeLine CLI are fully described with examples below.

The TimeLine CLI also uses the following environment variables when they are defined:

  • TIMELINE_AUTH_TOKEN: The JSON web token (JWT) that the TimeLine CLI will use to authenticate all calls against the TimeLine Service. This token is obtained by logging in to the WellLine Interface (see the Info page).

  • TIMELINE_URL: The default URL to use on all TimeLine CLI commands as the address of the TimeLine Service GraphQL interface. This can be overridden per command using the --url argument when running the TimeLine CLI from the command line.

  • TIMELINE_TENANT_ID: The default Tenant ID to use on all TimeLine CLI commands as the target Tenant. This can be overridden per command using the --tid argument when running the TimeLine CLI from the command line.

  • TIMELINE_BATCH_ID: The default Batch ID to use on all TimeLine CLI commands. This can be overridden per command using the --batchId argument when running the TImeLine CLI from the command line.

Authentication Tokens

The TimeLine Service enforces authentication and requires all requests from the TimeLine CLI to be accompanied by a valid Authentication Token. This authentication token is read by the TimeLine CLI from the TIMELINE_AUTH_TOKEN environment variable, which MUST be set manually before you run any commands.

For example, in Mac OSX:

export TIMELINE_AUTH_TOKEN=<token>

Or in Windows:

set TIMELINE_AUTH_TOKEN=<token>

NOTE: It is not possible to provide the authentication token as a command line argument, the TimeLine CLI will only read it from the TIMELINE_AUTH_TOKEN environment variable.

To generate an authentication token for use with the CLI, a user must log into the WellLine Interface and click Info in the toolbar.

Tenant IDs

The TimeLine Service supports multiple tenants on a single deployment and each command must target a specific tenant. The Tenant ID can be passed as an argument when running individual TimeLine CLI commands; for convenience, you can set the default Tenant ID using the TIMELINE_TENANT_ID environment variable.

For example, in Mac OSX:

export TIMELINE_TENANT_ID=<id>

Or in Windows:

set TIMELINE_TENANT_ID=<id>

If the user does not provide a Tenant ID with a command, the TimeLine CLI will read the TIMELINE_TENANT_ID variable and apply it to the commands.

NOTE: the Tenant ID is case insensitive in the TimeLine Service. It will always be converted to lowercase.

Parameter: batchId

The optional batchId parameter is a way to group together a set of events and/or entities into a named collection. This makes it easy to work with the set of events or entities that were uploaded as a batch. This is particularly useful for incremental uploads when you want to query, review, or process just those entities or events that were uploaded together.

--batchId <string>

Parameter: chunkSize

The optional chunkSize parameter is a way to specify the number of kilobytes (KB) sent per Timeline Service call.

--chunkSize <int>

The default chunkSize is 50KB.

Parameter: updateRetries

The optional updateRetries parameter is a way to specify the number of times an update action will retry before failing. This is to cater for situations where there are a high volume of events feeding into the WellLine Event Engineering Pipeline and event updates are written by an event source before WellLine has processed and written the original event.The default number of retries is 5. The first retry waits for 1 second before trying to apply the update. Each subsequent retry doubles the length of time.

--updateRetries <int>

Commands and Actions

The TimeLine CLI currently supports the following commands and actions:

  • tenant:

    • add: provision a new tenant

    • delete: delete an existing tenant

  • dataset

    • install: publish an entire dataset, from a predefined folder structure, with a single command

  • entityTypeGroup

    • add: publish entityTypeGroups from JSONL files to the TimeLine Service. If an entityTypeGroup already exists, an error is logged.

    • addOrReplace: publish entityTypeGroups from JSONL files to the TimeLine Service with the following logic:

      • If the entityTypeGroup does not exist, add it

      • If the entityTypeGroup already exists, entirely replace it (all fields)

    • addOrUpdate: publish entityTypeGroups from JSONL files to the TimeLine Service with the following logic:

      • If the entityTypeGroup does not exist, add it

      • If the entityTypeGroup already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing entityTypeGroups when their id matches. If the id does not exist, an error is logged.

  • entityType

    • add: publish entityTypes from JSONL files to the TimeLine Service. If an entityType already exists, an error is logged.

    • addOrReplace: publish entityTypes from JSONL files to the TimeLine Service with the following logic:

      • If the entityType does not exist, add it

      • If the entityType already exists, entirely replace it (all fields)

    • addOrUpdate: publish entityTypes from JSONL files to the TimeLine Service with the following logic:

      • If the entityType does not exist, add it

      • If the entityType already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing entityTypes when their id matches. If the id does not exist, an error is logged.

  • entity:

    • add: publish entities from JSONL files to the TimeLine Service. If an entity already exists, an error is logged.

    • addOrReplace: publish entities from JSONL files to the TimeLine Service with the following logic:

      • If the entity does not exist, add it

      • If the entity already exists, entirely replace it (all fields)

    • addOrUpdate: publish entities from JSONL files to the TimeLine Service with the following logic:

      • If the entity does not exist, add it

      • If the entity already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing entities when their id matches. If the id does not exist, an error is logged.

  • eventTypeGroup

    • add: publish eventTypeGroups from JSONL files to the TimeLine Service. If an eventTypeGroup already exists, an error is logged.

    • addOrReplace: publish eventTypeGroups from JSONL files to the TimeLine Service with the following logic:

      • If the eventTypeGroup does not exist, add it

      • If the eventTypeGroup already exists, entirely replace it (all fields)

    • addOrUpdate: publish eventTypeGroups from JSONL files to the TimeLine Service with the following logic:

      • If the eventTypeGroup does not exist, add it

      • If the eventTypeGroup already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing eventTypeGroups when their id matches. If the id does not exist, an error is logged.

  • eventType

    • add: publish eventTypes from JSONL files to the TimeLine Service. If an eventType already exists, an error is logged.

    • addOrReplace: publish eventTypes from JSONL files to the TimeLine Service with the following logic:

      • If the eventType does not exist, add it

      • If the eventType already exists, entirely replace it (all fields)

    • addOrUpdate: publish eventTypes from JSONL files to the TimeLine Service with the following logic:

      • If the eventType does not exist, add it

      • If the eventType already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing eventTypes when their id matches. If the id does not exist, an error is logged.

  • measureType

    • add: publish measures from JSONL files to the TimeLine Service. If a measure already exists, an error is logged.

    • addOrReplace: publish measures from JSONL files to the TimeLine Service with the following logic:

      • If the measure does not exist, add it

      • If the measure already exists, entirely replace it (all fields)

    • addOrUpdate: publish measures from JSONL files to the TimeLine Service with the following logic:

      • If the measure does not exist, add it

      • If the measure already exists, update existing fields

    • update: use new field data from JSONL files to update fields in existing measures when their id matches. If the id does not exist, an error is logged.

  • meta:

    • add: convert legacy metadata formats into current format (JSONL files)

Each of these commands takes a number of arguments provided either as environment variables or as arguments on the CLI command line.

Command Arguments

Commands have the following optional arguments:

--target {graphql,azurequeue}
the target service type for the published data
--tid TENANTID the tenantId to target with the command
--batchId BATCHID the batchId for the set of data
--updateRetries UPDATERETRIES
the number of times to retry an update action before
failing
--chunkSize CHUNKSIZE
the size (in KB) of data to publish per call
--gqlUrl GQLURL the address of the timeline service graphql api
--azureStorageName AZURESTORAGEACCOUNTNAME
the name of the Azure Storage Account
--azureQueueName AZURESTORAGEQUEUENAME
the name of the Azure Storage Queue
--env ENV the name of the WellLine environment

Required parameters are:

  • --tid

  • --target

  • Either --gqlUrl or --azureQueueName, depending on --target chosen

If desired, --tid and --target can be entered as environment variables.

Command: tenant

All tenant actions are accessed through the tenant command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py tenant -h

Action: add

Use the add action of the tenant command to provision a new tenant on the TimeLine Service

python timeline tenant add --gqlUrl <timeline.svc.address> --tid <tenant_id>

Action: delete

Use the delete action of the tenant command to delete a tenant on the TimeLine Service

python timeline.py tenant delete --gqlUrl <timeline.svc.address> --tid <tenant_id>

Action: reset

The reset action of the tenant command performs a ' tenant delete' followed by a 'tenant add' on the TimeLine Service, effectively clearing the tenant and readying it for new publishing

python timeline tenant add --gqlUrl <timeline.svc.address> --tid <tenant_id>

The reset and delete actions delete all data within the tenant. The only chance of recovery will be from backups of the indexes if you have created them.

Command: dataset

The dataset installation command is accessed through the dataset command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py dataset -h

Action: install

Use the install action of the dataset command to publish an entire dataset to the TimeLine Service. This is essentially a "shortcut" to running the many individual commands. Each command will be run in the appropriate order to ensure accurate publishing.

To publish correctly, the dataset must follow a predefined folder layout:

  • data

    • entities

      • add

      • addOrReplace

      • addOrUpdate

      • update

    • entityTypeGroups

      • add

      • addOrReplace

      • addOrUpdate

      • update

    • entityTypes

      • add

      • addOrReplace

      • addOrUpdate

      • update

    • events

      • add

      • addOrUpdate

      • addOrReplace

      • update

    • eventTypeGroups

      • add

      • addOrUpdate

      • addOrReplace

      • update

    • eventTypes

      • add

      • addOrUpdate

      • addOrReplace

      • update

    • measureTypes

      • add

      • addOrUpdate

      • addOrReplace

      • update

Not every folder needs to be present or populated, but JSONL files may only be located in the bottom directories (i.e. files may exist in data/entities/add but not in data/entities)

This command will stream the lines from each JSONL formatted file in each populated folder, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py dataset install --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <dataFolderPath>

If uploading through Azure Queue:

python timeline.py dataset install --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <dataFolderPath>

The dataFolderPath argument must point to a folder named data with the layout as outlined above.

Command: entity

All entity actions are accessed through the entity command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py entity -h

The GraphQL schema of the entity is as follows:

externalId: ID
typeId: ID!
sourceData: String
sourceId: ID
name: String!
description: String

Action: add

Use the add action of the entity command to publish new entities to the TimeLine Service. If you try to add an entity that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entities, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entity add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entity add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new entities, or a directory with one or more JSONL files containing new entities. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an entity must be on a single line and separated from the next with a carriage return (no comma). For example:

{"externalId": "NO 15/9-19 A","typeId": "well","name": "NO 15/9-19 A","sourceId": "TestId","sourceData": "DataSourceA"}
{"externalId": "Inspirer","typeId": "rig","name": "Inspirer","sourceId": "TestId","sourceData": "DataSourceA"}

Use the addOrReplace action of the entity command to publish new entities or replace already-published entities from JSONL files to the TimeLine Service with the following logic:

  • If the entity does not exist, add it

  • If the entity already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entities, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entity addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entity addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entities, or a directory with one or more JSONL files containing entities. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (in combinations of typeId and name) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the entity command to publish new entities or update already-published entities from JSONL files to the TimeLine Service with the following logic:

  • If the entity does not exist, add it

  • If the entity already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entities, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entity addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entity addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entities, or a directory with one or more JSONL files containing entities. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (in combinations of typeId and name) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the entity command to update the externalId, sourceId, and/or sourceData field(s) associated with one or more entities. This will insert and/or remove values found in those fields of the entity that matches each input row's typeId + name field. If the specified entity does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entity update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entity update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entity field updates, or a directory with one or more JSONL files containing entity field updates. If you reference a directory, every JSONL file in the directory will be processed.

The GraphQL schema of the entity update (UpdateEntityInput) is:

externalId: ID
typeId: ID
sourceData: String
sourceId: ID
name: String
description: String

Command: event

All event actions are accessed through the event command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py event -h

The GraphQL schema of the event is as follows:

id: ID
externalId: ID
typeId: ID
sourceId: ID
sourceData: String
summary: String
content: String!
startedOn: DateTime
startedAt: GeoLocationInput
endedOn: DateTime
endedAt: GeoLocationInput
importance: Float
sourceEventIds: [ID!]
measures: [MeasureInput!]
quantities: [QuantityInput!]
properties: [PropertyInput!]
subjectEntityIds: [ID!]
referenceEntityIds: [ID!]`

For additional information on JSONL formatting for events, see Event Data Formatting.

Action: add

Use the add action of the event command to publish new events to the TimeLine Service. If you try to add an event that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the events, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py event add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py event add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new events, or a directory with one or more JSONL files containing new events. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an event must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "k1","externalId": "-601144","typeId": "kick","sourceId": "BOEM","content": "CONT TO TEST BOPS ON YELLOW POD","startedOn": "2008-07-19T00:00:00Z","endedOn": "2008-07-19T00:00:00Z","importance": "4","sourceEventId": "4a8904c5afbb93de753ceccee90d73ef","properties": [{ "name": "lifecycle", "value": "Drilling" },{ "name": "code", "value": "Problem" },{ "name": "subcode", "value": "Ballooning" }],"subjectEntityIds": [ "well.608184006802" ],"referenceEntityIds": [ "rig.Ensco 7500" , "organization.Chevron U.S.A. Inc." ]}
{"externalId": "-602447","typeId": "kick","sourceId": "BOEM","content": "CONT TO WGT UP TO 16 2 PPGPUMP SWEEP & CIRC AROUND, START LOSING RETURNSSLOW RATE TO 370 GPM","startedOn": "2008-08-22T00:00:00Z","endedOn": "2008-08-22T00:00:00Z","importance": "4","sourceEventId": "0290f150124e05ce990443ea6626f658","properties": [{ "name": "lifecycle", "value": "Drilling" },{ "name": "code", "value": "Problem" },{ "name": "subcode", "value": "Ballooning" }],"subjectEntityIds": [ "well.608184006802" ],"referenceEntityIds": [ "rig.Ensco 7500" , "organization.Chevron U.S.A. Inc." ]}
{"id": "k2","externalId": "-604358","typeId": "kick","sourceId": "BOEM","content": "Drilled to 9982'. Well ballooning back 25 bbls while making connection","startedOn": "2008-10-02T00:00:00Z","endedOn": "2008-10-02T00:00:00Z","importance": "4","sourceEventId": "774f1f0655bdecb919326b77348a6a87","properties": [{ "name": "lifecycle", "value": "Drilling" },{ "name": "code", "value": "Problem" },{ "name": "subcode", "value": "Ballooning" }],"subjectEntityIds": [ "well.427114081901" ],"referenceEntityIds": [ "rig.Coil Tubing Unit" , "organization.Energy XXI GOM, LLC" ]}

Action: addOrReplace

Use the addOrReplace action of the event command to publish new events or replace already-published events from JSONL files to the TimeLine Service with the following logic:

  • If the event does not exist, add it

  • If the event already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the events, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py event addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py event addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing events, or a directory with one or more JSONL files containing events. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the event command to publish new events or update already-published events from JSONL files to the TimeLine Service with the following logic:

  • If the event does not exist, add it

  • If the event already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the events, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py event addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py event addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing events, or a directory with one or more JSONL files containing events. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the event command to update fields within one or more already-published events. This will insert and/or remove values in fields of the event that matches each input row's id field. If the specified event does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py event update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py event update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing event field updates, or a directory with one or more JSONL files containing event field updates. If you reference a directory, every JSONL file in the directory will be processed.

The GraphQL schema of the event update (UpdateEventInput) is as follows:

id: ID!
externalId: ID
typeId: ID
sourceId: ID
sourceData: String
summary: String
content: String
startedOn: DateTime
startedAt: GeoLocationInput
endedOn: DateTime
endedAt: GeoLocationInput
importance: Float
sourceEventIds: [ID!]
measures: [MeasureInput!]
quantities: [QuantityInput!]
properties: [PropertyInput!]
subjectEntityIds: [ID!]
referenceEntityIds: [ID!]
addSubjects: [ID!]
deleteSubjects: [ID!]
addReferences: [ID!]
deleteReferences: [ID!]
addSourceEvents: [ID!]
deleteSourceEvents: [ID!]
addMeasures: [MeasureInput!]
deleteMeasures: [MeasureInput!]
addQuantities: [QuantityInput!]
deleteQuantities: [QuantityInput!]
addProperties: [PropertyInput!]
deleteProperties: [PropertyInput!]

Because JSONL is required, each of the JSON structures representing an event update must be on a single line and separated from the next with a carriage return (no comma).

The various add and delete update options, above, allow for the incremental addition or removal of field values. This is how, for example, the results from WellLine's entity extraction processing are added to already-published events: addReferences input are used to incrementally add entities to the events they were extracted from.

For more information on updating events, see the TimeLine Service GraphQL schema available in the GraphQL Playground environment, included in every WellLine installation.

Command: entityTypeGroup

All entityTypeGroup actions are accessed through the entityTypeGroup command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py entityTypeGroup -h

The GraphQL schema of entityTypeGroup is as follows:

id: ID!
name: String!
description: String
group: EntityTypeGroup

For additional information on entityTypeGroups, see the WellLine data model.

Action: add

Use the add action of the entityTypeGroup command to publish new entityTypeGroups to the TimeLine Service. If you try to add an entityTypeGroup that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityTypeGroup add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityTypeGroup add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new group type objects, or a directory with one or more JSONL files containing new group type objects. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an entityTypeGroup must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "problem", "name": "Problem", "description": "Grouping for problems", "groupId": "all"}
{"id": "well problem", "name": "Well Problem", "description": "Grouping for problems with a well", "groupId": "problem"}

Action: addOrReplace

Use the addOrReplace action of the entityTypeGroup command to publish new entityTypeGroup objects or replace already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the event does not exist, add it

  • If the event already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityTypeGroup addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityTypeGroup addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type objects, or a directory with one or more JSONL files containing group type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the event command to publish new entityTypeGroup objects or update already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the entityTypeGroup does not exist, add it

  • If the entityTypeGroup already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityTypeGroup addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityTypeGroup addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type objects, or a directory with one or more JSONL files containing group type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the entityTypeGroup command to update fields within one or more already-published entityTypeGroup objects. This will insert and/or remove values in fields of the group type that matches each input row's id field. If the specified group type does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityTypeGroup update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityTypeGroup update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type field updates, or a directory with one or more JSONL files containing group type field updates. If you reference a directory, every JSONL file in the directory will be processed.

Command: entityType

All entityType actions are accessed through the entityType command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py entityType -h

The GraphQL schema of entityType is as follows:

id: ID!
name: String!
description: String
group: EntityTypeGroup!

For additional information on entityTypes, see the WellLine data model.

Action: add

Use the add action of the entityType command to publish new entityTypes to the TimeLine Service. If you try to add an entityType that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entity types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityType add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityType add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new entity type objects, or a directory with one or more JSONL files containing new entity type objects. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an entityType must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "problem_ballooning", "name": "Ballooning Problem", "description": "A ballooning problem at a wellsite", "groupId": "well problem"}
{"id": "problem_kick", "name": "Kick Problem", "description": "A kick problem at a wellsite", "groupId": "well problem"}

Action: addOrReplace

Use the addOrReplace action of the entityType command to publish new entityType objects or replace already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the entityType does not exist, add it

  • If the entityType already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entity types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityType addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityType addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entity type objects, or a directory with one or more JSONL files containing entity type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the event command to publish new entityType objects or update already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the entityType does not exist, add it

  • If the entityType already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the entity types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityType addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityType addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entity type objects, or a directory with one or more JSONL files containing entity type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the entityType command to update fields within one or more already-published entityType objects. This will insert and/or remove values in fields of the entity type that matches each input row's id field. If the specified entity type does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py entityType update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py entityType update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing entity type field updates, or a directory with one or more JSONL files containing entity type field updates. If you reference a directory, every JSONL file in the directory will be processed.

Command: eventTypeGroup

All entityTypeGroup actions are accessed through the eventTypeGroup command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py eventTypeGroup -h

The GraphQL schema of eventTypeGroup is as follows:

id: ID!
name: String!
description: String
group: EventTypeGroup

For additional information on eventTypeGroups, see the WellLine data model.

Action: add

Use the add action of the eventTypeGroup command to publish new eventTypeGroups to the TimeLine Service. If you try to add an eventTypeGroup that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventTypeGroup add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventTypeGroup add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new group type objects, or a directory with one or more JSONL files containing new group type objects. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an eventTypeGroup must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "production", "name": "Production", "description": "An event group from a Public Dataset.", "groupId": "all"}
{"id": "workover", "name": "Workover", "description": "An event group from a Public Dataset.", "groupId": "all"}

Action: addOrReplace

Use the addOrReplace action of the eventTypeGroup command to publish new eventTypeGroup objects or replace already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the event does not exist, add it

  • If the event already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventTypeGroup addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventTypeGroup addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type objects, or a directory with one or more JSONL files containing group type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the eventTypeGroup command to publish new eventTypeGroup objects or update already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the eventTypeGroup does not exist, add it

  • If the eventTypeGroup already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the group types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventTypeGroup addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventTypeGroup addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type objects, or a directory with one or more JSONL files containing group type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the eventTypeGroup command to update fields within one or more already-published eventTypeGroup objects. This will insert and/or remove values in fields of the group type that matches each input row's id field. If the specified group type does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventTypeGroup update --tid <tenant_id> --target graphql --gqlURL <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventTypeGroup update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing group type field updates, or a directory with one or more JSONL files containing group type field updates. If you reference a directory, every JSONL file in the directory will be processed.

Command: eventType

All eventType actions are accessed through the eventType command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py eventType -h

The GraphQL schema of eventType is as follows:

id: ID!
name: String!
description: String
group: EventTypeGroup!

For additional information on eventTypes, see the WellLine data model.

Action: add

Use the add action of the eventType command to publish new eventTypes to the TimeLine Service. If you try to add an eventType that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the event types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventType add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventType add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new event type objects, or a directory with one or more JSONL files containing new event type objects. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an eventType must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "production - daily production report", "name": "Daily Production Report", "description": "An event from a Public Dataset.", "groupId": "production"}
{"id": "workover - wire line", "name": "Wire Line", "description": "An event from a Public Dataset.", "groupId": "workover"}

Action: addOrReplace

Use the addOrReplace action of the eventType command to publish new eventType objects or replace already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the eventType does not exist, add it

  • If the eventType already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the event types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventType addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventType addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing event type objects, or a directory with one or more JSONL files containing event type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the eventType command to publish new eventType objects or update already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the eventType does not exist, add it

  • If the eventType already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the event types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventType addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventType addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing event type objects, or a directory with one or more JSONL files containing event type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the eventType command to update fields within one or more already-published eventType objects. This will insert and/or remove values in fields of the event type that matches each input row's id field. If the specified eventType does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py eventType update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py eventType update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing event type field updates, or a directory with one or more JSONL files containing event type field updates. If you reference a directory, every JSONL file in the directory will be processed.

Command: measureType

All measureType actions are accessed through the measureType command provided by the TimeLine CLI. To see the syntax and get help, run the following command:

python timeline.py measureType -h

The GraphQL schema of measureType is as follows:

id: ID!
name: String!
units: String!
description: String

For additional information on measureTypes, see the WellLine data model.

Action: add

Use the add action of the measureType to publish new measureType to the TimeLine Service. If you try to add a measureType that already exists, the TimeLine Service will log an error and ignore the data.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the measure types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py measureType add --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py measureType add --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing new measure type objects, or a directory with one or more JSONL files containing new measure type objects. If you reference a directory, every JSONL file in the directory will be processed.

Because JSONL is required, each of the JSON structures representing an measureType must be on a single line and separated from the next with a carriage return (no comma). For example:

{"id": "depthMD", "name": "Depth MD meters", "units": "m", "description": "Measured depth, in meters"}
{"id": "mudWeight", "name": "Mud Weight specific gravity", "units": "sg", "description": "Mud weight, in specific gravity"}

Action: addOrReplace

Use the addOrReplace action of the measureType command to publish new measureType objects or replace already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the measure does not exist, add it

  • If the measure already exists, entirely replace it (all fields)

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the measure types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py measureType addOrReplace --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py measureType addOrReplace --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing measure type objects, or a directory with one or more JSONL files containing measure type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, they are overwritten by the new data.

Action: addOrUpdate

Use the addOrUpdate action of the measureType command to publish new measureType objects or update already-published objects from JSONL files to the TimeLine Service with the following logic:

  • If the measureType does not exist, add it

  • If the measureType already exists, update existing fields

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the measure types, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py measureType addOrUpdate --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py measureType addOrUpdate --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing measure type objects, or a directory with one or more JSONL files containing measure type objects. If you reference a directory, every JSONL file in the directory will be processed.

If any exact matches (based on the id field) are found to already exist, field data from the JSONLs being uploaded will be added to the same field in the matching data.

Action: update

Use the update action of the measureType command to update fields within one or more already-published measureType objects. This will insert and/or remove values in fields of the measure type that matches each input row's id field. If the specified measureType does not exist, the TimeLine Service will log an error and ignore the update.

This command will stream the lines from a JSONL formatted file (or a directory of JSONL formatted files), batch the fields, and write them to the TimeLine Service through the chosen --target.

If uploading through GraphQL:

python timeline.py measureType update --tid <tenant_id> --target graphql --gqlUrl <timeline.svc.address> --chunkSize <int> <fileOrDir>

If uploading through Azure Queue:

python timeline.py measureType update --tid <tenant_id> --target azurequeue --azureQueueName <name> --chunkSize <int> <fileOrDir>

The fileOrDir argument must point either to an individual JSONL file containing measure type field updates, or a directory with one or more JSONL files containing measure type field updates. If you reference a directory, every JSONL file in the directory will be processed.

Command: meta

As of August 2019, the meta command converts previous metadata.json files into the required eventType, eventTypeGroup, entityType, entityTypeGroup, and measureType JSONL files, used in their respective commands (see above documentation sections).

To see the syntax and get help, run the following command:

python timeline.py meta -h

Action: convert

Use the convert action of the meta command to convert a metadata.jsonl file into individual files for each section from the file:

  • agents

  • measureTypes

  • entityTypes

  • entityTypeGroups

  • eventTypes

  • eventTypeGroups

  • measureTypes

python timeline.py meta convert <metadataFile>

The output of this command is a set of folders and files that match the layout from the dataset install command. These folders and files will be output to the same filepath defined in parameter <metadataFile>.

For more information on the format of the metadata.jsonl file that previously contained this data, including each section, see the WellLine Data Model

Troubleshooting

Below are scenarios and specific errors that could be encountered while using the WellLine CLI tool, along with the potential reason for the occurrence and a potential solution.

If your issue is not listed below:

  • Try to search for the problem in the upper-right corner of the documentation

  • Send the WellLine team a chat message by using our chat support module (the icon in the bottom-right corner of this documentation)

Issue

Potential reason and solution

Error encountered: TypeError: 'encoding' is an invalid keyword argument for this function

One or more dependencies have not been correctly installed. Ensure:

One of these errors is encountered: TimeLine Service call failed with code: 401 Exception: Query failed to run by returning code of 401.

Authorization code has either not been entered, has expired, or is not valid for the given environment or tenant.

Ensure:

Data has been uploaded through CLI, but autocomplete is not working

Data was uploaded before a tenant was created. Ensure:

  • The tenant specified in --tid parameters was first added via tenant add

Error encountered:

UnauthorizedError: jwt malformed

The current token is invalid, most likely due to its length.

Make sure that when pasting the token into your command line tool, the entire token is being pasted, and there are no leading or trailing spaces.

© Copyright 2018 MAANA, Inc.