Creating your first automation
Estimated time: 15 minutes
Last updated
Estimated time: 15 minutes
Last updated
You can create a free account at TunnelHub.io. Just inform your name and email and accept the terms and conditions. No credit card is required for free accounts.
To help during the development process, we created a CLI package to interact with our platform in a simple and productive way. To install, you must have NodeJS v12+ and NPM installed.
npm install -g @tunnelhub/cli
If you are using Yarn, run:
yarn global add @tunnelhub/cli
To run deploys, list existing automation or create new resources, you must be authenticated with our platform. This can be done with the th login
command. When running, you will be prompted for three pieces of information:
Tenant ID
User
Password
The Company ID can be in the information button on the upper right bar of the system, in the Company ID field. The username and password fields are the same as those used to enter the portal.
Now you are ready to use all CLI commands in your TunnelHub account. You can execute th login-check
anytime to verify if your credentials are valid.
Before creating automation, it's necessary to create a package. Packages are logical units for grouping items together within the TunnelHub platform.
You can create your package in the user interface or using CLI. To create a package in the DEV environment using CLI, execute:
th create-package --env DEV
Now it's time to create an automation. You can do it in the user interface or using CLI. To create automation in the DEV environment using CLI, you should use the command
th create-automation --env DEV
This command creates automation in TunnelHub and also generates an initial skeleton for your application based on four models:
No delta (individual)
No delta (in batch)
With delta (individual)
With delta (in batch)
According to the chosen model, a different template will be created with all the code necessary to start your automation. In most cases, the template "No delta (individual)" will be a good choice. So let's do it:
All necessary code will be created in a new folder with the name chosen for your automation. You can open it using your favorite IDE, like VS Code, with cd My-fist-automation && code .
If you check the user interface on the web app, the automation created was visible there too. Now you are ready to start coding your automation.
First of all, we need to install all dependencies. You can do it by executing yarn install
on the root. With dependencies installed, our integration code is in src/classes/integration.ts
. This class has three important methods:
loadSourceSystemData
defineMetadata
sendData
loadSourceSystemData
This method is responsible to collect the data from the datasource. It's a async method and must return an array of objects. Let's check an example:
In the example, is requested data from an API using got
HTTP client and returning and an array of objects with five columns.
defineMetadata
This method is responsible for translating data returned into loadSourceSystemData
for the monitoring screen as human-readable. It's a sync method and must return an array of objects with column definition. Let's check an example:
The fields can are described below:
fieldName: the technical name of the field returned in loadSourceSystemData
fieldLabel: the human-friendly name of the field to be displayed in the monitoring
fieldType: the type of field for formatting. The possible values are 'TEXT' | 'NUMBER' | 'DATE' | 'DATETIME' | 'BOOLEAN'
. The monitoring screen will apply formatting automatically according to the field type.
hideInTable: this field is not present because it is optional and can be used to hide the field in the log table but show it on the detail screen.
sendData
This method is responsible to send the data to your target. In our example, we are using the "No delta (individual)" template, so this async method will executed for each array item returned in loadSourceSystemData
and must return an object defining the status of processing. Let's check an example:
In this example, the automation will create a file in an FTP for each object returned in loadSourceSystemData
. After creating the file successfully, it is necessary to return an object with some message (in this case, "Success") to be displayed on the monitoring screen - but it can be any other text. Exceptions are caught by default and aren't necessary any special handle.
We know it's a lot of work testing cloud applications because each deploy take time and there's no time to lose. So it's so better have a way to execute and simulate the automation in our local environment. For that, we have setup basic test cases using jest in __tests__
folder. Let's take a quick look:
If you are already familiar with jest, there's nothing new. But we have some important settings for local executions in beforeAll section:
These sections are mandatory for any test because they are routines that only make sense in real runtime. They are to save logs, update metadata, persist some contexts between internal lambdas and update some statistics, like execution time and error count. Any other mocks are optional and are up to you. It's very common for debugging mock anything more to make real calls and evaluate the results locally.
When you are good and confident about your integration, let's deploy it and run it on the cloud. Considering that in our example, we are using an FTP as target data, it will be necessary to have a working FTP and adjust your code to connect to your FTP server.
As your code will run in the AWS Lambda environment and we are using Typescript, it's necessary to create a bundle with all code and dependencies transpiled to Javascript. In our template, it's already configured using Webpack. Just execute:
yarn run build && th deploy-automation --env DEV --message "My first deploy"
This will create the bundle and deploy your artifacts for your TunnelHub account. You can check your deploy details in Automation -> Automation Details -> Deployments.
After you successfully deploy, it's time to execute your integration on Cloud Environment. This can be archived in many ways, including:
Creating a schedule
Create a webhook and call it manually using Postman or other HTTP clients
Execute manually through the user interface
To execute is necessary to have a trigger defined before. Let's edit our automation and define an "On event / Webhook" trigger type and save it:
Now let's execute it manually. Go to the Automations
menu, find your automation, and press Operations
button in Actions
column and select Execute now
:
On the modal, just press Execute
without a payload:
After that, your integration will start executing! You can check the progress in the menu Automations -> Monitoring
After finished, you can check de detailed log by pressing the See details
button on the last column:
The log will cover line by line, with column titles defined in metadata output. You can check record details by clicking on the first column Log ID
:
In most cases, someone will need to receive an alert if any automation fails. This can be set up easily through a panel on automation details, in the section Notifications:
To add someone as receives, just click on "+ Add
" button and fill all information required:
That's it! The platform will send an e-mail message in the selected language warning that execution has errors. For details, it's necessary to have a user on the platform with all necessary permissions.