Data bloc

Databloc is an Easyflow capability that is very similar to databases or tables. You can use it for a variety of purposes such as storing data, transferring data between workflows, preparing data for analytics and visualisation and more.

There are several ways to create datablocs. You can use the big plus button located in the left side of the screen, or you can navigate to the Datablocs page from the left menu and click Create a new table.

The next step is to define a schema of the databloc. In database language, it’s similar to creating columns for a table.

From the fields tab, click manage and the schema designer will appear. Let’s create 2 keys: First Name and age.

Now, it’s time to import data to the databloc. This can be done from the workflow by using the Datastore connector, Insert Records operation.

There are 2 ways for inserting records into a databloc: Record by Record or Bulk Insert. In this video, we are going to demonstrate both of them.

Let’s start with Record by Record insert:

First step is to select our create databloc the “tutorial databloc”

Then by leaving the Dataset textbox empty, Easyflow will automatically understand that we are looking to insert data Record by Record

In the Record textbox, the magic schema picker icon, choose the tutorial databloc schema. It’s the same schema we’ve defined while creating the databloc columns.

As you can see, the FIRST NAME and age appeared in the textbox.

Let’s test the outcome by running the workflow from the Run button. Don’t forget to set the connector as a starting point first.

As you can see, a single record gets inserted into the tutorial databloc.

The next example is the Bulk Insert. For this purpose, we are going to read information from Google Sheets and insert it into the databloc. Let’s see how:

First step, let’s add Google Sheets connector

We are going to read the data from the “READ DATABLOC” spreadsheet

Choose the Get Range Values operation and configure it.

In the Range textbox, you can pick up the “READ DATABLOC” spreadsheet using the folder icon without determining any range. This means all the data in this spreadsheet will be captured.

Let’s check if the workflow is returning data from the spreadsheet as expected. To do that, we are going to run the workflow from the browser.

As you can see, the workflow returns an array of arrays containing all the expected data.

The next step is to insert this data into the databloc as bulk. We need a new databloc connector, Insert Records operation

This time we are going to assign a value to the DATASET by mapping it to the VALUES array returned from the Google Sheets step.

In the records section, we are going to pick the tutorial dataloc again, but this time we are going to map the data to the selected array.

You can use the help and tutorial for more details.

Let’s run the workflow again and if the data gets inserted into the databloc. As you can see, the workflow inserted all the data as expected.

On top of insert records, the databloc connector provides additional operations such as Search Records, Get a single record, Delete records and many.

Let's try the Search Records operation. You can configure your filter and sort criteria in JSON format.

By running the workflow from the browser, you can see the response returns data from the databloc as array of json as expected.

Thank you for watching this tutorial. Please feel free to contact us if you have any questions.

Last updated