Skip to main content

Import Data

Obtain the ClickHouse formats to import and export your data.

Import your data into the cluster

curl -X 'POST' \
'https://api.gigapipe.com/v1/clusters/<your_cluster_slug>/imports/s3' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <your_access_token>' \
-H 'Content-Type: application/json' \
-d '{
"table": "your_table_name",
"path": "your_s3_path",
"aws_access_key_id": "string",
"aws_secret_access_key": "string",
"format": "CSVWithNames",
"columns": [
{
"name": "column_name",
"type": "Column_type"
}
...
],
"compression": "gzip"
}'
# Payload response ::Dictionary
{
"message": "S3 file will be imported."
}

Parameters

Attributes


cluster_slug  ::String

The slug of the cluster


Payload

Attributes


table  ::String

The table name


path  ::String

The S3 path


aws_access_key_id  ::String

Your access key ID


aws_secret_access_key  ::String

Your access secret key


format  ::String

The import format


columns  ::Array[Dictionary]

The columns/types you want to import

name  ::String

The column name

type  ::String

The column type


compression  ::String

The compression type


Response

Attributes


message  ::String

A basic message response