How to Bulk Import from Files
Bulk Importing Data from CSV to BigQuery
Requirements
- CSV file you want to import
Note: Column names must match the headers in the CSV file.
Note: TSV and JSON formats are also supported.
Placing Components
- Place a [Column].
- Adjust the width by moving the [Column] adjusters.
- Place a [Table] on the left side of the [Column].
- Place a [File] on the right side of the [Column].
- Place a [Modal] below the [File].
- Enter
Check CSV
as the [Button label] in the [Modal]. - Open the [Modal] and place a [Table] inside it.
- Place a [Button] below the [Table] in the [Modal].
- Enter
Execute Bulk Import
as the [Label] on the [Button].
Creating a Dataflow
- Click [Create] in the [Dataflows].
- Select any data source integrated with BigQuery.
- Choose [SQL] for the [Mode].
- Enter the following SQL in [SQL]:
SELECT * FROM <dataset_name>.<table_name> LIMIT 20;
- Click [Run] to run.
- Click [Create] in the [Dataflows] again.
- Select any data source integrated with BigQuery.
- Choose [GUI] for the [Mode].
- Select an [Action]. Choose
Bulk insert
for bulk inserts orBulk upsert
for bulk inserts/updates. - Enter the relevant dataset name in [Dataset] for BigQuery.
- Enter the relevant table name in [Table] for BigQuery.
- Enter
{{ file1.value[0].parsedData }}
in [Items]. - Enter
id
in [Primary keys] (only ifBulk upsert
is selected). - Add an [Success events] and select
dataflow1
under [Dataflfow].
Linking Dataflow and Components
- Enter
{{ dataflow1.data }}
as the [Data] for the first placed [Table]. - Enter
{{ file1.value[0].parsedData }}
as the [Data] for the [Table] inside the [Modal]. - Click [Add] on the [Button] inside the [Modal].
- Select
dataflow2
under [Dataflow] in the event settings.