Skip to content

Please use api-job and api-perform instead.

Usage

insert_extract_job(
  project,
  dataset,
  table,
  destination_uris,
  compression = "NONE",
  destination_format = "NEWLINE_DELIMITED_JSON",
  ...,
  print_header = TRUE,
  billing = project
)

Arguments

project, dataset

Project and dataset identifiers

table

name of table to insert values into

destination_uris

Fully qualified google storage url. For large extracts you may need to specify a wild-card since

compression

Compression type ("NONE", "GZIP")

destination_format

Destination format ("CSV", "ARVO", or "NEWLINE_DELIMITED_JSON")

...

Additional arguments passed on to the underlying API call. snake_case names are automatically converted to camelCase.

print_header

Include row of column headers in the results?

billing

project ID to use for billing

Value

a job resource list, as documented at https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs

See also