Skip to content

Column names on S3+PIpe => Error #362

@thebruuu

Description

@thebruuu

Describe the bug

Encountered an error while running operation: Database Error
001044 (42P13): SQL compilation error: error line 4 at position 17
Invalid argument types for function 'GET': (VARCHAR(134217728), VARCHAR(3))

The code generated for the COPY INTO has wrong columns name :

COPY into TESTS_MACRO.TEST_MACRO.S3_FED_PIPE_ALL_COLUMNS_WO_HEADER
    from (
        select
          nullif($1:SID,'')::NUMBER as SID,
          nullif($1:CustomerId,'')::VARCHAR as CustomerId,
          nullif($1:FirstName,'')::VARCHAR as FirstName,
          nullif($1:LastName,'')::VARCHAR as LastName,
          metadata$filename::varchar as metadata_filename,
          metadata$file_row_number::bigint as metadata_file_row_number,
          metadata$file_last_modified::timestamp as metadata_file_last_modified,
          metadata$start_scan_time::timestamp as _dbt_copied_at
        from @TESTS_MACRO.TEST_MACRO.S3_T0001 
    )
    file_format = ( type = 'csv' FIELD_DELIMITER=',' DATE_FORMAT=AUTO TIME_FORMAT=AUTO TRIM_SPACE=TRUE FIELD_OPTIONALLY_ENCLOSED_BY='\"' ) 

Steps to reproduce

My Sources.yml is

version: 2

sources:
  - name: TESTS_MACRO
    database: TESTS_MACRO
    schema: TEST_MACRO
    loader:  S3 + snowpipe

    tables:
      - name: S3_FED_PIPE_ALL_COLUMNS_WO_HEADER
        external :
          location: "@TESTS_MACRO.TEST_MACRO.S3_T0001"
          file_format: ( TYPE= 'csv'  
                        FIELD_DELIMITER=','
                        DATE_FORMAT=AUTO
                        TIME_FORMAT=AUTO
                        TRIM_SPACE=TRUE
                        FIELD_OPTIONALLY_ENCLOSED_BY='\"'
                        )
          infer_schema: false 
          snowpipe:
            auto_ingest:    false  
            copy_options:   "on_error = continue, enforce_length = false"
        columns:
            - name: SID
              data_type: NUMBER
            - name: CustomerId 
              data_type: VARCHAR
            - name: FirstName 
              data_type: VARCHAR
            - name: LastName 
              data_type: VARCHAR

Expected results

I wass expecting this code to be generated for the COPY INTO :

COPY into TESTS_MACRO.TEST_MACRO.S3_FED_PIPE_ALL_COLUMNS_WO_HEADER
    from (
        select
          nullif($1,'')::NUMBER as SID,
          nullif($2,'')::VARCHAR as CustomerId,
          nullif($3,'')::VARCHAR as FirstName,
          nullif($4,')::VARCHAR as LastName,
          metadata$filename::varchar as metadata_filename,
          metadata$file_row_number::bigint as metadata_file_row_number,
          metadata$file_last_modified::timestamp as metadata_file_last_modified,
          metadata$start_scan_time::timestamp as _dbt_copied_at
        from @TESTS_MACRO.TEST_MACRO.S3_T0001 
    )
    file_format = ( type = 'csv' FIELD_DELIMITER=',' DATE_FORMAT=AUTO TIME_FORMAT=AUTO TRIM_SPACE=TRUE FIELD_OPTIONALLY_ENCLOSED_BY='\"' ) 

Root Cause

the is_cvs.sql macro is not so robust ... because it looks for 'type=csv' ... so type='csv' does not work, nor type= csv

System information

The contents of your packages.yml file:

Which database are you using dbt with?

  • redshift
  • snowflake
  • other (specify: ____________)

The output of dbt --version:

"dbt_version": "2025.7.22+b91572d"

dbt Cloud

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriage

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions