Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,30 @@ jobs:
- name: Run all e2e tests
if: github.event_name == 'workflow_dispatch' || github.event_name == 'push' || steps.filter.outputs.e2e-test == 'true'
run: python3 e2e/src/main/scripts/run_e2e_test.py --testRunner **/${{ matrix.tests }}/**/TestRunner.java
# Step to check if there were failures and list contents of target directory for debugging
- name: List target directory contents for debugging
if: always()
run: ls -al ./plugin/target
- name: Read failed_scenarios.txt for debugging
if: always() # Ensures this step runs regardless of the previous outcome
run: |
if [ -f ./plugin/target/failed_scenarios.txt ]; then
echo "Contents of failed_scenarios.txt:"
cat ./plugin/target/failed_scenarios.txt
else
echo "failed_scenarios.txt file not found."
fi
# Step to check if there were failures and run retry runner if needed
- name: Check for Failed Tests and Run Retry Test Runner
if: always() # Always run this step to check for failures
run: |
echo "Checking for failed_scenarios.txt"
if [ -s ./plugin/target/failed_scenarios.txt ]; then # -s checks if the file is non-empty
echo "Found failed scenarios. Running retry tests."
mvn -f ./plugin/pom.xml verify -Dtest=**/${{ matrix.tests }}/**/RetryTestRunner
else
echo "No failed scenarios found or file is empty."
fi
- name: Upload debug files
uses: actions/upload-artifact@v3
if: always()
Expand Down
5 changes: 3 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1201,7 +1201,8 @@
<id>e2e-tests</id>
<properties>
<testSourceLocation>src/e2e-test/java</testSourceLocation>
<TEST_RUNNER>TestRunner.java</TEST_RUNNER>
<runner>TestRunner.java</runner>
<RETRY_RUNNER>RetryTestRunner.java</RETRY_RUNNER>
</properties>
<build>
<testResources>
Expand All @@ -1225,7 +1226,7 @@
<version>3.0.0-M5</version>
<configuration>
<includes>
<include>${TEST_RUNNER}</include>
<include>${runner}</include>
</includes>
<!--Start configuration to run TestRunners in parallel-->
<parallel>classes</parallel> <!--Running TestRunner classes in parallel-->
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@BigQueryMultiTable_Sink
@BigQueryMultiTable_Sinks
Feature: BigQueryMultiTable sink -Verification of BigQuery to BigQueryMultiTable successful data transfer

@BQ_TWO_SOURCE_BQMT_TEST @BQ_DELETE_TABLES_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Sink
@BigQuery_Sinks
Feature: BigQuery sink - Validate BigQuery sink plugin error scenarios

@BigQuery_Sink_Required
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@BigQuery_Sink
@BigQuery_Sinks
Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data transfer

@BQ_UPSERT_SOURCE_TEST @BQ_UPSERT_SINK_TEST @EXISTING_BQ_CONNECTION
Expand Down Expand Up @@ -108,7 +108,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Source"
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Expand Plugin group in the LHS plugins list: "Sink"s
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Navigate to the properties page of plugin: "BigQuery"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@BigQuery_Sink
@BigQuery_Sinks
Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data transfer

@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
Expand All @@ -22,7 +22,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
When Select plugin: "BigQuery" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "BigQuery" from the plugins list as: "Sink"
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connections
Then Navigate to the properties page of plugin: "BigQuery"
And Enter input plugin property: "referenceName" with value: "Reference"
And Replace input plugin property: "project" with value: "projectId"
Expand Down Expand Up @@ -75,7 +75,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
And Replace input plugin property: "dataset" with value: "dataset"
And Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Validate "BigQuery" plugin properties
Then Validate "BigQuery" plugin propertiess
And Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/bigquery/sink/GCSToBigQuery.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Sink
@BigQuery_Sinks
Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfer

@CMEK @GCS_CSV_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
Expand Down Expand Up @@ -57,7 +57,7 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
Then Enter BigQuery property reference name
Then Enter BigQuery property projectId "projectId"
Then Enter BigQuery property datasetProjectId "projectId"
Then Override Service account details if set in environment variables
Then Override Service account details if set in environment variablesss
Then Enter BigQuery property dataset "dataset"
Then Enter BigQuery sink property table name
Then Toggle BigQuery sink property truncateTable to true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
Given Open Datafusion Project to configure pipeline
When Source is GCS
When Sink is BigQuery
Then Open GCS source properties
Then Open GCS source propertiess
Then Enter GCS property reference name
Then Enter GCS property "projectId" as macro argument "gcsProjectId"
Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Validate BigQuery source plugin error scenarios

Scenario Outline:Verify BigQuery Source properties validation errors for mandatory fields
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Source is BigQuerys
Then Open BigQuery source properties
Then Enter the BigQuery properties with blank property "<property>"
Then Validate mandatory property error for "<property>"
Expand All @@ -29,7 +29,7 @@ Feature: BigQuery source - Validate BigQuery source plugin error scenarios

Scenario Outline:Verify BigQuery Source properties validation errors for incorrect format of projectIds
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Source is BigQuerys
Then Open BigQuery source properties
Then Enter BigQuery property reference name
Then Enter BigQuery property projectId "<ProjectID>"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Verification of BigQuery to BigQuery successful data transfer

@BQ_SOURCE_TEST @BQ_SINK_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Verification of BigQuery to BigQuery successful data transfer using connections

@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_CONNECTION @BigQuery_Source_Required
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Verification of BigQuery to GCS successful data transfer

@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Verification of BigQuery to GCS successful data transfer with macro arguments

@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BigQuery_Source
@BigQuery_Sources
Feature: BigQuery source - Verification of BigQuery to Multiple sinks successful data transfer

@CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST @BQ_SINK_TEST @PUBSUB_SINK_TEST @BigQuery_Source_Required @CMEK_Required
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/bigqueryexecute/BQExecute.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BQExecute
@BQExecutes
Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin

@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_EXECUTE_SQL @BQExecute_Required
Expand All @@ -17,7 +17,7 @@ Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin
Then Validate "BigQuery Execute" plugin properties
Then Close the Plugin Properties page
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Run the Pipeline in Runtimes
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BQExecute
@BQExecutes
Feature: BigQueryExecute - Verify BigQueryExecute plugin error scenarios

Scenario: Verify BigQueryExecute validation error for mandatory field SQL Query
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@BQExecute
@BQExecutess
Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin with macro arguments

@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_EXECUTE_SQL
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/bigtable/BigTableToBigTable.feature
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
@BigTable @BIGTABLE_SOURCE_TEST
@BigTabless @BIGTABLE_SOURCE_TEST
Feature: BigTable source - Verification of BigTable to BigTable Successful Data Transfer

@BIGTABLE_SINK_TEST @bigtable_Required
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/datastore/runtime.feature
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@DataStore
@DataStores
Feature: DataStore - Verification of Datastore to Datastore Successful Data Transfer

@DATASTORE_SOURCE_ENTITY @datastore_Required
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcs/sink/GCSSink.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Sink
@GCS_Sinks
Feature: GCS sink - Verification of GCS Sink plugin

@CMEK @GCS_SINK_TEST @BQ_SOURCE_TEST
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcs/sink/GCSSinkError.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Sink
@GCS_Sinks
Feature: GCS sink - Verify GCS Sink plugin error scenarios

Scenario Outline:Verify GCS Sink properties validation errors for mandatory fields
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcs/source/GCSSourceError.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Verify GCS Source plugin error scenarios

Scenario Outline:Verify GCS Source properties validation errors for mandatory fields
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Verification of GCS to BQ successful data transfer

@GCS_CSV_TEST @BQ_SINK_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Verification of GCS to BQ successful data transfer

@BQ_SINK_TEST @GCS_READ_RECURSIVE_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Verification of GCS to GCS Additional Tests successful

@GCS_AVRO_FILE @GCS_SINK_TEST @GCS_Source_Required @ITN_TEST
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Verification of GCS to GCS successful data transfer using connections

@GCS_CSV_TEST @GCS_SINK_TEST @GCS_CONNECTION
Expand Down
12 changes: 6 additions & 6 deletions src/e2e-test/features/gcs/source/GCSourceSchema.feature
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
@GCS_Source
@GCS_Sources
Feature: GCS source - Validate GCS plugin output schema for different formats

Scenario Outline:GCS Source output schema validation for csv and tsv format
Expand All @@ -7,7 +7,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
Then Open GCS source properties
Then Enter GCS property projectId and reference name
Then Override Service account details if set in environment variables
Then Enter GCS source property path "<GcsPath>"
Then Enter GCS source property path "<GcsPath>"s
Then Select GCS property format "<FileFormat>"
Then Toggle GCS source property skip header to true
Then Validate output schema with expectedSchema "<ExpectedSchema>"
Expand All @@ -28,7 +28,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
Then Enter GCS property projectId and reference name
Then Override Service account details if set in environment variables
Then Enter GCS source property path "<GcsPath>"
Then Select GCS property format "<FileFormat>"
Then Select GCS property format "<FileFormat>"s
Then Validate output schema with expectedSchema "<ExpectedSchema>"
@GCS_BLOB_TEST
Examples:
Expand All @@ -52,7 +52,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
Given Open Datafusion Project to configure pipeline
When Source is GCS
Then Open GCS source properties
Then Enter GCS property projectId and reference name
Then Enter GCS property projectId and reference names
Then Override Service account details if set in environment variables
Then Enter GCS source property path "<GcsPath>"
Then Select GCS property format "<FileFormat>"
Expand All @@ -70,7 +70,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
Given Open Datafusion Project to configure pipeline
When Source is GCS
Then Open GCS source properties
Then Enter GCS property projectId and reference name
Then Enter GCS property projectId and reference names
Then Override Service account details if set in environment variables
Then Enter GCS source property path "gcsDelimitedFile"
Then Select GCS property format "delimited"
Expand All @@ -82,7 +82,7 @@ Feature: GCS source - Validate GCS plugin output schema for different formats
Given Open Datafusion Project to configure pipeline
When Source is GCS
Then Open GCS source properties
Then Enter GCS property projectId and reference name
Then Enter GCS property projectId and reference names
Then Enter GCS source property path "<GcsPath>"
Then Select GCS property format "<FileFormat>"
Then Toggle GCS source property skip header to true
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/gcscopy/GCSCopy.feature
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@GCSCopy
@GCSCopys
Feature:GCSCopy - Verification of successful objects copy from one bucket to another

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Expand All @@ -22,7 +22,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
When Select plugin: "GCS Copy" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Copy"
And Replace input plugin property: "project" with value: "projectId"
And Enter GCSCopy property source path "gcsCsvFile"
And Enter GCSCopy property source path "gcsCsvFile"s
And Enter GCSCopy property destination path
Then Override Service account details if set in environment variables
Then Enter GCSCopy property encryption key name "cmekGCS" if cmek is enabled
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/gcscopy/GCSCopyErrorScenarios.feature
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# License for the specific language governing permissions and limitations under
# the License.

@GCSCopy
@GCSCopys
Feature: GCSCopy - Validate GCSCopy plugin error scenarios

@GCSCopy_Required @ITN_TEST
Expand All @@ -21,7 +21,7 @@ Feature: GCSCopy - Validate GCSCopy plugin error scenarios
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Copy" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Copy"
Then Click on the Validate button
Then Click on the Validate buttons
Then Verify mandatory property error for below listed properties:
| sourcePath |
| destPath |
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcscopy/GCSCopy_WithMacro.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Validate successful copy object from one bucket to another new bucket with macro arguments
Given Open Datafusion Project to configure pipeline
Given Open Datafusion Project to configure pipelines
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Copy" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Copy"
Expand Down
Loading
Loading