Skip to content

Commit 1680e74

Browse files
authored
Merge pull request #59 from jhughes24816/readme-updates
chore: readme update
2 parents fa494f3 + f5167fd commit 1680e74

File tree

1 file changed

+14
-3
lines changed

1 file changed

+14
-3
lines changed

README.md

Lines changed: 14 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ The connector is supplied as source code which you can easily build into a JAR f
2222

2323

2424
## Building the connector
25+
2526
To build the connector, you must have the following installed:
2627
* [git](https://git-scm.com/)
2728
* [Maven 3.0 or later](https://maven.apache.org)
@@ -150,25 +151,29 @@ The KafkaConnectS2I resource provides a nice way to have OpenShift do all the wo
150151
The following instructions assume you are running on OpenShift and have Strimzi 0.16 or later installed.
151152

152153
#### Start a Kafka Connect cluster using KafkaConnectS2I
154+
153155
1. Create a file called `kafka-connect-s2i.yaml` containing the definition of a KafkaConnectS2I resource. You can use the examples in the Strimzi project to get started.
154156
1. Configure it with the information it needs to connect to your Kafka cluster. You must include the annotation `strimzi.io/use-connector-resources: "true"` to configure it to use KafkaConnector resources so you can avoid needing to call the Kafka Connect REST API directly.
155157
1. `oc apply -f kafka-connect-s2i.yaml` to create the cluster, which usually takes several minutes.
156158

157159
#### Add the MQ sink connector to the cluster
160+
158161
1. `mvn clean package` to build the connector JAR.
159162
1. `mkdir my-plugins`
160163
1. `cp target/kafka-connect-mq-sink-*-jar-with-dependencies.jar my-plugins`
161164
1. `oc start-build <kafkaconnectClusterName>-connect --from-dir ./my-plugins` to add the MQ sink connector to the Kafka Connect distributed worker cluster. Wait for the build to complete, which usually takes a few minutes.
162165
1. `oc describe kafkaconnects2i <kafkaConnectClusterName>` to check that the MQ sink connector is in the list of available connector plugins.
163166

164167
#### Start an instance of the MQ sink connector using KafkaConnector
168+
165169
1. `cp deploy/strimzi.kafkaconnector.yaml kafkaconnector.yaml`
166170
1. Update the `kafkaconnector.yaml` file to replace all of the values in `<>`, adding any additional configuration properties.
167171
1. `oc apply -f kafkaconnector.yaml` to start the connector.
168172
1. `oc get kafkaconnector` to list the connectors. You can use `oc describe` to get more details on the connector, such as its status.
169173

170174

171175
## Data formats
176+
172177
Kafka Connect is very flexible but it's important to understand the way that it processes messages to end up with a reliable system. When the connector encounters a message that it cannot process, it stops rather than throwing the message away. Therefore, you need to make sure that the configuration you use can handle the messages the connector will process.
173178

174179
Each message in Kafka Connect is associated with a representation of the message format known as a *schema*. Each Kafka message actually has two parts, key and value, and each part has its own schema. The MQ sink connector does not currently use message keys, but some of the configuration options use the word *Value* because they refer to the Kafka message value.
@@ -205,6 +210,7 @@ value.converter=org.apache.kafka.connect.storage.StringConverter
205210
```
206211

207212
### The gory detail
213+
208214
The messages received from Kafka are processed by a converter which chooses a schema to represent the message and creates a Java object containing the message value. There are three basic converters built into Apache Kafka.
209215

210216
| Converter class | Kafka message encoding | Value schema | Value class |
@@ -252,6 +258,7 @@ mq.message.builder.value.converter.schemas.enable=false
252258
```
253259

254260
### Key support and partitioning
261+
255262
By default, the connector does not use the keys for the Kafka messages it reads. It can be configured to set the JMS correlation ID using the key of the Kafka records. To configure this behavior, set the `mq.message.builder.key.header` configuration value.
256263

257264
| mq.message.builder.key.header | Key schema | Key class | Recommended value for key.converter |
@@ -265,21 +272,26 @@ The connector can be configured to set the Kafka topic, partition and offset as
265272

266273

267274
## Security
275+
268276
The connector supports authentication with user name and password and also connections secured with TLS using a server-side certificate and mutual authentication with client-side certificates. You can also choose whether to use connection security parameters (MQCSP) depending on the security settings you're using in MQ.
269277

270278
### Setting up TLS using a server-side certificate
279+
271280
To enable use of TLS, set the configuration `mq.ssl.cipher.suite` to the name of the cipher suite which matches the CipherSpec in the SSLCIPH attribute of the MQ server-connection channel. Use the table of supported cipher suites for MQ 9.1 [here](https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_9.1.0/com.ibm.mq.dev.doc/q113220_.htm) as a reference. Note that the names of the CipherSpecs as used in the MQ configuration are not necessarily the same as the cipher suite names that the connector uses. The connector uses the JMS interface so it follows the Java conventions.
272281

273282
You will need to put the public part of the queue manager's certificate in the JSSE truststore used by the Kafka Connect worker that you're using to run the connector. If you need to specify extra arguments to the worker's JVM, you can use the EXTRA_ARGS environment variable.
274283

275284
### Setting up TLS for mutual authentication
285+
276286
You will need to put the public part of the client's certificate in the queue manager's key repository. You will also need to configure the worker's JVM with the location and password for the keystore containing the client's certificate. Alternatively, you can configure a separate keystore and truststore for the connector.
277287

278288
### Troubleshooting
289+
279290
For troubleshooting, or to better understand the handshake performed by the IBM MQ Java client application in combination with your specific JSSE provider, you can enable debugging by setting `javax.net.debug=ssl` in the JVM environment.
280291

281292

282293
## Configuration
294+
283295
The configuration options for the Kafka Connect sink connector for IBM MQ are as follows:
284296

285297
| Name | Description | Type | Default | Valid values |
@@ -358,17 +370,16 @@ You may receive an `org.apache.kafka.common.errors.SslAuthenticationException: S
358370

359371
When configuring TLS connection to MQ, you may find that the queue manager rejects the cipher suite, in spite of the name looking correct. There are two different naming conventions for cipher suites (https://www.ibm.com/support/knowledgecenter/SSFKSJ_9.1.0/com.ibm.mq.dev.doc/q113220_.htm). Setting the configuration option `mq.ssl.use.ibm.cipher.mappings=false` often resolves cipher suite problems.
360372

361-
362373
## Support
363-
A commercially supported version of this connector is available for customers with a support entitlement for [IBM Event Streams](https://www.ibm.com/cloud/event-streams) or [IBM Cloud Pak for Integration](https://www.ibm.com/cloud/cloud-pak-for-integration).
364374

375+
Commercial support for this connector is available for customers with a support entitlement for [IBM Event Automation](https://www.ibm.com/products/event-automation) or [IBM Cloud Pak for Integration](https://www.ibm.com/cloud/cloud-pak-for-integration).
365376

366377
## Issues and contributions
367378
For issues relating specifically to this connector, please use the [GitHub issue tracker](https://github.com/ibm-messaging/kafka-connect-mq-sink/issues). If you do want to submit a Pull Request related to this connector, please read the [contributing guide](CONTRIBUTING.md) first to understand how to sign your commits.
368379

369380

370381
## License
371-
Copyright 2017, 2020 IBM Corporation
382+
Copyright 2017, 2020, 2023 IBM Corporation
372383

373384
Licensed under the Apache License, Version 2.0 (the "License");
374385
you may not use this file except in compliance with the License.

0 commit comments

Comments
 (0)