You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/_index.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ title = "Transactional Event Queues"
5
5
6
6
[Oracle Transactional Event Queues (TxEventQ)](https://www.oracle.com/database/advanced-queuing/) is a messaging platform built into Oracle Database that combines the best features of messaging and pub/sub systems. TxEventQ was introduced as a rebranding of AQ Sharded Queues in Oracle Database 21c, evolving from the Advanced Queuing (AQ) technology that has been part of Oracle Database since version 8.0. TxEventQ continues to evolve in Oracle Database 23ai, with [Kafka Java APIs](https://github.com/oracle/okafka), Oracle REST Data Services (ORDS) integration, and many more features and integrations.
7
7
8
-
TxEventQ is designed for high-throughput, reliable messaging in event-driven microservices and workflow applications. It supports multiple publishers and consumers, exactly-once delivery, and robust event streaming capabilities. On an 8-node Oracle Real Application Clusters (RAC) database, TxEventQ can handle approximately 1 million messages per second, demonstrating its scalability.
8
+
TxEventQ is designed for high-throughput, reliable messaging in event-driven microservices and workflow applications. It supports multiple publishers and consumers, exactly-once message delivery, and robust event streaming capabilities. On an 8-node Oracle Real Application Clusters (RAC) database, TxEventQ can handle approximately 1 million messages per second, demonstrating its scalability.
9
9
10
10
TxEventQ differs from traditional AQ (now referred to as AQ Classic Queues) in several ways:
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/aq-migration/_index.md
+2Lines changed: 2 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -6,6 +6,8 @@ weight = 5
6
6
7
7
Oracle Database 23ai includes a migration path from Advanced Queuing (AQ) to Transactional Event Queues (TxEventQ), to take advantage of enhanced performance and scalability for event-driven architectures.
8
8
9
+
Advanced Queuing (AQ) has been Oracle’s traditional messaging system for managing asynchronous communication in enterprise applications, allowing reliable queuing and message delivery. TxEventQ leverages Kafka-based event queuing, offering improved throughput, lower latency, and greater scalability, making it ideal for modern event-driven architectures and high-volume event processing.
10
+
9
11
The [DBMS_AQMIGTOOL](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/DBMS_AQMIGTOOL.html) package facilitates a smooth migration process, designed to be non-disruptive and allowing the parallel operation of AQ and TxEventQ during the transition, enabling a smooth cut-over with minimal downtime for your applications.
10
12
11
13
The migration from AQ to TxEventQ is suitable for various scenarios:
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/aq-migration/migration.md
+2-1Lines changed: 2 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,8 @@ title = "AQ Migration"
4
4
weight = 1
5
5
+++
6
6
7
-
This section covers the use of the [DBMS_AQMIGTOOL](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/DBMS_AQMIGTOOL.html) package for migrating Advanced Queuing (AQ) classic queues to TxEventQ.
7
+
8
+
This section provides a detailed guide for migrating from Oracle Advanced Queuing (AQ) to Transactional Event Queues (TxEventQ). The migration process uses the [DBMS_AQMIGTOOL](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/DBMS_AQMIGTOOL.html) package to ensure minimal disruption of existing messaging workflows.
8
9
9
10
Users of AQ are recommended to migrate to TxEventQ for increased support, performance, and access to new database features. It is recommended to read through the document fully before attempting migration.
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/getting-started/advanced-features.md
+9-61Lines changed: 9 additions & 61 deletions
Original file line number
Diff line number
Diff line change
@@ -4,71 +4,17 @@ title = "Advanced Features"
4
4
weight = 4
5
5
+++
6
6
7
-
This section explains advanced features of Transactional Event Queues, including transactional messaging, message propagation between queues and the database, and error handling.
7
+
This section explains advanced features of Transactional Event Queues, including message propagation between queues and the database, and error handling.
8
8
9
-
*[Transactional Messaging: Combine Messaging with Database Queries](#transactional-messaging-combine-messaging-with-database-queries)
10
-
*[SQL Example](#sql-example)
11
9
*[Message Propagation](#message-propagation)
12
10
*[Queue to Queue Message Propagation](#queue-to-queue-message-propagation)
## Transactional Messaging: Combine Messaging with Database Queries
19
-
20
-
Enqueue and dequeue operations occur within database transactions, allowing developers to combine database queries (DML) with messaging operations. This is particularly useful when the message contains data relevant to other tables or services within your schema.
21
-
22
-
### SQL Example
23
-
24
-
In the following example, a DML operation (an `INSERT` query) is combined with an enqueue operation in the same transaction. If the enqueue operation fails, the `INSERT` is rolled back. The orders table serves as the example.
25
-
26
-
```sql
27
-
createtableorders (
28
-
id number generated always as identity primary key,
29
-
product_id numbernot null,
30
-
quantity numbernot null,
31
-
order_date date default sysdate
32
-
);
33
-
34
-
declare
35
-
enqueue_options dbms_aq.enqueue_options_t;
36
-
message_properties dbms_aq.message_properties_t;
37
-
msg_id raw(16);
38
-
message json;
39
-
body varchar2(200) :='{"product_id": 1, "quantity": 5}';
40
-
product_id number;
41
-
quantity number;
42
-
begin
43
-
-- Convert the JSON string to a JSON object
44
-
message := json(body);
45
-
46
-
-- Extract product_id and quantity from the JSON object
> Note: The same pattern applies to the `dbms_aq.dequeue` procedure, allowing developers to perform DML operations within dequeue transactions.
68
-
69
15
## Message Propagation
70
16
71
-
Messages can be propagated within the same database or across a [database link](https://docs.oracle.com/en/database/oracle/oracle-database/23/sqlrf/CREATE-DATABASE-LINK.html) to different queues or topics. Message propagation is useful for workflows that require message processing d by different consumers or for event-driven actions that need to trigger subsequent processes.
17
+
Messages can be propagated within the same database or across a [database link](https://docs.oracle.com/en/database/oracle/oracle-database/23/sqlrf/CREATE-DATABASE-LINK.html) to different queues or topics. Message propagation is useful for workflows that require processing by different consumers or for event-driven actions that need to trigger subsequent processes.
72
18
73
19
#### Queue to Queue Message Propagation
74
20
@@ -137,7 +83,7 @@ alter system set job_queue_processes=10;
137
83
138
84
#### Stopping Queue Propagation
139
85
140
-
You can stop propagation using the DBMS_AQADM.STOP_PROPAGATION procedures:
86
+
You can stop propagation using the [DBMS_AQADM.UNSCHEDULE_PROPAGATION](https://docs.oracle.com/en/database/oracle/oracle-database/23/arpls/DBMS_AQADM.html#GUID-4B4E25F4-E11F-4063-B1B8-7670C2537F47) procedure:
141
87
142
88
```sql
143
89
begin
@@ -153,14 +99,16 @@ Your can view queue subscribers and propagation schedules from the respective `D
153
99
154
100
#### Using Database Links
155
101
156
-
To propagate messages between databases, a [database link](https://docs.oracle.com/en/database/oracle/oracle-database/23/sqlrf/CREATE-DATABASE-LINK.html) from the local database to the remote database must be created. The subscribe and propagation commands must be altered to use the database link.
102
+
To propagate messages between databases, a [database link](https://docs.oracle.com/en/database/oracle/oracle-database/23/sqlrf/CREATE-DATABASE-LINK.html) from the local database to the remote database must be created. The subscribe and propagation commands must be altered to use the database link.
157
103
158
104
```sql
159
105
begin
160
106
dbms_aqadm.schedule_propagation(
161
107
queue_name =>'source',
162
-
destination =>'<database link>.<schema name>', -- replace with your database link and schema name,
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/getting-started/core-concepts.md
+4-2Lines changed: 4 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -120,7 +120,7 @@ end;
120
120
121
121
#### Object
122
122
123
-
For structured, complex messages, you may choose to set the payload type as a custom object type that was defined using `create type`. Object types must reside in the same schema as the queue/topic, and the structure of each message must exactly match the payload type.
123
+
For structured, complex messages, you may choose to set the payload type as a custom object type that was defined using `create type`. Object types must be accessible from the queue/topic, and the structure of each message must exactly match the payload type.
124
124
125
125
The following SQL script defines a custom object type, and then creates a Transactional Event Queue using that type.
126
126
@@ -149,4 +149,6 @@ end;
149
149
150
150
#### Kafka Message Payloads
151
151
152
-
Topics created using the Kafka APIs for Transactional Event Queues use a Kafka message payload type, and so specifying the payload type is not necessary. Additionally, topics created using Kafka APIs should also be managed, produced and consumed from using the appropriate Kafka APIs.
152
+
Topics created using the Kafka APIs for Transactional Event Queues use a Kafka message payload type, and specifying the payload type is not necessary. Additionally, topics created using Kafka APIs should also be managed and interacted with using the appropriate Kafka APIs.
153
+
154
+
For more information on using Kafka APIs with TxEventQ, see the [Kafka chapter](../kafka/_index.md).
Copy file name to clipboardExpand all lines: docs-source/transactional-event-queues/content/getting-started/message-operations.md
+61-5Lines changed: 61 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -6,6 +6,7 @@ weight = 3
6
6
7
7
This section explains message operations using queues, topics, and different programming interfaces (SQL, Java, Spring JMS, and more). You’ll learn how to enqueue, dequeue, and manage messages effectively.
8
8
9
+
9
10
*[Enqueue and Dequeue, or Produce and Consume](#enqueue-and-dequeue-or-produce-and-consume)
10
11
*[Queues](#queues)
11
12
*[Topics](#topics)
@@ -19,6 +20,7 @@ This section explains message operations using queues, topics, and different pro
19
20
*[Message Expiry and Exception Queues](#message-expiry-and-exception-queues)
20
21
*[Message Delay](#message-delay)
21
22
*[Message Priority](#message-priority)
23
+
*[Transactional Messaging: Combine Messaging with Database Queries](#transactional-messaging-combine-messaging-with-database-queries)
22
24
23
25
### Enqueue and Dequeue, or Produce and Consume
24
26
@@ -159,7 +161,7 @@ public class SampleProducer<T> implements Runnable, AutoCloseable {
159
161
160
162
The following Java snippet creates an org.oracle.okafka.clients.consumer.KafkaConsumer instance capable of records from Transactional Event Queue topics. Note the use of Oracle Database connection properties, and Kafka consumer-specific properties like `group.id` and `max.poll.records`.
161
163
162
-
The org.oracle.okafka.clients.consumer.KafkaConsume class implements the org.apache.kafka.clients.consumer.Consumer interface, allowing it to be used in place of a Kafka Java client consumer.
164
+
The org.oracle.okafka.clients.consumer.KafkaConsumer class implements the org.apache.kafka.clients.consumer.Consumer interface, allowing it to be used in place of a Kafka Java client consumer.
The following Java class consumes messages to a topic, using the [Kafka Java Client for Oracle Database Transactional Event Queues](https://github.com/oracle/okafka). Like the producer example, the consumer only does not use any Oracle classes, only Kafka interfaces.
186
+
The following Java class consumes messages from a topic, using the [Kafka Java Client for Oracle Database Transactional Event Queues](https://github.com/oracle/okafka). Like the producer example, the consumer only does not use any Oracle classes, only Kafka interfaces.
Message msg = consumer.receive(10000); // Wait for 10 seconds
259
261
if (msg !=null&& msg instanceofTextMessage) {
@@ -336,7 +338,7 @@ begin
336
338
message :=sys.aq$_jms_text_message.construct();
337
339
message.set_text('this is my message');
338
340
339
-
message_properties.delay :=7*24*60*60; -- Delay for 1 week
341
+
message_properties.delay :=7*24*60*60; -- Delay for 7 days
340
342
dbms_aq.enqueue(
341
343
queue_name =>'my_queue',
342
344
enqueue_options => enqueue_options,
@@ -376,4 +378,58 @@ begin
376
378
commit;
377
379
end;
378
380
/
379
-
```
381
+
```
382
+
383
+
### Transactional Messaging: Combine Messaging with Database Queries
384
+
385
+
Enqueue and dequeue operations occur within database transactions, allowing developers to combine database DML with messaging operations. This is particularly useful when the message contains data relevant to other tables or services within your schema.
386
+
387
+
In the following example, a DML operation (an `INSERT` query) is combined with an enqueue operation in the same transaction. If the enqueue operation fails, the `INSERT` is rolled back. The orders table serves as the example.
388
+
389
+
```sql
390
+
createtableorders (
391
+
id number generated always as identity primary key,
392
+
product_id numbernot null,
393
+
quantity numbernot null,
394
+
order_date date default sysdate
395
+
);
396
+
397
+
declare
398
+
enqueue_options dbms_aq.enqueue_options_t;
399
+
message_properties dbms_aq.message_properties_t;
400
+
msg_id raw(16);
401
+
message json;
402
+
body varchar2(200) :='{"product_id": 1, "quantity": 5}';
403
+
product_id number;
404
+
quantity number;
405
+
begin
406
+
-- Convert the JSON string to a JSON object
407
+
message := json(body);
408
+
409
+
-- Extract product_id and quantity from the JSON object
0 commit comments