Skip to content

Commit 96c0caf

Browse files
Update content/integrate/redis-data-integration/data-pipelines/data-denormalization.md
Co-authored-by: andy-stark-redis <164213578+andy-stark-redis@users.noreply.github.com>
1 parent ab36a14 commit 96c0caf

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

content/integrate/redis-data-integration/data-pipelines/data-denormalization.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ If you don't set `merge` as the `on_update` strategy for all jobs targeting the
100100

101101
When using this approach, you must ensure that the `key` expression in the child job matches the key expression in the parent job. If you use a different key expression, the child data will not be written to the same Redis key as the parent data.
102102

103-
In the example above, the `addresses` jobs uses the default key pattern to write to the same Redis key as the `customers` job. You can find more information about the default key pattern [here]({{< relref "/integrate/redis-data-integration/data-pipelines/transform-examples/redis-set-key-name" >}}).
103+
In the example above, the `addresses` job uses the default key pattern to write to the same Redis key as the `customers` job. You can find more information about the default key pattern [here]({{< relref "/integrate/redis-data-integration/data-pipelines/transform-examples/redis-set-key-name" >}}).
104104

105105
You can also use custom keys for the parent entity, as long as you use the same key for all jobs that write to the same Redis key.
106106

0 commit comments

Comments
 (0)