Skip to content

Commit 8709a7d

Browse files
authored
Fix dependencies (#804)
* Revert "Upgrade confluent-kafka to >=2.8.2,<2.9 (#799)" This reverts commit d2a9c77. * Fix neo4j Conda dependency * Upgrade confluent-kafka to 2.8.2 and include schema registry extras
1 parent d2a9c77 commit 8709a7d

File tree

6 files changed

+5
-15
lines changed

6 files changed

+5
-15
lines changed

conda/meta.yaml

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,6 @@ requirements:
1515
- python >=3.9,<3.13
1616
run:
1717
- python >=3.9,<3.13
18-
- python-confluent-kafka >=2.8.2,<2.9
1918
- requests >=2.32
2019
- typing_extensions >=4.8
2120
- orjson >=3.9,<4
@@ -30,7 +29,7 @@ requirements:
3029
- boto3 >=1.35.65,<2.0
3130
- boto3-stubs >=1.35.65,<2.0
3231
- azure-storage-blob >=12.24.0,<12.25
33-
- neo4j >=5.27.0,<6
32+
- neo4j-python-driver >=5.27.0,<6
3433
- pymongo >=4.11,<5
3534
- pandas >=1.0.0,<3.0
3635
- rich >=13,<14

conda/requirements.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
CONDA_POST_LINK_REQUIREMENTS = re.compile(r"'([\w\[\],-]+)([><=,.\d]*)?")
1212

1313
PYPI_TO_CONDA_NAME_MAPPING = {
14-
"confluent-kafka": "python-confluent-kafka",
14+
"neo4j": "neo4j-python-driver",
1515
}
1616

1717

docs/advanced/schema-registry.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -4,13 +4,7 @@ Serializers and deserializers for JSON Schema, Avro, and Protobuf support integr
44

55
The current implementation wraps Confluent's serializers and deserializers, which are tightly coupled with the Schema Registry.
66

7-
To integrate your existing Schema Registry, install the `schema_registry` extra
8-
9-
```bash
10-
pip install quixstreams[schema_registry]
11-
```
12-
13-
and pass `SchemaRegistryClientConfig` to your serializers and deserializers. Additional optional configuration can be provided via `SchemaRegistrySerializationConfig`.
7+
To integrate your existing Schema Registry, pass `SchemaRegistryClientConfig` to your serializers and deserializers. Additional optional configuration can be provided via `SchemaRegistrySerializationConfig`.
148

159
> NOTE: Not every `Serializer`/`Deserializer` uses `SchemaRegistrySerializationConfig`; refer to each serialization type below for
1610
> valid use.

pyproject.toml

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -41,8 +41,7 @@ all = [
4141
"azure-storage-blob>=12.24.0,<12.25",
4242
"neo4j>=5.27.0,<6",
4343
"pymongo>=4.11,<5",
44-
"pandas>=1.0.0,<3.0",
45-
"confluent-kafka[avro,json,protobuf,schemaregistry]>=2.8.2,<2.9"
44+
"pandas>=1.0.0,<3.0"
4645
]
4746

4847
avro = ["fastavro>=1.8,<2.0"]
@@ -58,7 +57,6 @@ azure = ["azure-storage-blob>=12.24.0,<12.25"]
5857
neo4j = ["neo4j>=5.27.0,<6"]
5958
mongodb = ["pymongo>=4.11,<5"]
6059
pandas = ["pandas>=1.0.0,<3.0"]
61-
schema_registry = ["confluent-kafka[avro,json,protobuf,schemaregistry]>=2.8.2,<2.9"]
6260

6361
# AWS dependencies are separated by service to support
6462
# different requirements in the future.

requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
confluent-kafka>=2.8.2,<2.9
1+
confluent-kafka[avro,json,protobuf,schemaregistry]>=2.8.2,<2.9
22
requests>=2.32
33
rocksdict>=0.3,<0.4
44
typing_extensions>=4.8

tests/requirements.txt

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,4 +8,3 @@ influxdb3-python>=0.7.0,<1.0
88
pyiceberg[pyarrow,glue]>=0.7,<0.8
99
redis[hiredis]>=5.2.0,<6
1010
pandas>=1.0.0,<3.0
11-
confluent-kafka[avro,json,protobuf,schemaregistry]>=2.8.2,<2.9

0 commit comments

Comments
 (0)