Skip to content

Commit ffab633

Browse files
Add files via upload
0 parents  commit ffab633

File tree

3 files changed

+543
-0
lines changed

3 files changed

+543
-0
lines changed

README.md

Lines changed: 104 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
# arachnid-shield-sdk
2+
An SDK for consuming the Arachnid Shield API.
3+
4+
## Installation
5+
6+
(From `bergpi`)
7+
8+
```sh
9+
pip install arachnid-shield-sdk
10+
```
11+
12+
## Usage
13+
14+
First, obtain login credentials by contacting [Project Arachnid](https://projectarachnid.ca/en/contact).
15+
16+
This client acts simply as a global resource that may live as long as your application. So you may use it in different ways.
17+
18+
### Vanilla Python (Sync)
19+
20+
```python
21+
from arachnid_shield_sdk import ArachnidShield
22+
23+
shield = ArachnidShield(username="", password="")
24+
25+
26+
class HarmfulMediaFoundException(Exception):
27+
"""Raised when a CSAM/Harmful to Children media is found to be uploaded on the server"""
28+
user = None
29+
scanned_media_metadata = None
30+
31+
def __init__(self, user, scanned_media_metadata):
32+
self.user = user
33+
self.scanned_media_metadata = scanned_media_metadata
34+
35+
36+
def process_media_for_user(user_id, contents):
37+
"""
38+
39+
Raises:
40+
HarmfulMediaFoundException If the
41+
"""
42+
43+
scanned_media = shield.scan_media_from_bytes(contents, "image/jpeg")
44+
if scanned_media.matches_known_image:
45+
raise HarmfulMediaFoundException(user=user_id, scanned_media_metadata=scanned_media)
46+
47+
# do more processing here.
48+
...
49+
50+
51+
def main():
52+
53+
with open("some-image.jpeg", "rb") as f:
54+
contents = f.read()
55+
56+
process_media_for_user(user_id=1, contents=contents)
57+
58+
59+
if __name__ == '__main__':
60+
main()
61+
```
62+
63+
### Vanilla Python (Async)
64+
65+
```python
66+
import asyncio
67+
from arachnid_shield_sdk import ArachnidShieldAsync as ArachnidShield
68+
69+
shield = ArachnidShield(username="", password="")
70+
71+
72+
class HarmfulMediaFoundException(Exception):
73+
"""Raised when a CSAM/Harmful to Children media is found to be uploaded on the server"""
74+
user = None
75+
scanned_media_metadata = None
76+
77+
def __init__(self, scanned_media_metadata):
78+
self.scanned_media_metadata = scanned_media_metadata
79+
80+
81+
async def process_media(contents):
82+
"""
83+
84+
Raises:
85+
HarmfulMediaFoundException If the
86+
"""
87+
88+
scanned_media = await shield.scan_media_from_bytes(contents, "image/jpeg")
89+
if scanned_media.matches_known_image:
90+
raise HarmfulMediaFoundException(scanned_media)
91+
92+
# do more processing here.
93+
...
94+
95+
96+
async def main():
97+
with open("some-image.jpeg", "rb") as f:
98+
contents = f.read()
99+
await process_media(contents=contents)
100+
101+
102+
if __name__ == '__main__':
103+
asyncio.get_event_loop().run_until_complete(main())
104+
```

0 commit comments

Comments
 (0)