Skip to content

Commit cb5f54e

Browse files
authored
Merge pull request #246 from Labelbox/3.0release
notebook cleanup
2 parents fd5fd48 + 21c6122 commit cb5f54e

File tree

8 files changed

+694
-696
lines changed

8 files changed

+694
-696
lines changed

examples/annotation_types/basics.ipynb

Lines changed: 24 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,12 @@
66
"metadata": {},
77
"source": [
88
"## Annotation Types\n",
9-
"This is a common format for representing human and machine generated annotations. A standard interface allows us to build tools that only need to work with a single interface. For example, if model predictions and labels are all represented by a common format we can write all of our etl, visualization code, training code to work with a single interface. Annotation types can also provide a seamless transition between local modeling and using labelbox. Some of the helper functions include:\n",
9+
"This is a common format for representing human and machine generated annotations. A standard interface allows us to build one set of tools that is compatible with all of our data. For example, if model predictions and labels are all represented by a common format we can write all of our etl, visualization code, training code to work with a single interface. Annotation types can also provide a seamless transition between local modeling and using labelbox. Some of the helper functions include:\n",
1010
"* Build annotations locally with local file paths, numpy arrays, or urls and create data rows with a single line of code\n",
11-
"* Easily upload model predictions by converting predictions to \n",
12-
"* Configure project ontology from model inferences\n",
13-
"* Easily access video data without having to worry about downloading each frame.\n",
14-
"* Helper functions for drawing annotations, converting them into shapely obejects, and much more."
11+
"* Easily upload model predictions for MAL or MEA by converting annotation objects to the import format\n",
12+
"* Configure project ontologies from a set of model inferences\n",
13+
"* Easily access video data without having to worry about downloading each frame's annotations.\n",
14+
"* Helper functions for drawing annotations, converting them into shapely objects, and much more."
1515
]
1616
},
1717
{
@@ -23,7 +23,7 @@
2323
"## Installation\n",
2424
"* Installing annotation types requires a slightly different pattern\n",
2525
" - `pip install \"labelbox[data]\"`\n",
26-
"* `pip install labelbox` is still valid but it won't add the required dependencies. If you only want the client functionality of the SDK then don't add the data extras. However, you will likely get import errors if attempting to use the annotation types"
26+
"* `pip install labelbox` is still valid but it won't add the required dependencies. If you only want the client functionality of the SDK then don't add the [data] extras. However, you will likely get import errors if attempting to use the annotation types"
2727
]
2828
},
2929
{
@@ -33,7 +33,7 @@
3333
"metadata": {},
3434
"outputs": [],
3535
"source": [
36-
"!pip install \"labelbox[data]\" --pre"
36+
"!pip install \"labelbox[data]\""
3737
]
3838
},
3939
{
@@ -152,7 +152,7 @@
152152
" - `project.label_generator()`\n",
153153
" - `project.video_label_generator()`\n",
154154
"3. Use a converter to load from another format\n",
155-
" - Covered in converters.ipynb notebook."
155+
" - Covered in the converters.ipynb notebook."
156156
]
157157
},
158158
{
@@ -161,7 +161,7 @@
161161
"metadata": {},
162162
"source": [
163163
"### Basic LabelCollection\n",
164-
"* A Label collection is either a labelList or LabelGenerator containing Labels\n",
164+
"* A Label collection is either a `labelList` or `LabelGenerator` containing `Labels`\n",
165165
" * More on this in label_containers.ipynb\n",
166166
"* Each label contains:\n",
167167
" 1. Data\n",
@@ -193,7 +193,7 @@
193193
"id": "circular-router",
194194
"metadata": {},
195195
"source": [
196-
"* All models are pydantic so we can easily convert all of our objects to dictionaries and view the schema."
196+
"* All models are pydantic models so we can easily convert all of our objects to dictionaries and view the schema."
197197
]
198198
},
199199
{
@@ -354,8 +354,8 @@
354354
"source": [
355355
"#### Non-public urls\n",
356356
"* If the urls in your data is not publicly accessible you can override the fetching logic\n",
357-
"* For TextData and ImageData overwrite the following function and make sure it has the same signature. `data.fetch_remote(self) -> bytes`.\n",
358-
"* For VideoData, the signature is `VideoData.fetch_remote(self, local_path)`. This function needs to download the video file locally to that local_path to work."
357+
"* For `TextData` and `ImageData` overwrite the following function and make sure it has the same signature. `data.fetch_remote(self) -> bytes`.\n",
358+
"* For `VideoData`, the signature is `VideoData.fetch_remote(self, local_path)`. This function needs to download the video file locally to that local_path to work."
359359
]
360360
},
361361
{
@@ -382,27 +382,27 @@
382382
"metadata": {},
383383
"source": [
384384
"* There are 4 types of annotations\n",
385-
" 1. ObjectAnnotation\n",
385+
" 1. `ObjectAnnotation`\n",
386386
" - Objects with location information\n",
387387
" - Annotations that are found in the object field of the labelbox export\n",
388-
" - Classes: Point, Polygon, Mask, Line, Rectangle, Named Entity\n",
389-
" 2. ClassificationAnnotation\n",
388+
" - Classes: `Point`, `Polygon`, `Mask`, `Line`, `Rectangle`, `TextEntity`\n",
389+
" 2. `ClassificationAnnotation`\n",
390390
" - Classifications that can apply to data or another annotation\n",
391-
" - Classes: Checklist, Radio, Text, Dropdown\n",
392-
" 3. VideoObjectAnnotation\n",
391+
" - Classes: `Checklist`, `Radio`, `Text`, `Dropdown`\n",
392+
" 3. `VideoObjectAnnotation`\n",
393393
" - Same as object annotation but there are extra fields for video information\n",
394-
" 4. VideoClassificationAnnotation\n",
394+
" 4. `VideoClassificationAnnotation`\n",
395395
" - Same as classification annotation but there are extra fields for video information \n",
396396
"-------- \n",
397397
"* Create an annotation by providing the following:\n",
398398
"1. Value\n",
399-
" - Must be either a Geometry, TextEntity, or Classification\n",
399+
" - Must be either a `Geometry`, `TextEntity`, or `Classification`\n",
400400
" - This is the same as a top level tool in labelbox\n",
401-
"2. name or feature_schema_id\n",
401+
"2. Name or feature_schema_id\n",
402402
" - This is the id that corresponds to a particular class or just simply the class name\n",
403403
" - If uploading to labelbox this must match a field in an ontology.\n",
404404
"3. (Optional) Classifications \n",
405-
" - List of ClassificationAnnotations. This self referencing field enables infinite nesting of classifications.\n",
405+
" - List of `ClassificationAnnotations`. This self referencing field enables infinite nesting of classifications.\n",
406406
" - Be careful with how you use the tool. Labelbox does not support nesting classifications\n",
407407
" - E.g. you can have tool.classifications but not tool.classifications[0].classifications\n",
408408
" "
@@ -1027,7 +1027,7 @@
10271027
"outputs": [],
10281028
"source": [
10291029
"def signing_function(obj_bytes: bytes) -> str:\n",
1030-
" # WARNING: Do not use this signer. You will not be able to resign these images at a later date\n",
1030+
" # Do not use this signer. You will not be able to resign these images at a later date\n",
10311031
" url = client.upload_data(content=obj_bytes, sign=True)\n",
10321032
" return url"
10331033
]
@@ -1138,7 +1138,7 @@
11381138
"metadata": {},
11391139
"source": [
11401140
"### Creating Data Rows\n",
1141-
"* Our Labels objects are great for working with locally but we might want to upload to labelbox\n",
1141+
"* `Labels` objects are great for working with locally but we might want to upload to labelbox\n",
11421142
"* This is required for MAL, MEA, and to add additional labels to the data.\n"
11431143
]
11441144
},
@@ -1253,7 +1253,7 @@
12531253
"source": [
12541254
"### Next Steps\n",
12551255
"* Annotation types should be thought of as low level interfaces\n",
1256-
"* We are working on a set of tools to make this less verbose. Please provide any feedback!\n",
1256+
"* We are working on a set of tools to make working with annotation types less verbose. Please provide any feedback!\n",
12571257
"* Checkout other notebooks to see how to use higher level tools that are compatible with these interfaces"
12581258
]
12591259
},

examples/annotation_types/converters.ipynb

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,16 @@
66
"metadata": {},
77
"source": [
88
"# Converters\n",
9-
"* The goal is to create a set of converts that convert to and from the labelbox object format.\n",
9+
"* The goal is to create a set of converts that convert to and from labelbox annotation types.\n",
1010
"* This is automatically used when exporting labels from labelbox with:\n",
11-
" 1. label.label_generator()\n",
12-
" 2. label.video_label_generator()\n",
11+
" 1. `label.label_generator()`\n",
12+
" 2. `label.video_label_generator()`\n",
1313
"* Currently we support:\n",
1414
" 1. NDJson Converter\n",
1515
" - Convert to and from the prediction import format (mea, mal)\n",
1616
" 2. LabelboxV1 Converter\n",
1717
" - Convert to and from the prediction import format (mea, mal)\n",
18-
"* Converters use the LabelGenerator by default to minimize memory but are compatible with LabelLists"
18+
"* Converters use the `LabelGenerator` by default to minimize memory but are compatible with `LabelList`s"
1919
]
2020
},
2121
{
@@ -25,7 +25,7 @@
2525
"metadata": {},
2626
"outputs": [],
2727
"source": [
28-
"!pip install \"labelbox[data]\" --pre"
28+
"!pip install \"labelbox[data]\""
2929
]
3030
},
3131
{
@@ -104,8 +104,8 @@
104104
"metadata": {},
105105
"source": [
106106
"### Video\n",
107-
"* No longer need to download urls for each data row. This happens in the background of the converter\n",
108-
"* Easy to draw annotations directly exported from labelbox"
107+
"* Users no longer need to download urls for each data row. This happens in the background of the converter\n",
108+
"* It is easy to draw annotations directly exported from labelbox"
109109
]
110110
},
111111
{
@@ -487,7 +487,7 @@
487487
}
488488
],
489489
"source": [
490-
"# We can also reserialize:\n",
490+
"# We can also serialize back to the original payload:\n",
491491
"for result in LBV1Converter.serialize(label_list):\n",
492492
" print(result)"
493493
]
@@ -498,8 +498,8 @@
498498
"metadata": {},
499499
"source": [
500500
"## NDJson Converter\n",
501-
"* Converts common annotation types into the ndjson format.\n",
502-
"* Only supports MAL tools. So videos annotated with bounding boxes can't be converted"
501+
"* Converts common annotation types into the ndjson format\n",
502+
"* Only tools that are compatible with MAL are supported"
503503
]
504504
},
505505
{
@@ -520,8 +520,6 @@
520520
}
521521
],
522522
"source": [
523-
"# TODO: Throw an error on these video annotations..\n",
524-
"\n",
525523
"ndjson = []\n",
526524
"for row in NDJsonConverter.serialize(label_list):\n",
527525
" ndjson.append(row)\n",

examples/annotation_types/label_containers.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
"metadata": {},
2121
"outputs": [],
2222
"source": [
23-
"!pip install \"labelbox[data]\" --pre"
23+
"!pip install \"labelbox[data]\""
2424
]
2525
},
2626
{
@@ -256,8 +256,8 @@
256256
"source": [
257257
"# LabelList\n",
258258
"* This object is essentially a list of Labels with a set of helpful utilties\n",
259-
"* This object is simple and fast at the expense of memory\n",
260-
" * Larger datasets shouldn't use label list ( or at least will require more memory ).\n",
259+
"* It is simple and fast at the expense of memory\n",
260+
" * Larger datasets shouldn't use label list ( or at least will require more memory )\n",
261261
"* Why use label list over just a list of labels?\n",
262262
" * Multithreaded utilities (faster)\n",
263263
" * Compatible with converter functions (functions useful for translating between formats, etl, and training )"
@@ -273,7 +273,7 @@
273273
"labels = get_labels()\n",
274274
"label_list = LabelList(labels)\n",
275275
"\n",
276-
"# Also build label lists iteratively\n",
276+
"# Also build LabelLists iteratively\n",
277277
"label_list = LabelList()\n",
278278
"for label in labels:\n",
279279
" label_list.append(label)"
@@ -429,10 +429,10 @@
429429
"source": [
430430
"# LabelGenerator\n",
431431
"* This object generates labels and provides a set of helpful utilties\n",
432-
"* This object is complex and slower than LabelList in order to be highly memory efficient\n",
432+
"* This object is complex and slower than the `LabelList` in order to be highly memory efficient\n",
433433
" * Larger datasets should use label generators\n",
434434
"* Why use label generator over just a generator that yields labels?\n",
435-
" * This object supports parallel io operations to buffer results in the background.\n",
435+
" * Parallel io operations are run in the background to prepare results\n",
436436
" * Compatible with converter functions (functions useful for translating between formats, etl, and training )\n",
437437
"* The first qsize elements run serially from when the chained functions are added.\n",
438438
" * After that iterating will get much faster."

examples/annotation_types/mal_using_annotation_types.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
"metadata": {},
1818
"outputs": [],
1919
"source": [
20-
"!pip install \"labelbox[data]\" --pre"
20+
"!pip install \"labelbox[data]\""
2121
]
2222
},
2323
{

0 commit comments

Comments
 (0)