You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All internal data are stored in HWC format, 4 channels per 32-bit word. Assuming 3-color (or 3-channel) input, one byte will be unused. Example:
612
+
All internal data are stored in HWC format, 4 channels per 32-bit word. Assuming 3-color (or 3-channel) input, one byte will be unused. The highest frequency in this data format is the channel, so the channels are interleaved.
613
+
614
+
Example:
613
615
614
616

615
617
616
618
#### CHW
617
619
618
-
The input layer can alternatively also use the CHW format (sequence of channels), for example:
620
+
The input layer can alternatively also use the CHW format (a sequence of channels). The highest frequency in this data format is the width or X-axis (W), and the lowest frequency is the channel. Assuming an RGB input, all red pixels are followed by all green pixels, followed by all blue pixels.
621
+
622
+
Example:
619
623
620
624

621
625
@@ -780,6 +784,8 @@ The `ai84net.py` and `ai85net.py` files contain models that fit into AI84’s we
780
784
781
785
To train the FP32 model for MNIST on MAX78000, run `scripts/train_mnist.sh` from the `ai8x-training` project. This script will place checkpoint files into the log directory. Training makes use of the Distiller framework, but the `train.py` software has been modified slightly to improve it and add some MAX78000/MAX78002 specifics.
782
786
787
+
Since training can take hours or days, the training script does not overwrite any weights previously produced. Results are placed in sub-directories under `logs/` named with date and time when training began. The latest results are always soft-linked to by `latest-log_dir` and `latest_log_file`.
788
+
783
789
### Command Line Arguments
784
790
785
791
The following table describes the most important command line arguments for `train.py`. Use `--help` for a complete list.
@@ -802,6 +808,7 @@ The following table describes the most important command line arguments for `tra
802
808
|`--resume-from`| Resume from previous checkpoint |`--resume-from chk.pth.tar`|
803
809
|`--qat-policy`| Define QAT policy in YAML file (default: qat_policy.yaml). Use ‘’None” to disable QAT. |`--qat-policy qat_policy.yaml`|
804
810
|*Display and statistics*|||
811
+
|`--enable-tensorboard`| Enable logging to TensorBoard (default: disabled) ||
@@ -941,7 +948,7 @@ Both TensorBoard and Manifold can be used for model comparison and feature attri
941
948
942
949
#### TensorBoard
943
950
944
-
TensorBoard is built into `train.py`. It provides a local web server that can be started before, during, or after training and it picks up all data that is written to the `logs/` directory.
951
+
TensorBoard is built into `train.py`. When enabled using `--enable-tensorboard`, it provides a local web server that can be started before, during, or after training and it picks up all data that is written to the `logs/` directory.
945
952
946
953
For classification models, TensorBoard supports the optional `--param-hist` and `--embedding` command line arguments. `--embedding` randomly selects up to 100 data points from the last batch of each verification epoch. These can be viewed in the “projector” tab in TensorBoard.
0 commit comments