Skip to content

Enable lr_infer_he_sgx in Anolis OS #190

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: branch-dev/lr_he_grpc
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 94 additions & 0 deletions cczoo/lr_infer_he_sgx/Anolisos.dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
FROM openanolis/anolisos:8.4-x86_64 AS Anolisos

ENV INSTALL_PREFIX=/usr/local
ENV LD_LIBRARY_PATH=${INSTALL_PREFIX}/lib:${INSTALL_PREFIX}/lib64:${LD_LIBRARY_PATH}
ENV PATH=${INSTALL_PREFIX}/bin:${LD_LIBRARY_PATH}:${PATH}
# Add steps here to set up dependencies
RUN yum install -y \
openssl-devel \
libcurl-devel \
protobuf-devel \
yum-utils.noarch \
python3 \
wget

# Intel SGX
RUN mkdir /opt/intel && cd /opt/intel \
&& wget https://mirrors.openanolis.cn/inclavare-containers/bin/anolis8.4/sgx-2.15.1/sgx_rpm_local_repo.tar.gz
RUN cd /opt/intel && sha256sum sgx_rpm_local_repo.tar.gz \
&& tar xvf sgx_rpm_local_repo.tar.gz \
&& yum-config-manager --add-repo file:///opt/intel/sgx_rpm_local_repo \
&& yum --nogpgcheck install -y libsgx-urts libsgx-launch libsgx-epid libsgx-quote-ex libsgx-dcap-ql libsgx-uae-service libsgx-dcap-quote-verify-devel
RUN yum groupinstall -y 'Development Tools'

# COPY patches/libsgx_dcap_quoteverify.so /usr/lib64/
RUN yum install -y --nogpgcheck sgx-dcap-pccs libsgx-dcap-default-qpl

# Gramine
ENV GRAMINEDIR=/gramine
ENV SGX_DCAP_VERSION=DCAP_1.11
ENV GRAMINE_VERSION=v1.3.1
ENV ISGX_DRIVER_PATH=${GRAMINEDIR}/driver
ENV PKG_CONFIG_PATH=/usr/local/lib64/pkgconfig/
ENV LC_ALL=C.UTF-8 LANG=C.UTF-8
ENV WERROR=1
ENV SGX=1
ENV GRAMINE_PKGLIBDIR=/usr/local/lib64/gramine
ENV ARCH_LIBDIR=/lib64

RUN yum install -y gawk bison python3-click python3-jinja2 golang ninja-build
RUN yum install -y openssl-devel protobuf-c-devel python3-protobuf protobuf-c-compiler protobuf-compiler
RUN yum install -y gmp-devel mpfr-devel libmpc-devel isl-devel nasm python3-devel mailcap

RUN ln -s /usr/bin/python3 /usr/bin/python \
&& pip3 install --upgrade pip \
&& pip3 install toml meson wheel cryptography paramiko pyelftools

RUN git clone https://github.com/gramineproject/gramine.git ${GRAMINEDIR} \
&& cd ${GRAMINEDIR} \
&& git checkout ${GRAMINE_VERSION}

RUN git clone https://github.com/intel/SGXDataCenterAttestationPrimitives.git ${ISGX_DRIVER_PATH} \
&& cd ${ISGX_DRIVER_PATH} \
&& git checkout ${SGX_DCAP_VERSION}

ENV LD_LIBRARY_PATH=${INSTALL_PREFIX}/lib:${INSTALL_PREFIX}/lib64:${LD_LIBRARY_PATH}
RUN cd ${GRAMINEDIR} \
&& LD_LIBRARY_PATH="" meson setup build/ --buildtype=debug -Dprefix=${INSTALL_PREFIX} -Ddirect=enabled -Dsgx=enabled -Ddcap=enabled -Dsgx_driver=dcap1.10 -Dsgx_driver_include_path=${ISGX_DRIVER_PATH}/driver/linux/include \
&& LD_LIBRARY_PATH="" ninja -C build/ \
&& LD_LIBRARY_PATH="" ninja -C build/ install
RUN gramine-sgx-gen-private-key

RUN echo "enabled=0" > /etc/default/apport
RUN echo "exit 0" > /usr/sbin/policy-rc.d

RUN mkdir -p ${INSTALL_PREFIX} \
&& wget -q -O cmake-linux.sh https://github.com/Kitware/CMake/releases/download/v3.19.6/cmake-3.19.6-Linux-x86_64.sh \
&& sh cmake-linux.sh -- --skip-license --prefix=${INSTALL_PREFIX} \
&& rm cmake-linux.sh

# Clean tmp files
RUN yum -y clean all \
&& rm -rf /var/cache \
&& rm -rf ~/.cache/* \
&& rm -rf /tmp/*

ENV WORKSPACE=/lr_infer_he_sgx
WORKDIR ${WORKSPACE}

COPY src ./src
COPY datasets ./datasets
COPY cmake ./cmake
COPY CMakeLists.txt \
start_service.sh \
infer_server.manifest.template \
Makefile ./

RUN cmake -S. -Bbuild \
&& cmake --build build \
&& cp build/src/infer_server . \
&& cp datasets/lrtest_mid_lrmodel.csv . \
&& make clean \
&& ENTRYPOINT=infer_server make

RUN echo "/lr_infer_he_sgx/start_service.sh" >> ~/.bashrc
11 changes: 4 additions & 7 deletions cczoo/lr_infer_he_sgx/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -77,17 +77,17 @@ RUN if [ -z "$AZURE" ]; then \
# Gramine
ENV GRAMINEDIR=/gramine
ENV SGX_DCAP_VERSION=DCAP_1.11
ENV GRAMINE_VERSION=v1.2
ENV GRAMINE_VERSION=v1.3.1
ENV ISGX_DRIVER_PATH=${GRAMINEDIR}/driver
ENV WERROR=1
ENV SGX=1

RUN apt-get install -y bison gawk nasm python3-click python3-jinja2 ninja-build pkg-config \
libcurl4-openssl-dev libprotobuf-c-dev python3-protobuf protobuf-c-compiler \
libcurl4-openssl-dev libprotobuf-c-dev python3-protobuf protobuf-c-compiler protobuf-compiler\
libgmp-dev libmpfr-dev libmpc-dev libisl-dev

RUN pip3 install --upgrade pip \
&& pip3 install toml meson cryptography
&& pip3 install toml meson cryptography pyelftools

RUN git clone https://github.com/gramineproject/gramine.git ${GRAMINEDIR} \
&& cd ${GRAMINEDIR} \
Expand Down Expand Up @@ -136,7 +136,4 @@ RUN cmake -S. -Bbuild \
&& make clean \
&& ENTRYPOINT=infer_server make

RUN echo "/lr_infer_he_sgx/start_service.sh" >> ~/.bashrc

ENV http_proxy=
ENV https_proxy=
RUN echo "/lr_infer_he_sgx/start_service.sh" >> ~/.bashrc
8 changes: 4 additions & 4 deletions cczoo/lr_infer_he_sgx/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,19 +34,19 @@ For deployments on Microsoft Azure:
```shell
AZURE=1 ./build_docker_image.sh
```
For other cloud deployments:
For Anolis OS cloud deployments:
```shell
./build_docker_image.sh
./build_docker_image.sh anolisos
```
### Execution
Open 2 terminals, one for the inference client that has data to be inferred and the other for the inference server that has a AI model.
- Inference server
```
./start_container.sh server
./start_container.sh server [ubuntu/anolisos]
```
- Inference client
```
./start_container.sh client
./start_container.sh client [ubuntu/anolisos]
```
### Result
>EncryptionParameters: wrote 91 bytes
Expand Down
11 changes: 11 additions & 0 deletions cczoo/lr_infer_he_sgx/build_docker_image.sh
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,16 @@ proxy_server="" # your http proxy server

cd `dirname $0`

if [ $1 == "anolisos" ]; then
DOCKER_BUILDKIT=0 docker build \
--build-arg no_proxy=${no_proxy} \
--build-arg http_proxy=${proxy_server} \
--build-arg https_proxy=${proxy_server} \
--build-arg AZURE=${azure} \
-f Anolisos.dockerfile \
-t anolisos_lr_infer_he_sgx:latest \
.
else
DOCKER_BUILDKIT=0 docker build \
--build-arg no_proxy=${no_proxy} \
--build-arg http_proxy=${proxy_server} \
Expand All @@ -36,3 +46,4 @@ DOCKER_BUILDKIT=0 docker build \
-f Dockerfile \
-t lr_infer_he_sgx:latest \
.
fi
10 changes: 8 additions & 2 deletions cczoo/lr_infer_he_sgx/start_container.sh
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,12 @@ if [ -z "$1" ]; then
exit 1
fi

docker_image="lr_infer_he_sgx:latest"

if [ "$2" = "anolisos" ]; then
docker_image="anolisos_lr_infer_he_sgx:latest"
fi

# You can remove no_proxy and proxy_server if your network doesn't need it
no_proxy="localhost,127.0.0.1"
proxy_server="" # your http proxy server
Expand All @@ -42,7 +48,7 @@ if [ $1 = "client" ]; then
-e no_proxy=${no_proxy} \
-e http_proxy=${proxy_server} \
-e https_proxy=${proxy_server} \
lr_infer_he_sgx:latest \
${docker_image} \
/lr_infer_he_sgx/build/src/infer_client --data datasets/lrtest_mid_eval.csv
elif [ $1 = "server" ]; then
container=$(echo `docker ps -a | grep infer_server`)
Expand All @@ -60,7 +66,7 @@ elif [ $1 = "server" ]; then
-e no_proxy=${no_proxy} \
-e http_proxy=${proxy_server} \
-e https_proxy=${proxy_server} \
lr_infer_he_sgx:latest \
${docker_image} \
/lr_infer_he_sgx/infer_server
else
Usage
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,19 +34,19 @@ For deployments on Microsoft Azure:
```shell
AZURE=1 ./build_docker_image.sh
```
For other cloud deployments:
For Anolis OS cloud deployments:
```shell
./build_docker_image.sh
./build_docker_image.sh anolisos
```
### Execution
Open 2 terminals, one for the inference client that has data to be inferred and the other for the inference server that has a AI model.
- Inference server
```
./start_container.sh server
./start_container.sh server [ubuntu/anolisos]
```
- Inference client
```
./start_container.sh client
./start_container.sh client [ubuntu/anolisos]
```
### Result
>EncryptionParameters: wrote 91 bytes
Expand Down