AWS AI Hack Day Micro Conference stretches into a full-day hackathon at the AWS GenAI Loft in San Francisco giving developers the time to go deeper, collaborate longer, and actually ship what they start.
What’s happening on the ground:
⚡ Hands-on technical challenge powered by FriendliAI
🧠 A 30-minute session on scaling inference and agents
👀 An early preview of Friendli Agent shared live with the community
📍 AWS GenAI Loft, 525 Market St
🗓️ August 22, 9:30 AM – 8:00 PM PT
If you're ready to experiment, connect, and create, this is where the Bay Area's AI community will be. Register here → https://lu.ma/aws-08-22-25
The CLI binary is available as fa
(alias: friendli-app
). It manages app deploy/update and basic
suite operations.
- Install (Python 3.11+)
- With pip:
pip install friendli-app
- Authenticate
- Get a Personal Access Token: https://friendli.ai/suite/setting/tokens
- Export it or pass per-command with
--token
.
export FRIENDLI_TOKEN=YOUR_PAT
# Verify
fa whoami
# Or without exporting
fa --token YOUR_PAT whoami
- Deploy an example app
Each app must contain a main.py
at its root. Example apps live under examples/
.
# Deploy examples/simple-app with a name, project, and optional env vars
fa deploy examples/simple-app \
--name my-simple-app \
--project-id <PROJECT_ID> \
-e KEY1=VALUE1 -e KEY2=VALUE2
- Manage apps
# List apps in a project
fa list --project-id <PROJECT_ID>
# Update source archive from a directory
fa update <APP_ID> ./path/to/app
# Restart or terminate
fa restart <APP_ID>
fa terminate <APP_ID>
Notes
- Command alias:
friendli-app
is identical tofa
. - Example deployment prints a link to view status in the Suite UI.
Global options
--token
: Personal access token; overridesFRIENDLI_TOKEN
.-h, --help
: Show help for the current command.
Common
fa whoami
: Show logged-in user info.- Usage:
fa whoami
orfa --token <PAT> whoami
- Usage:
fa version
: Show CLI version.- Usage:
fa version
- Usage:
Apps
fa deploy <APP_DIR>
: Deploy an app directory.- Options:
-n, --name <NAME>
,-p, --project-id <PROJECT_ID>
,-e, --env KEY=VALUE
(repeatable) - Notes:
main.py
must exist at app root; ~50MB directory limit; detectspyproject.toml
orrequirements.txt
to bundle deps. - Example:
fa deploy examples/simple-app -n my-simple-app -p <PROJECT_ID> -e KEY1=VALUE1
- Options:
fa update <APP_ID> <APP_DIR>
: Update an app’s source archive.- Notes:
main.py
required; ~50MB limit. - Example:
fa update <APP_ID> ./my-app
- Notes:
fa list --project-id <PROJECT_ID>
: List apps in a project.- Example:
fa list -p <PROJECT_ID>
- Example:
fa restart <APP_ID>
: Restart an app.- Example:
fa restart <APP_ID>
- Example:
fa terminate <APP_ID>
: Terminate an app.- Example:
fa terminate <APP_ID>
- Example:
Each example is its own Python project. See the example’s README for setup, dependencies, and usage.
- examples/simple-app: Minimal AgentApp with sync/async callbacks and streaming. README
- examples/streaming-chat-memory: Streaming chat with persistent memory (mem0), OpenAI-compatible
/v1/chat/completions
. README - examples/daily-assistant-mcp: MCP server exposing practical tools (tip calc, timezone, BMI, password). README
- examples/debug-echo: Tiny FastAPI echo service for connectivity testing. README
- examples/debug-fai: FastAPI app calling Friendli Serverless via OpenAI SDK; includes passthrough endpoint. README
- examples/langgraph-research-agent: LangGraph multi-agent research workflow with streaming. README
- examples/async-crewai-agent: CrewAI-based background task agent with progress and results endpoints. README
- examples/adk-multi-agent-research: Google ADK-style multi-agent research FastAPI service. README
- examples/autogen-dev-team: AutoGen multi-agent dev team orchestrating design→code→review. README
Build lightweight HTTP agents using the SDK in friendli_app.sdk
.
- Import:
from friendli_app.sdk import AgentApp
- Define callbacks with
@app.callback
(sync, async, or generators for streaming) - Run locally:
python main.py
(uses Uvicorn under the hood) - Invoke:
POST /callbacks/{callback_name}
with JSON body
Example
import asyncio
from friendli_app.sdk import AgentApp
app = AgentApp()
@app.callback
def greet(name: str = "World"):
return {"message": f"Hello, {name}!"}
@app.callback
async def greet_async(name: str = "World"):
await asyncio.sleep(1)
return {"message": f"Hello, {name}! (async)"}
@app.callback
def stream(n: int = 3):
for i in range(n):
yield {"i": i, "msg": f"chunk {i+1}/{n}"}
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8080)
Invoke callbacks
- JSON response:
curl -s -X POST http://localhost:8080/callbacks/greet \
-H 'Content-Type: application/json' \
-d '{"name": "Ada"}'
- Streaming (SSE):
curl -N -X POST http://localhost:8080/callbacks/stream \
-H 'Content-Type: application/json' \
-H 'Accept: text/event-stream' \
-d '{"n": 5}'
Notes
- The request body JSON is mapped directly to the callback function parameters.
- Generator or async-generator callbacks stream Server-Sent Events (
text/event-stream
). - To deploy an SDK app with the CLI, ensure your project root contains
main.py
with anAgentApp
instance.