The need for secure, privacy-preserving data access is becoming increasingly important, particularly in industries like healthcare, finance, and enterprise AI. To address this, the LazAI-DATA-Query-Server-Setup-Kit provides a ready-to-deploy framework for confidential data querying using Trusted Execution Environments (TEEs).
This guide outlines the architecture, features, and setup steps for deploying your own LazAI query server locally or in production. If you prefer video format you can watch the livestream workshop video.
This is a privacy-preserving data query service built on FastAPI that enables secure, confidential data retrieval using Trusted Execution Environment (TEE) technology. The application integrates with Alith LazAI to provide encrypted data querying capabilities while maintaining data privacy.
In today’s data-driven world, organizations often need to query sensitive data while maintaining confidentiality. Traditional approaches either:
This application solves these challenges by:
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Client App │ │ FastAPI Node │ │ TEE Runtime │
│ - File Upload │───▶│ - Query API │───▶│ - Decryption │
│ - Query Request │ │ - Auth Middleware│ │ - Processing │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐
│ Milvus Store │
│ - Vector Search │
│ - Embeddings │
└─────────────────┘
Before you begin, ensure you have:
⚠️ Important: Before using this data query server, you must register it with the LazAI network. This applies to both localhost development and production deployments.
Registration Requirements
# Clone the repository
git clone https://github.com/0xLazAI/LazAI-DATA-Query-Server-Setup-Kit.git
cd LazAI-DATA-Query-Server-Setup-Kit
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
python -m pip install -r requirements.txt
# Set up environment variables
cp env.example .env
# Edit .env with your actual API keys and configuration
Edit the .env file with your actual credentials:
# DStack configuration
DSTACK_SIMULATOR_ENDPOINT=/tmp/tappd.sock
# OpenAI configuration (for embeddings)
OPENAI_API_KEY=your_openai_api_key_here
# Alith LazAI configuration
PRIVATE_KEY=your_private_key_here
RSA_PRIVATE_KEY_BASE64=your_rsa_private_key_base64_here
LLM_API_KEY=your_llm_api_key_here
LLM_BASE_URL=your_llm_base_url_here
DSTACK_API_KEY=your_dstack_api_key_here
The application requires a Trusted Execution Environment simulator for local development.
For macOS:
wget https://github.com/Leechael/tappd-simulator/releases/download/v0.1.4/tappd-simulator-0.1.4-aarch64-apple-darwin.tgz
tar -xvf tappd-simulator-0.1.4-aarch64-apple-darwin.tgz
cd tappd-simulator-0.1.4-aarch64-apple-darwin
./tappd-simulator -l unix:/tmp/tappd.sock
For Linux:
wget https://github.com/Leechael/tappd-simulator/releases/download/v0.1.4/tappd-simulator-0.1.4-x86_64-linux-musl.tgz
tar -xvf tappd-simulator-0.1.4-x86_64-linux-musl.tgz
cd tappd-simulator-0.1.4-x86_64-linux-musl
./tappd-simulator -l unix:/tmp/tappd.sock
In a new terminal (with your virtual environment activated):
# Start the FastAPI development server
python main.py
The server will be available at:
http://127.0.0.1:8000
# Build for your local architecture
docker build -t thirumurugan7/my-tee-app:v2.0.1 .
# Build for multi-platform deployment (recommended for production)
docker buildx build --platform linux/amd64 -t thirumurugan7/my-tee-app:v2.0.1 --push .
# Start the application with docker-compose
docker-compose up -d
The application will be available at:
http://localhost:8000
{
"file_id": "your_file_id",
"query": "What are the main topics discussed in this document?",
"limit": 5
}
This template is designed for deployment on:
TEE Simulator Connection Error
API Key Errors
Vector Database Issues
Check application logs for detailed error information:
# Local development
python main.py --host 0.0.0.0 --port 8000
# Docker
docker-compose logs -f app
This project is licensed under the MIT License - see the LICENSE file for details.