Get Started#

Live Video Search is a Metro AI Suite sample that adapts the VSS pipeline for semantic search on live Frigate streams. It ingests live camera streams, indexes video segments with embeddings and timestamped camera metadata, and lets users select cameras, time ranges, and free‑text queries to retrieve ranked, playable clips with confidence scores while surfacing live system metrics. This guide starts the Live Video Search stack (Smart NVR + VSS Search) using Docker Compose.

Prerequisites#

Project Structure#

live-video-search/
├── config/                        # Local configuration and assets
│   ├── frigate-config/            # Frigate camera configs (active + templates)
│   ├── mqtt-config/               # Mosquitto configuration
│   ├── telemetry/                 # Telemetry collector configs
│   └── nginx.conf                 # NGINX reverse proxy
├── data/                           # Runtime data (recordings, caches)
├── docker/                         # Compose files
│   ├── compose.search.yaml        # VSS Search stack
│   ├── compose.smart-nvr.yaml      # Smart NVR stack
│   └── compose.telemetry.yaml      # Telemetry collector
├── docs/                           # Documentation
│   └── user-guide/                # User guides
├── setup.sh                        # Main setup script
└── README.md                       # Project overview

Set Required Environment Variables#

Before running the application, you need to set several environment variables:

  1. Configure the registry: The application uses registry URL and tag to pull the required images.

    export REGISTRY_URL=intel
    export TAG=latest
    
  2. Set required credentials for some services: Following variables MUST be set on your current shell before running the setup script:

    # MinIO credentials (object storage)
    export MINIO_ROOT_USER=<minio-user>
    export MINIO_ROOT_PASSWORD=<minio-pass>
    
    # PostgreSQL credentials (database)
    export POSTGRES_USER=<postgres-user>
    export POSTGRES_PASSWORD=<postgres-pass>
    
    # Embedding model for search
    export EMBEDDING_MODEL_NAME="CLIP/clip-vit-b-32"
    
    # MQTT credentials (Smart NVR)
    export MQTT_USER=<mqtt-user>
    export MQTT_PASSWORD=<mqtt-pass>
    

Optional Environment Variables#

You can customize the application behavior by setting the following optional environment variables before running the setup script:

  1. Control the frame extraction interval (Video Search Mode):

    The DataPrep microservice samples frames from uploaded videos according to the FRAME_INTERVAL environment variable. Set this variable before running source setup.sh --search to control how often frames are selected for processing.

    export FRAME_INTERVAL=15
    

    In the example above, DataPrep processes every fifteenth frame: each selected frame (optionally after object detection) is converted into embeddings and stored in the vector database. Lower values improve recall at the cost of higher compute and storage usage, while higher values reduce processing load but may skip important frames. If you do not set this variable, the service falls back to its configured default.

  2. Enable ROI consolidation (Video Search Mode):

    ROI consolidation groups overlapping object detections into merged regions of interest (ROIs) before cropping for embeddings. Enable this feature and tune it with the following environment variables:

    # Enable ROI consolidation (default: false)
    export ROI_CONSOLIDATION_ENABLED=true
    
    # IoU threshold for grouping ROIs (higher = stricter merging)
    export ROI_CONSOLIDATION_IOU_THRESHOLD=0.2
    
    # Only merge ROIs with the same class label when true
    export ROI_CONSOLIDATION_CLASS_AWARE=false
    
    # Expand merged ROIs by a fraction of width/height
    export ROI_CONSOLIDATION_CONTEXT_SCALE=0.2
    

    The IoU calculation follows the standard formula:

    $$ IoU(A, B) = \frac{|A \cap B|}{|A \cup B|} $$

  3. To use GPU acceleration for embedding generation, set the following variable before running the setup script:

    # Enable GPU embeddings
    export ENABLE_EMBEDDING_GPU=true
    
  4. To explicitly select devices for DataPrep and the Multimodal Embedding service, set:

    # CPU or GPU
    export VDMS_DATAPREP_DEVICE=GPU
    export EMBEDDING_DEVICE=GPU
    

    If unset, both default to CPU. Setting ENABLE_EMBEDDING_GPU=true forces both to GPU.

Configure Cameras#

Edit config/frigate-config/config.yml to add or update camera inputs. This is the active Frigate configuration used at startup.

For reference, see the default template in config/frigate-config/config-default.yml.

Start the Application#

source setup.sh --start

RTSP Test Stream (Out-of-Box)#

Use the bundled sample video to spin up a looped RTSP stream and point Frigate at it.

  1. Start the stack with the RTSP test services:

    source setup.sh --start-rtsp-test
    
  2. Confirm the sample stream is live in Frigate:

    • Open http://<host-ip>:5000 and select the rtsp-garage camera.

This uses config/frigate-config/config-rtsp.yml and publishes config/videos/garage.mp4 over RTSP via mediamtx. Replace the RTSP URL in config/frigate-config/config.yml with your real camera streams when moving to production.

Access:

  • VSS UI: http://<host-ip>:12345

USB Camera (Direct Frigate Input)#

Use a local USB camera (UVC/V4L2) as the Frigate input without creating an RTSP stream.

  1. Plug in your USB camera and confirm the device node exists (typically /dev/video0).

  2. Start the stack with the USB camera override:

    source setup.sh --start-usb-camera
    
  3. Open Frigate UI at http://<host-ip>:5000 and select the usb-camera feed.

Notes:

  • If your camera is not /dev/video0, update config/frigate-config/config-usb.yml and/or set USB_CAMERA_DEVICE before starting:

    export USB_CAMERA_DEVICE=/dev/video2
    source setup.sh --start-usb-camera
    
  • You can tune resolution and frame rate in config/frigate-config/config-usb.yml under input_args.

Stop or Reset#

# Stop all containers
source setup.sh --down

# Remove volumes, live recordings, and app networks
source setup.sh --clean-data

Telemetry#

Telemetry is enabled for Live Video Search and shows live system metrics in the VSS UI when the collector is connected.

Troubleshooting#

No clips in search results#

  • Confirm cameras are enabled in Configure Cameras in VSS UI.

  • Verify VSS_SEARCH_URL in setup.sh points to the internal endpoint.

Search results empty after changing model#

  • If you changed EMBEDDING_MODEL_NAME, clean data and re‑ingest:

    • source setup.sh --clean-data

    • source setup.sh --start

Telemetry information is not being displayed#

  • Verify vss-collector is running.

  • Check Pipeline Manager status: /manager/metrics/status.

MQTT connection errors#

  • Ensure MQTT_USER and MQTT_PASSWORD are set.

  • Confirm mqtt-broker is healthy: docker ps and docker logs mqtt-broker.

Stream disconnects#

  • Check Frigate logs for camera connection errors.

  • Confirm RTSP sources are reachable and credentials are valid.

Docker network label mismatch on startup#

If startup fails with an error like network docker_live-video-network was found but has incorrect label, clean up stale networks and restart:

  • source setup.sh --clean-data

  • docker network rm docker_live-video-network live-video-network || true

  • source setup.sh --start

For RTSP test mode, start again with:

  • source setup.sh --start-rtsp-test

References#