Tick Slayer 3000
← All Documents

architecture

Full Project Spec

Complete technical specification — components, layers, operations, and design decisions

Full Project Spec

Type: Architecture

Concept

An autonomous ground rover that drags a cloth to collect ticks, captures images of that cloth, and maps tick density over space and time.

Vehicle Layer — Traxxas TRX-4

  • Mobility platform for uneven terrain, grass, roots
  • Handles towing the drag cloth
  • Includes ESC (motor control) and steering servo

Compute Layer — Raspberry Pi 4

  • The brain: runs navigation, capture, return-to-home, and streaming
  • MicroSD stores OS, captured images, and GPS logs
  • Powered by independent USB battery pack (prevents electrical noise and motor spike crashes)

Control Layer — PCA9685 PWM Driver

  • Converts Pi commands to RC signals
  • Controls throttle (ESC) and steering servo
  • Replaces the handheld controller

Navigation Layer — u-blox GPS Module

  • Provides latitude/longitude for image tagging, return-to-home, and tick density mapping
  • GPS-only for MVP (~2-3m precision)
  • Path recording: user walks/drives the route first, rover replays it autonomously
  • Boundary defined by the recorded path, no separate geofence

Sensing Layer

  • ADS1115 ADC — reads analog signals (Pi can't natively)
  • Voltage divider — scales battery voltage safely
  • Together: monitor crawler battery and trigger return-to-home

Vision Layer

  • Raspberry Pi Camera Module 3 (Standard) — captures images of drag cloth, autofocus
  • Camera cable — flexibility in mounting position
  • Camera mount + ball head — precise angle adjustment for consistent framing

Sampling Layer

  • Drag bar (PVC/dowel) — holds cloth behind rover
  • White flannel cloth — collects ticks, treated with permethrin before each use cycle
  • Optional dark backing — improves contrast for detection
  • Ticks incapacitated/killed on contact (nymphs <1 min, adults 1-3 hrs)

Physical + Mounting Layer

  • Waterproof enclosure — protects Pi, ADC, wiring
  • Mount plate — base for enclosure, battery, components
  • Zip ties + Velcro — secure mounting

Wiring Layer

  • Servo extension cables, connector taps, terminal block, ground wire

Optional Enhancements

  • Buzzer — alerts on low battery or mission complete
  • LED strip — consistent lighting for image capture

How the System Operates

1. Start

Rover placed at "home", GPS position saved.

2. Patrol

Rover drives along yard edge, drag cloth collects ticks.

3. Sampling Loop

move → stop → capture image → move → repeat

Each capture includes: image, timestamp, GPS location.

4. Detection (Offline or Later)

Process images: count ticks, filter debris, build dataset.

5. Mapping

Generate tick density heatmap and time-based trends.

6. Return-to-Home

When battery low: rover navigates back and alerts you.


Design Decisions

Software

  • Pi OS: Standard Raspberry Pi OS
  • Language: Python
  • Data upload: WiFi preferred; offline queue on SD card, syncs when back in range
  • Central app: This Next.js/Convex project — parts, build docs, test runs, data logging, vision/ML, public how-to guide

Vision / Detection

  • TBD — depends on image quality from real captures
  • May start with manual labeling, possibly train ML later

Hardware

  • Drag bar mount raised slightly, dragging as far back as possible
  • No obstacle avoidance for MVP — yard assumed mostly clear
  • Run time unknown; smart return-to-home on low battery

Operations

  • Daily use: set it off each morning, runs as long as battery allows
  • Won't run in rain; handles morning dew (~9am starts)
  • App should warn about rain/weather before deploying

Data Pipeline

Rover (Pi) → WiFi upload → Convex backend → Next.js dashboard
                ↓ (if no WiFi)
         Local queue on SD → sync when reconnected

Open Questions

  • Exact drag bar attachment to TRX-4
  • Battery run time (determines mission length)
  • Image quality/resolution needed to resolve ticks (~3mm)
  • On-device vs offline detection
  • Cloth maintenance strategy mid-run