Tech With Htunn
  • Blog Content
  • ๐Ÿค–Artificial Intelligence
    • ๐Ÿง Building an Intelligent Agent with Local LLMs and Azure OpenAI
    • ๐Ÿ“ŠRevolutionizing IoT Monitoring: My Personal Journey with LLM-Powered Observability
  • ๐Ÿ“˜Core Concepts
    • ๐Ÿ”„Understanding DevSecOps
    • โฌ…๏ธShifting Left in DevSecOps
    • ๐Ÿ“ฆUnderstanding Containerization
    • โš™๏ธWhat is Site Reliability Engineering?
    • โฑ๏ธUnderstanding Toil in SRE
    • ๐Ÿ”What is Identity and Access Management?
    • ๐Ÿ“ŠMicrosoft Graph API: An Overview
    • ๐Ÿ”„Understanding Identity Brokers
  • ๐Ÿ”ŽSecurity Testing
    • ๐Ÿ”SAST vs DAST: Understanding the Differences
    • ๐ŸงฉSoftware Composition Analysis (SCA)
    • ๐Ÿ“‹Software Bill of Materials (SBOM)
    • ๐ŸงชDependency Scanning in DevSecOps
    • ๐ŸณContainer Scanning in DevSecOps
  • ๐Ÿ”„CI/CD Pipeline
    • ๐Ÿ”My Journey with Continuous Integration in DevOps
    • ๐Ÿš€My Journey with Continuous Delivery and Deployment in DevOps
  • ๐ŸงฎFundamentals
    • ๐Ÿ’พWhat is Data Engineering?
    • ๐Ÿ”„Understanding DataOps
    • ๐Ÿ‘ทThe Role of a Cloud Architect
    • ๐Ÿ›๏ธCloud Native Architecture
    • ๐Ÿ’ปCloud Native Applications
  • ๐Ÿ›๏ธArchitecture & Patterns
    • ๐Ÿ…Medallion Architecture in Data Engineering
    • ๐Ÿ”„ETL vs ELT Pipeline: Understanding the Differences
  • ๐Ÿ”’Authentication & Authorization
    • ๐Ÿ”‘OAuth 2.0 vs OIDC: Key Differences
    • ๐Ÿ”Understanding PKCE in OAuth 2.0
    • ๐Ÿ”„Service Provider vs Identity Provider Initiated SAML Flows
  • ๐Ÿ“‹Provisioning Standards
    • ๐Ÿ“ŠSCIM in Identity and Access Management
    • ๐Ÿ“กUnderstanding SCIM Streaming
  • ๐Ÿ—๏ธDesign Patterns
    • โšกEvent-Driven Architecture
    • ๐Ÿ”’Web Application Firewalls
  • ๐Ÿ“ŠReliability Metrics
    • ๐Ÿ’ฐError Budgets in SRE
    • ๐Ÿ“SLA vs SLO vs SLI: Understanding the Differences
    • โฑ๏ธMean Time to Recovery (MTTR)
Powered by GitBook
On this page
  • The Problem: IoT Data Overload
  • The Stack: Modern, Containerized, LLM-Powered
  • How Data Flows Through the System
  • The IoT Simulator: Testing Made Easy
  • The Secret Sauce: LLM-Powered Intelligence
  • Performance Considerations: Right-Sizing the LLM
  • How to Try My Project
  • Tips from My Learning Journey
  • Extending the Platform for My Learning
  • My Road Ahead
  • Conclusion: My Personal Learning Journey with IoT Data
  • ๐Ÿ‘ฉโ€๐Ÿ’ป About the Author
  1. Artificial Intelligence

Revolutionizing IoT Monitoring: My Personal Journey with LLM-Powered Observability

PreviousBuilding an Intelligent Agent with Local LLMs and Azure OpenAINextUnderstanding DevSecOps

Last updated 13 days ago

TL;DR: I've been working on a cutting-edge IoT monitoring platform as a personal development project, combining Docker containers, FastAPI, MongoDB, Grafana, and a Llama LLM to enable natural language queries of IoT sensor data. Now I can simply ask "What's the temperature trend in the server room?" instead of digging through dashboards. This blog post shares my learning journey and the technical details of what I built.

The Problem: IoT Data Overload

As part of my personal learning journey in IoT and AI integration, I identified a common pain point โ€” sensors everywhere, data flooding in, but making sense of it all requires specialized dashboard knowledge or complex queries. It's 2025, and many are still copy-pasting metrics into reports or screenshots into Slack. I thought there has to be a better way, right?

That's exactly what motivated me to start developing this IoT Observability platform as a personal development project. I wanted to challenge myself to create a system where I could simply ask questions in plain English and get meaningful answers about sensor data. This project has been an incredible learning experience that has helped me develop skills across multiple technologies.

The Stack: Modern, Containerized, LLM-Powered

For my personal development project, I decided to challenge myself with a fully containerized architecture using Docker and Docker Compose. This approach helped me learn about container orchestration while building something practical. Here's what's under the hood of my learning project:

The core components:

  • IoT Simulator: Python service generating realistic temperature and humidity data (for testing or demo)

  • Metrics Service: FastAPI-powered REST API for data collection

  • MongoDB: NoSQL database optimized for time-series metrics storage

  • LLM Service: The magic sauce - Llama-powered natural language processing

  • Grafana: For traditional dashboard visualization needs

The deployment is handled through Docker Compose, making the entire setup portable and reproducible:

How Data Flows Through the System

Let's walk through how data moves through my system. This is where things get interesting, and where I really deepened my understanding of system architecture:

The IoT Simulator: Testing Made Easy

During my self-learning development process, I realized I needed a reliable source of test data. As I didn't have actual IoT devices to work with at home, I created a Python-based IoT simulator that generates realistic temperature and humidity values for different rooms. What's cool is how I made it mimic real-world patterns - the kitchen runs hotter than the bedroom, bathroom humidity spikes periodically, and the outdoor sensor shows wider temperature swings. This was a great exercise in Python development for me.

Here's a sample of the data structure I designed:

{
  "id": "unique-uuid",
  "device_id": "device_001",
  "device_name": "Living Room Sensor",
  "location": "Living Room",
  "type": "temperature",
  "value": 23.4,
  "unit": "C",
  "timestamp": "2025-06-06T14:30:15.123456"
}

We built robust retry mechanisms and configurable intervals - because even in a simulated environment, we wanted to practice good IoT data collection practices!

The Secret Sauce: LLM-Powered Intelligence

This is where my personal project truly shines and where I spent most of my learning time. Traditional monitoring systems make you learn their query language. I wanted to flip the script as part of my exploration of LLM capabilities - making my system learn my language instead.

I had to learn a lot about prompt engineering and context building to make this work. The LLM Service I developed processes natural language queries by:

  1. Receiving the query text ("What's the recent humidity trend in the kitchen?")

  2. Building relevant context from the MongoDB metrics collection

  3. Analyzing the intent (time-series analysis of humidity for kitchen location)

  4. Processing with the Llama model

  5. Returning human-readable insights

As I got more comfortable with the technology, I challenged myself further. For advanced queries, I implemented an agentic approach that enables more complex analysis (which was a fascinating learning experience):

The beauty of this approach is that I can now extract insights without specialized knowledge. As an individual developing my skills, this has been incredibly rewarding - seeing how my personal project could potentially help make data accessible to non-technical users. It's been a great exercise in building practical AI applications.

Performance Considerations: Right-Sizing the LLM

One of my biggest learning challenges was running a large language model alongside IoT services on my personal development machine. This part of the project taught me a lot about optimization. I learned to optimize by:

  1. Using TinyLlama in quantized Q4_K_M format (reduced model size by ~75%)

  2. Configuring appropriate context window (2048 tokens)

  3. Implementing GPU acceleration when available (with fallbacks)

  4. Fine-tuning temperature and token settings for IoT-specific queries

I was thrilled to discover that even on my modest personal hardware (18GB RAM), the system performs well with response times typically under 2 seconds for basic queries. This was a great lesson in making AI technology accessible without enterprise-grade hardware.

How to Try My Project

If you're also on a learning journey and want to try what I've built, the setup is straightforward:

# Clone the repo
git clone <repository-url>
cd agentic-iot-observability

# Fire up the stack
docker compose up -d

# Test a natural language query
curl -X POST http://localhost:8080/query \
  -H "Content-Type: application/json" \
  -d '{"query": "What is the current temperature in the living room?"}'

The services are available at:

  • Grafana: http://localhost:3000 (admin/admin)

  • Metrics API: http://localhost:8000

  • LLM API: http://localhost:8080

Tips from My Learning Journey

Through my personal development journey with this project, I've gathered some practical tips I'd like to share:

  1. Start with modest LLM parameters: I learned to begin with lower context windows and increase as needed to avoid overwhelming my personal machine.

  2. Index your MongoDB collections properly: One of my early mistakes was poor indexing! I quickly discovered that timestamp-based queries are common, so ensuring they're optimized is critical.

  3. Consider rate limiting: Even in my personal development environment, I added rate limiting to the LLM service to prevent accidentally overwhelming my system with queries.

  4. Implement caching: I found that common queries can be cached to improve response times, which was a great learning experience in optimization.

  5. Customize for your context: While my simulator works for my learning purposes, I recommend adapting the data generation to match whatever use case you're exploring.

Extending the Platform for My Learning

One aspect I'm particularly proud of is how the modular design makes it easy for me to keep learning by extending the system:

# Adding a new sensor type? Just update the simulator:
iot_simulator:
  environment:
    - SIMULATION_INTERVAL=5
    - ENABLE_CO2_SENSORS=true  # New environment variable

For those wanting to use a different LLM, modify the environment variables:

llm_service:
  environment:
    - MODEL_PATH=/app/models/llama-2-7b-chat.Q4_K_M.gguf  # Different model
    - TEMPERATURE=0.5
    - MAX_TOKENS=2048

My Road Ahead

As part of my continuing personal development journey, I have several features I'd like to add to grow my skills further:

  • Statistical anomaly detection with configurable thresholds (to learn more about data science)

  • Predictive maintenance capabilities (to develop my ML skills)

  • Integration with automation systems for closed-loop control (to enhance my IoT knowledge)

  • Mobile app for on-the-go monitoring and alerts (to expand into mobile development)

Conclusion: My Personal Learning Journey with IoT Data

This personal development project has taught me so much about making IoT data accessible. I've discovered that the power of IoT has always been in its data, but that power remains locked away if you need specialized knowledge to access it. By combining modern containerization, time-series databases, and LLM technology, I've created a platform that makes my IoT data accessible to myself without specialized queries.

For me, the real game-changer wasn't any single technology, but learning how they work together to create a seamless experience. This project has helped me develop skills across multiple domains and created something practical I can actually use.

Feel free to check out the project on GitHub and contribute to its development!

I've made this project open source under the MIT license, partly to document my learning journey and partly to invite others to learn alongside me. If you're on a similar personal development journey, I'd love to see your pull requests and feature suggestions!


๐Ÿ‘ฉโ€๐Ÿ’ป About the Author

I'm a technology enthusiast with a passion for AI and machine learning systems. My background spans software development, automation and cloud architecture, with particular expertise in Python development, DevSecOps, AWS and Azure cloud services. I enjoy building systems that bridge the gap between cutting-edge AI research and practical applications, especially those that leverage both local and cloud-based models. When I'm not coding, you can find me exploring the latest advancements in AI/ML, contributing to open-source projects.

๐Ÿค–
๐Ÿ“Š