42 detailed projects across 3 tracks — 2 per class, Class 6 through 12. Every student builds something real, every single year.
A fully responsive static website displaying school announcements, events, timetable, and teacher info — built entirely with HTML5 and CSS3. Includes a working dark/light theme toggle using CSS custom properties. Students sketch wireframes first on paper, then code from scratch without any framework.
Students create a personal portfolio page showcasing three pieces of pure CSS art — geometric illustrations drawn entirely with div elements, borders, and transforms. No images, no canvas, no JavaScript. A sidebar lists CSS properties learned. This project deepens understanding of the box model, transforms, and keyframe animations in a creative, non-intimidating context.
A 20-question multiple-choice quiz app on any school subject. Features a per-question countdown timer, live score bar, animated result screen with pass/fail fanfare, and a "review wrong answers" mode. Questions are stored in a JSON array the student authors themselves — making the subject personal.
Students build a weather app that fetches real-time data from the OpenWeatherMap API using the Fetch API and async/await. Search any city, see current conditions, a 5-day forecast in cards, and an animated background that changes with weather type (sunny, rainy, cloudy, snowy). First introduction to real-world APIs, JSON parsing, and asynchronous JavaScript patterns.
A colour-coded CLI application that manages student records stored in a CSV file. Menu-driven: add, search, update, delete students. Auto-computes grade letter, percentage, and class rank. Combines file I/O, OOP, and exception handling in one practical tool.
Students implement the classic Snake game using Python's Turtle module — fully object-oriented with separate classes for Snake, Food, and Scoreboard. The project cements OOP principles (encapsulation, composition) in a context students are already familiar with. Optional: port to Pygame for smoother rendering and add levels with increasing speed.
A real CMS for the school blog. Teachers log in, create and edit posts with a rich-text editor, upload cover images, and toggle publish status. Students browse, search by tag, and comment. Deployed on PythonAnywhere — every pair gets a live public URL they can share with parents.
Students build a functional student portal: a Django REST Framework API backend paired with a React frontend. Students can view their enrolled courses, submit assignments as file uploads, and see grades. Teachers have a separate dashboard to grade submissions and post announcements. Introduces the pattern of decoupled backend + frontend that powers modern web products.
A production-style REST API for task management — like a simplified Trello backend. Supports user registration, JWT login, and full CRUD on tasks with deadlines and priority tags. Tested with an 18-case Postman collection and deployed live on Render with MongoDB Atlas.
A real-time multi-room chat application used by the actual PREKSHA class for project coordination. Node/Express backend with Socket.io handles room join/leave, message broadcast, typing indicators, and user presence. React frontend shows message history, active users sidebar, and emoji reactions. Room history persisted to MongoDB so messages survive page refresh.
A complete, live-deployed online store. React SPA + Node/Express API + MongoDB Atlas. Ships product catalog with search and filter, Redux cart, order management with email confirmation, and a minimal admin panel. Stripe test-mode checkout included. Both partners push feature branches and merge via pull requests.
Students build their personal developer portfolio using Next.js with static generation and Incremental Static Regeneration, pulling project content from a Contentful headless CMS. Polished Framer Motion page transitions and scroll animations. The portfolio showcases all their PREKSHA projects and is submitted as part of their college/internship applications from Class 11 onward.
A production-ready multi-tenant SaaS starter. Next.js 14 App Router, TypeScript end-to-end, Prisma ORM, NextAuth.js (Google + GitHub OAuth), Stripe subscription gating, GitHub Actions CI/CD, and Playwright E2E tests. Presented as a real product pitch to faculty and invited industry mentors at year end.
Students decompose a monolith into three microservices (Auth, Products, Orders) each in its own Docker container, communicating via REST. An Express API Gateway handles routing, JWT verification, and rate limiting. Docker Compose orchestrates the full stack locally. Each service has its own database schema. Demonstrates industry patterns: service isolation, container orchestration, and distributed tracing basics.
Students simulate a sequentially blinking LED chase circuit in Tinkercad, then build it physically on a breadboard using a NE555 timer IC as an oscillator feeding a CD4017 decade counter through 5 LEDs. Students hand-draw the schematic, calculate resistor values, and submit a one-page lab report explaining the NE555 clock signal.
Students wire an analogue sound sensor to an Arduino Nano to detect claps, then use the output to cycle through three LED patterns on an 8×8 LED matrix. Double-clap toggles the entire matrix on/off. Triple-clap cycles display mode. This project introduces analogue-to-digital conversion, threshold comparators, and simple pattern storage as 2D arrays in PROGMEM.
Students code their first Arduino sketch: a proper traffic light cycling Red → Yellow → Green → Yellow with correct timing. A pedestrian push button interrupts the cycle to show a "walk" signal with a buzzer chirp. The key teaching moment is millis()-based non-blocking timing, replacing the bad habit of delay().
Students build a two-wheeled robot that autonomously follows a black line on white cardboard using an array of 5 IR sensors. Starts with simple if/else bang-bang control, then students improve it to a proportional controller (P-control) that reduces oscillation. Race day at the end: whose robot completes the PREKSHA track layout fastest?
An Arduino parking lot model: two HC-SR04 ultrasonic sensors detect car entry and exit, a servo motor controls a barrier gate, and a 16×2 I2C LCD shows available slot count. Students build the physical model from cardboard with a toy car for the live demo. A manual override button lets an attendant bypass the sensors.
Students build an automatic plant watering system using a capacitive soil moisture sensor (more accurate than resistive) and a 5V mini submersible pump controlled via a relay module. The system checks soil moisture every 10 minutes; if below threshold it waters for 3 seconds then checks again. An OLED shows real-time moisture percentage and last-watered timestamp. Demonstrated with a real plant on demo day.
An ESP32 reads DHT22 (temperature + humidity), BMP280 (air pressure), and MQ-135 (air quality) every 30 seconds, publishing JSON to an MQTT broker. A Node-RED flow on the school RPi subscribes and renders live gauges, line charts, and sends a Telegram alert if temperature exceeds 38 °C. Dashboard accessible from any browser on the LAN.
Students build a real GPS tracker using an ESP32 and NEO-6M GPS module. The device reads NMEA sentences, parses latitude/longitude/speed, and publishes to a Firebase Realtime Database via Wi-Fi every 15 seconds. A simple HTML dashboard with the Google Maps JavaScript API displays the live position as a moving marker with a breadcrumb trail. Students attach it to a bag and watch their position update as they walk the school campus.
A full smart-home simulation: ESP32 controls an LED strip (PWM brightness), a DC fan via L298N motor driver, and reads DHT22. All devices are controllable from a custom HTML/JS dashboard via the ThingSpeak API. Dashboard shows live temperature charts, lets the user set an auto-fan threshold, and includes a time-based light scheduler.
Students build a Python-based drone controller simulator using Pygame for visualisation and the MSP (MultiWii Serial Protocol) to send commands to a Betaflight-flashed flight controller board. Phase 1 simulates a top-down 2D drone responding to keyboard commands. Phase 2 connects to an actual flight controller via USB serial and tests roll/pitch/yaw responses with motors disconnected (safe bench testing).
Raspberry Pi with Pi Camera monitors a room. OpenCV detects motion via MOG2 background subtraction; a Haar cascade then confirms a human face before triggering. On detection a Telegram bot sends an alert with a timestamped snapshot. A lightweight Flask app serves a live MJPEG stream viewable from any browser on the LAN.
Students build a 3-DOF robot arm (shoulder, elbow, wrist) using 3D-printed links and MG996R servos, then write a Python inverse kinematics solver that calculates joint angles from a desired (x, y, z) end-effector position. A Tkinter GUI lets users click a target point; the arm moves there smoothly via interpolated servo commands sent over serial. Teaches the bridging of mathematics and physical hardware.
The Class 12 IoT grand capstone: a robot navigates a 5×5 grid, identifies coloured packages via camera, and delivers to the matching shelf slot. RPi handles high-level navigation and computer vision; Arduino manages low-level motor and servo control. A Next.js cloud dashboard shows live robot position, task queue, and delivery history via MQTT.
Three robots work together to cover a grid area, divide pick-and-place tasks without collision, and communicate task completion over MQTT. A central Python swarm controller assigns tasks using a greedy nearest-robot algorithm. Webots simulation validates the algorithm before physical deployment. Each robot reports position and status; the cloud dashboard shows all three in real time. Demonstrates fleet management, task allocation, and collision avoidance.
Students train their very first machine learning model with Google Teachable Machine — capturing 50+ webcam images per hand sign (Rock, Paper, Scissors, Background) and training an image classifier. The exported TF.js model is embedded in a web page that plays a live game, showing class prediction and confidence score in real time.
Students build a rule-based chatbot in Scratch that answers frequently asked questions about PREKSHA Academy — admission fees, tracks, timings, and contact info. Sprites respond to typed keywords using if/else chains and nested conditions. Students learn this is how early AI assistants worked before machine learning: explicit rules written by a human. They then reflect on what limitations this creates versus a trained model.
Students design a Google Form survey, collect 30+ responses from classmates, clean the data in Google Sheets, and build an analysis dashboard with pivot tables and four chart types. The final deliverable is a five-slide Canva "data story" presenting one headline insight and a recommendation to the class.
Students receive a deliberately messy 500-row retail sales CSV file (duplicate rows, inconsistent product names, missing values, mixed date formats) and must clean, model, and visualise it into a professional interactive Excel/Sheets dashboard. Slicers allow filtering by Region, Product Category, and Month. This project builds the "data wrangling" muscle that underpins all future data science work.
Students load a real IPL dataset (2008–2023) into a Google Colab notebook and perform guided exploratory data analysis. They clean the data, answer 8 specific analytical questions, create 8 visualisations, and export a one-page PDF summary. Subjects include top run-scorers, winning teams by venue, and toss-effect analysis.
Students analyse the sentiment of real tweets on a topic they choose (a Bollywood film, a cricket match, a tech launch) using VADER (Valence Aware Dictionary and sEntiment Reasoner) — a lexicon-based sentiment analyser designed for social media. They pre-process tweets (remove stopwords, URLs, mentions), compute sentiment scores, and create visualisations showing sentiment distribution, time-series sentiment shifts, and top positive/negative words.
Using the Student Performance dataset (UCI ML Repo), students build and compare Linear Regression and Random Forest models to predict final exam scores from study hours, attendance, and past grades. Models are evaluated with MSE and R², feature importance is visualised, and the best model is served as a live Gradio web app anyone can use.
Students build a Convolutional Neural Network from scratch using Keras (no transfer learning) to classify 36 fruit and vegetable categories from the Fruits-360 dataset. They manually design the architecture — experimenting with number of Conv layers, filter sizes, dropout rates — and document each experiment in a structured "experiment log" showing how each change affected validation accuracy. Teaches model architecture intuition before transfer learning shortcuts are introduced in Class 11.
A no-code automation built in N8N (self-hosted on the school Raspberry Pi) that reads a Google Sheets deadline tracker every morning at 08:00, finds assignments due within 48 hours, and sends personalised WhatsApp or email reminders — then marks them "Reminded" in the sheet to prevent duplicates. Real students use this from Day 1.
Students build an automation that keeps their GitHub profile README dynamically up-to-date: fetches their latest public repos, star counts, and language breakdown via the GitHub GraphQL API, generates a rich markdown file with charts and stats using Python, and commits + pushes the updated README automatically via a GitHub Action that runs every Sunday. Teaches API authentication, GraphQL queries, and automated Git workflows.
Students fine-tune MobileNetV2 on the PlantVillage dataset (54,000 leaf images, 38 disease classes) and deploy it as a Streamlit app where farmers upload a leaf photo and receive an instant disease diagnosis, confidence score, and treatment recommendation. Model is also exported to TFLite for potential on-device deployment.
Students run YOLOv8 inference on live webcam feed to detect COCO objects in real time — then fine-tune it on a custom 5-class dataset of school lab equipment (Arduino, breadboard, multimeter, RPi, soldering iron) using 200 images annotated in Roboflow. Final model detects lab equipment in real time and triggers an inventory alert when an item is removed from the bench. Deployed as a Streamlit app using CUDA acceleration on the school Master PC.
The Class 12 AI grand capstone: a full RAG chatbot grounded in PREKSHA curriculum PDFs and textbook chapters. LangChain RetrievalQA + GPT-4o + Pinecone vector database, a FastAPI streaming backend using Server-Sent Events, and a polished Next.js chat UI with source citations. Evaluated with RAGAS metrics and pitched as a real school product to academy management.
Students fine-tune a DistilBERT transformer model on a multi-label toxicity classification dataset (hate speech, profanity, threats, spam — 4 classes) and deploy it as a production-ready FastAPI microservice. A React moderation dashboard lets a human reviewer see queued content, model predictions with confidence scores, approve or override, and the human decisions are fed back to retrain the model weekly — closing the MLOps feedback loop.
Two projects per class, every year — each pair builds on the previous year's skills and introduces one major new concept.
Proficiency each track builds by Class 12 — measured by complexity of work produced across all 42 projects.