back to blog
Introducing Route Runners: Helping Civilians escape in record times

Introducing Route Runners: Helping Civilians escape in record times

April 21, 2025

Route Runner

ColorMatte-ezgif com-optimize

Route Runner aims to aid first responders in alerting citizens within a close-by radius to a hazard to evacuate using a 3d scan of their environment, generated by a custom-built drone, that contains an escape plan. We would eliminate the use of technology by using technology here as a First Responder would just have to input a prompt to a chatbot that would then automatically send off a drone to collect environmental data on a specified region that would then be sent to affected citizens. This saves the time and lives of citizens, as they wouldn’t have to spend time they would use for checking the news and social media for affected areas.

Custom Drone: The Runner

Route Runner is react/next.js web app that communicates to a custom-made drone(which was also built throughout the competition) through a Raspberry Pi.

IMG_7640

We built a custom quadcopter with a F450 frame. For the flight controller and ESC, we used the SpeedyBee F405V3 50A stack and Flash Hobby D4215 650KV brushless motors. The construction of the quad-copter took about 2 hours and configuring it for the different missions ran throughout the night. We used a range of mission planning software, starting with Betaflight, moving to AdruPilot, and finally settling with iNAV.

Web App: The Route

The web app consists of a prompt the user can send to the drone. This prompt can be something like “Scan my current location with a radius of 15 meters” or “Give me hazard data for Oklahoma City, OK”. Once the prompt has been input, the app uses ChatGPT along with Dijkstra’s algorithm to calculate and generate a flight path(iNav) that will automatically be sent to the drone. This ensured that the drone created a flight path that was quick, efficient, and covered the most area. Once this Flight Path is generated, it is sent to a Raspberry Pi that is attached to the drone, which will immediately take off as flight path data is sent. This will then create a live feed for first responders to watch and take notes with.

Once the drone touches down, the live feed video is sent back to the website, where a 3D model is generated using Agisoft MetaShape to create Gaussian splats(Video-to-3D Model). After this is generated, it is then output to WebAR, where an AR experience is shown on the website. Above is a demo of what first responders would do to scan their current area and analyze the area for threats. We plan to make an alert system to alert citizens with an augmented reality escape plan.

Getting Started

First, duplicate the .env file into a new file named .env.local. Update the value of your OpenAI API key there.

The first time you run this project, you will need to install the dependencies. Run this command in your terminal:

yarn

To start the app, run:

yarn dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

Make sure to add your GPT keys to the .env file.

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.