๐ข The Starting Line
It was October 15th and I was spending a few minutes scrolling my LinkedIn feed when something caught my attention — an announcement for the kickoff of Toyota Racing Development's Hack the Track hackathon.
I don’t know much about racing, but I’ve always loved cars since I was a boy. I looked into this hackathon some more and started to become increasingly interested. It was an opportunity to build something using a unique dataset from the world of motorsports.
This was certainly something out of left field for me, but the more I learned about how racing teams deal with massive amounts of live-streamed data — from car sensors to weather systems — the more I realized something:
this was a perfect opportunity to push the BEAM’s real-time strengths into a completely new domain — motorsport telemetry.
After reading the contest description, one category stood out immediately:
Real-Time Analytics — Design a tool that simulates real-time decision-making for a race engineer.
That was it. The perfect match for Elixir, Phoenix, and OTP.
I hesitated for a night — it was a big leap into a domain I knew nothing about — but the next day, I joined the hackathon.
Now the question was: what would I build?
๐งฉ Making Sense of the Data Jungle
The organizers released gigabytes of CSVs: lap timing data, race results, track weather, and — most interestingly — telemetry data from the GR Cup race cars.
At first glance, it was overwhelming. Millions of rows, hundreds of fields with names like accx_can, pbrake_f, Laptrigger_lapdist_dls. I didn’t even know what half the abbreviations meant.
I started simple — using tools like less, grep, and Excel to poke around. Then it clicked: Each row wasn’t just a data point — it was a heartbeat from the car.
Telemetry captures everything happening inside a racecar every fraction of a second: speed, throttle, brake pressure, gear changes, acceleration, GPS position.
And if I could take those heartbeats and replay them live, I could recreate the rhythm of a race — not a static CSV, but a living timeline of events.
That idea became the foundation of what I came to call GRapple (because it grapples with lots of data ๐):
a real-time race replay, analytics, and strategy system powered by Elixir and OTP.
โ๏ธ Turning Data into Motion
My first task was to sort the telemetry chronologically. That sounds simple, but the dataset wasn’t ordered, contained duplicates, and was over 11 million lines long, so I used Unix sort and uniq in wired that into Elixir for performance.
Once sorted, I used NimbleCSV to parse each lazily streamed event into well-defined structs. The core of it looks like this:
@doc """
Parses the given Telemetry Data CSV file into a lazy Stream of chronological TelemetryPoints.
WARNING: this function can take a few minutes if the CSV file is large (GBs)
"""
@spec parse_stream!(filename :: String.t()) :: {integer(), Enumerable.t()}
def parse_stream!(filename) do
cleaned_telemetry_file = clean_file!(filename) # remove dups
sorted_telemetry_file = sort_file_by_timestamp!(cleaned_telemetry_file)
telemetry_count = num_data_points(sorted_telemetry_file)
stream =
File.stream!(sorted_telemetry_file)
|> TelemetryParser.parse_stream(skip_headers: false)
|> Stream.map(fn row ->
try do
TelemetryPoint.from_row!(row)
rescue
e in [MatchError, ArgumentError] ->
Logger.debug("Skipping unparsable row: #{inspect(row)} (#{Exception.message(e)})")
nil
end
end)
# reject any rows that failed to be parsed
|> Stream.reject(&is_nil/1)
|> compute_deltas()
{telemetry_count, stream}
end
compute_deltas/1 does something interesting. It looks at the timestamps of two successive telemetry events, computes the delta of the next event, and stores this inside the struct for the current event. This way when we come to replay the telemetry event live, we will know exactly duration between any two events and can therefore simulate the race.
Then came the fun part: I needed a process that would emit those events live at the same timing intervals they originally happened (using our deltas from before). Enter the TelemetryReplayer.
At first, I was thinking to broadcast every telemetry row as a message on a Phoenix PubSub topic so other processes could subscribe, react, and compute metrics in real time. This would work great when I incorporate LiveView later on. But for now since I'm still building the backend, I decided I would keep it simple and just have the TelemetryReplayer cast each telemetry event to its associated car GenServer (which I would implement next).
At last, everything was starting to come together. I had gone from millions of rows of CSV files to well-defined data I could work with being streamed in realtime using the power of OTP. Seeing that first replay run felt surreal — after days of CSV wrangling, a virtual race had come to life in my terminal.

This was a truly motivating moment and I was ready to keep moving forward. It's not enough to replay the race live. If I could have one GenServer process per car consuming its telemetry events and keeping rolling metrics as its state, I could unlock a whole new world of possibilities when it comes to analytics and insight.
๐ Teaching GRapple to Think
Once the race replay worked, the next challenge was to give GRapple some intuition.
I built a MetricsServer, one GenServer process per car, that tracks real-time statistics — speed, throttle, brake pressure, soon acceleration, steering, and much more.
Each process consumes its car's telemetry events, and maintains rolling per-lap stats, smoothing raw signals with exponential moving averages (EMA) to highlight trends in driver behavior. Once a new lap starts, it stores the past lap in its own history. Each process is supervised by a MetricsServerSupervisor for fault-tolerance.
This way, I could get the metrics for any car in the race at any given time by looking at the state of its associated running process. It also helps tremendously in reasoning about the system and keeps things simple: each process knows about its cars performance best. Using hand-crafted test CSV files, I started to see the system in action and it was mesmerizing. Data was flowing from CSV file to the TelemetryReplayer to a car's MetricsServer which would perform rolling metrics calculations all in real-time.
And this was just the beginning. Eventually, I am hoping to implement another dedicated Analytics process that will query each MetricsServer and derive higher-level race insights.
๐ง What I've Learned So Far
This project reminded me why I love Elixir and the BEAM.
- The Actor Model is perfect for modeling a race with multiple cars.
- OTP made concurrency predictable.
- Streams let me handle millions of rows lazily and efficiently.
- Elixir is the right tool for this job.
There's quite a bit until I get to the finish line with this project, but it's been refreshing stepping into the world of motorsports and a joy using Elixir so far.
๐ The Road Ahead
In the next stage, I’ll start extending the MetricsServer to compute more metrics per car using the telemetry events. So far, I've implemented speed, throttle behavior, and brake pressure rolling metrics, but will need to add acceleration, lateral G (for cornering insight), steering, lap timing, and (if time permits) lap section comparison for more insight. Eventually, I will connect it to AI models via Replicate to simulate a race engineer that can answer live questions about performance and even strategize race decisions.
I'll be writing some more blog posts as I continue building GRapple for the Toyota Hack the Track 2025 challenge. Stay tuned.