STEM Hub – AI-Powered Formula Search and LaTeX Generator

By Seokhyeon Byun

Why did I start this project?

In aerospace engineering it is common to use mathematical equations on a daily basis. For example in areas such as fluid/aerodynamics, space vehicle dynamics, and thermal/propulsion systems. Additionally, as the world is increasingly adopting digitalization in academic journals, LaTex has become essential for technical documentation. As a student studying in this field, I encountered two recurring problems. First, it is difficult for the human brain to remember every concept and equation learned over long periods. Second, it can be time consuming to convert handwritten equations from paper to LaTeX.

Then I found the Google Gemini API Developer Competition. I thought I could use this opportunity to build potential solutions for these problems and learn new skills.

STEM Hub home screen
STEM Hub — Homepage UI

Tech stack I used

  • Frontend: React, TypeScript, TailwindCSS
  • Backend & AI Integration: Next.JS(Server Action), Google Gemini API, Vercel AI SDK, LangChain
  • Hosting & Deployment: Railway

Core Features

When I designed system architecutres and features, I had three considerations.

  1. Some users already know what equation or thoery they are looking for, but they might just need LaTeX form of it.
  2. Some other users don’t know/remember exact name of equation/theory, but they might know the category of it.
  3. I need to show possiblity and usefulness of Gemini API for the competition.
STEM Hub search page overview

Search page for users to select which searching option based on their need (latest UIUX).

1. Direct Search - When you know what you’re looking for

The most straightforward feature I built was the direct search. If you know you need the “Navier-Stokes equation” or “Fourier Transform,” you just type it in and get the formula, description, and LaTeX code instantly. No need to dig through hard covered paper textbooks or remember exact mathematical notation.

Direct search interface showing formula results and LaTeX output

Direct Search — results with LaTeX output

2. Multi-step Search - When you’re not sure exactly what you need

Sometimes you know you need something related to “fluid dynamics” but can’t remember if it’s Bernoulli’s equation or the continuity equation. Instead of forcing users to type perfect prompts like ChatGPT, I tried to create a guided search where you can browse by field (physics, engineering, math), then narrow down by category, and finally find exactly what you’re looking for.

3. The “Convert” Feature

Originally, I had a feature that could convert handwritten equations into LaTeX code that allowed users to upload screenshots/images of their handwritten notes, and it would automatically convert them to full LaTeX format for digital notes. This feature was designed specifically for people who aren’t familiar with modern AI Chat Interfaces or who prefer taking handwritten notes.

Check the demo video below to see original features UIUX.

Under 3-minute video demonstration for competition submission (Original Project name was Pro Formula)

Source code on GitHub

Challenges

  1. One of the biggest headaches was dealing with LaTeX formatting. The AI would generate perfectly valid mathematical expressions, but then LaTeX would break because of missing escapes or weird formatting. I spent way too much time debugging things like summations and integrals that would render fine in one context but break in another. Eventually, I built custom cleaning functions that could handle these edge cases automatically.
  2. Initially, I deployed on Vercel because of free hosting for individual developer. But the Gemini API calls sometimes took longer than Vercel’s serverless function limits, causing frustrating timeouts. After a few too many failed requests, I migrated to Railway, which gave me more flexibility with execution times.

Things I learned

Unfortunately, my project wasn’t selected as a finalist. But, I would like to share some perspectives I learned.

  • I should have built more robust pipeline algorithms to filter and validate user inputs instead of relying on the AI for every request, which resulted in unnecessary API costs. Implementing rate limiting with Redis and input validation would have been more professional and cost-effective. Purely relying on AI outputs wasn’t ideal since AI responses can vary slightly with each request, making the system less predictable.
  • Users don’t care how fancy your AI integration is if the app breaks when something goes wrong. Building robust error handling and retry mechanisms turned out to be just as important as the core search functionality.
  • I focused too much on academic users rather than general users. While this wasn’t explicitly mentioned in the competition requirements, I realized this after analyzing the selected projects and seeing they targeted broader audiences.