When "Polish Over Security" Costs Real Money
Client wanted to "polish features first, security later." Found exposed OpenAI API key in frontend code. Anyone could steal it and rack up unlimited charges.
A potential client sent me their React + Firebase app for review.
"Need some help polishing before MVP launch," they said. "Focus on features and usability first — security can wait."
I opened the code. First thing I saw:
const openai = new OpenAIApi(new Configuration({
apiKey: "sk-proj-0DWd29YLvwUMGU3vCa04XWh..."
}));
Anyone with Chrome DevTools can grab this key and run up thousands in OpenAI charges.
The "Polish First" Problem
The client's logic seemed reasonable: get core features working, worry about backend architecture later. AI features were the selling point. Security felt like overhead.
But here's what "security later" actually meant:
Exposed API credentials: $20/day in AI costs could become $2000/day overnight
Missing Firestore rules: Any user could read/write any data
Client-side everything: Business logic exposed to anyone who wanted to see it
The Real Conversation
Client: "The exposed key issue - the old dev can fix that easy, right?"
Me: "It requires complete app architecture change. Those API calls need secure backend."
Client: "What about just focusing on the AI chat improvements?"
Me: "Your app could be bankrupting you right now and you wouldn't know."
This is the moment every developer faces: explain why security isn't optional, or watch the client learn expensive lessons later.
What Actually Happened
I sent a detailed code review flagging critical security holes. The client's response?
"Can we just focus on making the AI better first?"
They wanted to postpone the one thing that could save their business.
The Fix Was Obvious
Move sensitive operations to Firebase Cloud Functions. Add proper security rules. Secure the credentials. Basic stuff that should have been done from day one.
Estimated fix time: 2-3 weeks
Cost of not fixing: Potentially unlimited
What I Learned
Clients often think security is about preventing theoretical problems. They don't realize it's about preventing immediate financial disaster.
When someone says "security can wait," they're usually saying "I don't understand what security problems cost."
The conversation ends one of two ways: fix it now, or explain to stakeholders why the OpenAI bill is five figures.