Gotchaa Lab
Back to Blog
AIcybersecuritysoftware-developmentvibe-codingmalaysia

Vibe coding security risks: what Malaysian businesses need to know in 2026

20 March 2026·4 min read·By Gotchaa Lab
Vibe coding security risks: what Malaysian businesses need to know in 2026

Vibe coding sounds like a dream. You describe what you want in plain English, an AI writes the code, you ship it. No reading, no debugging, no thinking about edge cases. Tools like Cursor, Bolt, Lovable, and Replit have made this shockingly easy.

But vibe coding security risks are real. According to the Veracode 2025 GenAI Code Security Report, nearly 45% of AI-generated code contains security flaws. A Wiz study found 20% of vibe-coded apps have serious vulnerabilities or configuration errors. This is code that often passes basic review, which makes it worse.

If you're a Malaysian business building customer-facing apps or handling data under PDPA, that number should bother you.

What is vibe coding, exactly?

Andrej Karpathy coined the term to mean fully giving in to the vibes, letting AI handle code while you just steer. It works surprisingly well for prototypes and hackathon projects. The trouble starts when people treat vibe-coded software as production-ready.

What are the real vibe coding security risks?

AI models generate code from patterns in training data. They don't understand your business logic or your database schema. In practice, this means AI regularly:

  • Pulls in outdated dependencies with known vulnerabilities
  • Generates placeholder API keys that never get removed
  • Skips input sanitisation (hello, SQL injection)
  • Builds login flows that look right but miss token expiration or rate limiting

Snyk's CEO has said AI-generated code is 30-40% more vulnerable than human-written code. That tracks with what we've seen.

Do companies actually accept vibe coding?

Some, but mostly for throwaway work. Addy Osmani at Google noted that the consensus among engineering leaders is that "vibe coding puts critical software qualities at risk: security, clarity, maintainability, and team knowledge."

The distinction that matters: AI-assisted engineering, where developers use AI but still review and understand every line, is completely different from vibe coding. Professional teams do the former.

How this affects Malaysian businesses

If your app handles customer data, you're subject to the Personal Data Protection Act (PDPA). A breach caused by sloppy AI-generated code is still your problem. The AI won't show up in court for you.

We've had Malaysian startups come to us after building MVPs with vibe coding. The story is always similar: the prototype works, but it falls apart under real traffic, has no error handling, and stores passwords in plain text. Rebuilding it costs more than doing it right the first time would have.

How we use AI coding tools at Gotchaa Lab

We use Cursor, Claude, and GitHub Copilot daily as part of our AI-powered development process. They're part of how we work. But we treat AI-generated code like code from a new hire: we review all of it.

AI handles boilerplate and first drafts. Architecture and security decisions go through a senior engineer. Every PR runs through the same CI pipeline and security scans regardless of who (or what) wrote the code. And anything touching authentication, payments, or personal data gets written and reviewed by humans only. AI tools don't know your codebase, your deployment setup, or Malaysian compliance requirements, so we fill in that context ourselves.

How to reduce vibe coding security risks

Keep vibe coding for prototypes. Don't ship it to real users without someone reviewing the code. If you don't have developers in-house, bring in an external team for a security review before it goes live. Run tools like Snyk or SonarQube on your codebase. Even basic linting catches a lot of common AI mistakes.

Vibe coding isn't going away, and neither are the security problems it creates. The smart move is using AI to move faster while keeping people around who actually understand what's running in production.

Want to build with AI tools but do it properly? Talk to us. We'll be straight with you about what AI can handle and where you still need people.

This article does not constitute professional cybersecurity advice. For specific security assessments, consult a qualified cybersecurity professional.

References

  1. Veracode 2025 GenAI Code Security Report
  2. Andrej Karpathy on vibe coding
  3. Addy Osmani: Vibe coding is not the same as AI-assisted engineering
  4. Kaspersky: Security risks of vibe coding and LLM assistants
  5. Malaysia Personal Data Protection Act (PDPA)

Share this article

Need help building this for your business?

We help Malaysian companies turn ideas like these into working software. Free consultation, no obligation.