I spoke about vibe coding at SETU School recently.
Video:
https://lnkd.in/g4nFnHWGTranscript:
https://lnkd.in/gNJuVvYBHere are the top messages from the talk:
๐ช๐ต๐ฎ๐ ๐ถ๐ ๐๐ถ๐ฏ๐ฒ ๐ฐ๐ผ๐ฑ๐ถ๐ป๐ด
It's where we ask the model to write & run code, don't read the code, just inspect the ๐ฃ๐ฆ๐ฉ๐ข๐ท๐ช๐ฐ๐ถ๐ณ.
It's a ๐ฐ๐ผ๐ฑ๐ฒ๐ฟ'๐ ๐๐ฎ๐ฐ๐๐ถ๐ฐ, not a methodology. Use it when speed trumps certainty.
๐ช๐ต๐ ๐ถ๐'๐ ๐ฐ๐ฎ๐๐ฐ๐ต๐ถ๐ป๐ด ๐ผ๐ป
โข ๐ก๐ผ๐ป-๐ฐ๐ผ๐ฑ๐ฒ๐ฟ๐ ๐ฐ๐ฎ๐ป ๐ป๐ผ๐ ๐๐ต๐ถ๐ฝ ๐ฎ๐ฝ๐ฝ๐ - no mental overhead of syntax.
โข ๐๐ผ๐ฑ๐ฒ๐ฟ๐ ๐๐ต๐ถ๐ป๐ธ ๐ฎ๐ ๐ฎ ๐ต๐ถ๐ด๐ต๐ฒ๐ฟ ๐น๐ฒ๐๐ฒ๐น - stay in problem space.
โข ๐ ๐ผ๐ฑ๐ฒ๐น ๐ฐ๐ฎ๐ฝ๐ฎ๐ฏ๐ถ๐น๐ถ๐๐ ๐ธ๐ฒ๐ฒ๐ฝ๐ ๐๐ถ๐ฑ๐ฒ๐ป๐ถ๐ป๐ด - the "vibe-able" slice grows daily.
๐๐ผ๐ ๐๐ผ ๐๐ผ๐ฟ๐ธ ๐๐ถ๐๐ต ๐ถ๐ ๐ฑ๐ฎ๐-๐๐ผ-๐ฑ๐ฎ๐
โข ๐๐ฎ๐ถ๐น ๐ณ๐ฎ๐๐, ๐ต๐ผ๐ฝ ๐บ๐ผ๐ฑ๐ฒ๐น๐ - if Claude errors, paste into Gemini or OpenAI.
โข ๐๐ฟ๐ผ๐๐-๐๐ฎ๐น๐ถ๐ฑ๐ฎ๐๐ฒ ๐ผ๐๐๐ฝ๐๐๐ - ask a second LLM to critique or replicate; cheaper than reading 400 lines of code.
โข ๐ฆ๐๐ถ๐๐ฐ๐ต ๐บ๐ผ๐ฑ๐ฒ๐ ๐ฑ๐ฒ๐น๐ถ๐ฏ๐ฒ๐ฟ๐ฎ๐๐ฒ๐น๐ - ๐๐ช๐ฃ๐ฆ ๐ค๐ฐ๐ฅ๐ช๐ฏ๐จ when you don't care about internals and time is scarce, ๐๐-๐ข๐ด๐ด๐ช๐ด๐ต๐ฆ๐ฅ ๐ค๐ฐ๐ฅ๐ช๐ฏ๐จ when you must own the code (read + tweak), ๐๐ข๐ฏ๐ถ๐ข๐ญ only for the gnarly 5 % the model still can't handle.
๐ช๐ต๐ฎ๐ ๐๐ต๐ผ๐๐น๐ฑ ๐๐ฒ ๐๐ฎ๐๐ฐ๐ต ๐ผ๐๐ ๐ณ๐ผ๐ฟ
โข ๐ฆ๐ฒ๐ฐ๐๐ฟ๐ถ๐๐ ๐ฟ๐ถ๐๐ธ - running unseen code can nuke your files.
โข ๐ค๐๐ฎ๐น๐ถ๐๐ ๐ฐ๐น๐ถ๐ณ๐ณ๐ - small edge-cases break; drop the use case or wait for next model upgrade.
๐ช๐ต๐ฎ๐ ๐ฎ๐ฟ๐ฒ ๐๐ต๐ฒ ๐ฏ๐๐๐ถ๐ป๐ฒ๐๐ ๐ถ๐บ๐ฝ๐น๐ถ๐ฐ๐ฎ๐๐ถ๐ผ๐ป๐
โข ๐ฉ๐ฒ๐ป๐ฑ๐ผ๐ฟ๐ ๐๐๐ถ๐น๐น ๐บ๐ฎ๐๐๐ฒ๐ฟ - they absorb legal risk, project-manage, and can be bashed on price now that AI halves their grunt work.
โข ๐ฃ๐ฟ๐ผ๐๐ผ๐๐๐ฝ๐ฒ-๐๐ผ-๐ฝ๐ฟ๐ผ๐ฑ ๐ฏ๐น๐๐ฟ - the vibe-coded PoC could be hardened instead of rewritten.
โข ๐จ๐ ๐ฐ๐ผ๐ป๐๐ฒ๐ฟ๐ด๐ฒ๐ป๐ฐ๐ฒ - chat + artifacts/canvas is becoming the default "front-end"; underlying apps become API + data.
๐๐ผ๐ ๐ฑ๐ผ๐ฒ๐ ๐๐ต๐ถ๐ ๐ถ๐บ๐ฝ๐ฎ๐ฐ๐ ๐ฒ๐ฑ๐๐ฐ๐ฎ๐๐ถ๐ผ๐ป
โข ๐๐๐ฟ๐ฟ๐ถ๐ฐ๐๐น๐๐บ ๐ฐ๐ฎ๐ป ๐ฟ๐ฒ๐ณ๐ฟ๐ฒ๐๐ต ๐๐ฒ๐ฟ๐บ-๐ฏ๐-๐๐ฒ๐ฟ๐บ - LLMs draft notes, slides, even whole modules.
โข ๐๐๐๐ฒ๐๐๐บ๐ฒ๐ป๐ ๐๐ต๐ถ๐ณ๐๐ ๐ฏ๐ฎ๐ฐ๐ธ ๐๐ผ ๐๐๐ฏ๐ท๐ฒ๐ฐ๐๐ถ๐๐ฒ - LLM-graded essays/projects at scale.
โข ๐ง๐ฒ๐ฎ๐ฐ๐ต "๐น๐ฒ๐ฎ๐ฟ๐ป๐ถ๐ป๐ด ๐ต๐ผ๐ ๐๐ผ ๐น๐ฒ๐ฎ๐ฟ๐ป" - Pomodoro focus, spaced recall, chunking concepts, as in ๐๐ฆ๐ข๐ณ๐ฏ ๐๐ช๐ฌ๐ฆ ๐ข ๐๐ณ๐ฐ (Barbara Oakley).
โข ๐๐ฒ๐๐ ๐๐ฎ๐ฐ๐๐ถ๐ฐ ๐ณ๐ผ๐ฟ ๐๐๐ฎ๐๐ถ๐ป๐ด ๐ฐ๐๐ฟ๐ฟ๐ฒ๐ป๐ - experiment > read; anything written is weeks out-of-date.
๐ช๐ต๐ฎ๐ ๐ฎ๐ฟ๐ฒ ๐๐ต๐ฒ ๐ฟ๐ถ๐๐ธ๐
โข ๐ข๐๐ฒ๐ฟ๐ฐ๐ผ๐ป๐ณ๐ถ๐ฑ๐ฒ๐ป๐ฐ๐ฒ ๐ฟ๐ถ๐๐ธ - silent failures look like success until they hit prod.
โข ๐ฆ๐ธ๐ถ๐น๐น ๐ฎ๐๐ฟ๐ผ๐ฝ๐ต๐ - teams might lose the muscle to debug when vibe coding stalls.
โข ๐๐ฒ๐ด๐ฎ๐น & ๐ฐ๐ผ๐บ๐ฝ๐น๐ถ๐ฎ๐ป๐ฐ๐ฒ ๐ด๐ฎ๐ฝ๐ - unclear license chains for AI-generated artefacts.
โข ๐ช๐ฎ๐ถ๐๐ถ๐ป๐ด ๐ด๐ฎ๐บ๐ฒ ๐๐ฟ๐ฎ๐ฝ - "just wait for the next model" can become a habit that freezes delivery.