Why learning programming the hard way still matters in the AI age

Professor Mark Mahoney explains why mastering programming basics is critical, even as LLM tools like Claude Code and GitHub Copilot reshape how we write code.
Large language model (LLM) coding tools like GitHub Copilot, Claude Code, and Gemini CLI are transforming how software developers write code. But even as these applications grow capable of churning out entire codebases from simple prompts, veteran computer science educators like Professor Mark Mahoney argue that foundational programming skills remain essential. In an interview with Free Code Camp podcast host Quincy Larson, Mahoney shared his thoughts on the evolving landscape of programming education, the limitations of AI, and why learning the "hard way" shouldn't be discounted.
The value of traditional programming education
Mahoney, a computer science professor at Carthage College with over two decades of teaching experience, emphasized the importance of building a strong foundation in programming. For him, the practice of debugging and understanding code deeply is irreplaceable—even in an age where LLMs can generate code instantly.
Mahoney observed that beginners who rely primarily on AI tools miss the opportunity to learn critical problem-solving skills. "If an LLM writes most of the code for you, it’s fine," Mahoney remarked. "But you need to step through that code, observe its effects, and understand its behavior—not just trust that the AI got it right.”
The professor is particularly cautious of students developing what he described as a lack of resilience. "If you’re just given the answer every time, you miss out on the struggles and problem-solving processes that actually make you a strong developer."
AI as a tool, not a crutch
Mahoney doesn’t see LLM tools as inherently bad, but he sees their role in education and development as complementary to traditional methods. Citing a recent personal experience, Mahoney shared how he used Claude Code, a popular LLM tool, to save time with a small educational project. When creating a visualization tool to explain Git pull requests to his students, the LLM helped build a rough solution. However, Mahoney still had to review and tweak the AI-generated code to align with best practices.
"Rather than have it immediately generate code," Mahoney says, "these days I ask LLMs to create a plan. I will iterate on that plan and, once it's reasonable, have it write the code. But only someone with experience can identify when an LLM’s plan is flawed."
This iterative troubleshooting prevents the creation of "technical debt"—code that is functional on the surface but problematic in the long term. Mahoney stressed that without fundamental programming knowledge, developers are unlikely to spot such issues and could unknowingly introduce bugs.
Programming with LLMs: Not all roses
Despite his optimism about new tools, Mahoney remains aware of their limitations. He highlighted a concrete example: using an LLM, the code attempted to store critical data in a global document object, which was inefficient given the project structure. "If I hadn’t noticed that issue and corrected it, the codebase would have already started carrying technical debt," he explained.
Not everyone can work around these flaws as easily. Beginners who lean heavily on AI for learning may receive functional, but suboptimal, code. Worse, they may lack the skills to realize it’s suboptimal. For instance, a novice developer might use AI to debug errors without addressing the root cause, ultimately reinforcing bad habits.
The economic and ethical concerns of AI reliance
Beyond technical shortcomings, Mahoney touched on potential financial barriers. While some companies subsidize LLM access, costs can accumulate quickly. For example, Mahoney spends $20 monthly on Claude Code and often exhausts his token limits despite coding part-time. He predicts that costs might become prohibitive for smaller organizations.
Additionally, legal and ethical obstacles remain pressing. Mahoney mentioned a former student whose employer prohibits AI-based tools like Copilot for fear of intellectual property or security risks. These restrictions underscore why adaptability and thorough knowledge remain vital.
"In some parts of the world, regulatory issues or infrastructure challenges could limit access to these tools altogether," Mahoney said. "We can’t count on AI always being an option."
Why learning the hard way builds better developers
While sophisticated LLM tools make it easier to "just-in-time learn" coding, Mahoney insists that fully understanding the development process provides more value long-term. He suggested that companies hiring engineers may increasingly value those who demonstrate fluency in core programming principles.
"If I were hiring someone, I’d want proof they learned the hard way," Mahoney said. "It shows grit—an understanding of debugging, problem-solving, and software design that’s hard to mimic just from interacting with an AI."
This ethos, Mahoney argued, not only prepares developers for complex work environments but also equips them with critical skills like adaptability and troubleshooting. In his classroom—and through his online learning platform Playback Press—Mahoney encourages students to grapple with coding challenges head-on rather than look for shortcuts.
The irreplaceable role of human instructors
The discussion also touched on the critical role of human instructors. Mahoney described teaching not as presenting information, but as actively motivating students. "My job is to ignite their passion for learning," he said. "An LLM might provide answers, but it can’t inspire." Mahoney also noted that in many ways, human teachers are better equipped to tailor their instruction to individual learning styles—something AI hasn’t mastered.
He gave the example of students who struggle with confidence. A teacher can modulate their approach, sometimes challenging a student to figure something out, and at other times guiding them step by step. This dynamic, Mahoney believes, is essential for deep, lifelong learning.
A blended future
So, where do we go from here? While Mahoney acknowledges that LLMs are here to stay, he encourages educators, students, and industry leaders to adopt a "blended" approach. LLMs can act as supplementary tools—"infinitely patient tutors," as he called them—but shouldn’t replace rigorous, hands-on learning.
"An ideal future combines the best of both worlds," Mahoney concluded. "AI tools assist in untangling complexities or speeding up repetitive tasks, while human-driven education ensures mastery and resilience."
Aspiring developers who rely too heavily on AI risk losing critical skills altogether. In contrast, those who combine the efficiencies of modern tools with traditional learning progress more confidently toward becoming not just coders, but truly exceptional developers.
Staff Writer
Chris covers artificial intelligence, machine learning, and software development trends.
Comments
Loading comments…



