Media Students Have No Idea How Bad Tech Used to Be and It Makes Me Feel Ancient
TL;DR
As a university lecturer teaching audio and video coding, I’m struck by how students today take tech advancements for granted. They’ve never experienced the struggles with low-quality audio, encoding artifacts, or outdated tech like we did—they just expect everything to “work.” Makes me wonder if it’s a generational shift, me being a tech nerd, or both. Trying to make them care about these nuances often makes me feel old.
------------------------------------------------------------
I teach audio and video coding at a university for media students, and I’m always surprised at how many tech challenges that were significant in the past—and shaped so much of what we know—don’t even cross my students' minds today.
For instance, when I introduced them to Bluetooth codecs, quality loss, and latency, they admitted they’d never noticed low-quality SBC encoding until I demonstrated it. The headphone jack removal? They barely remember it—it’s always been Bluetooth for them. They never experienced the awful audio days of A2DP. Most have never really thought about audio quality at all because, for them, “it just works.” Even features like spatial audio on iOS are left on by default without much thought. They’ve heard of Dolby Atmos but rarely question what it actually is or does.
Similarly, while they’ve encountered poor video streams, they’ve never noticed encoding artifacts, blocky videos, or the chaos caused by missing B/P frames. When I teach about HDR and tone mapping, they recognize the terms but have never consciously considered whether they’re watching SDR or HDR content—or even what their screen’s peak brightness is.
For most of them, as long as it works, it’s fine. Content consumption usually means watching Netflix on their phone with headphones, and that’s enough.
Meanwhile, I’m teaching about AutoEQ, codecs, quality differences, display calibration, motion smoothing, and HDR. But for many students, these are abstract concepts because their experience tells them everything is already “good enough.”
This isn’t meant to be a rant—I’m genuinely curious whether this is just me being a tech nerd, a generational shift, or maybe a mix of both. So much has improved over the years that things we once struggled to fix, improve, or even understand have become non-issues. People now take these advancements for granted, as if they’ve always been that way.
Sometimes, trying to get media students to care about these details makes me feel old. It’s funny and humbling to realize how quickly the focus shifts from “how to make it better” to “it works, so why bother?”