When you are programming, you're the link between your concept of the problem, and the machine instructions that get the computer to solve it for you. To me, it's magic.
+----------+ +------------------+ | Problem +--o << magic >> o--+ Machine Language | +----------+ +------------------+
So you can think of programming as a series of transformations, each one taking steps away from how you naturally think of the problem, and taking steps toward how your computer "thinks" about it. When you get to machine language, you are done. To speed up the process, what you want are language features that help you model the solution as naturally as possible, while maintaining suitable efficiency after translation to machine code. Is anyone really working on that anymore? When I'm reading about interesting programming language research these days, it's fun enough but I can't help but think they are a bit off-target.
A true advance would be a more powerful language that helps us model real-world problems more easily and more accurately. Instead, we are making small variations on languages invented by the 1960s. Maybe there just isn't a fundamentally more natural computer language, so I'm hoping for too much. Or maybe we've had computers long enough now that the concepts have hardened in our minds and left us closed and inflexible. When you read CS papers from the 50's and 60's, they are full of excitement for revolutionary changes that they--naively--thought could be around the corner. I think because computers were newer, they didn't take as many things for granted. I mean, when's the last time you considered that computers don't have to be Von-Neumann machines? Back then, the sky was the limit and anything was possible. I miss that, even though I wasn't even there!