I think we (the programming language field) has yet to get to a sweet spot that matches up with the scales, aspirations, and life cycles we want to deal with.
That sweet spot yet to come will certainly have a very strong and dynamic notion of modules, and one part of the notion will almost certainly involve both protection and intermodule communication. But I don't think we need to be religious about the past (even the parts of the past that were done better than today).
My main complaint about most language use today (including the languages you mentioned) is that the very weak "data structure and procedure" style is retained (via getters and setters, etc.), and (b) the CPU is used to set the notion of "time passing" so that bad synchronization schemes have to be employed. This leads to fragile, very large, inscrutable, and intractable systems.
It was known in the 70s how to eliminate both these problems. (But not all the problems involved with scaling and very high level expression.)
So, to me, the issue is much more "how should we design and program systems that must live in an ever growing "ecology" of ongoing systems?"
Besides using the good 70s solutions to the two problems alluded to above, we could also think about the idea of protected modules which don't send explict messages but use some form of efficient and general publish and subscribe.
Imagine that each module in "a space" exhibits two kinds of "bulletin boards": one that tells the space what the module needs as resources to do its job, and (b) one that exhibits what the module has produced. A "super-Linda" kind of matcher can be used to automatically do the brokering. This is more late bound and has more possibilities for graceful scaling (because objects are not dealing with targets but with needs).
Similar mechanisms can be used to completely separate "meanings" from "optimizations" so that semantic debugging, testing and validation can be done independently of attempts to make things go faster.
One of the mottos from Xerox PARC was "Math Wins!", and I think this is still the case. By "math" I mean mathematical thinking and making up new mathematics when it will help problem solving. It is possible to make meta systems in which the new maths can be turned into programming systems and run. This is somewhat equivalent to "escaping from bricks to inventing arches" when things start to get too complex. This is one of the most powerful properties of computers, but it is rarely used to the extent needed and possible in most of today's systems.
And so on. It's not that this stuff is easy or solved, it's that too many people in the field are simply trying to *cope*, where what we need are lots of people trying to make *real progress*!