I believe that we are coming out of a time where hardware and software needed to be with respect to each other. A classic example of this is augmented reality, which started with hand-made tracking systems and now works using markerless tracking using commodity mobile devices.
Massive data (text, image, audio, location, etc) and the ability to process it is all that matters now. The internet is not the Memex. The internet is PageRank and BigTable. Semantic meaning is almost certainly coming, and my bet on the mechanism is Information Distance.
Human Computer Interaction is becoming more comfortable and intuitive for the humans, but the very devices that improve effortless interaction provide more data to the system about who and what we are. And at the moment that computers are better/cheaper at something than humans, computers move into that role. Consider the now silent floors of the stock exchanges.
As computing becomes more ubiquitous (And I mean everything from fMRIs to thermostats), data about us will continue to be produced in larger, more informative streams. The ability of software to detect patterns in that information will continue to improve, along with the hardware processing speed. The models of human behavior that we thought that we knew, Fitts’ law, GOMS, etc, will be replaced by data-driven models that will probably be more neural in nature than hand rolled. Cognition will be understood (modellable?) in ways that we are only just beginning to see.
How we fit into that brave new world is going to be the central question of HCC in the coming years. It won’t be how we direct it.