Filtering through the inconsequential is not trivial.
Many would argue that the beauty of the internet is the absence of hierarchy. However, the absence of hierarchy translates into the inherent inability to truly differentiate the quality, as defined by the user's interest, of information on the internet. It is necessary to point out many strides made to make the internet more comprehensible: organizations categorizing new information, filtering done by trust-worthy sources, and search engines to name a few. But these entities have a two-fold effect: on one hand, "important" or "relevant" information is more readily available; on the other hand, the information that reaches us through these mediums are out of our control. General popularity is a strong influence in determining what gets filtered to the average user. Profiling help personalize the filter. This may help us reach the site we desire, but they can also hinder that possibility. The fundamental problem remains:
How to we guarantee that we are able to access the "perfect site"?
To guarantee the "perfect site" is the goal of the internet. A given search may result in millions of similar results. To narrow down of these results, one can use more keywords. At some point, however, it becomes pure guessing as to how a site would phrase an idea, or which synonyms they would use. This is the Achilles heel for any search engine.
To say exactly how such issues can be resolved is impossible. However, I propose a general idea that may guide a solution.
Right now, there is a very simple equality among all websites in that none can be considered to be a parent of others (other than sub-pages or content with a common producer). At most, a site's purpose can be to link to others (just as a search engine). However, the linked-to sites can very well link back to the hyper-linked site, thus demonstrating that no site can be considered to be above any other. Thus, any connection between websites must be intentional, i.e., one must know of another site and link to it. This creates a natural barrier between websites.
An inherent webbing between websites could abolish all barriers between the user and the "perfect site". This webbing would be done according to content, idea, or even a single phrase or word. Imagine if when reading one page, you could find all other pages that share an idea expressed in the article. Then if the next page chosen contains information closer to what you desire, you use another phrase or idea expressed to find another page. All evidence of your exploration could be compiled, so that the next page your reach will be an intelligent improvement on the other pages given to you. In essence, it is a constant googling, but one of content and ideas, cumulatively, rather than just words. This would cure the disconnect between similar (or dissimilar) sites.
This proposition is not give the one-and-only fix for the internet. Rather, it is to show, in light of Bush's prophetic ideas, that the growth, storage, and access information can still be improved.