What if the content/facts of all wikipedia articles were semantically linked by a prediction modeling application?
Of course all of this linking would need to be done by the community. But I have a great deal of faith in the wikipedia community.
All that really needs to happen is for links in sentences that contain dates like “2054” or “June 11” or “05, 21, 1976″ to be declared as being a prediction, fact, speculation (or other specs) Etc (for now, via a rel=”prediction” or time=”Etc” type of thing) (I think it would have to work along with/within human language syntax for now, because I doubt people want to qualify every word they write with semantic markup, but to require lines that contain a date to have some rules isn’t too crazy)
Another piece is needed to tie in the actual information, but it could just be a link to the actual article in which it appears, which isn’t necessary because that’s where it’s coming from. It’s a start.
In other words, “Show me predictions for 2054 based on wikipedia info” could give you articles that contain predictions for 2054. And you can easily get to those articles from the results. Not as granular as “linked data” should be, but right now, the web is basically all about making it easier to look at selfies and bad journalism.
I think this could have a lot of research power.