The world needs cost-effective, democratic, accessible microprediction.
And if everyone copies this prototypical site, it might happen -- did you know it is open source?
What's microprediction you say?
The act of making thousands of predictions of the same type over and over again. Microprediction can:
- Clean and enrich live data
- Alert you to outliers and anomalies
- Provide you with short term forecasts
- Identify patterns in model residuals
It can also be combined with patterns from Control Theory and Reinforcement Learning to engineer low cost but tailored intelligent applications. Often enough AI is microprediction
, albeit bundled with other mathematical or application logic.
The provided API
(or Python client
) seeks to address the high cost of bespoke data science -- at least as it applies to microprediction.
- You publish a live data value.
- The sequence of these values gets predicted by a swarm of algorithms.
- Anyone can write a crawler that tries to predict many different streams.
Microprediction APIs make it easy to:
- Separate the act of microprediction from other application logic.
- Invite contribution from other people and machines
- Benefit from other data you may never have considered.
- Benchmark your work for research purposes.
- Automate ongoing performance analysis.
Perhaps this is how our future AI overlords will solve realtime decision making and control problems.
A community effort
Seamless data and algorithm reuse is accomplished here.
- Algorithms crawl from one stream to another.
- Algorithms discover causal links between streams.
Let's say your store is predicting sales and I'm optimizing an HVAC system across the street. Your feature space and mine probably have a lot in common.
Help us create a microprediction network
Microprediction is inherently hard because it is a search in the space of the world's data and models. It is the kind of thing that is likely to be solved by a complex network, not individual work. We think microprediction must be solved collectively by everyone or not very well at all.
Allow us one analogy. Document search was pretty lame before the web. Cold start artisan data scientists trying to organize the world's feature spaces might as well be librarians trying to replace the internet with the Dewey decimal system.
Driving down the cost of microprediction, and thus AI, requires more powerful, ruthless orchestrating mechanisms. Help us evolve low cost supply chains for microprediction. Help us build a lattice of nowcasts of numbers of civic, commercial and scientific significance that, like the web itself, can grow to become an indispensible resource. Perhaps over time that feature lattice can become as essential to realtime operations as the web is to commerce today.
Right now we are making microprediction easy for individuals, organizations and small to medium size enterprises who cannot afford teams of data scientists. In the long term, perhaps this can be the beginning of a more profound realtime, collective intelligence.
It's worth a shot.
Please consider contributing
in some way (and check the footer for cash incentives).