Big data, Google, and the end of free will
ft.comInteresting read, though I would have liked more highlight to the fact that all of these algorithms are biased against you.
You aren't just giving control away in exchange for an all-knowing algorithm helping you with how to live your life. No, these algorithms all actively use this control and they have no interest in making your life better.
They all want you to buy something that you wouldn't have otherwise bought or want you to spend more time on their service watching ads than would be sensible.
They might try to be useful at the side, so that you don't abandon them entirely, but whenever they can, they will do what's best for their company. Because that's what they are programmed to do.
Your perspective seems right, relative to the status quo. But, in time, fair algorithms could be put in use by open source and creative commons to empower regular people. Today it is easy to fork or extend a system, even an AI system. Data can be collected by offering services. I don't think a few companies could hold AI hostage, especially because the practice is to share and because the entry barrier is low with regards to computer power.
If humans write the algorithms then all the subconscious bias of their authors will still influence them.
Algorithms are not holy. Nor is data.
The Unabomber's manifesto doesn't seem so crazy now, does it?